Powered by Smartsupp


 

Introduction 

Deleting information from any document is something to think about twice may be thrice. But deleting a requirement from a specification is not as simple as deleting a sentence. A requirement is an object which holds not only the specification sentence but other information like name-value pair attributes, history, link information etc. 

In this document we will try to explain possible ways to approach how to delete requirements 

Challenges

Sometimes it is required to discard some information from the project for various reasons. But even in such cases, when specification information is discarded, there are still some risks as follows: 

  • It may break the integrity of the overall information map. 
  • The information piece to be discarded may not be needed in one configuration but may still be critical in others. 
  • The information piece to be discarded now, may still be an important component of an audit trail 
  • The information itself may have no value anymore, but the links it bears, my still constitute some value 

So when we decide to delete any information from the requirements base, we should be considering several cases. Or somebody has to pre-consider for us.      

How to delete requirements from DOORS Next Generation 

How can we delete any requirement from DNG? Or what is the best approach to delete requirements from the requirements database? 

For the reasons mentioned above, deleting requirements from GUI doesn’t exactly remove the data from the database. It just gets unaccessible. But to keep the integrity intact, any deleted requirement, still occupies a placeholder in the database. 

Soft Delete of Artifacts

Since any “delete” operation via the GUI doesn’t cause the physical deletion of the data from the database, and in spite of this result, the data is still permanently unreachable when deleted, may be we can consider a “soft” delete operation. 

It is a widely accepted approach to organize requirements in modules within DNG. Like we create/edit requirements within module, we also tend to execute the delete command in modules. This command is called as “Remove Artifact”: 

 

What we are doing with this command is not actually deleting the requirement, but removing it from the module. So it will remain accessible in the base artifacts and folders. 

 

 

Also, by design, there is a concept of requirement reusability. This means this requirement may coexist in other modules. 

This is why, when removing a requirement from a module it is not “deleted” by default. However if the artifact is present only in one module, then there is an option to delete it via “remove” comment as follows: 

If the artifact to be removed doesn’t exist in any other module, this option permanently deletes it. 

The permanent deletion of the artifact will still leave the artifact on the database for the integrity of the data. But it will not be possible to retrieve this artifact any more. 

So we can recommend not to permanently delete any artifact but remove. However there might still be some issues with removing the artifact from the module:

When an artifact is removed from the module, the specification text, its key-value pair attributes, history, all information will remain. Only its link to the module it belongs to will be broken. But there is one important piece of information else which gets broken. The link to other artifacts! In this example we see that the requirement is linked to another requirement in another module. When the artifact is removed from the module, we use that link as well. So when we retrieve it back to the module, it won’t have the link information anymore:

Sometimes our clients may prefer this behavior, sometimes not. For those, of which this is not preferable, we generally consult a “softer” delete operation. We advise to define a boolean attribute called “Deleted” with default value “false”. This attribute is assigned to all artifact types and instead of deleting the artifact, simply change the boolean value from “false” to “true”. 

Of course it is not enough to do this. Also we need to define a filter for every view which filters out the “Delete” attribute “true” valued requirements. 

Hard Delete of Artifacts

Hard delete of artifacts can be considered as deleting the artifacts with the option “If the artifact is not in other modules, permanently delete it.” selected. 

But this will still not be sufficient if the artifact is being used in more than one module. To make sure a hard delete, first remove it from the modules it is being used in. 

You can used the information “in modules” 

Then, just simply locate the base artifact in the folder structure and select “Delete Artifact” from the context menu. 

 

Upon confirming the deletion it will be permanently deleted: 

 

This operation will still not delete the artifact from other configurations if there are more than one stream and the artifact appears in those streams.   

How to clean up requirements 

As mentioned above, the requirements even deleted from GUI are not deleted from the database for various reasons. Most of which is to maintain the data integrity and database indices. Theoretically, after every physical deletion of the artifacts a reindexing and database maintenance scripts should be run. This would not make sense for a daily operation done by an end user.  

For those artifacts which are permanently deleted from the GUI, there is a repotools command which helps to remove from database as well. This is deleteJFSResources command. However, use it with extra precaution and please review the information in the link below: 

https://www.ibm.com/support/pages/deleting-data-permanently-doors-next-generation-project 

Summary: 

Making use of configurations, streams and modules complicates the deletion concept. We may loose information where we don’t expect. Also we may get results even after properly deleting artifacts. Since they will still be available in other configurations. That is why we generally advise to use the “Deleted” attribute approach and implement the filtering of non-deleted artifacts.   

Installation of IBM (Engineering Lifecycle Management) ELM

This article describes our experience in implementing the ELM (Engineering Lifecycle Management) product from IBM.

Introduction

IBM ELM applications are the leading platform for complex product and software development. ELM extends the standard functionality of ALM products. It provides an integrated end-to-end solution that offers complete transparency and tracking of all technical data, through requirements, testing and deployment. ELM optimizes collaboration and communication between all stakeholders, improving decision-making, productivity, and overall product quality. This product has undergone a significant update and has also changed its name from CLM (Collaborative Lifecycle Management) to ELM (Engineering Lifecycle Management).

The main differences between CLM and ELM relate to the change in product names:

- Rational DOORS – in ELM it is called IBM Engineering Requirements Management DOORS Family (DOORS)

- Rational DOORS Next Generation (DNG) – in ELM it is called IBM Engineering Requirements Management DOORS Next (DOORS Next)

- Rational Team Concert (RTC) – in ELM it is called IBM Engineering Workflow Management (EWM)

- Rational Quality Manager (RQM) – in ELM it is called IBM Engineering Test Management (ETM)

- And many more...

Since version 6.0.6.1, the user interface has already changed. However, some applications have not had their names changed, such as Report Builder, Global Configuration Management, Quality Management, etc.

Other differences between CLM and ELM include the fact, that ELM supports different web application servers, operating systems, and databases. For maximum flexibility in adopting new advanced features and to simplify potential future migration to or from:

- IBM ELM on Cloud SaaS

- IBM ELM as a Managed Service

- IBM ELM containerized on RedHat OpenShift, which is currently under development

The current version that we are installing is 7.0.2 SR1 (ifix15). This product is primarily aimed at customers in the fields of healthcare, military, transportation and many other critical areas of industry and manufacturing. Among the main common requirements in these areas of the industry is speed, reliability, flexibility, and security. For the effective and reliable functioning of the ELM solution, initial analysis of requirements and planning is important.

Challenges

After defining the client's requirements (required applications, architecture), the actual deployment of the ELM product is approached. During the design, we consider the client's requirements, adapt the architecture to the already existing infrastructure with the technical requirements of the ELM product. The ELM product supports implementation on Linux (RedHat), Windows and IBM AIX operating systems.

IBM ELM offers several applications, some of which are listed below:

  • JTS (Jazz Team Server)
  • CCM (Change and Configuration Management)
  • RM (Requirements Management)
  • QM (Quality Management)
  • ENI/RELM (IBM Engineering Lifecycle Optimization - Engineering Insights)
  • AM (IBM Engineering Systems Design Rhapsody – Model Management)
  • GC (Global Configuration Management)
  • LDX (Link Index Provider)
  • Jazz Reporting Service, which includes more applications:
    •  RS (Report Builder)
    •  DCC (Data Collection Component)
    •  LQE (Lifecycle Query Engine)
  • JAS (Jazz Authorization Server)
  • RPENG (Document Builder) – which is implemented as a separate application/system

For large infrastructures, we recommend implementing each application on a separate server. That, however, significantly increases the financial requirements. For fewer lower requirements on ELM, we look for the best possible balance between performance and reliability, taking into consideration their possible financial constraints to achieve the best results. The optimization of the IBM ELM solution can also consist of the fact, that several applications will be deployed on one server at the same time. The implementation of multiple applications on one server is used for less frequently used applications. Based on our experience, we have verified the following logical arrangement of applications on servers (ASX):

  • AS1: JTS, LDX, GC
  • AS2: QM, CCM
  • AS3: DCC, RPENG
  • AS4: LQE, RS, RELM
  • AS5: RM

For a better understanding of the architecture, the following figure shows a proposal for a possible logical topology:

Preferred Solutions

The recommendation for large customers is to install each application on a separate server, if the technical and financial limitations of the customer's environment allow it. However, this results in increased costs for the server, which consist of installations, licenses, upgrades, and "resources" necessary for the operation of the given solution.

For small and medium-sized customers, based on an initial analysis, we recommend installing less-used applications on a shared server. However, over time, problems with memory and system resource utilization may occur. With a properly designed architecture, these applications can be partitioned without data loss. For detection, prevention of problems and future planning, the deployment of a monitoring system is reccomended.

From the result of the initial analysis, it is determined which application will be installed separately on the server. It is also advisable to install applications on separate servers that the customer uses regularly and contain thousands to millions of requests. Other, less computing and memory-intensive applications can be redistributed so that they can share computing and memory performance.

An essential part of every ELM deployment is the JTS application, which ensures the connection of individual applications. The recommended deployment of ELM packages in the infrastructure is to use IHS (IBM HTTP Server), which is configured as Reverse Proxy.

A necessary part of the solution is the implementation of the database server. The database server stores most of the data, so it is vital to pay extra attention to it when deploying and choosing a suitable product. The basic databases included in the installation include Apache Derby, which is intended primarily for testing purposes with a maximum of 10 users. This database is not intended for production environments and therefore we do not recommend it. Databases suitable for both production and testing include:

  • IBM DB2
  • Oracle
  • Microsoft SQL Server

If the installation is carried out on the cloud, it is preferred to use the DB2 database. In the case of large companies, the Oracle database is more suitable. The use of these databases directly depends on the number of users and the amount of user data.

The use of these databases directly depends on the number of users and the amount of user data.

After the product is installed, we address the implementation of user authentication and security. For basic security, the WebSphere Liberty basic registry is used, which is available immediately after installation, but this option is recommended for testing purposes only. Other authorization options are, for example, the use of the LDAP protocol, which can be supplemented by the installation of JAS (Jazz Authorization Server). With the LDAPS authorization option, users are authorized using a hierarchical user structure. JAS extends the solution with the OAuth2.0 protocol. This solution also offers to authorize third-party applications.

Specialized solutions, approached and tips

From Softacus' point of view, we try to adapt to customers and carry out installations according to their usual standards. Based on the client's requirements, Softacus can work remotely, in the case of customers from critical infrastructure, we can also work on-site. VPN (Virtual Private Network) technologies are used for safe use.

During installations, it is necessary to ensure that installed IBM ELM applications do not have access to environments they are not authorized for. Attention should also be paid to the installation of applications that are listed as "package" (an application with a set of sub-applications to choose from - the client chooses which sub-applications he needs). After installation and subsequent setting of the environment, it is necessary to verify the functionality of the system, which is assisted by the testing team (Softacus).

One of the last steps after performing the installation and securing the server communication is to think about data backup. With customers, it is necessary to think about a "disaster recovery" plan, and it is also necessary to plan regular data backups to secure data against loss or system failure. Monitoring is an essential part of every infrastructure nowadays. Monitoring helps with prediction and improves response to adverse events on servers and applications.

Conclusion

ELM product implementation includes analysis, planning, deployment, and continuous monitoring. At Softacus, we work on improvement and efficiency during the entire course of the solution. It is a continuous and complicated process, during which it is possible to implement new functionalities and extensions into the customers' environment.

Článok: Installation of IBM ELM

Tento článok popisuje naše skúsenosti pri implementáciách produktu ELM (Engineering Lifecycle Management) od IBM.

Introduction

Aplikácie ELM sú vedúcou platformou pre komplexný vývoj produktov a softvéru. ELM rozširuje štandardné funkcionality ALM produktov. Poskytuje integrované komplexné riešenie, ktoré ponúka úplnú transparentnosť a sledovanie všetkých technických údajov, cez požiadavky, testovanie a nasadenie. ELM optimalizuje spoluprácu a komunikáciu medzi všetkými zainteresovanými stranami, zlepšuje rozhodovanie, produktivitu a celkovú kvalitu produktu. Tento produkt prešiel značnou aktualizáciou a zmenil aj svoj názov z CLM (Collaborative Lifecycle Management) na ELM (Engineering Lifecycle Management).

Hlavné rozdiely medzi CLM a ELM sa týkajú zmeny názvov produktov:

  • Rational DOORS – v ELM sa nazýva IBM Engineering Requirements Management DOORS Family (DOORS)
  • Rational DOORS Next Generation (DNG) – v ELM na nazýva IBM Engineering Requirements Management DOORS Next (DOORS Next)
  • Rational Team Concert (RTC) – v ELM na nazýva IBM Engineering Workflow Management (EWM)
  • Rational Quality Manager (RQM) – v ELM na nazýva IBM Engineering Test Management (ETM)
  • A mnoho iných...

Od verzie 6.0.6.1 sa už zmenilo aj používateľské rozhranie. Niektorým aplikáciám sa však názvy nemenili, ako napríklad Report Builder, Global Configuration Management, Quality Management, atď.

K ďalším rozdielom medzi CLM a ELM patrí to, že ELM podporuje rôzne webové aplikačné servery, operačné systémy a databázy. Pre maximálnu flexibilitu pri prijímaní nových pokročilých funkcií a na zjenodučenie potenciálneho budúceho presunu na alebo z:

  • IBM ELM na Cloud SaaS
  • IBM ELM ako Managed Service
  • IBM ELM kontajnerizované na RedHat OpenShift, ktoré je v súčasnosti vo vývoji

 Aktuálna verzia, ktorú inštalujeme je 7.0.2 SR1 (ifix15). Tento produkt sa zameriava primárne na zákazníkov z oblastí zdravotníctva, armády, dopravy a mnohých ďalších kritických oblastí priemyslu a výroby. Medzi hlavné spoločné požiadavky týchto oblastí priemyslu patria rýchlosť, spoľahlivosť, flexibilita a bezpečnosť. Pre efektívne a spoľahlivé fungovanie ELM riešenie je dôležitá prvotná analýza požiadaviek a plánovanie.

Purpose of the article - challenges

Po definovaní požiadaviek klienta (požadované aplikácie, architektúra) sa pristupuje k samotnému nasadzovaniu produktu ELM. Pri návrhu zohľadňujeme požiadavky klienta, prispôsobujeme architektúru k už existujúcej infraštruktúre s technickými požiadavkami produktu ELM. ELM produkt podporuje implementáciu na operačných systémoch Linux (RedHat), Windows a IBM AIX.

IBM ELM ponúka viacero aplikácií, pričom nižšie sú uvedené niektoré z nich:

  • JTS (Jazz Team Server)
  • CCM (Change and Configuration Management)
  • RM (Requirements Management)
  • QM (Quality Management)
  • ENI/RELM (IBM Engineering Lifecycle Optimization - Engineering Insights)
  • AM (IBM Engineering Systems Design Rhapsody – Model Management)
  • GC (Global Configuration Management)
  • LDX (Link Index Provider)
  • Jazz Reporting Service, obsahujúci viacero aplikácií:
    •  RS (Report Builder)
    •  DCC (Data Collection Component)
    •  LQE (Lifecycle Query Engine)
  • JAS (Jazz Authorization Server)
  • RPENG (Document Builder) – ktorý je implementovaný ako samostatná inštalácia

Pri veľkých infraštruktúrach odporúčame implementovať každú aplikáciu na samostatný server. To ale značne zvyšuje finančné požiadavky. Pre menej nižších požiadaviek na ELM hľadáme čo najvýhodnejší balans medzi výkonom a spoľahlivosťou za účelom dosiahnutia čo najlepších výsledkov vzhľadom k ich možným finančným obmedzeniam. Optimalizácia IBM ELM riešenia môže spočívať aj v tom, že na jednom serveri budú nasadené viaceré aplikácie naraz. Implementácia viacerých aplikácií na jednom serveri sa využíva pri menej používaných aplikáciách. Na základe našich skúseností sa nám overilo takéto logické usporiadanie aplikácií na serveroch (ASX):

  • AS1: JTS, LDX, GC
  • AS2: QM, CCM
  • AS3: DCC, RPENG
  • AS4: LQE, RS, RELM
  • AS5: RM

Pre lepšie pochopenie architektúry uvádza nasledujúci obrázok návrh možnej logickej topológie:

Preferred solutions

Odporúčaním pre veľkých zákazníkov je inštalácia každej aplikácie na samostatný server, v prípade, že im to dovoľujú technické a finančné obmedzenia prostredia u zákazníka. Z toho ale vyplývajú zvýšené serverové náklady, ktoré pozostávajú z inštalácií, licencií, upgradov a „resources“ potrebných na prevádzku daného riešenia. 

U malých a stredne veľkých zákazníkov na základe úvodnej analýzy odporúčame inštalovať menej používané aplikácie na zdieľaný server. Časom ale môže dôjsť k vzniku problémov s pamäťou a vyťaženosťou systémových zdrojov. Pri správne navrhnutej architektúre je možné tieto aplikácie rozdeliť bez straty dát. Pre detekciu, prevenciu problémov a plánovanie do budúcna je odporúčané nasadiť monitorovací systém.

Z výsledku prvotnej analýzy sa určuje, ktorá aplikácia bude inštalovaná samostatne na serveri. Na samostatné servery je tiež vhodné inštalovať tie aplikácie, ktoré zákazník využíva pravidelne a obsahujú obrovské množstvá požiadaviek. Ostatné, menej výpočtovo a pamäťovo náročné aplikácie je možné prerozdeliť tak, aby vedeli zdieľať výpočtový a pamäťový výkon. 

Nevyhnutnou časťou každého ELM nasadenia je aplikácia JTS, ktorá zabezpečuje prepojenie jednotlivých aplikácií. Odporúčaným nasadením ELM balíkov do infraštruktúry je použiť IHS (IBM HTTP Server), ktorý je nakonfigurovaný ako Reverse Proxy. 

Nutnou súčasťou riešenia je implementácia databázového servera. Databázový server uchováva väčšinu dát, preto je potrebné mu venovať zvýšenú pozornosť pri nasadzovaní a výbere vhodného produktu. Medzi základné databázy pri inštalácií patrí Apache Derby, ktorý je určená primárne na testovacie účely s obmedzením na maximálne 10 používateľov. Táto databáza nie je určená pre produkčné prostredia, a preto ju neodporúčame. Medzi databázy vhodné pre produkciu aj testovanie patria:

  • IBM DB2
  • Oracle
  • Microsoft SQL Server

Ak je inštalácia realizovaná cez cloud, preferované použitie je DB2 databázy. V prípade veľkých firiem je preferované použiť Oracle databázu. Použitie týchto databáz sa priamo odvíja od počtu používateľov a od množstva používateľských dát. 

Po nainštalovaní produktu sa venujeme implementácií overovania používateľov a zabezpečenia. Pre základné zabezpečenie sa používa WebSphere Liberty basic registry, ktorý je dostupný hneď po inštalácií, avšak táto možnosť sa odporúča iba na testovacie účely. Ďalšie možnosti autorizácie sú napríklad využitie protokolu LDAP, ktorý môže byť doplnený o inštaláciu JAS (Jazz Authorization Server). Pri možnosti autorizácie pomocou LDAPS sa používatelia autorizujú pomocou hierarchickej štruktúry používateľov. JAS rozširuje riešenie o OAuth2.0 protokol. Toto riešenie ponúka aj možnosť autorizácie aplikácie tretích strán. 

Specialized solutions, approached and tips

Z pohľadu Softacusu sa snažíme prispôsobiť sa potrebám zákazníkom a vykonávať inštalácie v súlade s ich zaužívanými štandardmi. Na základe požiadaviek klienta vieme v Softacuse pracovať na diaľku, no v prípade zákazníkov z kritickej infraštruktúry vieme pracovať aj na mieste. Pre bezpečné používanie sa využívajú technológie VPN (Virtual Private Network).

Pri inštalácií je potrebné zabezpečiť, aby inštalované IBM ELM aplikácie nemali prístup na prostredia, ku ktorým nemajú autorizáciu. Pozor taktiež treba dávať na inštaláciu aplikácií, ktoré sú uvedené, ako “package“ (aplikácia, pri ktorej je na výber sada podaplikácií – klient si zvolí, ktoré podaplikácie potrebuje). Po inštalácií a následnom nastavení prostredia je potrebné overiť funkčnosť systému, pri čom asistuje testovací tím (Softacus).

Jedným z posledných krokov po vykonaní inštalácie a zabezpečení serverovej komunikácie je zvážiť zálohovanie dát. Pri zákazníkoch je potrebné sa zamýšľať nad „disaster recovery“ plánom (vysporiadaním sa s potenciálnymi nehodami) a taktiež je potrebné naplánovať pravidelné zálohovanie dát z dôvodu zabezpečenia dát pred stratou alebo zlyhaním systému. Nevyhnutnou časťou každej infraštruktúry v tejto dobe je už aj monitoring. Monitoring pomáha s predikciou a zlepšuje reakciu na nežiadúce udalosti na serveroch a aplikáciách.

Conclusion

Implementácia ELM produktu obsahuje analýzu, plánovanie, nasadzovanie a neustály monitoring. V Softacuse pracujeme na zlepšovaní a zefektívňovaní počas celého priebehu riešenia. Je to súvislý a komplikovaný proces, počas ktorého je možné implementovať nové funkcionality a rozšírenia do prostredia zákazníkov. 

DOORS Next provides great functionalities for managing requirements within your team, but as your project grows sooner or later you need to export and import the data using external formats.

For DOORS Next data the first format to think of is project template (or component template, in case if configuration management feature is enabled in RM). Content of template can be set up to include different types of data but all items of included types will be included to a template (e.g. you can include artifacts to a template - and it means that all folders will be added to a template).

Another option is the ReqIF package - it allows interchange requirements between requirements management platforms which are compatible with ReqIF, with more data selection options (e.g. you can include only certain artifacts or modules). 

Excel/CSV format is also an option to export and import data to DOORS Next, and this article provides some details on these functions. 

The purpose of this article is to describe several cases of Excel/CSV import and export, prerequisites and limitations.

The very first thing you need to know - which fields are mandatory for importing the data to DOORS Next via spreadsheets and their meaning. The best way to get these fields is exporting the data from DOORS Next, and here is some explanation for them. For the basic case (extended case is explained below) you need to have following attributes as columns of a spreadsheet:

  • Artifact type - must be existing type of an artifact
  • Name or Primary Text - either name or Primary Text must be included to the spreadsheet. If you include Primary Text - the value of Name attribute will be updated automatically (this is basically how DOORS Next works)

If you want to update existing requirements via Excel/CSV file - you need to include an ID attribute column. In this case you can use an ‘update’ option and values in the ‘ID’ column will be checked against IDs of existing artifacts.

If you want to update module content - e.g. module structure (insert new artifacts to a module and or change order of artifacts) - you need to include module specific attributes to a spreadsheet. They are:

  • isHeading - boolean value, related to module specific option of an artifact, regardless of artifact type
  • parentBinding - ID of parent artifact for the current. The value is empty if an artifact is on the top of module hierarchy or contains ID of ‘parent’ for the current artifact (e.g. Heading of a chapter to which current artifact is related)
  • Module - ID of a module, which will be the same for all rows in a spreadsheet because import to a module is pointed to a certain module

You can use Excel/CSV import for links creation, and here there are options. The very straightforward option is to literally insert links directly. In this case you can use a spreadsheet column to build a link to another item (either another artifact in DOORS Next or workitem in EWM application or test artifact in ETM). For EWM and ETM this is the main option, you can better understand the format of a spreadsheet after exporting some samples. It can help you to build your spreadsheet for importing links either for base artifacts or for artifacts in a module - depending on the import option you use.

For links to other DOORS Next artifacts you can use Excel/CSV to prepare your requirements for linking via ‘Link by Attribute’ feature. In this case you need to have a special attribute which is supposed to hold IDs of artifacts to be linked. And you can fill values of this attribute via spreadsheet.

Depending on your linking policy and needs (linking to base artifacts or linking to artifacts in certain modules) you will use either IDs of artifacts you need to link to (e.g. 12345 is an ID of artifact you want to build a link to) or pairs of module ID and artifact ID (e.g. 23456.34567 - where 23456 is ID of a module and 34567 is ID of an artifact in this module when you want to create a link to a modular artifact). After successful  import of these values you can proceed with using the ‘Link by Attribute’ feature.

You can use Excel also to import primary text with complex formatting (fonts, colors or even tables) but the source must be also DOORS Next. Formatting in this case is handled via cell note. Another case is embedded artifacts import - in this case spreadsheet includes a link to an embedded artifact which must be already existing on the server.

A few general tips which could help you when working with Excel/CSV import and export:

  • Use column names from exported spreadsheets to be sure they are right
  • If you are importing values of modular artifacts (pairs of IDs separated with dots) - check format of cells, by default they might be decided by Excel as numbers with decimals and therefore cut
  • If you are sure you are doing everything right but import fails - try switching to English locale
  • If you need to exclude some artifacts from a module - be sure to keep the ‘METADATA’ section from the initial state of a module (it includes list of IDs of artifacts), otherwise ‘delete artifacts from the module’ option won’t work

Excel/CSV import and export feature can help you in various cases, both as a standalone option or combined with other functions of DOORS Next. Reach us to know more and get support with your special case.

Monday, 23 January 2023 11:32

Global Configuration Management in IBM ELM

Introduction

Configuration Management has always been a difficult problem to solve, and it tended to get more and more difficult as the complexity of the products increased. As end-user expectations evolve, development organizations require more custom and tailored solutions. This introduces a necessity of effective management of versions and variants, which introduces more sophistication to the problem. The solution to being able to manage a complex product development lifecycle in this sense also requires a solution in a similar sophistication level.

The complexity of the end-products and being able to be tailored for the end user are not the only concerns for system developers. What about increasing competitive pressures, efficiency, etc.? These also have increasing importance when considering developing a system. 

Challenges in Product Development

Of course, when we talk about developing a system, we should think about how companies will be able to develop multiple products, share common components, and manage variants, product lines, and even product families, especially if this system needs to be able to answer many different types of user profiles, expectations, needs, and so on.

One main concern for such companies is how to optimize the reuse of assets across such product lines or product families for effective and efficient development processes. Such reusability may introduce new risks without being governed by effective tools and technologies. For instance,  a change to a component in one variant or one release cycle, may not be propagated correctly, completely, and efficiently to all other recipient versions and variants.

This may result in redundancy, error-prone manual work, and conflicts. We do not want to end up in this situation while seeking solutions for managing product lines, families, versions, and variants. In this article, we would like to provide a brief explanation of how IBM Engineering Lifecycle (ELM) Solution tackles such challenges by enabling a cross domain configuration management solution across requirements, system design, verification and validation, and all the way down to software source code management.

This solution, Global Configuration Management (GCM), is an optional application to the ELM Solution, which  integrates several products to provide a complete set of applications for software or systems development.

The Solution

Global Configuration Management (GCM) offers the ability to reuse engineering artifacts from different system components with different lifecycle statuses across multiple parts of the system. Stakeholder, system, sub-system, safety, software, hardware requirements, verification and validation scenarios, test cases, test plans, system architectural design, simulation models, etc. can be reused as well as configured from a single source of truth for different versions and variants. Each of these information sources above has their own configuration identification as per their domain and lifecycle. One main challenge is to combine all these islands of information into one big configuration.

Fig1

In the example above, there are 3 global configurations: 

  • A configuration which was sent to manufacturing, the initial phase of the ACC (Adaptive Cruise Control) 
  • A configuration where the identified issues are being addressed (mainly to fix some software bugs) 
  • And a configuration on which engineering teams are currently working on to release the next functionally enriched revision.  

All these configurations include a certain set of requirements; test cases relevant to those requirements which will be used to verify the requirements; a system architecture parts which satisfies the requirements and will be used during the verification process; and if the system also includes any software the relevant source code to be implemented.

Of course, the sketch above can be extended to a full configuration of an end product, for instance a complete stand alone system like an automobile. In this case, we define a component hierarchy to represent the car. For a reference on component structures for global configurations the article from IBM here can be used.

In such applications of GCM a combination of components of the system and streams can be used to as follows to define the whole system:

Fig 2: Image from article: "CLM configuration management: Defining your component strategy" - https://jazz.net/library/article/90573

In such examples, the streams represented as right arrows on an orange background (global stream: ) correspond to a body part of the system and is constituted by several configurations of relevant domains. For instance, an “air pressure sensor” is defined by a set of requirements for this part, as well as a system design for the sensor and its test cases to verify the requirements. All these domain level, local configurations correspond to different information assets. The global configuration can be defined in such a way that if a different configuration is defined for Ignition System ECU (different baseline is selected for Ignition System ECU Requirements, Test Plans and Cases and Firmware) we end up with a different variant of the system. This effect is illustrated in the figure below:

Fig 3:  Developing 3 different variants at the same time, facilitating reusability.

We may have a base product and different variants being developed to this base product. The variants may share  considerable commonality, and they are based on a common platform. This platform is considered as a collection of engineering assets like requirements, designs, embedded software, test plans and test cases etc.

Effective reuse of the platform engineering assets is achieved through the Global Configuration Management application. Via GCM, ELM can efficiently manage sharing of engineering artifacts and facilitate effective variant management. While a standard variant (or we can call it also as Core Platform) is being developed (see Fig 3), some artifacts constituting specifications, test cases, system models and source code for sub systems, components or sub-components can be directly shared and reused, while others may undergo small changes to make a big difference. Sometimes there might be a fix that surfaces in one variant and should be propagated through other variants as well because we may not know if that variant is affected by this issue or not. Such fixes can easily and effectively be propagated through all variants making sure that the issue is addressed on those variants as well. This is because all artifacts regardless of being reused or changed, are based on one single source of truth and share the same basis. This makes comparison of configurations (variants) very easy and just one click away.

ELM with GCM manages strategic reusability by facilitating parallel development of the “Standard Variant” (Core Platform) and allows easy and effective configuration comparison and change propagation across variants.

Tips and Best Practices for GCM

The technology and implementation behind GCM are very enchanting. Talking about developing systems, especially for safety-critical systems, a two-way linking of information is required. Information here is any engineering artifact, from requirements to tests, reviews, work items, system architecture, and code changes.

So one should maintain a link from source to target and also the backward traceability should be satisfied from target to source. How is this possible with variants? There must be an issue here, especially when talking about traceability between different domains, say between requirements and tests.

To be more specific, let's look at a simple example:

In this example, we have two variants. The initial variant is made of one requirement and a test case linked to that requirement. Namely, "The system shall do this," and the test case is specifying how to verify this. The link between the requirement and the test case is valid (green connection). Now, in the second variant, we have to update the requirement. Because the system shall do this under a certain condition. Both requirements have the same id. They are two different versions of the same artifact. We can compare these configurations and see how the specification has changed from version 1 to version 2. But now, what will happen to the link? Is it still valid? Is the test case really testing the second version of the requirement? We have to check. That is why the link is not valid anymore in the second variant.

The team working on the first variant sees that the test case exactly verifies version 1 of the requirement. But the team working on the second variant is not sure. There is a suspect between the requirement and the test case because the requirement has changed. How is this possible with bidirectional links? Actually, it is not. Because it is supposed to be the same link as well, like it is the same version of the requirement. So we also kind of version control the links. However, if we anchor both sides of the links to the different artifacts on different domains, creating two instances for the same link is not possible. ELM enables this by deleting the reverse direction of links when Global Configuration is enabled and maintains the backward links via indices. This is the beauty of this design. Cause links from source to target exist but, looking backwards, are maintained by link indices, which makes the solution very flexible for defining variants and having different instances of links between artifacts among different variants.

There are some considerations to make when enabling Global Configurations in projects. The main question I ask to decide whether to enable it or not is, do I need to manage variants? How many variants? Is reusability required? Answers to these questions will shed light on the topic without any second thoughts. 

Then what is a component, how granular should it be, how to decide on streams and configurations. What domains should be included in GC, when to baseline, definition of a taxonomy for system components. Would also be typical questions on designating the global configuration environment. The discussion of this topic requires a separate article to focus on. Let's leave that for next time, then.

Conclusion

IBM ELM’s Global Configuration Management (GCM) is a very specific solution for Version and Variant Management and is designed to enable Product Line Engineering, by facilitating effective and strategic reuse.

With GCM implemented, the difficulties around sharing changes among different variants disappear, and every change can be easily and effectively propagated among different product variants.

GCM mainly helps organizations reduce time to market and cost by reducing error prone manual and redundant activities.

It also improves product quality by automating the change management process  which can be  used by multiple product development teams and multiple product variants.

References

We are looking for a motivated individual to work with us on our projects in the area of IBM Engineering Lifecycle Management in order to strengthen our growing team.

Role description
As an IBM Engineering Lifecycle Management Consultant, you will be working alongside some of the top experts in Europe and USA and you will support some of the largest companies in regulated Industries like Automotive, Rail, Medical, Pharma, Aerospace & Defense.
You will be part of exciting and challenging projects in the area of Systems and Software engineering. Our clients are companies mainly in DACH, Europe and the USA .

What you can expect

  • Competent and skilled team of colleagues
  • Pleasant working atmosphere
  • Laptop und Smartphone according to your choice
  • Support of your further education and self-development
  • Working at a company with flat decision structures
  • High Level of flexibility
  • Performance and skill based remuneration

Main responsibilities

  • Working with our customers on their daily business activities in order to develop a deep understanding of their business processes and objectives
  • Engaging with key business users and stakeholders and collecting and defining requirements
  • Managing and analyzing changes to requirements in the IBM ELM product suite
  • Capturing and tracing changes to requirements in the IBM ELM product suite
  • Coordinating and contributing with other consultants, architects and developers and providing the team with analysis, design, development and enhancement of IBM ELM solutions and IBM ELM Extensions
  • Discussing and cooperating with the development team to specify, execute and provide relevant solutions
  • Working on business and technical tasks in IBM products like DOORS, DOORS Next, EWM (RTC), ETM (RQM), Rhapsody and other Jazz applications
  • Leading proactive communication and information exchange
  • Providing feedback and guidance to clients to support technology-related decision making
  • Helping our clients with Process-Development in ALM Toolchain
  • Analysis and Discussions with other stakeholders regarding development processes
  • Presentation and documentation of concepts,workflows and data models
  • Implementation of Process Configuration and preparation of user documentation and support during the implementation of the tools
  • Support during pre-sales and post-sales activities

Ideal candidate profile

  • Good experience with the IBM Engineering Lifecycle Management products including IBM Doors Next, Engineering Workflow Management and Engineering Test Management or similar tools in ALM area
  • Solid understanding of processes in the Area of requirements engineering, design, MBSE, development and testing of complex products
  • Education with focus on Engineering or IT is a big advantage
  • Professional and project experience in complex development environments.
  • Knowledge of traditional and agile development processes (V-Model, Agile, SAFE)
  • Knowledge of other ALM tools (DOORS Classic, Codebeamer, JAMA, PTC Windchill, Polarion) or integrations to 3rd Party tools is an advantage.
  • Software development skills (i.e. Javascript, JAVA, Python…) are a plus
  • Orientation on customer and high quality service delivery
  • Strong analytical and problem solving skills
  • Willingness to travel within Europe (mainly DACH) when needed
  • Good communication and presentation skills
  • Solid knowledge of German (level B2 or higher) and English (level B2 or higher)

If you have skills in similar domains and you learn quickly then we would be also happy to receive your application.

About Softacus
Established in 1981, Softacus AG has developed focus on the following three areas: software resale, consulting and software development-outsourcing. We are experts when it comes to consulting and service delivery of IBM software products and solutions and we help our clients to improve visibility and transparency when licensing and managing commercial software.
Softacus is a responsible organization of 50+ team members in nearly 10 countries and our customers include high-profile enterprises and SMEs in regulated industries.
Continuously investing into the skills of our employees is axiomatic in order to find and deliver the right talent to fit our customers expectations.
Thanks to the very close cooperation with IBM and direct relations with almost all technical departments of IBM, we support customers quickly and professionally in selecting the correct product and base solution.

Do you think this position is for you?
Then we look forward to receiving your application documents with your CV, certificates, possible start date and also your salary expectations!

Phone Number: +49 69 173201020
Email: This email address is being protected from spambots. You need JavaScript enabled to view it.

Thank you very much!

Date: May 17 2022, 11:00 CEST

You are part of the LifeSciences and Healthcare Industry? You are developing new medical devices you want to know : How can you improve Quality ?

Give your team full transparency and enable them to make the right decisions at the right time. Create work products as a holistic, linked data set in a single view of truth and insure your devices safety.

How can you streamline Compliance ?
Create your work products with a system that foresees design control for every aspect and integrates compliance into the development process resulting in significant cost reduction. See how IBM ELM and Softacus face the challenges of IEC 62304 compliance.

How can you reduce time to innovation ?
Deliver innovative offerings to market in less time by leveraging design reuse, early validation with model-based engineering, and accelerated product delivery with agile engineering practices.

Discover how we meet these challenges with IBM Engineering Lifecycle Management and Softacus!

Register now to participate to our webinar the 17th of May 2022 at 11H.

Main Functionalities

  • Ability to migrate big amount of modules from DOORS to DNG
  • Possibility to transform the data according to specific user needs (split, merge modules)
  • Transformation of attributes
  • Transformation in GCM context; view to stream, split across components
  • Ability to migrate history and baselines

Image

 

DOORS Tables

Our application has the ability to transform DOORS Tables into DNG HTML Tables within the DNG Artifact.
Also, we support the ability to transform OLE excel tables into DNG artifact HTML tables.  

Migration of History data

Ability to save DOORS object history and import it into either an attribute or initial content of a "primary text" attribute.
Possibility to make a link back to original object in case of migration without the history (recommended option).

(Currently in development) Import from specific DOORS baseline.

High Automation Level

The tool have API in order to create automation for:

  • Artifact creation and mapping
  • Attribute creation and mapping
  • Link Type creation
  • Project area & component creation
  • GCM creation
  • Creation of views, folders, streams and baselines
  • Create User in Jazz
  • Automated assignment of Group to the User

Conversions

  • Rich text into attributes
  • DXL Layouts into string attributes
  • Ability to convert attributes into embedded artifacts (i.e. for parametrization)
  • Transformations of pictures. We support png, jpeg, jpg. Due to DNG limitations, .wmf and .svg are not supported

Links

  • Recreation of links within the project area
  • Recreation of links within the component
  • Recreation of links across project areas
  • Recreation of links across components with GC context

Splitting of Data

  • Split to several components 
  • Split to project areas
  • Split to several servers
  • Split of data by view to several streams
  • Stream Updates

Profiles per Each project

The solution have customizable possibilities (currently via YAML file) to set up a profile for migration of each project

Migration of Views

The solution supports migration of view (beta)
Some views are non-compatible in DNG, and we tag them

Lightning Speed

We have developed a concept how to avoid thousands/millions of calls to DNG

Example 10 000 artifacts 150 seconds

Statistics

  • PA Size
  • Modules size
  • Number of attributes
  • Number of artifacts

Validator

  • checks count and data
  • count object types attribute types
  • views
  • links

Mapping

  • Object view to component stream
  • Attribute harmonization
    • Possible automated cleanup of the data
  • Mapping of custom object types to artifact types
  • Storage of mapping per each execution
  • Sandboxing  -migrate test and improve

OLE Object Handling

  • Migration of OLE from attribute into primary text (merge) or as linked artifact (link to base artifact)
  • Ability to migrate pictures as embedded or standalone artifacts. Linking of embedded pictures to its artifacts (as base artifact link)

Notifications

Error handling prior and after the migration

Test Management

(Planned) Test on each object with proof that the artifact was migrated from DOORS to DNG properly

Other Advanges

  • Keep reference to DOORs classic
  • 60 widgets available from Softacus for DNG with mainly common architecture. Widget are resolving several gaps and increase user experience in DNG
  • Support for Tags,
  • Handling of huge project data
  • Own migration server and infrastructure for Migration
  • Unicode special symbols handling
  • Nested symbols handling
  • Incompatibility checks
  • Attributes merging
  • We can recreate DXL scripts in Js Java and Python or other language if required

Technical Information

  • Concurrent loading from DOORS to DNG
  • Usage of internal DB during Migration, which allows flexible linking after the migration of the data
  • Possibility to resume of the migration process in case of a disruption

 

Flexible for any customer use case

All migrations are different.  We have built this solution, because the traditional ways of migrating were not sufficient for our and our customer needs.

This video show first release of the solution.

 

Image

Development of specific APIs

The tool is able to to custom migration due to the fact that we have created our own APIs.

Monday, 07 March 2022 11:39

Filter History

This extension allows users to filter this history directly in a module without need of opening history separately in new window or tab

Page 3 of 3
Image

Softacus AG

Löwenstrasse 20
8001 Zürich
Switzerland
E-Mail: info@softacus.com
Tel.: +41 43 5087081
Fax: +41 43 344 6075 

VAT: CHE-108.817.809 MWST
D-U-N-S® Number 486800618

Image

Softacus GmbH

Westendstrasse 28
60325 Frankfurt am Main
Germany
E-Mail: info@softacus.com
Tel.: +49 69 34876544
Fax: +49 69 5830 35709

VAT: DE301903892
D-U-N-S® Number 313482703

Image

Softacus s.r.o.

Křídlovická 351/47A
603 00 Brno
Czech Republic
E-Mail: info@softacus.com
Tel.: +420 530333482
Fax: +41 43 344 6075

VAT: CZ07286333
D-U-N-S® Number 496165108

Image

Softacus s.r.o.

Tatranské nám. 3
058 01 Poprad
Slovakia
E-Mail: info@softacus.com
Tel: +421 911 083 612
Fax: +41 43 344 6075

VAT: SK2121388148
D-U-N-S® Number  2121388148

Offcanvas

Cookie Policy