January 3, 2019

Applying Analytics to Improve Engineering and Operations : The road ahead

BY :     January 3, 2019

In my last blogpost, I introduced the topic of Analytics and its application in improving engineering and operations. In this blogpost, I’ll be discussing some of the engineering use cases on which we have worked in applying analytics and will lay down some rules of the road, to help you get started on building this capability within your organization.

So, for the last couple of years, we have been working with the multiple product engineering teams to help them implement the capability of leveraging analytics in their engineering operations. Some of the use-cases which our team has worked on are given below:

Product Management Function

We are currently working with a few customers in the regulated medical devices and aerospace and defense industries, where the focus of applying analytics is mainly on determining the typical features that are most used, the usage patterns of the products, configurations, etc. We have worked with a medical device customer recently who wanted us to look at the clinical logs generated while using their product and compare it with the logs generated through automation and identify frequently used features and workflows.

Development function

We have been working with a leading global beverage manufacturer analyzing their source code to help them derive hotspots and identify the context-specific tests they need to execute. We have also been working with a medical device manufacturer in a proof of value to automate the process of impact analysis. The regulated medical devices industry has a lot of focus on the process of impact analysis and using the power of analytics helps them to get it done faster.

Quality Assurance

When it comes to the QA use-cases, we have been working with Dell EMC. For more details on our experience please have a look at this video:

We are also currently working with a medical device manufacturer in identifying duplicate tests scenarios, defect deduplication etc.

Release Management

We are currently engaged with a European Automotive OEM’s central release management team, working on helping the team take a go-no-go decision, predicting the risk level of various product components which they are receiving as part of their software supply chain.

The road ahead OR where do we apply analytics in our operations?

As you can see from the use cases above, Capgemini has been working with several product engineering companies to apply analytics to improve operations. In fact, we have an entire engineering analytics competency team which focuses on the operational aspects of applying analytics.

One common concern among customers is how to identify the low hanging fruits to apply analytics. In other words, what are the opportunities and which are the areas we can apply analytics in our operations?

We already have various mechanisms to improve the way we work, such as heuristics, rules of thumb, best practices, common sense, insights, intuitive judgements, feedback, and SME tribal knowledge. A primary goal of applying analytics is to help us “automate” or “codify” these “heuristics” and “rules” for better and faster decision making.

Let us take an example from the quality domain to understand how we “automate” or “codify” rules. In the case of a test selection problem, we use common sense to test defects which have been fixed, identify related test failures and pick up those tests, identify dependent modules from code base and identify tests for those modules, identify prior failed tests when a similar code change happened, etc.

Though all of the above can be done manually, in a continuous engineering organization it is very difficult to get these done effectively without automation. This is where we can leverage the power of analytics and automate or codify the rules.

The other options for leveraging analytics is by using descriptive analytics to find out what is happening, diagnostic analytics to understand why is something happening, predictive analytics to know what is likely to happen, prescriptive analytics to decide what needs to be done given a particular context and finally deploy artificial intelligence.

Rules of the road or how do we apply analytics to our operations?

When applying analytics to your operations, the first step is to select the right analytics partner. Your analytics partner needs to have not only analytics capabilities but also an understanding of product engineering phases. The partner can be in-house, external or both. Finally, the analytics partner should be open to co-value creation and experimentation.

The next aspect to focus on is data availability. If you don’t have the data yet, start collecting it. Collecting data is not an intensive activity anymore, as there are a number of tools available for product engineering that allow data collection with ease via their APIs.

Once you have your data, start looking at data quality and check for correctness, completeness, de-duplication, sufficiency, and fix missing records. You will find that these activities of data curation will give you as many insights as you might get by implementing an analytics model.

Explore the data next to identify anomalies, trends, patterns, and select rules you would want to apply by either modeling the current manual process or applying Industry best practices and methods.

You then need to look at modeling the rules by identifying the right algorithms by looking at segmentation algorithms, association algorithms and statistical algorithms.

The next rule and one of the most important would be to validate the model with the help of an SME or by using statistical methods such as precision, recall and F-measure.

Then get into the operationalizing phase wherein you take a decision on the technology aspects of deploying the same across multiple programs keeping in mind the common needs across the programs when you design the data lake.

Challenges and key takeaways

No journey is without challenges. Identifying the right project, and then finding out if the project or program is analytics-ready is one of the initial challenges. Sometimes the repositories are found to be not as integrated as desired, and that will have to be addressed. Moreover, some data quality problems may arise due to process violations. The availability and bandwidth of SMEs are often concerns. Last but not the least, is the right skill set, which would ideally be data scientists with product engineering experience.

The key takeaways, then are the following:

  • Now or never: One of the key trends revealed by the WQR 2018 survey was the convergence of Artificial Intelligence (AI), Machine Learning (ML) and analytics and their use in carrying out smarter automation. Cognitive automation is clearly the future and organizations seeking to reap the full benefits of analytics need to start collecting data now or risk being left behind by their competitors.
  • Mind your data: To start on this journey, one of the key steps that organizations need to take is to identify the data quality, accessibility, and availability gaps and fix them in order to leverage analytics better
  • Start small: This means starting with a pilot project having limited aims and objectives and then rolling out the project across the organization based on the results. The best way to build competency in this area is to identify a use case and pick up the right project along the lines of something that has already been done in other organizations.
Vivek Jaykrishnan

About

Vivek  Jaykrishnan is a highly experienced enterprise test consultant and  test architect. He has a career spanning 18+ years , steering the functions of Verification and Validation including functional, system, integration and performance test in leadership positions with organisations of repute. Vivek has a proven track record of working across various engagement models including outsourced product V&V in service organisations, Independent V&V in captive unit and globally distributed development in a product company. Vivek  also has extensive experience in developing and implementing verification and validation strategies and driving V&V to align with various development methodologies including continuous delivery, agile, iterative development and water-fall model. Vivek is passionate about applying cognitive intelligence within testing, helping organisations move beyond product quality to product integrity and also chasing the next frontiers in IOT testing.

More on Vivek Jaykrishnan.

Related Posts

Your email address will not be published. Required fields are marked *

1 + 4 =


*Opinions expressed on this blog reflect the writer’s views and not the position of the Sogeti Group