Skip to Content

Single source of truth: Why it matters?

Marco Venzelaar
February 06, 2019

We are living in the Information age and we know from our private lives that information can be used (and abused) all around us. So too in our professional lives but this is still very much under-utilised and there is great potential. The opportunities are very wide but just to name a few:

  • Estimated time to complete your task – would be great to base this on actual historical data, instead we use anecdotal information from memory and a ‘feeling’ where the project is heading.
  • Daily and weekly reports: Information like the daily and weekly reports created by the test managers and project managers are generally held somewhere within the systems of the projects yet we still do this manually.
  • Unit test execution, system test execution, E2E test execution and so on – while the unit testing is likely to be automated (let’s hope so) it is still held within development only, and the other test phases have their own repositories or just stand alone documents – each one issues their own version of a report.

Information is fragmented across the project in documents, in different management tools and also in project members’ brains and not to forget the ‘gut-feeling’ of the project managers. Perceptions and assumptions are being made when (part of) information is missing and we all know what that has the potential.

Then there are the project requirements, those might be held in requirement documents, design documents, requirement management tools or in all the above. People will instinctively build spreadsheets trying to accumulate or link this information but at the same time creating yet another source of information. At any given point in time does the project manager know the true status of his project? When you are running an Agile or DevOps project (in true style that is) you are likely to have access to pipeline information or Agile dashboards. Unfortunately, too often this information is tailored for the development teams and certainly only within the project itself. What about the role of program directors or the wider stakeholders, they would like to see a wider view. Decision made based on silo-ed information will eventually lead to ill-informed decisions. Quality assurance has over the last few years taken a backseat role in all this but it is now the time that we can bring renewed value.

Collating data points from across the project into a common format and central location will allow the project or the organisation to draw up dashboards that show the state of the project based on all the information. This ‘single source of truth’ will allow a conversation to go on across the project with the same information for all people. This does not only mean informed decisions but also generally more in the same direction. If there is an issue to be dealt with, then the same and up to date information is available to all and decisions can be made quickly.

A more advanced step will be to connect this information to cognitive learning capabilities, which is not that difficult to do. This will allow you to start learning and predicting based on the centrally held information and can be done at any level of the organisation. It will help build clearer estimates based on real figures and allows organisation to truly learn from past experiences and assist with decision making at crucial points in the delivery process. When you are collecting a lot of information from across the project it will become more difficult to analyse all that information due to sheer volume and diversity so cognitive learning/machine learning will provide immense help.

Another smaller but still an important point is that in large projects there is a transient population of the project members…. Information might be handed over from the departing member to the new member but it will never be all the information and it often ends up on yet another piece of physical paper or document that in most cases isn’t available for others in the project.

Go have a good look at your (project) organisation or have us have an independent look and see how this information can be brought together. You can start small and then expand within the project and eventually duplicate this across the projects. And the great thing is that it does not mean that every project must use the same tools, lots of tools nowadays use API’s to allow access to information. Our engineers can use those API’s or create custom API’s to allow older tools that were previously unconnected to share data openly for all within the project and organisation to use it.

It is time liberate your data and make informed decisions!

Image Source

About the author

Managing Consultant | Test Automation | UK
Marco started his career in Quality Assurance with ISO certifications including environmental and safety regulations. His attention to detail on processes continued into his career as an expert testing consultant where he integrated this with his passion for test automation, performance testing and now applying this to Cognitive QA.

    Comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *