During the last years, I have been working quite a lot in and around Business Intelligence.
With my sense of Enterprise Architecture, I have all the time got a feeling that there is a potential for improvement. The symptoms are in different viewpoints all related to time-consuming work for new deliveries and maintenance. At a public organization, the business got an answer that it will take 9 years to the solution for requested follow up of operational business processes. At another organization, the BI people are hiding for business representatives in the corridors, because they have no time to take on their requirements. At yet another organization the standard answer is that we take on your case in six months from now. The most surprising about this is actually that the business has a demand, a need, probably based on a desire to improve their business.
The main symptom can be seen as a lack of capacity to meet business demand.
What is the solution? The standard answer is in most cases, we need more resources.
Of course, higher capacity through more resources will help. Though, my absolute belief is that there are other ways to take on the capacity issue, where capacity is more about throughput than the number of human resources.
What is the throughput of your BI delivery organization? That’s an important question!
IS IT POSSIBLE TO IMPROVE THE THROUGHPUT?
I am investigating this issue continuously in different assignments.
So far, I have identified the following mechanisms for increased throughput in the BI delivery:
The mechanisms are not in any ranking order. Hereafter I present them briefly.
The impact of EA is mainly in the standardization area. There are business processes and procedures that are similar and without a doubt possible to harmonize. A harmonization will lead to a kind of sharing of service, which in turn will lead to faster delivery a lower resource capacity needs over time.
A dynamic organization has different delivery teams with a different focus. The focus can be a technology area, a business area or a capability like business demand management. Every organization has its own pre-requisites and culture that drives this set-up. The idea is that specialization in certain areas will speed up deliveries. Another view of ‘dynamic’ is the human resource capacity. It is important to distribute the delivery teams to different regional locations to secure competence supply. To contact a service provider with agreed service levels for scaling up and down capacity is usually a valuable complementary. Both these scenarios enable a human resource capacity that more easily can be adapted to the actual business demand.
Accelerators is the term for mechanisms, software or software components, that can be used to speed up the delivery and maintenance of solutions.
An example of this is IDA, the Insights & Data Accelerator developed by Sogeti in Sweden.
IDA is a tool where the DW model and its ETL procedures are defined with the metadata model. The processes to create and maintain DW are automated though BIML, and considerably faster and more efficient. It also simplifies and partly automates the documentation of BI solutions.
Business in the driver’s seat
Many BI development processes are based on a model where the BI delivery organization is doing most of the work and driving a large number of iterations with the business to find the solution. The interaction takes time and there are usually misunderstandings along the road. To develop the business capabilities to be able to work with the demand and drive the design will increase the throughput of the developments dispatched to the delivery center.
Self-Service is the ability of the business to design and implement BI reports.
If the business gets access to its data in easy and authorized way and can design and implement their own reports, it will make a huge impact on the delivery capacity.
Self-service will also drive deeper business understanding of the data, which in turn will drive better and more straight-on demands. In some cases, the business people can do their own basic specifications with a tool like Power BI if the data is made available. It doesn’t have to come from a DW initially. It can be a dedicated data set and structure that is prepared for the business for assessment and tests.
Master Data Management
In traditional DataWarehouse processes, a large portion of both work and procedures are handling the build and maintenance of dimensions that are typically master data structures.
With MDM processes and a master data structure in place, the BI work is about capturing transactions and put them into the master data structures. MDM has a potential to considerably lower the amount BI work and through that increase the capacity.
MDM can also host dimensions that usually otherwise are built into many applications. With these dimensions in place centrally, the work to clean data will be less, and changes of dimensions will increase throughput in BI application management.
Today we usually speak about Business Intelligence as the old way and Artificial Intelligence as the new way. There should be no separation between both. It is all about Intelligence, and in most cases, I have been working with lately, it melts together. Data warehouse and DataLake is not two different things, it is a united platform. The business also expects traditional reports and AI oriented ones to be provided together.
To work with the complete perspective and scope provides un-valuable synergies in the longer term as well as faster time to market.
In part 1, I will elaborate a little further on the first mechanism, Enterprise Architecture. The other mechanisms will be covered in coming blog posts.
How can Enterprise Architecture increase the capacity/throughput in BI delivery?
Enterprise Architecture (EA) has a big influence on the long-term efficiency of Intelligence delivery. There are several perspectives that can be taken into account under this label.
The first perspective is business harmonization. To harmonize business processes on a common level that is aligned with defined performance indicators and other reporting requirements simplifies the design and delivery of BI solutions tremendously. Let say that you harmonize 5 processes into 1. That means an increase of throughput with 400% within that scope.
The second important perspective, an obvious one, is Information Architecture. To unify information terminology and structures over the enterprise. This partly includes Master Data Management, which has a unified information structure as a foundation.
A third perspective is the Enterprise Information Architects job. To understand what data are where. How it is structured? What limitations there are in data structures and application accessibility?
These aspects usually drive enormous time, so to take care of it from an EA perspective has a very high value and of course affects the throughput, as the issues often show up in the delivery phase.
A fourth perspective is a unified and efficient platform, which will require EA thinking to be realized right. How to build Data Warehouse, how to use DataLake, how to secure data authorization, how to set up BI portals with a combination of reports, dashboards and self-service, how to optimize the integration with all tools at hand, and more.
I have seen so many initiatives where the lack of EA has created structures that from an EA perspective seem stupid. Of course, in most cases, the solution was good in a small perspective.
When working with BI and Information Management in common, we can’t afford not applying an EA mindset on the different initiatives. It will have a huge impact over time if done right…