Analysis

How better harnessing of data can improve project returns

Richard Corderoy (pictured above), managing director the Oakland Group, says that using machine learning to find historic trends and correlations can monitor actual performance while a project is in progress.

According to the Purchasing Managers’ Index 11th Global Project Management Survey, UK organisations waste an average of $162m (£132m) for every $1bn (£815m) spent on projects and only 45% of projects complete on time. Why is it that so many projects deviate from the initial proposal?

Large-scale infrastructure programmes are inherently complex and come with a lot of big challenges. For example, take hundreds of people from multiple organisations working on one project, many of whom are contractors working on very specific aspects. Combine this with teams being changed multiple times over the course of the project and it’s no wonder there is little continuity. It is hard to establish an organisational culture with everyone working in different ways. This is one of the major reasons for a disjointed approach with a lack of visibility and control over a project.

The power of data

Politicians love the “mega-project”, but the very nature of these projects is that they can be prone to change, largely due to so many stakeholders over a long period of time. But the one thing that remains consistent over the project lifecycle is the creation of data.

Complex programmes are data producing machines, each one has vast amounts of corporate history currently filling up databases but very rarely being looked at. Having the ability to tap into this data and integrate various systems can provide a new level of insight where previous decisions can be understood and analysed.

Say, for example, the original team who made decisions around risk left during the project, the new team who is brought onto the project does not have the insight to understand why the initial decisions were made or if these decisions were good ones in the first place. Using data to maintain the corporate memory helps reduce this knowledge gap and can provide a foundation to assess future risk.

It is hard to establish an organisational culture with everyone working in different ways. This is one of the major reasons for a disjointed approach with a lack of visibility and control over a project.– Richard Corderoy

Becoming data-centric

Though we know mega-projects have a pool of untapped data, how exactly do you collect and harness the information that is held in this data?

Many mega-projects involve multiple interconnected smaller projects that involve multiple teams, each of which have their own systems and databases. Data can be collected from a range of databases from across the supply chain, whether that is scheduling systems, risk management systems, accountancy or safety specific databases.

Once these core data sources have been identified, the data can be pulled from the various databases to create a more central system. The project number is used as the key that links the various systems together to create interconnectivity throughout the project.

Using data analytics platforms, the collected data can then be converted into a “4D” data model, where data can be viewed all at once and from any point in time. Once this model has been created, it will be able to identify patterns in the data, for example, it will be able to recognise when projects have not been updated or closed, which consequently could be causing delays across the entire project. The data outputs are then fed back into the core system, so it can better learn over time and provide more accurate predictions.

How forecasting can help

Complex projects are that – incredibly complicated. The dependencies and interactions between workstreams can be mind-blowing and understanding the knock-on consequences of delays and issues equally difficult to react quickly to. As most project managers are naturally optimistic (as are most humans) it may take time to fully comprehend the implications.

However, making decisions based on sound data can provide a new level of accuracy and objectivity that previously hasn’t been accessible. Traditional forecasting models and approaches rely on individuals at multiple stages in the process, who have to make assessments on limited information.

Using machine learning to find historic trends and correlations can monitor actual performance whilst the project is in progress. This allows risks and their potential impact to be identified quicker whilst consistently improving the quality of future estimates.

With overspends common in delivering these highly complex projects, poor forecasting can have serious impacts on the overall project budget Professor Lukumon Oyedele, Chair Professor of Enterprise and Project Management at UWE Bristol, stated that companies often plan for a profit of 10-15% but often struggle to make a 2% margin.

Overall, this type of forecasting can improve the chances of a project ending successfully, in terms of risk, budget and planned schedule. Alongside this, the accuracy of these predictions means time and financial contingencies can be reduced that are set aside for uncertainty, improving the overall returns on the project budget.

Story for BIM+? Get in touch via email: [email protected]

Comments

  1. A digital twin will mitigate these issues.

Comments are closed.

Latest articles in Analysis