Making Data Easy

by Maxime Montreuil

yellow arrow background image

For years, we have worked on gathering more data; the focus being on having the best person enter it once, and having it available across systems and reports in a timely manner. AS time passed, we started asking for more details, pictures, forecasts, etc. and we quickly reached a point where the time spent capturing that next bit of information wasn’t the best use of the foreman or plant operator’s time.

While we continued to increase the quality and timeliness of the information being reported, we started working on capturing outside data without user interaction. This included automated meters, GPS information, Robotic Process Autotmation (RPA). The RPAs allowed us to either scrape information from a website or simply process it and move it around.

And then, sometime in 2017/2018, we reached the point where adding more data wasn’t useful. We reach data overload, and had more data than could be digested by people. In fact, we had so much data, we no longer could act on or react to it. We overdosed on data, and had become data handlers, rather than data consumers. We spent hours every day, prepping and massaging data, instead of using our data in a productive manner through data mining and predictive analytics. This is the point where making data easy and flipping it on its head became a strategic axis for us. We began to look at how we can create data as a service.

grey background arrow

As time has passed, we have gone through several iterations of technology to store and use that information: traditional data warehouses, data marts, Multi-Dimensional cubes, and Tabular models. We have created reports using SSRS, ReportsNow and other tools for consumption, exported these to pdfs with 50, or more pages, and now we have developed Powerbi dashboards, etc. But this also allowed us to become a bottleneck for our end users. To make the data useful, they need to ISS to create these reports, and visualization. To achieve the goal of making Data easy and providing data as a service, we had to make it easier to integrate the data through our systems, providing APIS and automations that allowed for bi-directional communications with our systems. We needed to embrace newer technologies, architecture and toolsets that allow for our end users to We had to make it easier to integrate the data through our systems, providing APIs and automations to and from our systems.

Today, we embrace Microsoft’s PowerPlatform and Azure services to deliver the best in class tools to our business. We are uplifting our backend replication and data warehouse strategy, taking full advantage of data lakes, data bricks, and data factory solutions in Azure. This will provide us with better performance and maintainability; and allow us to scale up down as the need for data increases. It also enables us to build better security models so that we can easily share it with our end users.

In our new paradigm, we have new users, which can be described as data analysts or data scientists, directly inside our business units. These users now consume our data under new angles that have never been looked at before. Their needs differ from our traditional users in the sense that they need considerably more data, and better tools to analyze such volume. Our Azure data strategy transformation will support them better, and grow with their needs.

We also will be using new tools, such as PowerApps / PowerAutomate, to support our digital transformation efforts in the most efficient way, and adopting new development methods along the way.
We are rolling out PowerBI dashboards right and left, giving the users a simpler way to absorb large volumes of information, using better visualizations, while providing all the details they need should they desire to drill into a specific metric.

One of our next evolutions will be around finding better ways to automate alerts, through Actionable Intelligence. Having a KPI that is dark red in a PowerBI is great, but if nobody sees it in a timely manner, or it does not provide data where we can make a decision, we are missing opportunities to react. We need to have a platform that will allow us and our users to define thresholds and other triggers based on statistical deviation from the norm. Delivered straight to their mailboxes and/or popup alerts on their devices letting them know that something changed and needs their attention. By doing so, we’ll increase the chance that we act sooner on an issue, making us a more agile company.

And then, like everybody, we are starting to dip our toes into the world of Machine Learning and Artificial Intelligence, figuring out ways to assist our users in their day-to-day tasks, and bringing us closer to a computer assisted decision making world where information will be processed on the fly, raised to our business leaders, and provide them with an automated assessment of a situation. This will create an environment where our people will be able to take all of it into account and make the best decision for the future of Colas. The computer won’t replace the human in the final decision making, at least not in the medium term, but it can help improve the understanding of the context.

If being a trailblazer wanting to revolutionize an industry that hasn’t yet seen a lot of digital transformation, predictive analytics, and actionable intelligence; and still relies a lot on people’s gut instincts is something that makes you giddy, come join us and help us build this new world and change how the Engineering & Construction industry will operate tomorrow!

We have an exciting journey we have in front of us…