The oil and gas industry error rate that big data could reduce

A recent report shows firms are spending lots of money to acquire information, but the mistakes in exploration, extraction and processing keep piling up. Use these key questions to drive an analytics agenda


Few phrases were more grating than, “data is the new oil,” an adage bandied about far too often by IT industry pundits until they began to realize just how difficult it is to extract actionable insight from unstructured information. This could explain why, for companies dealing with actual oil and gas, the search for big data has been slow to begin.

According to a recent study from Molten Group, a London-based consulting firm specializing in the energy sector, it’s not like oil and gas firms are ignoring the potential of big data analytics. Molten says upstream divisions of the “supermajors,” or the biggest firms in the sector, spend between US$1 billion to $3 billion a year on acquiring data. Like enterprises in so many other industries, they’re just not making the most of it.

It’s too bad because, as Molten partner and head of Global Energy Colin Frost says in the report, there’s an obvious goal around which any efforts around big data in oil and gas could be based: improving the average number of mistakes.

“A 10 percent error rate is deemed acceptable in the sector. However, I think we need to question if this is level of error is appropriate,” he says. “Clearly the nature of exploration and production means there will always be an element of error. However, I think we can do better and over time reduce the error rate across the board. The key to producing better forecasts is better data and its management.”

Perhaps that’s easier said than done, but Molten’s study offers some very concrete ideas that could galvanize CIOs and IT departments who are interesting in contributing more value to their organizations. While much of the report covers familiar themes in big data — you need the right governance model, bad data leads to bad outcomes, and so on — there are very industry-specific questions offered for which a successful big data initiative could provide better answers. What is the next basin in which to invest exploration dollars? Where should oil and gas companies drill exploration wells? How can they optimize their production rates while also optimizing recovery factors? If they have to shut down an operation and start decommissioning, how much will it cost?

Of course, part of the effort around big data analytics is not merely collecting and storing unstructured information but having the channels available to convey meaning and insight across the organization. This is where unified communications, which can bring disparate modes of collaboration into a more integrated whole, becomes more important. Molten’s study hints at this when it mentions the importance of people in the equation: “(Oil and gas companies need) a well-designed and highly effective organization that enables the business to make and execute decisions required to realize its strategic objectives.”

In other words, they need people to work together better. That’s what unified communications can do, and if they’re armed with the right data their error rate could be reduced significantly. Maybe this is the sector that will show us that data isn’t necessarily the new oil, but the pipeline that fuels greater accuracy, greater value, greater success.

For more industry-specific advice, download ‘Create a Collaborative Network in the Oil and Gas Industry,’ an Allstream Industry Brief.

Image courtesy of smarnad at FreeDigitalPhotos.net

Comments are closed.