It’s no secret that the Oil and Gas industry is cautious and calculated when adopting new technology. There are more than a few reasons for this, but the most fundamental in my opinion are the absolute requirement for safe operation combined with the sheer amount of inertia, in terms of investment, in existing technologies. It’s the analog of turning an oil tanker – it’s going to take a while.
That said, when there is uptake of a particular set of technologies, the consumption and demand comes at astounding pace and scales. Without a question, Oil and Gas is one of the largest industries, if not the largest industry globally. According to Wikipedia, six out of ten companies with the highest revenue in the world are Oil and Gas companies. Profit percentage, on the other hand, especially in the current low-price oil environment, is another matter. The pressure from that aspect of the business is actively driving increased appetite for new technologies to reduce costs.
What does the adoption of Big Data technology stacks mean for Oil and Gas? This, in part, is the topic of a joint presentation I’ll be delivering along with Brian Clark of Objectivity at the Strata + Hadoop World conference in San Jose, Calif., on March 31.
In my estimation, when the Oil and Gas sector turns to today’s Big Data platforms, it will push the limits of what has been done to date. For the moment the trend of slow technological adoption remains steady, but in a time when Lambda architecture implementations have proliferated their way through Silicon Valley due to their scalability, flexibility, fault tolerance, and price point, Oil and Gas seems to be fighting the tide. There are, however, internal and external factors that are propelling adoption faster than expected.
For one, geophysical data acquisition has been increasing at staggering rates for decades, requiring data scientists in the industry to push the technological envelope to glean a better understanding of the subsurface. Also, today’s digital oil field – the result of applications of IoT technologies within established analytical geoscience – has brought benefit, but also compounded the problem, by producing torrential streams of data from a myriad of sensors designed to record and transmit 24/7. Processing these data volumes, in turn, yields orders of magnitude more data.
Like many other industries, analytical sciences targeted at hydrocarbon recovery are drowning in a sea of unstructured data with little hope of attaining the full value from that data. Lambda provides a life preserver, of sorts, by giving these disciplines an opportunity to increase throughput while decreasing latency in a scalable, cost-effective manner and can therefore address or even surpass the current bandwidth gap. Having accomplished that, there is great potential to fuse the increasing array of data sources into analytical workflows that can provide further insight.
In the presentation at Strata + Hadoop World, Brian and I will frame the problem domain and present a scalable geoscience-specific application of Big Data technologies to address the analytical requirement at hand.
I hope to see you at the conference and look forward to sharing my thoughts on this subject!
Marco is the data model architect for French-based geophysical services company, CGG, Inc., a fully integrated geoscience company providing leading geological, geophysical, and reservoir capabilities to a broad base of customers, primarily from the global oil and gas industry. Since joining CGG in 2007, Marco’s focus has been developing scalable, object-oriented data models capable of satisfying the data-management challenges of geoscience applications.