Noticing that 2015 is already half-over, it got me thinking about just how fast time flies. Case in point: almost ten years ago, I was working at Objectivity as a systems engineer for Fugro-Jason, now part of CGG, a geoscience company based in Paris, France, and one of the many long-standing solutions partners with Objectivity.

Today I’m delighted that I’m still at Objectivity, enabling CGG GeoSoftware to remain on the cutting edge of big data technology. Over these ten years the requirements of the application and demands on the database have increased significantly.  The number and accuracy of the sensor-generated samples and readings has increased six orders of magnitude, and the amount of data stored and processed has increased correspondingly three orders of magnitude.

It’s no surprise that the oil and gas industry is under a massive amount of pressure to meet the world’s energy demands through the discovery of more hydrocarbon deposits with minimal environmental impact in a commercially viable manner. The data needed to conduct these advanced analytics are more than just a few numbers in a spreadsheet; by sending sound waves deep into Earth’s subsurface, Statoil Canada, for example, captured more than two terabytes of information per day in the country’s first high-density, slip-sweep program in 2011. A large field could have tens of thousands of wells, with up to ten tools per well generating multi-dimensional logs and curves, equating to multiple gigabytes of data. Once the source data is processed, generating hundreds of new curves and other related information, these gigabytes turn into terabytes very quickly.

However, at that break-neck rate of ingest, it’s not just big data—it’s fast data too. Downhole monitoring—where permanent downhole gauge systems are installed to record continuously over the life of the well—is also gaining popularity. The oil and gas industry is recognizing that they’ve been participating in the Internet of Things long before the term hit mainstream media. Fast data, like those collected from seismic sensors, time series, and other sources, connect the world one touchpoint at a time and thus demand real-time response. After all, you can’t drill for oil using outdated calculations.

CGG understands the importance of combining big data with fast data, and has, since I started working with them a decade ago. What Objectivity/DB and other big data tools like Spark and HDFS allow companies working in the oil and gas domain like CGG to do is create better images and understandings of Earth’s layers with richer datasets.

CGG originally chose Objectivity/DB for its ACID transactions, reliability, scalability and performance managing different multiple complex object types for many users, without the overhead of an object/relational mapping. A particular feature of Objectivity/DB is the support for inter-object relationships (also known as associations). This feature allows CGG to link the different object types together for particular sets of wells, tools and sensors to build and maintain complex structures on top of the data that supports many different query access patterns.

I’d love to share more about the exciting work that Objectivity and CGG have done as partners, but I don’t want to give away all our secrets just yet. You can find out more about the industry-changing innovations we’re accomplishing with our technologies by attending our upcoming webinar, “Drilling Insights Out of Big and Fast Data: The Changing Landscape of Leveraging Data in the Oil and Gas Industry” on July 29th at 10 a.m. PST.

If there’s one thing that I have learned over 10 years working with CGG, it’s that technology only gets better with age—hopefully, just like the certain system engineers who build it.




Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedIn