Blog

A new approach to analyzing time-series data

These days, most large organizations have a plan for big data integration (see Figure 1), that is, to collect and analyze their big data assets from many sources: For instance, e-commerce businesses have the tools to sort through CRM databases for order logs, customer correspondence, and delivery information, and can pair that data with historical weather records to assess how the temperature impacts when customers are most likely to order certain products, or how changes in weather have historically impacted delivery schedules.

read more

Making Spark Work for Next Generation Workflows

Introduction

You know that you are dealing with “Big” data when you can no longer use general-purpose, off-the-shelf solutions for your problems. Big data technologies are all specialized for specific sets of problems. Apache Spark™ is uniquely designed for in-memory processing of data workflows at scale. Currently, it is the most active open-source project for big data processing. One key strategy for extracting the most value from large connected datasets is the use of graph analytics to derive business insights.

Distributed graph databases also support analytics at scale, but they are specifically designed to store complex graph data and perform fine-grained real-time graph analytics. By using Spark for expressing and executing your workflows in conjunction with a distributed graph database, you can design and execute workflows suitable for the next generation of applications that exploit insights derived from complex graph analytics.

read more

A walk down the memory lane of object data modeling

In my last blog, I covered the last ten years working with a particular Objectivity customer. In this blog, I want to go back even further, more than 25 years ago to the start of Objectivity.

Let me start by presenting a brief timeline:

The 1980s: Object technologies became popular, although even prior to that time, objects had been used in some significant projects. These technologies included languages, modeling, tools and databases.
1989: The Object Management Group (OMG) was formed to coordinate and standardize efforts among multiple organizations in different verticals, all trying to leverage the power of objects.
1996: The Unified Modeling Language (UML) was accepted by the OMG, unifying modeling methods from luminaries like Grady Booch, Ivar Jacobson and James Rumbaugh.
2005: A Task Force was set up with OMG to bring together multiple tools in the Business Process space.
2011: The Cloud Standards Customer Council was created.
So suffice it to say that objects have been around for a long time and are still here.

read more

Started from the Bottom, Now We’re Here: 10 Years of Innovation with CGG

Noticing that 2015 is already half-over, it got me thinking about just how fast time flies. Case in point: almost ten years ago, I was working at Objectivity as a systems engineer for Fugro-Jason, now part of CGG, a geoscience company based in Paris, France, and one of the many long-standing solutions partners with Objectivity.

Today I’m delighted that I’m still at Objectivity, enabling CGG GeoSoftware to remain on the cutting edge of big data technology. Over these ten years the requirements of the application and demands on the database have increased significantly. The number and accuracy of the sensor-generated samples and readings has increased six orders of magnitude, and the amount of data stored and processed has increased correspondingly three orders of magnitude.

read more

Hard Data vs. Soft Data

Do you ever wonder to yourself: which way home is going to be the fastest today? I know this conundrum personally, for although I have a few options on how to get home (thanks to innovative apps like Waze), I know from experience that the shortest path from work is...

read more

Information Fusion and Data Integration: Fast vs. Batch

Almost any popular, fast-growing market experiences at least a bit of confusion around terminology. Multiple firms are frantically competing to insert their own “marketectures,” branding, and colloquialisms into the conversation with the hope their verbiage will come out on top.

Add in the inherent complexity at the intersection of Business Intelligence and Big Data, and it’s easy to understand how difficult it is to discern one competitive claim from another. Everyone and their strategic partner is focused on “leveraging data to glean actionable insights that will improve your business.” Unfortunately, the process involved in achieving this goal is complex, multi-layered, and very different from application to application depending on the type of data involved.

read more

Archives