Thinking of my previous blog about our long-term customer in the oil and gas industry got me walking down memory lane regarding other customers, this time one in building automation, which I was involved with in the early days of Objectivity.
This customer has a long history with Objectivity during which it has grown through various corporate and technology acquisitions in several different countries. Although their automation system has evolved over time, Objectivity/DB has remained as the persistent object store at the heart of it.
The system monitors networks of sensors for heating, air conditioning and ventilation, fire detection, intrusion, and other equipment. The main goals of the system are the following: provide investment protection well into the future while protecting past investments; support expanding needs as new technology trends in information, communication and building automation emerge; and ensure compatibility across a wide range of equipment from many different suppliers. Deployments of the system range from single buildings to large distributed campuses and airports.
Information fusion has its foundation in data fusion as used by military and intelligence agencies, generally defined as the use of techniques that combine data from multiples sources and gather that information in order to achieve inferences. This process would be more efficient that if the fusion was achieved by means of a single source.
Depending on the model used, there are several levels of assessment or refinement. As the fusion process goes through these different levels, the information is refined as more value is added. Information fusion can be defined as the process of merging information from disparate sources despite differences in conceptual, contextual and typographical representations, typically combining data from structured, unstructured and semi-structured resources.
The world is full of real world objects (people, places, things) and relationships (knows, likes). Information fusion works with these real world objects and relationships, and in the fusion process discovers new objects and relationships. The best way to represent these is in an object model representation.
In my last blog, I covered the last ten years working with a particular Objectivity customer. In this blog, I want to go back even further, more than 25 years ago to the start of Objectivity.
Let me start by presenting a brief timeline:
The 1980s: Object technologies became popular, although even prior to that time, objects had been used in some significant projects. These technologies included languages, modeling, tools and databases.
1989: The Object Management Group (OMG) was formed to coordinate and standardize efforts among multiple organizations in different verticals, all trying to leverage the power of objects.
1996: The Unified Modeling Language (UML) was accepted by the OMG, unifying modeling methods from luminaries like Grady Booch, Ivar Jacobson and James Rumbaugh.
2005: A Task Force was set up with OMG to bring together multiple tools in the Business Process space.
2011: The Cloud Standards Customer Council was created.
So suffice it to say that objects have been around for a long time and are still here.
Noticing that 2015 is already half-over, it got me thinking about just how fast time flies. Case in point: almost ten years ago, I was working at Objectivity as a systems engineer for Fugro-Jason, now part of CGG, a geoscience company based in Paris, France, and one of the many long-standing solutions partners with Objectivity.
Today I’m delighted that I’m still at Objectivity, enabling CGG GeoSoftware to remain on the cutting edge of big data technology. Over these ten years the requirements of the application and demands on the database have increased significantly. The number and accuracy of the sensor-generated samples and readings has increased six orders of magnitude, and the amount of data stored and processed has increased correspondingly three orders of magnitude.