How Graph Analytics Can Help the Finance Sector Innovate

Over the last decade, financial institutions have increasingly relied on data and data analytics to gain a competitive edge, as well as to minimize their exposure to risk and compliance issues. For instance: institutional investors apply sophisticated data analytics algorithms to historical data and streaming data, trying to find patterns and associations between the data, in order to determine which stocks to buy and sell. According to the Wall Street Journal, algorithmic trading now accounts for roughly one-third of foreign currency exchanges. When it comes to their own business operations, financial institutions use data tools for tracking compliance, assessing risk, and detecting potential fraud or security breaches.

For these reasons, many institutions are now utilizing data analytics platforms to help them make informed decisions based on contextual analysis of historical and real-time data. An enterprise-class graph analytics platform is a vital addition to this toolset as current methods generally rely on statistical pattern-matching. A graph approach would allow detection of cause-and-effect chains that are apparent by analyzing transactions and stock prices.

read more

Conquering The Database Epidemic, Pt. 2: Where are we headed?

In Part 1 of this blog series, I looked at the fundamental principles behind all database technologies and the evolution of DBMSs as system requirements changed. In this concluding article, I’ll address the enormous changes in requirements that Objectivity is seeing and suggest some ways of attacking the problems that they are introducing.

The Rise of Big Data

Dramatically increased and still growing use of the WWW has made it necessary for companies to gather and analyze a much wider variety of data types than ever before. They also need to store and process more data in order to garner business intelligence and improve operations. This introduces an additional data generator, the data center and communications infrastructure, which can produce voluminous logs from multiple sources.

Many of these “Big Data” systems operate on huge volumes of relatively simple data, much of it requiring conversion, filtering or consolidation before it can be used for analytical purposes. In the early days, much of this new data was stored in structured files. Hadoop, with its MapReduce parallel processing component and the scalable, robust Hadoop Distributed File System (HDFS), rapidly gained momentum making it the mostly widely used framework for Big Data systems.

read more

Conquering The Database Epidemic, Pt. 1: Do we have enough of them yet?

In the year 2000, it seemed that database technology had matured to the point where changes were incremental, at least in the enterprise. Today, there is such a wide choice of database management systems (DBMSs) and other storage technology that it is hard to determine the best fit for a particular problem. Some wonder whether we have a surfeit of databases. In this blog series, I’ll look back at how we arrived where we are today, examine the new demands and challenges that we are facing, and suggest some lines of attack using Objectivity’s suite of database platforms.

read more


This is a blog examining the use of signaling in applications that provide recommendations as a use case for graph databases.

After reading books like “Freakonomics,” written by pop economic superstars Stephen Dubner and Steven Levitt, and “Outliers,” written by acclaimed New Yorker magazine columnist Malcolm Gladwell, you realize that economic theory can help you see the world from a different perspective. New insights using economic principles are helping to address questions that have previously seemed unanswerable. In his book, “Who Gets What and Why: The New Economics of Matchmaking and Market Design,” Alvin Roth designs successful matching markets where desperate organ recipients find compatible organ donors or needy job seekers find companies looking for candidates with their skillset. Roth is opening up a new way of solving matching problems.

Roth discusses the need for natural signals of interest in successful market designs. This communication, which Roth calls “signaling,” is used to qualify potential matches and determine if the matches are desired (177). For example, bright displays of color are not an evolutionary advantage to avoiding predators for a male peacock, but his plumage certainly allows potential mates to identify him as strong and healthy. Similarly, a signal can be costly for the sender, but it can also help to qualify a match or determine the strength of a match. The value of signaling is seen clearly through use of online dating sites to meet people.

read more

Recapping 2015 at Objectivity

2015 marked an exciting year for all of us at Objectivity due to three important developments: we saw the continuing emergence of new technologies for Big and Fast Data, we also saw the growth in demand for Industrial IoT applications and solutions, and lastly, it was also a year of major accomplishments at Objectivity.

Over the last 12 months, we have seen a growing number of organizations elevate the strategic value of their data assets. The first generation of Big Data systems primarily focused on data ingest and batch analysis by leveraging more cost-effective scale-out and cluster computing. Now the emerging importance of Fast Data from various steaming sources, such as sensors, has brought about the recognition that tremendous competitive value can be achieved by narrowing the time gap between data coming in and actionable insights coming out.

The importance of narrowing this gap between data arrival and value realization is particularly great in applications around the Industrial IoT. From utilities and various manufacturing sectors to financial services, public safety and logistics, we witnessed an increasing instrumentation of devices in the form of massive sensor networks generating volumes of streaming Fast Data. As the volume of this new data grows, the need to accelerate sensor-to-insight has been growing at even greater urgency.

read more

Automating the Indoors: How Objectivity is driving innovation in the Industrial IoT

Thinking of my previous blog about our long-term customer in the oil and gas industry got me walking down memory lane regarding other customers, this time one in building automation, which I was involved with in the early days of Objectivity.

This customer has a long history with Objectivity during which it has grown through various corporate and technology acquisitions in several different countries. Although their automation system has evolved over time, Objectivity/DB has remained as the persistent object store at the heart of it.

The system monitors networks of sensors for heating, air conditioning and ventilation, fire detection, intrusion, and other equipment. The main goals of the system are the following: provide investment protection well into the future while protecting past investments; support expanding needs as new technology trends in information, communication and building automation emerge; and ensure compatibility across a wide range of equipment from many different suppliers. Deployments of the system range from single buildings to large distributed campuses and airports.

read more