Conquering The Database Epidemic, Pt. 2: Where are we headed?

Conquering The Database Epidemic, Pt. 2: Where are we headed?

In Part 1 of this blog series, I looked at the fundamental principles behind all database technologies and the evolution of DBMSs as system requirements changed. In this concluding article, I’ll address the enormous changes in requirements that Objectivity is seeing and suggest some ways of attacking the problems that they are introducing.

The Rise of Big Data

Dramatically increased and still growing use of the WWW has made it necessary for companies to gather and analyze a much wider variety of data types than ever before. They also need to store and process more data in order to garner business intelligence and improve operations. This introduces an additional data generator, the data center and communications infrastructure, which can produce voluminous logs from multiple sources.

Many of these “Big Data” systems operate on huge volumes of relatively simple data, much of it requiring conversion, filtering or consolidation before it can be used for analytical purposes. In the early days, much of this new data was stored in structured files. Hadoop, with its MapReduce parallel processing component and the scalable, robust Hadoop Distributed File System (HDFS), rapidly gained momentum making it the mostly widely used framework for Big Data systems.

Conquering The Database Epidemic, Pt. 1: Do we have enough of them yet?

Conquering The Database Epidemic, Pt. 1: Do we have enough of them yet?

In the year 2000, it seemed that database technology had matured to the point where changes were incremental, at least in the enterprise. Today, there is such a wide choice of database management systems (DBMSs) and other storage technology that it is hard to determine the best fit for a particular problem. Some wonder whether we have a surfeit of databases. In this blog series, I’ll look back at how we arrived where we are today, examine the new demands and challenges that we are facing, and suggest some lines of attack using Objectivity’s suite of database platforms.

Object…What?

Object…What?

Until the mid-1970s, most systems were built using functional systems. Object-oriented systems were introduced with a flurry of promises in the early ‘80s, many of which actually proved to be true, for once.

More recently, people have been talking about object-based systems, object stores and object-based file systems. In this article, I’d like to clarify the characteristics of each type of technology. Truth in advertising—there’s a lot of overlap, so I’ll try to smooth out the bumps in the ride.

A new approach to analyzing time-series data

A new approach to analyzing time-series data

These days, most large organizations have a plan for big data integration (see Figure 1), that is, to collect and analyze their big data assets from many sources: For instance, e-commerce businesses have the tools to sort through CRM databases for order logs, customer correspondence, and delivery information, and can pair that data with historical weather records to assess how the temperature impacts when customers are most likely to order certain products, or how changes in weather have historically impacted delivery schedules.

Information Fusion and Data Integration: Fast vs. Batch

Information Fusion and Data Integration: Fast vs. Batch

Almost any popular, fast-growing market experiences at least a bit of confusion around terminology. Multiple firms are frantically competing to insert their own “marketectures,” branding, and colloquialisms into the conversation with the hope their verbiage will come out on top.

Add in the inherent complexity at the intersection of Business Intelligence and Big Data, and it’s easy to understand how difficult it is to discern one competitive claim from another. Everyone and their strategic partner is focused on “leveraging data to glean actionable insights that will improve your business.” Unfortunately, the process involved in achieving this goal is complex, multi-layered, and very different from application to application depending on the type of data involved.