Next-generation database analytics help companies prepare for big data

Published: Friday, May 11, 2012

Database maintenance and upkeep is an essential part of the IT department's responsibilities. As more companies deploy virtualization, cloud computing and other technologies, data center infrastructures continue to experience disruptions that may or may not negatively impact an organization's mission-critical applications. With the age of big data now fast approaching, any companies that don't prepare their systems accordingly could experience regrettable performance bottlenecks - one of the most common and fatal problems for databases.

To mitigate unnecessary risks associated with introducing massive volumes of information to an organization, database administrators should consider deploying monitoring tools. These generally come in three models: day-to-day, ad-hoc and alert monitoring. Day-to-day advising observes performance each day, while ad-hoc types allow database administrators to check on their system's efficiency if something goes wrong. Alert monitoring enables decision-makers to determine certain metrics that, when reached, inform system administrators about an occurrence.

As the big data phenomenon approaches, it is more important than ever that databases have monitoring and analytic tools that are capable of deciphering and making sense of large volumes of information. Unfortunately, many of today's in-house analytic tools are inefficient when trying to perform under 'big data' workloads, according to an IT world report.

In the past, the concept of gaining a competitive advantage over rival firms could be attributed to whom a business knows, rather than what. The advent of big data changed all that, driving the need for database customization so systems can handle gigantic workloads and analyze vast quantities of information, .

"There is broad consensus in most organizations that enterprise data, and perhaps more importantly, the ability to analyze large volumes or smaller subsets of data at will, in real time, are crucial business differentiators," a separate Database Trends and Applications report said.

IT world noted that most databases are not designed to leverage in-house analytics, meaning that when information does need to be analyzed, the warehouse must migrate it from the storage facility to an evaluation station. As a result, queries can be stalled and bottlenecks can emerge to choke up the entire system. This lag can also negatively business performance by slowing down response time and affecting the decisions that administrators make.

By leveraging in-database analytic tools, organizations can increase the speed they can decipher data to make informed decisions, identify new models to acquire revenue and create new markets based on evaluations, according to an IBM blog post by Bill Zanine.

"In-database analytics will allow you to run models that used to take weeks in a matter of hours or minutes," Zanine asserted. "It just doesn't make sense to stick to the same monthly or quarterly segmentation or targeting cycles. Instead, you must plan to continually evaluate and respond to customer behavior."

In-database analytics also give businesses the ability to make decisions that were previously unattainable because of their inability to analyze the data needed to make such a choice. The solutions also allow database administrators the ability to observe deep evaluations beyond the reach of traditional analytic tools, giving companies a competitive advantage over rival firms that lack the same innovative services.

"The big data market is expanding rapidly as large IT companies and startups vie for customers and market share," IDC business analytics solutions program vice president Dan Vesset said in another report. "For technology buyers, opportunities exist to use big data technology to improve operational efficiency and to drive innovation. Use cases are already present across industries and geographic regions."