Over the last few decades, organizations have turned to a dizzying array of IT systems to capture and store data. In sectors as diverse as retail, manufacturing, healthcare, government, education and financial services, the desire to gain insights into a vast array of activities and events is increasingly at the center of operational success.
Yet, as organizations amass repositories tipping into the petabytes and attempt to pull data elements from a dizzying array of sources, achieving designated objectives has never been tougher. Enter big data — today’s big IT buzzword .
Amazon Web Services refers to big data as any amount of data that’s too big to be handled by one computer. While Wikipedia deems it data sets with sizes beyond the ability of commonly-used software tools to capture, curate, manage and process within a tolerable elapsed time.
Regardless of the definition, big data represents a resource from which to extract meaning and drive better decision-making — all from the proliferation of digital bits and bytes that fill servers, hard drives, mobile devices and websites. Yet, as organizations turn to big data to spur innovation and gain a competitive edge, many are finding that they face big challenges.
For one thing, there’s the sheer volume of data that organizations must wade through. Consulting firm Gartner Inc. reports that enterprise data growth rates now average 40 percent to 60 percent annually. For another, data increasingly resides in systems and pools that are difficult to access and put to use.
And the task isn’t getting any easier. In many cases, organizations are attempting to tap into transactional data extending back decades. In addition, they’re coping with mountains of unstructured data — including geolocation information, audio and video, photos, websites, voicemails, messaging streams, data from machine sensors and more — generated from a growing array of digital devices, including mobile phones and social media.
“Combining everything and making sense of everything is the challenge of the digital age,” says Gary Curtis, chief technology strategist and managing director at consulting firm Accenture.
Ultimately, how can organizations tap into big data to gain valuable insights? What type of approach is necessary to produce real world results?
“The biggest challenge is finding a way to cut through the sheer volume of data and the inherent complexity of different databases and unstructured systems,” states Scott Schlesinger, vice president and head of North America Business Information Management at consulting firm Capgemini. “A well-conceived strategy and the appropriate tools for conducting advanced analytics and modeling are critical.”
Of course, the path to progress can prove bumpy. Capgemini, in a 2012 report titled The Deciding Factor: Big Data & Decision Making , found that while 65 percent of respondents reported a need to make management decisions based on “hard analytic information,” the path to success is paved with more than a few potholes.
It seems there currently is some question regarding there being enough big data practitioners to make it work.
“While there's much fuss made about the endless possibilities of what can be done with the streams of data garnered from the web, the hard truth is that few people know what to do with it,” notes Jamie Condliffe of Gizmodo .
In addition, many organizations fail to view data strategically and are unable to adopt a “big data culture.”
Organizations also struggle to make effective use of unstructured data for decision-making — particularly social media data — while structural silos inhibit a more integrated approach to data analytics.
A best-practice approach to big data means combining the right technologies and tools, building effective workflows and policies, finding talent that can tap into analytics and predictive analytics software, and applying the technology to delve beyond the surface and address real problems, notes Kalyan Viswanathan, director for Global Consulting Practices Information Management with Tata Consulting Services.