The time value of data, like the time value of money, decays over time. A constant influx of streaming data from social feeds, sales and other sources means that what was true a week or even an hour ago isn’t necessarily true now. To truly take advantage of the value of streaming data, enterprises must be able to leverage high performance analytics to extract insights from data as it is collected while there is still time to act on it. What’s Behind the Need for Speed? A just-published Big Data Executive Survey by NewVantage Partners shows the “need for speed” is behind sharp increases in spending on big data initiatives. Of the Fortune 1000 firms surveyed, only 5.6% of firms identified cost savings and operational efficiency as the primary drivers of big data investment. However, a whopping 83.5% of survey respondents named factors relating to speed, insight and business agility as the primary reasons for investments in big data technologies and resources. Of this group, 46.5% of firms invested—or are planning to invest—in order to increase speed and reduce the time-to-insight. The Race Against Time Clearly, most enterprises believe that fast insights from big data will deliver significant advantages, enabling them to act faster and make better decisions while bringing new capabilities to market. Yet another survey conducted by experts at Accenture concurs. The 2015 report, entitled “Accenture, Big Success with Big Data: Executive Summary” revealed intimidating results: 89% of business leaders believe big data will revolutionize business operations just as the Internet did. 85% believe big data will dramatically change the way they do business. 79% agree that “companies that do not embrace big data will lose their competitive position and may even face extinction.” 83% have pursued big data projects in order to seize a competitive edge. It’s time to gear up to take on this major market disruption and be prepared to win the race to extract faster and better intelligence for more informed decision-making. Unfortunately, large expenditures in technology will not be enough to drive the needed improvements in speed. A Wrench in the Gears? A quick scan of the traditional process on the left in the diagram below shows a few reasons why it takes so long to get insights and answers from data. Each step in the traditional process introduces significant cumulative delays, which risk big data initiative failure. The primary offenders are bottlenecks such as: Extensive time to clean and transform a variety of types of data. Time-consuming data indexing operations required with any change in data. Sluggish analysis performance because of limited processing power. Limited access to parallel processing clusters or big vendor frameworks. Relatively slow networking speeds. You Can’t Manage What You Can’t Measure To understand how efficient an organization’s data pipeline is—or is not—you need to look at overall mean time to decisions (MTTD), a measure of the time it takes to capture and process data and deliver the insights needed to drive business decisions. This includes the time to prepare and move data to the analytics infrastructure, to transform and load data, and to analyze it. By mapping and measuring each of the steps in the data pipeline, you can focus on eliminating big delays from the process and accelerating MTTD. Contact us to hear how Ryft dramatically compresses today’s data pipelines to speed MTTD and deliver relevant insights in real time. Leave a Reply Cancel reply Your email address will not be published. Required fields are marked *Comment Name * Email * Website Save my name, email, and website in this browser for the next time I comment.