Notes and Video from the 2015 Hadoop Summit

Last month’s North America Hadoop Summit brought together developers, data scientists, data analysts, vendors and more to learn how to leverage Hadoop within enterprise data architectures. The summit also provided an opportunity for attendees to hear from thought leaders across a number of verticals who are leveraging Hadoop (and a myriad of other tools such […] Read More

Datacenter Trends: FPGA Acceleration is the Answer to Enterprise Data Analysis Performance Challenges (Part 2)

Earlier, we discussed the challenges that plague big data initiatives, and why there needs to be a dramatic shift in the industry and technology to meet the real-time needs of today’s businesses. X86 architectures, can no longer keep up, and FPGA systems are moving forward as the solution to data analytics challenges. FPGA-accelerated systems have […] Read More

Data Center Trends: FPGA Acceleration is the Answer to Enterprise Data Analysis Performance Challenges

Driven by the relentless and increasing volume and velocity of data, there has been a renewed interest in using FPGAs to execute operations in parallel for the real-time performance needed to drive decision making in the enterprise. The operators of the world’s largest data centers and leading search engines are accelerating their data analysis and […] Read More

Supercharge Big Data Success with New Analytics Architectures

Real-time analysis of a wide array of both person—and machine—made data streams is becoming integral to getting value from data. However, current infrastructures just cannot process the velocity and volume of data these streams produce. Recently, we gave the below presentation at the Enterprise HPC forum to showcase how the new Ryft ONE can simultaneously analyze […] Read More

Sorry, But 1940’s Compute Architectures Can’t Overcome Big Data Performance Bottlenecks

The von Neumann computer, developed in the 1940s, refers to a type of computer architecture in which instructions and data are stored together in a common memory. Computers using this design revolutionized “life as we know it” by providing an efficient appliance for compute-centric tasks. That’s why von Neumann computers—most commonly seen in x86 systems—still […] Read More

2015: The Year for Agile Data?

There’s no surprise—businesses are moving faster than ever before while at the same time, data is being amassed at greater and greater speeds. It’s the story that everyone is talking about right now. The other piece being discussed is how exactly do you reconcile the two scenarios? The answer: agile data. The idea of agile […] Read More