Why Edge Computing Is Here to Stay: Five Use Cases

Data loses its value when it can’t be analyzed fast enough. Edge computing and analytics can solve the challenge for enterprises ranging from oil and gas production to banks and retailers. Security cameras, phones, machine sensors, thermostats, cars and televisions are just a few of the items in daily use that create data that can […] Read More

In Big Data, Mean Time is Money

To begin to unlock the true value of their data and reduce Mean Time to Decision (MTTD), organizations need to eliminate their big delays—and that means new approaches to Big Data analysis and entirely new processing architectures. Here, RTInsights contributor Patrick McGarry explains why investing in platforms based on FPGA technologies will be critical to […] Read More

Datacenter Trends: FPGA Acceleration is the Answer to Enterprise Data Analysis Performance Challenges (Part 2)

Earlier, we discussed the challenges that plague big data initiatives, and why there needs to be a dramatic shift in the industry and technology to meet the real-time needs of today’s businesses. X86 architectures, can no longer keep up, and FPGA systems are moving forward as the solution to data analytics challenges. FPGA-accelerated systems have […] Read More

Data Center Trends: FPGA Acceleration is the Answer to Enterprise Data Analysis Performance Challenges

Driven by the relentless and increasing volume and velocity of data, there has been a renewed interest in using FPGAs to execute operations in parallel for the real-time performance needed to drive decision making in the enterprise. The operators of the world’s largest data centers and leading search engines are accelerating their data analysis and […] Read More

What Comes After Moore’s Law?

Last week we took at look at the demise of Moore’s Law. It’s no secret that workloads are becoming larger and more complex than ever, and the traditional x86 hardware and scaled out infrastructure cannot meet low latency, high compute and massive storage demands. In fact, as you can see in AMD’s graph below, data-driven […] Read More

Sorry, But 1940’s Compute Architectures Can’t Overcome Big Data Performance Bottlenecks

The von Neumann computer, developed in the 1940s, refers to a type of computer architecture in which instructions and data are stored together in a common memory. Computers using this design revolutionized “life as we know it” by providing an efficient appliance for compute-centric tasks. That’s why von Neumann computers—most commonly seen in x86 systems—still […] Read More