Sorry, But 1940’s Compute Architectures Can’t Overcome Big Data Performance Bottlenecks

The von Neumann computer, developed in the 1940s, refers to a type of computer architecture in which instructions and data are stored together in a common memory. Computers using this design revolutionized “life as we know it” by providing an efficient appliance for compute-centric tasks. That’s why von Neumann computers—most commonly seen in x86 systems—still […] Read More

2015: The Year for Agile Data?

There’s no surprise—businesses are moving faster than ever before while at the same time, data is being amassed at greater and greater speeds. It’s the story that everyone is talking about right now. The other piece being discussed is how exactly do you reconcile the two scenarios? The answer: agile data. The idea of agile […] Read More

For High Performance in Big Data, Look Under The Hood

“Insanity is doing the same thing over and over again and expecting different results.” – Albert Einstein. For many, this quote resonates with how most companies are solving data performance problems today. Got Performance Problems? Option 1: Throw bigger servers at the problem. This is called scale-up, until you run out of conventional processors, memory or disk you can fit on […] Read More