This 3 times faster than the previous known record.
For those who don’t know, Jim Gray et al established a series of tests, including the 1TB sort, in order to give database vendors a playground for honest comparisons. The results are maintained online. Here are the two related papers:
Google managed to sort 1TB in 68secs using their MapReduce infrastructure on 1,000 machines. Then, they attempted to sort 1PB of data on 4,000 machines. It’s interesting how when sorting 1PB of data one hits the hard disk failure rates.
Interesting stuff. I am looking forward to the paper.
Say hello to the Graph Model Domain Specific Language (GMDSL), created with the help of…
As I wrote in previous posts, the manual recording of memories for BrainExpanded is just…
Imagine a world where your memory is enhanced by a team of intelligent agents, working…
As part of the BrainExpanded project, I’m building an iOS app that lets users easily…
Artificial Intelligence (AI) has rapidly evolved over the past few decades, becoming an integral part…
Happy New Year everyone! I was planning for my next BrainExpanded post to be a…