This 3 times faster than the previous known record.
For those who don’t know, Jim Gray et al established a series of tests, including the 1TB sort, in order to give database vendors a playground for honest comparisons. The results are maintained online. Here are the two related papers:
Google managed to sort 1TB in 68secs using their MapReduce infrastructure on 1,000 machines. Then, they attempted to sort 1PB of data on 4,000 machines. It’s interesting how when sorting 1PB of data one hits the hard disk failure rates.
Interesting stuff. I am looking forward to the paper.
There’s a unique energy that comes with starting something new — a blend of excitement,…
As I continued work on BrainExpanded and its MCP service, I came to realize that…
Just over a month ago, I published "Playing with graphs and Neo4j". Back then, it…
After my initial implementation of some BrainExpanded-related ideas on top of dgraph using its GraphQL…
Say hello to the Graph Model Domain Specific Language (GMDSL), created with the help of…
As I wrote in previous posts, the manual recording of memories for BrainExpanded is just…