Categories: ResearchTechnology

1TB sort done in 68 seconds by Google

This 3 times faster than the previous known record.

For those who don’t know, Jim Gray et al established a series of tests, including the 1TB sort, in order to give database vendors a playground for honest comparisons. The results are maintained online. Here are the two related papers:

Google managed to sort 1TB in 68secs using their MapReduce infrastructure on 1,000 machines. Then, they attempted to sort 1PB of data on 4,000 machines. It’s interesting how when sorting 1PB of data one hits the hard disk failure rates.

Interesting stuff. I am looking forward to the paper.

Recent Posts

The Beginning of CVOYA

There’s a unique energy that comes with starting something new — a blend of excitement,…

3 weeks ago

Enhancements in Graph Model: Dynamic Entities & Full-Text Search

As I continued work on BrainExpanded and its MCP service, I came to realize that…

4 months ago

GraphModel: A .NET Abstraction for Graphs

Just over a month ago, I published "Playing with graphs and Neo4j". Back then, it…

5 months ago

Playing with graphs and neo4j

After my initial implementation of some BrainExpanded-related ideas on top of dgraph using its GraphQL…

6 months ago

A Graph Model DSL

Say hello to the Graph Model Domain Specific Language (GMDSL), created with the help of…

6 months ago

BrainExpanded – Web app and Data Sources

As I wrote in previous posts, the manual recording of memories for BrainExpanded is just…

7 months ago