All I'm really aware of are Google Proprietary Magic™ (motherfuckers. I want colossus baaaaad), CFS from DataStax, Storm (sorta-not-really?), and Spark.
Sector/Sphere is what the Sloan Digital Sky Survey uses.
Instead of supplying map and reduce routines, you implement generic "user defined functions". This gives you some more flexibility about how the work is handled, though if you want to just implement map and reduce UDFs, it supposedly gets better performance than Hadoop.
It's also designed to support distributing work over WANs. I think Hadoop really wants every compute node to be on the same LAN.
>I think Hadoop really wants every compute node to be on the same LAN.
Fucking a-right it does. You should see the labyrinthine depths people descend to in order to scale Hadoop. Sub-clusters of sub-clusters, rack-local clusters, Zookeeper nodes all over the place.
You should write a page for it. You aren't supposed to create pages for your own projects/products on wikipedia; they should come from neutral parties.