Link Details

Link 1084135 thumbnail
User 1173579 avatar

By lucyc@sandsmedia.com
via reddit.com
Published: Dec 20 2013 / 09:26

The Large Hadron Collider experiments manage tens of petabytes of data spread across hundreds of data centres. Managing and processing this volume required significant infrastructure and novel software systems, involving years of R&D and significant commissioning to prepare for the LHC First Data. The evolution of this global computing infrastructure, and the specialisations made by the experiments, have lessons relevant for many commercial "big data" users. This talk looks at the data and workflow management system of one of the LHC experiments and will draw out successes, weaknesses and interesting organisational issues that have parallels in a commercial setting. Filmed at JAX London 2013.
  • 12
  • 0
  • 361
  • 1099

Add your comment


Html tags not supported. Reply is editable for 5 minutes. Use [code lang="java|ruby|sql|css|xml"][/code] to post code snippets.

Reactive Programming with Akka
Written by: Ryan Knight
Featured Refcardz: Top Refcardz:
  1. Design Patterns
  2. OO JS
  3. Cont. Delivery
  4. Java Performance
  5. HTML5 Mobile
  1. Java Performance
  2. Node.js
  3. Debugging JavaScript
  4. Java
  5. Java Concurrency