Hadoop is helping to solve many Big Data problems through massively distributed processing across many machines. This is creating amazing value for both business and engineering teams by allowing many terabytes of data to be processed and analyzed quickly and efficiently for almost real time decision making. The scale and distributed model of Hadoop is also creating new challenges for Operations departments. Operations must be prepared to provision, install, configure, maintain, and scale large clusters quickly and reliably in order to keep up with the requirements of Big Data to meet the needs of the business.
We tackled this topic at the recent presentation at the LA Chef/Hadoop Joint Meetup -- http://www.ustream.tv/recorded/33117271