Recently my colleague Merv Adrian and I spoke with a top four financial services firm about its upcoming Hadoop deployment. After several months of testing, the ten-node pilot was moving to production in a few weeks. We asked if any consulting firms or systems integrators helped with the project and the response was that Hadoop skills were simply unavailable.
If a top-tier finserv company can’t find Hadoop skills, how will other enterprises?
Hadoop is a complicated beast. Commercial distributions frequently comprise more than two dozen projects, each with its own pace of development, version incompatibilities and bugs. Just getting Hadoop running – let alone processing data – is a substantial achievement requiring a variety of skills. Actually doing something with the data requires another set of uncommon, if not rare, skills.
Enterprise adoption of Hadoop will be significantly impacted in 2015 unless Hadoop becomes drastically simpler to deploy and manage or the pool of available talent increases exponentially. With new components being added almost daily (Spark, Flink, Hama, etc.), the former doesn’t look possible. The availability of Hadoop skills must increase, and increase rapidly, for meaningful adoption to occur.
What are you doing to develop Hadoop skills at your company? Let me know in the comments.