2014 was a big year for Hadoop security. After a number of acquisitions, some exciting TIT FOR TAT strategies between vendors and the entry of established DAP ( = Database Audit and Protection) vendors into the Hadoop security market, clients have now a number of good choices as to how they want to secure their big data in Hadoop for production.
- Controls applied when the data is captured.
- Controls applied when data is preserved.
- Controls applied when data is analyzed.
- Auditing, monitoring and assessment functions.
For structured data in Hadoop, security architects have regained the controls that they have been used to in RDBMS. For unstructured data, clients must resort to basic platform security and hope for the best. The 2015 version of my research note “Protecting Big Data in Hadoop” assess and introduces the available controls and maturity levels to security and risk management professionals who need to secure big data in Hadoop.
Though the controls that are meanwhile available are plenty, the market still disagrees on the best way to control the many ways of data in Hadoop,–either North/South or East/West between the transforms and components. Some push central security services, others push data encryption (or an other transform) that would implicitly govern data access on all its ways.
I believe that my new report gives a very good overview and gives the necessary guidance to finally from the new paradigms in their production environments.