Admitting that I never understood how orange could be the new blackis no big deal, as there are many things I do not readily understand. Regardless of whatever colours I’ve likely missed since then, I guarantee you that today’s thing is PEC.
Privacy. Enhancing. Computation.
Don’t stop reading there. Your initial response may be ‘humbug, that’s something for the academics and scribes, not useful for me today’. But hang on. Because it isn’t. PEC is a term I use for not just one technology, but a group of modern, sustainable techniques and technologies that you do not want to miss out on.
A few years ago when I thought I saw an upcoming trend, I discussed it with clients as ‘privacy preserving analytics’. I was wrong for two reasons: One, analytics was not the only use case this is relevant for (though early adoption almost uniquely circled in that area). Two, to use ‘preserving’ would imply privacy was already there. In most cases, PEC makes things simply better. For privacy that is.
A few that have come across my desk since, in a wide variety of acronyms just to make you feel you’re in the army: Various forms of HE (be it partial, fully, or over the Torus), as well as DiffPriv, GenerativeAI based SynthData, FedML, PAML, sMPC, ConfComp (or TEE), TTP, PIR, ZKP, and PSI.
PEC can be applied on the data level, the software, or the hardware. These are not all brand new, nor is any of them confined to the academic realm only. These become increasingly applicable, viable means to unlock new levels in the business game, while respecting, enhancing privacy and as result finally be deserving of a customer’s trust.
To get started, here are Three Critical Use Cases for Privacy-Enhancing Computation Techniques
I’ll give you two spoiler alerts:
1) Very (very!) soon, you will see a study that shows over a third of your competitors are investing in this for the coming years.
2) It’s a visual one, and looks as follows: