Deux Ex Machina — Only as Good as its Human Programmers

By Mike McGuire | September 27, 2013 | 2 Comments

My esteemed colleague Jake Sorofman’s recent post got my attention. Probably because I really like Jake’s writing, but also because I’ve been around tech for awhile. And my dad’s an artist who literally only uses his eyes, hands and some paint to create. Stuff. Paintings.

So what really got me going was when Jake asked the questions, “…can automation commoditize the human genius now required to produce emotionally evocative content? Can this lightning in a bottle be canned and distributed at scale?”

Now we’re headed to that fork in the road where Jake and I part ways. He goes on to say that human emotions are easily mainpulated and that given certain well-known and well understood forms of storytelling — his example of Hollywood is spot-on — a well designed machine could spit out a convincing script.  Perhaps. But what will happen is an exponential increase in derivative, formulaic content that while satisfying a majority of consumers, ultimately works against the audience and the creators.  Why? As it becomes more and more predictable, the content — and from a marketer’s perspective, what marketing messages can be associated with it — reduce even quicker than the machines can master reductive storytelling.  Why? Because it becomes the norm. And all the norm knows how to do is replicate itself.  Machines can certainly gain further insight by repetion and clever programming, but ultimately they will simply refine an exisitng model.

So to all the creatives of the world, I ask you not to ignore my esteemed colleague.  Rather, as you contemplate the power in automation, also try to remember the words of noted philosopher (and shredding guitarist), Frank Zappa who once said:

“Without deviation from the norm, progress is not possible.”

 

 

 

2 Comments
  1. 27 September 2013 at 10:12 am
    tom austin says:

    Mike,

    I sympathize but you (and perhaps Jake) have missed the opportunity to consider whether deep learning smart machines can’t master deviance and surprise. Those technologies wouldn’t follow a set of rules humans fed them. They’d analyze literature (and movies, etc) on their own. If the environment reinforced novelty, you’d get novelty.

    Don’t be so quick to dispense with this threat.

  2. 27 September 2013 at 12:34 pm
    Mike McGuire says:

    Hey Tom. . . Thanks so much for taking the time to check out the post.
    I think there is so much AI can do to offload the acquisiton and management of data — tasks that currently take humans hours/days/weeks to collect. But, and it’s a highly qualified, puzzled but…Where I get lost in your argument is that to master deviance and surprise, among other subjective interpretations of info/events, aren’t the root algorithms defined by humans with all their biases?
    And I’m curious why you refer to the developments as a threat?

Comments are closed.