by Andrew White | September 19, 2014 | 3 Comments
I stayed up last night until around 12.30am local time, around 5.30am Greenwich Mean Time/UK. I had to watch one of the most important votes of my lifetime; at least that is how I see it. The UK, the sixth largest economy in the world, a key ally of the US, permanent member of the Security Council, and doorstep to Europe, perhaps one of the stronger speakers and ambassadors for freedom of the individual, civil law, property rights, and the great British banger. Scotland was, and is, part of the UK. After all is said and done, just over 55% of the votes went with “no” to exit the Union. Weather they wanted it or not, the result will still mean more devolution for Scotland – even the 55% that said they wanted to stay in the UK. More devolution will likely follow for other parts of the UK also.
It was almost unthinkable that Scotland would even have considered such a vote. Breaking up the UK would have triggered, or been a catalyst, for so many bad things for the West: The West needs to be strong, united, and organized as the enemies are legion and dynamic. Some of the possible genies that would have followed an exit of Scotland from the UK could have been:
- A punch in the arm for other, near-organized cessation opportunities including Spain (Catalonia), Russia/Ukraine, Canada (Quebec), even talk in the US (Texas is never far removed). If it really was unthinkable in the UK, it is just as unthinkable in Texas
- UK political scenery. The Labour party would have lost a notable power base it needs in order to regain power in Westminster, and the UKIP v Conservative argument would have destroyed any chance for a rational, if establishment-oriented response. The Conservatives would likely have won the next election but in a more fractious environment, even though it would then oversee a national vote that would have removed the remaining UK from Europe; just as Scotland votes to get into the Euro.
- The remaining Commonwealth would have been undermined. The reaming UK would have been less interested, and less able, to keep supporting the resulting group of nations that feel there is value in connecting with the old British Empire grandparent. As the Commonwealth continued to erode, nations around the globe, many small, would have less support from a large, significant ally.
- European security. The UK was never, as a whole, a central European federalist. The role the country plays is a free market counter-weight to that movement – even if that role is currently under scrutiny and has its own vote ahead after the next general election. A “yes” vote would very likely have hastened the UK’s withdrawal from Europe; put at risk its membership of the Security Council and contributions to NATO. With that, French federalists would be left to fight it out with Germany and Europe would have little choice but opt for a 2 speed Europe. This would hasten the pressure on the Euro and the demise of the European project at the fringe of Europe.
- The US and its relations with Europe would have changed. The US’s closes moral ally would have been downsized, and though the rhetoric would have implied little change, the reality is that the UK’s ability to hold up its end of the bargain, with US and NATO, as well as operating as a global voice of reason, would have been diminished.
These genies have been put back into the bottle. Of course it is not all a bed of roses, nor should we have thought it would be – life itself isn’t like that. But the voice of reason has won through – finally – in a rare victory. The UK still has to vote on European membership in a year or so; France and the Federalists still have to managed and constrained; work with the US and NATO can continue reliably..
But when you hear Scottish folk blame their lot on the politicians in Westminster (as you did last night on TV), you have to wonder where they get their ideas from. Barmy politicians, with too much time on their hands, wove a clever, dangerous message. The WORLD is struggling to cope with our situation – it has little to do with one government. That is not to say that Westminster is without blame; and does not let them off the hook after not listening to their citizens from north of the border. But deindustrialization is a fact of life. Austerity is something we have to get used too. Getting on your bike is a good strategy. Life is tough. Let’s get on with it.
The real value of the vote however is that all I just noted above is moot. As the genie is pushed back into the bottle, Pandora’s Box is actually now open. More powers will be given to the Scottish parliament. As this goes, so Wales, Northern Island and England will follow. The UK’s constitutional monarchy will change. How this will change I don’t know – nor do many others though there are a lot of ideas now spinning around in the UK. But change there is a coming – that is for sure. At least these changes will now be explored within and under the auspices of one UK. I just hope the politicians actually work together. I am not so sure. If you watched C-SPAN last night and saw how the various members from the different parties “argued” over each other, one wonders how can they come together. Some of the weirdo’s even claimed the other were racist – on live TV.
In a rare opportunity the rational outcome was realized. Let’s see how long it survives
Category: Commonwealth European Union NATO Reason Scotland UK Tags:
by Andrew White | September 17, 2014 | 3 Comments
There was an awesome article in The Economist, US print edition, September 6-12th, 2014, entitled Free Exchange: Pardon the Disruption. The article sub title suggests that organization inflexibility is usually the reason behind the failure of a firm that succumbs to new forms of competition. Sound familiar? It should.
The article leads off with the views of Clayton Christiansen, a professor at Harvard Business School. I am sure you remember reading your copy of Innovators Dilemma. It remains a classic in my mind, though of late some folks have tried to pull it apart. The article suggests that most research looking at the size of firms and their capacity to innovate fail to detect a relationship between the two dimensions. This goes to one part of Christiansen’s point – the other being the metrics that guide behavior and dictate “success” for larger firms (which actually demotivates the desire for new, smaller, disruptive innovations). New research from Rebecca Henderson, also at Harvard Business School, suggests the return on investments in research by old firms in fast evolving industries is exceeded by the return invested in research by younger rivals.
Rebecca Henderson’s idea is that firms are nothing more than information processing systems. She suggests in her work, and other references sited in the article, that these information processing systems are organized in efficient ways, that over time tend to lead to a hard wiring that prevents the needed change to react to disruptive (not just normal new) threats. From reading the article I get the feeling that her ideas and work seems to suggest that Christensen is still very much correct in his ideas. Large firms get good at sensing stuff – just look at the amount spent on business intelligence. But do the sensors dull the mind and the organizational DNA from spotting a disruption that just does not make sense? That seems to be the case.
I love the idea that firms are information processing systems. Of course, so are we. And of course we all know that everything is based on information. Yet the theory of the firm, founded on the relationship between transaction costs and size, only goes so far in exploring the efficiency in processing certain kinds of information. If I can figure out a better way to understand a customer segment and fulfill a need than the competition, I might start up a firm and try to redraw the competitive boundaries. That is business-speak for “my ideas for information processing are different and I think I can make money on that idea”. That won’t help in selling a business cases, but the connection is implied.
And Rebecca Henderson’s point seems valid. The older we and our organizations get, the more likely we put into stone what had previously been fluid, dynamic, information flows. We start to budget for that acceptance. We tune our antenna to look for expected competitive signals – those that comply with our view of the world. We tend to avoid anything that changes or threatens to change that acceptance. And in that rigidity we create the perfect opportunity for a new thread, a new flexile competitor, a new disruptive threat.
Category: Competition Competitive Advantage Disruption Infonomics Information Theory Information Value Innovation Innovators Dilemma Theory of the firm Transaction Cost Tags:
by Andrew White | September 17, 2014 | Submit a Comment
Since I started on about equal outcome versus equal opportunity the other day, I was flabbergasted to note in an article in today’s US print edition of the WSJ an article that explains how folks with a poor credit score, or loan default, will actually get a discount on their mortgage rates in excess of others with great credit scores can get.
In “Banks Discount Mortgages“, the WSJ reports on the Neighborhood Assistance Corp of America, or NACA, another quango sanctioned by the US Department of Housing and Urban Development, focused on helping those that had poor credit scores or loan defaults. I had never heard of this group before. Turns out they originate and distribute loans. They don’t require down payments, they don’t check borrowers’ credit scores, and they approve borrowers for mortgages as soon as 12 months after a default or other adverse credit event. I mean if that wasn’t ‘discount’ enough, such borrowers could now get today a rate of 0.125% on the life of a 15 year fixed loan.
In the same newspaper there was an article (see, “In Califorina, a Novel Use of Eminant Domain Hits Headwinds” – offline: Mortgage-Relief Plan Hits Hurdles) that tells a story of a couple who purchased a house in California at the height of the bubble, with no money down. Over the next few years the giant loans were rolled into new bigger loans to pay off a car and fix stuff. A final rollover completes the mess and what started out as $310,000 mortage, already extended, ends up as $423,000 sink hole. I am left wondering- where in all this mess was responsibility left behind? Since house prices fell as soon as this family stretched themselves, why didn’t they hunker down and start to cut back? Whey did they seem to go “all in” and now those that behaved sensibly are likely to bail out such behavior?
Back to the original point: discounted mortgages for the poor or high risk among us. I think this is totally abhorrent. The goal is worth, the means is the problem. It creates price distortions that confuse and make the market inefficient. It encourages irresponsible behavior. It punishes those of us, eventually, who save, who cut back, who don’t splurge on a fancy car or second holiday. It might even lead to a bubbie that could catch us all out. Oh, didn’t we just go through that before?
I accept that there is no easy or simple way to reconcile increased property ownership, which adds to social stability and a more stable economy, with lack of economic ability for all. Rather than harp on about politically incorrect arguments for educating and encouraging moral upbringing in those that get into trouble, perhaps we should focus on growing the economy with the right kind of jobs. Then, in time, salaries will increase and those with low incomes and poor credit will be able to afford a nicer house. Housing and protection from the elements should be a minimum from the state. Fiddling the mortgage system (and regular tax payers) such that it (and they/we) can be taken advantage of is not.
Category: Banking Banking Regulation Political Politics Tags:
by Andrew White | September 16, 2014 | 4 Comments
In last weeks US print edition of The Economist (Sept 6-12, 2014), the Technology Quarterly included an article called, “The language of the internet of things“. This is an important topic and helpful article. In a nutshell, the internet of things is growing like gangbusters, yet the chances that all vendors and technologies that comprise the internet of things will all talk the same language – meaning that they will all integrate easily and effectively- are slim to none. The issue is not lack of standards, but too many if them.
I have had personal experience with this issue. In the late 1990s and early 2000s, the consumer goods and retail industry started to develop a standard to simplify how retailers and their trading partners define the products you and I purchase in stores and online. The data was not overly complex- it would define basic product information and nothing unique or secret. The data, once consistent and shared between trading partners, could help align business processes and work – in order to remove from the tale complexity and allow firms to focus on selling to you and me. It seemed, and still seems, an eminently good idea.
At the time Wal-Mart was not involved with this initial effort, at the time called UCC.net. Kmart was also a large retailer at the time, and was only just experiencing the financial difficulties that would, in later years, bring it down. Firms like P&G, Staples, HP, Dell, and others were working with Kmart on this and related standards. What was interesting is how Wal-Mart behavior.
Wal-Mart had been encouraging its own suppliers to use its data definitions for products when it communicated with each other. Wall-Mart was large enough that it was almost the industry standard itself. But it was “almost”- it never was quite large enough. So its largest competitors, spearheaded by Kmart, led the challenge for a real, industry wide, shared, neutral standard. After a while the standard work progressed, Wal-Mart saw the writing on the wall. Why continue developing their own “standard” when the wider community had invested enough, as if on their behalf but not really, to develop a real standard? Wal-Mart canned its own investment and joined the broader industry standard effort. Wal-Mart determined that cooperating with competitors was more effective in this area than competing with the community on what should not have been, directly, a source of advantage.
The behavioral issues behind this are predictable but technology people sometimes assume technology is the reason here. It is not. One has to understand the use and value of standards, all standards, to business and where they figure in the competitive DNA of the firm. In this case UCC.net (now known as GS1) was focused on synchronizing common or near-public data between trading partners. The promise being that if this base level data could be cleaned up and shared, more valuable information (e.g. Trading partner specific) could be exchanged using the same mechanism (standards extension and infrastructure). From Wal-Mart’s perspective this was the lowest common denominator. The wider community had come up with a workable solution, so it was perfectly rational for Wal-Mart to kill off their own sunk costs and leverage the shared cost with the community. If there is something competitive that later comes about as a result, being on the inside is far better than spending your own money fighting the wave.
Now the Economist article and the possibility for a internet if things wide standard for “thing” communication. There are already several large standards in play, often around large centers of gravity like technology or key vendors or partnerships. It is possible that they may converge on a lowest common denominator, but this will be costly.
Second, by agreeing a lowest common denominator, the wider community of interest risks dumbing down the standard that it is no longer able to deliver on the ultimate promise. The basic promise is efficiency and cost reduction. The ultimate promise is effectiveness and value. You only have to look at what happened with Global Data Synchronization to figure out how successful this is likely to be.
Human and financial motivators trump technology elegance. And technology elegance that delivers an advantage to business trumps all other technology elegance whatever their cool factor. I would use that lens to predict where the standards for the language of the internet of things will land.
Category: Collaboration Competition Competitive Advantage Consumer Goods/Retail Global Data Synchronization (GDS) GS1 Internet of Things Retail Industry Standardization Tags:
by Andrew White | September 16, 2014 | 4 Comments
It is with great sadness that I wait for a vote on Thursday by the population of Scotland to see if the UK stays together. It is actually hard to imagine that such a vote could even take place. The idea that the UK might want to dissolve has no more crossed my mind than if I am feeling hungry I would start by nibbling on my own arm. Really, to think that a politician could make a career out of such a move (I refer to Alex Salmon, First Minister, and divisive leader of the ‘yes’ campaign). I say divisive for I cannot see one positive reason for the body of Scotland to leave. Separated, both countries would be weaker. Sure, Alex gets to run his own country, but for what?
I have said on many occasions that as the world get flatter, and communication speeds up, the more fragmented as a society we are becoming. Look since WWII and see how first the British Empire broke up. Much of this was natural and good for sure and the Commonwealth acted as a magnetic binding for a more positive, contemporary union. But central and Eastern Europe broke up; the USSR, south eastern Europe as well. The arguments for individual nationalism and devolution roll onward. We hear that Catalonia in Spain may push if Scotland votes in the affirmative. When will Quebec try again? When Texas or California?
To think the UK might fall apart in my lifetime is heartbreaking. The end of empire is only just over 100 years ago. In 1914 a financially broken Britain had to concede economic control to the US. That in and of itself was not a bad thing. But the empire as a people were about to embark on a painful empirical experiment and readjustment. The core was and is the UK. Scotland is part of that Union. Because of that Union great things happened – not least industrial revolution (a Scotsman invented the steam engine). Our Crowns came together. I feel a need to relay on my Scottish connections. Scotland invented modern banking.
And to think the people of Scotland accept Alex Salmon’s point if view that Scotland would be economically better off without the Union? Really? Think about that for a moment. It is incredibly naive. Almost a dream, or nightmare really. Yes I know that there are no Tories north of the border, and yes I understand the region is predominantly left leaning, and they are not happy with “Westminster”, but so what? There are regions around the world like this all the time. They don’t split off to create their own country. They get on their bike and make change happen. Better from within – since there is no chance when without. And it’s not like there has been civil unrest between England, Wales and Scotland. Well, other than on the Rugby field.
The polls have had their say. Some go with ‘too close to call’. Even a slim “no” vote will be hugely damaging. British politicians have promised new devolutionary powers to console the northerners with a hankering for power. What that means no one knows. As reported in two fine articles in the US print edition of the Financial Times today:
I don’t understand how nationalist fervor can take over what is a rational structure that adds more value as a union. I am heartbroken that in just a few years of effort, a mad few can insight the excited many, to break apart what took hundreds of years to forge.
Category: European Union Political Politics Scotland Scottish Vote UK Tags:
by Andrew White | September 15, 2014 | 2 Comments
I admit it, and if you read my blog, you know it. I am more of an equal opportunity guy than an equal outcome guy. I suspect President Obama is more of an equal outcome guy. But I was confused over the weekend. An article in the US print edition if the Wall Street journal, titled, “Relaxing a key college loan“, suggests that the Obama administration is moving to relax standards related to education loans for those folks with low credit scores.
I am confused since I do understand the end (increased access to a good education for more people) but I don’t understand the chosen means to that end. Why relax standards and saddle those with poor credit scores with more loans? This reeks if the engineering Clinton and Bush peddled with relaxed federal home loan standards that precipitated the financial crisis we are still enjoying.
I get the goal. There are other ways to achieve this end. Tax or education credits, perhaps matched to improving credit scores, to help encourage the right kind of socially desirable behavior (i.e. responsibility for you actions) is one suggestion. Even improved and free/subsidized public education, for the less well of, would work. This might even help keep the cost of education down with increased competition with those establishments that charge higher rates.
The Obama policies will generate several consequences:
- loan bubble for those that can’t afford them in the first place
- ongoing and increased price distortion in the cost of education, leading to increased costs (as you decrease the differential, the higher priced offers are likely to increase as a result)
- likely increase in loan defaults for those that can Ill afford such burdens
Tax or education credits are not ideal, but they would not have as many negative consequences as the currently proposed policy. Free or targeted and subsidized education is the best option though. I would not mind my taxes helping out there.
So even if president Obama has a different goal to me, I still can’t see the logic for his proposed policy. It is as faulty as the same policies that tore down the US and then the global economy.
Category: Education Obama Personal Political US Government Tags:
by Andrew White | September 12, 2014 | 3 Comments
I just spent the good fortune of spending a couple days in Western Canada. I met with numerous end-users spanning several industries included public sector (several different departments), mining, retail, manufacturing, and other regulated industries. I was speaking with a number of CIOs and other senior IT leaders concerning the need for an updated, business relevant information strategy – especially given the increasing focus on digitization.
There were a number of themes and threads that arose over the couple of days that align, as it happens, align with numerous other inquires in the last few months. So thought I would detail some of these for you.
There was one overarching theme that solidified in my mind, and that of complexity. More specifically how firms in general are tending to continuously overlay new complex processes and rules atop what was already a complex business or organization. This ‘adding to complexity’ seems to be everywhere and is creating all kinds of perverse or unintended consequences. More insidiously, many feel that they are actually seeking to make things easier too.
I feel that one reason why we are “over complicating” our processes and organizations is that senior management feels that their firms are too big and hard to manage and that the natural reaction is to establish some lowest common denominator in terms of process or analytic to try to standardize how the masses are to behave. You see his in every form of behavior, from sales, marketing, operations, service, fulfillment, and even finance.
It must be the case that some of us in the world are high performers and others are low performers, and we all know the Peter Principle: it tends to be easier to promote an effective lieutenant to boss, and have them introduce atop the new organization new processes that should emulate the success that led to promotion.
Well the results are often quite predictable:
- The very effective lieutenant becomes an ineffective boss, gets fed up and leaves, or worse, promoted again
- The other high performers get fed up with the ever increasing bureaucracy so that they themselves give up and move on
- The low performers carry on as before only with more credible claims to protection as “things take longer to learn”
I read one time a so-called management book on how complexity kills performance based progress and how firms should seek to avoid and weed out complexity. This is indeed wisdom. But the disease is alive and growing, it seems.
There were several other smaller themes that popped up over and again, probably adding to the complexity I refer to above::
- How to get the business involved as IT is doing all the leading when it comes to (fill in your pet investment of the day, such as big data, master data management, business intelligence, content management, records management etc.)?. Many such examples were asked of me. It seems many IT shops are pretty smart, and do have an understanding of the business needs, but business is too far removed from what IT is doing ‘on their behalf”.
- How to get IT to slow down since their ability to get investments approved is likely to prove their undoing? Too many IT shops have gotten funds approved for investments that have no real business outcome targeted. This manifests as a program in flight that has no measurable outcome to declare victory, or even completion. How such programs and investments get approved in the first I will never know. But IT needs to slow down and focus on business more. All this investment in cool tools and new toys is ok, to a degree. But budgets need to show value, not activity.
- How to classify information assets and prioritize which are most important? This is a classic from the domain of information architecture. This question never really dies- it’s a fan favorite. But it is back in vogue, since so much of our new information investments, even the good and smart ones, require us to prioritize among all the noise that is the information in our organizations.. What I found interesting too is that I just a note on this topic specifically: Gartner’s Three Rings of Information Governance Help You Prioritize Different Types of Data
- How to start managing information in more complex, heterogeneous, best of breed application landscape? And it’s sister, How to start managing information in more process standardized, even centralized, homogenous (read ERP mentality) application landscape? And the more interesting cousin, How to start managing information as my business shifts from one model (e.g. centralized/standardized) to the other (e.g. best of breed)?
All three of these questions are in fact different sides of the same three sided die. What I find interesting though is how misaligned end-user goals are to the software and consulting vendors there to help them. They may appear aligned but they are not. If they were aligned, would we really organize our business applications and data the way we have them today? I think not.
- What do we do if we are successful as a business, such as making a profit or growing fast, but we know we have issues in our data?
- How and where do we address the growing issues of information security, risk, and ethics?
- How can we (in IT) be successful if the business side do not have a usable set of business metrics driving their goals and deliverables?
There were more, and there are lots of discussion points to explore in this gold mine. If you are stopping by at our upcoming IT Expo/Symposium, I’d love to chat with you where we can explore these and more. Hope to see you there in a few weeks.
Category: Bad Practices Best of Breed Big Data Business and IT Business Applications Business Case Business Drivers Business Outcome Business Process Content Management Dark Data Digital Business Digital Economy Digital Information Strategy Digital Workplace Digitlization Enterprise Information Management (EIM) ERP Gartner Gartner IT/Expo Information Advantage Information Architecture Information as an Asset Information Governance Information Innovation Information Organization Information Strategy Information Trust Information Value Innovation Master Data Management MDM Multichannel Integration Silver Bullet Symposium 2014 Tags:
by Andrew White | September 11, 2014 | 4 Comments
I was talking to a client the other day and this was their question:
“How can we do “MDM light” in/on our data lake since we don’t have the time or money to that “big MDM” thing?
Of course, there is no such thing as MDM light. You either govern the data, or you don’t. But there are different ways to design the governance effort – who would design an over bearing governance process when the business wants agility?
It turns out that in this case MDM was misunderstood. Too many folks seem to forget that MDM is meant to be a minimalist program. It is only meant to focus on master data – that means there is data that is not in focused, such as application suite shared data, or application specific data. Of the data in an notable business application, probably no more than 5% of its structured data is master data. As such, I used several metaphors for master data and MDM and its narrow focus as a means of support for others to then add notional insight to related (non-master) data:
- Central stem of a Christmas tree (the ornaments would be the related, non-master data, along with the branches which might be application suite shared data)
- Spinal cord in the human body with the limbs, muscles, veins and so on “hanging off” in some fashion acting as other information types
And before you respond, yes I know these are not ideal. The truth is that some data is more important than other data, and in many cases (not all, I admit), master data is one of the more important. If I knew how to relate master data in one data pool to master data in another data pool, I have a leg up on understanding the relationship between the related data to the master data in each pool. Thus for a small investment up front, by adding some limited scope of information governance (in this case, MDM), I give my users the chance to make sense out of the other stuff in the pools and lakes and tributaries. That is what MDM was meant to be all along!
Given that we are, on the whole, a lazy bunch, the idea that we limit our governance efforts to the data that offers the greatest bang for our buck seems to appeal. And in a data lake concept, that is true even more so.
By only seeking to govern, as light as possible, the central, core master data that can be used to link and provide a relationship between pools and feeds of data, we leave open the work for users to leverage the other data more quickly. In other words, only try to actively govern the 5% of the data in the lake, such that the various pools can be contextually related by each user as needed. That sounds like a plan to me.
And just to end on an argumentative point, once the lake supports this information governance “layer”, it no longer is a lake J and is more like an data store or warehouse with some active information governance.
Category: Data Lake Information Governance Master Data Management MDM Tags:
by Andrew White | September 3, 2014 | 3 Comments
A few months ago I noted that the brainy ones on CNBC were too removed from reality. They were discussing Twitch, and they could not understand why anyone would use it. They seemed unable to understand kids and gamers playing massively multiplayer online (MMO) games and watching others doing the same. They assumed non gamers were watching: “but other people just watching a video feed?”. No, they did not get it. I did since my oldest son plays MMO’s, as I do, but he is also a Twitch user.
Turns out for my son, Twitch is, or was, all the rage. Players sometimes stream their games so others can watch and critique. Other gamers, that is. Other gamers he hopes to play against at a later time in the league he participates in. There are tournaments around the world that are streamed. I myself have listened to the fascinating audio of these tournaments, as my son watches them on his iPhone, during long drives. I have to admit, the American narrators do not have a scratch on the Danes or the Germans. The accent helps with the mania and the panic, as one team smashes through the defense of another in League of Legends. There are a cadre of followers – gamers and their other half’s, and wanna-be gamers.
Amazon acquired Twitch August 25th for $970M. There had been a rumor just the week before that Google had acquired it. And in under a week since the acquisition Amazon has started to fiddle with the rules around what can be streamed and what cannot. I can feel the ire from my son, and he tells me many serious gamers feel the same. They are already experimenting with alternatives, of which there are several. Amazon had better beware. They should hire my son to figure out how to grow the service, before they lose all the important gamers that made Twitch worth $1bn in the first place.
Category: Inc Massively Multiplayer Online (MMO) Twitch Tags:
by Andrew White | August 29, 2014 | 4 Comments
I visited a client the other day and they wanted to talk about data lakes. Someone at the client, not at the meeting, had been promoting the concept of a data lake as an answer to question we explored. Before I tell you what happened, let me update you on my “opening” position.
A few weeks ago my colleague Nick Heudecker and I published a note (See The Data Lake Fallacy: All Water and No Substance) on data lakes. The note called out what appeared to be missing from the vendor hype related to data lakes, that being the lack of any sustaining practice (or technology) to help any value persistence from re-use of the data in the lake. There IS value in mining information in a lake. But to assume that the IP and structure used to expose that insight and value persists in the data lake is wrong. A data lake does not persist that. In the jargon, “no information governance, no sustainable or repeatable value”. It seems to be good advice.
Not everyone agrees. Another colleague of mine brought this InfoWorld “review” by a “strategic developer” to my attention – see “Gartner gets the ‘data lake” concept all wrong”. It seems we said that data lakes are not useful, and that somehow a large scale, enterprise wide, wall to wall governance effort is required. Apparently we were also touting proprietary technology. Since we don’t support either perspective (devoid of context, and data lakes is not sufficient in either case) I don’t even feel the need to respond. If there had been a response to the main fallacy we call out, I would have. Truth is, if you don’t maintain any structure in the data you use, how on earth can someone that follows you get a leg up, and avoid repeating your effort? Either way the hype around data lakes continues apace.
So let’s go back to the meeting this week with the client.
This client has several established data warehouses, each with some successful if local information governance supporting analytics. The client had 17 or so data centers, each supporting one of these data warehouses. The business uses these 17 systems a lot and gets value from the data- they rely on what they get from them.
There was one question: can we use a data lake? However we had to drill down to the REAL questions behind what was being asked. There were two real questions/desires:
- Can we reduce IT costs by reducing the number of data centers, and
- Can we increase synergy by supporting shared governance across the silos, as if we had a single, unified layer?
In truth this client wants to consolidate data centers, and quite separately adopt a focused information governance program to sustain common data spanning and connecting the local insights for additional value. As far as I can tell, a data lake plays no role in either question. Yet it was being pushed by a vendor to one of the end users at this client.
The end-user even spotted the fallacy themselves. They asked, “If we used a data lake, don’t we actually take steps backwards, in that we ‘lose’ all those currently silod yet effective IP and governance frameworks?
YES! A data lake by definition has a zero barrier to entry and so supports zero information governance. Any and all data is accepted because it has no need to confirm or relate to the rest of the data that exists in the lake already. If there IS a cost to enter, it is not a data lake. In contrast, a data warehouse or EDW has a higher barrier to entry. So why not go for a balance? In this case the user was right. A data lake would be a step backward. .
So why was data lake being referenced? Perhaps this vendor is selling a form of data warehouse but wanting to use the new silvery bullet-like name. My final recommendation to the client: forget the new names. Identify the real requirement (data center consolidation, and multi-warehouse information governance) and design the target architecture. If you really want a name for it, let’s chat again. But don’t use “data lake” since it does not seem to fit.
Category: Data Lake Enterprise Data Warehouse Information Governance Information Strategy Tags: