by Andrew White | March 12, 2014 | 2 Comments
Chess – the game of Kings.
A game of skill, strategy, and generally, little luck (at the highest levels).
“In his book Chess Metaphors, Diego Rasskin-Gutman points out that a player looking eight moves ahead is already presented with as many possible games as there are stars in the galaxy. Another staple, a variation of which is also used by Rasskin-Gutman, is to say there are more possible chess games than the number of atoms in the universe.” 1
Chess must be the quintessential big data problem! And Deep Blue was tuned to take to exploit a computer’s ability to number crunch thousands, millions of moves, patterns and games in order to outsmart the world’s greatest chess champion.
Man versus Machine. In 1996 Kasparov faced Deep Blue in the first of the “man versus machine” chess matches. Kasparov was victories 4-2. It seemed that man was safe, for now. But a year later, after a lot of tuning of Deep Blue by some very smart programmers, there was a rematch. The world was stunned when the match ended a victory for Deep Blue. In fact the manner of the defeat of Kasparov was also of considerable importance for the discussion of Man versus Machine. It went like this….
Game 1 May 3rd 1997
Kasparov was a known aggressor. He was also at the top of his game. He knew that Deep Blue would be better than before, but he did not know how or where this “better” would show. So for 14 moves, for each player, no pieces had been taken and remarkably Kasparov’s pieces had not moved beyond the third rank. Nothing had advanced to the middle of the board. Kasparov was prodding, testing, looking for a character perhaps.
Eventually a piece was offered by Kasparov. Would the computer take it? Yes. Then an exchange (i.e. like pieces) was offered and again the computer pounced. But Kasparov did not stick to one plan; he moved around the board, trying multiple different strategies and set pieces, and the lack of single strategy seemed to confuse the computer. Professionals would later say that Deep Blue actually wasted time! At one point Deep Blue moved a bishop to one location, only to move it back to where it started, a move later. This was not characteristic of a winning strategy.
Finally Kasparov set a trap and Deep Blue followed “the book” and entered. The door snapped shut and Kasparov had a couple of pawns threatening queenship, and Deep Blue was in trouble. At this point, move 44, Deep Blue made what the professionals at the time and since have claimed was a mistake. It seems Deep Blue had run out of longer term strategies, and instead played a “safe” move, not really troubling anything with its Rook, at a time when its own position in the game was perilous. Two moves later and Deep Blue Resigned. Why did it waste its 44th move? What was it thinking? We now know that this seemingly insignificant “bug” (?) was actually going to bring Kasparov down, and hand victory of the entire match to Deep Blue. But at this point, this was not known and Kasparov was happy – for now – 1-0 Kasparov.
The biggest data cruncher of the day, solving what arguably is one of te largest big data problems – ever – lost. Though it could out think through very possible move, it was not able to appreciate the subtleties of an array of strategies that were designed not to win on their own, but to test out an understanding for how the computer played.
Game 2 May 4th 1997
This time Deep Blue teed off as white. The game developed a little more predictably as each player drew on numerous patterns (i.e. previous games played) proven over the years. Kasparov relied on his limited memory, enriched with innovation and change. Deep Blue could only create something ‘new’ once its algorithms predicted less than favorable outcomes with predictable moves. For the computer, innovation was a last resort.
The first piece was taken after both players had moved 27 times. Another 18 moves each and it was all over. Deep Blue drew a resignation from the champion. However on inspection many other chess champions have since shown that Kasparov seems to have thrown the game! At move 43 Kasparov could have made a move that would have led to perpetual check. That would have been a draw. It seems Kasparov may have attributed more capability or “brains” to the computer then he should. Perhaps Kasparov was over thinking? In the first game Kasparov had jumped from recognized pattern to the next, and threw in, now and then, moves that didn’t conform to any book or game of books stored deep in Deep Blues memory banks. In game 2 Kasparov reverted to more formal play. Deep Blue had drawn Kasparov in and won. Interestingly both games ended on turn 45.
The End Game
Games 3 (48 moves), 4 (56 moves), and 5 (49 moves) resulted in a tie. Reports suggest that Kasparov continued to mix it up with his somewhat unorthodox use of moves that didn’t quite confirm to what the books said should happen. It seems sometimes it worked; maybe other times it did as much damage to Kasparov’s own chances to build a winning platform.
Miraculously game six ended in only 19 moves! Deep Blue won the game and the match 3.5 to 2.5. And in looking at the game myself, you might have even thought you could have kept up with both players, at least up until move 10. The board looked very strange and under developed. Even at this stage, a weak chess player like me would assume, on looking at the board, that Kasparov’s troops were better placed than Deep Blue’s. But by turn 17, Kasparov was led into a trade of his queen for less important pieces. Two moves later, history was made.
Many have since written that Kasparov was tired. Apparently he may have assumed that the computer had human traits that meant it could keep up with his somewhat unpredictable moves. Kasparov played as if his opponent was human! In his efforts to confuse his opponent, that could not actually ‘see’ his ministrations, Kasparov may have forgotten how to play to win. Some writers even went as far to say that Kasparov did not play as well as he could. That may be so too.
Deep Blue was a computer program designed, at this time, to beat Kasparov. It had lost to him 1996 and IBM’s programmers were bent on improving Deep Blue for the repeat match in 1997. Its horsepower was doubled. The amount of data available to it, representing every conceivable game of any import, was ready in its memory banks for matching, dissecting, re assembling, and so on. But Kasparov forgot it was a cold, un-feeling machine. We surmise now that the very idea of a bug existing in Deep Blue never entered into Kasparov’s mind as Deep Blue made strange moves. Kasparov may have thought the computer was smarter than he was. And this led to mistakes, and a loss of confidence in his own capabilities.
Deep Blue was an amazing computer, backed up by an amazing group of very smart programmers. But even these have their limitations.
Man versus Machine OR Man and Machine
“Although I [Kasparov] had prepared for the unusual format, my match against the Bulgarian Veselin Topalov, until recently the world’s number one ranked player, was full of strange sensations. Having a computer program available during play was as disturbing as it was exciting. And being able to access a database of a few million games meant that we didn’t have to strain our memories nearly as much in the opening, whose possibilities have been thoroughly catalogued over the years. But since we both had equal access to the same database, the advantage still came down to creating a new idea at some point.” 1
So when competitors have similar capabilities or technologies (or in this case, sources of truth for possible winning chess strategies) the differentiator will come down to the human and their application of the (same or similar) technology. Thus what you do with the tools you have if far more important and impactful then the tool itself.
Beyond Man v Machine
“In 2005, the online chess-playing site Playchess.com hosted what it called a ―freestyle chess tournament in which anyone could compete in teams with other players or computers. The surprise came at the conclusion of the event. The winner was revealed to be not a grandmaster with a state-of-the-art PC but a pair of amateur American chess players using three computers at the same time. Their skill at manipulating and ―coaching their computers to look very deeply into positions effectively counteracted the superior chess understanding of their grandmaster opponents and the greater computational power of other participants. Weak human + machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + machine + inferior process.” 1
Clearly this examples shows how a pure strategy (i.e. the “best” anything) does not necessarily win. One might even surmised that in any long cycle game or set of interactions (think of a business cycle), a mixed or hybrid (and adaptive) strategy should win out over any number of pure strategies. Purity will, eventually, give itself away.
Thus information leaders need to think less of silver bullets and the best this or that. They would be better served trying to come up with a reasonable set of assets and investment and truing to best optimize their application. The leaders that count will be those that figure out not the best technology, or even the best people, but the optimized balance of people (organization), process (strategy) and technology (capabilities).
The key points of this narrative:
- We need to think more of “man and machine” and forget concepts like “man v machine”
- Business leaders should not take, on blind faith, the outcome of their business system. There always needs to be some defined level of checks and balances (read information governance)
- The winningest strategies will not be pure; they will tend to be adaptive and even mixed.
- Information can only offer possibilities; leaders, and stakeholders realize value
Notes and Sources
- The Signal and the Noise, Nate Silver, 2013.
- The Chess Master and the Computer, http://www.nybooks.com/articles/archives/2010/feb/11/the-chess-master-and-the-computer/?pagination=false
- Kasparov and Deep Blue, Bruce Pandolfini, Simon and Schuster, 1997
Category: Big Data Information Leadership Information Management Information Strategy Information Theory Information Value Tags:
by Andrew White | March 5, 2014 | 2 Comments
They do happen. Client (end-user) Inquiries which are just priceless. I had a call to close out Tuesday’s busy schedule. It took me back to my roots. The initial question centered on a wide ranging conversation concerning the future trends related to how retailers will, in the future, gather product data from various sources and share it through several or all channels (what some now call omni-channel). More importantly, themed users’ perspective was that historical efforts were oriented around how a retailer’s goal was to pile inventory high and sell it. The future will be about how to interact and influence prospects in a digitalized world.
This ended up a wide ranging conversation. It was a really rewarding for me since it it was futuristic to the point of thinking aloud about how the retail/supply chain industry might be re ordered if information that underpins it changes value and importance. For example we explored:
- the role and unfulfilled promise of Global Date Synchronization (a form of digital commons)?
- why is it that most firms limp along with dubious, at best, data quality?
- why is it so hard for enterprises to collaborate on joint business processes?
- what role, value and lack of leadership associated with (information centric) industry standards play?
- what would it take to shift enterprises from their current fixation on enterprise business applications and associated information?
- what would it take for interested stakeholders to cooperate and even collaborate on shared multienterprise processes focused on sharing information?
- how to shift the industry forward to recognize the need to govern and share public, community based information and yet also private, partner specific information, even in the same infrastructure architecture?
The dialog was not about big data or cloud or mobile. They were taken as environmental conditions. The conversation explored the role and nature of information that powers how an industry operates. That’s hot!
This was a conversation. It was not an inquiry where I answered all the questions. Jointly we explored the edges of reality and poked around at the edges. But it is rewarding to hear end users who have the right questions. And also they want to think about the answers. I wish more visionaries like this sat on boards of the right consortia and standards bodies. Things might actually change for the better.
Category: Competitive Advantage Consumer Goods/Retail Global Data Synchronization Industry Design Information Advantage Information Exchange Information Innovation Information Leadership Multienterprise Omni Channel Tags:
by Andrew White | March 5, 2014 | 1 Comment
The UK’s financial services sector has been known for its innovation. Ever since Thatcher’s ‘big bang’ deregulation, the sector has developed new products and services. Well, the UK is at it again. In Monday’s US print edition of the Financial Times an article reporting: Britain to scrap tax on Bitcoin. Now technically this is not an innovation. Bitcoin happens to be one of the more well known virtual (digital) currencies. But the UK is innovating with respect to its regional competitors. Such a move will encourage the flow of Bitcoin funds through the UK financial services sector. This will act as a more significant clearing house and center of gravity.
Bitcoin has its challenges. It is at risk of dubious or even criminal monetary laundering, due to its innate ability to avoid traceability. But transactions should be innocent until proven guilty. And why would crooks use popularized virtual currencies when there are many less well know available anyway? Bottom line- the UK has mettle, still, and it can use it from time to time.
Category: Bitcoin Competitive Advantage Economic Growth Economy Virtual Currency Tags:
by Andrew White | March 5, 2014 | 1 Comment
Bernie Sanders article in Wednesdays US print edition of the Wall Street Journal (see There’s no need to end Saturday mail delivery) does a disservice to the reader, and himself. He claims the financial woes of the US postal service are manufactured by wanna-be capitalists that want to create opportunity for private firms to fill a gap created by dismantling the US postal service.
Mr. Sanders refers to a change in the law in 2006 that required the Us postal service to pay for it unfounded health and pension liabilities ahead of time. The idea is that when Posty retires, the funds for his or her retirement and medical costs will be there ready, ahead of time. What a GOOD idea! MR? sanders. Or would you prefer to pay tomorrow’s bills with next weeks salary?
The reality is that even with these changes in 2006, the US postal service STILL has massive unfounded liabilities approaching $50bn (see http://www.gao.gov/assets/660/650511.pdf). Mr. Sanders claims that if the ‘prefund’ didn’t exist the US postal service might make $1bn a year. Oh that’s good. That means another 40 years and liabilities are covered; new liabilities will keep piling up, and there is no money left for investment. The GAO reports mentioned in this paragraph suggests the US Postal Service may never even meet its current liabilities, let alone any new ones. Forget about investment.
The article is pure spin. It should not be allowed and mr Saunders shows his true, political colors. The US postal service should have been deregulated years ago and allowed to invest in new and innovative services to develop its own business. For too long government types have tied its hands and prevented innovation. Way to go, Mr Sanders. Why don’t you move over and let someone else try running the place. We need younger people to run our government who will be around long enough to experience the fruits of the very policies they enact. We need senators with skin in the game, and not fingers in the pie.
Category: Analytics Marketing Political Politics Tags:
by Andrew White | February 27, 2014 | 4 Comments
I have to say I am amazed as the way organizations understand their customers. Here I am, half way around the world, tired and ready for bed, and I want to sit down and watch a program I have paid for. I refer to last weekend’s The Walking Dead. I have Xfinity at home – its a GREAT Internet service (in my area) – far, far faster and more stable then my last provider, AT&T Uverse. However, it seems Comcast monitors my IP address of the Internet access point I am using.
Even though I am using my own account log on, the vendor ‘sees’ that I am not in the US, and even though I have tried 2 “hide my IP” address tools, Comcast prevents me from watching something I have paid for. Additionally, Comcast offers a service to allow you to watch programs online. They even offer an iPad app too for the same service. No luck on either iPad or PC. Comcast is just too tight. I mean COME ON! I have paid. Just because I travel does not mean I should pay more. The sales rep never said, “oh, and if you happen to be a traveler, half of what we offer does not actually work”.
Since I cannot watch my program, and since I am now wound up, I might as well talk about my Uverse v Xfinity experience.
I have been a Uverse customer for several years. I was an early customer in my location. The download speed promised was 18mb and it never got past 12mb. Upload was promised as 2mb and it never got passed 1.4mb. AT&T does not guarantee any speed. The service was stable for long periods, then for a time it would go wobbly. Sometimes it was local work in the vicinity, sometimes weather related and downed lines. Once the service was interrupted it would take days for engineers to get it stable again. But once stable it went well. We were very happy with it, overall. We used digital TV, HD, and phone too.
I had a ludicrous customer service experience once whereby I was being charged just $7 a month for one of the DVR boxes I had returned. For SIX MONTHS I would phone to get the bill credited; AT&T would apologize and promise to take the item off the bill “next time”. Next month it would appear. I would call in again and the cycle continued. I finally got a sensible response that their internal system was “broken” and they could NOT remove the item. Someone in authority had to over-ride the price field and set it to zero! What a laugh. They did respond eventually- but it was a right royal pain.
More recently I needed a more stable line and service, and it turns our Comcast offered over 100mb download with a guarantee. I tried it and have only used it for about a month. Seems stable enough, though had a few issues with the phone line. I cant afford any issues with either Internet or phone – so I will have to monitor the situation. here is a quick summary of the good and the bad of the two services:
AT&T Uverse – for:
- Very thorough technicians (they wear inside show covers to protect your floors)
- Very knowledgeable technicians and support (they know, and share, their knowledge on what is a new technology
- Offers wireless DVR boxes so you can move your TV/box around the house easily
- Wireless Internet signal in the house (from their own 2FireWire box) was pretty strong, and steady
- TV boxes were automatically Internet connected (so you could actually daisy chain a PC off your TV DVR or set top box)*
- Works easily with SlingBox so you can watch TV AND recorded shows easily over the web
AT&T Uverse – against:
- Not very fast in my location (no fiber optic, yet)
- Speeds never up to the promise offered
- Customer Service a long, arduous effort
- Once the line went out, it took days, even several weeks, for the service to stabilize again
* turns our Uverse uses ONE line in to the house and through to the main gateway. From their, established COAX (or Ethernet) delivers TV and Internet on the same line. This limits throughput, but gets you Internet on every TV box. Xfinity uses TWO lines so they split TV and Internet. This gets you better speeds but NO Internet at ANY TV box. This with Xfinity placement of the main gateway is CRITICAL versus SOMEWHAT important with AT&T.
Comcast Xfinity – for:
- Very fast (in my area)
- Speed guaranteed and delivered – I have checked and do so frequently – strong and very stable (at the gateway)
Comcast Xfinity – against:
- Does not work easily with SlingBox (need to wire between gateway and a TV box before hand (AT&T does not need that wire)
- Technicians seem OK so far but they are not as customer service focused. They don’t wear shoe covers when they trample all over your house
- No wireless TV DVR or set top box yet offered (its not hard – COME ON!)
- You cannot watch ANY recorded shows over the web (unless you bother to get SlingBox set up but that is a pain due to the extra wiring needed)
- Wireless Internet signal is VERY WEAK and NOT STABLE, even in a regular wood constructed house! I mean it fluctuates a HUGE amount. I have run many tests and it is amazing how bad it is. Though 100mb is at the gateway, just ONE ROOM away and it drops to 20 to 40mb, with a huge drop to 1mb every 20 seconds. That kills any streaming dead.
- Tight fisted with use of their own “on demand” app for online or mobile devices. They limit you to only US based access.
I am about to try a test: turn off their wireless service at the gateway, add put in front of it my own wireless hub (I have a new 802.11ac unit) wired to their gateway, and see if I can spread some of my lovely 100mb bandwidth wirelessly around the house. If this works, I will use that.
So bottom line – I am getting a much, much better speed but the rest of the offering is not as good as AT&T. Shame AT&T didn’t get fiber optic where I live. And shame Comast does not sub license shoe covers, wireless set top boxes and wireless routers from AT&T.
No I can go to bed somewhat less frustrated. I gave up trying to convince the Comcast server that I am a real customer that happens to be outside the US. I will get to watch my program Sunday – when home. But Comcast needs to get with the program and get up to date. Seriously.
Category: AT&T Uverse Comcast Xfinity Tags:
by Andrew White | February 25, 2014 | 5 Comments
A colleague of mine (Debbie Wilson) shared a link to a survey from the Electronic Manufacturing Service Industry website. Though specific to an industry and its use of ERP solutions, I have to say that my inquiry load over the last 5 years shows that the main finding of the survey is virtually universal. Yet the issue persists and we are not really that much nearer a “final solution”.
The survey reports (see Top 5 ERP challenges for EMS providers) the most frequently reports issues holding back success with ERP systems. You can read all about the survey and the “five” at the link above, but the TOP item, most often reported as the greatest challenge with a successful ERP program, is “data integrity”.
Yes, data integrity. How is it that in 2014 this remains the greatest challenge? Surely we know this. Yet why is it that so many ERP projects are noted as ‘successful’ at the hand over to the business? Why is it that the software vendor and the systems integrator go home “happy and paid” and yet only a few months later, the end user discovers ‘problems in the data’? Why do we all put up with this??????
ERP systems do not live alone. No ERP acceptance or hand off should take place until and if an effective information governance AND stewardship function is established. This means looking at data BEFORE it gets into ERP, as well as data ‘across ERP’, and just as important is the data “after ERP” and where it goes. If the end to end view is not set up, ERP will just slowly drift into apathy. It will become much less valuable to the business and been seen not to yield the benefit it was support to provide. ERP is a good strategy for some organizations, at some points of their lives, but we all still have a lot to do to make them delivery the good.
Footnote: though I say “ERP”, the problem described here is just as prevalent with any large scale application landscape. This includes CRM, Procurement/SRM, PLM, SCM, and also any industry specific application suite (for example, core banking) that equates to ERP. End user organizations with any number of business applications has the same challenges in varying degrees. Many times it is well known but not “visible to leadership”. Bummer.
Category: Business Applications CRM ERP Information Governance Information Stewardship Master Data Management MDM Procurement/SRM Product Lifecycle Management (PLM) Tags:
by Andrew White | February 25, 2014 | 1 Comment
Well what a week so far. Loved it. Very busy but the kind of thing that makes you think.
Day 2 started just like day 1. Woke at 3am – I guess due to jet lag. Cleared the now burgeoning email. How can you work in a location where you don’t get emails during the day but when you wake up, there is 160 of them just waiting for your attention?
Breakfast at 7.
I had a couple of 1-1′s before my first presentation at 10 – a new one on the Business of Information Management. There is renewed emphasis on information strategy, and how it should be taken seriously as an investment program. After all, those organizations that generate a greater return on their information investment than their peers will, in general, out perform the other guys. So why shouldn’t we, as information leaders, think more strategically about information? Before the dot.com crash (remember that?) the ‘strategy’ was king – it was over the top, even. Post dot.com the “strategy” kind of got a bad name. Everyone focused more on the hear and now. It was thought to be silly to be too far out. Well, that lack of vision and strategy is now holding many firms back. This deck looked at how to apply skills and disciplines from other domains like general management and supply chain to improve information strategy. Specifically I looked at scenario planning (for longer term information strategy analysis), Activity Based Costing and Value Stream Mapping (SCM) to help with information life-cycle improvement issues, and also Root Cause Analysis (to help with overall data quality issues). Very busy room – great questions. Can’t wait to see the scores to see if this new deck get’s rewarded. I believe it has a bright future!
11.30am to 2.30pm and another 1-1 before lunch. A quick bite and then into a workshop on “how to get started with information governance”. Packed room – every seat taken. This also was a new workshop, built off of the success and use of an MDM workshop. This workshop tries to help organizations determine what kinds of information assets need governance sooner rather than later. So we look at master data, other structured data, analytic data, transaction data, records, content, digital assets, social data etc. Interestingly some industries are struggling with so much data across the spectrum; others are focused on (still) getting a handle on master data first.
3pm team meeting with a client. Shame we only had 30 minutes. Great discussion about how to get started with an information governance program. MDM was one key aspect of this. In fact that was a key theme for me from the two days. MDM is very often one of the key starting points for an IG program.
Another 1-1 and then a quick coffee break before closing key note.
Final summary of topics from 1-1′s and impromptu discussions with users on the show floor:
- Getting started and selling value of information governance – 6
- Using advanced metadata management techniques like Enterprise Metadata Management to get more value from information – 3
- Making information governance “stick” day to day – 4
- Can I get information governance working effectively when the data in question is in my (BI) data warehouse – 2
Almost every inquiry touched on master data, even if MDM was not mentioned. I don’t want to say that MDM is ‘for everyone” but it seems everyone has master data. The general question for ALL 1-1′s might really be re-stated as “how do we start managing our most important information assets for better business outcomes?” MDM was one route for some firms. For others, the name of the program was not clear, since the data itself did not attract a program name.
The final question is a hot topic too – and one I have blogged on several times (see Information Governance on (or in) the Data Warehouse – Does it Exist?). It is VERY HARD to get business people interested on day to day stewardship and governance of data in a data warehouse. But if you have to start there, you need to transition to upstream, operational IG as soon as possible before the users lose interest!
Fun time. Hope we added some value. Can’t wait for the next Summit. On to London in a couple of weeks. See you there!
Category: Business and IT Business Intelligence Business Strategy Content Management Data Governance Data Quality Data Stewardship Data Warehouse Infonomics Information Architecture Information as an Asset Information Governance Information Innovation Information Innovation Yield Curve Information Leadership Information Management Information Stewardship Master Data Master Data Lifecycle Master Data Management MDM MDM-Summit-NA Metadata Management Quantified Selt Tags:
by Andrew White | February 24, 2014 | 1 Comment
Wow what a day! So busy – as you would want and expect. And its only day 1.
Opened up with a pre-event tutorial on Master Data Management. Room ended full with standing at the back. Hope the other sessions got as good as response as I did with MDM. MDM again emerged on the pre-event survey’s this year so we have a virtual track with several MDM targeted sessions for budding information governors, CDO’s and MDM program leads. A coupe of attendees had follow up questions too. Lots of interest in MDM.
Then the Gartner keynote. I tweeted away. I hope they all made sense. Did someone see TIM? I thought I did….. Information IS the asset now. Analytics is one of the main methods for information exploitation. I took feedback from attendees through the day, and some of them ‘aligned’ with each of the three speakers, who each took on a different role. I think Frank’s distrust of all the hyped technology seems to have won the argument and the day.
Followed up with a couple 1-1′s. And I had a few more later in the afternoon. The running tally of topics and threads looks like this:
- Getting started and selling value of information governance – 3
- Using advanced metadata management techniques like Enterprise Metadata Management to get more value from information – 2
- Making information governance “stick” day to day – 3
In almost all cases MDM was also part of the conversation. Master Data is getting to be a universal topic and recognition, even if there remains some challenges with MDM. Some firms (not any I met here, yet) don’t get the concept that MDM is only focused on master data and that there is a whole lot of other data in our business applications that is NOT master data.
I had a busy room after lunch for my “governing the governance board” session. I love to use props so I had my FIFA Soccer Rules with me (information governance) and also my soccer referee book and cards and whistle (information stewardship). I hope the idea came over well. I met an end user after the session that drew for me his organizations structure for IG and IS – it was almost the identical model to our template. Love it when a plan comes together.
Then ran hot foot to the show floor to present the Master Data Management for Customer Data Solutions Magic Quadrant. Full house there too. Never the best environment to present as it is always loud. But I saw lots of nods in the right places.
Ended the day with a couple of 1-1′s. Now off to the show floor for a walk about.
Category: Business Intelligence Information Governance Master Data Management MDM-Summit-NA Tags:
by Andrew White | February 20, 2014 | 3 Comments
Karen Heath, an analytics solution architect for Accenture’s Health business, posted this question (“Is it really possible to achieve a single version of the truth?”) in this week’s Information Management top stories. Of course I had to run quickly over to the site to read the article – how could I pass up such a red rag!
I have to commend Karen on a fine article and premise. But I wanted to call out a few of her points and explore how really effective they are. I don’t think her article explores these points enough. There are a few such points that always come up in dialog with end users, and some vendors and analysts too!
- Is there such a thing as single-version-of-the truth?
- What is single-version-of-the-truth, anyway?
- Where does this truth persist, and why?
Firstly, is there such a thing? Well there is – and many end user organizations use this phrase, but the reality is that such a thing does not really exist! In fact it’s a somewhat silly question. What most end users really mean is the following:
- Can we at least agree to disagree (what the truth is)?
- Can we at least service our own individual needs with a “view” into what the truth is?
- Can you not push your ideas of the truth onto my ideas?
- Why can’t we just all get along?
The precise phrase we could use is more like “a commonly agreed and accepted set of truths that operate as a foundation, on which we will each derive our own, interpreted contextually centric views”. That is not a very marketable phrase, so we all tend to use “single version of the truth”. Try having this conversation in a Research meeting and you can spend most of the hour arguing about this whole area!
The second question really depends on the focus. In BI land the phrase tends to relate to far more than master data. With respect to the data warehouse, the truth is focused on all the transactions and now content and big data that is collected and used to drive analytics. Master data might make up no more than 5% of the data in that warehouse, at most.
In Master Data Management, the focus is ONLY master data. Or at least, it is for a while. It will always tend to grow over time. See my research round of for 2012!
Now as to the ‘where’ this truth persist – that is much easier to address. Karen’s role seems related to business analytics. This implies something important. It implies (I admit not necessarily so) a focus on a data warehouse of some kind that collects data from many sources, and provides a “single view” (physical or virtual, or even logical) of a lot of data to power analytics and reporting. This is critical to the discussion since, and I have said this before, I don’t think that many organizations are as successful with “information governance” by the (end user) people, for the (end user) people, when the focus is the data warehouse. See “Information Governance on (or in?) the Data Warehouse – Does it Exist?” Mark Beyer, a colleage of mine, sometimes goes further sand suggests that information governance does not exist in the data warehouse for the same reason I site here.
As an ‘ex’ end-user myself, my priority was on the business data within the realm of the business applications I used to allow me to do my day-to-day job. The time I spent on helping IT clean up data in some warehouse to help with reports we get from time to time was a lot less. My bosses, over the years, followed the same practice. The only time the emphasis on the data warehouse made sense was when I had not other choice, or when my business performance required me to do so. Most business performance metrics focus on just that – not data that is not related to performance.
So I believe that trying to instantiate information governance, led by the business, on data in a warehouse to drive reporting, even compliance, is never as east as trying to get information governance established on information in core business systems. If you can get to the latter phase, you will have the attention of the business user – front and center. Thus change management, the level you need, can be identified and measured in a way to motivate the end user to support it.
Therefore the question for me in the article is this – are we focused on governing data in the warehouse (realm of IT), or information in the core business systems (realm of the business user)? They may seem related – we even came up with the phrase, “analytical MDM” to denote the relationship, but the discussion Karen so nicely explores is answered in very different ways depending on the focus of the information in question.
Category: Analytical MDM Business Intelligence Change Management Information Governance Information Management Information Policy Information Stewardship Master Data Management MDM MDM-Summit-NA Tags:
by Andrew White | February 20, 2014 | 2 Comments
Interesting reading this morning with “Opponents say PROTECT Act could destroy mHealth, endanger patients“. The article refers to new proposals that would regulate what parts of Health IT need to be regulated. This will cause organizations to prioritize their IT investments in a specific way. This smacks of big-time interference, or perhaps over reaction? We need to watch the progress of this newly proposed bill….
Category: Healthcare mHealth Mobile Technology Tags: