Andrew White

A member of the Gartner Blog Network

Andrew White
Research VP
8 years at Gartner
22 years IT industry

Andrew White is a research vice president and agenda manager for MDM and Analytics at Gartner. His main research focus is master data management (MDM) and the drill-down topic of creating the "single view of the product" using MDM of product data. He was co-chair… Read Full Bio

Book Reviews: Monetary Theory, Bretton Woods and the Gold Standard

by Andrew White  |  November 22, 2014  |  2 Comments

Two books – one from 2009 and the other from 1936.  A great pair I can tell you!

Monetary Theory and Bretton Woods, Filippo Cesarano, Cambridge, 2009.  Wow – what a book!  One of the four books I annotated most – the others being Does IT Matter. Nation of Takers and Dealing with Darwin.  Since I have been delving into Bretton Woods and its aftermath, I had to dig deeper and understand the gold standard, any why so many economist are enamored with its golden image.  Turns out its just a system that worked for a while.  Much like Bretton Woods.  But I have learned why we can never go back to a gold standard, or any other standard.  The natural world is not actually a fixed system but an organic, changing model that is more dynamic, with various equilibrium that come and go.  This book explains a lot – is easily read, and nicely chronological in how history has evolved in the area of monetary theory.  Well worth the read – and I am not even more excited to read more on the subject.  Recommended 9 out of 10.

The Downfall of the Gold Standard, Gustav Cassel, Oxford University Press, 1936.  This book was referenced by several recently published books on monetary systems and the gold standard.  I was able to find an old second hand (used) copy.  And mighty glad am I.  When this book was written, the world was between two world wars.  The ‘first’ or more well known gold standard had just come to an end, with the commencement of WWI.  After the war, a sterling block reformed around Britain and its remaining Empire.  Germany was being rebuilt.  The dollar was emerging as a global force and making inroads as a globally traded currency, on its way to be the next reserve currency.  But at the time, the “return to gold” was the oft cited goal from central bankers the world over.  This excellent book explores the failings of the gold standard that struggled in the period between the end of WWI and the start of WWII (up to 1936) and compares what happened to the more stable sterling-based gold standard before the war, but also between the War’s.  This inter-war period consisted of a flexible gold-exchange rate as well as an attempted fixed rate period.  The gold standard failed, spectacularly, and Cassel explains clearly.  The U.S. (and to a degree, France) imported much of the world’s gold for different reasons, yet at the same time did not permit the agreed price rises nationally that should have accommodated such behavior. The very trust agreed between nations was disrupted.  The accepted “rules of the game” were violated.  Only the new sterling area actually showed any signs of stability, but sterlings’ days were numbered.  In effect, the U.S. was discovering, as it went, what it meant to have a globally traded and settled currency.  And it was ruining the standard through its actions (from the perspective of global trade and prices).  WW II changed all that.  You won’t find a finer summery for what explains the demise of the gold standard.  Recommended 9 out of 10.


Category: Economic Growth Economic History Economy Gold Standard UK Pound Sterling US dollar     Tags:

Global Trade Information Management: Silver Bullet Count – 2 requested today, so tally up to 22

by Andrew White  |  November 6, 2014  |  3 Comments

I only just blogged today that a client had asked for a silver bullet: Did you get your silver bullet yet ?  Opening count starts today at 20!  Well, in my penultimate inquiry today it happened again.  That’s two in a day.  Pretty good going I think.  Maybe I need to invest in silver!  I was joking, of course.



The silver bullet concerned a retailer struggling with collecting, managing and exploiting the data needed to support a sprawling global trade management and compliance program.  Of course, this is a going concern so there is data everywhere right now.  The question is how to rein it in and get control of it.  If only there was a turn-key solution one could use to “do it all” at “no real cost”.  Sigh…



Category: Silver Bullet     Tags:

Did you get your Silver Bullet yet? Opening Count starts today at 20!

by Andrew White  |  November 6, 2014  |  3 Comments

At Orlando Symposium a few weeks ago I handed out a number of silver bullets.  In fact I handed out about 18 of them.  Nearly twenty individual 1-1 meetings with end-users resulted in a question that looked something like this:

“We have this issue (with data).  I don’t really want to pay the high price that vendor’s are asking (for their software and services).  What I really want is a more practical solution, priced for my budget that covers the important bases.”

I have previously offered a virtual silver bullet (see In Case Your Program Needed One – Here is Your Silver Bullet), but given that my idea of giving away, literally, silver bullets was a “hit” with my colleagues, I thought I would start to count the number of physical and virtual bullets asked of me.  It so happened that today I was on an inquiry with a client where I was asked for yet another silver bullet.  I had been asked for one last week – so I will open the ledger today at 20.

I will be “down under” next week and at our Symposium Gold Coast - with my bag of silver bullets in hand.  I wonder how many I will need to give away.

If you have need of your own silver bullet, here it is:



Category: Silver Bullet     Tags:

Economic Minute #5: Interest rates and their impact on corporate investment: all at sea

by Andrew White  |  November 6, 2014  |  2 Comments

An article in last weeks (October 18-24, 2014) US Print edition of the Economist set off my alarm bells again.  It was titled, “Tight, loose, irrelevant”.  Interest rates do not seem to affect investment decisions as economist had previously assumed.  The article reviews a new paper on the topic that explores the apparent change in relation between interest rates and its relationship to why firms change the level of investment.

History was thought to tell us that a government could stimulate investment and so private sector driven growth by lowering interest rates.  Lower rates mean that borrowing money is cheaper so for a given amount of profit, or cash, the opportunity cost falls and so more money is burrowed and invested.  If governments want too cool a rapidly growing economy, the opposite was thought true too.  Increasing rates mean the burden is larger on firms and so they are dissuaded from investing.

Through the economic crisis and into the current malaise, the world has seen record near-zero interest rates for extended periods of time.  Yet private sector investment has not reacted as expected.  Why?  The report cited by the Economist article suggests that the state and financial performance of the individual firm is a greater signal of its likely desire to increase, or decrease, investment.  But even that data is not enough.  Corporate profits have been very healthy, and have been up until recently. Presumably interest rates play a role, but it seems far less important a driver or barrier then previously thought.  So the issue persists (how do government enthuse growth in private sector investment as a key driver of economic growth) and it seems our understanding of what makes firms tick is still lacking.

As if that was not enough, a related article in the US print edition of the Financial Times, Monday October 20, explored the same issue but from Europe’s position.  In Wolfgang Munchau’s Comment piece, Eurozone stagnation is a greater threat than debt, he argues that the chronic shortfall in private sector investment is creating the conditions for a long period of weak demand with falling prices.  This would act as a massive anchor on trading nations too, and could be enough to counter any positive effort to promote growth elsewhere in the global economy.

Footnote: It is my firm belief that private sector investment and productivity are two key “handles” that we (collectively) need to tweak if we are to change the current economic situation.  The former needs a change in policy at the state level; the latter is firmly in the hands of business leaders.  The former also has the opportunity to impact the entire economy at a stroke; the latter has more opportunity at the local, even innovator and individual firm level.  And IT plays a key role on the productivity puzzle.


Category: Debt Economic Growth Economic Productivity Economy Euro Globalization Interest Rates     Tags:

What Software Developers can learn from their PC Gaming Developer Cousins

by Andrew White  |  November 5, 2014  |  3 Comments

As an avid PC gamer I happen to have a love for military strategy and war games.  Just to expose my nerdy credentials, I once played a “play-by-email” game with my father (he in the UK, me in the US).  We simulated War in the Pacific over a 3 and a half year cycle.  Every week we would take turns to move our pieces, plan our strategy, and send the resulting file by email.  He won.  Nuff said.

Well, the user interface (UI) is critically important to PC games – probably games on all platforms.  The quality of the UI can mean the success or the failure of a product in the market.  In enterprise application software its impact is less simple than that – but it can have a significant impact.  So here is a story of an innovation that has just emerged that I believe will change the PC gaming market.  Enterprise software developers can leverage this story, I feel.

For the last 20+ years the way you moved units (troops, armies, navies) on a board (the map on your PC screen) has not changed.  You would use your mouse to select a unit and then click a location.  The unit would then move to that location.  Code would be used to determine the path chosen and you, the player, had no or little direct control over the route chosen.  In early games, and even poorly designed games today, units would take silly paths and often get stuck somewhere – turning up later to the fight you had planned for them.  You might even lose the war because of this.  The challenge was somewhat resolved with “way-points”.  This would allow you to manually select specific points along the way.  In a tedious manner you would again select the unit and then click multiple times at specific locations on the map where the unit was to move.  A move could include any number of individual way-points, and tedious clicks all over the map.  Again, between the way-points  the unit was still controlled by code that would select the most appropriate way to get to each.  So the problem was never really resolved.

This legacy approach, in use by almost all military and real time strategy games, is about to be blown away.  Enter Ultimate General: Gettysburg by Game-Labs LLC.  In the graphic below you can see how I give each unit movement instructions.  I select the unit and then I draw the specific path I want them to take – from start to end.  I can go direct, or I can move around the map in “S’s” and in fact any shape I want to.  It is intuitive, it is self-evident.  At a stroke I get a pleasing UI and an effective UI.


If you wanted to select multiple units before giving them a move instruction you had to use the control key, or the shift key, or some combination, and use your mouse to select the units themselves.  This was also tedious.  With Ultimate General: Gettysburg you simply click and draw around the units you want to order together, as if you were a general leaning over the map, and drawing sweeping hand gestures.  See the graphic below:


Once drawn I issue orders once and all units in the sweep receive the same order.  Again, this is simple, intuitive, and pleasing.

In both cases above, movement and grouping, the legacy software designers used the available tools to hand (mouse, and screen) and created a standard that has worked well enough for a long, long time.  This new game will force other developers to respond since it behaves like I think and would do if I was that general.

I have written about the importance of the UI before (see Why Mobile Gaming is Killing our High Tech Industry – though Doom and id Software may yet help.)  The UI of your “A” screens (if you know what that means) should exploit high-taction, as this game does, to engage the user.  Software developers should adopt techniques that work or extend what it is end-users want to do every day.

I think its fun to watch how software vendors address their perceived end-user needs while not actually being users themselves or even asking them.  Too many start their standard software demo with their “C” screens where all the capability is presented.  Smart vendors  start their demo with their “A” screens, showing the results and benefits of the work that should be implied by the use of their technology.  If only we all could be generals for a day…





Category: PC Gaming Real-time Strategy Software Developers Taction User Interface     Tags:

Poverty in Decline; Italy on the Hook; Debt Mountain Grows!

by Andrew White  |  November 5, 2014  |  1 Comment

In Monday’s print edition of the Wall Street Journal was an Opinion piece by Mr Irwin, professor of economics and co-director of the Political Economy Project at Dartmouth College, titled, “The Ultimate Global Antipoverty Program“.  In it Mr Irwin highlights the progress felt around the globe in the drive to reduce extreme poverty.  According to data from the World Bank, the share of the world’s population living in that condition has fallen from 36% in 1990 to 15% in 2010.  In a nutshell this was driven by the desire of poorer less developed countries to improve their lot by adopting, on the whole, capitalist processes and constructs within their economies.  This was not achieved by government redistributing to the poor (since those government’s didn’t initially have much surplus funds to redistribute).

This won’t sit well with the politically correct who want to highlight the apparent anomaly that looks at poverty within a country, and specifically the USA.  It is statistically possible for the poverty gap to widen within a country, when between counties that gap between the poorest and the richest actually closes.  That is because the lowest level of poverty for all nations measured is rising, even if the much better well-off are even better off.   But ignoring the politics of this argument, one should not ignore the positive influence capitalism has had on places like China, India, and Africa.  Of course these are not silver bullets nor Crown Jewels in the fight against cronyism- that is a flaw in man whatever the political system, not a flaw in the ‘system’ per se.  The article looks at how the politically correct and socialists will avoid agreeing with the data Mr Irwin presents.  All in all its a warm little article.

The Missing Deleveraging Debt

In the same newspaper there was another Opinion piece (Missing the Prime Suspect in the Global Slowdown) by Mr Melloan, former columnist and deputy editor of the Journal editorial page and author, who calls out the alarming data explaining how debt deleveraging is not actually taking place.  Remember that word, “deleveraging”?  It was all the rage a few years as American’s were told to deleverage, or reduce debt, in order to become more responsible and financially viable.  Well it turns out that some personal debt deleveraging has taken place, especially in the US but less so in other developed nations, but public debt, owed by sovereign nations, is soaring.

Low interest rates, Mr Melloan affirms, is motivating us – governments all – to save less and borrow more.  Saving is a critical element of capital formation and one of the so called “lift off” vectors of the industrial revolution.  A growing economy needs savers since that helps funnel excess profits back into re-investment.  Today’s governments are afraid to raise interest rates for fear of slowing down even more their sluggish economic growth.  Yet it is the low rates that prevent saving.  And make borrowing so easy (for some).

It seems to me that we are caught between a rock and hard place.  Even Japan blew the doors off their quantitative easing initiative last Friday by announcing a massive expansion.  This program is meant to fight deflation.  Remember Milton Friedman’s, “inflation is each and every time a monetary phenomena“?  Such flooding of the markets with money is supposed to drive inflation, to spur consumers to spend ahead of price increases.  So far, that hasn’t happened.  All in all its a conundrum.  And the twist is that higher interest rates would increase the debt burden on sovereign states.  The interest due on all those loans would soar as rates increase, so it is a real pickle.

Eurozone on the Italian Hook

And a third article, Italy is Big Test for Success of Bolder Bond Buying, explores my favorite issue of the day, the euro.  Specifically the  challenge the EU,and ECB have in deploying their own monetary guns in terms of bond purchases to “do” quantitative easing.  Italy, according to the article, is the weak link and possible catalyst for ongoing stress.  It has toxic growth and a very high public sector debt burden (near 135% of GDP).  The country could be back in recession shortly.  The article states, “If QE can’t rescue Italy, then it cant rescue the Eurozone.”

And  the cards seem stacked against success.  Additional data reported in the article suggest that bank lending in Italy is already 53% of GDP, higher than in France and Germany, and yet it is through the banking system that the extra cash has to be utilized.  Worse, the ECB’s recent comprehensive assessment of the eurozone’s banks suggest that the Italian banking sector is one of the weakest in the eurozone (i.e. understated bad debts and capital shortfalls).  Finally the article concludes that bank capital (from QE) is not what Italy actually needs, but that it needs more corporate capital.  And that this is not likely to occur with QE.  So the conundrum keeps on spinning.

1 Comment »

Category: Debt Economic Growth Economic Productivity Economy Euro Eurozone     Tags:

IBM’s Insight 2014 Conference – Headline Messages and Perspectives

by Andrew White  |  October 31, 2014  |  3 Comments

I was at IBM’s Insights event in Las Vegas the week for a couple of days.  Here are some of my perspectives of the event.  I also compare and contrast to what I observed while attending SAP’s TechEd the week before.

Bottom line: IBM has stuck to its “big data and analytics” message for the last 2 or 3 years.  In some ways this was a tad repetitive, or so it seemed for a while.  The main keynote seemed to have that feeling of sameness about it.  However there was a new message this year that emerged as day 1 progressed.  That new message is that the offerings talked about in previous years are now here, available, and ready for use.  In other words, IBM’s big data and analytics solutions are real and some of its vision has been realized.  I did not see too much new innovation however…

Now for the details.

Client examples were sprinkled through the various keynotes, and customer stories always help convey the vendor messages.  I particularly liked the Pratt and Whitney (Internet of Things evolving predictive maintenance) and Ceva Logistics (evolving use and exploitation of supply chain asset oriented analytics) stories.  The key tips from Ceva were “do taxonomy first, and do information retention policy early”.  Sage advice, for sure.

Another new emphasis on an old topic is that concerning IBM Watson.  Throughout the first day there were several new product announcements, and several played on the commercial availability of IBM Watson-powered products.  These represented the leading, innovative examples that were loosely described in previous years.

Like SAP the week before, IBM has had a silver bullet looking for a problem to solve.  It seems IBM has done a little more homework than SAP, in my view, in this area.  IBM’s Watson seems to have come of age while SAP’s HANA is still in high school.  Several commercial solutions were demoed including Chef Watson, targeted at you and me and our creative culinary juices.  There was another demo for IBM Watson specific to healthcare (Oncology).  IBM emphasized how these were packaged and targeted at specific roles and use cases.  Packaged solutions are what Swiss Army Knives, such as IBM Watson and Sap HANA, need to be meaningful to business users.

Cool Metaphors

As I listened to the speakers through day 1, several cool metaphors or phrase resonated with me.  One concerned a phrase to define an industry leader.  The phrase was, “two moves for one“.  That sounded cool.  The point being that in the time and money your competitor takes to make a single move, as if on a chess board, your organization could execute two.  This is a powerful message and one way bigger and broader than mere analytics.  This conveys the fan-favorite, “plan, do, check, act” ( a slight nod to i2 Technology’s Sanjiv Sidhu).

The second cool idea was teased by Beth Smith, GM Information Management, in her divisions’ keynote.  She got my heart racing with “liquid data layer” then she promptly let me down and cool, by not following up and exploring the idea.  She mentioned it one more time, as a near afterthought, and that was it.  I thought I was going to be made weak at the knees with a pitch about semantic governance or something.  Bummer.

Odd Thoughts

One odd thought came to me, that had a sports flavor.  It happens to apply to IBM and SAP.  The question was: Why does Big data produce so many one-time data innovations?

Another odd thought came to mind during the ECM keynote.  It was this: Over 90 per cent of what Doug Hunt (GM, ECM) referred to actually related to structured data, not unstructured data.  In truth, he was referring to examples oriented towards content, but content with some contextual structure applied.  Once structured, content becomes information even if it is not physically stored in an relational database.  Such information should be subject to the exact same information governance policies and practice as master data.   And this ties into a recent blog of mine (see Is it too soon for unstructured data governance) on the issue.

Open questions, considerations and queries

In the main keynote IBM over emphasized analytics as if analytics alone would save or improve your business.  There was very little to talk about information Management per se; there was talk of faster boxes, servers and stuff, but that’s the ‘easy’ part to talk about.  Governance, quality, and trust in data, and information value, was not the priority for this event headlines.

Additionally IBM had a key slide that they failed to explore, explain or exploit.  The slide emphasized how insight from analytics leads to action and a change in (business) outcome.  They did not explore enough, for me, that change in action.  It’s as if the focus was on the stop-lights, and it was assumed you, the driver, know how to drive.  There was good use of case management, but again, that is a poor substitute for all business apps that are the vehicle controls we all use every day.

From queries to wrinkles

A wrinkle for me:  “Markets of one are a stopover”.  That did not resonate with me but this was said several times in the keynotes.  Markets of one is a concept that should sit at the heart of a new digital business strategy.  IBM was trying to outdo the concept with the suggestion that markets of one does not imply the ability to interact and even influence the market.  This is not true other than IBM says it is.  No one said a market of one was meant to exclude interaction or influence.  IBM had good content but the message chosen to convey it didn’t work with me.

One final wrinkle and thought left me wanting more.  After the conclusion of the divisional keynotes (corporate, business leaders, information management, ECM, and business analytics), I was left wondering this:  Why didn’t IBM call out the market discontinuity by not having separate keynotes for info mgt, ECM, and analytics?  All keynotes included a major focus on analytics.  And though the opening keynote was a kind of overview, it too was heavily focused itself on analytics. I’d like to see a vendor break the mold and avoid silod keynotes and focus on a matrixed message and structure.

So wrapping up…

I tweeted quite early on day 1 that for me, “taxonomy was tops”.  What I meant to say or mean was that the new world IBM is talking about really points to the validity, quality and meaningful use of  my taxonomy versus your taxonomy.  At the end of the day, if we are using content or information, dark or big data, streaming or in-memory data, if we can’t express the semantic meaning of the data, we are done.  IBM did not really explore or explain how this brave, new data work will unfold.  I know IBM has cool stuff going on and new products rolling out.  But I didn’t see IBM with a visible, rounded, public road map showing how all the necessary tools converge to align, prioritize, then operationally govern key information artifacts across an organization, or firewall.  We all know IBM can build almost anything (they have a big bag of tools and products) but they do not, as yet, seem to want to push this particular envelope yet.  I accept that there are few, if any, real buyers for such a vision but I think it is on our collective horizon.





Category: Analytics Big Data Business Intelligence Dark Data IBM IBM Insights 2014 Information Management Information Organization Information Stewardship Information Strategy     Tags:

Book Review October: Making It Happen: RBS and the men who blew up the British Economy

by Andrew White  |  October 31, 2014  |  2 Comments

Making It Happen: RBS and the men who blew up the British Economy, Iain Martin, Simon and Schuster  UK, 2013.  After reading about  numerous US organization’s that played a role in the financial crisis that preceded our economic crisis (Bear Stearns, Lehman, Washington Mutual, Amaranth, Merrill Lynch etc.), I really wanted some experience of the same relevancy but from the UK side.  This was the only book I could find that seemed new and focused on a major bank (RBS).  The book is readable, and in chronological order, and really focuses nicely on the men in suits that created the giant that was RBS.  I didn’t realize until this book that at the time of its demise, RBS was the world’s largest bank by balance sheet, with almost $2 trillion on its books.  Also the author catches nicely the spine twisting moment when RBS closes the ABN Amro deal, at exactly the moment when questions were being asked about the credit worthiness of the CDOs on its own books.  Worse, ABN Amro had a balance sheet stuffed with the same things, and RBS had just paid top dollar for them.  Chilling.  The fault I have with the book is that it does not explain or describe enough the fall of RBS, and intricacies of its meltdown.  It’s all a bit of a rush.  The first 208 pages are all prelude, though with important narrative over its US adventures that opened up the core banks arteries to the toxic CDOs.  The next 80 pages are the real ride.  I really wanted a bit more about the events, and how the sweat must have been palpable in many important meetings. The details don’t quite live up to the depth in say, for example, Bear Trap or Colossal Failure of Common Sense.  Due to lack of alternatives, this was an “ok” read.  But the definitive UK-finance story seems yet to have been written.  Somewhat recommended 5 out of 10.


Category: Economic History Economy RBS Royal Bank of Scotland     Tags:

Why does the MDM (aka master data) hub have to capture the state of data?

by Andrew White  |  October 24, 2014  |  4 Comments

I was enjoying an invigorating inquiry with and end-user client yesterday.  The question on the table was this: why does the MDM hub, the place where the single source of truth (for master data) resides, need to be the place where we recognize, capture and govern the state of data?  I felt this was an intriguing question.  It immediately exposes for me one of the weaknesses of our collective information infrastructure and application landscapes; and also highlights an apparent fatal flaw in our current information systems design.

First the weakness: Even with all the talk of standards, protocols, and metadata, our information systems and applications deployed around our organizations do not, on the whole, get on well with each other.  I don’t mean to say they do not or cannot be integrated.  Many of our systems are integrated; that is very different to exploring the degree to which each system understands or respects the (information) needs of others (i.e. context).  In fact, by design, we have compartmentalized context because that is how we do things.  Using systems thinking we break things down into their smallest common components and assemble them into logical or meaningful groups.  Thus the concept of application or app is explained is somewhat explained.

This will expose the flaw.  A business application is meant to be a representation of how work should get done.  Years ago these were monolithic beasts; now they are almost transient services assembled periodically, perhaps one day, on the fly.  However, the very notion that we can predefine how work gets done leads to the idea that there is a boundary, and a boundary results in a semantic model that meets the needs of that which is bound.  Thus we have many applications, and each with their own authority model over their own semantic landscape.  We designed systems efficiently so that they did not have to worry or concern themselves with other application needs for the same information shared or “integrated” between them.

We are not going to solve this problem quickly, if at all.  The fact that I said this was a “problem” is itself problematic; it may not be a real problem that needs to “solved”, whatever “solved” means.

Maybe we don’t need to solve the entire problem.  In fact, Master Data Management (MDM) is a great case in point.  One that should be brainstormed at the highest echelons of IT strategy.  I say this, because MDM is at once a Trojan horse, and at the same time, a silver bullet.  It is a Trojan horse because it is showing us that we DON”T actually need to govern, master, properly manage all the data in our business systems.  We don’t need to because people are generally quite good at doing things, and coping with adversity.  An effective MDM program helps an organization prioritize and effectively manage only what matters, and govern what gives back.  Much data is fine, as is, in whatever state it finds itself.  If we only govern the part of the information make-up that really, truly matters, good things will come, and the rest of the world will operate quite nicely, thank you very much.  Another way to look at it might be this: we only need to exploit our master data slightly better than our nearest competitor…

MDM is also a silver bullet:

  • It sounds so easy
  • It’s not even that new as an idea (didn’t an Enterprise Data Warehouse do this anyway?)
  • It’s a technology -can’t we just buy one?
  • We can’t afford a massive cost , so is there an easy way to reconcile and manage this data, on the cheap?

MDM is not a technology.  MDM is not easy.  MDM is not cheap – but I don’t mean to say that it must cost you a $1M to acquire.  I mean to say the mental energy (executive and non-executive) needed to understand what MDM is, and how to make it work for the organization, is not insignificant.  The money you spend on “it” is the least of your worries.  Truly understanding what makes MDM different to ERP, EDW, data integration, SOA, cloud, in-memory, is the real concern.

So back to the question: Yes, the MDM hub needs to play the role of the source of information state since the rest of your information infrastructure and application landscape is incapable of doing so for the benefit and information of others.  This is a most interesting idea – one that sounds pretty dry, even IT’ish, but from a business perspective (note I have never worked in IT!) it is really quite interesting.  One I wish more of us explored – ideally with a nice, cool libation in hand.





Category: Authority Model Information Lifecycle Master Data Master Data Lifecycle Master Data Management MDM     Tags:

Euro Woes and Politicking – on and on

by Andrew White  |  October 24, 2014  |  2 Comments

Two articles in today’s print editions of the Financial Times and Wall Street journal call out the extreme debate raging beneath the malaise in the euro zone.

In an Opinion piece for the Financial Times (see Blame Berlin for bad policies, not reluctance to spend more), Otmar Issing (former ECB chief economist) suggests that calls on Germany to increase its spend (even deficit spending) to “promote growth”, are wrong and should be ignored.  In fact the argument is that Germany is a model European country that has done well.  As such it should not be dragged down like the less efficient states that today hamper growth.

In “France, Italy Take Their Austerity Fight to Brussels“, the Wall Street Journal reports on the politicking taking place before a summit set to review budget proposals for 2015.  There are rules agreed that say a country cannot run a deficit larger than a specific percent of GDP.  When a deficit is ran that is larger, the country can be fined (making things worse, of course).  I don’t actually know or remember if any country has actually paid a fine but I do remember France, and even Germany, gave flagrantly ignored the rule in the past.  However, the issue today is austerity (reduce deficit spending in at-risk economies) versus increased spending (in those same at-risk economies) to help grow the Euro zone out of it’s funk.

The fact is that Germany is a model economy, of sorts.  The problem though is that Germany is  able to account for its economic situation first and foremost and that is its “success” for which its politicians are elected, and Europe’s success and votes second.  On the one hand the country is acting as a pillar of strength among weaker Euro partners; on the other hand Germany is not “mucking in” enough and jointly getting their hands dirty with real work needed.  Germany, by being German (disciplined, managing within their means) saved the euro.  But by staying German, that are letting their Euro neighbors struggle on their own.

And the real issue is the Euro structure itself.  It is an incomplete federalist structure whereby Germany is able to count its economic and political condition separate from its euro neighbors.  In the US federal model, there is a state level budget, equivalent to a country, but it takes secondary priority over the federal budget.  There is an election, accountable to the people, for the head of the nation.  In Europe it remains an incomplete model white few political, monetary, fiscal, and centralized powers like the US model.  In Europe there is no federal gilt or bond market. The way debts are accounted for is not at the European or super-national level.   The ECB cannot fund its work like the U.S. can.  The banking system is not structured the same way. The federalists in Europe want more, as if an “all in” approach fixes the problems.  In part, it would fix the fact that there is no unitary response framework for such situations.  But more federalism would not, however, change the economic imbalances across the euro zone that actually  created the mess in the first place.  No amount of politicking will fix that.  For now, let the politicians meet, eat their croissants and coffee in plush surroundings, print expensive bound and leathered documents and meeting minutes, while the unemployed sit patiently waiting for their unemployment check.



Category: European Central Bank (ECB) European Union     Tags: