Andrew White

A member of the Gartner Blog Network

Andrew White
Research VP
8 years at Gartner
22 years IT industry

Andrew White is a research vice president and agenda manager for MDM and Analytics at Gartner. His main research focus is master data management (MDM) and the drill-down topic of creating the "single view of the product" using MDM of product data. He was co-chair… Read Full Bio

Beyond Master Data Management – when even our own words fail us

by Andrew White  |  January 30, 2015  |  1 Comment

I noted today an article in the US print edition of the Financial Times, titled, “Slip of the tongue sparks debt sell-off“.  The article refers to a comment made by the Belarus president, Alexander Lukashenko at a press conference.  He meant to say that his country would “refinance” its debt.  However, the word he used was “restructure”.  In that environment that implies a default and thus a hair-cut for debt holders.  The result was a short term panic that resulted in price of the country’s bond falling markedly.  The announcement resulted in the price of its 2015 $1bn bond crashing, driving its yield up to 46.8 per cent on an annualized rate a day before.  The president was forced to appear in public again in order to correct his error and try to recover the value.  At the time the article was published that value had not yet been recovered.

We can implement programs to help assure that when we say “customer x” or “product y” we all know which customer or product that is.  But we can’t implement programs to assure that Freudian slip.  Ooops.

Oops.

1 Comment »

Category: Uncategorized     Tags:

Share your Innovative Information Management Case Study at Gartner Summit!

by Andrew White  |  January 30, 2015  |  Submit a Comment

Gartner is hosting its 2015 Enterprise Information and Master Data Management Summits – inEMEA on March 11th and 12th in London, UK, and in the US on April 1st and 2nd in Las Vegas, NV. As part of the exciting agenda we always include end-user case study presentations.  They are the best attended and almost always attract the highest scores when rated by attendees.  Who can resist a good war story about how peers fought through insurmountable odds and ended up with a successful information-based initiative!

We are looking for case study speakers right now.  We have a couple of slots left the US event.  So if you are interested, drop me an email (andrew.white@gartner.com) and we can chat over the phone.  I can share with you the details and we can go from there.  No commitments needed yet – just a chat.  Go on, reach out.  You know you want to be on stage and share your success!!!

Our survey’s suggest that end users are particularly looking for case studies across:

  • Enterprise Information Management (EIM)
  • Master Data Management (MDM)
  • Information Governance and Stewardship
  • Coping with change management
  • The loss of a sponsor can kill a program
  • Innovative use of Information Management technology

If you have good case study, exploring business value across use, management and governance of content, records, analytics, structured information, big data, open/linked data, or even someone else’s data, reach out to me and let’s chat.

And sorry, no vendors allowed.

Submit a Comment »

Category: MDM Summit MDM-Summit-NA     Tags:

The truth and lies behind an analytic: inflation

by Andrew White  |  January 28, 2015  |  1 Comment

Wise words from the Financial Times’ writer John Kay.  In today’s US print edition of that newspaper he wrote a Comment piece titled, “History is the antidote to fear of falling prices“.  He ends the article with, “A raise body temperature might be a sign of fever or the result of a relaxing hot bath: it is wise to determine which it is before you start to worry, far less prescribe remedies.”  The article is referring to the press and political alarm bells as a result of a bout of deflation, mostly now being reported in Europe but well entrenched in Japan and being spotted in the US.

The short column is really an interesting read.  He points out that in reality deflation is economically quite common.  For example, using the longest semi-official price series we see a 140 fold rise in prices in the UK since 1750.  But if you ignore what happened with prices during the Napoleonic and first world wars, that increase all but disappears up through 1938.

Kay also explores the fringes of the inflation analytic.  What is the real impact on you and me when that analytic is reported to be just above, or just below, zero?  In reality – not a lot.  The analytic is an aggregate of a somewhat subjective set of goods and services that very often differs between nations and even changes in composition over time.  Some commodity items may have a falling price while other that are required but capital items might be increasing in price.  In other words we need to be wary of the components, not just look at the headline number.

Additionally he identifies what are the underlying determinants of real price changes.  There is in reference to some pretty large, macro and even long-cycle conditions: productivity changes, opening up of new lands, global trade patterns and changes to them, and industrialization.  All these interplay and are complex in their own right.  Lastly Mr Kay does not talk about the sources of the component data, how it is collected (the process involved) or the quality of the data itself.  All these are fraught with disaster.  In the US the monthly data gathered by federal agencies related to unemployment from each state.  if a state happens to miss  the monthly deadline a federal employee will “guess” the number from past results in order to prevent holding up the federal publication.  There is every chance some “bad” data is used in many, many cases, wherever we look.

We can apply John Kay’s sage advice to our own use of analytics and information signals.  Big data is all the rage, and the idea that some unknown inference is just waiting to be discovered in the burgeoning set of data that grows every day, is quite common.  But will such information really signal an earth shattering insight?  Possible.  Does new information suddenly make old information redundant?  Maybe.  Should we change direction at every twist in outcome implied by our dashboards?  Perhaps.

Bottom line: we should not be alarmed with a reported deflationary analytic if the underlying causes are positive for us individually, for businesses, or for the economy.  So “Don’t panic, Mr. Mainwaring!”  Also, I need to find a book that looks at global inflation trends and causes thereof.  Any ideas?

1 Comment »

Category: Analytics Business Intelligence Business Performance Business Unintelligence Inflation Information Theory     Tags:

The incurable problem with data fitness

by Andrew White  |  January 27, 2015  |  1 Comment

I was reading an end-users strategy document for their upcoming Master Data Management (MDM) program.  As part of the strategy there is a significant technology decision – to go for a packaged solution from any number of software vendors, or to develop their own custom or bespoke solution.

There was a page listing a number of design assumptions for the custom/bespoke approach.  One stood out for me – it was this: This decision will introduce an inhibitor to the end to end solution.  The “this” in “this decision” is in fact the idea that processes and rules will be enforced as data quality issues are identified that would otherwise impact the effective use of information and its fitness for purposes.  Yes, the author of this assumption feels that by inserting a process to resolve business issues in data quality at source, the “end to end process” would in some way be inhibited.  Really?  Where is the finding that says, “By not addressing our poorly governed business processes, poor quality data bites us in the xxx some undetermined period of time later and thus processes are more randomly inhibited and with greater impact?

We should all recognize that most organizations get by quite nicely, thank you very much, with pretty crappy data.  This problem – of poor quality and low fitness of data – is not new nor will it ever go away.  But to blindly assume that any process designed to resolve those issues at source is going to be worse than doing nothing is clearly not in the business of selling business process improvement.  There is a trade off – I admit.  So why not simulate the business process “as is” and “as it could be” and evaluate with real business metrics and outcomes?  My anecdotal perspective suggests that 95 times in a 100 a process to trap issues at source is less disruptive to aggregate business outcome.  In fact it will more likely improve business outcomes!

1 Comment »

Category: Data Quality Data Stewardship Master Data Management MDM     Tags:

In the Digital World, Content is King and License Holder is Prince

by Andrew White  |  January 27, 2015  |  1 Comment

We know that the cost to store information is relatively low compared to the cost of authoring, creating or curating it.  We also know that the cost to distribute information is equally relatively low – in this Internet-based economy.  In fact such costs were at the heart of the business disruption experienced in the music business as third party organizations were disintermediated between consumers (you, me, or more likely our children) and the music artist. I can still remember the experience and fun of Napster – as a means for me to acquire digital copies of the same vinyl I had paid for in years past.  The number of physical stores selling CDs has plummeted.  Streaming is now doing to MP3 and online stores what they had done to physical CD sales.  I just experienced SoundCloud and I now look at my CD collection and wonder why I ever bothered.  My son has used the service for a long time – he just introduced it to me as I was looking for some Gustav Holst over the weekend – and I found it in seconds and more than I ever wanted.

But as such costs plummet the basic equation has become polarized across three participants: the author/artist and the consumer.  The third participant is in fact the holder of the license to access the IP.  That is the current battle.  The license holder might be the artist, or it might be a third party.  But the keeper of the right is where the money is going.

In Saturday’s US print edition of the Wall Street Journal titled, Music Services Face Suit Over Oldies.  The article highlights how sound recordings were not brought under protection of federal copyright law until 1972.  Until that time the law was a patchwork of state level laws.  Apparently some music pre 1972 was recently aired and the companies who played the music are being sued.  The real interesting part was the last sentence of the article: Lawmakers in Washington are considering a broad overhaul of national copyright law, including the oldies issue.

Copyright law is certainly outdated – it does not take account of the efficiency and dynamics of how current technology can legally – and illegally – store, share or otherwise consume, content.  Our business innovation, centered in information, moves much faster than our law-making systems operate.  Additionally those that write the law are themselves often law makers from the wrong end of the age spectrum – they don’t experience the results of their policy themselves.  But we should all be wary of new laws here. These laws will likely define much in terms of who might win in the new digital world.

 

1 Comment »

Category: Business Disruption Copyright Digital Business     Tags:

Storm clouds gather over Greece, its debt, and the chances of a ‘Grexit’

by Andrew White  |  January 27, 2015  |  1 Comment

The Comment columns in today’s US print edition of the Financial Times were awash with calls to save Greece and the Euro, in some fashion.  One article talked about forgiving Greece’s debt, standing at a burgeoning 175% of GDP.  Another article explores how halving the debt might save Greece and the Euro.

In Gideon Rachman’s column, entitled Forgiveness that Europe cannot afford, the author explores what could happen if Greece’s debt is forgiven:

  1. Political backlash in northern parts of Europe which would strengthen far-right and nationalist parties
  2. Far-left and anti-capitalist parties would gain credibility in Southern Europe and would press for similar debt write-downs
  3. Breakdown in trust between members of the EU that accepted the Greek write-down would make keeping the euro together much harder

Such a write down could be part of s negotiated settlement that requires reform in Greece.  Such reform would seek to put in place effective business regulation, end closed shops, and implement a reliable tax system.  Would Greece go for this?  If Rachman is write it won’t matter.  A write down is a default: the market will run riot and rings around the euro.  Surely it would be better to let Greece exit, reintroduce the drachma, and if they want to fix their problems and quickly, the drachma would appreciate in value. Mid the Greeks don’t want to fix their problems their currency will devalue and the problems of Greece will no longer hamper the Euro.  At least that is the theory.

if you have any doubt as to the situation on the ground in Greece today check out these:

  • US print edition of the Wall Street Journal – Opinion piece titled “Greece’s Last Evasion” talks about the effort one needs to go through to legally start up a new company.
  • See my book review of The 13th Labour of Hercules that tells of a lot more corruption and government misrule over what could be a normal, growing, thriving economy.

1 Comment »

Category: Debt Debt Default Debt Forgiveness Economy Euro Greece     Tags:

Upcoming Gartner Enterprise Information and Master Data Management Summit – ask the analyst…

by Andrew White  |  January 26, 2015  |  1 Comment

So you are thinking of coming to the 2015 Gartner Enterprise Information and MDM Summit?  Why not indeed – it will be the best place, the most comprehensive set of knowledge and advice around Enterprise Information Management and MDM on the planet – bar none.  And that’s just the sales pitch.

Here is a list of some of the new topics we explore this year:

  • Digital business transformation
  • Open data and how to exploit it
  • Information management meets mobility
  • Data lakes meet information governance (when is a data lake, not a data lake?)
  • Making change management stick
  • New roles, new leaders and new responsibilities
  • Content management and unstructured data (is it too late?)
  • Hadoop meets infrastructure

On a more serious note, if you have a question about the summit, the agenda, the content, who will be there, or related to information management, why don’t you submit a question?  Here is the link: Ask a Question.  I will respond to all questions either in email or in a blog post. Hopefully we can meet and compare notes in Las Vegas!

See you there!

1 Comment »

Category: MDM-Summit-NA     Tags:

Beware the Currency Wars of 2015

by Andrew White  |  January 26, 2015  |  1 Comment

The title of this blog is the title of an Opinion piece in Friday January 8th US print edition of the Wall Street Journal – see Beware the Currency Wars of 2015.  It was written my Mike Newton, former macro trader and global head of emerging market FX strategy at HSBC.  The article was spot on in it is calling for global orchestration of monetary policy: “The situation demands policy coordination.”

Bretton Woods came about when the global community accepted a shared goal related to stable prices as means to support economic growth, over any one nations attempt at individual growth.  It was an imperfect framework, and it was used to circumvent the fallout of the breakdown in the gold standard, another but slightly less imperfect system.  The gold standard, when based on sterling, worked for longer and was more stable.  This was mainly because Britain’s growth objectives were helpful to most other participating nations.  The Bretton Woods agreement broke down chiefly due to France and U.S.’s desire to hoard gold and the U.S. Governments actions that resulted in an imbalance between its spending and reserves (i.e. debt).  However for periods of time both systems helped Global Inc. to cope with the then current economic conditions. This is what we need today:

  • The US is experiencing lopsided growth, with fast growing employment but with low wage growth, coupled with energy driven deflation
  • The Euro zone is not a homogenous set of economic states.  It is a mixed bag of states needing different medicine: periphery states (i.e. Greece, Portugal) need investment, debt reduction, and regulatory reform; central and northern economies (i.e. Germany, Nordics) need higher interest rates to balance export growth.  Add to this Italy, Spain and France, central economies that need regulatory reform, debt reduction, and different interest rates to the periphery and Germany, and the entire zone is inconsistent.
  • A global financial sector hampered by increased regulation in the interests of saving ourselves from ourselves (and the next financial crisis)

There was a half-baked attempt at global coordination during this economic cycle.  The London G20 summit of 2009 suggested that a memo calling for new spending and better coordination of policy would help the situation.  Now you have the Fed talking of pushing interest rates up just at the same time that the EU is talking of billions of Euro’s in terms of QE.  Now you have trading blocs doing everything they can to help themselves, and no real coordination at all.  If ever there was an excuse for real effective global monetary policy coordination this is it.  I called this in October last year in: Economic Minute 4 – Global Call for Bretton Woods II. The global economy will lurch on for a long while, accidentally looking for a road to growth.  If we can just put away our own toys for a moment, and sit down as adults, we might agree on a better way.

1 Comment »

Category: Bretton Woods Economic Coordination Economic Growth Economic Productivity     Tags:

My Holiday Book Reviews – Imperial Japanese Navy and Army

by Andrew White  |  January 16, 2015  |  Comments Off

The Imperial Japanese Navy- In the Pacific War, Mark E Stille, Osprey, 2014

This is a really nice book- part desk reference and part narrative.  By ship class (e.g. Carrier types, battleship, cruiser, destroyer etc.) Stille explores and explains the history of each design, the major structural and technical innovations and armaments.  He points out the strengths and weaknesses of each, and also the upgrades received during each ships’ lifetime.  For each ship he also then details a short history of its work, its major engagements and battle scars.  A really nice resource to have on hand and an easy, engaging read.

The study of the Imperial Japanese Navy (IJN) is fascinating.  Japan is an island nation not unlike Britain. It had designs on a similar empire, and of course domestically, it’s history, tradition and grandeur goes further back in time than it dies for Britain.  It was the Washington Treaty of 1922 (See The Washington Conference, 1921-22, edited by Erik Goldstein and John Maurer, CASS, 1994 and my book review) that resulted in the cancellation of a strong alliance between Britain and Japan.  This alliance may have helped prevent the war in the Pacific, but Britain was worried over how such a relationship would damage its political relationship with the US.  The Washington Treaty was an attempt by the U.S. and the UK to (basically) prevent an escalating construction of major fleets that were crippling economies and budgets ever since the conclusion of WWI.  With the treaty Japan (and the US and UK) had agreed a specific limit of large ship types (carriers and battleships) but the treaty did not cover cruisers, though they were the focus of later conference attempts at treaty limitations.

With WWII Japan’s strategy against the U.S. was centered on the success the IJN had experienced versus Russia in 1905 at the battle of Tsushima that was a decisive naval engagement that effectively decided the war, in Japan’s favor.  This idea of a decisive naval engagement permeated Japanese strategic thinking regarding the U.S.  If you add to this the different economic capability between Japan and the US, the development of its navy was driven by a need for higher quality over the US forces rather than higher quantity.  The very idea of a large number of united carriers as a single strike force was a major Japanese innovation, albeit adapted from Britain’s initial experiment at Taranto in the Mediterranean in 1940.  Up until Perl Harbor the U.S. Navy had used single carriers as support vessels for battleship centered fleets.

Japan’s battleships got bigger and bigger, heavier and heavier, and with larger and larger guns.  The scale, the muscle (much hidden in the thickness of the armor, and the beauty of the outline of the Yamato are breathtaking.  Yet she hardly fired her big guns-at the time the largest of any ship on the planet.  But this leads to one of the stranger points of Pacific war history, and it is explored in the book.  The battleships of Japan never really engaged any navy in the large fleet encounter for which the battleship was designed.  There were one or two smaller engagements but still no significant impact from Japanese battleships.  During the war the carrier eclipsed the battleship as the primary offensive weapon, but nonetheless the Japanese never had the luck or could pull off the conditions for an effective decisive engagement.  I find this interesting and, in fact in my gaming simulations of the pacific war I use such resources more adventurously.  The war is never won by Japan, but it is often extended by a year or so.

Recommended 8 out of 10

The Imperial Japanese Army- The Invincible Years 1941-1942, Bill Yenne, Osprey, 2014

I am in awe of naval history and its impact on history.  As such I have not read as much concerning army logistics or army campaigns related to the pacific war.  I have read a lot concerning Burma, Malaya, and Guadalcanal, but that is about it.  This book nicely fills in the gaps.  The scale and organization needed to pull of so many military activities from December 1941 to the end of 1942, are staggering.  I knew about the secret weapon the Japanese used in Malaya- the bicycle, but Yenne does a nice job of exploring the many other fronts, amphibious landings, and senior Japanese military leaders in action.  The material related to Java, Borneo, Celebes, and New Guinea are splendid.  I didn’t learn too much more about Singapore, Malaya, or Guadalcanal since they have been covered quite heavily in other material.

An interesting a point Yenne makes is that for the most part the IJA was never really defeated.  That is it never really lost a major land battle.  It did lose a few, and critical to the strategy.  The early scuffles with Russia (think Zhukov) led to a stalemate on that front that otherwise might have had an impact on the war in Europe, especially as the German Wehrmacht was closing in on Moscow in December 1941.  But many if the cities and islands of the expanded Japanese empire remained firmly under Japanese rule at the end if the war.  And this is to be expected, even if not well thought through (by me at least).  There was no need for the allies, and chiefly the U.S., to fight every emplacement of Japanese land forces.  All they needed to do was a) nullify the Japanese ability to supply distributed forces (that led to the demise of the IJN), and b) then pick and choose which string of islands to conquer in what order to move toward threatening the Japanese homeland.  That second phase was initiated and founded with operation cartwheel. So this was a great book that adds a lot of interesting insight to my WWII bookshelf.

Recommended 8 out of 10

These two books were my Christmas present from my recently deceased father-in-law. The books were really nice presents on my favorite topic from a dear friend.  Thanks John.

Comments Off

Category: Book Review Imperial Japanese Army Imperial Japanese Navy World War I World War II     Tags:

US Bank/Credit Card Companies Lag Behind Global Security Standard

by Andrew White  |  January 7, 2015  |  2 Comments

I read “New Credit Cards Fall Short on Fraud Controls” in Monday’s U.S. Print edition of the WSJ.

I find it both sad and incredible that the U.S. bank and some retailers that use credit cards (in the article both Discovery and J.P. Morgan Chase & Co. are cited) are not adopting the “Chip and PIN” technology used elsewhere in the world, and instead plan to continue to use the sign and chip capability.  It must be the case that U.S. banks find the losses from credit card fraud as more palatable than the costs of adopting the newer, safer, technology.

As a European I have lived through the transition of one format to the chip and PIN format.  The change I had to undertake is easy relative to the effort for me to spot the fraud on my card in the first place, then prove it to the bank, track paperwork, and re-affirm new credit card ID with a trusted supplier.  Yet the article reports how U.S. banks have tested the Chip and PIN and found that some customers in the US do not like the change.  What else are they going to say?  Few of us seek change but surely this change is warranted?

Banks should seek to eradicate these fraudsters.  Economic principles, it seems, are going to allow these crooks to keep on at it.  Only when the levels of fraud become too great will these firms acquiesce to the inevitable.

I have been subject to fraud myself several times: my long-distance telephone service was hijacked 15 years ago.  It stopped once I added a PIN to my account.  I have had a credit card ID stolen (it was used in a duty-free shop in mainland Europe and it was there that the card was swiped, twice).  A chip and PIN service should be adopted.  Enough already.

2 Comments »

Category: Bank Fraud Banking Banking Regulation Chip and PIN Credit Card Fraud     Tags: