Andrew White

A member of the Gartner Blog Network

Andrew White
Research VP
8 years at Gartner
22 years IT industry

Andrew White is a research vice president and agenda manager for MDM and Analytics at Gartner. His main research focus is master data management (MDM) and the drill-down topic of creating the "single view of the product" using MDM of product data. He was co-chair… Read Full Bio

A Day in the Life of an Analyst – at Gartner’s BI and IM Summit 2015, Sydney.

by Andrew White  |  February 23, 2015  |  Submit a Comment

I had a packed day – here is a quick run through of the highlights.

6.01am breakfast at the hotel – only me and the hotel staff at the time.  I guess I was first to breakfast.

6.45am more email back at the room but need to get ready.  More key-note practice.

8.00am keynote technical check, voice warm up and last minute rehearsal.  All ready.  Looking forward to the keynote now.

9.32am event keynote to a packed room.  A little nervous now.

10.32 event keynote ended perfectly on time.  Awesome.  Break, coffee, review a note from a peer on organizing for effective big data.

11.30 1-1 with an end user struggling with information governance and getting it established in the business.  One interesting question the user had was this: users are confusing agile information governance and on-demand.  Here is a quick comment on the difference.

  • Agile Information Governance. This is a new adaptation of information governance that traditionally is broader, even wall to wall, end to end, focused and very much top down, formalized, and focused on the classic definition of decision framework for information access. This adaptation tends to focus on smaller teams, fewer roles, and in fact less data that is effectively governed.  At the same time the focus is less on decision rights, but more on an iterative process that starts small, and over time grows focus on what is governed, and the driver is only ever business outcome related.
  • On demand. This is a style of service delivery.  However, information – even information governance – is not a ‘service’ to be delivered.  It is part of what the business does – every day – all day.  Agile information governance might consume on-demand services – but they are not the same.

I think the confusion arises in that “agile” implies small, nimble work.  This sounds (perhaps) like on demand.  But one is very specific work, and one is a style of delivery.

12.10pm lunch – more email.  Completed peer review of that note on big data.

1.30pm Ask the Analysts regarding Master Data Management.  This was for me a thrilling thirty minutes.  I wish it was an hour.  20 odd folks were peppering me with questions – I just hope I was able to provide value in the response.  It is hard to tell in such pressured environment.  Here is a summary of the range of questions we explored:

  • How to cope with systems that call the same data by different terms?
  • How to cope with data called different things in the same hub?
  • How to cope with analytics that look the same but are defined differently?
  • How to avoid MDM processes that slow down the agile data discovery work?
  • What tools are used to clean up data?
  • What tools exit to support MDM programs, including Microsoft SQL Server?
  • How to find or determine the stakeholder for benefit from MDM?

2.15pm quick coffee and bump into a couple of users on the show flow area.  Quick chat about governance effectiveness.

3.15pm Session: Organizing for Effective Information Management.  Busy room – focus on the organizational challenges neee to be addressed in order to support an effective EIM program.

4.30pm 1-1 with a small vendor with a cloud-based MDM-like technology solution.

5.10pm quick drink and walk around the show floor.  Hung out with Gartner colleagues afterward for a few mins and comparison of notes.

5.35pm met with Gartner sales rep and walked to dinner with an end user.  Great conversation about how to develop and prioritize a broad EIM program that helps move the needed in the business.

9pm back at the hotel.  Clear up email and early to bed.  Tired.

Submit a Comment »

Category: Uncategorized     Tags:

On a Quest for Dark Data? Then show us your wimp!

by Andrew White  |  February 19, 2015  |  Submit a Comment

Have you been following the trickle of news articles reporting the varied attempts around the world to prove the reality of dark matter?  It seems Large Hadron Colliders and the like are all the rage again.  Some unsuspecting town will probably create a black hole and won’t even know it.

Seriously though it is an interesting tale of exploration.  In last weekend’s US print edition of the Wall Street Journal there was an article titled, “A Cosmic Quest for Dark Matter“.  The article reports latest estimates that the universe we see all around us equates to about 4% of everything.  Dark matter apparently could be a further 23% and the even more mysterious dark energy makes up the rest.

So the hunt goes on.  Next up are wimps, or weakly interacting massive particle.  By smashing together certain particles we should observe tell tale signs of wimp behavior, even though they are so small we might not ever see one.  Apparently this is hard work.

I wonder if information in our organizations follow the same ratio or real matter, dark matter and dark energy.  That the visible data we see represents about 4% of what really exists, or maybe what is formally managed.  And that the vast majority of dark data, is just hanging around.  Some might be perceived- the equivalent of dark matter.  And the mysterious dark energy is the data we have no awareness of at all?

I wonder which data makes up the wimp?

Submit a Comment »

Category: Dark Data Dark Matter Data Lake     Tags:

Enterprise Information and Master Data Management (MDM) Summit Q&A part 2

by Andrew White  |  February 18, 2015  |  Submit a Comment

We have in hand a number of questions from attendees from last year’s summit that were handed in at the closing keynote.  We have also received questions from those inquiring of this year’s event in London and Las Vegas, and we also pooled our thoughts from a number of the more common questions today we see from. I am posting a series of these questions, along with an answer.  Many of these questions are listed “as received”.  I hope you enjoy them.

  1. Where does MDM [vendors] says like Cordis/Markit EDM fit?
    • Markit (previously known as Cadis) describes itself (on their website) as “a central hub to manage the acquisition, validation, storage and distribution of data in a consistently, fully-audited environment.” This sounds like an information service of some kind.  In consumer goods/retail there is GDSN that seems to be similar.  It seems like Markit can act as a source of information for use by business applications; or it could – conceivable – replace an internal data store of some kind…perhaps…  So not MDM really but not far off.  Maybe its just a source of information to be considered with other sources….part of an overall information infrastructure.
  2. Data quality MQ. Can you speak more on emerging offering around “what should I clean?”
    • The “what” should you clean really depends on “what” is important to your organizations. For example I would suggest NOT trying to clean up supplier master data to help with “single view of supplier” in support of a spend reduction program if, at the time, the business has fat margins and is focused on growing into new markets.  I would however suggest interrogating the quality and consistency and impact on customer, location, market, brand and product data for that same organizations.  So the ‘what’ you should consider for cleaning up should always be tied in some way to an identified business outcome.  If there is no outcome in mind, and no business sponsor in view, don’t try.  No one will thank you.
  3. Please elaborate concept of virtualization of data in MDM.  Thanks.
    • This is both a simple and tricky question. Virtualization itself is a simple concept.  Data is “virtualized” in that a copy of the data is rendered elsewhere from where it persists.  This data is not duplicated – so it is representation of the ata.  In some ways the registry implementation style of MDM is a form of virtualized master data – but that is a bit of a stretch.  I think it’s fair to say that it is possible to virtualize data as part of a broader MDM program, especially as it relates to the initial setting up and evaluation of sources of data for system of record consideration.  But once MDM is implemented there is no virtualized data – the data is actively governed – that is the whole point of MDM.  Virtualizing the results of an MDM program may help someone else who cannot integrate to the MDM hub – that’s a possibility….but very manual I think.
  4. An MDM project requires a mix of ETL, DQ, and potentially one or more MDM domain – how do clients navigate the multiple MQs?
    • With difficulty in some cases, and easily in others. For example, the vast majority of end user organization actually only tackle one of the larger domains at a time, that is to say either a customer oriented domain or a product oriented domain.  Trying to do both at the same time would likely suck up too much of the life blood of an organization that it won’t be able to do its normal job.  Even for each of these large domains there may still be other “multidomains” though often related to the single larger domain.  For example with customer master data you may still include location data.  This would still be multidomain MDM technically but the main complexity derives from the customer data domain (in most cases).  For all forms of MDM there is a need for some common capabilities like ETL, DQ and so on.  They need to be selected as you go, taking into account longer term needs to help make acquisition efficient.
  5. Should we focus on a single vendor for each tool or go for best of breed?  Elaborate.
    • This is similar to the last question. The truth is that you need to be think of two lines of reasoning.  First, what is the source of complexity in your information based systems?  Is it derived from one kind of master data, or several?  Are the degrees of complexity equal?  Do these sources of complexity hold business process outcome improvement hostage?  This line of thinking will lead to a conclusion that one or more data domains need to be addressed – and this might mean MDM.  Second, what is the budget window, and risk propensity of the organization?  If you can budget smart and can look ahead more than a year, you might consider evaluating a vendor on its ability to meet current and future MDM program needs.  However, most end user organizations do not do this.  They select one vendor for one domain (with a real evaluation) and just accept that the same vendor can meet the other data domain needs later based on a verbal promise.  Sometimes this works out.  Sometimes it works out with a different product from the same vendor.  Sometimes it doesn’t work out.  Users should do proper evaluations for what it is they are buying/licensing.

Why don’t you come along to our summits – ask your questions – and let’s see if we can help!

Here is the first set of questions I responded too.

  • From a vendor perspective, why are multi-domain MDM solutions so challenging?  I.e. Technical/process problems?  Or marketing problems?  Other?
  • What are your thoughts on product data solutions for e-commerce?
  • Magic Quadrant for data model management?  Vendors ER studio, Power Designer, ERWin etc.  Criteria?  Repository power, version control, meta-data, cross-platform support?
  • I keep hearing that “the business” needs to own the rules, the process, be the data stewards.  So what do you do when your sales team just wants to sell and they believe MDM and data quality is an IT problem?
  • In terms of architecting the information environment, how do organizations understand the requirements for MDM and CRM tools?  What is the difference between the two?  When would I choose one over the other…or do we need both?

 

Submit a Comment »

Category: MDM Summit MDM-Summit-NA     Tags:

Mobile Gaming is Dumbing Down PC Gaming – part 2

by Andrew White  |  February 18, 2015  |  Submit a Comment

I blogged on this topic recently: Why Mobile Gaming is Killing our High Tech Industry – though Doom and id Software may yet help…

I claimed that dollars spent to “dumb down” PC-based games to fit on a mobile platform are dollars that would have been better spent pushing the PC-based game boundaries.  The short term gains of revenue for the new channel are offset against pioneering investment pushing the PC boundaries that promise even larger profits for more companies later.

There is clearly money to be made in selling games, even what was originally a PC-based game, to mobile users, but the technology will not focus on pushing the traditional boundaries of UI or processing development.  UI’s are smaller and processing power is limited; though I admit there might be innovation in the P2P or collaboration space.  Networking and peer consistency is not that pushed since that is part of large scale PC-gaming environments today like World of Warcraft and Eve Online.  Heck, even defunct Sims Online had more users that most mobile networked game have so far.

The bottom line is that the boundaries of PC gaming are shrinking, not growing, and this bodes poorly for our high tech industry. One reason many of us have powerful PC’s at home and at work is due to FPS gaming in the 1980’s and 1990’s.  If Doom had not happened, I would not be sitting here now and many software firms would not be where they are today either.

Well I have more proof, sadly, in one of my favorite game designers Peter Molyneux .  He is the creator of Black and White which is an early form of ‘god game’.  Well, he used Kickstarter to launch a new game, Godus, that was to be what Black and White never was.  Two years later he admits that adding mobile support, before the PC game was even finished, killed off the vision for the game.  Money diverted from the PC game to start the mobile version denuded the PC game of much needed funds and focus.  Now Godus is in trouble.  See here.  It was only a year ago things looked so good…..  And to top it all off, Kickstarter will fail too since the original game goals on the PC will not have been met.  I paid for a half finished, re-designed-game-to-pander-to-the-mobile-masses.  Sadly on the video you can hear Peter admitting that someone else now has a better vision then he.  Incredible.

Everywhere you look you can see erstwhile successful PC-produced gaming publishers adding mobile support to their PC games. Instead of developing a new PC game, they go off and adapt a PC game to a mobile platform.  Or much, much worse, they design a game from the start to work across multiple platforms, thus assuring the PC game will be a compromise and not an innovation.  Mainframe software was adapted to powerful PC’s that supported real time what-if analysis of business and supply chain scenarios.  Those PC’s were only there due to things like Doom.

Submit a Comment »

Category: God Games Godus Id Software Innovation Peter Molyneux     Tags:

Is it fair to ask: can you outsource innovation?

by Andrew White  |  February 18, 2015  |  1 Comment

There was an interesting story on the front page of the U.S. Print edition of today’s Wall Street journal, the Business & Tech section.  The article is called, Why GM Hired 8,000 Programmers.  The article suggests that 2 year ago GM had ended a $3bn IT outsourcing deal.  Now the firm has built and deployed some innovative commerce-driving web site solutions to differentiate itself in the market.  These innovative solutions were built by the now “in-sourced” programmers hired by GM.  Note too the CIO is no newbie but none other than Randy Mott, firmer CIO at Wal-Mart.  At Wal-Mart Randy was leader during a period of great innovation in retail and consumer goods, namely what became CPFR.

So what do outsourcing agencies say about this story?

A couple of points and questions come to mind as I read the article.

  • What is the main value proposition for outsourcing?  Is it cost cutting and efficiency or is it innovation?

Outsourcing is a great way to access highly repeatable services.  The provider stays in business by scaling repeatable work.  If it were one-off work offered then the costs would skyrocket and the service provider would have to change business models.

Equally working with outsource companies often ends up being a discussion more around service level agreements and terms and conditions.  And this before you get to define the new requirements.  In contrast with your own resources, you just issue a new edict.

  • What role does a strong, effective CIO play in this decision?

A strong CIO that understands the business and how information and technology (they are different!) changes the game.  They ‘get’ outsourcing and put it in its rightful place.  A CIO that plans on making a successful career based only on outsourcing tends to be short lived: it will focus on cost cutting, not innovative investment for growth.

I believe outsourcing makes sense for that work which does not provide differentiation or innovation.  I believe it can free up funds. But you should try to sell the resulting new investment along with the cost savings- at the same time.  And you need a leader, not a pacifier.

On reflection I think the question, “can you outsource innovation?” is the wrong question.  I think you can use outsourcing to help fund innovation; and some services offered by outsourcing vendors can be consumed in some innovation.  But the idea that a core business differentiation or innovation can be outsourced is hard to accept, for anything but the shortest amount of time possible.

PS a little known secret related to CPFR: The original CFAR pilot with Warner Lambert (now Pfizer) used EDI transactions – there was no hint of using real-time messaging via the Internet at that time.  I know – I was the one that asked the question publicly of the pilot presenters at Benchmarking Partners.

1 Comment »

Category: Differentiation Innovation Outsourcing     Tags:

Gartner’s Enterprise Information and Master Data Management Summits 2015

by Andrew White  |  February 17, 2015  |  1 Comment

Gartner’s Enterprise Information and Master Data Management Summits 2015

I am really looking forward to our upcoming summits in London, UK and Las Vegas, Nevada.  It is with a slight tinge of sadness that I attend as Conference Chair for the US event, as I have been Conference Chair since inception, and this is my last year. I am giving way to some new blood and will myself now focus on other responsibilities.  It has been a wonderful ride, and I learned so much.  I even got the chance to mete and hopefully help end-users learn and discover the power of information – and hopefully how Gartner can help them in their work.

The event has changed dramatically.  When we first met at the world’s first Master Data Management summit in Hollywood, Florida, there were about 250 of us.  It was a massive “group hug” as folks nervously assembled in the sunshine to find others with the same “disease”.  MDM had finally shorn itself of its historical trappings of Customer Data Integration and Product Information Management.  In 2014 the event had grown to nearly 900 folks, and the event itself had outgrown the initial birth of MDM and had entered its next phase (more on this later). John Radcliffe, my friend and partner in crime, and I took the keynote stage and the rest is history.  In some keynotes we wore hats (he was “party” and I was “thing”); in some keynotes we wore beach shirts and I actually tore off my short.  Once I even dressed up in a soccer referrer’s uniform (to make a point about policy enforcement and governance).  John retired from Gartner in 2012.

John and I actually first met some year before at a Gartner offsite where we were working on research coverage for the following year.  He was in the CRM team covering CDI and I was in the ERP/Supply Chain team covering PIM.  We literally were leaning up against different sides of a post in a large hotel ballroom, listening to our leaders present.  We just happened to lean forward, spot each other, and struck up a conversation.  Within a few minutes we spotted what turned out to be a pattern that was to change our career’s and has now become something much bigger.

We started out with a single event in the US with 250 attendees.  A few years later we spun out into Europe.  In 2014 we hosted two events (US and EMEA) the US event grew to nearly 900 attendees.  In 2014 and more so in 2015 the MDM event has grown again.  MDM remains at the center of information management.  MDM remains for many the starting point of another attempt at an effective, overall information management program.  But the event now is getting larger, broader.  We now have content and Gartner speakers that are spanning the total range of information management.  This year we have content for Enterprise Content Management (ECM).  We even have a case study presentation lined up to explore their multiyear vision and reality for ECM.  ECM itself is a discrete IM program, but when you align vision, strategy, metrics, governance, organization, life cycle and even technology across programs, you are attempting EIM.  Hence EIM becomes the umbrella, and inside it we started with MDM, but we are now spreading our wings.

The event this year will again sport 4 tracks – one of which will focus on business leaders (to support the idea of a team-send) and also program leaders.  Most of the content at the event will contain Gartner content, and end user case studies.  There are vendors too – some of the industry’s largest and/or most innovative technology and service providers.  So if you want to get the latest in MDM, EIM and things like ECM, and meet vendors in the space, this is the place to go.

In future events and locations we will likely have entire tracks for ECM, Records Management, e-Discovery, Metadata Management, and maybe even Business Intelligence (itself today a larger event….but who knows…), application integration, and so on.  Information is central to the digital business.  This event is already the “go to” place for the Chief Data Officer, and the CIO or information leader responsible for information strategy, governance, risk, compliance and innovation.  I hope to see you there this year – and next.

1 Comment »

Category: Chief Data Officer Enterprise Content Management (ECM) Enterprise Information Management (EIM) MDM Summit MDM-Summit-NA     Tags:

Enterprise Information and Master Data Management (MDM) Summit Q&A

by Andrew White  |  February 7, 2015  |  1 Comment

We have in hand a number of questions from attendees from last year’s summit that were handed in at the closing keynote.  We have also received questions from those inquiring of this year’s event in London and Las Vegas, and we also pooled our thoughts from a number of the more common questions today we see from. I am posting a series of these questions, along with an answer.  Many of these questions are listed “as received”.  I hope you enjoy them.

  1. From a vendor perspective, why are multi-domain MDM solutions so challenging?  I.e. Technical/process problems?  Or marketing problems?  Other?
    • Though the range of technologies needed to support MDM (e.g. data quality, data model, dashboard, workflow, analytics, integration etc.) look the same in name, the reality is that there are differences. The differences in the technologies needed for different data domains such as customer and product, as well as additional differences that are specific to industries even for the same data domain, result in different capabilities.  For example for customer data you most often need a data quality capability such as entity resolution. For product data the more common data quality capability you need is semantic and/or text string parsing.  Vendors have addressed the markets in different ways to seek competitive positions.  Thus specialization leads to a solution for fitted for one data domain and/or industry that is less competitive in another.  To try to put all the capability, all the intellectual property, and all the industry support, into one single product or solution is just far fetched.  There are some end-users who requirements are not that complex, and as such they may use a product that does actually support several domain, but these are still relatively rare.
  2. What are your thoughts on product data solutions for e-commerce?
    • This is a complex question, really. Some of the MDM of product data vendors have focused on customer-facing business processes, such as those related to order to cash and e-commerce, or even omni-commerce.  Some of these solutions have therefore been asked by end users to store much more than structured product data.  What vendor do you know ever says, “No, I can’t do that”?  At the same time, some MDM of customer data solutions have been deployed in a different part of the same organization, focused on customer data.  With big data there is even more need to integrate and unify the customer, prospect, product and sentiment data.  Thus MDM is at the center of that conversation too.  Not an easy one to answer definitively.  Hope this helps identify some of the trends.
  3. Magic Quadrant for data model management?  Vendors ER studio, Power Designer, ERWin etc.  Criteria?  Repository power, version control, meta-data, cross-platform support?
    • Great question. The challenges here are several.  This is not really one market.  A lot of the vendors or tools that did some of these capabilities have been acquired, and the capability has been rolled up into another product and so into another market (and so Magic Quadrant) as a feature.  Some of the technologies are widely re-usable for (say) MDM, application integration, B2B, data migration and so on.  Thus the use-case differs greatly too.  So there are several different capabilities that architects and designers use, and increasingly some of this data is needed by governance and stewardship users in line of business.  So we do watch the space and if we see coalescence of demand (from you) and a growing set of interchangeable, competing offerings from vendors, you can guarantee we would cover that with a Magic Quadrant.
  4. I keep hearing that “the business” needs to own the rules, the process, be the data stewards.  So what do you do when your sales team just wants to sell and they believe MDM and data quality is an IT problem?
    • This is one of my favorite questions. It’s the wrong kind of question though, and the fact that it arises suggests that MDM (and data quality, for that matter) is not understood by those business people the question refers too.  When I was a Supply Chain planner at Elizabeth Arden I was the informal information steward.  I accepted the role unofficially, from my peers – I was a citizen steward.  Whenever one of our business processes, such as planning a new product, or revising a production run for a new promotion, was held hostage to bad data, the other business users would come to me to “fix it”.  I was the only one that had an understanding of the IT systems and apps we were using and I knew how to get things fixed.  In this case the business (I was representing the business) knew the value of information and we did the work.  This is the same for sales.  Unless the sales team understand what bad data does to their sales process, they won’t care for data.  So the question should never come up – if the right education is established.
  5. In terms of architecting the information environment, how do organizations understand the requirements for MDM and CRM tools?  What is the difference between the two?  When would I choose one over the other…or do we need both?
    • This is quite a broad question – given that MDM and CRM are two very different things. CRM is a business strategy that is focused on understanding and servicing a total relationship with customers.  This may need a CRM application, and each one may come with its own customer master file.  MDM is a program that assures consistency of master data (not all data) – such as customer or product master data. MDM programs are supported by MDM solutions that bring with them a tool set to help reconcile and manage those disparate versions of master data – in this case customer data across all your CRM apps, suites, and cloud based apps, to ensure you only have one version of customer across them.   From a business perspective you need CRM (if you sell to customers) and MDM.  That is not the same as saying you need a CRM app and an MDM app.  The degree to which you need technology cannot be answered in this format – it requires a detailed understanding of your current business, application and information environment, a view of where those are headed, and a clear view of what business goals are to be exploited and/or maximized.

1 Comment »

Category: Application Architecture Applications CRM E-Commerce Education Enterprise Information Management (EIM) Information Architecture Master Data Management MDM MDM of Customer Data MDM of Product Data MDM Summit MDM-Summit-NA Multicommerce Omni Channel     Tags:

Unintended consequences: mortgage lending growth inhibits economic recovery

by Andrew White  |  February 5, 2015  |  Comments Off

In this week’s US print edition of the Economist there is an article entitled, “Banks have been boosting mortgage lending for decades, at the expense of corporate loans“.  The article is the Free Exchange article, under the heading As Safe as Houses.  The article reports on recent research in the US that suggests that over the last 150 years there has been a slow but gradual shift in the composition of what comprises bank loans.  It seems that bank loan mix has shifted from consisting of around 30% for mortgages in the 1900’s to about 60% more recently.  The ‘loser’ has been business loans, used for capital improvement etc.

“So what?” you may ask.  The result of this “squeeze” on business is that those organizations have needed to go elsewhere for their funding.  The place they went too has been, according to the article, the bond market. This might sound OK but it seems the risks that the bond market takes on is different to what banks do.  The bond market prefers longer term risks over short term.  Unfortunately this means that small start-up firms, who predominantly take out a large share of the shorter term loans, have been hardest hit.

The implication is that innovation and creative distraction, the ultimate source of economic growth, could be stymied.  It turns out that there is ample evidence that the rate of growth of start-ups in the US has slowed. This slowing down of new business development does not cause recession, but it does slow down any chance of a rebound or recovery once a slowdown gets entrenched.  This is where we are today.

So the conclusion of the article is that well intentioned government led programs to promote home ownership, which itself was thought to promote economic stability, has actually contributed, or even led, to a condition that undermines that very stability.

Comments Off

Category: Debt Economic Growth Economy     Tags:

Currency Pegging – Size Doesn’t Matter – You Knew That

by Andrew White  |  February 4, 2015  |  Comments Off

There was a small article about a small European country (Denmark) in Tuesday’s US print edition of the financial times.  The article was called, “Danes shield currency peg from storm unleashed by Swiss“.  The article explains how the Danish central bank has been active in protecting (i.e. maintaining) its currency peg to the euro, just as initial speculation in the krona is increasing.  This increase in speculation is part of the fall-out from the Swiss central bank move that recently relaxed the ceiling under which their currency had been held to the euro.

Though a small nation relative to the euro zone I get the feeling the role played by Denmark will be much larger.

On Wednesday September 16th 1992 I was sitting in an office in Liverpool helping United Biscuits, a large UK based (at the time) confectionery and food group, develop its ERP and SCM implementation strategy. Someone burst into the room around 10am to say that the bank of England had just upped its interest rate to 15%.  This was in defense of the pound that had been under extreme pressure in the markets.  The pound had been pegged to the D-mark for some time, and then (via the ecu) joined the Exchange Rate Mechanism, or ERM.

The problem in 1992 was the disparity between UK and euro economic cycles and the exchange rate adopted to equalize the currencies.  The loss of faith by the markets in those relationships led to selling of the pound, thus pushing the currency to the extreme ranges of its agreed peg to the euro.  The emergency move in interest rats was an attempt to alleviate the bleeding, and prevent the bank of England from emptying its coffers in defense of the pound.  It was never going to be enough- by that evening the chancellor of the exchequer announced the pound would have leave the ERM and therefore float against the euro.  It was known as black Wednesday and it shows how you cannot buck markets.

Denmark is about to undergo a similar experience, though I hope it does not lead to the same magnitude of effort or result.  What is different here is the size of the currency trade (very small compared to sterling), the size of the economy (smaller), the state of the economic cycle (not out of whack with mainland Europe) and the financial reserves (smaller).  Also different is that Denmark has been pegged to first the D-Mark and then the euro for nearly 30 years.  So it is a successful peg- one of the very few in history.  But it is also, ominously, the last member of the ERM2 from which the UK, then Italy, withdrew.

Currency speculators were recently caught flat-footed with they Swiss move.  They don’t want to be burned again and the pressure is only just mounting, according to the article.  The economic impact won’t be significant, but I think the political impact will far outweigh any other.

Interestingly once Britain left the ERM in 1992 its economy immediately went on to grow faster than that of the euro zone.  It’s economy was freed up of from self-imposed currency manipulation and the exchange rate evolved towards its natural economic competitive position. Denmark won’t likely experience this economic boom for one simple reason.  Much of its trade is with Germany, for which interest rates are aligned more practically, and so the Danish economy is very much on the same cycle as Germany. There is no pent up out-of-cycle growth to exploit.  And the challenge Denmark faces is down to pure market speculation, not down to any loss of faith by the market in any economic issue.

The very goal of exchange rate and price stability, one of the main sales points of the euro, is not and never the only or sole economic goal of trading nations.  It is an important goal that should be balanced with natural economic disparity across trading regions.  The euro has not been able to achieve this, yet.  Denmark, though not economically out of cycle with the central euro zone regions (i.e. Germany) may just be another speculative bump along the road.

 

Comments Off

Category: Denmark Economy Euro Exchange Rate Mechanism Exchange Rates Interest Rates     Tags:

Greece’s troubles will haunt the Euro

by Andrew White  |  February 4, 2015  |  Comments Off

I don’t write book reviews for books I am only half way through reading.  But I noted in Tuesday’s US print edition of the Wall Street Journal an article titled, “Europe Stimulus to Lift Biggest Naysayer, for one.”  The article highlights how Germany will benefit from the ECB’s massive QE program.  The part of the article that caught my attention was a comment made by Germany’s central bank president, Jens Weidmann, noted in the article by “saying that the ECB’s massive stimulus would fail to address the euro zone’s debt and competitiveness problems.  The program, he says, will take pressure of countries such as Italy and France to revamp their labor market and push through other economic overhauls.”

So you might think this as political hyperbole, or anti-ECB politicking.  That is until you read, The Euro Trap, by Hans-Werner Sinn, 2014, Oxford University Press.  I am but half way through and the data and arguments exploring and exposing what happened in the run up and after the euro is simply alarming.

The euro itself had such a calming effect on interest rates for countries joining that it actually created the very bubbles that now threaten its own existence.  With euro membership in the near-future for counties like Greece, Portugal and even Italy, such countries were “forgiven” by the market for the profligate spending.  In effect these countries were joining the euro in order to buy their way into German interest rates.  Their debt burdens were reduced as a result, thus allowing the opportunity to reduce debt more cheaply.  These countries did not and in fact increased debt and went on a spree to pay themselves more than their economic productivity would have allowed.  This is what Mr. Weidmann is talking about.  With the euro introduction those same countries forged ahead of other counties with higher wages without any real improvement in productivity.  The euro made this possible. And now these issues are again at the center of the underlying strains and imbalances across the euro area/

The book is chocked full of information you just have to see to believe.  The euro is far from safe.  There are just too many organic or structural forces at play.  It is amazing it [the euro] has lasted this long.  After reading half the book I recommend it.  It’s the closes thing an economist gets to a thriller.  Reading it you hope that the murder is caught and you seek a happy ending.  The problem with this thriller is that it is still being written.

Comments Off

Category: Economic Growth Economic History Economic Productivity Economy Euro European Central Bank (ECB)     Tags: