by Andrew White | December 18, 2014 | 3 Comments
I saw an interesting article in this weeks Economist (US print edition Dec 13-19). It was an article in the science and technology section, titled “A history of birds in 48 genomes“. The article explores new research that supports the idea that birds are decedents of one of the dinosaur lines, the one that actually included Tyrannosaurus Rex. Being able to talk about dinosaurs is of course cool. But what was really cool was the way in which the conclusions were reached.
Traditionally scientists would classify species into their position in the tree of life according to their physical and behavioral characteristics. But deciding which characteristics were more important was tricky business. Using genes was a more effective way to simplify the effort, but until recently only a handful of genes could be used. The article highlights two breakthroughs that has changed this. One is the fall in cost of the work for rapid gene sequencing, and the other is the increase in computing power. In other words more computing power, more data (whole genomes rather than a handful of genes), and better techniques (more mature process) has created a much more effective outcome. And additional insight can now be observed, as the article explains.
One word of caution for those in love with “big data just means more data”. The effort here went from “some data” to “all the data”. There was no ill-defined data lake or unknown set of data to draw on. The genome is a finite set of data. So this is a good “right data” case study and not “any old big data works” case study. This story does not seem to support the promise that more data is always a good thing. There is ample evidence that more data alone does not necessarily produce better results, in some cases it produces worst results. Check out More than you know: Finding financial wisdom in unconventional places, Colombia, 2007, Michael J Maubboussin, for an interesting read.
Category: Big Data Business Intelligence Right Data Tags:
by Andrew White | December 17, 2014 | 1 Comment
Fatal Risk: A cautionary tale of AIG’s corporate suicide, Roddy Boyd, 2011, Wiley. Not actually sure why I read this book. I guess it rounds out the broad set of tales relating to what happened during the financial crisis. Boyd’s style is s little annoying. While overall the book is a chronology of what happened, in each chapter Boyd tackles each topic (AIG’s and banking, AIG and property, AIG and hedge funds, AIG and mortgages etc.) then take us back to the source of that topics’ ills. We thus tend to keep repeating the same point about the disaster over and over in each section. I would have preferred a consistently chronological tale. In a nutshell the author does a good job exploring the roots of AIG’s downfall though. It was about excessive risk taking by a firm dedicated to help others protect themselves from risk. Boyd also shows simply how useless rating agencies were at the time. Just when AIG needed some stability in its finances – a breathing space in order to clear things up – its credit ratings were downgraded. The downgrade triggered an increase in its required capital holdings as a proportion of its exposed risk. This requirement was just not very useful at this moment. It’s as if the rating agencies are always behind the curve: the downgrade was too late and ended up playing not an insignificant role in blowing open the sink hole of funds needed to save the firm. If the rating agencies were more on-the-ball and downgraded AIG months before, the story might have been different. Since Hank Greenberg left AIG just as the disaster hit, we are denuded if his participation. It would have been fun to imagine him in meetings, at the helm, as the earth opened up under him. Recommended 7 out of 10.
Category: Book Review Tags:
by Andrew White | December 16, 2014 | 1 Comment
Gartner is hosting its 2015 Enterprise Information and Master Data Management Summits – in EMEA on March 11th and 12th in London, UK, and in the US on April 1st and 2nd in Las Vegas, NV. As part of the exciting agenda we always include end-user case study presentations. They are the best attended and almost always attract the highest scores when rated by attendees. Who can resist a good war story about how peers fought through insurmountable odds and ended up with a successful with information management program.
We are looking for case study speakers right now. We have a couple of slots left for the EMEA event and we are just opening up the “doors” for the US event. So if you are interested, drop me an email (firstname.lastname@example.org) and we can chat over the phone. I can share with you the details and we can go from there. No commitments needed yet – just a chat. Go on, reach out. You know you want to be on stage and share your success!!!
Our survey’s suggest that end users are particularly looking for case studies across:
- Enterprise Information Management (EIM)
- Master Data Management (MDM)
- Information Governance and Stewardship
- Innovative use of Information Management technology
If you have good case study, exploring business value across use, management and governance of content, records, analytics, structured information, big data, open/linked data, or even someone else’s data, reach out to me and let’s chat.
And sorry, no vendors allowed.
Category: MDM Summit MDM-Summit-NA Tags:
by Andrew White | December 12, 2014 | Submit a Comment
In Tuesday’s (December 2nd, 2014) US print edition of the Wall Street Journal there was an Opinion piece called, “Congress’s Budget Office Needs Better Numbers“. It seem that the the tools and practices used and followed by the CBO have not been kept up to date. The result is that the main source of information that is used to drive congressional policy on financial issues is outdated, outmoded and more specifically, flawed. The impact of this is that decisions taken by lawmakers may in fact produce unintended and possibly negative outcomes.
The article gives several examples of the sources of these apparent flaws:
- The CBO uses a model to try to understand the deleterious impacts of change in tax policy and how this affects behavior. It seems the models have done a poor job in terms of accurate predictions for tax increase and for regulations that impact growth.
- The models used to relate how private competition replaces public costs is inaccurate. It seems the CBO models consistently underestimate the cost savings from allowing private insurers to replace, for example, single-payer health benefits.
- Overall transparency of the model itself is lacking. It seems that much of the work and the model is operated as a black box and this means few others, even outside the CBO, can offer ideas on how to improve the work and therefore improve the quality of the findings informing political debate and decisions.
If this were a public company, or if this was part of a public company, and if the findings in the article were in fact proven, heads would roll. It is not likely that such poorly performing work would be allowed to continue. If assets don’t add value to its intended outcome, investment in them would changes. I assume the article implies a current political motivation that prevents improvements in what the CBO does. I hope better heads will prevail and changes will be made.
Category: Business Intelligence Business Outcome Business Performance Congressional Budget Office Tags:
by Andrew White | December 11, 2014 | 1 Comment
I spied an interesting article in today’s US print edition of the Financial Times called, “A banking grandee’s rethink on the rules of the game“. Sir David Walker, chairman of Barclays and author of a government report that has proven very influential in terms of guiding recent regulatory developments, is interviewed. His perspectives over what he would do then, knowing what he knows now, are most interesting. One thing explored in the article are the roles and new board oversight committees he would put in place. One concerns risks- external and internal to the business, and specifically focusing on “unknown unknowns”. He says, for example, he didn’t know what cybercrime was five years ago.
At Gartner we periodically discuss the scope of the role of the new CDO. This emerging role is really just an excuse to put a laser-like focus on that part of the CIO’s role that has, for many CIO’s, been hard to develop and sustain. For example too many CIO’s tell us that they are too busy negotiating vendor contacts or managing physical infrastructure issues that they have little time left to look at information innovation or strategy. To help, the message is “hire a CDO”. Or, elevate the role as CIO and delegate the other stuff to someone else.
The scope of the CDO has tended to focus on information compliance (often in regulated industries) and also information innovation, information strategy and information governance including leading the Master Data Management program. Sir David Walker’s interview makes it clear to me that we should ensure the role of the CDO should also include information security and information risk management. This is not to say that chief risk officers or security and fire wall departments need to report to the CDO. I just means that there is value in a unified organizational structure where such reigns converge- as it relates to information.
Physical technology and organizational security can carry on doing what it needs to do. But those that understand the life and times of information, and its relevant impact on business operations and performance, are best suited to analyzing, understanding, and setting information policy for the entirety of the information life-cycle. I suspect this is not an open and shut case though. The role of the CDO is still emerging- there are more hired every day. And the roles assigned are different, often across organizations even in the same industry. But if I try to be prescriptive, I just feel that article and analysis from the likes of the chairman of Barclays leads me to this conclusion. What do you think?
Category: Banking Banking Regulation Chief Data Officer CIO Information Governance Information Innovation Information Leadership Information Risk Management Information Strategy Tags:
by Andrew White | December 10, 2014 | 1 Comment
Hey, ever wanted to be an analyst? It is thoroughly recommended. After being an end-user in several organizations (never working in IT, I have to add), then working in the vendor space for 9 years, and now as an analyst, I can honestly say that the experience provides unique insight into an invaluable dimension I didn’t know existed as end-user or vendor. We are hiring, and specifically we are hiring for information management. This role will be focused on information technologies and specifically Master Data Management, with a focus more on product data (this time) and less so on customer data. So this means experience in and around business process management (BPM) and business applications like ERP, SCM, PLM, procurement etc. are most valuable. So if you are interested, let me know at email@example.com. Here are the details in case you are interested. The role is targeted for US.
PS someone emailed me directly last week and my Outlook literally ate the email. You might want to resend to me if I did not respond to you. Sorry about that.
Category: Uncategorized Tags:
by Andrew White | December 10, 2014 | 2 Comments
Two articles yesterday suggest to me thanks are not what they seem.
In Monday’s US print edition of the Wall Street Journal there was an article on the front page. It was called, “Banks Ask Bug Firms to Move out Cash“. New regulations designed to help stabilize the banking sector and protect it from risks during recessions and financial crisis may actually put them at more risk. Banks are encouraged to maintain higher reserves against the loans and debts they hold. If banks increase their loans or debts they are to equally increase their cash or credit reserve. Sounds OK – overall. The problem is each kind of asset is treated differently. Large sums of cash would ordinarily be deposited in banks by hedge funds and other cash-rich entities. Think about that- this is what banks do. Such accumulation of cash (savings deposits) adds to investment in that banks use that cash and loan it on to others who want to build firms, bridges, and schools. But the new regulation is forcing banks to now increase its holding of reserves – and such increase in reserves means that banks have less money to loan to those that want to grow the economy. Additionally some government agencies are considering charging interest rates on larger deposits! Yes, a form of charge for saving. Can you believe it? The result of both policies will reduce the demand on banks for saving. The result will be less saving for subsequent investment. This is Freakonomics at its best!
The second article, inside the front cover of the same Wall Street Journal, titled “Mortgage Firms Detail Aid Program“. There is so much wrong with this article. Fannie Mae and Freddie Mac are lowering their requirements for mortgage applications in order to help get the mortgage business moving. They plan to reduce the level of required deposit, perhaps down to as little as 3%. This is madness. We should not lower our standards- we should uphold and advertise them. We should focus policy on economic growth to put money in peoples pocket so they decide, if they want to, to save for a new house. We should not dabble in social engineering. We are willingly repeating the same mistakes that led to the financial crisis that ruined our economy. This should be stopped at all costs.
Category: Banking Banking Regulation Big Government Freakonomics Mortgages Tags:
by Andrew White | December 10, 2014 | 3 Comments
Two articles today in the US print edition of the Financial Times suggest that it is only a matter of time before Greece leaves the Euro. And with its leaving trust in the euro would actually rise (after an initial fall), and Greece would be able to manage its own inefficiencies away with exchange rate policy and a weak drachma (or whatever it’s new currency would be called).
The front page of the newspaper sports an article called, “Snap election in Greece reignites feels for eurozone“. The article explains how the Greek prime minister has called snap elections – the risk is that if he lost, an extreme far left party (the Syriza) might win. The Syriza are growing in popularity on an anti-austerity, anti-euro ticket.
You cannot forget that youth unemployment is at near all-time highs, as well as overall unemployment. Also national debt is unbelievable high and showing few signs of coming under control. Economic growth is only just showing positive signs with some good prospects next year. But all this is not the point. If Syriza took over, political sentiment would overtake economic sense. Greece would resist the chains placed on it by its euro-creditors, and if they were broken, the euro could well fall as a result.
Inside the paper, on page 4, was the more damming article for Greece. And this article was not even about Greece. It was titled, “Merkel seeks to allay party fiscal fears with U-turn“. The report explores a speech the German chancellor made at her party’s annual bash. Such events are always filled with rhetoric and crowd pleasing passages. This was no different, but for the U-turn and to whom it applies. Just the previous day Merkel said she had no plans to pander to her economically liberal critics within the party with a timetable for tax cuts. Yet during a 70-minute speech at the party conference in Munich the chancellor suggested that, should the nation remain true to its fiscally neutral balance, efforts to relieve the tax burden would be considered. Though not a firm commitment it is very, very different from a plain and simple, “nein”. The damning part though for me was a quote lifted from the speech: “We have stopped living on credit. We are thinking of our children and grandchildren.” This explains the thinking behind Merkel and Schauble (German finance minister) when they call on Greece to become fiscally responsible.
The math does not allow this to take place any time soon. And German defiance, rooted in financial competency, will not and cannot be compromised. In fact it’s resilience explains why the euro has survived this long. However, we now have a clear rock and hard place. In fact the only way such math can add up is a choice between:
- Massive, ongoing, and generally debilitating wage reduction in Greece, leading to significant lowering of the standard of living, or
- Exit from the euro, and control of their own exchange rate to help cushion a slightly less reduction in standard of living still required
This is due to Greece living beyond its economic means for so long. And no nation has, as yet, figured out an easy way to cope with such a correction. Each and every time throughout history default has occurred, ejection from a currency union or peg, or revolution- or sometimes all three. The next 6 weeks- the time for Greece to work through its election- will be watched like a hawk.
Category: Euro Crisis European Union Eurozone Exchange Rates Germany Greece Tags:
by Andrew White | December 9, 2014 | 1 Comment
I blogged yesterday with “Which Matters Most – the Analytic or the Analysis?” and concluded that the analysis was more valuable than the analytic. Well it so happens that I am here today to revise my perspective! I was sitting there (actually, standing at my standing desk) and beavering away at my inbox. Suddenly an email popped in with the headline, “Maersk Cancels North Asia to US route in favor of 2M Alliance“. Being a fan of global trade, and knowing something of the importance due to the scale of shipping that takes place between Asia and the US, this headline stopped me in my tracks. I hot-footed over to the email, and clicked my way to the article. It turns out that the actual story title is a little alarmist. It seems Maersk is going to reduce its own capacity on this very important trade route and instead use 3rd party partners to full the drop in capacity – apparently somewhere between 10 and 13% of the total between those two regions. 10% is a notable drop in capacity as a sole supplier; but overall global trade seems to be fine – this is just a change in the provider of the means of transport.
But there was a comment, hidden away in the bowels of the analysis. The worrying comment I spotted was this: “Maersk Line said the overall trans-Pacific trade and its TP5 service between North China, South Korea and Japan to the West Coast in particular had provided unprofitable results for nine out of the last 10 years.” So the analysis led me to assume “no real change” in how global trade is executed: I am less concerned with the change in ownership of the vessels moving the goods. But this data point is a point of contention for me: I am concerned that the routes might not be profitable. If they are not profitable such routes may not continue. Perhaps the 3rd party has larger ships that can accommodate the 10% capacity from Maersk, while keeping such lines profitable. I sure hope so. Any disruption to the key supply lines between Asia and the US will have big impacts on the health of the US economy. So I wanted to call out the value I found in an almost “aside” analytic or data point hidden in the body of the analysis. Interesting, perhaps.
Category: Global Trade Globalization Shipping and Logistics Transportation Tags:
by Andrew White | December 8, 2014 | 3 Comments
In Saturday’s US print edition of the Wall Street Journal there was an interesting article titled, “To Find Fraud, Just Do the Math“. The article reported on how powerful yet easy to apply analytics have been, or can be, used to spot financial fraud. According to Benford’s Law, 30.1% of numbers in a list of financial transactions to begin with ‘1’. Each subsequent digit in that number should represent a progressively smaller proportion, with ‘9’ as the beginning digit under 5%. The article shows a graphic of financial transactions from Enron that do suggest a breach of Benford’s Law in 2000.
The article presents a case study at a call center operation where operators were authorized to issue rebates up to $50. After some time and a lot of rebates, Benford’s Law was applied to the leading digits for all the refunds for each of the call center operators. The number ‘4’ turned up far in excess of its expectation. Analysis was done and fraud was uncovered.
So this is an open and shut case then. Simply automate the analysis of leading digits in an array of numbers and when you see a breach of the ‘Law, search for wrong-doing. Sounds all so simple then. But what if I now know how the ‘Law works? Why can’t I create a simple program to generate a range fraudulent numbers that ensure that overall proportions for each digit remain the same? Benford’s Law seems to work when applied to a source of numbers when that source is ignorant of the analysis. Once I understand how I am being measured, surely I can change behavior (for good or ill). So I would think that strategically the analysis is more important than the analytic. What do you think?
Category: Analytics Benford's Law Business Intelligence Information Analysis Tags: