Michael Blechar

A member of the Gartner Blog Network

Mike Blechar
VP Distinguished Analyst
17 years at Gartner
43 years IT industry

Michael Blechar is vice president and distinguished analyst in the Information Management Research area of Gartner's Research and Advisory Services. Mr. Blechar specializes in the area of metadata management/repositories, information and data services…Read Full Bio

Metadata Management for Pace Layering

by Michael Blechar  |  March 3, 2012  |  1 Comment

Many organizations find themselves with an enterprise application strategy that no longer satisfies the needs and desires of the business community, and this often leads to poor decisions and bad investments. In response to this need, Gartner has developed a “Pace-Layered Application Strategy” methodology for categorizing applications which, when coupled with metadata management best practices, enables a differentiated management and governance process that reflects how applications and related information assets are used and their rate of change — information needed to make good business decisions and investments.

At the heart of these layers is the concept that solutions may be categorized in three “layers”; 1) foundational but not provide highly competitive value, 2) foundational and provide competitive differentiation or 3) “innovative” to pilot new ideas which add further competitive value. In general, the first two categories (or “layers”) tend to have a large number of interdependencies across data, process/workflow, application and technology, while the latter may be rather isolated/independent from the other solutions – though some aspects of the innovation may be independent while other aspects might be more integrated to existing solutions.

For example, adding new mobile access from an iPad would include new technology and (perhaps) application logic but the network being used and the data being accessed might also be highly shared by foundational applications. Or, there may be a use of existing technology and applications to access new social media data in the cloud as part of an innovation project.

The net of all this, is that to understand the impact of innovations or changes to foundational applications requires a good understanding of the interdependencies within and across each of the pace layer for all aspects of the solutions – i.e. the metadata about those solutions. In other words, metadata understanding is what enables architects and business managers to make decisions on which solutions to change, when to change them and which approaches to use to best meet those needs and objectives.

Pace layering is a best practice for application strategies, but needs enterprise-wide metadata management as a complementary best practice to be successful.

Gartner clients who would like to read more about this topic should access the following research notes:

Accelerating Innovation by Adopting a Pace-Layered Application Strategy
Metadata Management for Pace Layering

1 Comment »

Category: Uncategorized     Tags:

Innovation Through the Use of Pattern-based Thinking

by Michael Blechar  |  January 31, 2012  |  Submit a Comment

Many of you may know that I happen to be a really good chessplayer (i.e. former NY City Open Junior Champion and US Air Force Chess Champion). To get good at the game, you must learn to identify repeatable patterns (i.e. “in this configuration of pieces the White pieces have the possibility to generate a winning attack”). It frequently takes a couple of years of lots of study and playing to become familiar with the numerous patterns. And then, the better players will also be able to sense when a certain position is similar to a known pattern but, due to the slight difference in piece placement, the result will be quite different. And, the great players, have the ability to juggle all these patterns and differences and couple them with deep analysis of certain key alternatives in the “decision tree of possibilities” and choose the right one.

It should not be surprising, then, to find that it took quite a while to create a world-class chess playing computer. But, in 1997, IBM’s Deep Blue computer finally won a six-game chess match against world champion Garry Kasparov. At first, it was thought that this meant the death of professional chess. However, what has happened is that the leading grandmasters (and amateurs alike) have turned to computer software available on the desktop to help them better understand where there are new unexpected “variants to the patterns” which lead to even more complex, dynamic and exciting games between humans playing face to face (or via the internet with an agreement to not use computers during the game).

In February 2011, we saw the next generation of computers like Deep Blue when IBM’s Watson beat the two greatest reigning champions of the US game show Jeapordy. It accomplished that through capabilities like pattern recognition and knowledge stored in parallel processors. It is now being used by IBM at client beta sites to explore other possible commercial usages.

So, why an I telling you all this? We are entering an age where the information explosion coupled with ever-increasing computer speeds and parallel processing is changing the face of where organizations will need to invest in innovation for business improvement opportunities.

Innovation Through Pattern-based Thinking: With the explosion of new types of “big data” available from social and cloud sources we are seeing emerging business improvement opportunities for new roles like the “data scientist” enabled by computers that can see “new patterns and variants of the patterns” for this information and other related information the organization already has available. The pendulum is swinging away from business process improvement through traditional application development to information-based opportunities for business improvement through computer-aided analytics. This is an emerging trend which will be increasingly realized over the next 3-5 years and probably exploding into more general mainstream use thereafter.

Just thought you might want to know…..! :)

Gartner clients who would like to learn more about this topic should find the following two research notes of interest:

Analytics and Learning Technology: CIOs, CTOs Should Rethink Art of the Possible
Emerging Information Use Cases Challenge Traditional Information Environments

Submit a Comment »

Category: Uncategorized     Tags:

Metadata Repository Request for Information (RFI) and Ratings Toolkit

by Michael Blechar  |  November 24, 2011  |  Submit a Comment

In my last post,  I mentioned a new research note I had just published (“Decision Framework for Evaluating Metadata Repositories*”). In that post I discussed a proven, successful approach for evaluating metadata repositories based on specific scopes of emphasis and levels of organizational maturity in terms of governance and metadata management practices. I have just released a companion set of research “Toolkit: Sample RFI and Vendor Rating Spreadsheet for Evaluating Metadata Repositories*” which includes a spreadsheet for rating and ranking repositories using the decision framework, as well as a sample set of RFI questions for gathering information to rate the solutions.

While I obviously cannot go into detail on the RFI here, I’d like to highlight three important things which repository selection teams should address before sending out the RFI to the vendors.These will streamline the selection process, prevent law suits from vendors who feel they were treated unfairly, and circumvent later debates and arguments amongst team members about the the rating criteria being used or which vendors should qualify as “finalists”.

Must Haves: Information managers will need to identify "must have" criteria which the vendor solution must support to be considered for selection. For example, a constraint may be that "metadata must be stored and managed in a relational database management system". These are useful for quickly eliminating vendors from consideration whose solutions do not meet these minimum requirements (and allowing them to understand why). However, organizations need to be careful when coming up with their list of "must haves," since the highest rated — or best compromise — solution may get eliminated based on what truly may not be must haves. In other words these need to only be “unconditional deal breakers”.

Level of Criteria Detail: The scoring spreadsheet does not need to go down to the most detailed level of selection criteria. The impact percentage of each low-level detail in the decision framework (at the "leaf level of the hierarchical rating tree") should be large enough to have an impact on the overall rating/ranking of vendors as opposed to having time-consuming evaluation of too-small criteria which have little effect on the total decision. In other words, given that the "must haves" will contain details which will eliminate the contenders who are misfits, there is generally no need to evaluate the differences in scoring for vendors on a criterion which only has a 1% or less total impact on the decision.

Minimum Rating: A suggested guideline is to establish before sending out the RFI a "minimum acceptable rating level." That might, for example, be 80% — meaning that if the vendor/repository score is anything less than this percent that solution will be considered as unsuitable for use. This allows the removal of vendors whose score falls short of "final consideration" for acquisition. It also, obviously,  helps the selection focus in the real candidates without debating whether a solution which falls short (but which some members like) should be a finalist or not.

Obviously, Gartner clients who want to understand these points (and others for which I cannot blog) better, will want to see the toolkit RFI questions,and selection criteria and ratings spreadsheet.

*Available to Gartner clients or for a fee

Submit a Comment »

Category: Uncategorized     Tags:

Evaluating Metadata Repositories

by Michael Blechar  |  October 31, 2011  |  2 Comments

I recently published a research note called “Decision Framework for Evaluating Metadata Repositories*” which describes the best practices for the process of rating and ranking repository solutions. Obviously, this goes into much greater detail than I can address in this blog. However, I’d like to highlight a few key points to my readers….

Most organizations will end up with their metadata stored in multiple, different technologies and places under the control of different people in various roles with some limited coordination of metadata across these sources (see my blog on “Metadata Management: Sources and Integration Impact Success and Failure” for more details). Therefore, organizations will need to be evaluations of the metadata management capabilities of multiple solutions – both individually and in concert.

The first, biggest, mistake I see in organizations trying to do metadata repository evaluations is in not understanding the scope of their short-term and long-term metadata management needs. And, related, not knowing the current and future needs of the roles they will be supporting with the repositories and, therefore, outgrowing their initial selection too quickly to justify the acquisition or build expense and implementation effort.

The metadata stores/repositories of technologies you already own, and those which you could acquire, come in different sizes, costs and focus. Until you know the scope you need supported – and which specific metadata artifacts, relationships and use cases you need to manage within that scope – you cannot create a valid set of criteria for evaluating your options.

Some organizations mistakenly try to select a “corporate repository solution” capable of managing any and all types of metadata when they are unable to implement or manage to that scope of cost and effort. Or conversely, they are so focused on one short-term need that they lose sight of how the solution for that need will coexist with the broader metadata management plan in the long-term – causing extra effort and cost later on.

Warning: A bigger issue (and beyond the scope of this blog) is how your information governance practices currently support the scope of metadata management you plan to implement. Frequently, repositories exacerbate the lack of sound information governance practices.

This issue of whether to focus on a best-of-breed short-term solution and worry about the broader long-term needs later, or use a (perhaps more expensive and complex) solution now which can be expanded to include the broader future needs of the enterprise later is a fundamental decision which drastically impacts the repository evaluation criteria.

The second mistake is to weight too heavily some detailed technical criteria causing the exclusion of the best alternative. For example, while it may be important to you that the repository support Object Management Group’s Reusable Asset Specification (RAS) standard, or have your metadata assets stored in a relation vs proprietary database. While it is good to include these types of criteria as requirements, if they are “must haves”, you may very well eliminate the solution which best fits your overall objectives, scope and needs the best.

Or, to say it in a different way, your selection criteria needs to have a balance of technical criteria requirements as well as other things like vendor execution and vision, service and support and total cost of ownership.

Net: Most organizations will find their best metadata management alternative to be to use the metadata management capabilities in the current tools they own supplemented by the implementation of multiple best-of-breed repositories with some limited federation or consolidation across the technologies. This needs to part of a broader enterprise-wide metadata management strategy/program which can help define the scope and needs for the selection/evaluation process of the best-of-breed metadata repositories and when and where metadata federation and consolidation should occur..

*Available to Gartner clients or for a fee

2 Comments »

Category: Uncategorized     Tags:

Metadata Management: Sources and Integration Impact Success and Failure

by Michael Blechar  |  October 1, 2011  |  1 Comment

While those in most organizations I talk to want one comprehensive metadata management solution, they generally understand that the issues surrounded an enterprise-wide set of shared metadata is beyond their scope, funding and governance abilities to implement. Hence most metadata management efforts – especially initially – tend to focus on smaller-scoped domains like data warehousing, master data management or the publishing and management of software services in registries or repositories.

When it comes to selecting the most appropriate domain to “start on”, I find that organizations frequently do not consider which further “sub-scope” to go after, or the issues surrounding the source and integration of metadata for that sub-scope. Let me give an example.

Let’s say that we have selected master data management (MDM) as our primary scope of metadata management to address. First, we need to know which subset of the domain is to be addressed first. Is it general customer or product metadata or some other sub-domain of MDM? And, is it to document and manage all the metadata for that sub-domain (customer contracts, customers orders, etc) or just some subset of scope within that sub-domain (like customer name and address metadata to the exclusion of contracts and orders)?

While many (most?) organizations fail to consider this issue of scope, others are able to get to this level of detail when trying to make metadata management strategy decisions. But I would suggest there are two layers beneath this which can cause the metadata management effort to fail and need to be considered.

The first is sources of metadata. Metadata can come from a spectrum of sources (see The Eight Common Sources of Metadata*) ranging from well-defined and managed metadata residing in models and metadata repositories to what’s locked up in the heads of certain business users of IT personnel – or even worse, might not exist at all! So, before deciding on a metadata management effort for a given sub-domain like “customer orders”, it would be valuable to know if the metadata about that domain exists or not, and how difficult in terms of time, cost and availability it would be to support that domain versus another.

The second is related – integration. When the metadata resides in machine-readable format with a definition of that metadata it is more understandable and (re)usable then when it does not. The more the metadata definition is in the format of a standard which crosses technologies and domains, the more likely it is the be sharable. The more that there are pre-existing bridges, federation or consolidation across technologies for that metadata the more sharable it becomes (see Six Common Approaches to Metadata Federation and Consolidation*).

You might know the sources of metadata, but unless they can be shared across the technologies of the domain in which you are focused (i,.e. such as across a MDM or data warehousing suite of tools), the less valuable the metadata will be, and the more difficult the ability to manage governance risk and compliance, coordinate change and release management, etc.

Net: When deciding on your metadata management strategy you must, of course, be driven by business opportunities and threats. But to minimize the potential for failure to achieve metadata management objectives – and to make a fully informed decision on how the metadata management effort is going to move forward – also factor into your decisions the available and type of sources for the metadata and how those sources can or cannot be integrated through federation and consolidation.

*Available to Gartner clients or for a fee

1 Comment »

Category: Uncategorized     Tags:

The Heart of Information Infrastructure is the Information Capabilities Framework

by Michael Blechar  |  October 1, 2011  |  4 Comments

To borrow a phrase from an old James Bond movie soundtrack by Wings, “But if this ever changin’ world in which we live in makes you give in and cry, say live and let die”. This can be especially true for those in Information Management who are getting bombarded with requests to support all kinds of structured and unstructured data residing inside the organizational firewall or externally in the “cloud” or internet and need some way to be both responsive while maintaining the integrity of the information for which they have management responsibility.

A key strategy is to implement an information infrastructure which is able to support different categories of “use cases” – such as online transaction processing applications, analytical applications and other hybrid forms of applications and workflows – using a core set of technologies and interfaces which can be shared across those use cases. To account for current and future needs, these information infrastructure capabilities ought to be provided as services which are application and technology agnostic.

We recently at Gartner released a conceptual “information capabilities framework (ICF)” to describe what is needed in a information infrastructure. It can be implemented in different ways and tools. There are six high-level “verbs” in the ICF like “Describe” and “Govern” which are broken out into more detailed verbs like “Model”, “Profile” and “Identify”. Each of the verbs represents a key aspect of the information infrastructure which must be supported by people, process and technology. At the heart of the ICF is a core set of metadata management capabilities which are needed to support the framework (the metadata management, too, can be implemented in different ways by different organizations).

Net: Information managers ought to make sure that they have such a conceptual framework and critique their current tools, process and service level agreements against the framework in support of business and IT personnel who have use case needs for the data being managed. Obviously, information managers will want to address any “shortfalls” between the needed conceptual capabilities and  their current people, process and technology implementations.

Gartner clients who want more information about these topics should read:

Information Management in the 21st Century
The Information Capabilities Framework: An Aligned Vision for Information Infrastructure
Defining the Scope of Metadata Management for the Information Capabilities Framework
Information Management in the 21st Century Is About All Kinds of Semantics
How to Use (And Not Use) Gartner’s Information Capabilities Framework

4 Comments »

Category: Uncategorized     Tags:

Understanding the Data Architecture Requires Modeling Collaboration

by Michael Blechar  |  October 1, 2011  |  Submit a Comment

 

Just a quick post on something I’ve been focused on – the need to coordinate the modeling efforts of information management (IM) personnel with business analysts and developers to better understand the data architecture.

Historically, IM modelers have been documenting the data architecture using structured analysis/structured design (SA/SD) methods (like Information Engineering and Yourdon) implemented on the data modeling/database design tools which support them. However, business process management (BPM) and service-oriented architecture (SOA) methods and tools being used by business analysts and developers – which allow them to model data using different paradigms than IMers – are leading to a fragmented view and understanding of the data architecture.

What’s needed to adequately understand the data architecture is to have collaboration across modeling roles, methods and tools – either through synchronized model metadata using federation across, or bridges between, the modeling tools or the consolidation of the model metadata into a repository for common reporting.

Gartner clients can read more about these topics in the following:

Data Modeling and Data Architecture; A Required Strategy for Enterprise Information Architecture
From Traditional to Collaborative Forms of Modeling Data Architecture
Six Common Approaches to Metadata Federation and Consolidation

Submit a Comment »

Category: Uncategorized     Tags:

Solution Architecture Collaboration

by Michael Blechar  |  July 30, 2011  |  1 Comment

in past blogs you’ve heard me talk about the need for collaboration across business process management (BPM), service-oriented architecture (SOA) and master data management (MDM) initiatives. I’ve also discussed how most of these initiatives tend to be initially focused on a “far-less-than-enterprise” scope of effort – in many cases targeting for improvement one business process, application development project or domain area of data. And, I’ve tried to drill home the point that each of these initiatives are of less value to the organization than when through collaboration the sum of the parts becomes greater than the whole.

But, I do not want to oversimplify the issue of transition strategy from project-oriented BPM, SOA and MDM to broader enterprise-wide collaboration. Arguably, less than 5% of all organizations are doing BPM, SOA and MDM at the same time, and those who are doing all three are generally not doing them in collaboration to any great extent. Moreover the maturity of the disciplines and technologies for each of these initiatives is at different stages of mainstream readiness.

SOA is the most mature and well into mainstream use in IT organizations, although many are struggling to use it “enterprise-wide” as a discipline (especially in terms of agile development methods). BPM is at a similar place in terms of BPM being implemented on selected processes identified as having low hanging fruit value to the business, but cross-organizational or enterprise-wide BPM is in use in less than perhaps 5% of organizations. And, finally, MDM is even newer; and while arguably the most important of the three for future new business opportunities and agility, it is perhaps 5-10 years away from being ready for adoption by most mainstream IT organizations. Moreover, MDM tool suites and solutions are primarily delivered in silos around domains of data like ‘customer’ or ‘product’; the more integrated (enterprise-wide) MDM solutions are even less mature.

How then should cross-initiative collaboration be accomplished in light of these maturity issues and the desire for more enterprise-wide sharing of benefits to the organization?

IMHO, collaboration first and foremost has to start with enterprise-wide planning of solutions within an enterprise solution architecture team. This team is new to most organizations, but it is critical for identifying where collaboration between the initiatives makes the most sense – both now (opportunistically) based on immediate needs, and later (strategically) based on future plans and needs. This is the most pragmatic way to plan the transition from project-oriented BOM, SOA and MDM to more enterprise-wide unification of these intiatives.

Net: Staffing the enterprise solution architecture team – whether with enterprise architects experienced in planning the future solution architecture (best practice) or subject-matter expert analysts who simply talk to one another regarding their project plans (still better than having uncoordinated initiatives) – is the best practice for optimizing the return on investment of the BPM, SOA and MDM initiatives.

1 Comment »

Category: Uncategorized     Tags:

Trouble Getting Agreement to Share Processes and Applications? Focus on Sharing Data!

by Michael Blechar  |  June 30, 2011  |  Submit a Comment

Frequently the politics and culture of organizations lead to autonomous business units wanting sole control and use of their processes and applications (generally with some limited bridging of data across the organizational silos). This, of course, leads to increased time and expense in integrating and managing the siloed processes, applications and data – inflating IT time and costs, Therefore, seeking to create more shared solutions generally makes good business sense.

I spent the better part of last week at Gartner’s Enterprise Architecture Summit in San Diego as Gartner’s resident Solution Architect answering questions about how to overcome the reluctance of business units to share processes and applications from the perspective of business process management (BPM) and service-oriented architecture (SOA). Interestingly, the enterprise architects in attendance were focused solely on the technology and business architecture sharing issues to the exclusion of the information architecture. Therefore, most were surprised when I recommended a greater focus on master data management (MDM) as the solution to their problems.

Historically, organizations have been more apt to share data than applications. There are many reasons for this, including the need for consistent and high quality customer, product and supply chain data across the enterprise. More recently, we have also seen other motivators emerge, including government legislation regarding the management and reporting of the privacy of data – leading to organizations creating roles charged with governance, risk and compliance (GRC) responsibilities and authority regarding information assets.

This, in turn, is leading to the funding of MDM projects to identify key data and improve the quality and management practices related to that information – including federating a single version of the “truth” from disparate, often redundant and inconsistent, sources of data. When this happens, the processes and applications which use the old redundant and inconsistent data are generally superseded by new shared services to the single version of the truth – with the older versions of the processes and applications phased out either immediately or over time.

Net: If your organization has siloed processes and applications and there is a reluctance to break those silos into more shared process and software services, consider using MDM projects in collaboration with GRC personnel to create shared data and then evolve the processes and applications from their silos into using shared services around the shared data.

Submit a Comment »

Category: Uncategorized     Tags:

Are BPM, SOA and MDM Creating New Silos Instead of Breaking Them Down?

by Michael Blechar  |  May 28, 2011  |  2 Comments

I recently had the opportunity to co-author with Ross Altman and Andrew White a “trilogy” of newly published research notes (see below) on the topic of the value of collaboration between business process management (BPM) service-oriented architecture (SOA) and master data management (MDM) initiatives. All three initiatives have at their heart a belief that collaboration and reuse of applications and data in the form of services which can be across shared workflows improves agility, lowers cost, and brings added agility to the business and IT.

So why is it then, that it seems like those who are focused on the three complementary initiatives appear to be building walls around their areas of expertise and, in essence creating application (SOA), process (BPM) and data (MDM) silos to the exclusion of each other and at the expense of the enterprise?

Much like a bunch of doctors who each specialize in different areas of medicine and each claim that there is nothing wrong with the patient – only to have the patient die due to the fact that no one had a broad enough vision of the illness to see that the patient was in an overall critical condition. 

I’m seeing the idea of “reusable services” being abused in a manner to mean that those “specialists” in SOA, BPM and MDM can put a “black box set of services” around their individual domains without real concern about the specific use case needs of anyone else. Part of this, I’m sure, has to do with the fact that both the business and IT are becoming increasingly more complex and demanding, leading to less time to understand and address collaboration issues.

But this lack of collaboration should not be tolerated in organizations. A failure to ensure collaboration across SOA, BPM and MDM is an error in judgment and cheats the organization out of the possible added value when these initiatives are done in a complementary manner.

To ensure that the collaboration becomes a reality, organizations need to address two major roles which are especially important to achieve success – one at the enterprise solution architecture planning level where cross-architecture and cross-project collaboration occurs and another at the project-level where solution architecture design of reusable assets across projects from an SOA, BPM and MDM perspective occurs.

You can read more about these roles in my following blog:

http://blogs.gartner.com/michael_blechar/2010/12/08/role-of-the-application-architect/

Newly published research*

Why You Should Coordinate Your MDM, SOA and BPM Initiatives
Why Application and Business Architects and Analysts Should Care About MDM
MDM, SOA and BPM: Alphabet Soup or a Toolkit to Address Critical Data Management Issues?

*Available to Gartner clients or for a fee

2 Comments »

Category: Uncategorized     Tags: