John Rizzuto

A member of the Gartner Blog Network

John Rizzuto
Research VP
6 years at Gartner
10 years IT Industry

John Rizzuto enables investors and business strategists to take a holistic view of the software industry and its participants by leveraging the qualitative insights of the Gartner platform and linking them to quantitative measures of business performance. In previous roles, he evaluated software companies' strategy, market position, and financial and business models as a financial analyst. Read Full Bio

Microsoft Is Staying the Windows Course – Deal With It.

by John Rizzuto  |  April 23, 2013  |  3 Comments

The call is deafening – “Why doesn’t Microsoft reveal Windows 8 unit sales?” Windows 8 sales are tepid. We know this; we have seen many real data points confirming it.  However, Microsoft didn’t build Windows 8 to save the PC industry; Microsoft built Windows 8 to enable the PC industry to transform itself.  The financial markets, the press, bloggers and analysts are all imagining that the thinly-veiled echo-chamber they created is making their voices louder and creating the perception that their chorus is widening.  Many believe that they will get Microsoft to change its strategy.  Don’t bet on it.  A new era of Windows is only in its infancy.

For the last 30 years the PC market has been evolving.  Whether Compaq’s early suitcase-sized mobile computer or today’s Ultrabook, the core of these systems has stayed relatively constant.  This extraordinary, evolutionary path has seemingly run its course.  The world is demanding something different from computing devices; and Microsoft is responding.  And so is Intel.  These two stalwarts of the PC era are transforming themselves and aim to create something very different for the next 30 years.   Microsoft’s plan is built on a vision, not a sound bite. Crazy?  Maybe so, but I submit Microsoft is pretty good at this vision thing.

We know the Microsoft story – Bill Gates and Paul Allen originally made software that measured street traffic, which laid the foundation for DOS and then Windows, now powering billions of devices used by billions of people.    It was the 1970s, before anyone knew what “high tech” was and before there was an endless stream of easy money for startups.  The tech sector only had a handful of participants and it was traded at a steep discount to the S&P.  This bit of trivia is worth highlighting, as I think it’s when Microsoft first developed the early markers of its DNA. Back then there was only one way to survive, take on industry giants and live in fear of their footsteps.  It was continuous paranoia, and demanded building something ever greater than what it built before. Microsoft was always willing to go where its customer wanted to go; I find very little credence that Microsoft dictated where its customers went. It innovated, it executed. That is its DNA.

Microsoft has a history of “creative destruction”, building new software to replace its old software. Software never wears out and for years Microsoft didn’t have an SA option (even today the majority of revenue is transactional).  If Microsoft wanted a customer to buy something new, it had to build it and sell it to him. Microsoft has done this over and over for the last 30 years.  The consistency to redefine itself time and time again does not come by chasing the flavor of the day. It dictates a strategy and focus, but most of all it has made the ability to change and adapt a core competency of Microsoft.   However, there’s a downside –  it makes keeping the balance between staying focused on your mission and when to adapt it, critical. 

The tablet and the smartphone are not the first things that the “Technorati” projected to make Microsoft irrelevant. Microsoft was always under threat. In the 80s it was Apple’s Lisa and IBM’s OS2. Then Novell’s Netware, directory services, free email, AOL, the Internet, Netscape, Java, Linux, VMware, virtualization, the cloud, advertising-financed software, open source software, and so on.  Microsoft has consistently addressed threats, not always perfectly, not always successfully, but it certainly earned the label of the tech company of the 20th century.  It often out-executed on others’ ideas.  But, no matter how you look at it, Microsoft responded to the ever-changing environment it faces -  aggressively and with conviction.

So, why are we so obsessed with Windows 8?   Microsoft, on its latest quarterly conference call with investors, revealed two things that stand out – its business outside of Windows is very healthy and Windows 8 is the operating system to drive the PC world into the future. The future is about touch and mobility and that is why Microsoft built Windows 8. Period. It’s going to stay the course, because its vision is well beyond these first couple of quarters.  Windows 8 is not something Microsoft intended to “throw against the wall and see if it stuck”.  Windows 8 is part of a transformation for the company and the ecosystem in which it participates. 

Microsoft is patient and it is focusing on where Windows is going to be in the coming years and beyond.  Ecosystems need this for confidence, stability and support and the PC industry needs this to reinvent itself.  Which brings up a couple of interesting questions: What is a tablet and what is a PC?   Maybe a tablet is just a PC in a different form factor, a different name.  I will not touch that debate.  But, in the eyes of Microsoft and even Intel, ARM Holdings, and investors, it really does not matter, as long as their technologies are the most successful in giving people what they want.

We don’t know Windows 8 sales volumes.  Does it matter? It will not change anything.  Microsoft’s Windows strategy is right there in front of us – love it or hate it; you can reach out and touch it.   It’s unlikely that we have heard the last on Windows and of Microsoft, and it’s quite possible that Microsoft’s best years lie ahead of it.  And, if you think transition and change are hard, try complacency. 

 

 

3 Comments »

Category: Microsoft PC Uncategorized Windows 8     Tags: , , , ,

Data Not Included – The Era of the Data Collector

by John Rizzuto  |  March 27, 2013  |  1 Comment

It’s all because of connectivity, don’t ya know?  The “Internet of Things” is a simple concept – anything can be connected to the Internet.  Anything.  An embedded electronic gizmo, smaller than a fingernail, and boom, there it is – in your browser or app – that “thing”, transmitting all sorts of information.  Trivial things, such as a reminder to water  flowers; and critical things, such as a jet engine signaling it’s statistically likely to fail on its next flight.  All this information, all this new type of data and the sophisticated analysis to make sense of it – is real. But, it will take many years to realize the revolutionary impact big data and big analytics will have on us.  Importantly; however, we have passed the point of no return – the big data and big analytics craze, in all its hype, evangelical praise, and emphatic disdain, is secular and irreversible.  Welcome to the “Era of Transformation”.

Up to now, technology was primarily about efficiency.  Driving costs out of the system through automation, increased speed and replacing physical channels with digital ones.  The Era of Transformation is something else.  It’s about effectiveness.  The Era of Transformation is likely to go on for a decade or more.  It will transform our organizations – and lifestyles – in ways that cannot be imagined.  How is it and why is it that technology makes us more effective only now?   Today, software and systems have the ability to take millions upon millions of seemingly mutually exclusive data points (and, perhaps more importantly, the ability to gather them) and run a myriad of algorithms against them and discover relationships – cause and effect – and answers the questions what happened, and why, but, ultimately, what will happen next, and what to do about it.  It is an intractable problem, if not impossible, for the human mind. 

The prevailing distributed or client-server model was about delivering applications to users; the cloud model is a bit more about bringing data and applications together. Enterprise applications, primarily creators of data, will be accompanied by a tsunami of new enterprise applications that consume data. Inevitably it will break the current methods of distributing and leveraging information.  In the current enterprise application model, the RDBMSs and the teams that supported them are the “center” of the data universe.  Analyzing data?  Contact the database admin, work with her to create an interface, and she will provide a copy of the data you need – and off you go.  Each connection was point to point, one data source to one data consumer and it was either hand coded or engineered with third party ETL tools.  This is data integration. 

When reflecting on these data silos, my colleague, Ted Friedman, expressed it well during his keynote at the Gartner’s 2013 BI Summit, “First, we have to stop thinking about data as a byproduct”. And this is the real change that big data brings to the way we design, deploy and use applications and how we treat the data they create and how they get the data they require.  There are countless analytical applications emerging to capitalize on data – applications for digital marketing to studying diseases – among a slew of others.   All these applications have one thing in common – data not included.   The line of business will covet these applications; they will need to move fast and painlessly.  Application users will demand a simple way to get the data sets they need, analyze them, preserve their findings, get new data sets, analyze those, and so on.   However, much of the data these analytical applications will need comes from outside the organization.  For example, just in the U.S, there are nearly 100 Federal Agencies with Statistical Programs, each publishing data to and accessible via the Internet. I look at this problem and think déjà vu. Years ago, point-to-point connections from application to application and their inherent brittleness made the application integration model break.   

These days, the title “data analyst” is an oft used term that rivals “big data”. The data analyst is the glory gal.  She takes the realms of data at her disposal, uses her BI and analytics tools, and comes up with answers – and questions – that she would never have been able to find or know to ask.  She’s the resident hero.  I submit another role, the data aggregator, will rise in ascendancy and importance.   The data aggregator does the strenuous lifting to prepare the data so it’s ready for the data anlayzer. The data aggregator will gather data from a set of these practically infinite data sources, collect them, format them, assure their quality, and then take these data sources and make each one seamlessly available to many data consumers.  It has to be repeatable, scalable, and done rapidly and often.  It likely needs to be self-service for the business user.  The steward will be required to provide internal, transactional, long-lived, short-term and real-time data.  The tool he will need to realize this vision does not exist, but it will.   And when it does, it will transform the ETL market such that it will be unrecognizable.

I recommend that our Gartner Invest clients read the following documents: Top 10 Technology Trends Impacting Information Infrastructure, 2013; Hadoop Is Not a Data Integration Solution; Data Integration Enables Information Capabilities for the 21st Century;   Emerging Role of the Data Scientist and the Art of Data Science; and The Future of Data Management for Analytics Is the Logical Data Warehouse.  These are only a few titles from our library on data and analytics.  Be sure to get on the inquiry calendars of any member of our Information Management Team, including Gartner Invest regulars:  Merv Adrian (big data, DBMS), Mark Beyer (big data, data warehousing), Roxane Edjlali (DBMS, data management), Donald Feinberg (DBMS, data warehousing) and Ted Friedman (data integration and data quality).

You will need a Gartner login to access documents mentioned.

Click to connect:

1 Comment »

Category: Uncategorized     Tags: , , , , ,

Lack of Social Policy Will Bite You.

by John Rizzuto  |  March 23, 2013  |  Comments Off

Social policy is not just good for the company; it’s good for employees, partners and customers as well.

It might not have “gone viral” yet, but it’s only Saturday.   Bloomberg Busniessweek; Silicon Valley Still Showing Signs It’s a Boys Club was posted earlier this morning and  the blogosphere and other news organizations have already seized on the story.  And no, I will not espouse the moral, ethical or legal dilemmas camouflaged in the subtext of this article on the Gartner Blog Network.

Last Sunday, Adria Richards attended the PyCon conference in Santa Clara.  During a presentation she tweeted a picture of two men sitting behind her, whom she alleged were making inappropriate and sexual jokes during a presentation.  The visceral reactions unleashed a barrage of death threats, praise, and claims of  foul and fair play.  No one wins.  Ms. Richards was fired to the chagrin of her supporters and the praise of her detractors. One of the men was fired and both companies scrambled to explain their reasoning and the events that transpired via their corporate blogs (SendGrid and playhaven).  Both CEOs, Jim Franklin of SendGrid and Andy Yang of playhaven are soliciting comments directly via email at ceo@sendgrid.com and ceo@playhaven.com .

I find it ironic that the very media they are using to reach the public is the same that triggered the need to address the events in the first place.  More commentary will traverse cyberspace in the days ahead and, I suspect, while both CEOs may understand the magnitude of the situation, it is an unfortunate way to have to spend their time. I don’t know the stated social media policy of SendGrid, where Ms. Richards worked.  If there was as social policy and she violated the terms of the company’s social policy, the “difficult situation” Mr. Franklin was forced to comment on may have taken another direction, but, at the very least; Ms. Richards may have been culpable for her actions.  On the other hand, if there were a policy, Ms. Richards may have handled the situation differently and not be subject to an “after the fact” behavioral analysis as Mr. Franklin described.

In his blog Mr. Franklin stated that he felt Ms. Richards’ could no longer fulfill her job responsibilities in the aftermath of this incident.   I cannot judge whether that is true.   Mr. Franklin also mentions his business is in danger as a result of the situation and he has an obligation to the company’s employees, community and its customers.   He is right, but nonetheless, there are surely members of that community who disagree with Mr. Franklin’s actions.  These dissenters may rethink their relationship with SendGrid.  In reality, Mr. Franklin is settling for the less damaging alternative, hoping for minimal backlash for his actions.

I joined Twitter and the GBN years ago, but it is only recently that I started to blog and tweet. Because of the confusion and uncertainty in the early days regarding Gartner’s policy, I decided to wait it out. Since that time, Gartner’s policy has been hashed out, it is well stated with both rules and guidelines, and I feel comfortable with my responsibility when tweeting and blogging – whether via my “personal” media channels or those as a Gartner representative.

I feel strongly about corporate policy, visibility and ethics.  We spend an awfully large part of our lives working for our organizations and we are dependent on them.   Social media has exploded and it’s bigger than any company – it has even played a role in toppling some governments, while reinforcing support for others.   I and my livelihood are at risk if my company ignores the risks of not having a clear social strategy – because its reputation is at risk.  My colleagues and I are at risk if our organization is forced to “make it up as it goes” when investigating perceived employee misconduct via social media – because it’s reactionary and will be influenced by prevailing sentiments at the time of the offense.

Gartner has been emphasizing the importance of social policy and social strategy.  My colleagues, Carol Rozwell and Jenny Sussin, have published on a developing a strategy for responding to social media, Handling Social Media Issues Appropriately Requires Preplanning. Carol also published,You Need an Enterprise Strategy for Social Business Initiativesand Use Gartner’s Social Business Program Maturity Model to Plan Your Next Move, which I recommend reading.  Jenny, along with Ed Thompson also wrote, The Consequences of Fake Fans, ‘Likes’ and Reviews on Social Networks to help in brand management.  And, finally, Jeffrey Mann wrote, Take Four Initial Steps Toward a Social Media Policy– to help in the why and how you need and manage a social policy.   These are only a few of a wealth of content we have on social media, from security to enhancing customer experience to leveraging it with your “non-social” customers.

Social media is here and will affect every aspect of our businesses – and our personal lives. Enterprises cannot hide from it. In the end, whether protecting or propagating our partners, customers, brands, reputation, or employees, social media must be considered.  If not yet, it will inevitably be one of the most influential and visible channels we have.  Ignore it at your own peril.

You will need a Gartner login to access documents mentioned.

Click to connect:

Comments Off

Category: Social Social Media Social Policy     Tags: , , , , ,

Adobe Talks “The Last Millisecond”

by John Rizzuto  |  March 7, 2013  |  2 Comments

Can a millisecond make a difference for a software platform?  If it’s a digital marketing platform, it may separate the winners and losers as SVP of Adobe’s digital marketing business, Brad Rencher, explains.

Adobe is aiming to build a millisecond of a lead on the competition.  This millisecond lead, as Adobe would have it, may very well be insurmountable by competitors.  It turns out that the millisecond is the one that occurs between the last piece of data a consumer “gives“ a system and the content with which the system responds.   What happens in that millisecond?  The system needs to correlate, manipulate, measure and analyze all the various pockets of data is has on the consumer and then choose, assemble and display the relevant content to her.  The content that these algorithms choose and that the infrastructure renders, must create a positive user experience. The experience is designed to encourage the consumer to take a preferred action – in ecommerce that is usually to buy something. This is not big data, this is big aspiration.

Envision a lifecycle.  The digital content lifecycle.  It stretches across content creation to content management to content utilization.   Adobe has three “cores” in each.  Its creative solutions to build rich digital content, a web content management platform, and web and information analytics to select the right content.  Its goal is to bring these three pillars together in real-time to so the process is dynamic and orchestrated instantly.

This is not technology it is magic.   Adobe cannot do this today, no one can.  However, it is a panacea which every major consumer-oriented company is pursuing – even if only theoretical.  Adobe has great potential to fulfil this promise and surely the right stuff to make important inroads now.  The key is how effective will it be and how long will it take to fulfill the vision?

At the Adobe Digital Marketing Summit, Adobe unveiled a streamlined digital marketing cloud.  The Adobe Digital Marketing Cloud was comprised of 27 separate point products and it has been rationalized to five solutions.  They are: Analytics, Social, Experience Management, Target, and Media Optimizer.   Some features and components demonstrated are available today and some are expected to hit the market before year-end.  Adobe has made important strides in making these solutions more intuitive and they more seamlessly integrate complementary functionality within them.

Of the many capabilities Adobe demonstrated, the one that interested me the most was the shared workspace between Adobe Creative Cloud and Adobe Marketing Cloud.  The workspace enables marketers, agencies and content creators to collaborate, coordinate and build campaign process flows between the Adobe Creative Cloud and the Adobe Marketing Cloud, which share the same common infrastructure platform. It turns a sequential and linear process into a more effective iterative one. It’s Adobe’s first step in creating synergies across the content lifecycle.

Synergies across all three of these segments will be critical for Adobe’s success.  This market promises to heat up and the competition will be fierce.  If Adobe can show that content creation, an area it dominates, is a key element and differentiator of the broader digital marketing platform, it will be difficult for competitors to respond. Adobe possesses some of the best in class technologies and capabilities in content management and analytics; hence, I continually seek examples of Adobe leveraging its position in the content creation market into its digital marketing business. This is going to be fun to watch.  But don’t blink, in an instant it might be over.

2 Comments »

Category: Uncategorized     Tags:

Informatica: Poised To Accelerate?

by John Rizzuto  |  March 4, 2013  |  1 Comment

I paraphrase:

We are transforming Informatica from a category leader to an Industry leader – Sohaib Abbasi, CEO, Informatica

And so the theme was set for Informatica’s annual industry analyst summit held on February 26th to 27th. Informatica’s 2012 performance was a year to forget. Informatica, which had been on a solid multi-year double-digit revenue growth rate, faltered (missing its own forecast) and posted only modest revenue growth while seeing a meaningful decline in license revenue. However, if you hear the subtext of the conference, in 2012 Informatica hit an important inflection point. Over the last year, many executive decision makers joined Informatica, including its EVP of Worldwide Field Operations, SVP of WW Sales Specialists, VP of Latin America, Country Manager in China, SVP of WW Alliances, VP of WW Channels, and its CMO. Informatica is setting out to redefine itself and to put in motion an updated long-term plan.

Redefine can sometimes be a scary word. Informatica is evolving. Over the last several years it has made acquisitions and delivered new products to enter into new markets. What I like about these markets is the synergy potential from each other and from Informatica’s core data integration and data quality roots. Informatica calls itself “The Data Integration Company” but down the road it might be more apt to think of the Informatica as a broader data management company.

Informatica, from a revenue and product perspective, is a leader and enjoys the top spots in Gartner’s MQ Data Integration Tools and MQ Data Quality Tools. These markets remain attractive and healthy. There has been continual angst regarding the competitive landscape in DI & DQ tools – particularly among financial analysts – but there are several things that could be better understood about these segments. My colleagues Merv Adrian and Ted Friedman wrote an excellent piece that addresses some of the concerns, “Hadoop Is Not a Data Integration Solution”. The reality is that ETL has changed and the requirements for solutions are accelerating. As such, there are likely four important things to realize about these markets

  1. As complexity increases, IT organizations are under more pressure to move from hand coding to packaged solutions,
  2. The functional gap between the high-end of the market and the low-end of the market has expanded,
  3. Point-to-point integration, or application DBMS to data warehouse DBMS, is being replaced by the much more complicated, any data source to any data repository requirement, and
  4. The need to get “real-time” information.

All these bode well for Informatica and although the competitive landscape is intensifying, Informatica is poised to strengthen its position.

Thus far, Informatica has built itself on the DI & DQ opportunity, but the future it envisions relies on an expanded footprint. Informatica believes it is competing in markets that, in aggregate, are nearly $10 billion in size. It is not an unreasonable expectation. However, beyond its PowerCenter DI & DQ products,Informatica gnerates less that 15% of its revenue. One area, master data management, MDM, is expected to be a very large opportunity and Informatica is a leader in one of Gartner’s MDM MQs. Informatica is plotting a different strategy to address this market. While, most MDM solutions are domain-specific, like product or customer data, Informatica is building a multi-domain solution. Depending how the market evolves, this can be an opportunity for the company – or a risk. Another area to highlight is Informatica’s cloud strategy. Informatica has a well-established data integration cloud product line and it is among the revenue leaders in the PaaS segment. In addition to MDM and cloud, Informatica has products in complex event processing (CEP), archiving, masking, and replication. All these markets are in their rapid growth phase. The next step is to bring them together.

Informatica has yet to realize and has not discussed synergies across these businesses, which is fair as most solutions are typically bought and evaluated versus their segment-centric brethren. For Informatica to become greater than the sum of its parts, it needs to establish cross-market synergies. At first glance, it is likely that Informatica’s technologies are complementary and as markets mature, solutions will likely arise which rely on and create value by combining their specific capabilities. This will also invite fierce competition.

In any case, Informatica is excited about its prospects and I am too. The road is long and there are the inevitable potholes along the way but the journey has begun. It’s now that we can stop waiting for Informatica to become the company it wants to be and start watching for Informatica to be the company it wants to become.

My colleague, Andrew White, has a very good blog that talks more of the event.

In addition to Merv Adrian and Ted Friedman, I highly recommend inquiries with Mark Beyer and Roxane Edjlali to talk more about big data, data warehousing and data and information management.

1 Comment »

Category: Uncategorized     Tags:

One Company’s Quest to Make Software Beautiful

by John Rizzuto  |  February 20, 2013  |  1 Comment

Beautiful enterprise software?   A paradox.   Enterprise software is not supposed to be beautiful – bits and bytes are not beautiful, sexy or pleasing.  Need it be?  After all, technology, in order to do anything useful for us, should effect an action or change in our physical environment; if it achieves that objective why be beautiful?   And, practically speaking, we need to build a bridge from the analog world to the digital world.

The bridge, or user interface (UI), is how we people interact with software.  It’s a rather utilitarian process, we get the software to do what it was designed for based on the inputs we give it, and we make decisions based on its output.  UIs are the least “valuable” part of the process, clearly what is important is the information we put in and what we do with the information that comes out.  In computer science speak, we talk of garbage in; garbage out, this refers to the information, who cares how that information gets in or out of the software?  Alas, heretofore, most UIs were designed to make that process as painless and as fast as possible, but not beautiful.

Appreciating our very analog and very human affinity for things beautiful, Infor, the world’s third largest enterprise application software company, has a different take on what software can be.  It can be beautiful.  Infor is serious about this; its design studio, Hook & Loop, was purposely plopped into the heart of NYC’s very fashionable Chelsea district so its software designers are surrounded by and recruited from the designers from art, fashion and theater.  Hook & Loop determines nearly every design decision Infor makes, from company logos and marketing campaigns to the user interface and charts that are part of its software.

I have attended hundreds of management presentations and I can’t recall any CEO stressing efforts to make software “beautiful”.  Infor defines beautiful software as being beyond “more useable” or “pretty”.  Infor believes beautiful software is what drives an exceptional user experience. One question: If beautiful software drives a superior user experience, what is a superior user experience?   We all readily recognize it when we feel it, but to design for it?  It’s open to debate, but, for now, as far as Infor is concerned, that is for Hook & Loop to determine. I anxiously await its results.

1 Comment »

Category: Uncategorized     Tags: