ERP implementation benchmark: Comparing SAP, Oracle, and Microsoft

ERP implementation benchmark: Comparing SAP, Oracle, and Microsoft

Summary: Recent research compares important dimensions of IT project success and failure on ERP systems from SAP, Oracle, and Microsoft Dynamics.

SHARE:

A recent survey (PDF download) by Panorama Consulting Solutions compares important dimensions of IT project success and failure with respect to ERP systems from SAP, Oracle, and Microsoft Dynamics.

Choosing the colors of ERP
Photo credit: "Choosing the colors of ERP" by Michael Krigsman

According to the report, the data set consists of over 2000 respondents from 61 countries who have selected or
implemented SAP, Oracle or Microsoft Dynamics ERP solutions between February 2006 and May 2012.

Also Read:
Study: 68 percent of IT projects fail
CRM failure rates: 2001-2009
Who's accountable for IT failure? (part one)
Who's accountable for IT failure? (part two)
Worldwide cost of IT failure (revisited): $3 trillion

Market share. As the diagram below shows, the report indicates that SAP holds the largest market share of three vendors, with 22% share. The market share difference between SAP and everyone else is quite large.

ERP market share

Selection rates. Given SAP's market share, it is no surprise the company often appears on procurement short lists, as indicated in this chart:

ERP short lists

Although SAP achieves the top spot in short lists, both Oracle and Microsoft are more frequently chosen than SAP. As the survey report states: the study suggests that after "assessing the available information, organizations are not easily convinced that SAP is the best option." It is possible that SAP's reputation for being expensive and complicated to implement scares potential buyers.

Implementation duration. According to the survey, Oracle projects show the largest gap between planned and actual implementation durations, as the graph shows:

ERP implementationtime

In general, Microsoft has the smallest implementations, relative to Oracle and SAP. Overall, 61% of all implementations reported in the survey run late. This number is not surprising because it is consistent with data from other research.

The study also lists the reasons that projects run late. As listed in the table below, technical issues cause only a small percentage of projects to run late. More commonly, projects are late due organizational, business, and project management challenges. 

ERP delay reasons

SUMMARY HIGHLIGHTS

The report presents the following overall summary information:

SAP

  • Largest share of the market
  • Highest short-listing rate
  • Lowest selection rate when short-listed
  • Longest payback period

Oracle

  • Highest selection rate when short-listed
  • Longest implementation duration
  • Largest delta between planned and actual implementation duration
  • Lowest percent of users who realized between 81- and 100-percent of benefits

Microsoft Dynamics

  • Smallest share of the market
  • Lowest short-listing rate
  • Shortest implementation duration
  • Highest percentage of users who realized between 81- and 100-percent of benefits

CIO STRATEGY CONCLUSIONS

The survey data suggests that Microsoft Dynamics is doing something right, despite its low market share. However, it is likely that Dynamics projects tend to be smaller than those from SAP or Oracle, which explains the shorter project and higher success rates.

Oracle comes off worst among the three vendors, based on the lowest benefits realization and longest gap between planned and actual project duration. SAP has the longest payback period, suggesting these projects may involve greater business process complexity than is typical from the other vendors.

Before making decisions based on this study, ERP buyers should be aware of an important deficiency in the survey reporting. Since the survey does not distinguish company size, the results are likely an average of projects of all sizes; the results might be significantly different if we were to look at breakdowns based on project and company size. For this reason, the survey is interesting and even helpful, but ultimately less useful than would otherwise be the case.

Topics: Enterprise Software, CXO

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

5 comments
Log in or register to join the discussion
  • Comparing Criteria

    The share is indicated, what about the value?
    As mentioned, the company size, the number of users and project value would extend this analysis farther more.

    In addition to that what about an indicator showing time basis of these data and if one of the 3 companies are taking more shares or losing shares..

    Thank you for the great article and special thanks to NAZMI who found it.
    h.meligy@...
  • Congrats!

    A really well written article. I agree with the comments on splitting the results by size of project may show a different result.

    Being a Microsoft Dynamics AX reseller, the results in the article does not surprise. Dynamics AX is an easier, as less complex system than SAP (at this stage, but they provided a 1,000 enhancements, additions and changes in the latest version and this velocity of development of the product will reportedly continue).

    I have 2 information pages on Dynamics AX if you would like to know more:

    Why choose Microsoft Dynamics AX?

    www.renown.com.au/MicrosoftDynamics/Dynamics_AX/Why_Choose_Microsoft_Dynamics_AX.aspx

    and

    An overview of Microsoft Dynamics AX, including a download of the latest Microsoft Dynamics AX Product Guide, found at:

    www.renown.com.au/MicrosoftDynamics/Dynamics_AX.aspx

    Hope you find that of interest.
    Michael
    RBSAU
    • Questionnable Research = Unreliable Results

      I too am part of the MSFT Dynamics community and have worked with all four ERP solutions in the portfolio including Dynamics AX since its acquisition by MSFT. In that regard, I would love to accept some of the information being presented by Panorama in this latest paper as being totally factual and reliable - it makes for some great promotional content.

      However, I cannot do so as this study is based on questionable statistical methodology and so the results must be questioned as well.

      The information being presented in this paper is supposedly based on voluntary participation in a web site based survey hosted by Panorama, not on a structured random sample or similar type of statistically sound methodology. So how does one evaluate the viability of data from some set of voluntary respondents on a web site? How do you eliminate bias in your outcomes based on why those respondents chose to participate in the first place (versus investing the same time in some other endeavor in their life)? One could even ask how the respondents were pulled into the survey on the Panorama web site in the first place (for instance, were they clients of Panorama at some point in time and were solicited to participate in the survey by Panorama in some manner).

      Panorama needs to do a better job of discussing their methodology, as well as revealing the statistical likelihood of error in their results, before I personally would accept any of their findings in this paper.

      Jack Moran
      Director of Manufacturing Solutions
      Sikich LLP
      www.sikich.com
      jemrunner
  • Size = what?

    How do you equate project size to the potential for fail/success? I don't get that. Where are your metrics?
    dahowlett
  • Calls for Transparency Appropriate

    Although some might think this post is “sour grapes”, I’ve been searching for an answer to how the “Clash of the Titans” numbers changed so dramatically from last year with only 200 more respondents.

    According to the 2011 “Clash of the Titans” report, Panorama interviewed 1,800 customers from 2005-2011 and concluded that the “actual” time it took to implement ORCL was 11 months. In the 2012 report, they said they looked at 2,000 customers (although in the live broadcast of the 2012 report they said 1,800) and the implementation time increased 7 months to 18.

    Further confusing the picture are the market share claims. Panorama claims to have looked at 324 ORCL customers in 2011 (18% of 1,800) and only 300 in 2012 (15% of 2,000).

    If the 2012 report was an annual study--which I doubt--these numbers would be possible. But, even if it is, publishing reports with the same title and different criteria isn’t appropriate.

    I’m not a statistics expert so if I’m missing something, please let me know. But, I think the calls for more transparency are essential if people are going to trust Panorama’s findings.

    I do work for Oracle but the opinions expressed in this post are entirely my own.
    coach74