Sage Data Cloud, new UI and other re-tooling progress

Sage Data Cloud, new UI and other re-tooling progress

Summary: User interface work is consuming the R&D budgets of ERP vendors everywhere. Products designed for the client server or Web 1.0 world really need re-developing. Infor has gotten out front, but efforts by Sage, SAP and SumTotal are right behind them.

SHARE:

Imagine you run R&D for a major software firm.  Your employer has grown the company massively over the last couple of decades with acquisitions of competing and complementary product lines. You now have to support and enhance dozens of product lines, product architectures, etc. Unfortunately, pure-play competitors are now introducing all-new applications at a dizzying clip on all-new cloud platforms. They’re also introducing new social, analytic and mobile capabilities as part of their modern applications.  

Many of the new competitors have only one product line, one technology stack (i.e., a cloud multi-tenant stack), one version of their products, etc. Of course, they can move fast. Yet the old-school vendor with lots of product lines can’t move that quickly and their customers and channel partners are not taking delays very well. So, what can they do?

Sage has an incredible number of customers and channel partners eager to see new cloud, mobile and other enhancements from them.

I’ve seen four software firms attack this problem recently. SumTotal took a bold approach and re-architected almost all of their products under a single, new platform. They now have an architectural flexibility that lets them add new functionality very quickly. They can also re-platform and integrate an acquired product very fast now, too. Their reward is that they can now move at the market’s speed, remain competitive and innovate at a pace that customers demand.

Infor has one of the largest product lines of any application software vendor worldwide. They, too, felt these same market pressures and acted to modernize their product lines. One of the hallmarks of their effort was to dramatically improve the user experience (UX) of all of their product lines. In doing so, they also created a cloud analytics database that serves up and stores both Infor transactional data as well as data sets of varying sizes from all kinds of big and small data sources.  To date, Infor has created a solution that breathes a lot of new life into its long-time product lines.

Sage North America, part of The Sage Group plc, is another of these long-lived, large portfolio firms with a lot of code to maintain and enhance. Sage has an incredible number of customers (over 6 million customers worldwide) and channel partners eager to see new cloud, mobile and other enhancements from them.

Sage took a couple of steps to kick-start the transition. The first was a culling of the product line. The company recently sold off its ACT! and SalesLogix products (to Swiftpage) and also its Non-Profit solutions (Accel/KKR).  Fewer products meant less dilution of scarce R&D dollars to spread across its different product lines. Second, the company created standalone, cloud-only applications (e.g., Sage Payment Solutions) that work with most of their on-premises solutions. These bolt-on applications extend the functionality of existing solutions – these are not re-writes of the existing core apps.

Third, the company created a Sage cloud data space (called, obviously, Sage Data Cloud). Customers can access a 50gb (or bigger, if needed) cloud storage area. Information from Sage cloud applications (like the standalone ones mentioned above) and copies of data that is stored in on-premises application servers are replicated and placed into this cloud, too. As a result, mobile applications can access core application and cloud application data via one cloud server.

SageCloudStrategy_2
Image courtesy of Sage Group plc - Used with permission

 

This cloud data space shares some ideological similarities to what Infor has done. Infor places core ERP data in its cloud and has all user interaction impact the cloud data store. Whether users are doing queries, completing transactions or crunching internal data against a third party data source, it all occurs in the Infor cloud data store. The result of their architecture is that data and results are served up fast to users regardless of their computing device (i.e., desktop PC, remote cell phone user, tablet user, etc.).  It is my understanding that their cloud data store is their primary data store. Data may also get written back to an underlying on-premise application’s relational database or to a persistent storage device. But, it is the cloud data store that is Infor’s one source of truth.

From what I understood re: Sage, the one source of truth is dependent on what kind of application controls the transaction. If the information originates in and is processed by an on-premises application, then the on-premises relational database is the primary store and a copy of the data is replicated to the cloud service. If the data originates via one of their newer all-cloud bolt-on applications, then the cloud server is the primary store.

SAM_2068
Sage North America CTO - Himanshu Palsule Copyright 2013 - TechVentive, Inc. - All Rights Reserved

 

I followed up with Sage’s CTO, Himanshu Palsule, to see if this was correct. He stated that the Sage Data Cloud stores data in three distinct categories: Reference Data, Transactional Data and Cloud Only Data.  Reference Data is the data needed to extend the reach of the on-premises products to mobile devices and browser based persona use cases.  This is data such as Customers, Addresses, Contacts, Inventory Items, Open Invoices, etc.  I view much of this to be reasonably static table and/or master file information.  He added that this data is indeed owned by the ERP system (on-premises OR cloud hosted such as Sage 300 in Azure) and is replicated up to the Sage Data Cloud in a common format regardless of the ERP product (Sage 50, 100, 300, etc.). This data is synchronized on an on-going basis. 

Transactional Data includes transactions that originate in the mobile and browser based applications that are then processed using the business rules of the ERP system.  These transactions such as Quotes, Orders, Invoices, Payments, etc. are owned by the Sage Data cloud, but are processed and injected into the standard on-premises ERP workflows.   What Sage is effectively providing is a choice to its customers to continue to leverage their investment in their business processes, training, customization, etc., while extending the reach of their applications to mobile users.

Cloud Only Data is data that is needed to augment the on-premises reference data to provide a rich user experience.  For example the ability to maintain related inventory items for upsell opportunities in Mobile Sales.  This data is maintained in a browser based UI and creates a consistent experience for users even as they upgrade their on-premises system, from say, Sage 50 to Sage 300.  Historically, Sage would have added these capabilities to each of its on-premises ERP systems, using native SDKs and differing user experiences. Now, new features will be implemented only once yet serve multiple products.

When vendors create two or more places where data may be stored, users should be alert to potential issues with: latency, synchronization and disaster recovery. Latency is an issue when the application first updates one data store (e.g., an on-premises relational database) but a user surfing for the data that is being served up via a different data store (e.g., a cloud reporting server) won’t see the new/updated data for a few seconds or minutes.

According to Himanshu, “Data that is synchronized (Reference Data) is not for a cloud reporting server, where differences in on-premises data and cloud data can give different results.  Rather our use cases are for potentially a mobile sales rep, attempting to create a quote for a deleted Customer or Inventory Item that may not have been synchronized in the cloud from on-premises system.  Because we utilize the on-premises ERP business rules to process the transactions created by the mobile applications, these conditions will be detected and reported back as an error that the customer is actually deleted, preventing bad data from going into the ERP system.  We currently do not offer a cloud reporting server. However that type of data is typically updated after a day-end processing (Sales Journal Update) function has been performed in the on-premises system. This update will then load the data to the cloud in an incremental manner.  We will utilize techniques to ensure that the “in-process” data that is being uploaded will not be visible to the cloud reporting server until the entire set has been uploaded.”  

Disaster recovery gets trickier when one of the data stores experiences an outage or failure. How will the problem get detected across all data stores and how are the data stores brought back to a complete and mirrored view of each other?  In Sage’s case, there are two data stores, on-premises and cloud.  Himanshu adds “We utilize vector clock tick counts to keep data in sync.  The cloud and ERP system keep track of this tick count.  A restore from backup of either the cloud or on-premises data will trigger a refresh up to the tick count to get the reference data back in sync”.

Further, how do customers stop, for example, a cloud application while the persistent storage or on-premises main storage media are being reset?  Himanshu responded that: “The Sage Data Cloud Connector runs as a windows service on the on-premises server.  This service can be stopped by the customer locally in the event that system admin functions such as Fiscal Period Close OR something like restoring data from a backup is performed.  Once complete, the service can be restarted and synchronization will resume.  When the service is down, the mobile and web users can continue their work, and any transactions that need to be processed by the ERP system will be queued in the Sage Data Cloud, until the connector comes back on-line. “

Synchronization is an issue when other applications (e.g., production scheduling) that depend on this application’s (e.g., cost accounting) data are accessing one version of the data (e.g., via an on-premises RDBMS) but users see a different (e.g., the redundant cloud store) dataset. Small differences in data values can trigger unexpected results that can be hard to explain or understand.

SageCloudStrategy_1
Graphic courtesy of Sage Group plc - Used with permission

 

Going forward, I’d like to see Sage do more of the following:

1)      Make the Sage Data Cloud more of a destination for non-ERP data.  50 gb is not a big data space. It will likely hold most Sage customers’ ERP data and some additional data but it is not in the realm of true big data sets. When you’re comparing ERP, HR or other data to social sentiment information, POS transaction data of big retailers, etc., you need to work with large data sets that often exceed a terabyte or petabyte of data.

2)      Significantly expand the analytic power and application breadth of this cloud service. Provide more pre-built analytic modules that use data resident within the Sage Data Cloud. Sage will need to develop applications, design more connectors to third party data sources and expand its channel partners to include more third party data providers. Himanshu tells me that “As we expand the data that we store in the cloud, this type of functionality will be added to the roadmap.  Our partnership with Microsoft and use of the Azure platform will allow us to leverage the ability to build up and tear-down computing clusters for big data analytics.

3)      Add an in-memory database capability to the Sage Data Cloud.  Speed, large data, etc. will compel Sage’s competitors to offer this with their cloud data stores. The typical SMB customer of Sage is going to expect more from the company’s hosted or pure-cloud solutions and this area is going to be one of those.  I have learned that the Sage Data Cloud is being architected with different data stores, such as relational, key/value, document, in-memory caches, etc. An in-memory database will be used where appropriate.

4)      Have the analytic applications drive a new User Experience and workflow. When the analytic applications detect anomalous behavior, it should utilize a workflow engine and route the findings, suggest possible actions and guide the most appropriate user to a solid business outcome. Sage is already offering an early example of this kind of capability via the Sage Inventory Advisor. Hopefully, Sage will be providing more of this type of functionality moving forward.  Alerts and approvals should be prominent as well.

5)      Create one and only one book of record for all of the applications. This change would require change to the various Sage ERP solutions. Himanshu and I may differ on this point. From his perspective, he sees a reason to have many products serving different customer bases with each product solving unique business problems.  I believe some common functions (like accounting, payroll, etc.) could have a common data model that serves many product lines simultaneously. This would permit the development of common objects, mobile apps, etc. that painlessly connect to and serve multiple ERP products. However, I do agree with Himanshu’s point when it comes to different vertical applications.

My next post will cover SAP’s UI re-development approach.

Topic: Enterprise Software

About

Brian is currently CEO of TechVentive, a strategy consultancy serving technology providers and other firms. He is also a research analyst with Vital Analysis.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

1 comment
Log in or register to join the discussion
  • Farmers vs Hunters - It makes a difference.

    Interesting post Brian, that got me thinking.

    There is a lot of stuff in here so I think the article might be more disestableif it was divided into two.

    One describing the response of traditional ERP vendors to cloud-based newcomers and the other describing the Sage Data Cloud and the related concerns.

    Starting with the challenge facing traditional players in responding to new entrants.

    Your contention is that traditional R&D heads have to deal with multiple products and platforms and that limits the resources they can apply to each offering (hence how effectively the company can respond to the competition) while the newer players have a limited product offerings and only one platform which gives them the benefit of focus.

    I would argue this is not the problem, just a red herring used by incumbents as an excuse.

    Usually each product line has a consistent set of technologies, drives unique revenue streams, and should make it own profit. In short, each product line is a business and should be run as such. So there is no argument to be made as to not having enough R&D resources because they are “too thinly spread”. Every RD&D VP sings the same tune no matter he/she is in a startup or an established company. There are never enough engineers to go around, just as there are never enough marketing heads or $, sales heads, etc.

    In the case of Sage, like many other companies before it, each product was a market leader at some point. They were eventually overtaken. Not because of limited R&D resources but due to problems in execution. It did not matter that the competitor was big like Intuit, or a startup like Netsuite or Intacct. The competitor just did the right things and did them better.

    So I think you would agree that responding to a competitor is not about responding to their R&D but responding to them as a whole business and more importantly responding to the market. R&D is just a small part.

    In the case of Sage, the response developed by CEO Sue Swenson and her staff of CTO Motasim Najeeb, CFO Marc Lupe, Mid-Market Business Head Lori Schultz, Lori Seal and others was 3 pronged.

    First, act like a SaaS company and reduce the friction associated with buying, implementing, using, supporting, upgrading your products. Auto updates, subscription pricing etc. came from that.

    Second, extend existing products to deliver greater value to customers via cloud and mobile. The products you saw from Summit and others came from that effort. The Sage Data Cloud came from recognizing that a common infrastructure to enable these Connected Services made sense.

    Third, build/acquire new SaaS mobile offerings to modernize the portfolio. The SageOne product came from that initiative. Acquisition of a cloud offering for the mid-market was also looked at but that effort died after Sue Swenson left with X3 being nominated as the upper mid-market from Sage.

    The arrival of Pascal Houillon as CEO in 2011 added a 4th leg to this strategy which was to rebrand, I expect to lower average Customer Acquisition Cost and to drive up Sales per Customer through better cross-sell.

    Pascal was also able to unload NPS and ACT/Saleslogix offerings, though I am not sure it makes any difference at all to what happens moving forward, other than they are 2 fewer businesses that the executive team has to worry about.

    My contention is that Sage’s original 3 leg strategy was as good a strategy as anyone else’s. It was a place to start. Over time you learn and these things get refined.

    The challenge as always has to do with fitting the kind of people you need to what needs to be done?

    Time after time you find nimble teams overtaking market leaders and making them irrelevant in a few short years. This is well documented by Clayton Christensen.
    Sometimes these challengers come from established organization, mostly from startups.

    In addition to the reasons Christensen gives I also think it is the kind of people that staff these organizations that is very different and makes a difference.

    Using the sales analogy challengers are staffed by “hunters” and they work and act like them. They are interested in acquiring new customers. Running a large existing business that already has a substantial customer base, lives off M&S, and wants to increase share of wallet is not their strength. That is for the “farmers”.

    The first step for Sage and traditional players to respond to the threat posed by hunters is to have its own set of hunters. Each business should be tasked with having both farming and hunting activities and these activities should be allowed to operate independently.

    Also each business should be required to grow their new customer base aggressively, not just be profitable. Surely a $100M+ business (say ACCPAC/MAS) can figure out how to compete with a $30M business (say Intacct)? It is never about resources. My contention is that with the right people, making the right prioritizations, and setting the right tone, traditional businesses can compete quite well with newcomers. After all it is the people from Sage and other companies who started Netsuite and Intacct. They were the hunters, and companies like Sage need to focus on retaining them and having build their new businesses in-house.

    Which brings me to my final point regarding this topic. I don’t think it is ever a good idea to pull R&D out of the business units and consolidate under a corporate umbrella. That leaves the business unit without the core that needs to drive innovation. Without the close collaboration with customers, sales, support and marketing that is required to build winning products. If R&D was already disconnected from reality, as it is often and justifiably accused of being, doing something like this makes the disconnect many orders of magnitude worse. The justification for this is always “to make things more efficient, to share common technologies and components, etc.” In other word cut costs. That is not where the focus should be in responding to the competitive challenges based on innovation. Rather, what is needed is a focus on innovation. By each business unit to meet their unique needs, within their own unique timeframes. It is messy, we know that. Like capitalism. With a lot of waste. But also with a lot of progress. But a lot more effective than the alternative. The Soviets proved that centralized anything does not scale, and is not responsive to the demands of an every changing world.And is way more wasteful in the long term.

    The second part of the article was on the Sage Cloud. It is not very complicated nor innovative. Nor unique. It is being or has been built by every ISV (Tally.Net is a particularly good example of what is possible) that wants to extend its existing on-premise products with cloud and mobile offering.

    Like all integration products it provides for both data integration, as well as control integration. In addition it also stores data like a data warehouse might that it can be used for reporting etc.It has the same challenges as well. Keeping data in sync is a challenge.

    The Data Cloud is really not the point, it is the stuff you build around it that will matter.
    And yes, instead of each product group building one for themselves, it makes sense for the company to build one for use across products.

    The prototype for this was built in mid/end 2011. Architects and engineers from all business units were involved. Which returns me to the previous point I made about leaving R&D inside business units. When common stuff has to be done, resources can easily be pooled, but the essential point is that, the majority of the R&D work in one business unit has little to do with that in another.

    The Sage Data Cloud effort was stopped when Sue Swenson left. The new point of view was that Sage should not be in the business of building integration software and should use a third party solution. Not sure, what the story is now. Maybe there is built in support on Azure for it?

    With respect to how Sage should move forward with the Data Cloud. The challenge is not in R&D.

    Rather a strong developer program is required to encourage 3rd parties to build add-on products and associated with that should be a marketplace and aggressive sales and marketing program that promotes these add-on product to customers.

    So where does one find the developers? I would suggest that the starting point is the Sage channel, that has the domain knowledge and has been building add-ons for years.
    Sunil Pande