Enterprise 2.0 is a lot like the term "innovate" vs. "innovative". When I see something that's really impressive, I might label it as innovative, after the fact. When I hear someone talk about how they're going to innovate, my built-in BS detector goes off.
Dennis Howlett gave an example of Western Digital's use of discussion forums for customer feedback - something that's arguably collaborative, allows employees and customers to share information, and is used to gather suggestions for improvements. Is this Enterprise 2.0?
Some years back I was a hack attending an IDC (I think) conference where Western Digital presented their use of forums as a way of getting customer feedback on what they were offering plus improvements they could make. At the time it was deemed a success. No-one bothered calling it Enterprise 2.0 or anything close. It was part of doing business. Fast forward to the present. We’re regaled with reasons to ‘do’ the E2.0 thing. But what has changed out there in enterprisey land?
Wikis long predate any discussion of the terms web 2.0, enterprise 2.0, and social media. The wiki has a long history of use, starting with its development in 1995 as a counterpart to centralized code management tools used by developers to work collectively and keep team efforts organized. There are concrete uses for a wiki (see 8 Things You Can do With an Enterprise Wiki for examples), specific ways to catalyze and lead adoption of the tool (see Wikipatterns for examples), and concrete ways to measure its utility in your organization – Return on Adoption (see Wikipatterns).
But none of this is really new. I saw the same thing two years ago, and in the early part of this decade in higher education. When course management software started to become a major presence in higher education, it sparked a debate over how – and whether – to centralize the creation of course websites and curriculum content using IT-administered enterprise software.
Before course management systems, faculty typically developed their own course websites with common items like syllabus, reading materials, links to relevant websites, etc. There was significant variation in the sophistication and quality of these websites, and the great promise of course management systems was to remove this variation and introduce consistency by giving people a common set of tools that would be easier to learn than building a website from scratch.
The same debates were there: Was it just about the tools, or really about better pedagogy? Who owned the curriculum materials – faculty or the university? How would you attract the early adopters and convince them to move their content to the new platform? How would you catalyze and lead broad adoption? How would you measure the ROI? Who would pay for it? This last one was particularly fun in an environment where 1) there truly is never enough money, and 2) the shift was from piecemeal web sites developed by faculty at what was seen as no cost (in reality, they were shouldering it themselves by buying their own software and training manuals) to a software system that needed resources and a line item in the IT budget – or was it the dean’s budget? Split among all the colleges? Or should the Provost pay?
Over time, these issues have been debated and settled in various ways. Some universities cover the cost as an IT expense; others pay for it from academic budgets. Still others share the cost between IT and colleges (which gives both a say in how things are managed). Course management and online learning systems have matured considerably. There are commercial software vendors like Blackboard, open source projects like Moodle and Sakai, and a variety of homegrown tools. Instructional technology and curriculum development departments at many universities assist faculty in learning how to use the software and design materials that take advantage of the capabilities of technology to enhance learning. All of this has taken time, and there is no single set of rules that all institutions can follow to be successful.
To me, a term like Enterprise 2.0 is a nice label to affix once you recognize that there is something visibly different about how an organization functions. For example, universities that have encouraged wide adoption and use of technology in teaching and learning, developed online courses in addition to traditional ones, and changed the educational environment to emphasize learning for understanding instead of content delivery and memorization could conceivably be given the label “Enterprise 2.0.” Universities are enterprises, and 2.0 represents a change from 1.0.
Enterprise 2.0 is a lot like the term “innovate” vs. “innovative”. When I see something that’s really impressive, I might label it as innovative, after the fact. When I hear someone talk about how they’re going to innovate, my built-in BS detector goes off.
Being innovative is not something you do based on a neat, step-by-step process. There isn’t a universal set of rules that make your behavior innovative. It’s a result of understanding the purpose of your work, observing how it gets done and thinking critically about your role, and recognizing the things you can do to make it better. Sometimes that results in refinements, and other times it leads to more significant changes. But when you’re in tune with the purpose and process, you intuitively know when and why change is necessary, and you can communicate that to others with authenticity.
In sum, here are several timeless patterns I’ve observed in my years of working with a variety of organizations on technology adoption. As Merlin Mann said in a recent speech, “This is not a list. It is a list of four things, but don’t think of it as a list. Because that makes me mad. Item 1.”
Never underestimate how busy people are, and how quickly they will ignore or dismiss something they don’t see as useful.
What has worked for me, time and time again, is to work my way through an organization team by team, department by department, and find out what day to day problems people want to solve. (My work with the Brown University Chemistry Department on a language tool for international graduate students is just such an example. We didn’t use a wiki for wiki’s sake; we built a source of information to help graduate students learn how to use technical terms in their proper context in a chemistry lab. This project is the subject of a chapter in the recently-published book Authenticity in the Language Classroom and Beyond: Adult Learners.)
Rules are for impatient people. You need to observe patterns to see what works well and where the weaknesses lie.
The best strategy for long-lasting technology adoption comes from running a small pilot, working out the kinks, telling a good story with relevant examples from the pilot, giving people permission and encouragement to find the best uses, and letting them guide their peers.
Stewart Mader is founder & senior consultant at Future Changes. He has led or advised wiki and content management software adoption in large companies, small and medium enterprises, and universities. He is the author of Wikipatterns and Using Wiki in Education, and created the widely-used Wikipatterns.com community for sharing technology adoption strategies.