Dynamic Site Reviews

Web site reviews should focus on processes rather than simply a snapshot of the site at one point in time. In addition, an analysis of technology and internal organizational structure should accompany the design review to form a more complete picture of site operations.
Written by Craig Roth, Contributor

Web site reviews should focus on processes rather than simply a snapshot of the site at one point in time. In addition, an analysis of technology and internal organizational structure should accompany the design review to form a more complete picture of site operations.

META Trend: By 2004, portal frameworks will become the centerpiece of a delivery infrastructure that acts as a fulcrum to aggregate reusable application, content, analytical, and collaboration components for highly dynamic user interfaces. By 2005, organizations will exploit portal frameworks to deliver contextual business workspaces, enabled via maturing XML and Web service standards. Through 2007, portal vendors will increasingly leverage enterprise infrastructure services.

As organizations increasingly depend on their customer-facing Web sites for revenue and lead generation, there has been a corresponding increase in demand for site reviews. These reviews generally focus on site design (e.g., navigation, use of color, comprehension of content, usability, icons). However, competitive pressures are forcing sites to be personalized and refreshed more frequently, so evaluating a point-in-time snapshot of the site design becomes less important than evaluating the processes behind its creation. By 2005, Web sites that treat each refresh as a separate project will fall behind competitors’ sites and become a burden on their respective IT departments.

How to Evaluate Web Site Processes: A Web Site Maturity Model

A capability maturity model, such as the one developed by the Carnegie Mellon Software Engineering Institute, should be used to assess a site’s design, technology, and people processes. This model rates processes on a scale from 1 to 5 (1 means it has simply been done once but with no repeatability; 5 is the highest level of process maturity). We have applied this model to Web site processes to obtain the summary ratings described as follows. However, a Web site process review should rate the site processes based on a detailed set of criteria to arrive at the summary rating:

  • Initial: The Web site has been assembled and released at least once. If it has been through re-releases, each was treated independently with little or no feedback from the previous release. No effort has been made to standardize on a design or toolset. Processes for updating the site have not been established. Objectives, measurements, and feedback loops are not in place.
  • Repeatable: The Web site has been assembled and released more than once. Project plans and lessons learned from previous releases are being applied to make each successive release smoother and more predictable. Processes for content-oriented updates to the site have been automated and put in the hands of content owners.
  • Defined: Web site design guidelines have been published and designers are indoctrinated in their use. Web architecture for the site is documented and tool standards are agreed on. Site business objectives are known and published. A taxonomy covering the content on the site is defined and used to tag data, which is utilized to align content to user patterns and for categorical searches. Privacy guidelines are published.
  • Managed: Web site analytics are gathered, tabulated by business criteria (e.g., customer pattern, items purchased), regularly reviewed, and aligned with business goals. Performance metrics are gathered and reviewed by operations for early detection of capacity problems or usage trends. Internal and external monitoring of uptime provides rapid alerting and resolution of site outages. Project management practices facilitate outsourcing of design, coding, and hosting. Governance procedures are in place to ensure consistency in design among product groups, geographic regions, and divisions.
  • Optimizing: Changes in customer patterns are detected and regularly used to drive changes to site content and functionality. Summary information on Web site contribution to business objectives is available to senior management and used to drive changes in business strategy, product strategy, and funding.
The Dynamic Site Review

We recommend a dynamic site review that examines processes along with a snapshot of the current state for three areas (see Figures 1 and 2):

  • Design
    Snapshot: Design reviews examine a snapshot of the site and apply design imperatives or heuristics (e.g., one should not have to scroll; the site should be usable by the color blind; sites should work across browsers, use three colors or less, use two fonts or less, and always provide help), qualitative aesthetic evaluation (e.g., icons were difficult to interpret, not enough white space, too busy), comprehension tests (e.g., page links were understandable, page content made sense), and error checking.
    Process: Analysis of design processes is less concerned with one instance of how the site currently looks than it is with the processes that will be used to keep it current and fresh over a long period of time. While snapshot design reviews can be done remotely by browsing a site, process reviews require interviewing site owners and stakeholders, and are therefore less common. Site design processes address issues such as the following: Is a process in place to determine customer patterns and evolve the site as the needs of different customer types change? Is there a process for performing audience measurement (I.e., panels, surveys) upon major design changes? Are there design guidelines in place that designers and developers are aware of to ensure the design remains sound?
  • Technology
    Snapshot: A technology snapshot review ensures that current technology meets the site’s goals for performance, robustness, function, and scalability for each product category (i.e., Internet strength Web analytics, Web content management, HTML page editors) and technology use (i.e., browser support, JSP/ASP, Flash). Process: Since Web technologies change rapidly, processes behind technology selection and compliance are more important than a snapshot analysis. For example, the process for measuring Web activity against the site’s business objectives is more important than the presence of usage-tracking tools. Procedures for end-user submission and approval of content are more important than the presence of a Web content management system.
  • People
    Snapshot: An organizational snapshot review verifies that responsibility and roles are properly defined throughout the organization. Roles to look for include who owns the site design, who sets technology strategy, who approves content, and where the budget for site infrastructure comes from. Where companies involve many product groups, divisions, or geographic locations, the degree of federation and centralization should be explored. An understanding of which components are outsourced (creative design, coding, hosting, monitoring) should be included in the snapshot. Process: Successful outsourcing depends on numerous repeatable processes, including business alignment, risk analysis, and project management. Budgeting processes need to be reviewed to ensure that responsibility is commensurate with funding (i.e., centralized dictates work best with centralized funding). Migration processes (including timelines and role transfers) are often required for dissemination of new technologies, mergers and acquisitions, or consolidation of Web sites.
Business Impact: A customer-facing Web site's ability to meet business goals depends on its usability, use of technology, proper organizational structure, and renewal processes.

Bottom Line: Site success is determined by the people, processes, and technology behind it, not just the design of the site. Organizations should measure the maturity of their site on a regular basis and strive to improve it over time.

META Group originally published this article on 28 October 2003.

Editorial standards