X
Business

Top 5 Identity Fallacies: #1 We'll Add It In Later

There are several fallacies which appear and reappear in identity discussion, technologies, and deployments. This is the first article in a series which examines these fallacies, why they are so easy to fall into, and what their consequences are in networked computing.
Written by Phil Becker, Contributor

One of the most often encountered identity fallacies is that identity can be added into an application later - maybe even by someone else such as "the guys in the IT infrastructure, admin, or security groups." Closely related is believing that much less identity capability will be required by an application than ends up being the case. The primary cause of this fallacy is a narrow focus on the functionality of an application leading to overlooking the realities of the larger networked environment in which it will deploy. Add a dash of wishful thinking, and you have one of the biggest causes of the current identity crisis in the internet.

Identity is often seen as just a user name and password...

Most developers begin with their thinking heavily focused on the part of an application that seems most essential to what they are developing. Only after they are confident they have made good progress in solving that part of the problem set (thus proving they can deliver the desired results) do they begin to think deeply about, and add in, the rest of what is needed to create the deliverable application. This has significant consequences.

In today's internet environment, this has led many to fail to see and understand the central place of identity in the applications they are developing. Identity is often seen as just a user name and password, with a few privilege levels of authorization that derive from that authentication - a conceptually simple thing that can be easily added along the way. The result is a proliferation of silos of identity, a proliferation of identity store formats, little commonality or integration in the processes of managing that identity data, and a growing problem with building or managing coherent policies that allow online communities to develop and operate across these silos effectively.

Twenty years ago, the algorithm was king. This led programmers to develop their algorithms with coarse "test user interfaces". Only after the algorithm was working fairly well did they turn to thinking about and designing how a user could reasonably interact with what they had created. The result was applications that made users adapt to their unforgiving interface demands.

Fifteen years ago the windowed desktop interface appeared. Developers had to learn a new interface paradigm causing them to begin thinking about the user interface much earlier in their application development. Soon it was the interface driving the development design. Developers began seeing the "flash" of the interface as much more of a value producing part of their applications, and the user experience began to be a focus of application development.

Ten years ago the web browser and the internet appeared. This made developing methods to interact with distributed data and finding user interfaces to create enhanced functionality that worked well over the network within the environment of the web browser the developer's first focus. But the browser based interface capabilities now constrained the user interface. Browser applications could easily look beautiful, but they couldn't adapt well to the needs of user processes. This created a flat, publishing mode "feel" to the user experience.

Over the past ten years rich development environments have appeared that make most aspects of the underlying development of web applications far easier than it has ever been. Browser extensions now allow much more "user friendly" browser based experiences to be constructed. But along the way the subtle constraints of the web browser as an interface became internalized as "the way things are." Chief among these is a loss of well coupled interactivity with the application - the remnants of the publishing interface model of the web browser.

Today, when developers approach applications, they assume that anything about the user interaction will become a function within the browser. They presume that what they see everywhere is what should be, and that they don't need to think very far beyond what is readily available. Since identity isn't called out as a construct in this environment, it is easy to minimize its importance.

The result has been the proliferation of "bolt on" infrastructure to supply and support identity and identity management. But if the goal of web applications is to orient them around users and communities, then identity must become a central part of the thinking and design of such applications. Only then can the "bolt it on later" identity fallacy finally be laid to rest.

Editorial standards