The "thin client model" of today has really changed very little from the model originally developed by Sun Microsystems in the early 80s to permit client hardware to remote boot off of expensive hard drives located in some remote server. The main difference is that the local client is a great deal smarter and a great deal less expensive than it once was. In those days, hardware was expensive and bandwidth was cheap. I will pick the fully functional laptop every time. Today, hardware is really cheap and, while bandwidth is cheap by comparison to 1980, as you point out -- it is not always available. And, if it is available, it is not always sufficient for the task at hand. (Nationally, the demand for bandwidth is still expanding exponentially.)
Sure, for the text-based chores you describe (plus an occasional JPEG file), the bandwidth demands are minimal, but what about those compute-intensive chores? Graphics design? Gaming? Further, for every processor-cycle not being consumed by a thick client, that processor-cycle is being consumed on some central server -- plus the added overhead of managing tasks from many clients at once. Even our thin client will spend most of its time in "idle" waiting for us to hit the next key. Why not have that "thin client" doing productive work while it is waiting on me? That is better than sending those bits across the country to be processed elsewhere and then sending them back to me.
Sure, centrally locating data is a real plus (mainly because it shifts our responsibility to take care of our own data to someone else -- who stands to lose a great deal more if they lose our data than if we lose our own) but does it really save us money? Does it save us time? Yes, your lost productivity argument is sound; but, really, how common is it that the dreaded BSoD happens? Don't we need those occasional failures to remind us to take heed of the pitfalls of our unbridled faith in our technology to protect us?
In reality, the average user could buy the least expensive PC available today and pay about the same as they would pay for a thin client and do everything they will need to do. But that is not what people do -- they buy all the bells and whistles because they can. Your blue sky also assumes that there will be standards and not a plethora of competing solutions -- none of which will work together.
Your basic argument could just as easily be applied to television. Do I buy a satellite dish and be responsible for keeping it working and replacing it when the technology changes, or do I sign up for cable TV and then call them when something goes wrong? They absorb the upgrade costs but I am paying a monthly fee for the service. In the end, the financial outlay is probably the same. And whether one is more convenient than the other is in the eye of the beholder.
Don't get me wrong. I love the fact that all of my important correspondence is stored on an Exchange server that someone else backs up every night. I also love the fact that I can get at that information from my BlackBerry -- and that I can browse the web from a geographic area outside of which I rarely travel.
Still, given a choice between my BlackBerry, which gives me a "thin-client-like" experience at 80Kbps, and a full-blown PC at (multiple Mbps) as my only option at work and at play, I will pick the fully functional laptop every time! And so will most of your readers.
It all sounds good -- and there are many applications for which the model makes sense -- but there are many for which it makes no sense because of insufficient bandwidth, insufficient cycles for the task, insufficient flexibility for the user.
Thin clients will always have their place but I doubt they will ever be the predominant model.