Sunday, December 16, 2007

LastData – Data Storage for the LastComputer

The second major technical aspect of the LastComputer, after applications (which has been the major focus of this blog so far), is data storage. When using the LastComputer your data must just be there, available, all the time. It must be there when you turn on your LastComputer, it must be there after your LastComputer crashes and is replaced, it must be there whether you’re on your big LastComputer desktop or laptop or whether you’re using your LastComputer mobile device. And most of your data must be there when you’re using a public terminal or using your friend’s computer (or, for ALL of your data, your friend’s LastComputer) at their home.

To satisfy these never-lost/always-accessible needs, and yet run as quickly as a traditional personal computer, requires a combination of automatic data caching, replication, and backup. Most likely this means that the master location of your data will be off in the cloud (i.e., in server data farms).

Is it too much to expect everyone’s data to be stored on a server farm? Isn’t that too much data and too much bandwidth? I don’t think so. Through deduplication techniques it turns out that most of your data really isn’t very big in terms of extra storage required: you run the same programs as millions of others, own the same songs as millions of others, share business documents with many others, and so are left with only your personal files. Personal files do not require much storage space, except for personal photographs and movies. (I joke that the world only needs one picture of a baby with a messy face full of food, with all variations handled by diff, but most parents and grandparents demand the real thing.) So storage will be fairly cheap per-user, and probably can fit within the LastComputer’s pricing model, with a little extra payment required for the photo/video-happy users (e.g. new parents of the world).

As for the bandwidth problems, in most cases a person’s LastComputer will have cached what they need, and changes are relatively few, and so the bandwidth costs remain reasonable. To the typical LastComputer user, they’re simply using local copies of their documents, reading their email, and listening to their songs, not thinking about the fact that much of this data which appears to be on their computer is really a cache of the real data file off in some server farm.

No one has achieved this LastData model yet, at least not for the public, but here is a representative sampling of some products and categories that are partway there:

Personal access-your-data-from-anywhere home servers:

This class of product (e.g. HP MediaSmart Server and other Windows Home Server boxes, and other server boxes) helps with backup, some amount of remote access, the ability to see files from many computers, and recovery of files and systems. Good goals, but it’s so wrong for these reasons:

  • Someone in every home becomes the IT department.
  • Must be addressable from anywhere in the world, and so IP providers must be cooperative.
  • Must be always on. Future solutions must save energy, not consume more energy.
  • This one machine now becomes the point of failure for all your computers – so who backs up the backup machine?
Online backup

There are many great online backup service to choose from (e.g., ibackup, .mac, backup.com, xDrive, Mozy, Arsenal, Carbonite, Amerivault) and the space is hot now (e.g., EMC recently acquired Mozy for $76 million, while IBM acquired Arsenal Digital just days ago, although Omnidrive is looking troubled). Many of these services offer a few GB of backup for free, but keeping a backup of your entire system will likely require more space than that. For $50 to $100 per year these services can backup your entire system (less per-system costs are available for corporate plans).

I personally use Mozy and am so far mostly happy with the way it works, but have not had a catastrophic failure since using it, so don’t know how well it will pull through when I really need it.

These online backup systems use a subscription model of approximately $6/month (more or less). This fits in well as a component of the $30/month subscription/leasing price I previously proposed for the LastComputer, which would contain built-in and intrinsic automatic backup (or caching, if you prefer) as part of that $30/month cost. (With deduplication, the actual storage costs per-computer will come way down.)

Where the current online backup solutions fall short is in their integration with the entire computer-owning experience:
  • Online backup is currently a separate experience from computer purchasing and maintenance.
  • Too-frequent user intervention is required.
  • Data recovery is not trivial.
  • If I have failure in all or parts of my system, it’s not clear that all programs will be configured correctly when I restore.
  • If I need to restore everything to a new computer, I’m not confident that all of the programs I own (have paid for and configured) will be able to be restored to their state automatically from backup, without my needing to go through some painful re-installations.
  • If I am on a different system (e.g. my mobile device or a public terminal) most of them don’t provide a clean way to access my data.
  • The big problem: We should not even need to think about “data backup”. What on my computer should just be “my stuff” and should always exist without my thinking about it.
Synchronization / Replication

One may view the LastComputer data issue as largely a file-synchronization problem. Whichever LastComputer you’re using (primary, mobile, friends’, hotel, or post-crash replacement) you want your data to be replicated on that computer as you need it. Folder-synchronization tools (e.g., FolderShare, PowerFolder, Unison, and good ‘ole rsync) show one way this may work. These tools allow you to have the same set of files on multiple computers; if a file is changed on one computer it automatically gets updated on the other computers.

I personally use FolderShare to keep certain files consistent across many machines (e.g. photos, files with information for the whole family, or common command files I need on each of my machines).

These file synchronization programs are not, in their current form, exactly what the LastComputer users need, but they do validate the utility and performance of cache-like, update as-needed data management. And they can be used to personally recreate some parts of the LastComputer on your own. For example, two friends may want to use file synchronization to constantly keep each other’s computers backed up. Or you may want to keep certain parts of your home and office computer always in-sync. Or your team across the globe may want to have local copies of commonly used documents without the performance loss of around-the-world communications each time they refer to a file.

I maintain some fading hope that Microsoft will surprise us all with a really nice worldwide data replication system, if Ray Ozzie has one more big replication trick up his sleeve.

Web OS & Online Apps:

Many expect that thin clients (browser, flash, silverlight, java) will replace desktop applications entirely. I’ve written previously (1, 2) that I expect this to happen for some common and simple-UI applications, but not all. If I’m wrong, then LastData becomes a non-issue because all of your data will always be on the server. Here are many companies and products betting that I’m wrong: Google Docs, Zoho, Zimbra, ThinkFree, Gnome Online Desktop, eyeOS, Ulteo (and so on, see Read/Write office roundup).

Offline/Online WebApps

A simple extension of the Web OS/SOA idea is applications that run in the browser, but continue to run when disconnected from the Web. Zoho has most recently shown how well this works with their “Go Offline” option which uses Google Gears to cache your data locally. I wrote earlier about Google Gears, and Zoho is the first to show it working. It’s not a great system, because it requires manual intervention, but it does work enough to prove the point.

I keep expecting Adobe to move ahead in this area with their AIR Applications, which is intended to let developers use their web (HTML, PDF, Flash) skills to develop desktop applications. It’s only a minor shift from there to on-demand applications and data anywhere (see Adobe discussion).

The disk drive in the cloud

Imagine a drive or folder on your computer—oh, let's just call it “G:” or “/G” for example—that also appears to be on every other computer you use. All of the G data (which will be all of your data) is really off in some server-farm somewhere (you don’t need to know where), but to you as a user it just appears to be in your local G, on any computer you use. The data in G is accessible by both your online apps and your offline apps. You then store and retrieve all of your data (documents, photos, movies, music, high-score game settings, etc…) on G and only G. No extra steps or copying or backup will be necessary. This will be data nirvana.

To achieve adequate performance on this G folder, tweaks will be needed to the current OSes to predicatively cache and search and replicate, but those are just tweaks and not far off from what is built into OS X or Vista already.

I hope that’s what Google has in mind for their GDrive.

Monday, October 1, 2007

Adobe's leads in software platform for thin-client apps

I've tested a lot of webOS / Web Office platforms (e.g. Goowy, DesktopTwo, Google Office, etc...). Within each of those platforms are many (mini) applications. Most of the applications are not very good, compared to desktop apps, but those that don't suck all have one thing in common: They're written in Adobe's Flash. (BTW, all of these that try to make it look like you have a desktop within a browser window are just silly, and are using a whole lot of code to get this silly effect... the browser already does a good job of running many apps within a desktop AND have the further benefit of being able to interact with the real desktop)

Today Adobe announced that they were acquiring Virtual Ubiquity, maker of buzzword. Buzzword is a word processor written in flash 9. Buzzword is a great word processor (the first web-based word processor I can say that about), downloads (and updates) immediately from the web (and with AIR will be standalone), integrates with the desktop, and is fast. Furthermore, buzzword is one of the first attempts in years to make a word processor that is not simply a copy of MS Word.

Adobe has Flash, the best technology for delivering web apps. It is working on AIR to make those also be desktop apps. They're pushing out their document sharing services. And AIR. And music players.

Google has Google Office, which are many apps that are, eh... good enough. (Google should get out of the DHTML-only mindset.)

Microsoft has lots of Office users, but is still only promising to make those apps "pushable" and to simplify file sharing. (MS should improve downloadability and sharing of their office applications.)

Lots of WebOS makers are trying... hoping to be acquired by one of those.

And the winner is: At this point, Adobe is in the lead.


P.S. I've nearly forgotten about this blog for a few months. Sheesh. If the "Last Computer" topic interests you then take a look at Read/WriteWeb. That blogger is way more active than I am.

Wednesday, June 27, 2007

Last Computer financial model - look to the cell phone

    In recent blog entries I’ve been describing how software will be implemented on The Last Computer. I'll get back to that later. But first, some thoughts on how individuals and organizations will pay for their Last Computers.
Convergence of mobile phones with computers

I bought a new cell phone last week. In most respects my new cell phone really is a computer that just so happens to be capable of fitting in my pocket and making phone calls. Over time we can expect what we call a "mobile phone" and what we call a "computer" to merge.

Perhaps this future merged device will take the form of a powerful mobile phone that docks into a keyboard / mouse / monitor when we're at our desks; or perhaps it will look like today's mobile phone but when placed on any flat surface it will project a keyboard and mouse pad onto that surface while projecting a high-density widescreen image against any available wall.

I don't feel safe in predicting the physical model of this converged Last Computer (whether it will be more like today's cell phone, more like today's second-to-last computer, or something completely new), but I can predict its financial model. The Last Computer and its related software and services will most likely be sold in a way very similar to today's cell phones.

I purchase my new cell phone (purchase? my?)

Here's how I bought my cell phone this week. I dropped into a local T-Mobile store. Tried out a few devices. Found one I liked. Played with it. A nice salesman answered some questions. He described a variety of service and payment options. While selling me the phone he cleverly up-sold me on various additional services (text-, time-, data-, and warranty-plans), which I mostly accepted because it was so easy to accept and because each item would appear as only a small increment on my monthly bill (for greater payment simplicity I could have those monthly payments automatically put on my credit card). I paid a few hundred for my phone, and within a few minutes it was mine and it was working and I was happy. The whole thing was so easy and painless.

Contrast this with any time I've bought a personal computer, or been supplied one at work, or upgraded from an old one (see previous blog). For simple ease of process, the cell phone purchasing model is vastly superior to today's computer purchasing model.

Consider another aspect of my recent phone purchase. In the above description I referred twice to that cell phone as if it belonged to me ("my cell phone", "it was mine") but in truth it does not belong to me, it belongs to T-Mobile. I don't own the phone; I am only leasing or renting it. The mobile companies are too clever to say that they'll lease or rent you a phone, instead they make it sound like they'll sell or even give it to you, along with an X-year plan. (We can all truly purchase an unlocked phone, and own that unlocked phone. But that costs a lot more up-front money than a plan. I'm perfectly fine with a plan and the other simple benefits it entails, and most other people are too.)

The Last Computer leasing model

The Last Computer will generally be leased. The terms lease or rent will probably be avoided when communicating with the individual consumer (although may be necessary in corporate situations), and, as with the cell phone, the user will feel as if they own the computer. But it will be some form of leasing and that's how I'll refer to the transaction here.

I don't know what the leasing price will be. In a small sampling of people I've queried, they generally seem satisfied with the idea of paying some small initial amount (let's say $200) to buy a computer and $30 per month for all the benefits described previously in my initial ACME Last Computer post. Over the course of three years (a typical obsolescence period for mid-capability computers) that is $1280, which is typically hundreds more than someone would currently pay for such a computer. If, as I have proposed, the consumer automatically gets upgraded every three years, they will continue to pay $360/year (or $1080 over three years) for the privilege of owning their always-up-to-date computer. (Note that peripherals, such as monitors, may not need upgrading as frequently as central units, resulting in greater savings for the vendors.)

This is a good financial deal for the individual consumer (or corporate user, with some modifications) since it is an easier and less-risky option than paying a large amount up-front for what is a risky purchase. For the computer vendor it is more money collected over the lifetime of that computer—on the face of it, it’s not clear if this $30/month arrangement is a financial win over a direct sale, but (as described in the next section) it does open the door for much more income over the lifetime of the lease (which, hopefully, is the long lifetime of the consumer).

Up-selling to the consumer

Now that the consumer is in the system, and is making easy monthly payments for a computer largely controlled by the seller, there is big potential for selling more goods and services to that consumer, and in ways that are actually good for that consumer:

  • Bundled internet services: A computer is not much good these days without a connection to the internet. What could be more convenient than combining the payment for internet service with the payment for leasing the computer? So it is a natural for the internet service provider (cable, dsl, wimax, etc) to partner (i.e. bundle) with the Last Computer provider. (Does this mean, for example, that Dell should start selling cable services, or Comcast should start selling computers, or that they should partner?) As the market is currently seeing combined providers for internet, phone, music, and movie-delivery, it is expected that these mega-providers should also offer computers as part of the one-price mix.

  • Bundled software and software services: Some predict the day when we will stop paying for software directly and instead all software will be a service (often ad-supported but sometimes as a leased service). I hope they are wrong. The ability to easily purchase and use software via your Last Computer provider can revitalize the ability to sell software (more on this below), with the Last Computer corporation as the retailer earning their cut of the sale.

  • Personalized marketing opportunities: With the Last Computer model, and all of the costumer's data available to the Last Computer provider (whether because of caching, backup, system-query, or because it goes through their pipes) their is opportunity to deliver relevant marketing that suits the needs of the user: be this advertising, coupons, or up-selling of relevant services and software.

  • Perpetual retention (customer lock-in): Because of the ease of remaining in the system, and because there is no longer a need to choose a new computer every few years (since automatic upgrades are part of Last Computer contracts), the consumer is unlikely to leave their Last Computer vendor for another.
The cell-phone leasing model leads to the miracle of small- (if not micro-) payments

It is amazing that on my cell phone I can purchase, download, and install software or other content for as little as 49 cents. 49 cents! (It's so painless and per-purchase cheap that countless kids out there are constantly buying total crap for their phones without a second thought.) When done right, all I need to do is click on "I agree" and the software is downloaded and installed, the small price is added to my monthly phone bill, and the phone company and content authors get paid. It's an amazing accomplishment that should be incorporated into The Last Computer for the benefit of creators and consumers.

I know of no other situation where it makes economic sense to sell anything so cheaply. In the traditional software model, the overhead in billing alone is much more than that 49 cents. Add to that the required packaging, middleman markup, marketing, and so on, and the end-user price on that 49-cent piece of software could easily jump to 49 dollars (which is then high enough that sales are lost due to sticker shock and piracy and open-source free alternatives, which together push the price up even higher or drive the creators out of business). Even a simple on-line purchasing system requires me to fill out credit information (again!), which adds to my reluctance to pay that high price.

Alternatively, there is free, ad-supported software. I believe that the recent trend toward making all computing services free and ad-supported can only go so far (and leads to lesser quality over time), and in many cases is just a stop-gap while some micro-payment model is created. Ideally micro-payments would come down to something like this: "Yes, I implicitly agree to pay the New York Times 1/100 of a penny every time I read one of their well-researched and articulate articles." The cell-phone / Last Computer billing model is not quite there, but it is close. I could easily agree to allow New York Times to add some small fee (25 cents?) to my phone bill during each month that I read some of their articles. And I could easily agree to pay a few cents to dollars, now and then, when I find software I like and its purchase and installation is as simple as "I agree".

The traditional software companies can also win from this Last Computer small-easy-payment purchasing model. A Last Computer owner may be happy to allow Microsoft, for example, to add a couple of dollars to their bill on months when that persons uses Microsoft Office. That person is still free to use Open Office for free if they think it's good enough, or to use Microsoft Office for $2/month (and get the additional Last Computer software installation and support). Again, this is good for the consumer, it's good for the software developer, and of course it's good for any ACME Last Computer corporation who gets a cut of this "sale".

Corporate financing models

Most of what has been said about individual consumers applies to corporate users of Last Computers, but with the corporate IT buffering the technicalities and pricing managed in bundles (and so corporations are more likely to shop around for Last Computer providers every few years). Once again the current cell-phone market is the one to follow, but as it applies to corporate purchasing.

Summary of Last Computer financial model: It's like a cell phone

Buying a Last Computer of tomorrow (with its services, software, features, bundles, and upgrades) will be much like buying a cell phone of today. The consumer will benefit from simplicity and lower buy-in prices. The Last Computer manufacturer will benefit from higher and steadier overall income, improved customer retention, and partnering opportunities. And third parties will benefit from selling to customers, through a Last Computer distribution chain, into a revitalized software and services industry. It will be a win win win.

Sunday, June 3, 2007

Upgrading to my second-to-last computer. A case study.

This weekend I switched my home personal computer from one that was 7 years old, to one that is only 3 or 4 years old (inherited, sort of). It was enough to about double the clock speed, and finally have a video card capable of viewing YouTube videos at more than a couple of frames per second.

This is the last time I ever expect to have to go through the process of manually moving over to a new personal computer. I put off doing so for the longest time because I expected the process to be painful. The next time I get a new "personal computer" I fully expect it to follow a Last Computer model much like the one I described in the open letter to the ACME Computer Corporation.

I've recorded notes about this computer transfer, both to document the painful process that Last Computer hopes to avoid, and for historical purposes (since in the Last Computer future people will not longer follow a migration process to new computers). If you get as bored and frustrated reading these notes as I was writing them, jump ahead to the end.

Notes taken while moving from old computer to new computer:

  • Preparation This Week: I'm lucky enough to live near a Frys Electronics where they still sell software in boxes. I select "PCmover" which claims to move all my programs, files, and settings to a new computer. For just $30.

  • Friday: I drive to work so I can carry home a monitor. I figure that doing this transfer will be a whole lot easier if each computer has it's own monitor.

  • Saturday 12:20: Lug second monitor upstairs. Not enough space on desk, and not enough outlets, so find clear spot on floor to set up new computer, with monitors and keyboards and such.

  • 12:30: begin removing old programs and accounts from new computer

  • 1:30: still running windows updates on the new computer, since no one has used it in a year or more

  • 1:46: install PCmover on old computer

  • 2:00: click through all PCmover messages saying, basically "I agree that this may not work"

  • 2:04: install PCmover on new computer; new computer is ready to receive; old computer is ready to send; press "go"

  • 2:08: oh-oh, old computer has 2 drives, new computer only one; PCmover is not sure how to make this migration happen -- based on help screen I will choose NOT to transfer the second drive (D:), and instead will manually move the disk drive from one machine to the other after transferring drive C:-- hope this is not a mistake

  • 2:13: all dialogs have been set. decisions made; with fingers crossed I press "Next" and it begins.

  • 2:16: PCmover has finished analyzing systems, gives me last chance to run it saying it may take several minutes to several hours. I press "Next" and let it begin.

  • 2:24: 8 minutes into comes the fist "time remaining" estimate. 5 hr, 24 minutes. Ugh. Maybe it's time to go catch up on the Heroes marathon Tivo recorded last weekend.

  • 5:10: Almost three hours into moving the one drive (Now I sure am glad I didn't choose to transfer the seconds drive, drive D, which is much larger than C) and PCmover tells me there's another hour to go. (Time enough to watch another episode of Heroes)

  • 6:08: PCmover says it finished. "Done. Filling the moving van is complete." (whatever that means)

  • 6:14: I've turned off and rebooted my new computer. On startup it looks like my old system (same user accounts, backgrounds, desktop setup). Before going further I'll now transfer the old D: drive from my old computer to the new one because I'm sure items on C: are referring to drive D: and I want to have D: available before that happens.

  • 9:15: got so frustrated back then when manually moving the D: drives that I yelled and swore like a sailor (hard to know which is C: and which is D: on old computer, and guessed wrong the first time, then had helluva time squeezing the hard disk into slot on new computer among the tangle of cables). Was frustrated second time that none of the memory from old computer works in the new computer. Huffing and puffing, I went to have dinner and watch more episodes of heroes. Now the new computer boots and I'll try to turn on all the old software and see what works(PCmover disables startups the first time through because, as I agreed in the earlier dialogue, there's no guarantee they'll actually work).

  • Saturday 11:46 PM: I've been updating old programs for last two hours (windows updates, virus scan, programs I use most often). I'm tired. Going to bed.

  • Sunday 8:30 AM: Boot computer to continue upgrades and getting programs to work -- it's in a weird 650x480 4-bit color mode - dangit - cannot reset that - have to download video drivers or something - the whole things running incredibly slow -- (things were going so well, it seemed, and now the video driver is not working)

  • 9:26: graphics problems resolved - should have remembered my old policy of not trying to upgrade too many components at once

  • 10:24: - man, it takes a long time for MS updates to do all their stuff on a computer that has not run updates for more than a year... this is giving me too much waiting time to wander off into YouTube videos, slashdot links... idle hands...

  • 12:43: - most of my daily programs are now working -- time for lunch

  • Sunday afternoon: Most of the programs I use each day are working on the new computer. It is noticeably faster than the old computer, but just barely (I hope to buy more memory and see if that helps). The room is a terrible mess, with computer parts, screws, memory, all over the place. I'll need to clean this up before the wife comes home. Maybe I'll put that off by blogging first. But what am I supposed to do with all these old computer parts

A few comments on the status of this transfer:

  • I'm a little surprised at how few of the program I've tried so far, are failing. Most of them are working just fine. Still a few problems

    • Copernic has no indexed data, but the settings are OK
    • iTunes takes forever to start up, ending with a message that it must be reinstalled if it's to burn CD's correctly, but I don't burn CDs so not going to reinstall and lose all my setting
    • every once in a while about 60 blank IE windows will open up in quick succession--but after a few hours that stopped happening)
    • had to re-install virus-protection software (by the time this is installed the computer doesn't run as fast as it first did)
    • had to remove windows indexing settings that slow computer down
    • Mozy backup software did not transfer

  • The process that transferred drive C: took longer than I expected; almost four hours (I'm really glad now that I did not transfer the larger drive D: in this process, but just physically moved the drive--although that's the same thing I did years ago when moving this drive to my previous computer--this drive must be getting really really long in the tooth)--I'm going to have to revise my story that a new LastComputer can auto-configure itself over the internet while it's owner is making dinner.

  • Given all the work it took to make this move, and how minor is the improvement with the new computer compared to the old one, and how cheap fast computers are these days, I wish I'd forked out a few bucks to by a completely new computer and done the transfer to that.

Lessons learned from moving to my second-to-last computer:

Even though it was more successful than I expected (I haven't yet had to dig around for any old CDs to reinstall their old software of copy old license ID's) This process sucked! It took up most of my weekend, was at-times incredibly frustrating, and was at all other times tedious. I cannot believe that average people (who are not computer people, making their living with these things like me) would be expected to get through this process. I cannot believe that computer and software manufacturers expect to make their customers happy with this process.

I can believe that a lot of computer upgrades are not being sold because potential customers fear this upgrade process. I'm convinced that those customers would be happy to pay hundreds of dollars every few years for faster equipment if moving to that new equipment was painless. I'm sure that the industry is leaving a lot of customer money on the table.

    Imagine if it were this difficult to transfer to a new automobile. If instead of signing some papers and moving some crap out of the back seat of one car into the back of another, automobile owners had to reinstall all their settings, and spark plugs, and belts, and radios, and so on, and then, after all that was done, had to figure out how to dispose of the old car. Who would buy a new car then? The first auto-manufacturer to make auto-ownership quickly upgradable/transferable would own the market.

When the computer industry (hardware and software) moves to the Last Computer model we'll open paths to a huge revenue stream from screaming mobs of continuously-contented, continuously-returning customers at home and in the workplace.

Thursday, May 31, 2007

Google’s inroads into solving 3 of 4 web-based app problems

This week Google has taken two more big steps toward The Last Computer, with release of Google Gears and acquisition of GreenBorder virtualization technology. Too bad they’re stepping with such clunky shoes.

In my previous post (Browser-based thin-client apps) I wrote about why Web 2.0 AJAXish applications are part of The Last Computer. But I also wrote about the four reasons browser applications will not replace all desktop applications.

3 of 4 web-app problems are solved by removing old client/server thinking

The first three of those four problems (Connection Required, Slow, & Sandboxed) are not a problem of the browser paradigm itself (although the fourth reason, HTML == Poor Interface, is). These three problems are due to an old, fixed way of thinking about browser apps. The old thinking is that the browser is on a client machine, and the server is on a distant machine somewhere on the other side of the internet, and never the two shall meet. There is no reason for this separation other than tradition, and if this tradition is broken then the first three problems approach zero.

    Consider: For ftp, bit torrent, and most p2p applications, there is no single client talking to a single server. The client is just as often a server. Client/server are outdated concepts.
The first step to breaking with old web/server thinking is to realize what a web server is: it’s the simplest bit of software ever written. A web server receives a request in a string; it sends back a response in a string. There’s no easier program to write than a basic web server. Perl fans like to write a web server in just a few lines. I’ve written at least twenty different web servers using Nombas’ ScriptEase Machine, because it is easier and smaller to create a specific web server than to install and configure a big commercial one.

Big Step #1: Move the web server onto the client

Realizing what a small effort an actual web server is, especially if it services only one client, there is no reason not to move that web server onto the client machine. Simply by moving a web server onto the client we’ve already (mostly) solved the first three of the web-app problems:

  1. Connection Required – Solved: server is on 127.0.0.1, no internet needed
  2. Slow – Solved (mostly): No internet latency added
  3. Sandboxed – Solved: Server has full access to client’s machine
For years I’ve used this web-server-on-desktop approach to create simple applications that use a browser interface to access my own computer. Mostly these have been for personal use when I need a quick program with a simple interface—usually a browser link or simple browser form will suffice. It works great (although it always feels like a hack).

Google Desktop was the first widespread offering I saw that used this approach. With the first Google Desktop, which allowed Google-like searching to spread to your own personal computer, what Google did was simply to write a web server that you ran on your own machine and so could search your local files, and your existing browser became the interface to what that installed web server delivered.

    BTW, this Google Desktop interface was terrible. People didn’t notice it was terrible because it looked just like regular Google results, which are fine, for the web (where expectations are very low), but when compared against a real desktop app it really was/is terrible. Long before Google Desktop I’d been using Copernic for the same thing. Even more so than my comparison of GMmail vs. Outlook, Google Desktop versus Copernic, side-by-side, showed the extreme comparative suckiness of browser as an interface.
Big Step #2: Join my web server into “the cloud” – Any computer is the network

So Google long ago realized some benefit of moving a web server onto the client. Why did they stop there for so long? Wouldn’t Gmail, and others parts of Google’s huge suite of web-based applications also benefit (i.e. lose 3 of 4 web-app problems) from moving the server onto the client?

Consider: Google does not have “a server” delivering my Gmail, web results, and so on. Instead, Google is some kind of amorphous cloud of machines that somehow magically act as one huge magical system. Those machines are fairly straightforward boxes. They’re unreliable; they break. Google works around problems. It’s magic.

I’m sitting here right now on a computer that is pretty much the same as those gazillion Google computers. But my computer can be dedicated to serving just me, while in the Google cloud I surely get less than one full system dedicated to me. Why cannot my personal computer become part of that amorphous Google web server cloud? The web server on my computer can better service me individually (online and off) but can also be part of the Google mega-computer-cloud. It’s the best of both worlds.

Google begins big steps with Google Gears, solving 2 problems (and creating more)

This week Google introduced a beta of Google Gears, taking those first two big beta steps toward solving the problems of browser-based apps replacing desktop apps. Google Gears is (primarily) a web server that runs on the desktop. In time, I expect Google Gears to tie into the Google cloud. The result being that all Google web applications will work when offline (synchronizing when connected), will become faster (because they will work directly with the local server), and will break out of the sandbox and allow access to data on one’s one computer (initially in limited form).

With Google Gears, web apps will become, for most functional intents and purposes, synonymous with desktop apps.

Google extends baby steps with GreenBorder virtualization acquisition

When you look at the Google Gears site, the first thing you see is this:

    This is an early release of Google Gears (BETA). After installation, please pay attention to the warning dialogs and grant access only to websites that you trust.
Once you extend the Google cloud to the desktop, so that any computer is the network, you’ve now introduced some big security holes. As the sandbox is opened and allows more capabilities, it also allows more danger. For instance, a web server on your Windows desktop has full access to delete your entire drive, read financial data, etc… whereas the browser on its own (barring bugs) cannot.

This is where Google’s second big news of the week comes in, their acquisition of GreenBorder. This acquisition has so far been reported as just a tool that protects a browsing session from danger, but I believe it has a deeper purpose that, when tied with Google Gears, gets Google closer to their version of The Last Computer.

If you can virtualize the session, not just of the browser but of the browser combined with Google Gears, then you have two things. 1) A safe environment for web apps that can do nearly anything a desktop app can do, including accessing much of the local computer’s resources, and 2) A virtualized “desktop” that can be neatly bundled, backed up (or cached), and even shared with other computers, so that as you move between computers your virtualized sessions can move with you.

With Google Gears and GreenBorder, can Google now replace web apps?

In combining Google’s news of the week, the whole story is much bigger than the sum of its parts. Google Gears + GreenBorder (+ existing Google apps) ~= Web Apps Replacing Desktop Apps.

That’s a big story. Enough to make Microsoft and Adobe squirm (more on them in future blogs). But this story still has a big problem: problem #4: HTML sucks for making rich and compelling interfaces. The browser, composed of HTML + CSS + Canvas + Plugins + Ecmascript + KitchenSinks, does not make pretty interfaces, at least not without a ton of hacks, memory, and money thrown at it for every application. The browser and its markup are just the wrong tools for making rich applications. Furthermore, the browser is not a thin client as it’s so often billed to be--it’s a fat, clinically obese, needs-a-crane-to-leave-the-bed client.

Close, Google, but no cigar yet

With these recent steps, Google is getting close to The Last Computer, but in some very inelegant ways. At least they show that they understand what The Last Computer is all about. And in their Googly way (with the nearly limitless resources of money and smart people to throw at the problem, even if they are hobbled by inelegant choices of basic components) they’ll probably make it work sufficiently well to become an integral part of most Last Computer systems.



Addendum June 10, 2007:

Wednesday, May 30, 2007

Browser-based thin-client apps: a part of The Last Computer

When I talk about The Last Computer people often counter with the idea that, with the advancements in Web 2.0 Ajax-y applications, there will soon be no need for a personal computer at all because all applications can be online.

For a decade or more many have claimed that the browser will become the ubiquitous front end to all applications, making the OS and even the computer a mere commodity. Following this claim is usually the prediction that these changes are on the verge of putting Microsoft out of business. (If I had a nickel for every million dollars that has been invested into putting Microsoft out of business through a browser thin client, I'd be a very wealthy man.) In the past the claim has always proven to be incorrect. The browser-based apps were just never good enough to replace desktop applications.

Are Web 2.0 browser-based apps now good enough to replace desktop applications? As a test, for the past couple of months I've switched away from my old Outlook program to GMail for all of my personal correspondence (Google's GMail is an example of one of the top-notch browser-based applications using Ajax and CSS and other magic). It was painful at first, of course, as any program switch is, but overall I've been very impressed. GMail is remarkably quick, interractive, useful, considering that all of the code logic is in the browser I already have and is updated as-needed onto any computer I log on to. From a user-interface perspective, GMail is probably as good as any standalone email desktop email reader of 10 years ago.

Yes, Web 2.0 apps are much richer than they were during the last dozen years when web apps were predicted to replace desktop apps. Much better than when we used to have to push each update from a slow server, and didn't have CSS to make things pretty, and had way too many browsers to support, and had to do the equivalent of today's XMLHttpRequest via communicating with separate 1-pixel-tall frames (if the browser supported frames and ecmascript, which many didn't). But they remain only about as good as the equivalent standalone app of 10 years ago.

Here's the thing: If you were to compare GMail side-by-side with Outlook on the same computer, and didn't know GMail was web-based but instead thought it was a standalone application, you'd think "what a crappy email program, slow, ugly, hard-to-use, and featureless--this looks like a 1.0 product of 10 years ago". But when you find out GMail is browser based, you say "wow, all this in a browser? Cool!"

Bad as it is, is GMail good enough to replace Outlook? Yes. For most situations and users it is, simply for these reasons:

  • It's free.
  • No installation or upgrade required.
  • It's available from any internect-connected browser anywhere.
  • No backup worries
  • Did I mention that it's free.
I'll be sticking with gmail for the foreseeable future for these reasons (and because I'm losing my exchange account as I enter my 3rd retirement soon). I'll get used to it. Google will improve it. It will remain "good enough".

But gmail, and web-based apps, are not great, and will never be as great as desktop applications for these reasons:

  1. Connection Required: They don't work when offline
  2. Slow: Slow with a good connection, and very slow with a bad connection
  3. Sandboxxed: Interaction with my real computer, or any programs not part of the google environment, is inconvenient (attaching files, sharing clipboard, etc...)
  4. HTML==PoorInterface: html+css+canvas + browser differences imposes severe limitation on how good anything can look and how difficult it is to create a rich interface

Reasonable fixes are coming to all of these problems (except probably the html+css problems). These issues will get better. But standalone desktop apps will also get better.

Summary: Web-based apps are good-enough to replace desktop apps in many situations. And we can expect them to improve (although always lagging many years behind desktop apps). But web aps are only part of the solution. For real power applications and users, and processor-intensive applications like games, browser+html+css+ecmascript+canvas+other-standards-layers is not going to cut it.

Browser-based thin-client apps will be a big part of, but not all of, The Last Computer.

Wednesday, May 9, 2007

Open letter to ACME Corporation to create the Last Computer: Consumer Model

Dear Important Dude at the ACME Corporation:

For a long time I've been dancing around an idea with all the computer geeks I talk to, everyone I work with, and some plain regular people who spend any amount of time on their personal computer. The idea has taken many wrong turns (mostly involving seeing it wrongly as a data backup problem), but now I think it is clear enough that I'd like to share the idea with you in the simple and remote hope that you can make something come of it.

In short:

    An ACME Personal Subscription Computer: The only computer that any typical person will ever need to "own".

The Current Problem:

    The typical person relies on computers more and more: for email, photos, documents, tons of data. But computers are not getting any easier to use when things go wrong: setup is hard, transfer to a new computer is near-rocket-science as far as most users are concerned, data is lost (those precious photos of my grandkids), viruses, crashed discs. Every time one of my friends or relatives gets a crash or needs a new computer even I usually cannot get things up and running quickly, and usually some data is lost in the process. (BTW, if they had an ACME before the crash, the fact that there were problems does not make them any more likely to buy an ACME again! There is not a lot of brand loyalty.)

Short Description of the Solution:

    Customer no longer "buys" a computer, but instead gets a subscription for an ACME Personal Computer from their local shop (e.g. Best Buy, Office Max, Target, etc...). I expect this to cost something like $30/month. For that monthly price they have a working computer, with all of their data, for as long as they pay their subscription. If ever anything goes wrong with the computer, anything, customer simply brings it in to their local Best Buy (or Office Max, etc...), hands it in, and immediately receives a replacement which is identical or better than the one they brought in. Customer brings the replacement computer home, plugs it in (to monitor, keyboard, and internet), and turns it on. This replacement computer will then churn for a while, and in an hour or two the customer has a working replacement computer with all of their data back. It's almost as if the computer never broke in the first place. They're precious photos of their grandchildren are right there where they left them. They're settings are as they like them. It's almost like magic.

    It is not magic, of course, but simply clever use of automated caching (i.e. back-up) of the computer’s data over their internet connection and without their explicitly having to do anything. They're getting the same computer they used to pay $700 for, but now they're getting it without the old headaches and they're paying for it by subscription. They're happy. ACME is happy.

This has been an extremely truncated description, of course, skipping tons of financial and technical details. I'm always glad to talk on and on about it, if you'd like to.

Like I said, I've covered the issue with computer geeks (who are not the target audience) from a technical angle. But I've also discussed it with regular people (family, in-laws, cousins, friends, kids, and old people) and the idea of paying a small subscription fee for a computer that ALWAYS works and NEVER loses their data is quite appealing. (It's certainly appealing to me, if it means I never again get a desperate call:"Help, my computer has crashed! You're the expert, Mr. Smartypants: fix my computer and save my data!")

Thank you for taking the time to read this far. I think the world will be better off with a computer like this. Maybe ACME can be the company that provides it.

Sunday, April 22, 2007

The Last Computer

The time is near when each of us will purchase (if purchase is the right word) the last computer (if computer is the right word) we will ever own (if own is the right word).

One of the business plans I've been mulling over for a few years has been to create the Last Computer Corporation, to facilitate creation of this Last Computer. I've dropped plans to tackle this myself for a few reasons:

  1. The task is bigger than I'm prepared to tackle now.
  2. I'm convinced of the inevitability of the last computer, whether I am part of that or not, and so I am not needed to make it happen.
  3. Other plans, where I have a greater chance of positive impact, have gained more my attention.
And so instead of creating the Last Computer, I've created this blog to document others' steps and mis-steps toward it's creation.

Those who can't do, watch.