Why Google Chrome OS will fail (on netbooks)

warning: Creating default object from empty value in /home/buckman/public_html/neo/modules/advanced_forum/advanced_forum.module on line 492.
Matt Barton's picture

I've been keeping an eye on news developments about Google's attempt at a true-blue operating system, and came across an op-ed today called Five Reasons Google Chrome OS Will Fail on, of all things, Google News. The author does a good job here laying out the key challenges Google will face on the netbook front, and he makes the obvious parallel to this and Linux. I have stated my belief that Linux (or at least some free operating system) will eventually topple Microsoft, but so far that belief has had no basis in reality. Microsoft and, of all things, Macintosh, still reign supreme, with Linux and other free OS's representing only a tiny sliver of the pie. I guess Google felt that the netbook market was fairly open and the best place to charge in, and I have to agree with them. But just because it's the best place to charge doesn't mean they stand a chance.

I think the key here is the "net" in "netbook." While most people associate Microsoft with PCs, they tend to associate Google with the internet. Indeed, many people I talk to seem to think Google IS the internet. Having a dirt-cheap netbook out there with Google's branding all over it might well inspire those people to see it as a legitimate and trustworthy product for doing what a netbook should. In other words, surf the net, and take advantage of the many net services that are now very shabbily (ahem, Google Docs) trying to match Office (Microsoft or otherwise). In my opinion, this latter angle is the wrong one to take. Google Docs is just too far behind. Every time I use Google Docs I keep thinking--man, this is it? I would have expected it to be at least three or four times as good as it is now after regular updates and patchwork. It still feels like a 40% beta from a small company. Sure, it has great features, especially on the collaborative side, but it just feels cheap and the formatting just never seems to work out. I don't expect it to have all the redundant and seldom-used features of Word, but it should at least be stable and offer no grief when it comes to exporting and importing across formats. The silly thing doesn't even seem to handle tabs appropriately, much less tables.

But, anyway, I digress. I guess the key thing about netbooks is whether people really need them, even at a fantastically low price. Let's say it was $100; that's about as cheap as I can imagine at the moment for something with a color screen, working keyboard, and good enough innards to run Google Docs and YouTube. I think you'd be looking at just barely better web functionality than you can get on a good mobile phone, and certainly nowhere near what you'd get on a budget or used laptop from emachines or what have you. That said, if they could get it down to $50, a lot of possibilities start opening up, especially for kids, travelers, and grandparents and the like. I mean, at that price you just buy one for a trip and throw it away when you're done; we're getting into "disposable" territory here. I guess it'd be like buying a disposable camera; sure, you're not going to get rid of your HDV, but if you just want and need a quick and dirty solution it's great. If you're with the family at the beach and want to surf the net while your kids surf the waves, $50 isn't so bad for something that will end up getting wet and sandy. So, yeah, I can see some definite potential, but so many "ifs."

Comments

Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
To my thinking, there's a

To my thinking, there's a significant two part reason why Linux and to a lesser degree MacOS will not topple Windows any time in the forseeable future. The first part is the fact that the world is standardized on Microsoft Office, and no amount of compatibility offered by other products will topple that. You'll need a genuine copy of Microsoft Office to play in the business/professional world properly, except in rare cases where you can get by with a functional clone. That leaves you running Windows or MacOS, only, or some type of user unfriendly and performance unfriendly compatibility layer on the Linux side.

The second part is one of drivers and compatibility. Every device supports Windows first, Mac second and everything else, if you're lucky or unofficially. Until there's some type of unified driver layer that works on EVERY flavor of Linux (and thus would make sense for vendors to officially support), Linux doesn't stand a chance with Joe Consumer who won't go to the effort to try and track down drivers and functionality for their new Canon multifunction printer, for instance. Even I'm in that camp. I'd love to move to Linux or some other OS, but I'm always held back by compatibility concerns. I need to be able to use my stuff!

In the world of netbooks you have two basic choices right now, Linux or Windows XP. Linux, in this scenario, is for when you want a lean OS and don't need to connect to any specific peripherals or be compatible with any Windows software. It's a WebOS and an OS for those who want to put the time and effort to make it something more. Windows XP is the leanest modern version of Windows that offers the same functionality as any other Windows system when that's what you need. So where does ChromeOS fit in? It can only hope to replace the Linux option, not the Windows XP option, which will become the Windows 7 option once that's available. Again, two different OS's, two different approaches, two different audiences. While the Windows netbook/laptop can do 100% of the functions of the Linux/ChromeOS netbook/laptop, the reverse is not true.

So to my mind, the markets, because of the needs of the users, will remain bifurcated on the netbook side, with no danger at all on the full-sized laptop/desktop side, where Windows-Mac-Linux/other will remain 1, distant 2 and mostly insignificant 3, respectively.

Books!
Bill Loguidice, Managing Director | Armchair Arcade, Inc.

n/a
Matt Barton
Matt Barton's picture
Offline
Joined: 01/16/2006
It's really surprised me how

It's really surprised me how poorly Linux has done these past few years. I liked to use the analogy of open architecture on the hardware side, and how that allowed the IBM PC "compatible" to dominate--surely an open architecture on the O.S. side would have the same effect. But sadly, that doesn't seem to have happened. I guess the main issue is that open hardware still allowed companies to make money, whereas open source software doesn't seem to have that. Sure, you can still charge competitively for service and support, but that doesn't seem to be enough. We've kinda glommed on to this idea that software is a product, not a service, and we're just beginning to see some business models on the fringes trying to change that (mostly via subscription services and perhaps in things like virus software). It also doesn't help that there are so many different Linux distros out there and none of them seem to have a slick spokesperson or marketing team. It's just still stuck in that uber-geek community, many of whom are downright hostile to anyone who hasn't drunken the koolaid. Yeah, I just don't see it happening anytime soon.

I can see it in my head, though. You just buy a year or so of support for your o.s., perhaps a "lifetime support" option (I mean, what are we really talking about here, 8 years, tops?) and that gets you full support and updates for your "free o.s.," as well as fund future development and code for all the problematic devices and compatibility issues. I just don't believe Canon or other hardware makers are anti-Linux. If they see that Linux is worth supporting they will provide the drivers, or at least make it easy for others to. If they don't, and some rival company does, they lose marketshare. Also, I don't think the support would be cheaper than Windows. You'd probably end up paying more in the long run, actually, since Microsoft usually provides support for a ridiculous number of years anyway. With what I'm talking about, you'd probably pay at least $100, maybe $200 a year, and that'd add up over time depending on economies of scale (i.e., the reason MS can charge so little is that they sell so many copies).

In any case, I'm convinced that you can't beat Windows with a cheaper product, better product, or "freer" product. People just don't care about those things. Look at how long DOS was able to persist when everybody else had GUIs, and it's not like DOS was cheap. I think you're 100% correct about compatibility and familiarity trumping all else. People just don't want to have to deal with compatibility issues or learning anything new unless it's absolutely necessary. You could have a Linux OS out there that was leaner, faster, and in everyday way put Vista or Windows 7 to shame, give it away for free, and have people on every street corner touting it, but it wouldn't do a damn bit of good. The only way it can possibly succeed is if there is a for-profit company behind it with a real marketing team and a product they believe in--and enough capital to ensure that they can overcome obstacles such as drivers and provide decent support. Is Google that company?

n/a
Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
That's the thing, there has

That's the thing, there has to be "one" Linux for Linux to have a chance to succeed, and the fact of the matter is there isn't and can't be by the very nature of what Linux is supposed to be. While your hardware analogy is a good one in regards to one of the main reasons why the PC standard became a PC standard, the fact of the matter is it was all exactly the same hardware. It's not all exactly the same Linux.

More and more, the browser is the OS (to be cliche for a moment), but until that final layer of interface is cracked (where peripherals are designed to work with it), the browser will still need that powerful, single platform, which for better or worse IS Windows (which all peripherals are designed to work with).

Really, for 99% of the population, it doesn't matter if it's Windows, Mac or Linux, as long as the thing works. Frankly, I don't care either, but for me to have what I deem a 100% system, it STILL has to be Windows-based. Hell, even if my next system were a Mac, it would still be a dual boot system into Windows 7. And frankly if I'm going to go that effort, I may as well just get a Windows 7 system and have a Mac as a secondary system.

Books!
Bill Loguidice, Managing Director | Armchair Arcade, Inc.

n/a
Catatonic
Offline
Joined: 05/20/2006
In case you missed it, the

In case you missed it, the Chrome OS is Linux-based. And it seems that the web browser will be the only application.

Matt Barton
Matt Barton's picture
Offline
Joined: 01/16/2006
Well, I'm wary of anything

Well, I'm wary of anything that suggests the common joe or average man or what have you is an idiot. They're usually a good deal smarter than we (and I include myself in that) give them credit for. If the public hasn't latched on to Linux, it's probably not so much that they're idiotic or clueless as that Microsoft Windows is simply better for them.

What I'm seeing now in the stores is starting to scare me, to be honest. The local Wal-Mart used to have a whole aisle of PC games. Now it's shrunk to about 1/5 of an aisle and moved behind to a less accessible space--AND has to share that space with apps and productivity. Everything else is consoles, with about 75% going to Japanese consoles (console or handheld). It's not like Xbox or Xbox 360 is really dominating here either. This leads me to the conclusion that I don't like at all: Most people who buy a computer or even many who buy consoles never buy any software for it. They just use what is on the machine and that's it. I heard that from someone; maybe during a woot interview--that most people only buy 2 or at max 3 new games for a console after the buy it, period.

Who cares? Well, I think this might be a spot for netbooks to make a difference. Even if people aren't interested in going to Wal-Mart and buying new games or software, they might be lured with a micro-transaction and casual market stuff. I've seen it happen. My mother-in-law and grandpa, for instance, have bought several casual games and I'm pretty sure they've never even been to the computer section of Wal-Mart before. In short, the key here may be not just the O.S. or the netbook or the price, but really making it so easy and cheap to buy additional software. That might perhaps make a difference. But again, if everything is going to be browser and Java or flash-based, the o.s. is almost redundant anyway.

n/a
Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
While it's true that PC

While it's true that PC games retail is generally weak - all of the money is in a select group of casual, Sims products and MMO's - your assessment of the console side is rather off. Attach rates vary by system, but it's well north of five games. You have to remember just like on the PC side, there is a robust online game delivery system for every console and most of the handhelds (and will be all of the handhelds very soon). Things like Steam on the PC side aren't even factored in to PC games sales, and that's a huge market. The x factor on the console side is of course the Wii, which has attracted an unprecedented casual (for lack of a better term) audience, who may in fact get "bored" of the whole videogame thing after the fresh initial thrill, lowering the attach rate. The other factor of course is that the Wii is home to arguably an overly high percentage of shovelware and mini-games because of the audience, who tends to eat up anything with a familiar license or branding. The PC, being an open platform, is a reflection of all sides, which is why it's equally full of shovelware, casualware and hardcore stuff.

I don't think anyone is calling Joe Average stupid, but the fact of the matter is is that the dominant platform is nearly always better for those who don't wish to get down to the nitty gritty details and putting lots of work into it, because the most stuff will be designed for it. In the case of PC's, that's dealing with Windows over Linux. I don't think there's any way to argue against that. The path of least resistance is where most people (myself included) prefer to go, and in the vast majority of cases that's with a Windows-based system. As a secondary system, really anything Web-enabled and able to install browser extensions is more than adequate, and that certainly includes anything running Linux. That's probably Linux's future and greatest strength, being in a mass of secondary and tertiary systems and products. Eventually the OS won't matter and the office suite won't matter, but that time is no time soon. Old paradigms die hard.

Books!
Bill Loguidice, Managing Director | Armchair Arcade, Inc.

n/a
Matt Barton
Matt Barton's picture
Offline
Joined: 01/16/2006
change
Bill Loguidice wrote:

While it's true that PC games retail is generally weak - all of the money is in a select group of casual, Sims products and MMO's - your assessment of the console side is rather off. Attach rates vary by system, but it's well north of five games.

That may very well be, but I could've sworn someone spouted out that figure. I'd need to see some studies or something before I thought otherwise; we have to remember that our immediate group here is very different from the casual gamer. Heck, most of us probably buy at least one game a week, maybe 5 per month, easily.

Quote:

I don't think anyone is calling Joe Average stupid, but the fact of the matter is is that the dominant platform is nearly always better for those who don't wish to get down to the nitty gritty details and putting lots of work into it, because the most stuff will be designed for it. In the case of PC's, that's dealing with Windows over Linux. I don't think there's any way to argue against that. The path of least resistance is where most people (myself included) prefer to go, and in the vast majority of cases that's with a Windows-based system. As a secondary system, really anything Web-enabled and able to install browser extensions is more than adequate, and that certainly includes anything running Linux. That's probably Linux's future and greatest strength, being in a mass of secondary and tertiary systems and products. Eventually the OS won't matter and the office suite won't matter, but that time is no time soon. Old paradigms die hard.

That may be true, but the habits have to form. It becomes a bit circular if you just say IBM PC and compatibles became the standard because they were so popular. You can say that about Microsoft, too--they are popular because everybody has it--but it still doesn't really explain why people were willing to move to DOS and then Windows in the first place. Somehow, IBM PC had a critical mass that allowed for enough people to adopt the platform that it made sense to try to clone it and start competing for that userbase's dollars. I could be wrong, but I don't see any real advantage the IBM PC had over the competition in terms of usability or compatibility when it came out. Of course it did have advantages for business and serious apps, but I don't really see how those would have been that big of a deal to Joe Smoe.

I think the critical era was really in the mid-80s when you had systems out with colorful GUIs and multimedia out the wazoo, and the clones were still chugging along lucky to have anything beyond monochrome and pc speaker. There's a ten year gap between the first Macintosh and Windows 95, for instance, and I definitely didn't see a mass migration from DOS to 3.1 (though your experience may be different). Heck, I remember people refusing to go Windows 95 and there are STILL people who insist that DOS is far superior (of course most of those have now moved on to Linux and extol its shell until your ears fall off).

So, anyway, what I'm saying is that there was a huge gap there in usability and at least a reduced gap in compatibility. My guess is that if you were to tell someone in 1985 that the Atari ST, Amiga, and Macintosh would all fail horribly compared to MS-DOS clones in every sector--even though they wouldn't get a true GUI-based OS for ten years--they would have laughed at you. Personally, I would have found it much more plausible that Macintosh would become the standard.

The one thing that always comes to mind when I think about stuff like this is the QWERTY keyboard. That keyboard arrangement was designed to slow you down, because when it was invented, there was a problem with people typing so fast that it jammed up their typewriters or some such. However, out of force of habit, we've maintained that layout, even though other layouts like the Dvorack are far better. That case really kind of makes it clear what you're saying about habits dying hard. But they obviously do sometimes change, or we'd still be using typewriters and not computers. ;)

n/a
Greg (not verified)
New habits

Your post is great. It makes me contemplate why people use what they do. It seems the new "buzz" among marketing types (well, maybe not that new) is that something has to offer 10 times (10x) what the current offering does for people to adopt it. The DVORAK keyboard was interesting -- maybe it was only 8x faster :)

I know that I loved the Amiga for the preemptive multitasking, but I suppose that wasn't enough. This all feels like a new Malcolm Gladwell book. Have a good day and I love reading your posts even if I am a bit of a blog lurker!

Greg

Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
Comments
Matt Barton wrote:

That may very well be, but I could've sworn someone spouted out that figure. I'd need to see some studies or something before I thought otherwise; we have to remember that our immediate group here is very different from the casual gamer. Heck, most of us probably buy at least one game a week, maybe 5 per month, easily.

Here is your link on attach rates, and you can drill down to the actual data: http://forum.pcvsconsole.com/viewthread.php?tid=18305

Matt Barton wrote:

That may be true, but the habits have to form. It becomes a bit circular if you just say IBM PC and compatibles became the standard because they were so popular. You can say that about Microsoft, too--they are popular because everybody has it--but it still doesn't really explain why people were willing to move to DOS and then Windows in the first place. Somehow, IBM PC had a critical mass that allowed for enough people to adopt the platform that it made sense to try to clone it and start competing for that userbase's dollars. I could be wrong, but I don't see any real advantage the IBM PC had over the competition in terms of usability or compatibility when it came out. Of course it did have advantages for business and serious apps, but I don't really see how those would have been that big of a deal to Joe Smoe.

I think the critical era was really in the mid-80s when you had systems out with colorful GUIs and multimedia out the wazoo, and the clones were still chugging along lucky to have anything beyond monochrome and pc speaker. There's a ten year gap between the first Macintosh and Windows 95, for instance, and I definitely didn't see a mass migration from DOS to 3.1 (though your experience may be different). Heck, I remember people refusing to go Windows 95 and there are STILL people who insist that DOS is far superior (of course most of those have now moved on to Linux and extol its shell until your ears fall off).

What I was saying is not circular. To my thinking (and others), the PC won out for three reasons, one, was the ability for any company to make a PC clone, two with so much competition within the PC world from all the clones there was a constant push for the next big thing to advance the state of the art on the platform, and three PC's were used in the office. Eventually these three things overwhelmed the one company, niche approach of the competition. Apple survived only because they had the most compelling niche for the era (desktop publishing versus video (Commodore) and audio (Atari) production), plus far fewer losses from side projects like Commodore and Atari, who seemed to want to play in every market segment. Apple also carved out a niche in educational institutions, unlike Commodore and Atari, who were never able to full crack that nut (particularly Atari).

So of my three points, to provide more detail, one there were a huge number of PC clones, which eventually drove prices down to levels matching Atari and Commodore, so those platform's price advantages became minimized. Two, while the PC started out as technologically inferior to everything else out there, eventually it caught up to and surpassed the other platforms in the two most immediately obvious areas - graphics and sound. Of course it was also the first computer platform to more-or-less standardize on hard drives, and to me that's a huge factor on the technological sides, as we're all well aware of the benefits of hard drives over everything being disk-based. And three, PC's were used in the office. Businesses are notorious for going the no-frills route and being averse (for both good and bad reasons) to the latest and greatest. Command line systems were proven and familiar, GUI's were new and untested, and frankly, on the earliest systems, offered a performance hit for business applications. GUI-based systems never got the momentum while they still could, the PC platform became entrenched in businesses, and it flowed down to the home desktop as time improved on the other points, namely cost and more home friendly technology.

In short, it's been proven time and again that people don't necessarily go with the best, they just go with what's good enough and stick with it until they HAVE to make a change or there's something so amazingly good as a replacement that there's no way they can resist. DOS and Windows 3.1 were good enough in comparison to the competition, and eventually Windows 95 and its descendents erased the final barriers.

Matt Barton wrote:

So, anyway, what I'm saying is that there was a huge gap there in usability and at least a reduced gap in compatibility. My guess is that if you were to tell someone in 1985 that the Atari ST, Amiga, and Macintosh would all fail horribly compared to MS-DOS clones in every sector--even though they wouldn't get a true GUI-based OS for ten years--they would have laughed at you. Personally, I would have found it much more plausible that Macintosh would become the standard.

Again, the GUI was something new, unproven and somewhat scary. In fact, at times, it was derided versus the more "professional" command line interface. Also, let's not overlook Apple ALWAYS charging a premium back then, even moreso then they do today. They were famous for 30% or greater profit margins on their computers, something few companies could ever get away with. In short, their computers cost way too much and always have. Only in recent years have they loosened that up a bit.

Matt Barton wrote:

The one thing that always comes to mind when I think about stuff like this is the QWERTY keyboard. That keyboard arrangement was designed to slow you down, because when it was invented, there was a problem with people typing so fast that it jammed up their typewriters or some such. However, out of force of habit, we've maintained that layout, even though other layouts like the Dvorack are far better. That case really kind of makes it clear what you're saying about habits dying hard. But they obviously do sometimes change, or we'd still be using typewriters and not computers. ;)

Again, it's what I said above. The QWERTY arrangement is good enough and it's what everyone knows, so the system sustains itself. It would require huge amounts of education and training to get people to learn Dvorak, and for what, just so they can type 50 WPM versus 35 WPM? 35 WPM is good enough, so why change? Now if something came out that allowed you to type 100 WPM versus 35 WPM, then you might have something there, but then you'd still be working against the existing infrastructure of keyboards and devices in the QWERTY format. Hell, we couldn't even get this country on the Metric system when it was a government initiative because there was too much resistance to change. Imagine trying to change something that people use everyday now.

Books!
Bill Loguidice, Managing Director | Armchair Arcade, Inc.

n/a
Matt Barton
Matt Barton's picture
Offline
Joined: 01/16/2006
Progress
Bill Loguidice wrote:

Here is your link on attach rates, and you can drill down to the actual data: http://forum.pcvsconsole.com/viewthread.php?tid=18305

It looks like 3-5 was too low; more like a spread of 4-11, with some systems getting as low as 3.5 and others as high as 11. It's interesting how the Xbox has such a high attach rate; my guess is that has something to do with the type of gamer it attracts (including many ex-PC gamers who are used to buying lots of new games). I would've thought the DS would have more, though. Nintendo is who I mainly had in mind; I can't tell you how many of my friends had an NES and never bought any games, only playing Mario & Duck Hunt with the occasional rental.

Matt Barton wrote:

as we're all well aware of the benefits of hard drives over everything being disk-based. And three, PC's were used in the office. Businesses are notorious for going the no-frills route and being averse (for both good and bad reasons) to the latest and greatest. Command line systems were proven and familiar, GUI's were new and untested, and frankly, on the earliest systems, offered a performance hit for business applications. GUI-based systems never got the momentum while they still could, the PC platform became entrenched in businesses, and it flowed down to the home desktop as time improved on the other points, namely cost and more home friendly technology.

That's probably it, there--the idea that a command line interface was somehow more "professional," and of course the hard drives were a huge factor. It's amazing to me that other machines went so long without internal hard drives being standard. Heck, I didn't get one until we purchased an Amiga 3000 back in 92 or so, and it was limited to a paltry 50 megs (which was paltry even by then-standards). Still, being able to run a game off the hard drive instead of floppies was amazing; it sped up pretty much everything. If we'd even a 25 meg in the A1000, no telling how things might have gone differently.

Matt Barton wrote:

Again, the GUI was something new, unproven and somewhat scary. In fact, at times, it was derided versus the more "professional" command line interface. Also, let's not overlook Apple ALWAYS charging a premium back then, even moreso then they do today. They were famous for 30% or greater profit margins on their computers, something few companies could ever get away with. In short, their computers cost way too much and always have. Only in recent years have they loosened that up a bit.

Does Apple really charge that much? Holy cow. No wonder they're niche. I've heard rumors off and on that they might open up a bit, but I guess they've been successful at squashing all the Apple clones (Franklin, anyone?). It'd be nice if they did what Microsoft did and just sold their OS and let anyone manufacture Macs, but they always seem to have been in the hardware business. I've always seen them as more a case of style vs. substance myself, but their fans really seem to enjoy them. At least they have beautiful screens and look great on the desktop. ;)

Matt Barton wrote:

Again, it's what I said above. The QWERTY arrangement is good enough and it's what everyone knows, so the system sustains itself. It would require huge amounts of education and training to get people to learn Dvorak, and for what, just so they can type 50 WPM versus 35 WPM? 35 WPM is good enough, so why change? Now if something came out that allowed you to type 100 WPM versus 35 WPM, then you might have something there, but then you'd still be working against the existing infrastructure of keyboards and devices in the QWERTY format. Hell, we couldn't even get this country on the Metric system when it was a government initiative because there was too much resistance to change. Imagine trying to change something that people use everyday now.

Yes, and look at who is most mired in the old system--government subsidized industries (i.e., farms). You don't buy a "gallon" of Coca-Cola; you buy 2-liter. You only buy a "gallon" of old farm stuff like milk. You get the same nonsense in almost every government subsidized industry; old-fashioned, wasteful nonsense that would be improved a hundredfold if they just cut the umbilical cord and forced them to make it or break it on their own.

Back when we had government-subsidized computing, we made incremental progress.

BTW, that Tipping Point book looks interesting. I think I'll have to add it to my reading queue. Re-reading "All Quiet on the Western Front" now.

n/a

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.