Categories
Uncategorized

Tux is on a cereal box?

The original Tux Linux, for your all you Windows and Mac users out there, has a mascot. The mascot’s name is tux. Here’s the original Tux. I’ve pretty much never seen this original mascot in association with anything other than Linux.
Revised Tux
This is a more hip, revised Tux. You can see a whole bunch of different kinds of this revised Tux at Tux Factory. You can find Tux as Zelda’s Link, Tux as the devil, Tux as a soccer player, Tux as Chewbacca, etc. That’s all lots of fun.
Cereal Tux I was shocked, though, to find this revised Tux on a real cereal box in a real grocery store. Thanks to my Android phone, I was able to capture the moment. Has anyone eaten this cereal? Is it any good?

Categories
Uncategorized

You mean products fail for other reasons?

If you read recent press coverage of Google’s Nexus One, it all seems to make sense. Phones weren’t going to sell well being sold only online without a chance for people to try them in person in a brick-and-mortar store. There wasn’t an advertising campaign for it. Very few articles or blogs about the end of Nexus One seem to think there was a problem at all with the phone itself. No one says the phone wasn’t ready for consumers or that it was too difficult to use.

Yet two years ago when Asus was just starting to be successful with the Eee PC netbook (which came preinstalled with a version of Linux, which Microsoft had to stop right away by resurrecting XP for the first of many times to come), that’s what a lot of the press coverage assumed. Geez. I mean, a lack of advertising campaign or in-person models to try out in the store couldn’t have anything to do with Linux netbooks not selling. It must be that Linux is too hard to use. It must be that Linux isn’t ready for consumers. It must really be that consumers just prefer Windows when given the choice.

Well, there is some truth to that in that the Linux distro Asus chose to put on the Eee PC was essentially crippled (not at all like Ubuntu, PCLinuxOS, Fedora, Debian, OpenSuSE, or any of the other popular distros of the time). It wasn’t even vanilla Xandros. It was a custom Xandros that could be customized only through pasting cryptic commands in the terminal.

Nevertheless, if they’d marketed it correctly, Linux could have been a success. The problem with Linux on “the desktop” (or the laptop or netbook) is the myth of meritocracy. You don’t win by being the best. You win by marketing.

Think about it.

When the iPad was announced, critics focused on the features it didn’t have (no webcam, no Flash, no USB ports), but Apple with its clever marketing department convinced the hoards that the device was magic, so the hoards bought it. If a Linux tablet had been released without Flash, people would have just laughed and said “This is the reason Linux will never succeed—they need to realize the masses use Flash.” But Apple releases a tablet and all of a sudden people are actually saying Flash isn’t necessary. HTML5 is suddenly the wave of the future. Apps for websites are suddenly better than just going to the websites themselves.

I also see a lot of Linux poo-pooers claim Linux doesn’t have any apps, and that Windows users have certain killer apps they need, and that’s why Linux won’t succeed. Well, when Android first started, it had very few apps. In fact, for the end of 2008 and all through 2009, iPhone fanatics kept pointing out how many hundreds of thousands of apps the iTunes App Store had compared to the few thousand Android had. Well, Android now has almost 100,000 apps. If this pace continues, the iTunes App Store and Android Market will probably have the same number of apps by this time next year. The Linux desktop (as opposed to server or embedded) has been around since… the late 90s? Android has been around since 2008. The Linux desktop isn’t mainstream but Android is.

What should we learn from all this? Marketing matters. Being able to test a physical product out yourself matters. Dell selling badly marketed (or even anti-marketed) Ubuntu models on its website isn’t going to sell Ubuntu preinstalled in great numbers, nor are relatively obscure vendors like System76 or ZaReason without a proper store front or brand name recognition.

I would love it if all the bugs in Ubuntu (or some other popular Linux distro) could be fixed. I would love it if some more attention would be paid to ease of use or to making more applications available in the software repositories. I would love that. But that won’t fix Bug #1. If Linux wants to make a dent in the desktop/laptop/netbook world, it needs to give up the idea of being good enough and start embracing the idea of crafting, shipping, and marketing a product—yes, one people can try out in a brick-and-mortar store. In other words, what I said two years ago is still true.

Categories
Uncategorized

Ubuntu on a Macbook Pro

I’m not abandoning Mac OS X, but you knew it had to happen—I have installed Ubuntu on the Macbook Pro as a dual-boot. It hasn’t been easy, mind you. Previously, I had done a few dual-boot setups with Ubuntu and Windows or Ubuntu and some other Linux distro or even Ubuntu and an older version of Ubuntu. Ubuntu on a Macbook Pro is a totally different experience.

So first I went to the Ubuntu wiki to find out if it was worth my time. According to the Macbook Pro 3,1 page, everything works pretty much out of the box with Ubuntu 10.04 (Lucid Lynx). That was encouraging. Then I read up the generic Apple Intel installation instructions. They didn’t sound too complicated. Install rEFIt, repartition the hard drive, install Ubuntu in the new partitioned space. Easy, right? Well, not so easy. Here are a few bumps I encountered along the way:

  • rEFIt didn’t install correctly. After you install it, you should reboot and see the rEFIt menu. No menu. So I had to do some digging and found out there is a script you can run in the terminal to sort of reinstall rEFIt.
  • I couldn’t resize my hard drive through Disk Utility or BootCamp. Both failed, claiming there wasn’t enough free space, even though there was plenty (at least 70 GB after I backed up my files to an external hard drive and deleted them, planning to copy them back later). So, believe it or not, I took the hours to completely reinstall Mac OS X from scratch and then repartition the drive.
  • Since Ubuntu can’t reliably write to HFS+, I put my music, pictures, etc. on a shared FAT32 partition. Unfortunately, iTunes doesn’t really dig that. If I try to skip to the next song, I get about five seconds of the rainbow circle of death before the next song will actually play. The symlinks from the FAT32 partition also broke at first, too, because initially it was mounted as /Volumes/Storage but then it suddenly became /Volumes/STORAGE. After fixing everything to point to the upper-case mount point, the links appear to be working again.
  • Ubuntu would not install the first five times I tried. That’s right. I tried five times. It kept failing in the middle of the installation, claiming the CD was bad or the CD drive was bad or the laptop was too hot. All of those things could have been true to some degree. The CD had a little bit of dirt on it, which I tried to clean off but couldn’t get completely clean. The CD drive was definitely bad. In OS X it was pretty good at reading commercially produced CDs and DVDs but would sometimes reject homebrews (it would spin and try to read for a minute or two and then just spit the disc out). Also, unlike my wife’s new Macbook Pro, this old MBP overheats like nobody’s business. You could probably fry an egg on it. Eventually, I did something that worked, and I’m not sure which part of it did it. I turned the computer off for the night (let it cool down completely). Then I immediately booted it up and while Ubuntu was installing, I never left it alone. I played gBrainy. I looked in the file browser. I changed various settings. I didn’t let the CD rest and give up. So I don’t know if it was having it cool or constantly engaging the live CD session, but eventually Ubuntu did get installed.
  • I installed the Nvidia driver, but then Hardware Drivers instructed me to use a more recent driver. After that, suspend didn’t resume. But then I removed the old driver and rebooted, and resume from suspend worked fine, as did Compiz.
  • The touchpad works extremely well for two-finger scrolling, but the touch sensitivity is a bit much (and can’t be adjusted, as far as I can tell), so I have to be careful not to tap the touchpad accidentally when trying to scroll; otherwise, I end up clicking. If I turn off tapping to click, then I can’t right-click by tapping down two fingers. A bit annoying.
  • Control is a rather small key on the Mac keyboard, but for most navigation it’s used more often than the Cmd key (the Super key, for all intents and purposes). The key placement is a bit odd when you’re used to coming from Mac OS X or even from a regular Windows keyboard. Takes a bit of getting used to.
  • I thought Skype was broken, but it wasn’t. I set my account to offline instead of invisible, and apparently if you’re offline you can’t do the Skype test call (it just fails immediately). I didn’t know that, so I was trying all these crazy fixes like uninstalling PulseAudio or whatever. Turns out it just works fine if you’re invisible or online.
  • The Picasa from the Google repositories is broken with the latest Lucid kernel. If you download the .deb straight from Google, though, it works just fine.
  • I had a 32-bit Ubuntu CD already, so I didn’t really want to bother downloading 64-bit Ubuntu to take advantage of all 4 GB of RAM (and waste another blank CD, since Macs can’t boot from USB). I guess that would have been interesting to try, but 32-bit works quite snappily with only a bit more than 3 GB of RAM being recognized.

Overall, I have to say Ubuntu works quite well on a Mac. I think it even runs a bit cooler, too (still very hot but maybe not hot enough to fry an egg on). My plan is to keep playing around with both (sometimes boot into OS X, sometimes boot into Ubuntu). With a FAT32 partition for files, I have that luxury, except that I will have to be in OS X to import into iTunes and iPhoto—Rhythmbox and Picasa on Ubuntu will automatically watch folders for new files.

Categories
Uncategorized

Made the move to Mac

As a follow-up to Why I might switch to Mac from Ubuntu, I did actually get a Mac… or, more precisely, my wife got a new Mac, and I inherited her old one.

Clarifications
Unfortunately, it seemed some of the commenters on that entry brought their own agendas and grudges without actually reading what I wrote. I have tried other distros, many of them, in fact—probably at least 20 distros over the past five years. You can read about some of my more recent failed attempts at trying non-Ubuntu distros. Two of my reasons for switching had nothing to do with Ubuntu specifically—that there were hardware regressions in the Linux kernel (and bugs in other upstream packages), and that the whole approach of the operating system development being wholly independent of the hardware development is a flawed approach if you want to increase adoption (which, incidentally, Ubuntu is trying to do, and not all Linux distros are).

To those who claim Macs “just work,” I have to disagree. For more details, read Macs are just computers, not magic and Macs are computers, not magic (part 2).

In terms of what happened in getting the new Mac, it’s been an interesting mix of positives and negatives (Can you believe it? Macs are not the holy grail, nor are they the devil incarnate).

The Apple Store
One of the nice things about the Apple store is that there are a lot of display models of various Apple products you can try out. So my wife and I got to spend considerable time playing around with the new Macbook Pro before we decided on purchasing it. More importantly, the sales staff appear to be trained on finding the right balance between being unavailable and being oversolicitous. A few annoying things about the sales staff, though:

  • They assume you know nothing about Macs, even if you are a long-time Mac owner (as my wife is).
  • They aren’t overly pushy, but they do try to upsell you (AppleCare, training programs, iWork, etc.).
  • They take every opportunity to bash so-called “PCs” in side comments (and by PC, they mean Windows PCs, because, as we all know, Macs aren’t personal computers, and Linux just doesn’t exist, nor does FreeBSD). Want to know where the stereotype of Mac users as being snobby zealots comes from? It comes from the Apple store employees (and from the “I’m a Mac, I’m a PC” commercials). I like Mac and Linux and Windows. Is that a crime to like all three?

The Migration Experience
At home with the new Mac, we used the Migration Assistant to move my wife’s files, settings, and applications over to the new computer. I don’t know who at Apple is in charge of the Migration Assistant, but that person needs to be replaced. First, it prompts you to make the transfer via firewire. The new Macbook Pro doesn’t come with a firewire cable, though. We had an old firewire cable from an external hard drive, but apparently that’s the wrong kind. We tried to do the transfer via ethernet. We soon realized that was a mistake, as the transfer was going to take three hours. Unfortunately, Migration Assistant is set up so that you can’t do anything else on the computer while the migration is happening, and the time remaining arbitrarily goes up, stands still, or randomly drops. At one point, it said it was going to take four hours. So we canceled it by killing the Migration Assistant on the source Macbook Pro and then forcing a shutdown on the destination Macbook Pro. Then we did the Migration Assistant again but this time with just the settings and applications (not the files). The files we copied over manually from an external hard drive backup afterwards (during that copy, my wife could actually use her new computer).

Apart from the Migration Assistant process being godawful, the migration result itself is pretty good. The setup was exactly the way she had it on her old computer. Wireless keys remembered. Dock configured in the exact same way. Mail with all IMAP accounts set up. Wallpaper the same. It was an exact replica of her account on the old Mac. All the programs worked, including CS3 (I thought maybe that might need a new activation key or something).

Unfortunately, one thing that didn’t work (and this points to a major usability issue with Mac OS X, which is being able to resize windows from only one corner) was her window setting with iTunes. See, her old Macbook Pro was 15″ and this new one was 13″, so the iTunes window extended beyond what the screen could display. We couldn’t figure out how to drag the window past the universal toolbar (I thought maybe there might be an equivalent to Alt-mouse-drag in Linux, but couldn’t find one). Clicking the + button (which usually zooms in other applications) just toggled between full iTunes and the iTunes mini player. Finally, I did a Google search and found that you could go to window > zoom in the toolbar menu to get it to zoom (since the + button in iTunes acts in a way inconsistent with other OS X applications). Solved that. Annoying to have to solve.

Meanwhile, I was tailoring my wife’s old computer to suit my needs. I deleted all her design and font programs (she’s the graphic designer; I’m not). I got rid of Mail, Safari, and iCal. Put on Firefox, Chrome, Thunderbird, Transmission, and some other programs I found at Open Source Mac. I love the smooth animation (when importing photos in iPhoto, when switching applications) that I just never could get in Ubuntu, even with Compiz. I don’t like that I can’t toggle hidden files with Control-H (or even Cmd-H). I don’t like that Finder is an always-on application (meaning, when I’m switching applications with Cmd-Tab, I want to switch between only actual applications, and not the file browser if no file browser window is open). I had to install a third-party application to turn off the annoying boot-up noise.

Really, though, the main draws for me to my wife’s old laptop are not any OS X–specific features per se. What I like most are

  • The magnetic power cord, because I am a klutz and actually broke my HP Mini power cord recently.
  • The larger hard drive. Since the HP Mini was my main computer, it was kind of tough to deal with having only a 16 GB SSD, and the upgrade options for a 1.8″ 5mm PATA Zif hard drive aren’t wonderful.
  • The ability to do Netflix streaming (the PS3 fake-Bluray experience isn’t as good as the web browser experience). I guess you could argue that’s OS X–specific, in the sense that Netflix supports Mac OS X and doesn’t support Linux. It has nothing to do with the usability of the operating system design.

Unlike most Linux users, I have always been a fan of iTunes. I’ve used Foobar, WinAmp, Songbird, Exaile, Rhythmbox, AmaroK, JuK, Banshee, and all the rest. I still think iTunes is the best. But I’m going to keep buying songs through Amazon’s MP3 store, since I want to be able to easily port the music to my Sansa Clip or to Ubuntu, should I decide later to set up a dual-boot. I’m also going to be sticking with Android, even after my phone becomes “obsolete” (obsolescence is subjective, I guess). I do like the iPhone, but it’s a bit too restrictive. I like the xScope web browser, and I don’t see any free web browsers in the iTunes app store like it. I like having a rooted device without worrying that updates will constantly break my installation. I like being able to send certain contacts straight to voicemail. I like the Google Voice app (which Apple has rejected for the iPhone).

In Conclusion
Yes, I will continue to update my Ubuntu documentation on Psychocats. Don’t worry. I plan to have Ubuntu in VirtualBox on Mac OS X. I also still have my HP Mini around with Ubuntu on it. My wife and I don’t travel often, but when we do, a 10″ netbook is far more convenient to travel with than a 15″ laptop. So even though Mac OS X is now my main OS, I will continue to document and test Ubuntu. And, mpt, I don’t know if you got my email, but I would be interesting in helping the Ubuntu experience design team if that offer is still good.

Categories
Uncategorized

Ubuntu 10.04 (Lucid Lynx) first impressions

They say you’re not supposed to upgrade to alpha pre-releases of Ubuntu on your main computer. Unfortunately, I have only one computer (my HP Mini 1120nr netbook) to test on, and it has a 16 GB SSD, so dual-booting isn’t even really an option. I just took the plunge, downloaded the latest Lucid Alpha .iso, “burnt” it to USB using UNetBootIn, and then installed it over my Ubuntu 9.10 (Karmic Koala) installation.

I have to say I’m not impressed. Yes, I know it’s an alpha release, but I’ve done alpha releases of older versions of Ubuntu, and it’s usually not this bad so close to the beta release.

A few things I didn’t like

  1. Broadcom drivers can’t be fetched without an internet connection. Okay, so this was true with the last Ubuntu release also, but I know in previous versions Ubuntu would autodetect I had a Broadcom wireless card and then prompt me to activate the necessary drivers and then have it just work (which is what Ubuntu is supposed to do). What does Lucid do? It tells me there are drivers I need to install. When I click on the little green square icon to launch jockey-gtk and try to activate the driver, I get told that the driver can’t be fetched from the online repository. Why should you need an internet connection to get your internet connection working? That’s silly. I’ve filed a bug on it: 535824.
  2. Applications crashing left and right. I’m a bit more hopeful on this one. This does tend to happen in alpha releases. Nevertheless, it’s ridiculous with Lucid. It’s not even the application launches and then crashes. It crashes even before it launches. That happened for Gwibber, for Ubiquity, for Software Center.
  3. Wireless slow to reconnect after resuming from suspending. This bug was annoying and in Intrepid and Jaunty. It seemed to go away for Karmic, but now it’s back in Lucid. Look, the whole point of suspend-to-RAM (also known as sleep) is that you can put your computer into a battery-saving state that can be quickly used again without a long wait. If I wanted a long wait, I’d have shut down and then booted up again. It honestly would be quicker than waiting 30 seconds to a minute for wireless to reconnect. Same old bug: 274405.
  4. Internal mic settings not autodetected. Another thing that appeared in previous releases but you think they’d have fixed by now. Nope. The hardware detection isn’t the problem. It’s the settings configuration. By default, Ubuntu uses the microphone selection to use the microphone. Really, though, my internal mic is the line-in selection. Shouldn’t Ubuntu be able to tell that for certain models the internal mic is the line-in selection and just select that by default? Bug previously filed: 441480.
  5. General problems. To be honest, I just don’t have the motivation to file bugs on all these, since most of the bugs I file get ignored (or acknowledged and then not fixed). When I resume from suspend, in addition to wireless taking a long time to reconnect, the battery icon for gnome-power-manager appears and disappears from the taskbar like a blinking light. I also get an error message about the monitor configuration. Update manager is holding back certain updates, but the updates still appear. What’s up with that? I had to explicitly go to Edit Connections on Network Manager to get it to automatically reconnect to my wireless network. Shouldn’t it try to automatically reconnect by default? That’s what it did in previous versions.

Another worthy critique

Someone on the Ubuntu Forums linked to 16 things that could be improved in Ubuntu 10.04, and I have to say it’s brilliant and very thorough. I don’t agree 100% with it (for example, Control-Alt-Delete needing to launch gnome-system-monitor). I do, however, agree with most of it and the general sentiment, which is that a lot of the decisions the Ubuntu devs made seem to have absolutely no rationale. It’s not that it’s a rationale I or others disagree with. It appears to be a totally non-existent rationale.

I’d like to elaborate on a couple of points here.

First of all, I don’t have a problem with the window buttons being on the left, as opposed to on the right. I’ve used both Windows and Mac OS X extensively, and I can use both just fine. Here’s the real issue, though. On Mac OS X, the window buttons are the left but the close window button is on the absolute left. On Lucid Lynx, the button group is on the left, but the close button is on the right of the group. That means if you want to close a window with your mouse, you have to move the mouse over to the middle-left of the window instead of the absolutely left corner of the window. Believe it or not, for most users, closing the window is the most common action used with the mouse (not maximizing/restoring or minimizing). Whereas you have easy key combinations to switch windows (Alt-Tab or Cmd-Tab) or minimize windows (Control-Alt-D, Windows-D, or Cmd-H), there isn’t really an easy and consistent way to close windows. Sometimes in Ubuntu it’s Control-W. Sometimes it’s Control-Q. Sometimes you have to do the awkward Alt-F4. Also, it’s safer to use the mouse to close a window since you’re less likely to close the wrong window. I’ve more than once Alt-F4’ed (in both Windows and Linux) the wrong window (thinking it was in focus when it wasn’t).

Someone brought up in the comments that a smaller font may be better for netbooks but isn’t great for larger desktop monitors. Well, Ubuntu seems to be able to autodetect my screen resolution is 1024×576. I’m sure for a lot of large desktop monitors it can autodetect your screen resolution as 1600×1200 or whatever. Would it be that difficult to have the defaults auto-adjusted to your screen resolution? So if you’re using a netbook, the default font would be 8pt or 9pt, and if you’re using a large monitor the default font would be 12pt or 14pt. Hey, there’s an idea.

The Future of Ubuntu

Pretty soon, I’m almost finishing up my fifth year with Ubuntu. I started Ubuntu in May 2005 with Ubuntu 5.04 (Hoary Hedgehog). I’ve used every release since then: Breezy Badger, Dapper Drake, Edgy Eft, Feisty Fawn, Gutsy Gibbon, Hardy Heron, Intrepid Ibex, Jaunty Jackalope, and Karmic Koala. I’ve posted literally tens of thousands of times on the forums to help new users with their problems. I’ve filed bug reports. I’ve written documentation (both official and unofficial). Over the years, I’ve seen Ubuntu improve a lot. In the old days, there were separate live and installer CDs. The installer CD didn’t even have a point-and-click interface. You couldn’t enable the extra repositories without manually editing the /etc/apt/sources.list file. You couldn’t safely write to NTFS. There was no bootsplash. There was no Wubi to allow a 99.9999% safe dual-boot setup with Windows. I like the recent logo rebranding, too.

With all that vast improvement, though, Ubuntu still hasn’t come significantly closer to fixing Bug #1. There are a few good reasons for this, the main one being that Ubuntu still hopes people will download, burn, install, and configure Ubuntu on their own. This isn’t the way to penetrate the market. And the preinstalled Ubuntu options are not appealing to the general public for various reasons. Dell doesn’t advertise Ubuntu well or price it competitively to Windows. Dell also does not sell Ubuntu on higher-end models… or even in very many countries. You cannot find the Ubuntu-preinstalled Dell models in a physical store to try out. You have to buy it sight-unseen. Same deal with System76 and ZaReason for that last part. If I’m going to be shelling out hundreds or thousands of dollars on a laptop, I want to be able to try it out and see how it looks and feels. With my last two purchases, I had to do it sight unseen (Xandros-preinstalled Asus Eee PC 701 and Ubuntu-preinstalled HP Mini 1120nr). It wasn’t fun having to scour the internet for various reviews and then realizing there were always one or two quirks that no one mentioned that I later discovered.

I don’t know if Jane Silber or Mark Shuttleworth will ever stumble upon my blog, but I wrote two years ago what I believe their best strategy would be, and I still believe that to be true: Ubuntu: The Open Source Apple Challenger? You need a store. You need a physical store with well-designed custom fully Linux-compatible laptops. It has to be as sleek as the Apple Store but with Ubuntu’s unique branding and, more importantly, a more open philosophy. Yes, we highly recommend you use this Ubuntu laptop and this Ubuntu phone and this Ubuntu MP3 player and this Ubuntu printer, but you may also find Ubuntu works well with many other devices. These are the ones we guarantee will work. No kernel regressions. Lots of extra testing.

When you file a bug report for Ubuntu, you’ll have to post lspci and other stuff only if you’re using a non-sanctioned model. Otherwise, Launchpad will automatically know exactly what model you have.

I can hear the Ubuntu zealot backlash in my head now. “How can you complain about something that’s free?” “Why don’t you just get a Mac?” “Ubuntu just needs more polish.” No. No. No. That’s not it. See, as I’ve pointed out before, you can’t have it both ways. If you’re going to say that which is free is not worthless, you have to stand by the quality of that which is free, which means you have to accept that there can be criticism of that which is free. Otherwise, you have to say free is necessarily inferior to that which is non-free. Besides, I have devoted hundreds of hours to helping Ubuntu. Maybe I didn’t pay money for it (except that one time I donated to the forums), but I certainly have donated enough of my time and energy to the project to be able to voice a criticism or two. I’ve certainly filed my fair share of bug reports and posted my fair share of brainstorms. And, sure, Ubuntu could use some more polish, but polish won’t save the day if people are still supposed to download and install Ubuntu themselves. For more details on that, see Linux-for-the-masses narratives.

Should I get a Mac, though? I don’t know. I have a lot of problems with Macs. I don’t like how you can resize windows from only the lower-right corner. I don’t like how there is a universal taskbar. I don’t like how accidentally dragging an icon off the dock makes it vanish in a poof of smoke. I don’t like how you can’t get a new finder window by pressing Cmd-N. I don’t like how Enter renames and Cmd-O opens. I don’t like how minimized applications don’t restore when you Cmd-Tab to them. I don’t like how closing the last window of an application doesn’t quit the application.

You know what, though? Even though I don’t agree with how Apple set up the interface, I understand the rationale behind each and every one of those decisions. I don’t have to agree with the rationale to understand it. For some of the Ubuntu or Gnome teams’ decisions, I cannot see the rationale at all. They just seem like bugs or arbitrary decisions. They don’t all follow a consistent paradigm or vision. More importantly, Apple does have some great innovative things. Love the multi-touch implementation on the new Macbooks. Love the magnetic power cords.

I guess we’ll see what happens when I’m next in a position to buy a new computer. If, by the time I buy a new computer, Ubuntu has physical stores with well-polished and properly marketed preinstalled laptops, I’ll probably get one of those. If, by the time I buy a new computer, Google Chrome OS netbooks are actually a good option, I’ll probably get one of those. If, however, we’re still in the same place we are now with Linux preinstalled, I may be getting a Mac. Don’t let me down, Jane and Mark. I admire so much of what you do, but Ubuntu really has so much more and different to do to get across that Bug #1 threshold. It isn’t just about improving software. It’s about an entirely new business approach.

P.S. I’m not threatening to leave Ubuntu. I’m simply stating what I believe to be a practical approach. If it’s been two years and I go to Google Chrome OS or Mac OS X, I’ll probably still be doing Ubuntu tutorials to help new users. They’ll just still be primarily for Windows ex-power users and not the so-called masses (aka “jane six-pack,” aka “average user”).

Categories
Uncategorized

How else can Linux fail in the consumer space?

Many Linux advocates and Linux bashers still think the success or failure of Linux in the consumer (not server or embedded) space rests on technical merits. Implementation, marketing, pricing, inertia, vendor lock-in—no, of course, those have nothing to do with whether people decide on Linux as opposed to Windows or Mac OS X. Would it help to work on the technical merits of Linux? Sure. Will that alone make Linux a success for consumers? Hardly. Technical merits will get technical users into it (Network admin, want a server? Use Linux. Hey, TiVo, want a free operating system for your DVR product? Use Linux).

Linux had a few good opportunities to succeed, but flubbed on the execution:

  1. OLPC. When I heard about the One Laptop Per Child project, I got giddy. It was marketed as the $100 laptop. It was going to be durable. It was going to use Linux. It was going to help kids in developing countries learn. If that had been what really happened, Linux would have really taken off, at least in certain demographic segments of the world. What really happened? Well, the laptop was nowhere near $100. It was more like $200. And if rich folks wanted them, they had to pay $400 ($200 to get one, $200 to give one). It also was a pretty ugly laptop, with an extremely crippled version of Linux.
  2. Dell. When Dell started up its Idea Storm section, it probably had no idea the section would be bombarded by Linux users demanding Dell start offering Linux preinstalled. Well, Dell half-heartedly gave in and offered a couple of select models with Ubuntu preinstalled. This half-hearted effort doomed the new venture to failure. Dell hid Ubuntu away so no one could see it on their website without a direct link or clever Google searching. Dell priced the Ubuntu laptops more than spec-equivalent Windows laptops. Dell “recommended” Windows on all the Ubuntu laptop pages (it still does). Dell still used Linux-unfriendly hardware (Broadcom, anyone?). To sum up, Dell was not invested in really selling Linux preinstalled. It just wanted to sort of, kind of appease the Linux community (most of whom continue to buy the cheaper Windows-preinstalled laptops and then install Linux for themselves).
  3. Netbooks. I love the idea of netbooks. The execution was terrible, though. They were not heavily advertised. Early netbooks had 512 MB of RAM and 4 GB SSD drives with 7″ screens. The battery life was poor. The keyboards were cramped. The screen resolution was practically non-existent. Worse yet, all the OEMs included crippled versions of Linux… Linpus Linux Lite, Xandros… installing software became in reality the nightmare that Linux haters often misrepresent it to be. It would be like having apps for the iPhone without an App Store. Yes, you could install a regular Linux version yourself, but that’s not what the everyday consumer is going to do. Microsoft slammed the years-familiar XP down on netbooks, and—suffering from a bad implementation and no marketing or advocacy from OEMs—Linux on netbooks floundered.
  4. Android. In many ways, Android is actually a success. But it is not the success it could have been. When people were saying various Android phones could be the next “iPhone killer,” I thought, Hey, maybe they could be. We’ll see. I wasn’t surprised to see that the G1 did not kill the iPhone, the MyTouch didn’t kill the iPhone, the Hero didn’t kill the iPhone, nor did the Droid, nor did the Nexus One. I have a MyTouch 3G with Android, and I love my phone. I understand very well why it didn’t kill the iPhone, though. Apple understands how to make an excellent user experience, and Google doesn’t. That’s the bottom line. I’m not an Apple fanboy. I actually disagree with a lot of the design decisions Apple makes. What I don’t dispute is that Apple has a vision. Every decision, whether I agree with it or not, has a rationale that makes sense. Yes, there are pros and cons, and Apple weighed them and decided the pros outweighed the cons. With Android, though, and with various HTC phones using Android, I see various bad interface implementations that have no pros at all. I just don’t see anyone properly testing these things. For example, on the MyTouch and the Nexus, the speaker is on the back of the phone. Why? On some of the Android text dialogues, you have to tap into the text field (even if you have no hard keyboard) to get the onscreen keyboard to appear (shouldn’t it appear automatically if the text field is in focus?). Those are just a couple of examples.

Just yesterday, Steve Jobs announced the iPad to much ridicule. People made fun of the name. People said it would be useless without Flash, a USB port, without a front-facing camera, without multi-tasking. They called it an oversized iPhone. They said the 4:3 aspect ratio wouldn’t be good for movies. The LED screen wouldn’t be good for reading in sunlight or for long periods of time.

I kind of liked it. I wasn’t overwhelmed by it. I wasn’t drooling. But I can see the appeal. It looks like a slick device, and it’s priced a lot lower than people thought it would be (most of the speculation saw it between $700 and $1000). If it’s a standalone device (doesn’t need to hook up or sync to a Windows or OS X computer with iTunes), I might consider it.

I would be curious to see if any OEM is going to step up to the plate here, though, and give Linux a real chance. I doubt it. It would be quite simple, though. Create a tablet just like the iPad (has to include proper multi-touch, though… no backing out for fear of so-called patent infringement, Google). Run a Linux-based operating system that is mainly open source (but can have some proprietary programs on it). Include multi-tasking. Include a proper software repository. Use a regular hard drive instead of SSD drive. Include USB ports. Have better screen resolution or a widescreen aspect ratio. Then price it just a little below the iPad… oh, and give it a proper name… one people won’t make fun of.

How simple is that? Will it happen? Probably not. A bunch of iPad imitators will pop around, sure. They’ll each have serious flaws, though. Many will lack multi-touch. Most will be too bulky. Some won’t have a sensible user interface. Some will be too expensive. Then I can tack it on as yet another way Linux has failed in the consumer space.

Mark Shuttleworth, if you’re reading this, it’s about time you realized Bug #1 gets fixed once you create a full and unified software-hardware user experience. Hoards of Windows users aren’t going to download the Ubuntu .iso, set their BIOSes to boot from CD, repartition their hard drives, install Ubuntu, and then troubleshoot hardware compatibility problems. You (or someone with your savvy and financial resources) need to be the open source Steve Jobs if Linux is going to succeed in the consumer space.

Categories
Uncategorized

The Power of Defaults

I tend to see two extremes whenever there are arguments about what should be the default (I’m speaking specifically of arguments on the Ubuntu Forums, but this could be applied to really anything in technology or anything in life in general).

One extreme is that defaults don’t matter at all. It’s not worth arguing about. Just put whatever as the default. Then people can just choose to change the default to something else if they don’t like the default. The other extreme is that defaults matter enough to have 500-post forum threads about arguing back and forth. Somewhere in the middle is some sanity.

Defaults matter. But defaults are only defaults. People can choose options other than the defaults.

Why do defaults matter? Here are some examples:

  • I love that on my wife’s Macbook Pro, you just press the function keys, and they do something right away (lower the volume, adjust the brightness). My netbook by default needs to have the blue Fn key pressed in combination with the function keys to get that behavior to happen. I can easily change that. But if I change it, it’s confusing for anyone else using my netbook, because the instructions on the physical keys themselves indicate the function keys are normal functions and that you need the Fn key in combination in order to do anything. In other words, whole products sometimes have fixed parts built around the assumption that defaults will go unchanged.
  • I use VLC for playing individual sound bits or videos. When I dug into the settings for VLC, I didn’t understand half of what that stuff is, and there were a lot of options to configure. Very confusing for a multimedia newbie like me. Good thing I didn’t have to configure all those settings. I just used the defaults. Sane defaults save the user from having to understand unnecessary minutiae.
  • As far as I can tell, every Linux user has a list of things she does immediately after a fresh installation. I usually change the wallpaper to a picture of my cat, replace Evolution with Thunderbird, add in some proprietary codecs, and delete the bottom Gnome panel. Sane defaults should make the sense for the most users. Even though I personally delete the bottom Gnome panel, the vast majority of Gnome users like to have both a top and bottom panel, so to have the top panel only wouldn’t make sense, because it would mean more people would have to take more time configuring things. If defaults are well-thought-out, less time is spent tinkering and adjusting and more time is spent using.
  • Linux live CDs can come in handy, especially if you need to help a Windows user recover deleted data. What happens in a default installation is the first impression that non-Linux user is going to have of Linux and may be the only impression she has. So if an ugly noise or splash screen appears, that’s the impression she’s going to get. It doesn’t matter if that noise or splash screen can be changed. Likewise, if you are using the live CD to show someone what Linux is like, you don’t want to have to “uninstall” and then “install” in the live session a whole bunch of software, especially if the computer you’re using has very little RAM.
  • And don’t forget that even though power users like to tinker and explore, most people just stick with defaults. 99% of Windows computers I see have the taskbar on the bottom, even though you can easily drag it to the top or the sides. 99% of Windows XP computers I see have the stupid blue theme, even though you can easily change to a silver or classic theme. Even though Firefox’s marketshare has skyrocketed in the past five years, Internet Explorer is still, globally the more-used browser over Firefox, Opera, Chrome, and Safari. It being the default web browser in Windows probably had something to do with that.

Yes, if you have an absolutely unbearable default, many people will probably just ditch it anyway, but instead of thinking “I’m so glad I have the freedom to change this setting,” they’ll most likely be thinking “What a terrible default! Who thought of that? Now we’re all going to have to change this!”

Sometimes defaults can have ethical considerations, too. For example, making people have to opt out of sharing information with a company or third-party corporation “partner” is a bad default (people should always have to opt in for that sort of thing), because it means if people forget to change the defaults or don’t investigate all of their basic settings and advanced settings, they will end up sharing more than they intended to share.

So if I see a bad default in Ubuntu, I’m going to make a point to say it’s a bad default. Good defaults matter. I will not, however, spend hours of my time arguing the point back and forth. Some things are a deal… they may not be a big deal, but they are still a deal.

Categories
Uncategorized

T-Mobile MyTouch 3G with Android… Four months later

I already wrote T-Mobile MyTouch 3G First Impressions and A month with the MyTouch 3G and Android, but someone requested I write yet another follow-up post after having used the phone for a while.

Well, it’s been almost four months, and I have to say that my general impression hasn’t changed much. I can sum it up as generally positive with a few annoying glitches. If you are a part of the Apple ecosystem already, the iPhone is a better choice. But if you are a Linux user or already caught up in the Google ecosystem, an Android phone is a much better choice. A lot of other Android phones have had more hype (Hero, Droid). The MyTouch is a good phone, though. If I actually liked Sprint, I would have waited for the Hero. And if I actually wanted a boxy-looking phone with a “real” keyboard, I’d have waited for the Droid.

Here are the annoying glitches that have bugged me the most:

  • Every web browser for Android has a serious flaw. Ultimately, I’m willing to settle for the flaws in Browser over the flaws in the other ones (Steel, Dolphin, Opera, etc.).
  • The Facebook app is basically good, but when you click on a picture thumbnail, it doesn’t enlarge the picture within the Facebook app. Instead, it launches the Browser app to view the picture. Lame.
  • After the whole cease-and-desist fiasco, I wanted to support Cyanogen for making a Google-compliant fully legal rooted (i.e., jailbroken) ROM, so I’ve been using Cyanogen recently. Unfortunately, the performance has been spotty. Sometimes it’s super-speedy, and sometimes it’s super-laggy. I may end up giving DWang’s ROM a go again, even though it’s not technically legal (it’s in the spirit of the law but not the letter—apparently Google cares very much about the letter, though).
  • Google Voice is a great service. The Google Voice Android app, however, is buggy as all hell. Sometimes it crashes. Sometimes it’ll randomly duplicate SMS messages if I write the message in landscape (instead of portrait) mode.
  • This doesn’t really bother me any more. If you are thinking of getting a MyTouch, though, you should know this: the touchscreen interface takes getting used to. Unlike the iPhone, light swipes are not recognized. You need to press your finger on the screen when you swipe.
  • With the latest versions of Android, there is no way to disable the camera sound (which is extremely loud). I had to install an app called Sound Manager in order to silence it.

That’s about it.

What has been the good stuff?

  • Opening links in background windows (except the Google recently changed its Google News website so that you cannot open links in background windows—other sites work fine with it, though).
  • Good Google Voice integration.
  • Ability to turn any song into a ringtone without special software is great. Right now the Noisettes’ “Wild Young Hearts” is my ringtone.
  • Ability to send unwanted calls straight to voicemail through the phone and to just block them altogether with Google Voice is invaluable.
  • USB tethering is even better than Wifi tethering. You just plug it in and Ubuntu automatically starts using the connection. No need to select anything or change a setting.

In the end, though, a phone is a phone. It makes calls. It receives calls. I can check my email and look up something quick on the web. There are subtle nuances that will differentiate Symbian from WebOS and Windows Mobile from the iPhone OS X and all that from Android. SmartPhones all pretty much do the same thing, though.

Categories
Linux Ubuntu

Linux users take note: Google knows marketing

While critics and advocates of so-called “desktop Linux” waste their time imagining a world in which some consumer-targeted Linux distro manages to fix all its bugs and then self-proclaimed computer illiterates everywhere download and burn .iso files and then set their BIOSes to boot from CD and install and configure Linux themselves, Google moves forward with Linux doing what Apple has always done: market! Strengths? Highlight those. Perceived weaknesses? Market those as strengths. Actual weaknesses? What actual weaknesses?

Seriously, instead of saying “Anything Windows can do, Linux can do” (some BS statement I’ve seen repeated on numerous Linux forums over the years) or “Linux will be a Windows replacement when it can do X” (another popular BS statement), just be honest about what Linux can do well and then play that up. For years, Linux distros had “app stores” called package managers. Because they didn’t have savvy marketing departments, somehow those package managers became a deficiency (“if only I could double-click a setup.exe as I did in Windows”) instead of a strength (get all your software in one place automatically updated and easily searchable). Apple knew how to take that concept and make it sexy. Voila! The App Store. Google followed up with the Android Market.

Likewise, for years, Linux distros have offered relatively safe computing for web, email, word processing, light photo editing, and music organization. Did that get played up as a strength? No. Linux advocates and critics instead decided to focus on what Linux didn’t offer (mainly Windows-only applications and drivers for some third-party hardware peripherals). What does Google do? Remind people (YouTube watchers, anyway) that they use “the internet” (web browser, really) for 90% of their computing anyway. Why not focus on the web browser instead of niche applications (the features in Photoshop that only professionals use, since the rest are in GIMP; high-end commercial video games, since people who use their computers 90% of the time on the web will either not play video games or play them on a console; iTunes, because you’re going to buy an Android phone and not an iPhone anyway, target audience of this YouTube video)? Why not play up the strengths of Linux?

Linux fanatics and haters, I give you… proper marketing:

It should also be mentioned that Google isn’t stupid. It knows that people generally buy devices, not operating systems. Who wants to install an OS herself and have to go through figuring out drivers and other such nonsense? That’s the OEM’s job. If you’re like the vast majority of consumers, you don’t buy an iPhone and install Linux on it. You buy an Android phone. You don’t buy a Windows netbook (or, worse yet, buy a badly configured obscure Linux distro preinstalled—Xandros and Linpus, I mean you!) and install Linux on it. You buy a Chrome OS netbook.

Categories
Apple and Mac OS X Computers Linux Ubuntu Windows

Where is this dreamland in which Windows “just works”?

First of all, I have to say it is not my intention to bash Windows. I am not a Windows hater. I actually like Windows. I use it at work every weekday, and I have found ways to have a generally pleasant experience with it. I like Mac OS X better than Windows, though, and I like Ubuntu Linux better than Mac OS X. I actually am quite a firm believer in using the operating system that works best for you and that all the major platforms have pros and cons.

What I can’t stand is Windows power users having a bad experience trying to migrate to Ubuntu (or some other Linux distribution) and then proclaiming “This is why Windows will always dominate the desktop” or “This is why Linux isn’t ready for the masses.” This in these contexts meaning that they had some problem using a peripheral or getting their wireless to work or whatever. I don’t get it. Really. I don’t understand where the logic in this proclamation is. Such a conclusion comes from several flawed assumptions:

  1. Windows always works.
  2. People choose Windows because it always works.
  3. If Linux always worked, the masses would suddenly flock to Linux.
  4. The problem I had with Linux is a problem everyone would have in Linux.

The truth is that if you work in tech support (I don’t officially, but I have unofficially in my last two jobs), you know that there are problems (many problems) on both Windows and Mac OS X. Windows has been the dominant platform at both my current and previous workplaces, and every single day there are Windows problems abounding—cryptic error messages, printer driver conflicts, wireless drivers preventing laptops from going into standby, blue screens of death, rogue viruses, and frozen applications. Believe me, our official tech support guy doesn’t just sit around twiddling his thumbs. He is busy.

Oddly enough, when people have these constant Windows problems, they don’t decide Windows “isn’t ready for the masses.” They just stick with it. Maybe they’ll say “I hate computers.” Maybe some smug Mac user (who also has problems of a different sort but somehow turns a blind eye to them) will say “I hate PCs” (and by PC they mean Windows PC). Oh, but the second a Windows power user tries Linux and encounters one or two problems, suddenly Windows is this always-working utopia. “I’d never have this problem in Windows.” Sure, buddy. Let me tell you about problems.

Last week, a friend of mine wanted to create a playlist of songs to put on her iPhone for a party she was throwing. Here are the problems she encountered:

  • The iPhone wouldn’t update because it couldn’t connect to the iTunes server
  • After it appeared to start the update, iTunes estimated the update download to take 54 minutes.
  • When the download failed after a half hour, she gave up on getting updated firmware on her iPhone altogether.
  • After installing the Amazon MP3 Installer, the download of the purchased MP3 failed midway through and would not complete or offer a useful error message after clicking retry.
  • The iTunes store worked better for purchasing music but cost more ($1.29 per song instead of $.99 per song)—not really a technical problem but still annoying.
  • She couldn’t sync the songs in her playlist to the iPhone, since the iPhone had been authorized on too many computers already, so she had to call Apple to get them to deauthorize her other computers so she could authorize her current computer.

So that’s “just working”? These are not the only problems she’s had on a Windows computer, and she’s had multiple computers. More importantly, she could not solve all these problems on her own, but she needed me to walk her through almost every step of the way. Is this pretty typical? Yes, actually. As I said before, I’m not even the real tech support guy at work, but people still ask me for help with their Windows problems every single day of the week. It could be Microsoft Word inserting some stupid line that can’t be erased or deleted. It could be Firefox not accepting cookies for website even when you’ve enabled them in Tools > Options. It could be the printer icon not allowing you to delete an errored out print job.

If there were really an operating system that offered you a flawless experience that didn’t require you to be your own tech support or for you to find outside tech support, then a lot of people would be out of jobs. Help desks everywhere would be laying off employees by the tens of thousands.

So does Linux have problems? Sure. It has a lot of problems. But those problems are not the primary (or even secondary or tertiary) reason most people use Windows. Windows’ dominance has mainly to do with inertia, marketing, brand-name recognition, and a near-monopoly on preinstallations. Why should I have to state this obvious fact? Because again and again Windows power users perpetuate this nonsense—because they have spent years or even decades perfecting the art of making Windows a bearable experience—that there are no problems in Windows and that any problem in Linux must be the reason Linux for desktops/laptops/netbooks isn’t more popular than it is.

Further Reading
Linux-for-the-masses narratives
Macs are computers, not magic (part 2)