Categories
Uncategorized

Ubuntu on a Macbook Pro

I’m not abandoning Mac OS X, but you knew it had to happen—I have installed Ubuntu on the Macbook Pro as a dual-boot. It hasn’t been easy, mind you. Previously, I had done a few dual-boot setups with Ubuntu and Windows or Ubuntu and some other Linux distro or even Ubuntu and an older version of Ubuntu. Ubuntu on a Macbook Pro is a totally different experience.

So first I went to the Ubuntu wiki to find out if it was worth my time. According to the Macbook Pro 3,1 page, everything works pretty much out of the box with Ubuntu 10.04 (Lucid Lynx). That was encouraging. Then I read up the generic Apple Intel installation instructions. They didn’t sound too complicated. Install rEFIt, repartition the hard drive, install Ubuntu in the new partitioned space. Easy, right? Well, not so easy. Here are a few bumps I encountered along the way:

  • rEFIt didn’t install correctly. After you install it, you should reboot and see the rEFIt menu. No menu. So I had to do some digging and found out there is a script you can run in the terminal to sort of reinstall rEFIt.
  • I couldn’t resize my hard drive through Disk Utility or BootCamp. Both failed, claiming there wasn’t enough free space, even though there was plenty (at least 70 GB after I backed up my files to an external hard drive and deleted them, planning to copy them back later). So, believe it or not, I took the hours to completely reinstall Mac OS X from scratch and then repartition the drive.
  • Since Ubuntu can’t reliably write to HFS+, I put my music, pictures, etc. on a shared FAT32 partition. Unfortunately, iTunes doesn’t really dig that. If I try to skip to the next song, I get about five seconds of the rainbow circle of death before the next song will actually play. The symlinks from the FAT32 partition also broke at first, too, because initially it was mounted as /Volumes/Storage but then it suddenly became /Volumes/STORAGE. After fixing everything to point to the upper-case mount point, the links appear to be working again.
  • Ubuntu would not install the first five times I tried. That’s right. I tried five times. It kept failing in the middle of the installation, claiming the CD was bad or the CD drive was bad or the laptop was too hot. All of those things could have been true to some degree. The CD had a little bit of dirt on it, which I tried to clean off but couldn’t get completely clean. The CD drive was definitely bad. In OS X it was pretty good at reading commercially produced CDs and DVDs but would sometimes reject homebrews (it would spin and try to read for a minute or two and then just spit the disc out). Also, unlike my wife’s new Macbook Pro, this old MBP overheats like nobody’s business. You could probably fry an egg on it. Eventually, I did something that worked, and I’m not sure which part of it did it. I turned the computer off for the night (let it cool down completely). Then I immediately booted it up and while Ubuntu was installing, I never left it alone. I played gBrainy. I looked in the file browser. I changed various settings. I didn’t let the CD rest and give up. So I don’t know if it was having it cool or constantly engaging the live CD session, but eventually Ubuntu did get installed.
  • I installed the Nvidia driver, but then Hardware Drivers instructed me to use a more recent driver. After that, suspend didn’t resume. But then I removed the old driver and rebooted, and resume from suspend worked fine, as did Compiz.
  • The touchpad works extremely well for two-finger scrolling, but the touch sensitivity is a bit much (and can’t be adjusted, as far as I can tell), so I have to be careful not to tap the touchpad accidentally when trying to scroll; otherwise, I end up clicking. If I turn off tapping to click, then I can’t right-click by tapping down two fingers. A bit annoying.
  • Control is a rather small key on the Mac keyboard, but for most navigation it’s used more often than the Cmd key (the Super key, for all intents and purposes). The key placement is a bit odd when you’re used to coming from Mac OS X or even from a regular Windows keyboard. Takes a bit of getting used to.
  • I thought Skype was broken, but it wasn’t. I set my account to offline instead of invisible, and apparently if you’re offline you can’t do the Skype test call (it just fails immediately). I didn’t know that, so I was trying all these crazy fixes like uninstalling PulseAudio or whatever. Turns out it just works fine if you’re invisible or online.
  • The Picasa from the Google repositories is broken with the latest Lucid kernel. If you download the .deb straight from Google, though, it works just fine.
  • I had a 32-bit Ubuntu CD already, so I didn’t really want to bother downloading 64-bit Ubuntu to take advantage of all 4 GB of RAM (and waste another blank CD, since Macs can’t boot from USB). I guess that would have been interesting to try, but 32-bit works quite snappily with only a bit more than 3 GB of RAM being recognized.

Overall, I have to say Ubuntu works quite well on a Mac. I think it even runs a bit cooler, too (still very hot but maybe not hot enough to fry an egg on). My plan is to keep playing around with both (sometimes boot into OS X, sometimes boot into Ubuntu). With a FAT32 partition for files, I have that luxury, except that I will have to be in OS X to import into iTunes and iPhoto—Rhythmbox and Picasa on Ubuntu will automatically watch folders for new files.

Categories
Uncategorized

Using a Mac full-time: the good, the bad, and the pretty

I think most new Mac users come straight from Windows. I’m now using my wife’s old Macbook Pro as my main computer (pretty much as a desktop, since the battery life is abysmal), and my Ubuntu netbook (for portability). Here are some good and bad experiences I’ve had.

First, the good:

  • Audio simplicity. I can use Skype without worrying it won’t work with my PulseAudio config or having to recompile Alsa. I can get playback in MuseScore without having to deal with Jack (which I know nothing about).
  • Instantaneous wireless resume. With Ubuntu, I went back and forth between a few seconds after resume from sleep to get wireless up again all the way to a minute and a half, depending on the release (Jaunty was the worst, sometimes taking up to two minutes). I tried WICD. No better performance there. I resorted to all sorts of weird tweaks. At least resume was a bit quicker in Karmic. Even in Lucid beta 2, if you look at /usr/lib/pm-utils/sleep.d/55NetworkManager, you’ll see a comment in the top of the config file that says Make NetworkManager smarter about how to handle sleep/resume If we are sleep for less time than it takes for TCP to rest a connection, and we are assigned the same IP on resume, we should not break established connections. Apple can do this, and it is rather nifty. That comment’s been there at least since Jaunty. Don’t know if it was there in Intrepid or Hardy as well.
  • Magnetic power cord. As I think I’ve mentioned before, I’m a total klutz and have killed more than one power cord by tripping on it. It’s good to have that peace of mind with the magnetic power cord that just gets yanked out.
  • Sound quality. The speakers are great on the Macbook Pro. Nothing tinny like what I’ve experienced in other computers.
  • Smooth animation. Yes, Compiz has a lot of fancy effects, but there’s always something a little jerky or pixelated about everything “cool” I’ve seen in Linux.
  • Netflix streaming. It’s much better on a Mac than it is on PS3 or Wii. And it is, of course, just non-existent on Linux. Even though Macs can’t run all the software Windows can, it does seem to get more third-party support for consumer commercial stuff than Linux does.
  • Simple extended monitor. Haven’t done it recently, but this is my wife’s old Mac, so I have in the past, and it’s pretty simple to do extended desktop with an external monitor. Linux is getting there, but not quite.
  • Photo and music management. I know a lot of Linux users dig their Amarok or Exaile or whatnot. I’ve always, even back in my Windows-using days, liked iTunes. And since it supports drag and drop to USB devices, I can even use it with my Sandisk player. iPhoto is like a slightly more polished version of F-Spot (which I liked in Ubuntu).
  • Multi-touch touchpad. I use an HP laptop at work (with Windows) and have an HP Mini at home (with Ubuntu), and I don’t dig the one-finger scrolling on the side of the trackpad. Two-finger scrolling is great.
  • Simple USB drive renaming. You’d think it wouldn’t get much simpler than launching up GParted and changing the label on the drive and hitting Apply. It’s a lot easier on OS X, though. Just hit Enter to rename, type the new name, and hit Enter again. No separate program to launch.

Now, the bad:

  • Rainbow circle of death. This Macbook Pro has an over 2 GHz processor and 4 GB of RAM. There’s no reason I should ever be getting any kind of freeze-ups unless I’m running something from Adobe Creative Suite. But, no, even with just Finder, Firefox, and Thunderbird running, I can sometimes get the rainbow circle of death. xkill would be handy here, but Option-Cmd-Escape works, too.
  • Reboots for most updates. I’m used to needing a reboot for only kernel upgrades. I don’t know why Apple does this, but even for simple application updates, it wants to reboot the system.
  • Thunderbird ugly. I still prefer Thunderbird to Mail, but it’s not looking good on Mac, and the Thunderbird downloadable themes are not that great either.
  • Spaces not working correctly. I’m glad Apple decided to put workspaces on Mac OS X, considering Linux distros have had these for quite some time. Unfortunately, I couldn’t quite get it to work the way I wanted to. For a while, I could, but suddenly my keyboard shortcuts for switching workspaces just weren’t working. I think there are too many keyboard shortcut conflicts in OS X. I just gave up.
  • Hard drive icon always jutting out. I’m on a widescreen laptop, so my most valuable screen real estate is vertical. On a laptop, though, all screen space matters. So I moved the Dock to the right. But I want to have the hard drive icon line up with the Dock. Instead, it juts out a bit, no matter what settings I use for the grid, font, or icon size. I can live with it, but that’s annoying. I did find a neat trick online to get the Dock to go to the corner of the screen instead of floating in the middle, so the trash can is now in the lower-right corner. That’s nice.
  • Application and window management. I just don’t get the application not quitting when the window closes. So if I’m using the terminal and type exit, that means I’m done. And if it’s the last terminal window, I don’t see why I should have to hit Cmd-Q to quit fully. More importantly, I like being able to hit Cmd-Tab (or Control-Tab) to switch between open windows. If I have multiple windows open in one application, I don’t want to have to worry about first switching to that application (Cmd-Tab) and then switching to that particular window (Cmd-`). That’s too much fine-tuned control, and the Cmd-` keyboard shortcut just isn’t easy for my fingers to position themselves for.
  • Overheating. You can fry an egg on this laptop after ten minutes of use. It gets really hot. Now I know why my wife got a cooling pad for this. Fortunately, her new Macbook Pro seems to run a lot more coolly.
  • F keys messed up. F9 is supposed to be for Exposé, but now it’s apparently for dimming the backlighting on the keyboard. As far as I can tell, you can either have all the F keys turn into normal F keys, in which case Expose´ will work again for F9 but none of the volume and brightness keys will work, or you can keep the brightness and volume keys working and have F9 be for dimming the backlighting on the keyboard. Either way, you’ll have to resort to using Fn-F9 at one point to get the functionality you want.

So, yeah, some gains and some new niggles to deal with. I have Ubuntu Lucid installed in VirtualBox right now, so I’ll be playing around with that too, and I’ll probably install that on my netbook after official release so I can get updated screenshots for my tutorials.

Categories
Uncategorized

Made the move to Mac

As a follow-up to Why I might switch to Mac from Ubuntu, I did actually get a Mac… or, more precisely, my wife got a new Mac, and I inherited her old one.

Clarifications
Unfortunately, it seemed some of the commenters on that entry brought their own agendas and grudges without actually reading what I wrote. I have tried other distros, many of them, in fact—probably at least 20 distros over the past five years. You can read about some of my more recent failed attempts at trying non-Ubuntu distros. Two of my reasons for switching had nothing to do with Ubuntu specifically—that there were hardware regressions in the Linux kernel (and bugs in other upstream packages), and that the whole approach of the operating system development being wholly independent of the hardware development is a flawed approach if you want to increase adoption (which, incidentally, Ubuntu is trying to do, and not all Linux distros are).

To those who claim Macs “just work,” I have to disagree. For more details, read Macs are just computers, not magic and Macs are computers, not magic (part 2).

In terms of what happened in getting the new Mac, it’s been an interesting mix of positives and negatives (Can you believe it? Macs are not the holy grail, nor are they the devil incarnate).

The Apple Store
One of the nice things about the Apple store is that there are a lot of display models of various Apple products you can try out. So my wife and I got to spend considerable time playing around with the new Macbook Pro before we decided on purchasing it. More importantly, the sales staff appear to be trained on finding the right balance between being unavailable and being oversolicitous. A few annoying things about the sales staff, though:

  • They assume you know nothing about Macs, even if you are a long-time Mac owner (as my wife is).
  • They aren’t overly pushy, but they do try to upsell you (AppleCare, training programs, iWork, etc.).
  • They take every opportunity to bash so-called “PCs” in side comments (and by PC, they mean Windows PCs, because, as we all know, Macs aren’t personal computers, and Linux just doesn’t exist, nor does FreeBSD). Want to know where the stereotype of Mac users as being snobby zealots comes from? It comes from the Apple store employees (and from the “I’m a Mac, I’m a PC” commercials). I like Mac and Linux and Windows. Is that a crime to like all three?

The Migration Experience
At home with the new Mac, we used the Migration Assistant to move my wife’s files, settings, and applications over to the new computer. I don’t know who at Apple is in charge of the Migration Assistant, but that person needs to be replaced. First, it prompts you to make the transfer via firewire. The new Macbook Pro doesn’t come with a firewire cable, though. We had an old firewire cable from an external hard drive, but apparently that’s the wrong kind. We tried to do the transfer via ethernet. We soon realized that was a mistake, as the transfer was going to take three hours. Unfortunately, Migration Assistant is set up so that you can’t do anything else on the computer while the migration is happening, and the time remaining arbitrarily goes up, stands still, or randomly drops. At one point, it said it was going to take four hours. So we canceled it by killing the Migration Assistant on the source Macbook Pro and then forcing a shutdown on the destination Macbook Pro. Then we did the Migration Assistant again but this time with just the settings and applications (not the files). The files we copied over manually from an external hard drive backup afterwards (during that copy, my wife could actually use her new computer).

Apart from the Migration Assistant process being godawful, the migration result itself is pretty good. The setup was exactly the way she had it on her old computer. Wireless keys remembered. Dock configured in the exact same way. Mail with all IMAP accounts set up. Wallpaper the same. It was an exact replica of her account on the old Mac. All the programs worked, including CS3 (I thought maybe that might need a new activation key or something).

Unfortunately, one thing that didn’t work (and this points to a major usability issue with Mac OS X, which is being able to resize windows from only one corner) was her window setting with iTunes. See, her old Macbook Pro was 15″ and this new one was 13″, so the iTunes window extended beyond what the screen could display. We couldn’t figure out how to drag the window past the universal toolbar (I thought maybe there might be an equivalent to Alt-mouse-drag in Linux, but couldn’t find one). Clicking the + button (which usually zooms in other applications) just toggled between full iTunes and the iTunes mini player. Finally, I did a Google search and found that you could go to window > zoom in the toolbar menu to get it to zoom (since the + button in iTunes acts in a way inconsistent with other OS X applications). Solved that. Annoying to have to solve.

Meanwhile, I was tailoring my wife’s old computer to suit my needs. I deleted all her design and font programs (she’s the graphic designer; I’m not). I got rid of Mail, Safari, and iCal. Put on Firefox, Chrome, Thunderbird, Transmission, and some other programs I found at Open Source Mac. I love the smooth animation (when importing photos in iPhoto, when switching applications) that I just never could get in Ubuntu, even with Compiz. I don’t like that I can’t toggle hidden files with Control-H (or even Cmd-H). I don’t like that Finder is an always-on application (meaning, when I’m switching applications with Cmd-Tab, I want to switch between only actual applications, and not the file browser if no file browser window is open). I had to install a third-party application to turn off the annoying boot-up noise.

Really, though, the main draws for me to my wife’s old laptop are not any OS X–specific features per se. What I like most are

  • The magnetic power cord, because I am a klutz and actually broke my HP Mini power cord recently.
  • The larger hard drive. Since the HP Mini was my main computer, it was kind of tough to deal with having only a 16 GB SSD, and the upgrade options for a 1.8″ 5mm PATA Zif hard drive aren’t wonderful.
  • The ability to do Netflix streaming (the PS3 fake-Bluray experience isn’t as good as the web browser experience). I guess you could argue that’s OS X–specific, in the sense that Netflix supports Mac OS X and doesn’t support Linux. It has nothing to do with the usability of the operating system design.

Unlike most Linux users, I have always been a fan of iTunes. I’ve used Foobar, WinAmp, Songbird, Exaile, Rhythmbox, AmaroK, JuK, Banshee, and all the rest. I still think iTunes is the best. But I’m going to keep buying songs through Amazon’s MP3 store, since I want to be able to easily port the music to my Sansa Clip or to Ubuntu, should I decide later to set up a dual-boot. I’m also going to be sticking with Android, even after my phone becomes “obsolete” (obsolescence is subjective, I guess). I do like the iPhone, but it’s a bit too restrictive. I like the xScope web browser, and I don’t see any free web browsers in the iTunes app store like it. I like having a rooted device without worrying that updates will constantly break my installation. I like being able to send certain contacts straight to voicemail. I like the Google Voice app (which Apple has rejected for the iPhone).

In Conclusion
Yes, I will continue to update my Ubuntu documentation on Psychocats. Don’t worry. I plan to have Ubuntu in VirtualBox on Mac OS X. I also still have my HP Mini around with Ubuntu on it. My wife and I don’t travel often, but when we do, a 10″ netbook is far more convenient to travel with than a 15″ laptop. So even though Mac OS X is now my main OS, I will continue to document and test Ubuntu. And, mpt, I don’t know if you got my email, but I would be interesting in helping the Ubuntu experience design team if that offer is still good.

Categories
Uncategorized

Why I might switch to Mac from Ubuntu

Who am I?
I’ve been using Ubuntu for almost five years now. I’ve offered some technical support on the Ubuntu Forums and been a moderator there off and on. I’ve maintained a new-user-targeted documentation site for every release of Ubuntu except the very first (4.10). I’ve also contributed to a few official Wiki pages. Even though nanotube did all the heavy lifting, I did help out a fair bit in at least the beginning stage of UbuntuZilla. I’ve filed bug reports at Launchpad. I’m not a programmer, but I feel I’ve contributed a fair bit to Ubuntu.

Why I was drawn to Ubuntu
I admire a lot of what Mark Shuttleworth has done. He has an enormous amount of wealth. A lot of people who don’t have a lot of wealth always think if they did that they would undoubtedly give away most of that money. It’s easy to give away other people’s money. It is not so easy to give away your own. My parents aren’t nearly as rich as Shuttleworth. Somehow, they managed to give a large percentage of their money away to church and to various charities, and still maintain a very comfortable upper-middle-class lifestyle. My wife and I are struggling to make ends meet while also trying to give away to causes we deem worthy. To sink millions of pounds into what could have been a dead-end project is a risk that I admire Mark Shuttleworth taking. He could have been ridiculed. He could have lost a lot of money on nothing.

He had a vision, though. I liked that original vision. I liked the free CDs shipped anywhere. I liked the idea of one CD with one application per task, not a lot of confusing options, and sensible defaults. More importantly, I liked the idea of Ubuntu—humanity toward others, which showed quite well in the Ubuntu Forums. And Ubuntu was one of the few distros to try to strike a reasonable balance between the lofty ideals of Free software zealotry and the pragmatism of proprietarily-licensed software.

Where did Ubuntu go wrong?
For a while, I had high hopes for Ubuntu. Every release seemed to make Ubuntu more polished, every additional feature seemed to make Ubuntu more accessible for the Linux novice. A few things that have come up recently have made me a bit disillusioned with Ubuntu, though:

  • These days, decisions and “improvements” seem more like arbitrary changes instead of actual user experience improvements. Grub suddenly became less configurable, as did GDM. Notifications would appear and randomly disappear at odd times (for example, if my wireless reconnected, the notification would still say I was disconnected and then change to connected only about ten seconds after I’d actually reconnected).
  • My bug reports have really come to naught. A few years ago, if someone had complained on the Ubuntu Forums about a problem with Ubuntu, I would have been first in line to say “Complaining here won’t do any good. If you want to tell the developers, file a bug report.” After seeing that most of my bug reports have been unanswered or unfixed, sometimes for years, I don’t know that filing a bug report is really the best thing to do.
  • Brainstorm is a mess. Really, there isn’t an efficient way for developers to get proper feedback from users. If I, as a user, can’t make sense of Brainstorm’s thousands of ideas, how can the developers, who are busy developing?
  • I’ve seen too many hardware regressions. A lot of this isn’t Ubuntu’s fault. A lot of this is upstream. Regardless, upstream affects the Ubuntu experience. The real problem is that the Linux kernel tries to support everything well. There isn’t enough focus. So something that is in theory supposed to be Linux compatible (say, an Intel Pro Wireless 2200bg card) can work perfectly in one release, and then have random disconnects in the next two releases and then work perfectly again in the next release. Personally, I’ve had a Broadcom card that works and doesn’t work in alternating Ubuntu releases, and that makes no sense to me. If the problem is that hardware manufacturers aren’t making it easy for Linux developers to make drivers, then that hardware should never work. If, however, the hardware works in one Ubuntu release and doesn’t work in the next release, that is definitely the fault of Linux, whether it is the kernel team upstream or the Ubuntu team… or both.
  • Recent decisions have seemed to focus on whim or business more than user experience, particularly the change to Yahoo! as the default search engine in Firefox and the random moving of the window control buttons from right to left. I have no problem with change. I also have no problem with Ubuntu making money. But there seems to be an utter disregard for how changes affect users. A little more communication would help. More details here.
  • The most important thing is there doesn’t seem to be a real strategy in place for fixing Bug #1. Yes, there are power users who like to install their own operating systems and troubleshoot hardware compatibility issues. In order for your product to take off, though, it can’t be just an operating system. It has to be a product. It has to be something people can purchase. And the limited options from Dell (which recommends Windows, even on the Linux parts of its website) don’t cut it. They also aren’t created by Ubuntu. They just use Ubuntu. Recently, Google released the Nexus One as its idea of hardware matching perfectly the software in Android. There is no Ubuntu equivalent. There isn’t hardware designed to be the ultimate Ubuntu experience. I’ve heard various Ubuntu advocates propose making a Ubuntu commercial. What’s the point, though? If someone saw a Ubuntu commercial, she couldn’t just go and buy Ubuntu, especially in certain countries. The options are limited or non-existent. And hardware compatibility is iffy (Dell still uses Broadcom cards… I have a Broadcom card in my Ubuntu preinstalled HP Mini, which HP no longer makes, by the way).

The straw that broke my camel back
This window button move in Ubuntu 10.04 is really indicative of a bad way Ubuntu is headed. Defaults matter. One of the things I liked about Ubuntu, as I stated before, is its sensible defaults. I don’t have to agree with everything the Ubuntu teams decide or that Mark Shuttleworth decides. That’s fine. You want GIMP out… I don’t agree with it, but I at least understand the rationale behind the decision (it takes up a lot of space on the disk, and most people do not need the crazy power-user features GIMP offers as a photo editor). This decision about the window controls came out of nowhere and had no apparent rationale. Instead of getting good reasons for the change, all we got was… nothing for a while. We got some people saying “Hey, it’s different” or “Just get used to it” or “You can change it back easily if you want.” These aren’t reasons for a change. These are coping strategies. If a change happens, there should be good reason for it. Look, I get Shuttleworth saying Ubuntu is not a democracy. It doesn’t have to be a democracy, though. How about, as self-appointed benevolent dictator for life, just explaining why you made a decision? People don’t have to agree with your decision, but at least if they have a reason for it, they are more likely to accept it. How about, even though you have the power and right to not listen to people, just soliciting feedback?

It took a lot of pressing from users to get Shuttleworth to talk a bit more about what kind of “feedback” and “data” he was looking for. He said at least that the decision wasn’t final, and he wanted genuine data. Based on his remarks in this bug report, it really does seem, though, that he has made up his mind that this is what is going to happen, regardless of what data and feedback people present him with—especially when people present a lot of legitimate points against the move, and then he just replies “And the major argument against it appears solely to be ‘we’re used to it here.'” For more details on those legitimate points, take a look at this and this.

Democracy v. Dictatorship = false dichotomy
In case anyone’s wondering, there are more than two options out there. You don’t have to put every decision to a vote. And you don’t have to totally disregard community input. You don’t have to try to please everyone or please no one. And you don’t have to be subject to mob rule if you offer a little transparency.

My advice to Shuttleworth for the future would be if you want to make a unilateral change, just be open about what your reasons are for it. You can be a strong leader without pissing off large segments of your user base. Just say “I want to change this a bit, because I think it offers X, Y, and Z usability improvements. I realize a change is difficult for everyone, and I also concede there are A, B, and C tradeoffs in making the change. The tradeoffs are worth it, though. Ultimately, the decision rests with me and the desktop experience team. Nevertheless, I would like to hear your concerns about the change, and the best way for you to communicate your concerns is through methods D and E.” Would that be so difficult? Any time you make a change, there will always be some people unhappy about it. You can still make the process a little less heated with just some communication and openness. After all, on your webpage, you say “Ubuntu is a community developed operating system that is perfect for laptops, desktops and servers.” Your millions of pounds help make Ubuntu happen. We all know that. Keep in mind that it would behoove you to not piss off your user base, as the success of Ubuntu can’t be bought with pounds alone. Millions of users contribute to Ubuntu in many ways as well.

Why Mac?
When I voiced opposition to this latest change in Ubuntu, I got a lot of “Ubuntu is not a democracy” and “You can always use something else.” Well, as I just explained, you can very well have a non-democracy that is still community-focused. I hope Mark Shuttleworth will reconsider for the future his approach to communicating (or not communicating, in this instance) with the larger Ubuntu communities. Really, though, if I’m going to be using an operating system maintained by a dictator, I might as well go for one who understands that 1) hardware and software planned together make for a better user experience and 2) even if users don’t agree with his design decisions, he should still have rationales for those decisions.

I can’t even tell you how many design decisions I disagree with Apple about (resize only from bottom right corner, zoom instead of maximize, disk image mounting for software installation, dock icons in poof of smoke when dragged off dock, etc.). You know what, though? Each one of those decisions I disagree with I also understand the rationale for. More importantly, I like how Apple doesn’t like to tackle too much at once. Instead of trying to support all hardware and then having regressions on various theoretically “supported” devices, Apple realizes it’s better to have a great experience on a limited number of devices.

And the attention to detail is impressive. The magnetic cord I love. I am a total klutz and can’t tell you how many cords I’ve ruined by tripping on them or tugging them the wrong way. In fact, I just broke my HP Mini cord this weekend and had to order a replacement cord. Not so with the magnetic cord on my wife’s Macbook Pro. When the Macbook is sleeping, the power light fades slowly in and out instead of doing a hard off and on blink. The power button is flush with the frame of the laptop and not jutting out. The sound quality is always good on Mac laptop speakers. There’s a lot to admire about Apple approach. It is one great way to present an integrated hardware-software computer experience. My hope was that someone would present another great way. We’ll see if that ever happens.

Am I abandoning Free software?
Not really. First of all, I don’t know that I’m going Mac. Macs are expensive, so I’d have to save up for one. Even if I do go Mac, though, my Mac experience would be very different from my wife’s Mac experience. For one thing, I might dual-boot with Linux Mint. And even if I stick with Mac OS X, I will use Thunderbird instead of Mail, Firefox instead of Safari, OpenOffice instead of iWork, and my Android phone instead of an iPhone (Cyanogen’s rooted rom has made me really appreciate the Android platform even though the iPhone has its advantages too). No change has to be permanent, though. If Ubuntu comes around or changes the way it does business, or if some other Linux distro focuses its energy on preinstallation and proper marketing/distribution, and thorough hardware compatibility testing on a few select models, I might make my way back. In the meantime, if I go Mac, don’t worry—I’ll still be making my Ubuntu tutorials. A bad decision though the window control switch is, it’s probably not bad enough for most Ubuntu users to actually abandon Ubuntu at this point. For me, it was a tipping point. It’s been a good five years.

Categories
Uncategorized

Ten Brainstorm ideas I wish more people would vote up

Ubuntu Brainstorm is a mess. There are literally tens of thousands of ideas posted up there. How can you make any sense of it? Well, you can’t. I thought I’d just draw some attention to some ideas I think are worthwhile in the hopes that people will vote them up or at least discuss them.

Here’s my top ten along with quick blurbs as to why they’re important:


Not everyone has broadband internet access at home. So-called “Linux for Human Beings” should focus on accessibility.


One good SVG takes up less disk space than seven PNGs of various sizes, and it also looks great no matter how big you make it.


I don’t think this requires a justification. I’m using the latest Ubuntu 10.04 alpha, and the problem still requires a workaround (deleting and recreating the keyring password with “unsafe storage”).


Why ask a user to paste a command into the terminal when the program could just run the command by itself?


Privacy should be the default with sharing as an opt-in.


Why give new users the option through the GUI to accidentally remove admin access?


For the last time: if hiding asterisks or dots is “a security feature,” then you should be voting up Idea #11136: Remove visual feedback from GUI password dialogues. If it isn’t a security feature, though, then you should vote this up so as not to confuse users who are expecting visual feedback when they type passwords. This happens a lot.


Lots of widescreen monitors out these days. Why waste vertical screen space with a second panel? A lot of people seem to think moving the window buttons from right to left is no big deal, so why would it be a big deal to just remove one Gnome panel by default. And the defaults-don’t-matter crowd (which I am not a part of) can just add it back with a few clicks.


I take a lot of screenshots for tutorials. I know a lot of others folks do too. It’d be great if gnome-screenshot didn’t keep prompting for a file name. Just create the file… or allow an easy preference option to do so.


I understand why Ubuntu doesn’t include various codecs and software by default in Ubuntu, but apart from pasting in cryptic code, new users don’t have an easy way to access the Medibuntu repositories. It’d be great if they could check just one more box (as they can with the Partner repositories).

Categories
Uncategorized

Ubuntu 10.04 (Lucid Lynx) first impressions

They say you’re not supposed to upgrade to alpha pre-releases of Ubuntu on your main computer. Unfortunately, I have only one computer (my HP Mini 1120nr netbook) to test on, and it has a 16 GB SSD, so dual-booting isn’t even really an option. I just took the plunge, downloaded the latest Lucid Alpha .iso, “burnt” it to USB using UNetBootIn, and then installed it over my Ubuntu 9.10 (Karmic Koala) installation.

I have to say I’m not impressed. Yes, I know it’s an alpha release, but I’ve done alpha releases of older versions of Ubuntu, and it’s usually not this bad so close to the beta release.

A few things I didn’t like

  1. Broadcom drivers can’t be fetched without an internet connection. Okay, so this was true with the last Ubuntu release also, but I know in previous versions Ubuntu would autodetect I had a Broadcom wireless card and then prompt me to activate the necessary drivers and then have it just work (which is what Ubuntu is supposed to do). What does Lucid do? It tells me there are drivers I need to install. When I click on the little green square icon to launch jockey-gtk and try to activate the driver, I get told that the driver can’t be fetched from the online repository. Why should you need an internet connection to get your internet connection working? That’s silly. I’ve filed a bug on it: 535824.
  2. Applications crashing left and right. I’m a bit more hopeful on this one. This does tend to happen in alpha releases. Nevertheless, it’s ridiculous with Lucid. It’s not even the application launches and then crashes. It crashes even before it launches. That happened for Gwibber, for Ubiquity, for Software Center.
  3. Wireless slow to reconnect after resuming from suspending. This bug was annoying and in Intrepid and Jaunty. It seemed to go away for Karmic, but now it’s back in Lucid. Look, the whole point of suspend-to-RAM (also known as sleep) is that you can put your computer into a battery-saving state that can be quickly used again without a long wait. If I wanted a long wait, I’d have shut down and then booted up again. It honestly would be quicker than waiting 30 seconds to a minute for wireless to reconnect. Same old bug: 274405.
  4. Internal mic settings not autodetected. Another thing that appeared in previous releases but you think they’d have fixed by now. Nope. The hardware detection isn’t the problem. It’s the settings configuration. By default, Ubuntu uses the microphone selection to use the microphone. Really, though, my internal mic is the line-in selection. Shouldn’t Ubuntu be able to tell that for certain models the internal mic is the line-in selection and just select that by default? Bug previously filed: 441480.
  5. General problems. To be honest, I just don’t have the motivation to file bugs on all these, since most of the bugs I file get ignored (or acknowledged and then not fixed). When I resume from suspend, in addition to wireless taking a long time to reconnect, the battery icon for gnome-power-manager appears and disappears from the taskbar like a blinking light. I also get an error message about the monitor configuration. Update manager is holding back certain updates, but the updates still appear. What’s up with that? I had to explicitly go to Edit Connections on Network Manager to get it to automatically reconnect to my wireless network. Shouldn’t it try to automatically reconnect by default? That’s what it did in previous versions.

Another worthy critique

Someone on the Ubuntu Forums linked to 16 things that could be improved in Ubuntu 10.04, and I have to say it’s brilliant and very thorough. I don’t agree 100% with it (for example, Control-Alt-Delete needing to launch gnome-system-monitor). I do, however, agree with most of it and the general sentiment, which is that a lot of the decisions the Ubuntu devs made seem to have absolutely no rationale. It’s not that it’s a rationale I or others disagree with. It appears to be a totally non-existent rationale.

I’d like to elaborate on a couple of points here.

First of all, I don’t have a problem with the window buttons being on the left, as opposed to on the right. I’ve used both Windows and Mac OS X extensively, and I can use both just fine. Here’s the real issue, though. On Mac OS X, the window buttons are the left but the close window button is on the absolute left. On Lucid Lynx, the button group is on the left, but the close button is on the right of the group. That means if you want to close a window with your mouse, you have to move the mouse over to the middle-left of the window instead of the absolutely left corner of the window. Believe it or not, for most users, closing the window is the most common action used with the mouse (not maximizing/restoring or minimizing). Whereas you have easy key combinations to switch windows (Alt-Tab or Cmd-Tab) or minimize windows (Control-Alt-D, Windows-D, or Cmd-H), there isn’t really an easy and consistent way to close windows. Sometimes in Ubuntu it’s Control-W. Sometimes it’s Control-Q. Sometimes you have to do the awkward Alt-F4. Also, it’s safer to use the mouse to close a window since you’re less likely to close the wrong window. I’ve more than once Alt-F4’ed (in both Windows and Linux) the wrong window (thinking it was in focus when it wasn’t).

Someone brought up in the comments that a smaller font may be better for netbooks but isn’t great for larger desktop monitors. Well, Ubuntu seems to be able to autodetect my screen resolution is 1024×576. I’m sure for a lot of large desktop monitors it can autodetect your screen resolution as 1600×1200 or whatever. Would it be that difficult to have the defaults auto-adjusted to your screen resolution? So if you’re using a netbook, the default font would be 8pt or 9pt, and if you’re using a large monitor the default font would be 12pt or 14pt. Hey, there’s an idea.

The Future of Ubuntu

Pretty soon, I’m almost finishing up my fifth year with Ubuntu. I started Ubuntu in May 2005 with Ubuntu 5.04 (Hoary Hedgehog). I’ve used every release since then: Breezy Badger, Dapper Drake, Edgy Eft, Feisty Fawn, Gutsy Gibbon, Hardy Heron, Intrepid Ibex, Jaunty Jackalope, and Karmic Koala. I’ve posted literally tens of thousands of times on the forums to help new users with their problems. I’ve filed bug reports. I’ve written documentation (both official and unofficial). Over the years, I’ve seen Ubuntu improve a lot. In the old days, there were separate live and installer CDs. The installer CD didn’t even have a point-and-click interface. You couldn’t enable the extra repositories without manually editing the /etc/apt/sources.list file. You couldn’t safely write to NTFS. There was no bootsplash. There was no Wubi to allow a 99.9999% safe dual-boot setup with Windows. I like the recent logo rebranding, too.

With all that vast improvement, though, Ubuntu still hasn’t come significantly closer to fixing Bug #1. There are a few good reasons for this, the main one being that Ubuntu still hopes people will download, burn, install, and configure Ubuntu on their own. This isn’t the way to penetrate the market. And the preinstalled Ubuntu options are not appealing to the general public for various reasons. Dell doesn’t advertise Ubuntu well or price it competitively to Windows. Dell also does not sell Ubuntu on higher-end models… or even in very many countries. You cannot find the Ubuntu-preinstalled Dell models in a physical store to try out. You have to buy it sight-unseen. Same deal with System76 and ZaReason for that last part. If I’m going to be shelling out hundreds or thousands of dollars on a laptop, I want to be able to try it out and see how it looks and feels. With my last two purchases, I had to do it sight unseen (Xandros-preinstalled Asus Eee PC 701 and Ubuntu-preinstalled HP Mini 1120nr). It wasn’t fun having to scour the internet for various reviews and then realizing there were always one or two quirks that no one mentioned that I later discovered.

I don’t know if Jane Silber or Mark Shuttleworth will ever stumble upon my blog, but I wrote two years ago what I believe their best strategy would be, and I still believe that to be true: Ubuntu: The Open Source Apple Challenger? You need a store. You need a physical store with well-designed custom fully Linux-compatible laptops. It has to be as sleek as the Apple Store but with Ubuntu’s unique branding and, more importantly, a more open philosophy. Yes, we highly recommend you use this Ubuntu laptop and this Ubuntu phone and this Ubuntu MP3 player and this Ubuntu printer, but you may also find Ubuntu works well with many other devices. These are the ones we guarantee will work. No kernel regressions. Lots of extra testing.

When you file a bug report for Ubuntu, you’ll have to post lspci and other stuff only if you’re using a non-sanctioned model. Otherwise, Launchpad will automatically know exactly what model you have.

I can hear the Ubuntu zealot backlash in my head now. “How can you complain about something that’s free?” “Why don’t you just get a Mac?” “Ubuntu just needs more polish.” No. No. No. That’s not it. See, as I’ve pointed out before, you can’t have it both ways. If you’re going to say that which is free is not worthless, you have to stand by the quality of that which is free, which means you have to accept that there can be criticism of that which is free. Otherwise, you have to say free is necessarily inferior to that which is non-free. Besides, I have devoted hundreds of hours to helping Ubuntu. Maybe I didn’t pay money for it (except that one time I donated to the forums), but I certainly have donated enough of my time and energy to the project to be able to voice a criticism or two. I’ve certainly filed my fair share of bug reports and posted my fair share of brainstorms. And, sure, Ubuntu could use some more polish, but polish won’t save the day if people are still supposed to download and install Ubuntu themselves. For more details on that, see Linux-for-the-masses narratives.

Should I get a Mac, though? I don’t know. I have a lot of problems with Macs. I don’t like how you can resize windows from only the lower-right corner. I don’t like how there is a universal taskbar. I don’t like how accidentally dragging an icon off the dock makes it vanish in a poof of smoke. I don’t like how you can’t get a new finder window by pressing Cmd-N. I don’t like how Enter renames and Cmd-O opens. I don’t like how minimized applications don’t restore when you Cmd-Tab to them. I don’t like how closing the last window of an application doesn’t quit the application.

You know what, though? Even though I don’t agree with how Apple set up the interface, I understand the rationale behind each and every one of those decisions. I don’t have to agree with the rationale to understand it. For some of the Ubuntu or Gnome teams’ decisions, I cannot see the rationale at all. They just seem like bugs or arbitrary decisions. They don’t all follow a consistent paradigm or vision. More importantly, Apple does have some great innovative things. Love the multi-touch implementation on the new Macbooks. Love the magnetic power cords.

I guess we’ll see what happens when I’m next in a position to buy a new computer. If, by the time I buy a new computer, Ubuntu has physical stores with well-polished and properly marketed preinstalled laptops, I’ll probably get one of those. If, by the time I buy a new computer, Google Chrome OS netbooks are actually a good option, I’ll probably get one of those. If, however, we’re still in the same place we are now with Linux preinstalled, I may be getting a Mac. Don’t let me down, Jane and Mark. I admire so much of what you do, but Ubuntu really has so much more and different to do to get across that Bug #1 threshold. It isn’t just about improving software. It’s about an entirely new business approach.

P.S. I’m not threatening to leave Ubuntu. I’m simply stating what I believe to be a practical approach. If it’s been two years and I go to Google Chrome OS or Mac OS X, I’ll probably still be doing Ubuntu tutorials to help new users. They’ll just still be primarily for Windows ex-power users and not the so-called masses (aka “jane six-pack,” aka “average user”).

Categories
Uncategorized

What bothers me about the Ubuntu-Yahoo deal

On Tuesday, Rick Spencer announced on the Ubuntu developers mailing list that Ubuntu has entered a revenue sharing deal with Yahoo! and will make Yahoo! the default search engine in the next Ubuntu release (10.04, Lucid Lynx). This sparked an extremely long discussion thread on the Ubuntu Forums about whether this is a good idea or not.

Generally speaking (with few exceptions), the reactions fall into one of two categories:

  1. This is great. I won’t use Yahoo! myself, but if it makes money for Ubuntu, why not? How hard is it to change the defaults. Two clicks.
  2. This is unacceptable. Yahoo! is in bed with Microsoft. This is wrong. If Ubuntu needs money, we should donate. Why wasn’t the community consulted?

Well, my reaction to this deal wasn’t quite either of those. Yes, I believe the community should have been consulted. That isn’t really what bothered me. What bothered me is that the decision was made soley with regard to revenue and not thinking at all about the user experience. It wasn’t “We evaluated the default search engine and decided Yahoo! has better search results or gives a better search experience than Google, and so we have decided to enter a revenue-sharing deal with Yahoo!” Nor was it even “We evaluated Yahoo! and Google and found the Yahoo! search experience to be only slightly worse than the Google one or about equal, but we thought revenue-sharing would be worth the sacrifice.” No, no mention of the user experience at all. It’s just the revenue.

I have nothing against Ubuntu making money. Mark Shuttleworth has deep pockets, but if Ubuntu is to be self-sustaining, it can’t just drain his pocketbook indefinitely. Nevertheless, defaults matter, and if they didn’t this deal would get Ubuntu no money (if most people changed the default, very few users would keep Yahoo!, which means Ubuntu wouldn’t get much revenue from this deal).

That last bit is something people don’t realize. If all (or even most of) the Ubuntu users change the default to Google or Cuil or Scroogle, then you can’t say “Well, I won’t use it, but great for Ubuntu to make some money.” They won’t be making money if you all keep changing the search engine.

But we won’t all be changing the search engine. Anyone handed the live CD and trying to do a search will either not know Yahoo! is the default search engine or just not bother to change it. (One of the reasons defaults matter.)

So I can see only two sensible reactions to this deal:

  1. This is great. Anything to make Ubuntu money. I intend to keep Yahoo! as the default to make Ubuntu money.
  2. Extra revenue is great, but why isn’t the user experience even considered when making this decision?

Obviously, I choose the latter.

Categories
Uncategorized

How else can Linux fail in the consumer space?

Many Linux advocates and Linux bashers still think the success or failure of Linux in the consumer (not server or embedded) space rests on technical merits. Implementation, marketing, pricing, inertia, vendor lock-in—no, of course, those have nothing to do with whether people decide on Linux as opposed to Windows or Mac OS X. Would it help to work on the technical merits of Linux? Sure. Will that alone make Linux a success for consumers? Hardly. Technical merits will get technical users into it (Network admin, want a server? Use Linux. Hey, TiVo, want a free operating system for your DVR product? Use Linux).

Linux had a few good opportunities to succeed, but flubbed on the execution:

  1. OLPC. When I heard about the One Laptop Per Child project, I got giddy. It was marketed as the $100 laptop. It was going to be durable. It was going to use Linux. It was going to help kids in developing countries learn. If that had been what really happened, Linux would have really taken off, at least in certain demographic segments of the world. What really happened? Well, the laptop was nowhere near $100. It was more like $200. And if rich folks wanted them, they had to pay $400 ($200 to get one, $200 to give one). It also was a pretty ugly laptop, with an extremely crippled version of Linux.
  2. Dell. When Dell started up its Idea Storm section, it probably had no idea the section would be bombarded by Linux users demanding Dell start offering Linux preinstalled. Well, Dell half-heartedly gave in and offered a couple of select models with Ubuntu preinstalled. This half-hearted effort doomed the new venture to failure. Dell hid Ubuntu away so no one could see it on their website without a direct link or clever Google searching. Dell priced the Ubuntu laptops more than spec-equivalent Windows laptops. Dell “recommended” Windows on all the Ubuntu laptop pages (it still does). Dell still used Linux-unfriendly hardware (Broadcom, anyone?). To sum up, Dell was not invested in really selling Linux preinstalled. It just wanted to sort of, kind of appease the Linux community (most of whom continue to buy the cheaper Windows-preinstalled laptops and then install Linux for themselves).
  3. Netbooks. I love the idea of netbooks. The execution was terrible, though. They were not heavily advertised. Early netbooks had 512 MB of RAM and 4 GB SSD drives with 7″ screens. The battery life was poor. The keyboards were cramped. The screen resolution was practically non-existent. Worse yet, all the OEMs included crippled versions of Linux… Linpus Linux Lite, Xandros… installing software became in reality the nightmare that Linux haters often misrepresent it to be. It would be like having apps for the iPhone without an App Store. Yes, you could install a regular Linux version yourself, but that’s not what the everyday consumer is going to do. Microsoft slammed the years-familiar XP down on netbooks, and—suffering from a bad implementation and no marketing or advocacy from OEMs—Linux on netbooks floundered.
  4. Android. In many ways, Android is actually a success. But it is not the success it could have been. When people were saying various Android phones could be the next “iPhone killer,” I thought, Hey, maybe they could be. We’ll see. I wasn’t surprised to see that the G1 did not kill the iPhone, the MyTouch didn’t kill the iPhone, the Hero didn’t kill the iPhone, nor did the Droid, nor did the Nexus One. I have a MyTouch 3G with Android, and I love my phone. I understand very well why it didn’t kill the iPhone, though. Apple understands how to make an excellent user experience, and Google doesn’t. That’s the bottom line. I’m not an Apple fanboy. I actually disagree with a lot of the design decisions Apple makes. What I don’t dispute is that Apple has a vision. Every decision, whether I agree with it or not, has a rationale that makes sense. Yes, there are pros and cons, and Apple weighed them and decided the pros outweighed the cons. With Android, though, and with various HTC phones using Android, I see various bad interface implementations that have no pros at all. I just don’t see anyone properly testing these things. For example, on the MyTouch and the Nexus, the speaker is on the back of the phone. Why? On some of the Android text dialogues, you have to tap into the text field (even if you have no hard keyboard) to get the onscreen keyboard to appear (shouldn’t it appear automatically if the text field is in focus?). Those are just a couple of examples.

Just yesterday, Steve Jobs announced the iPad to much ridicule. People made fun of the name. People said it would be useless without Flash, a USB port, without a front-facing camera, without multi-tasking. They called it an oversized iPhone. They said the 4:3 aspect ratio wouldn’t be good for movies. The LED screen wouldn’t be good for reading in sunlight or for long periods of time.

I kind of liked it. I wasn’t overwhelmed by it. I wasn’t drooling. But I can see the appeal. It looks like a slick device, and it’s priced a lot lower than people thought it would be (most of the speculation saw it between $700 and $1000). If it’s a standalone device (doesn’t need to hook up or sync to a Windows or OS X computer with iTunes), I might consider it.

I would be curious to see if any OEM is going to step up to the plate here, though, and give Linux a real chance. I doubt it. It would be quite simple, though. Create a tablet just like the iPad (has to include proper multi-touch, though… no backing out for fear of so-called patent infringement, Google). Run a Linux-based operating system that is mainly open source (but can have some proprietary programs on it). Include multi-tasking. Include a proper software repository. Use a regular hard drive instead of SSD drive. Include USB ports. Have better screen resolution or a widescreen aspect ratio. Then price it just a little below the iPad… oh, and give it a proper name… one people won’t make fun of.

How simple is that? Will it happen? Probably not. A bunch of iPad imitators will pop around, sure. They’ll each have serious flaws, though. Many will lack multi-touch. Most will be too bulky. Some won’t have a sensible user interface. Some will be too expensive. Then I can tack it on as yet another way Linux has failed in the consumer space.

Mark Shuttleworth, if you’re reading this, it’s about time you realized Bug #1 gets fixed once you create a full and unified software-hardware user experience. Hoards of Windows users aren’t going to download the Ubuntu .iso, set their BIOSes to boot from CD, repartition their hard drives, install Ubuntu, and then troubleshoot hardware compatibility problems. You (or someone with your savvy and financial resources) need to be the open source Steve Jobs if Linux is going to succeed in the consumer space.

Categories
Uncategorized

The Power of Defaults

I tend to see two extremes whenever there are arguments about what should be the default (I’m speaking specifically of arguments on the Ubuntu Forums, but this could be applied to really anything in technology or anything in life in general).

One extreme is that defaults don’t matter at all. It’s not worth arguing about. Just put whatever as the default. Then people can just choose to change the default to something else if they don’t like the default. The other extreme is that defaults matter enough to have 500-post forum threads about arguing back and forth. Somewhere in the middle is some sanity.

Defaults matter. But defaults are only defaults. People can choose options other than the defaults.

Why do defaults matter? Here are some examples:

  • I love that on my wife’s Macbook Pro, you just press the function keys, and they do something right away (lower the volume, adjust the brightness). My netbook by default needs to have the blue Fn key pressed in combination with the function keys to get that behavior to happen. I can easily change that. But if I change it, it’s confusing for anyone else using my netbook, because the instructions on the physical keys themselves indicate the function keys are normal functions and that you need the Fn key in combination in order to do anything. In other words, whole products sometimes have fixed parts built around the assumption that defaults will go unchanged.
  • I use VLC for playing individual sound bits or videos. When I dug into the settings for VLC, I didn’t understand half of what that stuff is, and there were a lot of options to configure. Very confusing for a multimedia newbie like me. Good thing I didn’t have to configure all those settings. I just used the defaults. Sane defaults save the user from having to understand unnecessary minutiae.
  • As far as I can tell, every Linux user has a list of things she does immediately after a fresh installation. I usually change the wallpaper to a picture of my cat, replace Evolution with Thunderbird, add in some proprietary codecs, and delete the bottom Gnome panel. Sane defaults should make the sense for the most users. Even though I personally delete the bottom Gnome panel, the vast majority of Gnome users like to have both a top and bottom panel, so to have the top panel only wouldn’t make sense, because it would mean more people would have to take more time configuring things. If defaults are well-thought-out, less time is spent tinkering and adjusting and more time is spent using.
  • Linux live CDs can come in handy, especially if you need to help a Windows user recover deleted data. What happens in a default installation is the first impression that non-Linux user is going to have of Linux and may be the only impression she has. So if an ugly noise or splash screen appears, that’s the impression she’s going to get. It doesn’t matter if that noise or splash screen can be changed. Likewise, if you are using the live CD to show someone what Linux is like, you don’t want to have to “uninstall” and then “install” in the live session a whole bunch of software, especially if the computer you’re using has very little RAM.
  • And don’t forget that even though power users like to tinker and explore, most people just stick with defaults. 99% of Windows computers I see have the taskbar on the bottom, even though you can easily drag it to the top or the sides. 99% of Windows XP computers I see have the stupid blue theme, even though you can easily change to a silver or classic theme. Even though Firefox’s marketshare has skyrocketed in the past five years, Internet Explorer is still, globally the more-used browser over Firefox, Opera, Chrome, and Safari. It being the default web browser in Windows probably had something to do with that.

Yes, if you have an absolutely unbearable default, many people will probably just ditch it anyway, but instead of thinking “I’m so glad I have the freedom to change this setting,” they’ll most likely be thinking “What a terrible default! Who thought of that? Now we’re all going to have to change this!”

Sometimes defaults can have ethical considerations, too. For example, making people have to opt out of sharing information with a company or third-party corporation “partner” is a bad default (people should always have to opt in for that sort of thing), because it means if people forget to change the defaults or don’t investigate all of their basic settings and advanced settings, they will end up sharing more than they intended to share.

So if I see a bad default in Ubuntu, I’m going to make a point to say it’s a bad default. Good defaults matter. I will not, however, spend hours of my time arguing the point back and forth. Some things are a deal… they may not be a big deal, but they are still a deal.

Categories
Linux Ubuntu

The GUI v. CLI Debate

I’ve been helping with online tech support for Ubuntu for over four years now, and every now and then the discussion comes up about whether it’s “better” to use terminal command instructions or to use point-and-click instructions when offering help.

Inevitably, some die-hard CLI (command-line interface) fans come out and say that the terminal is “more powerful” and that every Linux user should learn to use the terminal, and then some die-hard GUI (graphical user interface) fans come out and say that the terminal is intimidating and that if Linux wants more users, it has to develop more graphical interfaces for things; and then you get the hardcore Linux users who claim they don’t care if Linux gets more users or not, etc., etc., ad nauseam.

The truth is that neither CLI nor GUI is always “better” than the other. There are appropriate situations for both CLI and GUI on a support forum. I hope everyone can agree that all common tasks should be able to be done in the CLI and through the GUI. Choice is ultimately what’s most important, so that those who prefer the CLI can use the CLI, and those who prefer the GUI can use the GUI.

But if I am offering help to new users, do I give GUI instructions or CLI instructions? It depends on what kind of support I’m giving.

When is GUI support appropriate?
If a new user wants to know how to do a basic task that she will probably repeat (or, if not the exact task, then at least something similar) in the future, then I will usually give point-and-click instructions to encourage that user to explore the GUI for that kind of task. For example, if a new user asks “How do I install Audacity?” then I am not going to say “Just use sudo apt-get install audacity.” Instead, I’ll tell her to use Applications > Ubuntu Software Center or Applications > Add/Remove, or just link her to this tutorial on how to install software. There are several reasons I do this:

  • Even though the apt-get command makes perfect sense to me, it is just cryptic gobbledygook to a new user, and it will not help her to install other software in the future unless I bother to explain how the command works; and, more importantly, even if she understands how apt-get works, she’ll still need to know the name of the package she wants to install in order to use the command most efficiently.
  • A lot of new Linux users (myself included, when I first started) have an irrational fear of the terminal, even if you tell them to copy and paste the command with a mouse (no typing necessary). Eventually, as they become more comfortable with the new environment that Gnome or KDE (or Xfce or whatever other user interface they’re exploring) has to offer, they are more likely to be amenable to learning terminal commands and even liking them.
  • Among Windows power users (the most likely group to migrate to an almost-unheard-of operating system that requires download, installation, and configuration from the user and not the OEM), there is already a reputation Linux distros have of being too terminal-dependent. It’s great to advertise to new users just how many things can be done by pointing and clicking, and that will make their transition to Linux that much easier.

Ah, some veteran forum members would protest, but what if I don’t want to bother making screenshots or typing out long point-and-click instructions that can be summed up in a single command? To that, I say if you’re too lazy to offer appropriate help, don’t offer help at all. Someone else will help. Or, better yet, find a good screenshot-laden tutorial and link to the tutorial instead (that’s actually how I started up my Psychocats Ubuntu tutorials site—I got tired of constantly retyping the same support posts over and over again, so I just made one place I could keep linking new users to).

I would say something similar to those who use Fluxbox or Enlightenment and want to primarily help those who use Gnome or KDE. If you aren’t familiar with the graphical environment the user you’re trying to help is using, don’t offer help in that instance. Save your help for when the CLI is appropriate.

When is CLI support appropriate?
The GUI may be fine for common tasks (installing software, launching applications, managing files and folders), but what if someone runs into a problem? What if what she’s doing is not a common task but a one-time setup or configuration? There’s no way if a new user says “When I try to launch Firefox, it just disappears” that I’m going to offer a point-and-click solution. Problems are best diagnosed with the CLI, and terminal commands (even for GUI applications) are more likely to yield helpful error messages. Likewise, if her wireless card isn’t recognized properly or fixed by System > Administration > Hardware Drivers, it isn’t a crime to walk her through manually editing configuration files to fix the wireless problem, because once it’s fixed, she should never have to do that again.

If you do offer CLI solutions to problems, though, as much as possible try to explain what these commands mean or do. You don’t have to copy and paste in a whole man page (in fact, that probably won’t be helpful at all—I’ve been using Linux for years and have yet to find a man page I actually understand). Just keep in mind that to many new users, terminal commands are like a foreign language they can’t even say hello or thank you in.

CLI and GUI aren’t going away any time soon. One is a hammer. One is a screwdriver. No one tool will suit everyone best at all times. Use what’s appropriate. Appreciate that what you like or prefer may not be what someone else likes or prefers.