Friday, January 09, 2009

2009: Year of the Linux Handheld?

Palm's new Pre handheld will run a Linux-based OS

A common inside joke on Slashdot is the "Year of the Linux Desktop", a revelation about each upcoming year being the mythical time when Linux will obtain wide-scale adoption on desktop PCs. Although the number of PCs running Linux has increased each year, a new emergent trend is that Linux is becoming a popular on more consumer handheld devices.

The reason for Linux's rise in popularity on cellphones is that cell manufacturers (Eg. Motorola) are being forced to ditch their simpler, proprietary operating systems that run on the "dumbphones" they sell because a more powerful operating system is needed to handle better multitasking and internet connectivity. Windows Mobile has been a popular pick with manufacturers because it meets the needs of smartphones, but the licensing costs are enough to make manufacturers like HTC experiment with other platforms, like they did with Google's Android on their HTC G1.

Similarly, Palm has just announced their brand new Pre smartphone, which just so happens to be powered by Linux. Palm's "OS 2.0" (or WebOS, as it is now known as) has been in the works for as long as anyone can remember, and if you check out Engadget's videos of the Pre, you'll find out just how hard they were working.

With the devices like the Palm Pre and the upcoming swath of Android phones, 2009 might be the year that Linux starts challenging Microsoft's dominance in the smartphone arena. At the same time, it will be interesting to see how Linux holds up in the ongoing netbook race as well.

If devices like the Asus Eee PC continue to ship with Linux, it would not be surprising if Microsoft started offering manufacturers crippled versions of Vista for netbooks at a significantly discounted price. If Microsoft were to do that, I think we'd see Linux's market share on netbooks decline because a crippled Vista would be easier for average Joe to use for email/web/IM than Linux and the cost advantage of Linux would not be as good. Still, Microsoft already missed the boat with netbooks - There is no Windows Vista Netbook Edition, as there needed to have been early last year in order for Microsoft to have crushed Linux on netbooks. Instead, manufacturers found that companies like Xandros were willing to provide small, fast, and flexible Linux distributions to meet the needs of their netbook target audience, while Microsoft wasn't interested. This was undoubtedly a good thing for Linux adoption.

The Linux-powered Pandora gaming handheld is due out in Q1 2009. (Yes, this render looks like it was made in 3D Studio Max circa 1998...)

In 2009, Linux will also see interesting applications in devices like the Pandora gaming handheld (a spiritual successor to the GP2X). A 600 MHz ARM processor, 800x480 screen, 802.11g, Bluetooth, USB 2.0 host, and a purported 10 hours of battery life - This handheld sounds like a gamer's dream.

However, the success of any handheld device hinges on one thing being done superbly well; Software, software, software. If the games for Pandora are bad, not even hackability can save it. If the software on the Palm Pre turns out to be garbage, same fate. One can start to see a pattern emerging though - Linux has been successful on netbooks because the software stack of Firefox/Thunderbird/Pidgin is rock solid. The software that tackles the primary use cases of netbooks is fantastic (largely in part because Firefox, Thunderbird, and Pidgin are simply mature, well-run software projects). The tricky part with Pandora is that its primary use-case (gaming) does not have a fantastic stack of off-the-shelf open source software to run, perhaps beyond emulators. No offense, but Supertux isn't exactly what I'd like to be playing on Pandora.

There is nothing magical about running Linux that will make a device better for end-users. It can cut down development and licensing costs, but ultimately the fate of Linux on handhelds in 2009 will come down to the quality of the software that runs on it. Let's hope it goes better than last year.

Sunday, September 07, 2008

Has Linux lost the ISV battle?

For as long as anyone can remember, one of the big problems with Linux has been the lack of commercial applications. Independent software vendors (ISVs) are generally sticking to Windows or OS X, resulting in very little commercial software being available for Linux. Free software ideologies aside, there are many commercial applications that Linux would benefit from being able to run.

As a cross-platform software developer, there are many challenging issues that are unique to developing on Linux. First and foremost, there is the issue of binary compatibility.

In order to build a x86 Linux binary that runs on as many desktop Linux distributions as possible, the most widely documented procedure is to simply build your application on the oldest distribution you can find. Most Linux libraries are backwards-compatible, meaning an application compiled against older version will run with newer versions of the library. In theory, this seems like a reasonable way to make a universal Linux binary. In practice, things are very different - Do you statically link or dynamically link and bundle the libraries? How exactly does one do all of this? Is it practical to roll this procedure into your build system? Furthermore, where is the official documentation for this procedure? What are the best practices for producing a universal x86 Linux binary?

Another issue is software distribution. As a commerical software developer, how do you distribute your software to as many customers as possible? You'd need to create DEB and RPM packages, and probably have some generic graphical installer package as well. On Windows, a single installer .EXE will install on 2000, XP, Vista, etc. On Linux, you either need to create tons of packages, or you have to limit your customer base by creating packages for only the most popular distros. Already you've multiplied the amount of effort required to develop for Linux manyfold.

Additionally, you're fighting an uphill battle against open source software. If Adobe made a version of Photoshop for Linux (and allowed you to buy DEBs from their site), most people would still just install GIMP through Synaptic or Ubuntu's Add/Remove Applications dialog. Even worse (for Adobe), GIMP is installed on every Ubuntu system by default. The best Adobe can hope for is to offer DEBs via their site, and hope that people have a priori knowledge of their product, and go to their website to buy it. There is no "App Store" for Ubuntu, and perhaps there should be because distributions certainly don't make it easy to sell your software for Linux.

Many of these issues that are hindering independent software vendors from developing applications for Linux could be alleviated by much better organization by the Linux Foundation. When I originally heard about the Linux Standard Base (LSB), I was excited at the prospect of finally having universal x86 binaries for Linux, and perhaps it would finally open the door for more commercial Linux applications, I thought. However, to date, I can count the number of LSB certified applications on one hand. Mass adoption of Linux by ISVs did not happen, and I can't say that I blame them. If I were a developer coming from Windows, I wouldn't know where to start. The Linux Foundation's getting started guide includes an article on porting your application to the LSB, but that's for existing Linux applications, not for applications that already run on another platform.

If I were a developer looking to write a new application on Linux, I would not know where to begin. Do I use GTK or QT? wxWidgets? What are the standard system libraries on Linux? Where is everything documented? There is no central documentation repository that guides Linux developers and provides answers to this question. Windows developers have MSDN, OS X developers have Apple's Developer Connection, Linux developers have nothing but a bunch of scattered webpages, each trying to convince you that their library is the best one to use. This is not a productive approach, and an organization like the Linux Foundation should make a serious effort to give developers the information they need to develop their applications quickly. You can't expect developers coming from Windows to know what libraries to use by googling for answers - There needs to be some centralized site that provides developers with the answers they need. To me, this highlights the lack of leadership in the Linux desktop community.

In the kernel, it's very clear who is in charge. There is a clear structure of command, and this allows the kernel developers to work as an effective organization. Within the userspace (ie. libraries and software applications), we do not see the same command structure. We have Freedesktop.org creating "standards" and backend software for the Linux platform, but of the software hosted on it, it claims:

None of this is "endorsed" by anyone or implied to be standard software, remember that freedesktop.org is a collaboration forum, so anyone is encouraged to host stuff here if it's on-topic.


What desktop Linux appears to have is a plethora of organizations acting independently (creating libraries, etc.), with no clear cross-organization leadership. Freedesktop.org has been successful in getting many of these organizations to cooperate and has undoubtably resulted in an improved desktop Linux (see HAL and DBus), but it doesn't seem to be preaching a clear vision to those organizations, nor providing any guidance for new developers wishing to take advantage of the new technologies it has fostered.

Many of my views presented here have been shaped by my experience writing proprietary software on an embedded Linux platform. I've worked with engineers who've never written software for Linux, and they have a hard time answering the questions they have because Linux doesn't have something like MSDN. It's very easy to make bad decisions about what libraries to use on Linux due to the lack of centralized documentation.

The nightmare of binary compatibility, lack of support from Linux distributions, and the absence of centralized documentation and guidance for Linux software developers make it a difficult and expensive platform to develop on. It's a great platform to slap together little applications on, but when you have to deal seriously with the issues that independent software vendors have to deal with when developing desktop applications, Linux as a platform simply isn't worth the effort.


* Update: According to Phoronix, CyberLink DVD playing software appears to be for sale for Ubuntu in the Canonical Store. Two thoughts on this:
  1. Might this be start of the Ubuntu app store?
  2. CyberLink is just experimenting with this, they don't expect to make a lot of money from it. They make their money from distribution deals like getting bundled with DVD-ROMs, or more recently, getting bundled with Linux-based Netbooks/MIDs, not from selling their software in stores. I suspect this is also an experiment on Canonical's part, as they gauge the response of Ubuntu users and find the optimal way to integrate this into Ubuntu (hopefully in Add/Remove Applications one day).

Ubuntu and the ASUS P5Q-E Motherboard

I built a brand new PC a few weeks ago, and getting my ASUS P5Q-E motherboard to work in Linux took a few tweaks. I had taken my harddrive out of my old PC and dropped it right into the new PC, expecting it to work. Ubuntu managed to start booting, but it hanged at the earliest bootup splash screen, where the progress bar bounces back and forth. GRUB had managed to boot the kernel image, but something was wrong - The kernel image couldn't find my hard disks, so it wasn't booting.

To solve this problem, I had to change the following BIOS options:

  1. Under MAIN / Storage Configuration, I had to change "Configure SATA as ..." to [AHCI]. This allowed the kernel to find my disks and boot.
  2. I experienced some weird USB problems while booting, and so under ADVANCED / USB Configuration, I had to change "BIOS EHCI Hand-Off" to [Disabled].
  3. For good luck, I also made sure ACPI 2.0 was enabled under power saving.
Hopefully someone finds this useful. When I first booted and Ubuntu didn't boot, I thought to myself, "Damn, I should have checked if this new Intel chipset has good support in the kernel". I was worried the motherboard just wasn't going to work. However, after tweaking those BIOS options, things are working fine. +1 for Linux.

Random things

It's been a while since my last update, but I'd like to start practicing writing again, so I'd better start blogging more often. Since 2006, I've been involved with a growing open source project and that's been eating up most of my free time.

In the meantime, I'm still a die-hard Linux fan, although I haven't kept up with the latest and greatest stuff as much. I've also recently started experimenting with (drumroll) Windows Vista, and I've been impressed overall with it. My wireless USB stick has the same problems in Vista as it does in Linux, so I guess that's a good thing for Linux (?!). :)

Also on my list of random things to write about is Phoronix. Phoronix is a well-written Linux news site that's written and organized in a style that's aimed at Linux hardware enthusiasts. The editor(s) there do an excellent job of covering the latest and greatest developments in the Linux software world as well. It's definitely worth adding to your RSS reader.

Aha, I just remembered a topic for a post I wanted to write (I have most of the post written down on a napkin here). Next up, how to make Linux work with your ASUS P5Q-E motherboard. Stay tuned.

Sunday, June 22, 2008

To iPhone or not to iPhone...


$199 USD for a 3G iPhone, with a soul-stealing ridiculously-priced contract.

Is it worth it?

I've been considering getting an iPhone when it launches in Canada on July 11th, but rumours indicate that it's going to cost about $90 CAD/month for service with Rogers. I currently pay about $10 CAD/month for a cheapo prepaid cellphone. In 2 months I'll be moving across the country, so now is a convenient time to reconsider my options. I've decided that if I were to get an iPhone, it'd replace my landline. Does it make it any more affordable? No, $90/month on a contract for 3 years still seems insane to me.

The massive draw with the iPhone for me is the software. The sheer amount of cool applications that are going to be released for it make it very appealing, plus the stock software (email, web browser) are top notch. On the other hand, it's a nightmare if you want to make it sync with Linux.

What're my other options? Stick with the $10 CAD/month cheapo phone, and possibly invest in something else for mobile internet. This is where Linux comes back into play...



The Nokia N810 WiMAX Edition has caught my eye. A little known fact is that Canada already has a nation-wide WiMAX network, with access offered by both of our big telcos (as Rogers Portable Internet and Bell Sympatico Unplugged). For about $60 CAD/month (tax inc.), it looks like I can get 1.5 Mbps WiMAX which I believe will work with the N810. As an added bonus, I could just use this WiMAX connection as my internet connection at home too.

Lastly, did I mention that the N810 runs Linux? Nokia's device runs OS 2008 and it looks Android-ready too. I'm a fan of Linux-based embedded devices (like the GP2X), so this adds a bit of hackability to the thing. Unfortunately, the N810 WiMAX Edition doesn't look like it comes out for another month, so we'll have to see what the reviews are like when it's released. Until then, I'll keep pondering...

Tuesday, May 13, 2008

Debian Bug Screws us All

This morning, I spotted this nasty tidbit on Slashdot: Debian Bug Leaves Private SSL/SSH Keys Guessable

It turns out a maintainer of the OpenSSL package on Debian removed the "seeding" of the random number generator that is used to generate, among other things, SSH keys. For those unfamiliar with random number generators, they work by generating a sequence of pseudo-random numbers based on some initial seed. The default value most programmers use when seeding their random number generators is simply the time, because it changes quickly and ensures a great deal of variability in what the generated random sequence of numbers will look like. If you seed your random number generator with the same number every time, you'll end up with the same sequence of numbers being generated over and over again - It won't be random at all!

What this means to Debian users is that your SSH keys are not random, and they're much easier to crack/guess because of this. Because Ubuntu is based on Debian, and the OpenSSL packages are relatively untouched by the Ubuntu maintainers, this bug also affects all Ubuntu users.

Both Debian and Ubuntu have released security updates which fix the problem and ensure that any future keys that are generated have the expected level of security. However, keys that have already been generated need to be expired and replaced.

Fortunately, the Ubuntu update that you will receive through update-manager takes care of this for you. For a desktop user, this is sufficient. For system administrators who might use SSH keys widely, it's a massive pain in the ass.

There's much more to this story though - A Slashdot user dug up the original Debian bug report that lead to the "fix" that removed the seeding. The OpenSSL developers used uninitialized memory to seed their random number generator, which caused a warning in Valgrind that someone playing with the code noticed. Valgrind is a tool to help find memory leaks and memory corruption (ie. programmers' mistakes). However, the original code wasn't erroneous at all - The programmer that wrote that code must have asserted that uninitialized memory has more "randomness" than the time, which may or may not have been a good assumption. Regardless, it's clear that the real mistake was "fixing" this code without fully understanding the consequences, and the Debian package maintainers (and the developer that submitted the patch) are at fault.

What happened here is a nightmare scenario for an open source software developer like myself. When my development team makes a release, we do some distribution packaging ourselves, but we also count on other people to make packages for other distributions. If any of those package maintainers modify our software in any way, we can no longer guarantee the quality of our software. If this sounds familiar, you might remember that this is exactly why Mozilla wanted Debian to stop using the Firefox name back in 2006. Mozilla wanted to ensure that all of their users got "Firefox", not "Firefox plus Joe Blow's crappy tweaks", and I completely agree with them. As more open source projects grow and become professionally run, I can see this becoming a more common issue in the future.

Finally, there is the question of whether or not the OpenSSL vulnerability was introduced intentionally. To give the poor guy the benefit of the doubt, I think it was an honest mistake. He fixed something that he thought was broken, and it turns out he was wrong - an understandable, human mistake. The uploader that approved his change probably should have caught the mistake, but again, he too was also only human. Is it possible that this was intentional? Sure. Is it possible that this could be used as a blueprint for future open source sabotage? Absolutely.

Are we any less likely to see security flaws introduced like this again? Absolutely not. It's the process through which packages are maintained and updated that is broken here.

The solution? Discuss.


Disclaimer: Don't go on a witch-hunt for the Debian guys who made mistakes here. Stuff like this happens, and if it were you or I in their shoes, we may have made the exact same mistake. I stress once again that it's the process of distribution package management that is flawed, not the people involved.

Saturday, May 10, 2008

Blogger Captcha Cracked?

I just 50+ emails from Blogger saying new comments were posted on my blog:



Up until now, the spam situation with Blogger was decent. I'd only ever had the odd spam comment come through and I had been able delete them all. With this massive barrage though, I don't have the patience to go through and delete them all, especially knowing that it can happen again.

For now, I'll just hope that blogger deletes the spammer's account and all of the comments he posted.

Sunday, May 04, 2008

Toy Story, Cave Story, .... Ubuntu Story ?

About once a month, I receive an email solicitation asking me to promote something on my blog. 9/10 times I just ignore it, because they're usually asking me to promote a spam blog. Not cool.



It looks like we did better with this month's email. Rather than a pointless spam blog, it was instead regarding UbuntuStory.com, which seems like a friendly advertisement for Ubuntu by a community member. I'm probably preaching to the choir here, but if you haven't tried Ubuntu yet, I'd check out the site.

Sorry, your first impression from that site will be wrong - Ubuntu won't take you on a wild African adventure, but it can make your computer much less of a pain in the ass to use. :)