Sunday, September 07, 2008

Has Linux lost the ISV battle?

For as long as anyone can remember, one of the big problems with Linux has been the lack of commercial applications. Independent software vendors (ISVs) are generally sticking to Windows or OS X, resulting in very little commercial software being available for Linux. Free software ideologies aside, there are many commercial applications that Linux would benefit from being able to run.

As a cross-platform software developer, there are many challenging issues that are unique to developing on Linux. First and foremost, there is the issue of binary compatibility.

In order to build a x86 Linux binary that runs on as many desktop Linux distributions as possible, the most widely documented procedure is to simply build your application on the oldest distribution you can find. Most Linux libraries are backwards-compatible, meaning an application compiled against older version will run with newer versions of the library. In theory, this seems like a reasonable way to make a universal Linux binary. In practice, things are very different - Do you statically link or dynamically link and bundle the libraries? How exactly does one do all of this? Is it practical to roll this procedure into your build system? Furthermore, where is the official documentation for this procedure? What are the best practices for producing a universal x86 Linux binary?

Another issue is software distribution. As a commerical software developer, how do you distribute your software to as many customers as possible? You'd need to create DEB and RPM packages, and probably have some generic graphical installer package as well. On Windows, a single installer .EXE will install on 2000, XP, Vista, etc. On Linux, you either need to create tons of packages, or you have to limit your customer base by creating packages for only the most popular distros. Already you've multiplied the amount of effort required to develop for Linux manyfold.

Additionally, you're fighting an uphill battle against open source software. If Adobe made a version of Photoshop for Linux (and allowed you to buy DEBs from their site), most people would still just install GIMP through Synaptic or Ubuntu's Add/Remove Applications dialog. Even worse (for Adobe), GIMP is installed on every Ubuntu system by default. The best Adobe can hope for is to offer DEBs via their site, and hope that people have a priori knowledge of their product, and go to their website to buy it. There is no "App Store" for Ubuntu, and perhaps there should be because distributions certainly don't make it easy to sell your software for Linux.

Many of these issues that are hindering independent software vendors from developing applications for Linux could be alleviated by much better organization by the Linux Foundation. When I originally heard about the Linux Standard Base (LSB), I was excited at the prospect of finally having universal x86 binaries for Linux, and perhaps it would finally open the door for more commercial Linux applications, I thought. However, to date, I can count the number of LSB certified applications on one hand. Mass adoption of Linux by ISVs did not happen, and I can't say that I blame them. If I were a developer coming from Windows, I wouldn't know where to start. The Linux Foundation's getting started guide includes an article on porting your application to the LSB, but that's for existing Linux applications, not for applications that already run on another platform.

If I were a developer looking to write a new application on Linux, I would not know where to begin. Do I use GTK or QT? wxWidgets? What are the standard system libraries on Linux? Where is everything documented? There is no central documentation repository that guides Linux developers and provides answers to this question. Windows developers have MSDN, OS X developers have Apple's Developer Connection, Linux developers have nothing but a bunch of scattered webpages, each trying to convince you that their library is the best one to use. This is not a productive approach, and an organization like the Linux Foundation should make a serious effort to give developers the information they need to develop their applications quickly. You can't expect developers coming from Windows to know what libraries to use by googling for answers - There needs to be some centralized site that provides developers with the answers they need. To me, this highlights the lack of leadership in the Linux desktop community.

In the kernel, it's very clear who is in charge. There is a clear structure of command, and this allows the kernel developers to work as an effective organization. Within the userspace (ie. libraries and software applications), we do not see the same command structure. We have Freedesktop.org creating "standards" and backend software for the Linux platform, but of the software hosted on it, it claims:

None of this is "endorsed" by anyone or implied to be standard software, remember that freedesktop.org is a collaboration forum, so anyone is encouraged to host stuff here if it's on-topic.


What desktop Linux appears to have is a plethora of organizations acting independently (creating libraries, etc.), with no clear cross-organization leadership. Freedesktop.org has been successful in getting many of these organizations to cooperate and has undoubtably resulted in an improved desktop Linux (see HAL and DBus), but it doesn't seem to be preaching a clear vision to those organizations, nor providing any guidance for new developers wishing to take advantage of the new technologies it has fostered.

Many of my views presented here have been shaped by my experience writing proprietary software on an embedded Linux platform. I've worked with engineers who've never written software for Linux, and they have a hard time answering the questions they have because Linux doesn't have something like MSDN. It's very easy to make bad decisions about what libraries to use on Linux due to the lack of centralized documentation.

The nightmare of binary compatibility, lack of support from Linux distributions, and the absence of centralized documentation and guidance for Linux software developers make it a difficult and expensive platform to develop on. It's a great platform to slap together little applications on, but when you have to deal seriously with the issues that independent software vendors have to deal with when developing desktop applications, Linux as a platform simply isn't worth the effort.


* Update: According to Phoronix, CyberLink DVD playing software appears to be for sale for Ubuntu in the Canonical Store. Two thoughts on this:
  1. Might this be start of the Ubuntu app store?
  2. CyberLink is just experimenting with this, they don't expect to make a lot of money from it. They make their money from distribution deals like getting bundled with DVD-ROMs, or more recently, getting bundled with Linux-based Netbooks/MIDs, not from selling their software in stores. I suspect this is also an experiment on Canonical's part, as they gauge the response of Ubuntu users and find the optimal way to integrate this into Ubuntu (hopefully in Add/Remove Applications one day).

Ubuntu and the ASUS P5Q-E Motherboard

I built a brand new PC a few weeks ago, and getting my ASUS P5Q-E motherboard to work in Linux took a few tweaks. I had taken my harddrive out of my old PC and dropped it right into the new PC, expecting it to work. Ubuntu managed to start booting, but it hanged at the earliest bootup splash screen, where the progress bar bounces back and forth. GRUB had managed to boot the kernel image, but something was wrong - The kernel image couldn't find my hard disks, so it wasn't booting.

To solve this problem, I had to change the following BIOS options:

  1. Under MAIN / Storage Configuration, I had to change "Configure SATA as ..." to [AHCI]. This allowed the kernel to find my disks and boot.
  2. I experienced some weird USB problems while booting, and so under ADVANCED / USB Configuration, I had to change "BIOS EHCI Hand-Off" to [Disabled].
  3. For good luck, I also made sure ACPI 2.0 was enabled under power saving.
Hopefully someone finds this useful. When I first booted and Ubuntu didn't boot, I thought to myself, "Damn, I should have checked if this new Intel chipset has good support in the kernel". I was worried the motherboard just wasn't going to work. However, after tweaking those BIOS options, things are working fine. +1 for Linux.

Random things

It's been a while since my last update, but I'd like to start practicing writing again, so I'd better start blogging more often. Since 2006, I've been involved with a growing open source project and that's been eating up most of my free time.

In the meantime, I'm still a die-hard Linux fan, although I haven't kept up with the latest and greatest stuff as much. I've also recently started experimenting with (drumroll) Windows Vista, and I've been impressed overall with it. My wireless USB stick has the same problems in Vista as it does in Linux, so I guess that's a good thing for Linux (?!). :)

Also on my list of random things to write about is Phoronix. Phoronix is a well-written Linux news site that's written and organized in a style that's aimed at Linux hardware enthusiasts. The editor(s) there do an excellent job of covering the latest and greatest developments in the Linux software world as well. It's definitely worth adding to your RSS reader.

Aha, I just remembered a topic for a post I wanted to write (I have most of the post written down on a napkin here). Next up, how to make Linux work with your ASUS P5Q-E motherboard. Stay tuned.

Sunday, June 22, 2008

To iPhone or not to iPhone...


$199 USD for a 3G iPhone, with a soul-stealing ridiculously-priced contract.

Is it worth it?

I've been considering getting an iPhone when it launches in Canada on July 11th, but rumours indicate that it's going to cost about $90 CAD/month for service with Rogers. I currently pay about $10 CAD/month for a cheapo prepaid cellphone. In 2 months I'll be moving across the country, so now is a convenient time to reconsider my options. I've decided that if I were to get an iPhone, it'd replace my landline. Does it make it any more affordable? No, $90/month on a contract for 3 years still seems insane to me.

The massive draw with the iPhone for me is the software. The sheer amount of cool applications that are going to be released for it make it very appealing, plus the stock software (email, web browser) are top notch. On the other hand, it's a nightmare if you want to make it sync with Linux.

What're my other options? Stick with the $10 CAD/month cheapo phone, and possibly invest in something else for mobile internet. This is where Linux comes back into play...



The Nokia N810 WiMAX Edition has caught my eye. A little known fact is that Canada already has a nation-wide WiMAX network, with access offered by both of our big telcos (as Rogers Portable Internet and Bell Sympatico Unplugged). For about $60 CAD/month (tax inc.), it looks like I can get 1.5 Mbps WiMAX which I believe will work with the N810. As an added bonus, I could just use this WiMAX connection as my internet connection at home too.

Lastly, did I mention that the N810 runs Linux? Nokia's device runs OS 2008 and it looks Android-ready too. I'm a fan of Linux-based embedded devices (like the GP2X), so this adds a bit of hackability to the thing. Unfortunately, the N810 WiMAX Edition doesn't look like it comes out for another month, so we'll have to see what the reviews are like when it's released. Until then, I'll keep pondering...

Tuesday, May 13, 2008

Debian Bug Screws us All

This morning, I spotted this nasty tidbit on Slashdot: Debian Bug Leaves Private SSL/SSH Keys Guessable

It turns out a maintainer of the OpenSSL package on Debian removed the "seeding" of the random number generator that is used to generate, among other things, SSH keys. For those unfamiliar with random number generators, they work by generating a sequence of pseudo-random numbers based on some initial seed. The default value most programmers use when seeding their random number generators is simply the time, because it changes quickly and ensures a great deal of variability in what the generated random sequence of numbers will look like. If you seed your random number generator with the same number every time, you'll end up with the same sequence of numbers being generated over and over again - It won't be random at all!

What this means to Debian users is that your SSH keys are not random, and they're much easier to crack/guess because of this. Because Ubuntu is based on Debian, and the OpenSSL packages are relatively untouched by the Ubuntu maintainers, this bug also affects all Ubuntu users.

Both Debian and Ubuntu have released security updates which fix the problem and ensure that any future keys that are generated have the expected level of security. However, keys that have already been generated need to be expired and replaced.

Fortunately, the Ubuntu update that you will receive through update-manager takes care of this for you. For a desktop user, this is sufficient. For system administrators who might use SSH keys widely, it's a massive pain in the ass.

There's much more to this story though - A Slashdot user dug up the original Debian bug report that lead to the "fix" that removed the seeding. The OpenSSL developers used uninitialized memory to seed their random number generator, which caused a warning in Valgrind that someone playing with the code noticed. Valgrind is a tool to help find memory leaks and memory corruption (ie. programmers' mistakes). However, the original code wasn't erroneous at all - The programmer that wrote that code must have asserted that uninitialized memory has more "randomness" than the time, which may or may not have been a good assumption. Regardless, it's clear that the real mistake was "fixing" this code without fully understanding the consequences, and the Debian package maintainers (and the developer that submitted the patch) are at fault.

What happened here is a nightmare scenario for an open source software developer like myself. When my development team makes a release, we do some distribution packaging ourselves, but we also count on other people to make packages for other distributions. If any of those package maintainers modify our software in any way, we can no longer guarantee the quality of our software. If this sounds familiar, you might remember that this is exactly why Mozilla wanted Debian to stop using the Firefox name back in 2006. Mozilla wanted to ensure that all of their users got "Firefox", not "Firefox plus Joe Blow's crappy tweaks", and I completely agree with them. As more open source projects grow and become professionally run, I can see this becoming a more common issue in the future.

Finally, there is the question of whether or not the OpenSSL vulnerability was introduced intentionally. To give the poor guy the benefit of the doubt, I think it was an honest mistake. He fixed something that he thought was broken, and it turns out he was wrong - an understandable, human mistake. The uploader that approved his change probably should have caught the mistake, but again, he too was also only human. Is it possible that this was intentional? Sure. Is it possible that this could be used as a blueprint for future open source sabotage? Absolutely.

Are we any less likely to see security flaws introduced like this again? Absolutely not. It's the process through which packages are maintained and updated that is broken here.

The solution? Discuss.


Disclaimer: Don't go on a witch-hunt for the Debian guys who made mistakes here. Stuff like this happens, and if it were you or I in their shoes, we may have made the exact same mistake. I stress once again that it's the process of distribution package management that is flawed, not the people involved.

Saturday, May 10, 2008

Blogger Captcha Cracked?

I just 50+ emails from Blogger saying new comments were posted on my blog:



Up until now, the spam situation with Blogger was decent. I'd only ever had the odd spam comment come through and I had been able delete them all. With this massive barrage though, I don't have the patience to go through and delete them all, especially knowing that it can happen again.

For now, I'll just hope that blogger deletes the spammer's account and all of the comments he posted.

Sunday, May 04, 2008

Toy Story, Cave Story, .... Ubuntu Story ?

About once a month, I receive an email solicitation asking me to promote something on my blog. 9/10 times I just ignore it, because they're usually asking me to promote a spam blog. Not cool.



It looks like we did better with this month's email. Rather than a pointless spam blog, it was instead regarding UbuntuStory.com, which seems like a friendly advertisement for Ubuntu by a community member. I'm probably preaching to the choir here, but if you haven't tried Ubuntu yet, I'd check out the site.

Sorry, your first impression from that site will be wrong - Ubuntu won't take you on a wild African adventure, but it can make your computer much less of a pain in the ass to use. :)

Tuesday, April 22, 2008

Ubuntu 8.04 Beta Thoughts

Yesterday, I decided to upgrade to the Ubuntu 8.04 (Hardy Heron) Beta. I've been having problems with internet connection and I thought an upgrade might fix it, but now I'm convinced it's a problem with my LAN segment, not my card. Anyways, among the shiny new goodness:

  • Firefox 3 - Great new GTK look on all the buttons, they don't look like they're from 1993 anymore!
  • Wait, more crazy Firefox 3 goodness - apparently now I can cut and paste images around by right-clicking on them in the Blogger "Compose" mode.
  • Notebooks in Tomboy! If you're a Tomboy Notes user, you won't believe how useful this is. I currently have 78 notes in Tomboy, so being able to organize them is a huge win for me.
Tracker (left) and Tomboy (right)
  • Tracker for indexing, now with a spiffy tray icon. More importantly, when you right-click on the tray icon, you can easily pause the data indexing (for example, if you have an older PC like me and want to fire up a game that pushes your system, like Quake Wars)
  • The new screen resolution and display setup dialog. I've read lots of people evangelizing about this - it's supposed to make setting up a second display easier, such as when you're plugging in a projector. I haven't tested this personally though, and I'll believe it when I see it.

  • Lots of little improvements to Evolution as well. It handles multiple operations nicer now, and seems quite a bit snappier. Big thanks to the Evolution team for their hard work.
  • Other random things that I haven't personally tested: The new gio stuff in Nautilus is supposed to make doing multiple file copies simultaneously "better", and the old VFS mounts have been replaced with a new system. I use SSH mounts through Gnome frequently, so I'm looking forward to playing with the new system.
I'm sure there's lots of other little things I've yet to stumble across, and it'll take about a month for me to notice any little bugs that crop up. So far so good though - it looks like we've got another good Ubuntu release!

Friday, January 11, 2008

HOWTO: Figure out what's using your soundcard in Linux

It's 2008, and while Linux audio is getting better for desktop users, I still occasionally find myself running into a situation where one application is tying up my soundcard. Sometimes I'll try to run jackd through qjackctl and it'll fail because Firefox has a flash video loaded in it or something like that.

Anyways, to figure out what's using your soundcard on Linux, you can run:

sudo fuser -v /dev/dsp*
sudo fuser -v /dev/snd/*

The first command above will list all the OSS applications using your sound hardware, and the latter will tackle ALSA applications.

For example, if I run those commands with nothing running:

gamegod@home:~/$ sudo fuser -v /dev/dsp*
gamegod@home:~/$ sudo fuser -v /dev/snd/*
USER PID ACCESS COMMAND
/dev/snd/controlC0: gamegod 6236 F.... mixer_applet2

The output above is showing me that GNOME's mixer applet is using the "control" interface on my soundcard. This won't interfere with any applications, so you can always safely keep this running.

As another example, if I run Mixxx (software for DJs) before running those commands, I'll see:

gamegod@home:~/$ sudo fuser -v /dev/dsp*
gamegod@home:~/$ sudo fuser -v /dev/snd/*
USER PID ACCESS COMMAND
/dev/snd/controlC0: gamegod 6236 F.... mixer_applet2
/dev/snd/pcmC0D0p: gamegod 12936 F...m mixxx
/dev/snd/seq: gamegod 12936 F.... mixxx

This time the output above is showing me that Mixxx is using my soundcard's audio first ouput interface (pcmC0D0p corresponds to ALSA's hw:0,0), as well as ALSA's MIDI interface.

KDE 4.0 Released!

After having sworn off KDE many years ago, I might just have to give it a second shot - KDE 4.0 was just released, and brings a massive overhaul to the desktop environment. Congratulations to the entire KDE team on making this release happen. A lot of very talented people put a lot of hard work into this.




If you're looking for eye-candy, check out the KDE 4 screenshots on their site. Kickoff and KRunner look well thought-out, and I'm itching to give them a try. They also look like they're better integrated into the desktop environment than their GNOME equivalents (Deskbar and any of those XP/Vista-style system panel clones).

If you want to give it a spin, the source and binary packages for several distros are available on the KDE 4.0.0 info page. Packages for (K)Ubuntu 7.10 are available along with a LiveCD here.

Good stuff.