The Ubuntu 9.04 Intel Graphics Fiasco
Because I was traveling, I didn’t get around to upgrading my desktop to Ubuntu 9.04 until yesterday. After what seemed like the fastest Ubuntu installation and quickest boot ever on my system, I was excited to log into Gnome and see what the stable release of Jaunty has to offer. Instead, I was met with a lot of frustration and loss of faith in Ubuntu’s commitment to stability, due to egregious regressions in the performance of my Intel video card.
In earlier versions of Ubuntu, my Intel 82945G graphics chipset was able to handle desktop effects and 3D applications like Blender seamlessly. After the upgrade to Jaunty, however, it was clear that, as the release notes warned, “Users of Intel video chipsets have reported performance regressions” (“disasters” would have been a better description). Desktop effects were jumpy at best, and the playback of embedded flash video was less than smooth. And searches on the forums suggest that I was one of the lucky ones: some Intel users report an inability to start X at all in Jaunty.
After a lot of googling (and two reinstalls due to a potential fix that left me with an unbootable system), I was able to solve the video problem by reverting to the 2.4 Xorg intel driver. This solution was simple enough, but few inexperienced Ubuntu users are likely to find it, let alone know how to apply it. Instead, individuals who install Jaunty and find severe graphical performance problems are going to give up on Linux. And that’s a regrettable outcome that could have been avoided with a little foresight on the part of Ubuntu developers.
Ditching stability?
The performance regressions in the Intel video driver result from major code revisions in Xorg and the driver itself, which will ultimately provide a leaner and faster graphical framework for Linux. Those changes are useful and valuable. But I was very disappointed to find that Ubuntu developers had pushed the new code into Jaunty before it became stable on most hardware.
If I wanted to take my chances on bleeding-edge software, I’d switch to Debian Unstable or compile my own beta kernels. I use Ubuntu because I need my computer to work reliably with as little effort on my part as possible. In the past, Ubuntu’s commitment to usability and stability has stood out in the free-software world, where developers tend to sacrifice those features in order to push the latest and greatest code, regardless of how well-tested it may be.
The Ubuntu developers should have foreseen the problems posed for Intel users by the new version of Xorg–the issues were reported well before Jaunty came out in April–and saved the update for Ubuntu 9.10’s release next October, when the code should be considerably more stable. This may have proved detrimental to the few users who actually benefit from the revised code in its current state, but it would have made the Jaunty experience much better on most computers.
The Intel video fiasco is a rare exception among what has generally been a strong commitment to usability and stability on the part of Ubuntu developers. Let’s hope that the lesson of this mistake is learned, and that Ubuntu doesn’t squander its reputation by continuing to push out software before it’s ready for general consumption.
WorksWithU is updated multiple times per week. Don’t miss a single post. Sign up for our RSS and Twitter feeds (available now) and newsletter (coming in 2009).
Don’t ditch stability. It’s unnecessary. I used a similar technique as recommend in several different places (which I think all derived from a HOWTO on the Ubuntu Forums) but opted for as stable code as I could.
I wrote about it on my blog, I’m not really sure it’s fit for the masses but it worked for me.
http://nnutter.com/2009/05/fix-poor-intel-graphics-performance-on-jaunty/
I updated my kernel to 2.6.30 RC4, I’m using the X Updated PPA (NOT X Edgers!), and I added the MTRR fix to GDM so that it ensures the correct registers are always loaded.
I avoided installing Ubuntu on my new laptop because of this.
There is a very simple rule of thumb: always wait a month or two after a *buntu release to install. I was impatient, upgraded my new system and got screwed byut the ATI fglrx driver (still not working with a lot of hardware).
This is a non-issue for vendors like Zareason, System76 or Dell, because they can wait until things are steady or figure a workaround for the hardware they sell.
I’d love to see Ubuntu keep the 6 month schedule, but also have a consistent (and pre-advertised) point release (9.04.1 in this case) about a month after the regular release, but with no commitment of a release time. The .1 release would be out when ready. Users looking for stability should be directed to wait for that release, and enthusiasts who want the bleeding edge, well, they know what to do 🙂
@Leo Ubuntu will most likely not be fixing this issue as it “violates” the nature of a release. They release Ubuntu using version X of some program, say the Linux kernel, and only provide patches and updates to that version. Since the updates we need are in a new version of the Linux kernel it will most likely remain an issue to be solved by the community (as it already has been).
@Nathan
I think that the part in parentheses needs to be shouted. The community worked together and has resolved this issue.
What will probably happen, as it has happened in the past, is that Ubuntu will create an updated CD image for Jaunty that has the fix applied to it.
I had minor problems with nvidia (though annoying ones), really major ones with ati (remember the previous Ubuntu, where Blender (or the wm) did not work at all with the restricted – usable – ati driver?); I hoped that intel may be better… By the way waiting solves nothing, restricted drivers are updated rarely – or at least this seemed to be for me…
On my primary machine, I keep the latest LTS release, currently 8.04.2. On my laptop and other equipment, I give the newest versions a try. If I had only one computer, I’d stick with the LTS releases.
@Nathan
@Charon79m
Yes, I agree, many fixes come from the community, and as Nathan was suggesting I think, some involve forward looking fixes (i.e., a graphics driver is finally made to work with a new xserver), and some are merely “rolling back to older software until things get better”). The latter type really means a workaround that rarely gets released officially. And that’s the way things should be. If the big distros don’t push new code, we’ll always be stuck in stagnating software. Waiting always help though.
While some times CD’s have been published with updates since the main release, what I am proposing is intitutionalize that, and give official status to the point release.
9.04 doesn’t claim to be stable that is what 8.04 LTS (long term stable) is for. People shouldn’t use the latest and greatest if they are worried about stability that is why Ubuntu provides the LTS’s. Ubuntu has been know to experiment with the in between releases so when it comes around to have another LTS it is stable. Thats how they hash out a lot of bugs. I have never used an in between that is 100% stable especially within the first few weeks after release. But that’s part of the fun of using Linux
While its great fun to poke people in the eye over bugs that don’t get fixed…it might not be very constructive.
You know what is constructive.. talking about how to handle it better next time. Karmic UDS proposed sprints include:
https://blueprints.launchpad.net/ubuntu/+spec/desktop-karmic-xorg-intel-video-retrospective
https://blueprints.launchpad.net/ubuntu/+spec/desktop-karmic-xorg-intel-upstreaming-working-session
https://blueprints.launchpad.net/ubuntu/+spec/desktop-karmic-xorg-future-graphics-technologies
Anyone from workswithu going to go to UDS and cover the discussions? Or maybe help make sure all the sessions get video’d and archived so everyone can follow the discussions?
At the very least, I would imagine Chris would want to be able to report on the retrospective sprint.
-jef
These regressions are nothing more than bad publicity. So far, I’ve experienced 4 major regressions, including this Intel debacle. I can’t explain things like this to my customers.
@ David:
9.04 doesn’t claim to be stable? LOL… I guess Canonical sees this differently. Check this quote from a releasenote mailing they sent to all partners:
Helpful tips for upgrading
1. Should I upgrade my customers to the new release?
If your customers are on Ubuntu 8.04 LTS and you want to minimise the number of upgrades at their sites, or they need an invariant OS platform, then it might be advisable to wait until our next LTS is released. Otherwise then, yes, feel free to do so, as all of our releases are enterprise-grade quality, maintained and supported.
“loss of faith in Ubuntu’s commitment to stability”
Ubuntu had a commitment to stability? Must have been before I started using it in 2007.
“But I was very disappointed to find that Ubuntu developers had pushed the new code into Jaunty before it became stable on most hardware.”
But that’s what they do in *every release*. What were you expecting?
In every release, they know about major bugs and regressions (Nvidia cards failing, major problems with PulseAudio, major problems with Gnome), and just ship it out the door anyway. Gotta meet that deadline!
http://linuxhaters.blogspot.com/2008/06/evolution-of-ubuntu-user.html
Canonical’s priorities need to change, or Ubuntu will just continue descending into failure.
1. Time-based releases don’t work. Just watch Lucy and Ethel on the chocolate assembly line if you don’t understand why.
2. Stability of the subsystem is very important. Don’t release any changes to the video, audio, networking, etc. until they have been VERY thoroughly tested, and all significant outstanding bugs have been fixed. If there are still bugs, wait until the next release. No one wants an “upgrade” that prevents them from booting or logging in. We want our computers to work; we’re not your beta testers.
3. Stability of applications is not so important as being up-to-date. When Firefox or OpenOffice crashes, it’s not a big deal. You might lose some work, but it won’t affect anything else you’re doing, and more likely than not, it will save its state and return to it when you start it again. The new features and fixed bugs are worth the risk of a few more unfixed bugs.
“I’d love to see Ubuntu keep the 6 month schedule, but also have a consistent (and pre-advertised) point release (9.04.1 in this case) about a month after the regular release, but with no commitment of a release time.”
Don’t you mean “a beta release”, followed by a full release after some testing? They already do that.
I had severe problems with 9.04 and both the intel chipset and ati. I was able to fix the laptop with intel by using the 2.6.30rc kernel but the new ati(fglrx) doesn’t support my 1 1/2 year old laptop. I put in the open source drivers but I still can’t get mythtv working although flash plays full screen ok.
I also had to drop back to the 2.4 Xorg intel driver. After that things were much better. I think it is very disappointing that such a popular linux distro should have problems like this; it is the one new people are most likely to try!
[…] Read more at Works With U […]
I’m a huge fan of Ubuntu, but their six-monthly releases simply aren’t stable enough for inexperienced users, or even experienced users who don’t have weeks on end spare to fix problems.
After several upgrade disasters, I’ve reverted all my systems back to Ubuntu Hardy 8.04 Long Term Support, and have been happy ever since.
Canonical should promote their LTS releases as their primary stable consumer releases, and leave the non-LTS releases to those who want to dangle over the cutting edge.
Focussing solely on LTS would also increase take-up. Most Ubuntu books focus on LTS. More commerical software publishers would be willing to support a 3-5 year upgrade cycle. Tutorials for new users could remain valid for 3-5 years. The list goes on.
I think this is the reason Suse (OpenSuse?) is moving to an 8-month release cycle. The problem, though, seems to be Gnome’s release cycle; the distributions have been trying to release with the latest Gnome, and that’s every 6 months. Going to an 8-month cycle provides some buffer, but it also starts to throw the clock off; after three Gnome upgrades, the next Suse (OpenSuse?) would be a full Gnome edition behind, and eventually it’s the Gregorian vs. the Julian calendar. Maybe Gnome could move to an 8-month release cycle for the sake of the distros that support its platform.
However, I think this is a very interesting test of how the community rallies around a problem and finds solutions en masse, and those solutions are happening. The community knew — or had opportunity to know — about the graphics issue in the release notes during the beta period, and the linux press (especially Phoronix) has some good info on the upstream UXA development and why there are some problems.
The gamble, though, was on the problem with the problems; there are so many different combinations of UXA with different Xorg and hardware settings that it’s nearly impossible to tell which users will have show-stopping problems and who won’t notice anything.
Having read the advisory beforehand (but I didn’t beta test), it was my sense that Ubuntu rolled the dice with fair warning in the release notes. It was either that or hold off until UXA is stabilized, and that horizon isn’t yet visible. Some time back — was it Dapper? — Ubuntu tried putting off a release, and they were pilloried in the press and bawled out in the blogs.
So their choices, basically, were postpone release and get criticized for making their users wait, criticized because long-term releases are the ones to worry about, criticized by proprietary developers for not having the firepower to get a finished product out in time, and risk getting cranks writing about the demise of Ubuntu.
Or they put out the distro and get criticized for releasing a half-baked product, criticized for knowing it was a problem ahead of time and shipping anyway, criticized for turning off potential users, and criticized for not being professional.
But at least when they put out the distro, even more eyes will be peering into the problem. Then maybe that open-source community model will work its magic and the problems will be fixed faster than Ubuntu could have done themselves had they postponed shipment.
I have a GM965 and didn’t really have any problems, but went ahead and tried running the 2.6.30-rc4 kernel with the Xorg Edgers driver and UXA with great results. Then I tried Nutter’s fix earlier today (fancy seeing you here — I pulled your fix off the thread in ubuntuforums); I’m easily where I was at with Intrepid, but I may be just lucky.
I’m wondering if the issue with pushing this out has anything to do with not pushing OO.o3 into 8.10.
I tried every solution I could find, including reverting to the 2.4 driver, to no avail. The “potential fix” has left my system unbootable (it freezes on login and starting in a terminal and trying the “rollback” doesn’t work), and now I will have to reinstall using 8.10. If I had been a first time user, this would have had me running back to Windows as fast as my legs would carry me.
Ubuntu are victims of circumstance. With every new release they are expected to do something new and exciting. Currently changes are being made with the kernel and Xorg as to how graphics are handled. Which requires drivers to be rewritten. Combine the two and you have a mess on your hands.
Ubuntu got the timing wrong and as a result Intels graphics driver is crap. ATI has never worked properly in Linux so ATI problems are hard to pin down to changes in this release.
aikiwolfie:
Did the get the timing wrong? Or did Canonical fail to allocate development resources internally for best benefit? How much Canonical man power is dedicated to working on the upstream Xorg server stuff which flows down into Ubuntu releases? If Canonical is content on watching other business interests do all the upstream development work, and set the development roadmaps, will they ever be in a position to ensure the timing is right for their own interests?
-jef
Ubuntu 9.04 isn’t for people who crave stability. Those users should stick with the safer 8.04 LTS release. Crying the blues over using the latest kernel with the changes that it made to graphics execution manager and then blaming Canonical for including 2.6.28 kernel is a bit ridiculous when they backed off the even newer 2.6.29 kernel. To suggest that the latest release should use kernel 2.26.27 for the few users who have Intel chips when eventually the competing distros such as Fedora will have the same problem. The only advantage is that Intel may have stopped its foot dragging by now.
I never put my faith in Intel. I have an Nvidia card on my main computer and it never lets me down. I have another computer with an onboard Intel chipset and 9.04 works fine, but I have no 3D effects yet. I can wait for that to happen.
In all I am happy with 9.04 on both computers. One thing to suggest is that if you went the upgrade route to boot into the older 2.6.27 kernel until such time as you have a new driver. If you did a fresh installation, maybe you should blame yourself a bit for not checking things out first. IMO, it is shared blame, but Intel should be the one to get the most.
When you have a set release schedule, it comes down to timing. If everybody responds and all of the pieces fall into place your distro is great. If someone does not do their job (in this case Intel), then it can be an unsatisfactory experience for some users.
They could have altered the schedule or run with an older kernel. Had they done either of these things those who love 9.04 now would be complaining instead of you. Since most people are happy, then they did the right thing.
gt;The Intel video fiasco is a rare exception among what has generally been a strong commitment to usability and stability on the part of Ubuntu developers.
You’re kidding me, right? I jumped on the Ubuntu bandwagon with Gusty. Back then, certain intel graphic chipsets were blacklisted (some say, unnecessarily) due to minor video playback issues. Otherwise, compiz worked fine, so long as you found how to turn it on. Then Hardy came out. Can any say “PulseAudio issues?”, not to mention certain hickups with network-manager. So here we are in 2009, and Jaunty breaks compiz for many intel folks. I’m not sure you can say Ubuntu’s commitment to stability is so sparkling. Rather, it seems they are much more concerned with being able to market themselves as bleeding-edge, even if it means that critical parts of the system (graphics, sound, networking) broke or have serious issues for many users.
I also let my guard down until I ran into Ekiga and SIP not working (now it does) I think the Intel graphics problem really only effect those of us who like the latest. I ran into someone today who hasn’t hear of Linux.
When the operating system has a faulty software for something as critical as the Intel graphic driver, it’s undoubtedly a failure. No excuses.
If a company has any hope of making a mainstream operating system acceptable to the masses as Canonical claims they do, they must commit to stability.
When you ship with a faulty graphic driver, you automatically and instantly take yourself out of contention for the market in which you’re hoping to compete.
This is very clear evidence of the failure of time-based releases. That anyone can still think this is a good idea boggles my mind.
Oddly enough, all major distributions are time-based.
I tried updating my HP XE783 from 8.04 to 8.10, but the pulseaudio sound problems broke too many apps that were working in 8.04, so I reinstalled 8.04. I loaded 9.04 onto a different partition and the intel graphics problems made it unuseable for me. At least this time I can choose to boot into 8.04 at startup. I am an IT professional. I can only imagine what a new user of Linux would do when this occured to them. Is it any wonder that they run back to Microsoft to hold their hands. Does windows sound not work? Does windows graphics not work on intel graphics chipsets? Not this badly and with all the cryptic commands to roll back drivers and change kernels to try to fix them. Clearly in my experience 8.10 and 9.04 had major problems. Sound and graphics are too important to the computer experience to release to users in the half baked shape they were in. I want linux to succeed, but trumpeting releases with these major problems in them as great, won’t help the linux cause but will hurt it and relegate it to the technical elite.
I think Ubuntu 8.10 and 9.04 should have been labeled BETA releases. They were not ready for the general public. What should have been done was to wait 6 months for pulseaudio to stabilize and wait 6 months for Xorg Intel graphics drivers to stabilize. Let’s say a potential new Linux user reads about all the hype that ubuntu generates and tries the non-LTS releases and can’t watch youtube videos with 8.10 or tries 9.04 and their intel graphics hang up/freeze. Most of the reviews out there don’t mention the negatives of these releases. I can see this hurting the linux cause. They could have left pulseaudio out and used the old Xorg. The new releases would have still had some updated features/software to market. Currently I have to use older versions of openoffice and gimp than I have on my wife’s windows PC, because 8.04 is the only release that runs ok on my linux pcs. Unfortunately the company I work for uses windows.
Only how many months to the next LTS release?
Next LTS will be 10.04, a full year away.
After this experience with 9.04 (the intel graphics along with Kile the LaTeX editor), I will either stick with LTS or go with Debian in the future.
[…] However, Canonical#8217;s core user base seems to have strong faith in Ubuntu#8217;s ongoing quality #8212; though some readers (including our own Christopher Tozzi) have had issues with video cards. […]
I’m still fairly new to Ubuntu (although it’s my main os now). If I install 8.10, does it have the stability of 8.04? Not sure if that should be a no brainer…
“instead, individuals who install Jaunty and find severe graphical performance problems are going to give up on Linux.”
so true, I almost did. A big portion of my usage is flash video. If linux can’t play flash, well… so long 🙁
I’m having this same issue. Ubuntu has a LOT of regressions. It almost makes me want to go back to Debian stable. 6 months is not enough time to make a stable OS.
Well, I install ubuntu on my laptop, and ran into problems with win7 file share with my desktop, and with the sound coming out of my laptop headphones (there are no muted settings in ubuntu by the way)! I look in the forums and see people commenting on it going back to 2008! They go round and round in circles and rarely solve the problem.
I keep trying linux, but find myself back with MS. I guess I’ll give it another year then??
Wouldn’t mind paying £30 for a widely used linux-based OS that works with every laptop, on most hardware and is stable, will connect with windows shares, has a graphically appealing interface…etc. Linux would be a great OS to build on!!
But what i find with linux is its still full of geek stuff, like edit such and such a file…etc. Cant just click on something to sort something out when thigns get a bit technical.
Dont get me wrong, i have a leaning to IT, and am myself a bit of a computer geek. But I find linux, when you just want something to work, is a pain in the butt!
IF only they could think of the end user a bit more, we would have a valid alternative to windows. Ubuntu started doing this, but they need to KEEP doing that.
Eee PC amp; Ubuntu 9.04…
After a while using the Eee PC with the default Xandros based OS I wanted to try something else, since Ubuntu 9.04 just came out it seemed like a good choice. While installing and playing with the new Ubuntu version I learned a few things that might he…