https://www.channelfutures.com/wp-content/themes/channelfutures_child/assets/images/logo/footer-new-logo.png
  • Home
  • Technologies
    • Back
    • SDN/SD-WAN
    • Cloud
    • RMM/PSA
    • Security
    • Telephony/UC/Collaboration
    • Cable
    • Mobility & Wireless
    • Fiber/Ethernet
    • Data Centers
    • Backup & Disaster Recovery
    • IoT
    • Desktop
    • Artificial Intelligence
    • Analytics
  • Strategy
    • Back
    • Mergers and Acquisitions
    • Channel Research
    • Business Models
    • Distribution
    • Technology Solutions Brokerages
    • Sales & Marketing
    • Best Practices
    • Vertical Markets
    • Regulation & Compliance
  • MSP 501
    • Back
    • MSP 501 Rankings
    • NextGen 101 Rankings
  • Intelligence
    • Back
    • Galleries
    • Podcasts
    • From the Industry
    • Reports/Digital Issues
    • Webinars
    • White Papers
  • Channel Futures TV
  • EMEA
  • Channel Chatter
    • Back
    • People on the Move
    • New/Changing Channel Programs
    • New Products & Services
    • Industry Honors
  • Resources
    • Back
    • Channel Futures 20: Top Tech Providers
    • Advisory Boards
    • Industry Organizations
    • Our Sponsors
    • Advertise
    • 2023 Editorial Calendar
  • Awards
    • Back
    • 2022 MSP 501
    • Channel Influencers
    • Circle of Excellence
    • DE&I 101
    • Technology Advisor 101 (TA 101)
    • Channel Leaders Lists
  • Events
    • Back
    • 2023 Call for Speakers
    • CP Conference & Expo
    • MSP Summit
    • Channel Partners Europe
    • Channel Partners Event Coverage
    • Webinars
    • Industry Events
  • About Us
  • DE&I
Channel Futures
  • NEWSLETTER
  • Home
  • Technologies
    • Back
    • SDN/SD-WAN
    • Cloud
    • RMM/PSA
    • Security
    • Telephony/UC/Collaboration
    • Cable
    • Mobility & Wireless
    • Fiber/Ethernet
    • Data Centers
    • Backup & Disaster Recovery
    • IoT
    • Desktop
    • Artificial Intelligence
    • Analytics
  • Strategy
    • Back
    • Mergers and Acquisitions
    • Channel Research
    • Business Models
    • Distribution
    • Technology Solutions Brokerages
    • Sales & Marketing
    • Best Practices
    • Vertical Markets
    • Regulation & Compliance
  • MSP 501
    • Back
    • MSP 501 Rankings
    • NextGen 101 Rankings
  • Intelligence
    • Back
    • Galleries
    • Podcasts
    • From the Industry
    • Reports/Digital Issues
    • Webinars
    • White Papers
  • Channel Futures TV
  • EMEA
  • Channel Chatter
    • Back
    • People on the Move
    • New/Changing Channel Programs
    • New Products & Services
    • Industry Honors
  • Resources
    • Back
    • Channel Futures 20: Top Tech Providers
    • Advisory Boards
    • Industry Organizations
    • Our Sponsors
    • Advertise
    • 2023 Editorial Calendar
  • Awards
    • Back
    • 2022 MSP 501
    • Channel Influencers
    • Circle of Excellence
    • DE&I 101
    • Technology Advisor 101 (TA 101)
    • Channel Leaders Lists
  • Events
    • Back
    • 2023 Call for Speakers
    • CP Conference & Expo
    • MSP Summit
    • Channel Partners Europe
    • Channel Partners Event Coverage
    • Webinars
    • Industry Events
  • About Us
  • DE&I
    • Newsletter
  • REGISTER
  • MSPs
  • VARs / SIs
  • Agents
  • Cloud Service Providers
  • Channel Partners Events
 Channel Futures

Open Source


In Defense of the Six-Month Release Cycle

  • Written by Christopher Tozzi
  • May 10, 2009

The poor experience of many users upgrading to Jaunty has prompted calls for a less ambitious Ubuntu release cycle (for examples, see the comments on my recent post about video-driver problems).  Instead of pushing out an updated version of Ubuntu with a new feature set every six months, some have argued, developers should issue new releases less frequently, or recommend that only LTS versions be used for production.  I disagree.  Here’s why.

In general, the free-software world is dominated by laissez-faire attitudes when it comes to deadlines and schedules.  Open-source projects often fail to meet their own roadmaps; the latest stable version of Debian Linux, for example, arrived last February six months late.  The proprietary world doesn’t necessarily do any better–witness the delays plaguing Vista’s release–but that’s not an excuse for open-source developers to slack off in sticking to schedules.

Regularity = reliability = professionalism

Ubuntu stands out as an impressive exception in a software ecosystem where roadmaps often have little meaning.  Since Warty Warthog made its appearance in April 2004, Ubuntu has issued a new release every six months exactly on schedule, with the singular exception of version 6.06, which was deliberately postponed six weeks.  And in most cases, each new release has actually been stable and ready for production use.

Ubuntu’s ability to meet deadlines provides an aura of reliability and professionalism that attracts users on both desktops and servers.  Personally, I like knowing that my Hardy servers will receive security updates through 2013, and that the Intrepid kiosks I set up will patch themselves until 2010.  Windows machines, in contrast, are only supported for as long as Microsoft feels like it, and few other Linux distributions offer completely predictable support life-cycles.

Maintaining momentum

In addition to proffering a sense of reliability and professionalism, Ubuntu’s biannual release cycle is a good way to keep things exciting for users and the press.  It gives the community something to look forward to regularly.  It places Ubuntu at the top of technology headlines twice a year, generating a lot of publicity.  And each new release provides an excuse for users to socialize and reaffirm the community that makes Ubuntu possible, as Guy Thouret described in a recent post.

Less frequent releases would harm the ability of the Ubuntu community and developers to maintain the momentum necessary to keep the project rolling smoothly, and would diminish the name-recognition that the operating system currently enjoys as the world’s most popular Linux distribution.

The Scylla and Charybdis of biannual releases

Of course, sticking to a roadmap can pose problems.  On the one hand, Ubuntu runs the risk of producing lackluster releases that offer little innovation over previous versions.  At the same time, attempts to pack each release full of new features without adequate time for testing can lead to stability problems.

So far, Ubuntu has generally done an excellent job of navigating these two extremes.  Most releases offer meaningful new functionality without compromising stability.  (Jaunty may be an exception, but I’m giving it a month before I declare it a total usability disaster based on the graphics issues.)  There’s no reason this pace can’t be sustained into the future.

Over the course of five years and eleven releases, Ubuntu has proven its ability to maintain a six-month release cycle without compromising either innovation or stability.  The release roadmaps may be demanding and lead to serious botches from time to time, but they’re also one of the features that set Ubuntu apart from other Linux distributions and proprietary operating systems.

WorksWithU is updated multiple times per week. Don’t miss a single post. Sign up for our RSS and Twitter feeds (available now) and newsletter (coming in 2009).

Tags: Agents Cloud Service Providers MSPs VARs/SIs Open Source

Most Recent


  • Virtual data center
    VMware vSphere Upgrade Supports AI Workloads On-Prem with Nvidia GPUs
    The new release makes way for virtualizing HPC servers running AI workloads.
  • Cyber attack
    Microsoft Cyberattack Continues Growing in Severity, Victims Racking Up
    Microsoft had almost two months to push out the patch it shipped on Mar. 2.
  • Important Announcement
    5 Key Announcements from Microsoft Ignite
    Password-less authentication, Azure Arc and simplifying edge development were among the popular topics.
  • Five, 5
    Top 5 Managed Services that Support Business Development
    MSPs can handle IT operations maintenance chores and free company resources for business development.

24 comments

  1. Avatar Scott May 11, 2009 @ 4:35 am
    Reply

    The short release cycle as compared to Windows makes improvements more incremental. They tend to go unnoticed more easily. Also, Microsoft has a chorus of marketing drones to stir up excitement over the insignificant changes in Windows from one release to the next one six years later.

    There’s an idea for an article: compare the number, and significance, of changes in Windows vs Ubuntu over the same time period. Say, 2004 to 2008.

  2. Avatar Andrew Oakley May 11, 2009 @ 7:53 am
    Reply

    I agree that a regular, predictable release cycle is good. I just don’t think that “every six months” is the correct release cycle for beginner or deskop enterprise users. It’s simply not tenable to expect a large organisation with hundreds of desktop users, or a small home user with no sysadmin skills, to update – which invariably means “partial re-install” – every six months. Developers and enthuisiasts, sure, partial re-install every 6 months is fine for them.

    Requiring a partial re-install every 6 months is not anyone’s definition of a stable system. These six monthly releases should be promoted as “cutting edge” releases, with the LTS releases promoted as “stable” releases.

    LTS releases are ideal for beginner and desktop enterprise users, and Canonical should be promoting the LTS release much more heavily as being particularly suitable for them. By promoting the LTS release as “the current stable release”, it would also allow those users to be confident that tutorials and books written for “the current stble version of Ubuntu” would be suitable for them.

    But there is currently no scheduled “next LTS release” AFAIK.

    Regular, predictable release cycles are good. I agree. That’s why we should have one for the LTS releases, too.

  3. Avatar milkman May 11, 2009 @ 7:59 am
    Reply

    As long as they try not to break anything while they do the 6 month schedule is fine. The Intel driver breakage is ridiculous and they should’ve had the foresight to ship an older Intel driver while waiting for Intel to un-fornicate their newer version. The amount of people running laptops with Intel graphics should be enough to let them know that.

  4. Avatar PatR May 11, 2009 @ 8:13 am
    Reply

    I think the problem with longer release cycles is also one of package management common in linux. I still use Hardy, but, if i was a normal user, and I wanted to upgrade say to OO.org 3.2 sometime because I really need some nice new features then I’m stuck. (I dont say it’s better to upgrade to a new version though) Also if I’m in need of a new driver that isnt included in Hardy – anyone a clue for an easy way? So I think for a newbie its easier to update than to hunt down / compile or download several .debs somewhere in the net. By the way, I own an Intel Thinkpad 🙂

  5. Avatar Vadim P. May 11, 2009 @ 8:56 am
    Reply

    I’m pro 6-months too.

  6. Avatar dragonbite May 11, 2009 @ 9:35 am
    Reply

    I like the 6 month release cycle since it allows introducing new versions/technology more quickly. This includes new applications and security patches.

    The other advantage of a 6 month release cycle is if they decide that they really needs some code cleanup they could distribute a release that is primarily the clean-up and within a year a new, glitzy version will be available and built on top of the cleaned-up code.

  7. Avatar FabriceV May 11, 2009 @ 9:50 am
    Reply

    Well,
    Problems is that linux reality is far from propagand of zealots.
    1) Regularity = reliability = professionalism =gt; apply equally to 1 year or 6 monthes releases… And 6 monthes released has not helped… has just promoted marketing competition between major distro.
    2) Maintaining momentum =gt; Yes, that the most convincing argument… Vaporware is the main argument that support 6 monthes releases.
    3) The Scylla and Charybdis of biannual releases =gt; currently this model fails technically (and not since Jaunty as you seem to write…) but win vaporware… And Ubuntu has nicely scheduled buggy releases that simply demonstrated they strongly believe that marketing help him more than the product itself (and this vision of production is largely diffused).
    =gt; As long as distro (not solely Ubuntu) do not resolve the problem of continuous update and addition of softwares and drivers into a current release (and yes kernel philosophy is problematic)… Using 1 year release is just a pragmatic answer but not a complete solution.

    And yes linux is fine, like the third OS… Less stable, less featured, but free and enjoyable… Fortunately it is not sold at the price of MacOS… because nearly no one want (can) to use it even free desktop are available for most than 15 years. Thus the problems of Ubuntu releases are symptomatic but not entirely Ubuntu’s faults.

  8. Avatar Andre P. May 11, 2009 @ 9:50 am
    Reply

    Mark Shuttleworth has done the most to offer linux to the millions of MS-Windows users out there, that is undeniable. Also undeniable is the fact that Ubuntu is nothing without Debian. It takes Debian a loooong time to release what they call a Stable Release. But between the 6 month Ubuntu rush and the 2 year plus Debian Stable Release, there has to be an in between. For me that in between is Debian Testing. It doesn’t have the polish of Ubuntu, it doesn’t have the latest packages but it is stable. Some Ubuntu packages even come from Debian Experimental ! nevertheless…I still recommend Ubuntu to my Windows using freinds, I just don’t use it myself.

  9. Avatar Fr33d0m May 11, 2009 @ 10:12 am
    Reply

    “Requiring a partial re-install every 6 months is not anyone’s definition of a stable system.”

    Nobody requires you move off of LTS. Folks are thinking of these non LTS releases as the same as LTS–they aren’t.

    IMHO the statement above is symptomatic. People want the latest version–or believe they must upgrade–and the way these releases are presented there isn’t any mention made of the difference between LTS and the interims. Even on Ubuntu’s website the interims are listed as stable–and they are for most folks. But they are not really intended to be as stable/usable as LTS.

  10. Avatar Wesley May 11, 2009 @ 10:21 am
    Reply

    I find myself in the middle of the argument. I am just an end user, not a programmer. I really like the idea of pushing the LTS releases for general use and the other for the enthusiast crowd. I know this is not standard, but to ease these issues would it not make sense to offer some easy route while running a LTS to say upgrade to a new version of software like to Open Office 3?

  11. Avatar PatR May 11, 2009 @ 10:30 am
    Reply

    “I know this is not standard, but to ease these issues would it not make sense to offer some easy route while running a LTS to say upgrade to a new version of software like to Open Office 3”

    Thats what i mentioned before. But someone needs OO.org 3 someone other Inkscape and another person somewhat different. Thats what is broken on linux package management and renders it somehow impossible not to push out a release every 6 months. Same goes for drivers.

    Now I dont say that Mac or window’s self contained distribution system is better, but there HAS to be a solution inbetween.

    If you can fix this problem i would be pro longer release cycles as I think you can make it more stable then + nobody needs all system components renewed every 6 months – not homeusers, not companies – they need some new apps, but not the whole system (best example is XP I think).

  12. Avatar Anon E Moose May 11, 2009 @ 11:12 am
    Reply

    For all of your cavalier attitude towards MS support, please note that they support their operating systems for far longer than your “LTS” release. These time tables are published and, if anything, get revised upwards, so quit with the damn Fossaganda FUD already.

    Secondly, a 6 month upgrade cycle is ridiculous. We used to laugh at “having” to reinstall Win98 every 6 months, but yet this is acceptable practice in this case?

    I would rather have software that is released when it is ready. Perhaps not to the anal degree that Debian employs, but certainly not Ubuntu’s Fire It Out The Door, It’s Release Day philosophy.

    I think SuSE has moved in the right direction, giving themselves a few more months between releases.

    6 month release cycles are little more than a showboat for the latest, greatest Beta, and you know it. Users are either locked into a rapidly aging “LTS” release, the enforced obsolescence (something else MS gets accused of, but not Ubuntu) of a short term support release, or the bother of an OS upgrade every 6 months if they want to stay current.

    No, it’s definitely not a proprietary software model. Not even Apple sells users a point release every 6 months.

    But it’s free, so it’s better.

  13. Avatar LinuxCanuck May 11, 2009 @ 11:27 am
    Reply

    I agree entirely with the article. The six month release cycle pushes the envelope and produces a better product over all. In addition, the aggressive release cycles of competitors such as Fedora, Suse and Mandriva further drive Linux to excellence.

    LTS is important, because not all users should be on the 6 month upgrade path. People who crave stability do not have to upgrade. They can upgrade every three years if they want. Six months is only an option.

    The next LTS is due out in a year, following Karmic, which is two years after the previous LTS, Hardy. That gives LTS users one year to decide when to upgrade.

    I think that Canonical is being more than generous. It offers the latest and greatest for those who want bleeding edge and it offers a stable path for those who want that with opportunity to stretch your wings as you see fit in between these extremes.

    The problem is not with Canonical, but with users who do not know what they want. They think that they want to be on the bleeding edge, but then don’t realize that it isn’t meant for everybody.

    There is always a cost. The cost of being on the bleeding edge is potential problems and instability and if you are not willing to accept that then don’t go there. The cost of stability is you don’t get the latest and greatest.

    I do not mind problem solving and run Ubuntu from alpha to final release. I accept crashes and bugs as part of the deal. I also run other distros. I get a front row seat as to where Linux headed. I like that. However, I never run the development versions as my main distro. I restrict that machine to the latest six month release of Ubuntu/Kubuntu.

    Every user needs to take stock and determine what they want based on their needs and their capabilities.

  14. Avatar Sam Trenholme May 11, 2009 @ 12:36 pm
    Reply

    The article has the following false information in it:

    “Windows machines, in contrast, are only supported for as long as Microsoft feels like it”

    This is NOT true.

    Windows XP will receive security patches until mid-2014:

    http://support.microsoft.com/lifecycle/?LN=en-gbamp;C2=1173

    Microsoft’s “extended support” lifecycle means no improvements to the base OS, but security patches will still be made.

    The only version of Linux that exists today that will still be getting security patches in 2014 is RHEL and its derivatives.

    – Sam

  15. Avatar Christopher Tozzi May 11, 2009 @ 1:29 pm
    Reply

    Sam Trenholme: it’s true that Microsoft publishes timetables for the support cycles of its various operating systems well in advance of their end-of-life dates, and in principle the length of support is consistent across all releases. But Microsoft doesn’t abide strictly by these dates, as it demonstrated when it extended the support cycle for Windows XP in response to lower-than-expected Vista adoption.

    This didn’t shut anyone out, but it still makes it hard for consumers to know how long Microsoft really intends to support its products–do you rely on the timetables published when a new version of Windows first appears, or do you gamble that Microsoft will end up announcing extended support dates when ‘General Availability’ of a given product nears its end?

    Moreover, only critical security updates are offered for free through 2014 (and I guess it’s up to Microsoft to decide what’s critical and what’s just an update). Consumers have to pay for other patches. This model further complicates efforts to plan software deployments.

  16. Avatar aikiwolfie May 11, 2009 @ 2:02 pm
    Reply

    Instead of constantly updating the core components of the OS I would like to see Canonical use the non-LTS versions of Ubuntu to introduced more polished applications.

    If Ubuntu is to be taken seriously as a desktop OS then applications are critical. Business users need more than just OpenOffice.org like a graphics application as powerful as The Gimp but as easy to use as Photoshop. Home users need a decent home movie editor, DVD playback without all the Medibuntu nonsense and real games. I’m talking about something to compete with the Playstation, Xbox, Wii and Windows based games.

    There is just so much more needs to be done other than constantly upgrading the core OS.

  17. Avatar Ubuntu Look raquo; In Defense of the Six-Month Release Cycle May 11, 2009 @ 2:07 pm
    Reply

    […] Read more at Works With U […]

  18. Avatar Giving up on Ubuntu May 12, 2009 @ 2:54 pm
    Reply

    “Since Warty Warthog made its appearance in April 2004, Ubuntu has issued a new release every six months exactly on schedule”

    … which is exactly why the quality of the releases is going steadily downhill. Time-based releases don’t work.

    “But they should work”

    But they don’t work.

    “Here, read this post on Mark’s blog about why time-based releases are good”

    BUT THEY DON’T WORK.

    Every release has major, serious bugs that were known before the release, and it was just shoved out the door anyway to meet an arbitrary deadline. This is a horrible policy that makes absolutely no sense from a user standpoint. Maybe from a lazy developer standpoint it makes sense, but not from down here. I’d much rather wait three months for a delayed release than “upgrade” my system and then spend the next three months trying to fix it.

    And no, using only LTS releases isn’t a solution. The LTS has just as many bugs and regressions as any other release. LTS doesn’t mean “long-term stability”, it means “long-term support”.

    I’d love to see the download number trends over time. Google Trends is showing a plateau of interest in Ubuntu. I wouldn’t be surprised if it starts to go downhill from here on out.

  19. Avatar Jef Spaleta May 13, 2009 @ 7:07 am
    Reply

    Giving Up:

    Don’t read too much into Google trends… no one really understands how to interpret them..not even Google.

    The hard reality is that there are a lot of different timescales associated with open development. There is no master plan…there is no centralized control. There is no inherent timescale that makes sense.

    But at the same time, you can just decide to hold back the tide of changes..or else you never do the integration work to make all the changes work together. You’ll never get a head of the problem..but you can sure lag behind. If you are also doing your own new development as Canonical is starting to do instead of strictly being an integrator its even more important that you are not targeting stagnant code for your development work.

    The 6 month cycle is a compromise that that lets you do some stabilization and some development at the same time in the same codebase. The issue isn’t the 6 month cycle or a distribution which uses it. The issue really is, is it appropriate to stand that sort of distribution to a wide consumer audience.

    Its important to contrast how Canonical is approaching this compared to Red Hat. Red Hat has not locked itself into a strict release or development schedule for its enterprise product line while Fedora tries very hard to meet a 6 month schedule. By having distinct solutions, aimed at different usage cases and value points, Red Hat is able to separate the fast rate of change pressures on innovation and development with the slow rate of change pressure for stable enterprise products.

    What Canonical needs to figure out is how to do that seperation in the scope of consumer desktop/device linux distribution. They will need a fast rate of change for development and integration work..but they may need to introduce a new distribution release target model for OEM partners.

    -jef

  20. Avatar shamil May 14, 2009 @ 9:24 am
    Reply

    I have beef with the 6 month release cycle also. It sucks. A yearly cycle would be much better. The mepis yearly cycle is great. There would be less bugs, less upgrades to a new version (perhaps even less upgrades that went bad), less fresh installs. A 6 month release cycle is just too frequent. Once a year would be great.

    6 month release cycle is horrendous. Imagine if ms came out with a new version of windows every six months? It’s all really too much to keep up with a 6 month cycle. A yearly cycle, much better.

    on another not…
    Ubuntu’s a great base for a linux os. But, it’s so graphically and cli tied to sudo it’s disgusting. You login with your password for your user profile then use the same password for administrative tasks? In effect just a different way as logging in as root. For this, i don’t use it.

  21. Avatar Dave Johansen May 17, 2009 @ 8:16 pm
    Reply

    The problem with being so schedule driven is that it takes precedence over quality. For Mythbuntu 9.04, there’s still a major show stopper bug that hasn’t been fixed ( https://bugs.launchpad.net/ubuntu/+source/mysql-dfsg-5.0/+bug/326768 ) and it’s almost a month after release.

    I agree that there are advantages to have a deadline and shooting for it, but it shouldn’t trump quality (which unfortunately seems to be the case too often with Ubuntu).

  22. Avatar Nobody Important May 17, 2009 @ 9:02 pm
    Reply

    “And in most cases, each new release has actually been stable and ready for production use.”

    Liar. I love Ubuntu, but there’s no sense it making things up. 8.04’s release was a mess until 8.04.1, and 8.10 was down the drain for a week or so after. 9.04 is stable-ish, but it’s a poor time to have a release. Add together a bad time for drivers and a lack of worthy updates. Nothing interesting happened in the last six months, and everyone yawned at Ubuntu for a while. Fedora 11 will get all the good stuff, and only because it’s a month later. Ubuntu could, and should, have been delayed.

    The only reason worth updating anymore are the updated repositories. This drive to update and change and move means that there isn’t a stable platform to aim at – dependencies float about, hopefully supported in the next release, maybe not.

    Besides, what company is updating their OS every six months? Oh, yeah, NONE. Most are still on XP, a seven-year-old dinosaur that stabilized a platform and have a target for developers to aim for. Companies are lazy – they’d probably still be using 6.06, if they had migrated then. I can see the French police using 8.04 for years.

    I may use Ubuntu now, but I simply saw no point in 9.04’s release. It could have been skipped and nobody would have cared (if anything, it regressed, in terms of the Update Manager pop-up nonsense, driver issues and the half-baked notifications). The six month hamster wheels have wore me out and I simply don’t care anymore – I’m just going to go track Debian Stable and get back to the things on my computer that really matter.

  23. Avatar sander May 23, 2009 @ 7:23 pm
    Reply

    Delaying software releases because there is a major or small issue does not solve anything; it is an easy way that will not avoid release-delaying bugs in future release. A much more professional way to handle pre-release bugs is like this:
    1) Never delay the release. Delaying causes fear as potential new users will overreact; they will think the software is crap, they will think the bug also involves them (which definitely is not always true), they may already had planned their deployment (e.g. companies) and hence they have to change their planning. lt;–delays are *very* bad for trust in the software!!!
    2) Highlight the major/small issue in the “Known Issues” document
    3) Focus all resources on fixing these major/small issue, just after the release. So, users can download the update that fixes the issue asap.
    4) (most important!) Investigate why the issue could not get fixed in time and look for solutions to avoid this in future releases. Do we need more testers? How can we attract more testers? Don’t we have enough developer resources? Is there a gap in the testing procedures? Do we need automatic testing to avoid the same to happen again? Do we need help from hardware vendors? And so forth. lt;– THIS is the smart way to solves this kind of issues, delaying a release is a sign of weak project management (for instance Debian which has a too democratic structure).

    [email protected]: if a major issue is found in Ubuntu, news sites like this one should focus on why the issue got there and on how this can be avoided for future releases. This will be much more productive than defending the six-month release cycle.

  24. Avatar Uh September 11, 2009 @ 11:53 am
    Reply

    The release should ALWAYS be delayed until all the major bugs and regressions have been fixed. Shoving it out the door on a specific date when you know it’s going to break users’ machines is the height of irresponsibility.

Leave a comment Cancel reply

-or-

Log in with your Channel Futures account

Alternatively, post a comment by completing the form below:

Your email address will not be published. Required fields are marked *

Related Content

  • modern applications
    Making Modern Applications More Secure
  • Virtual data center
    3 Disruptive Networking Technologies Coming to Your Data Center
  • Risk level
    Cloud Security Provider Says Policy Gap Puts AWS Security at Risk
  • HPE Ezmeral
    HPE Ezmeral Updates Drive ‘Data-Driven Digital Transformation’

Upcoming Events

View all

Channel Partners Europe

June 13, 2023 - June 14, 2023

Channel Futures Leadership Summit

October 30, 2023 - November 2, 2023

Channel Partners Conference & Expo

March 11, 2024 - March 14, 2024

Galleries

View all

7 Channel People Making Waves This Week at AWS, Cisco, Snyk, CrowdStrike, More

June 9, 2023

Latest Channel M&A: ReliaQuest, IBM, Broadcom, Amplix, More

June 9, 2023

Images: Channel Partners Conference & Expo Best in Show Awards

June 9, 2023

Industry Perspectives

View all

Identity Is Increasingly Valuable – and Targeted

May 18, 2023

Gaining a Competitive Advantage through AV Managed Services

May 10, 2023

How to Build an Organization That Attracts and Retains Talent

May 1, 2023

Webinars

View all

From Problem to Profit: Mastering the Science of Selling Using Business Outcomes

May 9, 2023

Meet the 2023 Channel Futures Channel Influencers

April 13, 2023

DE&I Dialogue: How the Right DE&I Initiatives Can Propel Your Business

April 5, 2023

White Papers

View all

6 UCaaS Reseller Challenges and How Real World Businesses Solved Them

February 1, 2023

Frost Radar: North American UCaaS Market, 2022

February 1, 2023

The Complete Guide to White-Label UCaaS for Reseller Success

February 1, 2023

Channel Futures TV

View all

Coffee with Craig and James Episode 124: Palo Alto Networks

Motus: Partners Grasping Mobile Workforce Management Opportunity

June 9, 2023

Coffee with Craig and James Episode No. 123: MartinWolf M&A Advisors, CP Expo Preview

April 24, 2023

UScellular Takes On Rivals with Partner Program Simplicity

April 21, 2023

Twitter

ChannelFutures

Channel people making waves include @mnair1, @George_Kurtz, @mike_at_vulcan, @jzoblin, @jpatel41 and more.… twitter.com/i/web/status/1…

June 9, 2023
ChannelFutures

.@motusdotcom wraps its #CPExpo experience, talks mobile workforce management opportunity in the channel.… twitter.com/i/web/status/1…

June 9, 2023
ChannelFutures

Find out why #companyculture is significant when planning a merger or #acquisition. dlvr.it/SqR4ks https://t.co/gAUxiEW4yE

June 9, 2023
ChannelFutures

Great conversation with @Tom_D_Evans of @PaloAltoNtwks talking #cybersecurity, channel, more.… twitter.com/i/web/status/1…

June 9, 2023
ChannelFutures

The latest channel M&A includes @ReliaQuest, @IBM, @Broadcom, @AmpliXIT and more. dlvr.it/SqQntD https://t.co/DektC1Xmz9

June 9, 2023
ChannelFutures

Find out why everyone is talking about generative AI and cloud in this exciting new article >>… twitter.com/i/web/status/1…

June 9, 2023
ChannelFutures

In just 4 days, #ChannelPartners will come together in #London for #ChannelEurope. Professionals from the IT & Tele… twitter.com/i/web/status/1…

June 9, 2023
ChannelFutures

Kicking off a multi-part series, get the inside scoop on what changes are taking pace in the channel. In this galle… twitter.com/i/web/status/1…

June 8, 2023

MSP 501

The industry's largest and most comprehensive partner awards program.

Newsletters and Updates

Sign up for The Channel Report, Channel Futures Update, MSP 501 Newsletter and more.

Live Channel Events

Get the latest information on the next industry-leading Channel Partners event.

Galleries

Educational slide shows and images from live events.

Media Kit And Advertising

Want to reach our audience? Access our media kit.

DISCOVER MORE FROM INFORMA TECH

  • Channel Partners Events
  • Telecoms.com
  • MSP 501
  • Black Hat
  • IoT World Today
  • Omdia

WORKING WITH US

  • Contact
  • About Us
  • Advertise
  • Newsletter

FOLLOW Channel Futures ON SOCIAL

  • Privacy
  • CCPA: “Do Not Sell My Data”
  • Cookie Policy
  • Terms
Copyright © 2023 Informa PLC. Informa PLC is registered in England and Wales with company number 8860726 whose registered and Head office is 5 Howick Place, London, SW1P 1WG.
This website uses cookies, including third party ones, to allow for analysis of how people use our website in order to improve your experience and our services. By continuing to use our website, you agree to the use of such cookies. Click here for more information on our Cookie Policy and Privacy Policy.
X