Where are the modular laptops?
Mark Pickavance looks at the likely reasons that modularity in the world of computers isn’t fashionable any longer
If you asked me to explain in just a single word why the PC became so successful, that word wouldn’t be Intel, Microsoft or any other brand. It would be ‘modular’.
The reason that the PC is modular, to a degree, is that IBM saw it as a little cousin to its mainframe computers, and those by definition were modular. Where most computers of the era were self-contained systems that had whatever the maker put in them from the outset, the notion of card slots and other expansion possibilities made the IBM XT something special.
Obviously, IBM could have made it cheaper by not having expansion in the design, but the firm chose to have it, and it was the defining part of the personal computing jigsaw that still sees us using the offspring of that device today.
So if it was such a great idea, and the fulcrum on which the world of personal computing hinged, why do computer makers hate this aspect so much these days? And having promised both modular laptops and even greater modularity in the desktop system, do they appear to be reneging on this now?
Why a modular laptop could be good
In a desktop PC, it’s relatively easy to adjust the system in almost any way (unless it’s a Dell) via the purchasing of parts, because it’s inherently modular. The critical feature that makes this modularity work, mostly, are standards like ATX, USB, SATA and HDMI. When you have standards, saying how physically big a hard drive is and where the connectors are on it, then you can make a working system from a collection of parts that weren’t specifically made to go together.
That’s fine for desktop computers, and with a bit of effort it could be even better, but what about laptop systems? They have smaller numbers of standards, like the 2.5” SATA drive, but you can’t take a screen from a Toshiba and make it work with a Lenovo, and the majority of portable systems can’t have their CPU or motherboard replaced, and very few have upgradable GPUs. If you’re lucky, and there’s no guarantee, you can switch out the RAM, hard drive, and possibly replace the battery. However, the last item is bespoke, as is the charger that powers it.
Surely, it would make sense if laptops were modular in exactly the same way as desktop computers? Then when you wanted a better screen or a faster CPU, you’d buy that and not junk the entire system. This works for those desktop users who are prepared to venture inside their case, so why can’t that happen with laptops?
The practical reasons
Late last year, I wrote a piece for this very publication about rumours that Intel wanted to move away from socketed processors. One of the reasons that I gave for this movement was reliability, because allowing a user to remove a CPU does offer the possibility of part failure through poor contacts.
One of the challenges of making things modular is that you increase the points of potential failure, and you need to make connections that are positive and which, in the case of a laptop, won’t separate when the equipment is moved. That’s a challenge but not impossible. However, making a RAM slot on a laptop motherboard is more expensive than soldering it to the motherboard for the hardware maker, so each amount of modularity comes at a price.
Another practical issue is the chassis, because with the desktop PC you buy an enclosure that’s big enough for your future needs, not necessarily just for exactly what you put in it to begin with. That’s not possible with a portable system, because for obvious reasons the makers want them as small and thin as possible. To achieve that, most of the parts are combined on a robotic production line and aren’t that easy to disassemble and reconstruct.
By definition, a laptop that had a proper chassis in a desktop sense would be bigger than normal to allow you to disassemble it, and for all the part interconnections. This isn’t impossible, but it would just require some very clever engineering, and the definition of a bucket-load of new standards, so the chassis could be scaled in various directions to handle a bigger screen or a larger keyboard.
None of this is impossible, but it would require a very tight vision of what parts of a system could reasonably change, and how future enhancements might be included when we’ve no idea yet what they might be.
The desktop PC has managed this by allowing some standards to die and others to replace them, and because of that, most of the hardware in a modern system wouldn’t mesh well with a PC made ten years ago. So the standards for the modular laptop would need to evolve and be managed by a standards organisation like IEEE, but perhaps one that’s less influenced by Microsoft, for example.
The economic reasons
If we can overcome the technical challenges, then why shouldn’t the modular laptop exist? Maybe the answer to that is more about economics than the technology. All system makers exist to ship machines, and their simple goal is to ship more systems each year than the last. Putting aside the impracticality of an ever-expanding market, what stops most laptop makers achieving that dream is that their customers often have their existing hardware, and they won’t generally upgrade until that either stops working or becomes so old that it’s no longer viable.
To this end, the equipment makers don’t want a machine to go on indefinitely, and to them replacement parts are a double-edged sword. For business users, the lack of a replacement battery isn’t a selling point, but if makers thought they could get away without this feature (like Apple has done) then they’d remove this too. If you can’t replace the battery, then after about three or four years of daily charging, the system will no longer hold enough power to be used away from the mains, and you’ll need to buy a new one.
A modular system by definition could carry on indefinitely, parts allowing, which might entirely scupper you buying a new system. Yes, they’d get the profit from the expansion parts you bought, but they’d also see you sell your old screen/CPU module on eBay, negating a sale they might have made elsewhere.
To make this work economically the machine will cost more than one without modular features, and the parts will carry a premium too. But will people be willing to accept that, or will they just take the view that in three years time, they’ll simply buy another cheap system and junk or sell the old one? They might well do that, because the turnover of technology is so rapid in this sector that there’s no guarantee that in three years time the new modules will be compatible with the system you bought.
This scenario very strongly reminds me of the motherboards that appeared in the mid-90s that had dual processor sockets. The idea was that you could get the system up and running on a single CPU, and then when you wanted a boost, you could buy another and simply drop it in. However, what this didn’t take account of was the pace of change, which meant by the time you came to add another 400MHz CPU, a single CPU of 900MHz existed that could easily trounce your dual-CPU system but on a different socket and chipset!
Therefore, many of these were bought, along with GPUs that were intended to be combined with another of the same specification, which ultimately never realised their potential. That fact is what seems like a great idea now might not seem so brilliant in three years’ time.
In respect of multi-CPU motherboards, they became horribly complicated and expensive to make, and people realised the futility of them. Intel made a move to make sure that the consumer CPU range didn’t work in this respect, moving it to the Xeon series and specialist workstation and server customers. And then the multi-core systems provided much the same solution without many of the drawbacks.
If we all knew that we’d keep our next PC for ten years, then perhaps we’d consider paying more for it and getting an upgrade path as part of that deal. However, that assumes that PC makers would embrace the idea of providing that possibility, which isn’t a certainty.
At the heart of why we’ve not seem modular systems of the type we’d like is the nature of competition and how standards fit within that concept. Hardware makers love standards when they think they’re helping promote or sell things, but they’re much less keen on them when they’re considered a hindrance to profit.
A classic example of this was the debacle a few years ago over ‘n’ class wi-fi, where the router makers wanted to bring new products to market, but the standards on which they were based weren’t finished. That resulted in ‘Pre-N’ devices, which were proprietary guesses at how ‘n’ might eventually turn out, many of which weren’t remotely compatible with other wi-fi hardware.
Some hardware makers are strongly anti-standards, unless those standards are defined by them, covering their devices with proprietary connectors and patented features. While serving their short-term gains in attempting to lock people into their equipment ranges, these are ultimately damaging for the sector as a whole. Recently, the EU recognised this in respect of mobile phones, and it now insists that all phones use the same USB charging technology, in an attempt to stem the mountain of redundant proprietary chargers filling drawers across Europe. Even though Apple chose to ignore this, it’s generally worked, and the smartphone industry is better for it.
An attempt at a modular laptop would most likely be a patented concept, and therefore any system builder wishing to expand on the success of the originating company would probably have to license the design. The temptation would be that instead of doing that they’d design their own version, making an entirely proprietary solution, or license only parts of the concept, making parts between the two companies incompatible. Ideally a company would design the concept and license it for very little, but most companies would see this as helping their competition or being corporately soft. Therefore even if a company did progress with bringing a modular design to market, it would probably try to make it exclusive to itself, limiting the scope of the solution and the range of modular options available.
The irony is that the aspects that make the idea of a modular system so attractive to the buying customer are exactly the same that put the hardware makers off producing them. Until they can be convinced that what’s good for the customer is also good for them, these concepts will remain either unfulfilled or blighted by proprietary standards.
What’s really interesting for me, as a technical writer, is that this is a prime example of where the economics of the industry is entirely at odds with the expectations of the buying public. That’s a dangerous scenario, because once you stop giving people what they’d actually like, they might well stop buying what you do offer them completely.
Sales numbers from the last quarter of 2012 are painting an unpleasant picture where many of the classic PC makers (Dell in particular) are showing a marked reduction in units shipped. What’s also happening is that the market is turning away from high-end (and therefore high-profit) systems, knowing that even entry-level hardware is pretty useful these days.
So far, the industry reaction to this has been one of two directions: either denial or entrenchment. Those in the denial camp are trying to convince themselves that the whole tablet era is just a blip, and when people are bored with Angry Birds they’ll go and buy a proper computer.
While the siege mentality is gripping other parts, the likes of Intel and Apple trying to turn the PC into an appliance that’s built to a price and sold to be thrown away and not upgraded at any point in its life. In that respect it’s not that the true modular laptop or PC never arrived, it’s that we’re actually heading in entirely the opposite direction! With sales falling, manufacturers don’t want to do is miss that replacement sale in three to five years time because they let you upgrade instead.
Given the economic woes the world is suffering at this time and the inability of the USA to deal with its fiscal deficit, that doesn’t seem like a strategy that will lead anywhere but the bottom of the technological barrel.
Apple’s so far successful strategy has been to turn computers into commodities (the iPad), make them status symbols, have very few models or options, sell them at a premium and replace them often. Selling primarily into the first world, this has worked, because up till now there’s been a portion of society that can afford their prices and to replace their hardware every year (or seven months later even, with the latest iPad). However, should America and Europe’s economic problems get worse, that seems like commercial suicide, because Apple doesn’t do cheap or upgradeable.
Personally, I’d love to see one manufacturer really take the technological bull by the horns and build a modular system where you could swap out every part of the laptop should it fail or you want a performance boost. I don’t see this happening any time soon, though, because makers are all convinced that it would damage their own market or just improve it for others.
The track record is that when times get tough, people become risk adverse, and that’s probably where we are right now. So while the idea of modular computers is as attractive as ever, it’s no closer to reality.