Thursday, September 20, 2012

Microsoft's Impending Software Versioning Woes and a Solution

The release of Windows 8 is only a few days away. In an attempt to get into the mobile game, or rather, get competitive in the mobile game (Windows Phone? Really?), there will be versions of Win8 for ARM architecture, as well as traditional x86 and x86-64. I predict Microsoft is going to run into some problems on this front (on top the flack they are going to get from the unintuitive GUI and the locking of the bootloader), and offer an alternative.

In the last several years, as PCs transitioned from 32-bit to 64-bit architecture, there were some growing pains. Software that users used to be able to run wouldn't necessarily run after an upgrade, and vice versa. Microsoft did a somewhat decent job combating this with Compatibility Mode and Windows XP Mode (a tool too few users know about, in my opinion). But for the most part, 64-bit PCs could run 32-bit software, and to some extent, users started to learn the difference, and it wasn't so bad. x86 and x86-64 are very similar, apart from the data width, as x86-64 is just an extension of the x86 instruction set. Adding ARM to the mix is a different story. ARM is a completely different philosophy with a completely different instruction set. So what happens when a user buys their Win8 (ARM) tablet, expecting all their desktop software to run on it? (Hint: Vista-level user frustration all over again).

Let me stop for a moment and compare the extremes on how other systems deal with software across architecture. On one side, you have Apple. Every device they produce uses the same architecture (often the same processor), so software compatibility isn't a problem. Also, on their mobile platform, they have strict control over what software users can run on their devices. There are ways around this, but chances are, if you understand how to sideload iOS apps after a jailbreak, you know enough about software to not run into problems. It is a big deal when Apple switches architectures (as they did a few years ago, moving from PowerPC to Intel x86-64), and after that, they drop support for the old architecture completely.

On the extreme opposite is the Linux, etc. community. Software is distributed as source code, and users compile the code for their architecture. Because of this, there is a Linux kernel for every known architecture in existence, including some really obscure ones. Virtually any software can run on any system, as long as you have the source code and a compiler for your architecture. On the other hand, you also need a pretty firm grasp of compilers, such as creating makefiles and other things that are well over my head.

Microsoft seems to generally hold a philosophy that goes something like this: The end user doesn't know much about computers, and they don't need to know much to use one. Therefore we should (try to) make our products user-friendly and in the process protect the user from their own ignorance (otherwise they might break something or get a virus!). Let's ignore how generally unsettling a philosophy this is for me, and how often "user-friendly" software has been so locked down that it is unusable, and try and tackle the problem from their point of view.

Apple only allows one architecture at a time, and keeps tabs on available software. This isn't an option for MS if they want to compete in the mobile market without getting into the hardware business (and we know how bad Microsoft is at hardware). But MS has way too much value on Intellectual Property to ask developers to release their source code (and would never do it for their products), and furthermore, asking users to compile software violates the above philosophy.

So how about this:
Software installers (i.e. setup.exe files) contain the source code uncompiled but in an encrypted form. Also in the installer is a compiler or compiler library that has the private key for the source code hidden inside (this file is pre-compiled, so protecting the key should be trivial). When a user runs the install file, the compiler library checks the OS, processor, etc. and generates or looks up the correct compiling options for the system, decrypts and compiles the binaries, and then starts the regular install process (moving files, updating the registry, etc.). To the end user, this just looks like installing; they neither know nor care what is going on as the progress bar slides along (just as before) but now the same software runs on all their Windows devices, regardless of architecture. To the developer, they have always had to pre-compile software for Windows and create install files. The only difference now is exactly how that process works, but a wizard handles all the details (and oh, how Microsoft loves wizards). Their code is protected, just as it was as a compiled binary, but they don't have to compile and distribute a different version of their software for each combination of OS and architecture, and make sure their users get the right version.

This in theory would be a workable solution. Such a system could be implemented into Visual Studio, so even existing software could be distributed to mobile users, and versioning would no longer be an issue. Maybe someone with a deeper understanding of development for Windows can identify some problems with this solution, but I can't. Security might be a problem (as it always is with MS), but if the system is designed from the start with robust security in mind, even large developers of major, expensive, proprietary software (Adobe, etc.) could distribute this way without fear of a competitor (or worse, a pirate!) accessing the source code and reverse engineering the software. If anyone has any thoughts or comments, please share them.