After years of being a Windows user (since 2.0) and an administrator I've learned to ignore the marketing hype surrounding new Windows versions. But I tried out the Windows 7 beta just so I can settle arguments about what it can or can't do. It's only a minor upgrade from Vista with some stuff added and removed but the fanatics have been evangelizing it like it's a start of a new era in computing technology. Linux netbooks® really get them hyper.Table of Contents
8. Control and Security
9. Shortcuts to Panic
10. Divide and Conquer
11. Sharing Almost Redefined
^ 1. Preparation
I can't give a fair comparison between Vista and Windows 7 so I'll be referencing XP a lot. I haven't used Vista much as I have no need for it internally and only used it at a customer site a while ago. The customer I did ECAD work for insisted on it which reduced the workstation performance by half compared to the previously installed XP, even with Aero disabled. Over RDC it was even worse with constant stalling especially compared to a Windows Server 2003 system on the same network (disabling auto-tuning didn't fix it either). I charge by the hour so it really wasn't a problem. I haven't worked for them since the Windows 7 beta was released so I can't install it on the same system. I'm not going to install Vista in VMware to compare either. I feel that a virtual machine is not a good way to benchmark a desktop OS because there are too many variables.
I downloaded the 64-bit DVD ISO (3.2GB) and installed it on VMware Player 2.5 on 64-bit Ubuntu 8.04.2 (Hardy Heron). I'm using a Phenom 9550 with 8GB PC2-6400 ECC memory and a pair of Maxtor IDE drives using software RAID (md), LVM, and LUKS/dm-crypt. Drive encryption doesn't affect performance much with modern processors as the drives are so much slower. An XP VM runs just fine on the same system. I gave the Windows 7 VM 2GB of memory and a pair of 16GB virtual volumes. The beta is the Ultimate edition which will be the most expensive of the lot.
^ 2. Installation
Windows 7 Setup is graphical like those on most Linux distributions (which they've had for many years) but the functionality isn't much better than the XP installer. The only really useful addition is the ability to use USB storage devices to load drivers instead of requiring a floppy drive. Like in XP it is single-task based so that every partition edit is immediately applied while most Linux installers queue up a series of operations and then perform them in a batch. The installation and updates took a long time with several reboots (and of course entering the 25 digit key). I don't remember the details about the one Vista installation and update process was like but both are faster to install than XP with it's service pack and several updates that often have to be performed sequentially. The progress messages from the setup screen did seem to indicate some updates were installed but after logging in Windows Update installed some more. The earlier messages may have been referring to updates on the ISO that weren't slipstreamed.
Ubuntu's graphical installation process is a lot faster, especially considering the number of applications included. Its updating can take longer but the package manager updates everything, not just the OS. I use an internal mirror with netbooting via PXELinux that's partially automated using Kickstart so my installations are very fast and already updated. You can achieve some of the same benefits on Windows with the AIK, WinPE, WDS, and slipstreaming but you still have to deal with licensing, product activation, and updating. Windows Update only covers Microsoft products so other applications need their own update functions else you have to do them manually. Like Ubuntu and Debian most Linux distributions come with text or graphical installers or both. Text installers are not as friendly but work on systems with limited memory. Graphical installers are easier but require more memory. Many of them are integrated into Live CDs that can be used for web browsing or playing music while the OS is being installed in the background. There are a few Windows Live CDs, mostly based on BartPE. I haven't tried any of them but once I made a Windows 98 Live CD with a DriveSpace volume that had Quake installed as a feasibility study for RAM disk usage on an embedded system. Using Windows 98 on an embedded system was STUPID but I wasn't the engineer in charge of the project.
The full install was about 9GB but I suspect there is some debug code and related utils taking up space. There is a separate 200MB system partition with about 32MB for the boot loader and its language support files. An XP install with IE 7 added is about 3.5GB at most. A Ubuntu install is about 3.5GB (with a lot more applications) and comes on a CD.
After installation the first issue I encountered was that Windows 7 doesn't have a driver for the default AMD PCNet virtual network device. I had to set ethernet0.virtualDev = "e1000" in the vmx file which it recognized as an Intel PRO/1000 MT device. I started with a VMware configuration from an XP VM. If you are starting from scratch you may find EasyVMX helpful. I then extracted and installed the VMware Tools.
^ 3. Storage
Some of the features I wanted to try included Logical Volume Management (or "Dynamic Disks" as Microsoft calls it), software RAID, and drive encryption. I didn't use these technologies until I started using Linux. I knew that Windows had the capability but systems I built had hardware RAID and didn't need LVM or encryption. With Windows 7 Setup there is no way to configure these during installation. You have to create a basic disk and then convert it afterwards. What's annoying is that if you later erase the partition contents, leaving only the Dynamic Disks and RAID layout, the installer will install to them and they will be fully operational when Windows first starts. While the installer does indicate which partitions are dynamic or are using encryption it doesn't do so for RAID or anything on the system partition. Linux installers usually show the individual partitions and the child volumes for everything so it's easy to follow. I'm surprised by how primitive Windows Setup is considering how long it's been since they released XP and features like Dynamic Disks have been available since Windows 2000. It may be the result of the decision to only offer these features in specific editions of the OS (which causes problems in Vista with mixed installations). I haven't found anything definite about Dynamic Disk availability and the various editions of Windows 7 yet. Linux distributions don't have these restrictions and most graphical installers can set up RAID and LVM. For drive encryption the only installer that can handle it (that I'm aware of) is the alternate text-mode installer on Ubuntu (and Debian) but I suspect more distributions will add it. Drive encryption is really important for laptops.
I then set up Dynamic Disks first as software RAID requires it. Applying the change was instantaneous. While setting up the RAID mirror with the Disk Management tool I encountered an error that stated the boot configuration of the system could not be updated and I should use bcdedit.exe to fix it manually. I didn't bother and I found that the system wouldn't from the mirror if the primary was removed. On Linux this is also a little tricky with the GRUB boot loader. The problem is that in a failure scenario it's hard to deterministically identify at boot which is the good drive versus the bad if the latter is partially functioning but not syncing. Obviously this requires a BIOS that can boot the next good drive if one has failed completely (or partially after a timeout).
^ 4. Encryption
BitLocker is Microsoft's drive encryption system. Another implementation is BitLocker to Go which targets USB storage devices. I tried to "turn on" BitLocker for the C: drive but found it can't use a dynamic volume. So I went back to Disk Management only to find that you can't revert a dynamic volume to basic if the OS is on it. So I had to reinstall. But then I found that the installer won't allow you to delete a a dynamic partition. I had boot Knoppix and use cfdisk to delete the type 42 SFS partitions. Then I was finally able to reinstall and activate BitLocker, then change the disks to dynamic, and then set up RAID mirroring.
On Linux these problems don't exist. Any combination of software RAID, encryption, and LVM volumes can be created and stacked in any order on almost any storage device. On my Ubuntu system I set up RAID first, then LVM, some logical volumes (usually one per user) within the LVM group, then LUKS/dm-crypt volumes on those, then format them to ext3. I split the encrypted volumes up this way because even a single bit error can cause major damage on encrypted data. With separate volumes I limit the damage if one develops an error. This means the partition headers for RAID and LVM are unencrypted but I can't imagine they contain any significant data that could compromise the system. Unintentionally I tested the reliability of this configuration over several months amid random system crashes. I eventually narrowed down the problem to RAM - I had been burned by a bad batch of Crucial BallistiX PC2-8500 2.2V modules like a lot of people. Only the final crash lost any data and I recovered most of it (the critical data was backed up elsewhere).
From what I've read, to set up BitLocker on Vista required creating a 1.5GB partition for the loader/authentication system which obviously can't be encrypted else it can't boot. In Windows 7 the partitioning wasn't necessary and it looks like only the 200MB system partition is used (unless part of C: is not encrypted which I can't determine). My system doesn't have a TPM so I had to use a USB key as it won't allow just a PIN. At first it insisted that a TPM was required but I found that this was due to the default setting in the Group Policy. After changing the setting it still wasn't working. I noticed that my USB flash drive wasn't enumerating properly and in Device Manager the USB Mass Storage Device was reporting "This device cannot start. (Code 10)". Turns out that I had over-optimized the vmx file and needed to add some settings back. BitLocker was now satisfied and I selected the new option "Require a Startup key at every startup" and selected the USB drive for the key. Next I had to choose what to do with the recovery key - save it to removable drive, a file to an unencrypted location, or print it. Both key files are small and have long names, probably a serial number. The recovery key is a text file that includes a description of what it is for. The startup key is a binary with a BEK extension with hidden and system attributes set. The final screen had a "Run BitLocker system check" option selected by default. It restarts the system and attempts to read the startup key from the USB drive. After I clicked the Continue button it just kind of did nothing until I found that a restart prompt dialog was hidden behind the Control Panel window (minor bug). It restarted and booted back into Windows and reported that the test failed. After several attempts I tried it without the test and it proceeded to encrypt drive. It did take a while but the system was usable while this was occurring. You could say its ability to encrypt the volume the OS was operating from is an advantage but I'm not sure it makes up for Windows Setup not being able to do it. After it completed I rebooted with the USB drive but the key check failed. Apparently it couldn't see the drive. I searched with Google a bit and found many other reports of the same problem with Vista. Many users thought the BitLocker utility wasn't saving the key to the USB drive but I think they were confused because it was hidden. It may be an issue with the BIOS not enumerating the device or the boot application not communicating with my particular USB drive. I ended up having to enter the recovery key and Windows booted. The recovery key is 48 digits long and is entered as 8 groups of 6 digits which is easy to enter with a numeric keypad. Each group apparently includes a checksum as it validates them as you type. After the last group is entered correctly it begins loading Windows immediately instead of waiting for you to press Enter.
On Linux, LUKS/dm-crypt uses PINs exclusively which decrypt a second key stored in the partition header which is then used to decrypt the volume. LUKS has eight key slots and any key can be used to add or delete the other keys. This can be combined with other authentication mechanisms via boot scripts for TPM, USB keys (via pam_usb), and pretty much anything else. The problem with it is that while scripts exist for some of the authentication options there isn't much of a standard implementation and many distros don't include all the functionality. Both have biometric support - Windows has a control panel applet and fprint tools are in the Ubuntu repositories but I don't have a fingerprint reader and can't test either. Like with Windows a part of the OS needs to be unencrypted so it can authenticate the PIN and unlock the rest of the volume. On Linux this is the /boot subdirectory which contains the Linux kernel. This means the entire kernel loads and can provide access to any hardware device for which it contains a built-in module (driver). For both Windows and Linux the boot-time authentication system is the primary attack vector for an encrypted filesystem. If an attacker installs spyware it could copy the key elsewhere for later retrieval. The Linux kernel and start-up scripts can can be configured boot off USB key instead and WinPE could probably act in the same capacity for Windows. But this security hole also exists with the BIOS and even with TPM it's still possible to hack into or around (although not easily).
You can "suspend" BitLocker which simply means it stores the key plaintext on the drive someplace making it pretty insecure. Obviously you could do the same with LUKS/dm-crypt but who would want to? According to the help file it's for some tasks like BIOS updates which may be due to the way TPM operates. You can also "turn off" BitLocker which decrypts the drive, a surprising option. Like the original encryption pass the system is usable while it is decrypting. Kind of neat but again I can't imagine where someone would need it (even for forensics). I went ahead and tried it anyway. The decryption pass is very slow but it worked and I could boot without entering the keys again. Just for fun I turned it back on again to see how long it would take to re-encrypt. That's when I encountered a fundamental limitation of the whole architecture. Remember that I had to set up a Dynamic Disk and RAID after encrypting because BitLocker required a basic disk structure? Well now that it was dynamic it couldn't encrypt it. So I had to reinstall all over again using Knoppix to delete the partitions, etc.
An alternative encryption option is the file-level Encrypting File System. It can unlock files automatically upon login. Starting with the 8.10 (Intrepid Ibex) release, Ubuntu has added a similar feature, Encrypted Private Directory that also unlocks with logins. In additon, LUKS/dm-crypt volumes can be controlled by login when combined with pam_mount. With both Windows and Ubuntu the encryption keys are themselves encrypted when used by the login process and are associated with the user's password (a common weak point). The pam_mount implementation in Ubuntu (and Debian) has a few problems and one is that the association between the user's password and the decryption key is not maintained if the user changes their password. Of course with both Windows and Linux these encryption options are still vulnerable to spyware.
In summary, Windows and Linux have comparable features with LVM, software RAID, and encryption. The Windows solution has a nice GUI and adds a couple of questionable features but it's implementation is very inflexible and limited by a crippled installer, architecture, and licensing. It's not that these features are "enterprise only" as they can be very useful for home systems and laptops. But they're unlikely to see usage outside of large corporate environments because of the limitations. Ubuntu and other distributions are very flexible and without licensing problems but the encryption installation and authentication functionality needs to be streamlined better. Both encryption solutions for Windows and Linux distros have multiple options for management with directory services but I don't have one set up and it's outside the scope of this review.
^ 5. Interface
The desktop looked plain enough. It wasn't using Aero since VMware Player's experimental DirectX acceleration feature isn't enough to get it working even with registry hacks. I don't find the desktop 3D effects trends useful and don't have Compiz enabled on my system since it currently doesn't work well with multi-head desktops and full-screen OpenGL applications. Those problems are expected to be solved with future releases of the DRI.
The new "superbar" taskbar is interesting. Basically you can pin a menu item to it which is similar in functionality to the Quick Launch toolbar you could enable in the taskbar on XP. Task buttons for running applications also end up there which at first seems confusing as you can't just look at it and tell what's running or not. The difference between a pinned menu item button and a running task button is that the latter will pop-up a list of active windows when it has focus, similar to the "Group similar taskbar buttons" function in XP when there are more active task buttons than will fit on the screen. If a pinned menu item also has open windows then they are just listed above it so it makes sense it a way. Right-clicking on an item allows you to unpin it and lists recent files used with the application. The button grouping function on XP slowed me down when I had many CAD documents open so I disabled it. I would have to use the superbar a lot to know if I like it better.
^ 6. Applications
There still isn't much in the way of applications included but some of the existing ones have improved. Some that were in Vista have been removed and others added in. This is not a criticism of the lack of bundled applications but when you compare it to Ubuntu and the relative installation sizes you wonder what's using up all the space. Internet Explorer 8 continues to try and catch Firefox but is also available for XP. Like web browsers in most Linux distributions it doesn't include the Java, Flash, or Silverlight/Moonlight plug-ins. Notepad is still useless as it doesn't handle LF-only newlines and still has no syntax highlighting even for batch files. I normally install EditPad Lite. The calculator has improved and is equivalent to the default gcalctool on Ubuntu. WordPad is substantially better and can open and save OOXML and ODF files but I didn't test for compatibility with Word 2007 and OpenOffice.org Writer 3. It closest Linux equivalent is probably AbiWord but I normally use Writer. Ubuntu includes OpenOffice.org but removing it and adding AbiWord saves about 300MB of space. Even combined they are much smaller than an Office 2007 installation. Paint now has multi-level undo. It also has a lot more scalable shapes but having them in a raster graphics editor seems like a waste of code. Paint is basically useless for photo editing and is only good for people who think vector editors are hard. The closest F/OSS competitor is Tux Paint which doesn't have the editing tools but is more fun. Both WordPad and Paint now use the Office 2007-style ribbon. I'm indifferent about the ribbon. It makes some functions easier to find but in applications with a lot of functions it makes the rest harder. There's also the standard selection of basic games with a few more network-enabled versions.
Windows Media Player 12 currently doesn't support XP and is still bloated fatware compared to foobar2000, Media Player Classic, and VLC (on Ubuntu I use Rhythmbox). Considering the number and types of plug-ins available it probably qualifies as its own OS. Initially it can only rip to WMA, MP3 and WAV, but plug-ins for Ogg and other formats are available. It's not very good at solving codec problems. I tried a few videos including Elephant's Dream (DivX MPEG-4) and it reported it couldn't play the files and the problem may be the file type or codec. A "Web Help" button took me to a web page about error #C00D1199 (not very helpful). On Ubuntu when Totem doesn't have the correct codec it offers to install what it needs. With Windows, advanced users often just squash the problem by installing every possible codec. The other audio utility, Sound Recorder, can only save to WMA when in XP it could only save to WAV. In Ubuntu, gnome-sound-recorder can save to FLAC, Speex, Ogg Vorbis, and WAV.
I don't watch much TV and I've never used a PVR although I've been intending to set one up. Windows 7 Ultimate includes Windows Media Center. There are several similar PVRs for Linux, most of which are free and support Windows also.
The data backup and restore functions are more integrated then the buggy backup utility from Veritas in XP. In the properties panel of files and directories there is a "Previous Versions" panel that can be used to restore them from backups and restore points. When you connect a removable drive one of the autoplay options is to "Use this drive for backup".
^ 7. Legacy
There's some old Windows NT holdovers in XP. The Explorer Install New Font/Add Fonts dialog and ODBC Data Source Administrator are the two most obvious. I wanted to see if they were still around in Windows 7. The Install New Font option in the Fonts folder has disappeared as font installation is handled by a TTF file context menu. That left the ODBC utility. I went through the same routine as I would have on XP with an ODBC registration for an Access 2007 database. I didn't want to bother installing an Office 2007 trial so I installed the Access Database Engine (a.k.a. ACE) and then went into the Administrative Tools and ran the ODBC Data Source Administrator. I clicked the User DSN Add button but only an SQL Server driver was listed. I double-checked System DSN and, not seeing it there either, went through the Program Files directory to verify it actually installed. After some searching it turned out to be a 64-bit feature. The link in the Administrative Tools was to the 32-bit version of the utility and I had to use the 64-bit version located at C:\Windows\SysWOW64\odbcad32.exe instead. So I finally was able to select the Access mdb/accdb driver, enter a name, and then select a database. Then I was greeted by an old familiar dialog. It may seem petty but I get annoyed if I have to map a drive letter to select a database on a server share.
<rant> For the record I think the Jet/ACE database is unreliable garbage and would take SQL Server any day. Unfortunately I have to defer to software engineers that insist it's easier with .NET to work with Jet/ACE than SQL Server because "SQL is hard". The Jet engine is the abandoned offspring of the SQL Server team - deprecated in favor of SQL Server Express. The Access team couldn't seem to live without a particular query function so they forked Jet and their version is known as ACE. If you install Access 2007 on Windows 7 you end up with both Jet and ACE. In spite of what Microsoft says about upsizing (data migration) from Jet/ACE to SQL Server they are not quite compatible due to different field types and date ranges. I have tried using Access 2003 with a SQL Server back-end but it couldn't handle things like tables with auto-incrementing key fields. I haven't tried the same with Access 2007 and don't plan to since I've come to the conclusion that using a fat client instead of a web interface for data entry and reports is a waste of drive space. </rant>
One annoyance was that any online help that used the HTML Help Control for chm files (like the ODBC Data Source Administrator) would open a window that was locked to always be on top. This makes working with any application full-screen impossible with the help open. I had to open the chm files manually to get around it. Most of the other applications use the newer help system which has the opposite problem - when you want to lock it on top you can't. Problems like this don't occur on Linux because the window controls are provided by the window manager while on Windows each application has to implement it's own controls. If a Windows application developer doesn't feel like adding an "always on top" control then you go without or use a third-party utility that can override the window properties; or maybe not legally as the EULA states "You may not...work around any technical limitations in the software". Another irritation is the number of fixed-size windows with list boxes. I'm not overly fond of scrollbars especially when I have lots of desktop space. They did add some new hotkeys for window management which is an improvement but some of them don't work without Aero.
^ 8. Control and Security
Like Ubuntu with it's sudo implementation, the first user account created in Windows 7 has superuser privileges. This user can elevate their access on demand but doesn't log in directly with a superuser account (same with Ubuntu and the Linux root account). Explorer has a context menu on executable files allows them to be run with administrative privileges. The same can be done with most Linux file managers by configuring an "open with" sudo entry. On both most processes run using various system accounts.
This brings up everyone's favorite Vista feature - UAC. It has undergone some behavior modifications. If you are logged in with an account that has superuser privileges it only pops up if you try to install something or copy anything to certain directories like C:\Program Files or C:\Windows. Like in XP you get a security warning (not UAC) if you try to run an executable from any network location but not if you just copy it to the desktop or public folders (C:\Users\Public) and run it. Non-admin users get more UAC prompts. Considering typical user behavior is to use the first account with the least annoyances this looks like a set-up for a "blame the users" malware excuse that will keep Microsoft's lawyers happy. Every time I see "UAC" I think of the fictional company Union Aerospace Corporation that was responsible for the demonic invasion in Doom.
One odd feature is PC Safeguard which is apparently an updated version of an add-on named SteadyState. SteadyState is available for XP but it's not something I encountered before. It adds a data reversion function for a documents and settings in a shared user account. The changed data is stored as a file on the drive that is cleared at logout. On Linux you could achieve the same thing by mounting a temporary home directory on Aufs and deleting the rw branch at logout. Some of the reviews I've read promoted it for safer web browsing (remember that integrating IE with the OS was a feature). Compared to Live CDs this seems like a hack for administrators who don't know how to lock down a desktop or set up thin clients. Promoting it as a recovery mechanism reminds me of a CAD system I used long ago that had wonderful crash recovery right up to the last command issued. It was a feature that had evolved because all it did was crash.
One interesting feature is the integrated parental controls. On Linux there are graphical solutions for web access control (DansGuardian with the Ubuntu Christian Edition GUI and Mandriva's drakguard) but Windows 7 includes controlling access to applications by user and optionally by the ambiguous ESRB ratings. I've encountered the need for per-user application access control with a parent that wanted to play games that he felt were too violent for his kids. I solved the problem by changing ownership and permissions on the XDG desktop entry but a GUI would make it easier for a parent to do it. I like using parental controls, not because they work, but because it encourages kids learn more about computers while trying to get around them.
^ 9. Shortcuts to Panic
Lately there's been a lot of concern about Linux malware and XDG desktop entries. On Windows, programs that are runnable (executable) are determined by their filename extensions (exe, com, bat, cmd) which are hidden by default in Explorer. On Linux and other Unix-like operating systems the filename doesn't matter. Their files have attributes (read, write, and execute) similar to Windows (read-only, hidden, and system). In general a file can't be run as a program directly unless the execute attribute is set. It is possible to get a script (similar to a Windows batch or command file) to run without being marked executable if you call the program that interprets it (sh, bash, etc.) and tell it to run the script. When it comes to sending a program through e-mail the executability of a Windows program remains as the filename doesn't change (although some extensions are blocked by some e-mail servers). File attributes are not included with the file so the e-mail client that receives them saves it according to the settings on that system. On Unix-like systems that usually means without the executable attribute set. An XDG desktop entry is a text file that is essentially the equivalent of a Windows shortcut (lnk) file. Since the XDG file is not a program it usually doesn't have the execute attribute set which is technically correct. But the problem with both Windows lnk files and XDG files is that they can be created to run any program and specify any parameters so effectively they act as programs by proxy. They can be used as a basic malware carrier by being configured to create a simple program when launched. They could also be used to exploit an existing security hole in an application by having it connect to an outside source that has been maliciously created to exploit that hole. There has been some debate about requiring the file managers and desktop environments to not launch XDG desktop files without the execute attribute set. Currently Gnome and KDE don't require the attribute but the file manager Thunar in Xfce does. Unlike Windows, which always has Explorer, on Linux there are many desktop environments used so a security problem with one is not necessarily a problem for the others. The problems of Windows lnk files has existed since the early days of Windows. Here is an example shortcut Target command line that can create an entry in the current user's Startup folder which will show a directory listing at every login:
C:\Windows\System32\cmd.exe /C echo cmd /K dir "%HOMEPATH%\Documents">"%APPDATA%\Microsoft\Windows\Start Menu\Programs\Startup\dir.bat"
This will work with Windows 95 and newer. Give it a nice helpful icon (C:\Windows\hh.exe) and hook it up to the ftp client (available since Windows 95) and you got yourself a script kiddie-quality spyware. Safeguards like caution dialogs can be implemented on both Windows and Linux but don't underestimate the ignorance of users under the influence of social engineering, especially when they've grown used to being flooded with UAC prompts in Vista.
^ 10. Divide and Conquer
I wanted to try recreating the common Linux practice of separating user files from the rest of the OS with /home as a mount point for a separate partition. Whenever I rebuild a standalone Windows system I usually spend half my time backing up and restoring user documents so they don't get overwritten when I reinstall or use a recovery disk. I almost never use the Windows "repair install" option as usually the system has malware and replacing Windows components does nothing for any other application executables or their cached updates. With Linux you don't have to worry about wiping out /home during a clean reinstall and at most it will require fixing ownership of the existing files. In addition, the root account uses /root for it's home so it's easy to isolate any problem with other user home directories. With Windows its a lot more complicated due to a history of unconstrained application file management. While the directory structure suggested where files should go there wasn't anything preventing them from being written all over the place and overwriting system files. This led to crazy solutions like Windows File Protection which tried to automatically fix the damage after it happened. While system damage problems have been mostly solved with the introduction of Windows Resource Protection in Vista, many badly-written and legacy applications still store user files in the same directory as the application itself - resulting in permissions problems when multiple users access the same files. Newer applications should use the appropriate special folder but some access them by an absolute path and don't follow if the target directory is relocated. By "should" I mean that they can't save them elsewhere because the "Designed for Windows" logo requirements prevents them from doing so; in other words - security by marketing agreement (Bad Sony! You don't get a logo!) On Linux, user applications running under a user's account can write to their home directory and not much else. With the rest of the system being open source a distribution's staff can fix them if they don't follow the FHS rules.
With previous versions of Windows I've attempted to move user home directories and special folders to a different drive and it was a mess. While some special folders like My Documents had a Target property that could change the target directory and move the contents, others like Favorites didn't have that option and required editing the registry and manual copying. I would try to move the files using XXCOPY but dealing with access control list (ACL) problems and applications that insist on bad file management made it more hassle than it was worth. In Windows 7 all of the special folders are now relocatable but legacy applications are still a problem. In XP each user had the "My Documents" special folder while other user file locations like "My Music", etc. were just normal directories contained within it. In Windows 7 these are all special folders and are no longer nested in My Documents. This is helpful because they can be redirected to different locations instead of being dragged along with My Documents. Another annoyance in XP was that other user-related application data was stored in a different directory higher in the tree. It is now hidden within the user's home directory (same as Linux).
On Windows, NTFS supports volume mount points or "mounted drives" which act similarly to directory mounting options in Linux. You can target a mounted drive to any empty directory. During installation I reserved a portion of the virtual drive for a 2GB partition. Since C:\Users always has at least the initial user account in it I couldn't mount it there. I created a directory for a new user and mounted the partition there using the Disk Management tool. I then created the user account and changed the ownership and permissions on the directory. ACLs make this complicated with inherited permissions and such. Linux directory permissions are much simpler but and add-on for ACLs is available. Normally the user home directory and special folders are only created when the user initially logs in so I logged out and back in as the new user. I've noticed that the initial "Preparing your desktop" operation takes a very long time with nothing obvious to account for it. The new home directory and special folders are created and probably the initial registry for the account but I couldn't tell what else it did. It's at least twice as long as my XP VM and I did make another account without the mounted drive just to make sure it wasn't interfering. On Ubuntu adding a user takes only a few seconds including storing the password, setting group memberships, and copying /etc/skel to the new user's home directory. There is an additional few seconds delay with the initial log in with Gnome where it creates its configuration directories but that's it. After the long pause I found the mounted drive directory under C:\Users wasn't used. Instead it created a new directory as <username>.<domainname> as it would if the system had joined a domain. To be able to use the mounted drive I would have to change all of the special folders to use the other user directory or give it a drive letter and set them to that. This wouldn't work for the hidden application data directory either.
Special Folders can also be redirected to a share on a server. Not useful with a stand-alone system but common in larger business network environments. This doesn't help a lot of legacy applications for the reasons discussed above but it's convenient when users need to access the same data from multiple systems and allows for simpler backup. I knew I could redirect the special folders to a Samba share with the standard Windows SMB protocol but I wanted to try to connect them to an NFS export. Adding NFS support to Windows 7 was fairly easy but I didn't bother with the name mapping for my simple test. The add-on does have a lot of options for setting default permissions, character encoding, buffer size, etc. NFS is not a browsable protocol so you can't just click on the Network icon in Explorer and find servers. NFSv4 adds a pseudofs option that can be used to create a read-only share that lists all available shares which can be used to connect to the actual shares. If you know the exact address of the pseudofs export you can use the "\\<servername>\<sharename>" syntax in Explorer's address bar to see them but I wasn't able to connect to the actual shares this way. I also could not see any way to map NFS shares to drive letters in Explorer. With SMB shares you right-click on the server's share list and select "Map Network Drive" but since NFS is not browsable at that level you have nothing to click. I was able to use the Windows "mount" command to map drive letters using either the Explorer \\<servername>\<sharename> or Unix <servername>:/<exportname> syntax but there's no option to reconnect at login. There are ways to automate NFS drive letter mapping but it is a missing feature regardless. Being able to map NFS exports to a drive letter is is enough to prove that special folders can be mapped to an NFS share. There are some issues with extra crud being left behind when some files are copied to the NFS export but I didn't test for it.
^ 11. Sharing Almost Redefined
During the installation, if Windows Setup detects you are connected to a network, you are prompted as to what kind of network the system is connected to - home, work, or public. Selecting home takes you to the "Homegroup" configuration. If you click the "Tell me more about homegroups" link the help pops up a blank window. Actually there's not much more to homegroups than that. It looks like there are basically five parts: an extension of network profiles, new special folders for file management and sharing (Libraries and Homegroup), SMB file sharing, SSDP, and a new generic file sharing account (AlphaUser$). To test this I set up another Windows 7 VM and tested between it and the first.
The network profile extensions are easy to understand. Take multiple profile support in NetworkManager and extend it to shared directory permissions, control of some network services like Samba, and firewall profiles and you have it. There are three profiles for home, work, and public with increasing levels of lock-down. In the firewall they correspond to the profiles private, domain (i.e. Active Directory), and public. The Homegroups are only active with the home profile. A relatively obvious and trivial extension of network settings control so it's probably only worth a dozen patents at the USPTO.
With XP the starting point for navigating user files in Explorer was "My Documents". In Windows 7 the new special folder Libraries is the starting point but it only exists in Explorer and not in the underlying directory structure. The next level of special folders under Libraries are the media category libraries Documents, Music, Pictures, and Video (more can be created). Their contents are the aggregate of files contained in the sub-folders which is very confusing if there are duplicate file names. These also are not represented in the underlying directory structure but do exist as files. For example, the library Documents file is C:\Users\<username>\AppData\Roaming\Microsoft\Windows\Libraries\Documents.library-ms. You can delete them from the navigation panel which deletes the library file and hides the sub-folders in the Libraries view but this has no effect on the actual sub-folder directories or contents (some really bizarre architecture here). The default Libraries can be restored from the context menu of Libraries which causes any sub-folders to reappear. Under each media category library are two more special folders, a read-only private one and a read/write public one; "My Music" and "Public Music" for example (the missing RIAA's Music is conspicuous.) In the properties panel of the media category library object you can set which of the libraries within are the default save location. On Linux this would be similar to changing the settings in the local XDG config. There is also an option to optimize the library for various kinds of content but I'm not sure what it is for (indexing? compression?)
Homegroups are a network representation of the Libraries. It is simply SMB file sharing with a different view and authentication mechanism. It doesn't work with anything except another Windows 7 system which will annoy a lot of users (see comments at the bottom of the linked page). This could be another rusty bent nail in the coffin of XP but normal SMB file sharing is still available. In Explorer's navigation panel there is a Homegroup location. Under that are objects for users on various systems instead of just systems as you would see when browsing the SMB network. Under each is the same structure as the Libraries. Users can select which libraries to share with the Homegroup (adding them to the Homegroup group) or specific users and allowing read-only or read/write access. If a file is dropped on a media category object it goes into whichever library within that has read/write access with probably some rules for choosing between multiple valid targets. From the network view each user has a private and a public directory making it easy to keep files organized. Well it would be if it had been implemented correctly because each user's public directory on the same system is actually the same shared directory C:\Users\Public (another brainless design decision). If you check the sharing properties of the Public directory you find that sometimes only the current logged-in user (along with Administrators and maybe Homegroup) is shown as having share access when in fact all local users have access but the dialog randomly hides them. Security through confused obscurity? Since the Public directory is shared by all users, everyone has a sharing veto. I can't imagine this scaling well.
The new authentication mechanism includes SSDP for resource discovery, the Homegroup group, and what appears to be a new generic sharing account AlphaUser$. In a way it acts as a new "network guest" account. I think the idea behind AlphaUser$ is to use it as a proxy for sharing when a directory service is not available to authenticate users between connected systems. I think that when a file transfer occurs it's done under the AlphaUser account and then the ownership is changed on arrival. A password is needed to join a Homegroup and one is randomly generated when they are set up. SSDP has been used before in conjunction with UPnP. I accidentally discovered that the auto-complete function in Explorer's address bar showed some of my previous Homegroup locations in an unfriendly path referencing SSDP:\\Provider\Microsoft.Networking.SSDP//uuid:27088a07-14dd-438f-8433-bc5933be615c
There are other problems with the current system. One user couldn't access the shared folders in Homegroup on the same system but could access everything on the other system. I also found that if a user disables sharing on a media category library it sometimes causes the sub-folder to show the wrong sharing icons. This makes it difficult to understand what permissions are applying to where. The Library special folder seems to just add unnecessary depth to the tree. The name for it is odd but I can't think of a better one. Perhaps the intent is that completed work will be stored there and the desktop used for files that are currently being edited. I know a lot of users that work that way. I've never liked Explore and when it was introduced in Windows 95 I still strongly preferred File Manager with two vertical panes. I tolerate Nautilus on Ubuntu, probably because I have two monitors and it's "Places" side panel is a lot less cluttered than Explorer, especially with the new Libraries and Homegroup objects. On Windows I always use xplorer².
A default install of Ubuntu doesn't include SMB file sharing although it's easily enabled by adding Samba. Gnome and KDE (on Kubuntu) do have integrated SMB network browsing that doesn't need Samba. Sharing between users on the same system is easy; maybe too easy. One policy I really don't like is every user's home directory is readable by everyone else on the system by default. This is done to ease sharing but is unnecessary. Parents don't necessarily want to share everything with their kids like "marital photos". Each user has their own group (USERGROUPS=y in /etc/adduser.conf) and their home directories are set to be readable by members of their group. I normally remove read access for others (and newly created users with DIR_MODE=0750 in adduser.conf) because if one user needs read access to another user's home directory that user can be added to the other's group. This provides a lot more control than with the default settings. For ad-hoc sharing I have a "local" directory that all local users can write to and a "public" directory that is the same but shared via Samba and NFS. To get around authentication problems I normally set Samba to treat unidentified users on the network as guests ("map to guest = bad user" in /etc/sambla/smb.conf). Using a specific account with a password as a proxy (like Homegroups) would make sharing between systems easier and more secure.
^ 12. Epilogue
Windows 7 beta seemed relatively stable but I wasn't really installing much or putting it under continuous use. I had a lot more problems with the initial release of Vista. I did manage to crash Explorer a few times without trying but this is a "beta" which is the equivalent to an "alpha" for most other software projects. This is the reason why many system administrators wait until the first Windows service pack before mass deployments. Since they're skipping a second beta you might want to wait until SP2. Of course version numbering and service pack availability is taking on a marketing influence so it's getting more difficult to know when it's safe to deploy.
Some readers may get the impression that I'm against commercial software or closed-source. I don't have a problem using either actually. I've used the Linux versions of Perforce and TestTrack Pro, Nvidia's driver for my graphics card, and games like Quake 4. Given equal functionality I'll take an open-source product over a closed one as it gives me the option to maintain it if the developers abandon it, which reduces risk. I don't use Linux because it's free. I could easily use BSD or Solaris. I use Linux because I won't use Windows unless somebody's paying me to. Considering how long I've used Microsoft's products that's saying quite a lot.