Well, I just tried to reboot my laptop by doing Start, Shut Down, and pick Restart. What happened? It hung for 10 minutes, no activity. Everything stopped responding, no hard drive light, nothing. Finally I had to power off. They used to call it Plug and Pray back in the days when Windows 95 came out (notice the "r"?). This high-end Dell laptop, with 4 Gigabytes of RAM (heck, my Amiga 2000 had 5 MB and it was a lot!), a Core 2 Duo CPU, and supposedly a big I/O bus, but oy does it spend a lot of time hanging waiting for the hard drive.
So, I've been contemplating, in all this time I have to wait, what is it exactly that gives me the impression that, no matter how fast the computer is, it is still too slow? Have you thought the same? I mean, you know, back in the 1980's there was the DOS PC, the old Mac, the Apple IIe, Commodore 64 and Amiga, Atari 800 and ST, and so on. They were slow by today's standards, but back then they seemed, you know, fast enough but after you got used to them, you wished they were faster. Each new generation of machines got "faster", but they seemed, you know, fast enough but you wished they were faster. Heck, if you took the same machine and used it for 6 years, it was slower than when you got it. This is particularly a Windows phenomenon - I have noticed it because, being the miser I am, I like to recondition old equipment and use it for other purposes, so I have reformatted and reinstalled Windows many times on old equipment, only to find - hey, they run a lot faster after a reinstall, almost as fast as a brand new machine!
So, what's that all about?
First, Windows itself is a behemoth operating system that is built upon older versions upon older versions, with a mandate to maintain certain levels of compatibility all the way back to DOS 3. 1984. Remember the old IBM PC? In fact, Windows is itself very much like the Internet. The Internet as it stands today rests upon some basic standard protocols called TCP and IP, which were developed by the Defense Advanced Research Projects Agency (DARPA), and several universities including my alma mater, U of M. By the way, these protocols were developed back in the 1950's-1960's to withstand a nuclear holocaust and still route data between defense and university facilities. Not really designed for efficiency in communication, nor were they designed for anything other than simple text files, data files, e-mails, etc. Nowadays, upon those basic protocols, we have added rich Web (developed in the late 1980's and deployed in the early 1990's), streaming media, mobile apps, voice and fax virtual phone lines, virtual networks, rich e-mails with formatting and embedded media - it's amazing what they do with 1's and 0's. However, notice that the Internet doesn't work very well for these things - why? Because the basic foundation, that of self-routing packets split up and finding their own way so they can arrive in spite of a nuclear blast, is not designed to work well for them.
Windows is just like this - a simplified PC hardware architecture with a CPU, 640K of RAM, a text monitor, and a keyboard. Oh, you want graphics? Sure, we can do that - we just make a patch here. Oh, you want a mouse too, sure, another patch, and one for sound, one for USB devices. Put it all together, you end up with a patchwork built upon an ancient design.
UNIX, on the other hand, has had quite an interesting series of rebirths. It did start out in similar times, but with different bent and direction. As opposed to Bill Gates and his cadre of executives and lawyers, UNIX started out an open operating system (without the avid goal to make money). AT&T Bell Labs published their source code. Much UNIX software has remained like this, generating such institutions as the GNU Public Library, and influencing Sun Microsystems and others. Its openness extends to the design and handling of new constructs. Take, for instance, the famous decision made at Microsoft in the 1980's. Most computers had a max of 64K of RAM, so they multiplied it by 10, and said "That's the most memory we'll ever need," and proceeded to design their operating system under that limitation. UNIX (AT&T), on the other hand, said, OK, the largest number we can address in a number of X size is Y, so Y is the maximum RAM (turns out, the maximum number addressable depends upon the processor you run it on, so as the hardware gets bigger, UNIX addresses the memory no problem).
The bottom line is, we are all limited not by the resources at hand, but by what we imagine to be resources. The decisions we make, the actions we take, and more importantly, the assumptions we make without realizing those are merely assumptions, become the box that sets the limits.
So, fast forward to today, at the end of the first decade of the 21st century. The fact that Windows works as "well" as it does, is pretty much a miracle! Anyhow, a long digression on the history of computers, but I think it's quite interesting that this Dell, which basically has the exact same hardware as my Macbook Pro, runs so slow. But the Macbook Pro is insanely solid and fast.
The only time in the past month I have had to reboot the Macbook, is when it downloaded a software update from Apple that said "you should reboot". That's twice. In fact, when I close the lid, it goes to standby in, no kidding, 3 seconds. When I open the lid, no kidding, it comes out of standby and connects to the WiFi network in 3 seconds. I don't know how they do this magic - and that speed is truly magical. I suspect that a) the UNIX-based Mac OS X is a well-designed OS, well-organized, and b) the lack of real-time monitoring "security" software like antiviruses and antispyware really give it a boost, whereas they probably cause as much or more problems than the software they protect me from!
So, how do I solve the problem when the Dell won't reboot? I give it a nice, hard, satisfying jab on the power button, hold down for 20 seconds, and it turns off. Bam. Almost as instantly as the Mac shuts down normally. Well, at least there's the satisfaction that I have some level of control after all.
Second, what slows machines down is called "garbage" collecting. Windows is notorious for this, but to be fair, I guess all computers are susceptible to it to some degree. For example, central to Windows operation is an internal database called the Registry. This registry is a central location where settings are stored - settings for your hardware, Windows, software and drivers installed, and more. As you can imagine, it can get huge. Also, there can be things left behind by upgrades, uninstalls, and normal usage ("garbage"), that slow the overall system performance. There are utilities out there to clean up this stuff, my favorite is Glary Utilities. However, I find it interesting that the old UNIX way of doing things for the past 40 or 50 years, works better than the new Registry (introduced in Windows 95).
Third, of course, is the file system. By this, I mean the way in which the computer organizes files on hard drives and storage media. Old Windows and DOS used to use FAT and FAT32 (yeah, there are jokes about it), Windows now uses NTFS. However, there are many inefficiencies with the Windows file systems. First, if you have a lot of files in a folder, operation is very slow. Especially if you have a lot of small files. Second, is inefficient use of hard drive space - the block size is 1K, so if you have a file less than 1K, it uses 1K on the drive - the rest is wasted. Drive size is allocated in 1K blocks, so file sizes in between 1K are rounded up on the drive, leaving wasted space. UNIX seems to again do much better in drive use - one very interesting example is my MP3 players. I had a Sansa Fuze with 8GB memory, and my 1,200 songs pretty much filled it up. However, the same songs stored on my iPhone take up much less space - somehow, they fit with the iPhone OS leaving much less than 8GB free to start with, and I still have room. (The iPhone runs a mobile-modified version of Mac OS X.)
Coupled with the File System, is the collection of "garbage" files. These are typically temporary files created by software for a few moments' or hours' need, then they usually "forget" to clean up after themselves. (Hmm, does that sound like my daughter?) These collections of files slow the machine down (remember, the Windows file system gets slower with larger numbers of files stored within it). The true marvel here, is that UNIX UFS/HFS/NFS have been around for so long, and don't slow down with more entries thanks to the ingenious indexing and entry location system. In Windows, temp files collect everywhere, especially in the Windows Temp folder. Glary Utilities can help identify and clean these, improving system performance. And, it is good practice to clean stuff you no longer need. However, I admit that a) you don't always know what you have because a lot of it is done behind the scenes, and b) you may not know when or if you need it again, so you may want to hang onto it.
I have found that archiving files off onto CD or DVD for later use is good, and I have made a utility (MediaCat) that indexes these files into a database that can be searched to later locate the file and disc. Surprisingly, there doesn't seem to be a lot of other utilities that do this - there are a few, but none of them told you where you put the disc, only the name of the disc. Kind of useless if you have 300 discs!! Hmm, what was that I was saying about garbage collection??
So, much like my Chrysler Town and Country, the longer I have this Macbook, the more I like it.
No comments:
Post a Comment