Wednesday, March 4, 2015

Google is the new IBM

A Tangent

This is the 128th post on my blog, and as anyone in computer science knows, that is significant.  128 is 2 to the 7th power, or exactly half of a Byte (256).

The History

It has happened again - the world has gone full circle.  The world of computing, that is.  In the 1980's, as we moved from the IBM PC and Apple II into a full-blown explosion of the ubiquity of computing, IBM and Microsoft developed what became a baseline for computers.  We used to remark that "everyone and their brother" had a brand of computers, and there were a plethora of components so you could build your own.  Every DOS- and Windows-compatible computer was called an "IBM Compatible," but just as often it was also called an "IBM PC" - even the ones made by others (Brother, Epson, Gateway, and a thousand other names).

How did this come about?  Microsoft was just beginning, when Bill Gates bought the rights to DOS.  He then held an exclusive copyright (vigorously defended), and sold usage rights to hardware manufacturers like Tandy, IBM, Compaq, and others.  Since IBM was the biggest company at the time in that field (and a pioneer and innovator in the hardware), it became the best-known brand name.  Note that Apple devised the first "all-in-one" computer - screen, keyboard, CPU, all together instead of a separate kit components.  This was Steve Jobs' invention, and it revolutionized personal computing, but IBM perhaps more than anyone capitalized on it.

(Side Note) Ironically enough, at the time the reason IBM developed the Personal Computer, was because they felt that they needed a product that would prove, by how limited it was, that the only way to do computing was to get a real computer - a mainframe.  The PC would be a showcase in failure, and then when it did, it would morph into a terminal for the mainframe (and AS/400 mini).

As prices came down due to technology and manufacturing improvements, and competition with a myriad of manufacturers - the public bought into small-system computing in a big way.  IBM was a household name even before this revolution (think typewriters - before our first computer, we had an IBM Selectric that I spent hours hammering out my various writings on), so they cemented their name during this phase.  Everybody and their brother was licensing technology not just from Microsoft, but also from IBM - as they patented many inventions and advancements in personal computing.

Along came Google - a company named after the mathematical number named by the mathematician's kid's babbling toddler word.  The search engine company who was really the advertising company.  We all know their story, they got huge, fast.  Branch into web browsing, and now the mobile smart device market opens up.  Apple revolutionizes yet another market, and Google decides to venture into it with a mobile OS - Android.

Now, the various aspects of Google come together - web browser, search engine, advertising, and a mobile OS.  This is a massive synergistic environment designed to make money off of finding your personal habits and selling that information to advertisers.  But like smart cookies (yes, pun intended), they make their Android OS open-source.  This follows in the long tradition of UNIX, developed by AT&T Bell Laboratories (think Dennis Ritchie), and initially made open source.  Many different branches of UNIX abound, and indeed, a large sub-branch from the main trunk that re-opened the "open sourceness" of UNIX - Linux.  At this point, it's fair to say that thousands of different flavors of UNIX are in continuous use today - if you include all its derivatives like all the Linux branches, Mac OS X, and derivative of OS X - iOS, CarPlay OS, and whatever they call the Apple TV OS.

So, Google has entered into the fray with a very successful mobile OS, Android.  Many other people have "branched" or "forked" it by downloading the source code, modifying it, and building their own version of Android.  Now, various downstream versions of Android are running on a plethora of machines - not just phones and tablets, but also embedded devices, set-top-boxes, and more.  Truly, the ubiquitous nature of Android means it has become the de-facto standard by sheer number of devices.  And thus, Android runs on everybody and their brother's equipment.

However, it is a different mix today than in the 1970's when this was beginning.  Computing is everywhere - not just on your desk or in your lap, but on your wrist, in all your electronics, heck even woven into the fabric you wear.  And, Apple is now a huge, major player in all of these markets, even more so than in the days of the Apple II and IIe.  And Apple is one big, cohesive unit of consistency and connectedness, whereas Android is a scattered, confused mess of hardware manufacturers, versions, app stores, and more.

What It Means To Us

Does anyone remember what happened in the IBM PC days of PC proliferation?  What were some of the issues we had?  First, we had all kinds of compatibility problems - drivers, hardware, and more.  Software constantly crashed and malfunctioned because the underlying system was different than the one it was developed on.  Then, a watershed moment as computers got interconnected - viruses spread like a plague.  So now if you have a certain video card, or the wrong version of a driver, or some hard drive controller, or some modem, or mouse, or keyboard, or what have you - things didn't work right because it wasn't all compatible together.  It was the Wild West.  Personally, I hated "PC Compatibles" not only because they were the norm, but because they were boring, built with poor quality from hardware to software (and those few built with good hardware quality, had software and firmware issues - heck, they ran DOS and Windows!).

The great thing about the proliferation, is it led to cheap equipment - but the double-edged sword meant shrinking profits for manufacturers, and moving of jobs overseas.  Lots of variety, but we got bit by the lack of consistency.  In 1987, when I started selling PC's, the sales margin on a PC was from 30 to 45 points (or more perhaps).  In 1990, after I had gotten out, that had shrunk to maybe 10 points, and is now around 2-3 points.

Enter Apple.  What's different about what they did?  They moved their flagship systems to UNIX-based OS, they had expansion slots to add third-party hardware - how is what they did different?

Apple has throughout its history been criticized for a maniacal control of its environments.  From hand selecting components for the computers, to an ecosystem that closes out competitors, many have said that they did it for profits.  This is the end result, but not the primary goal.  As is evidenced over and over, a singular focus on profitability will lead you to stumble from the profitable path, because you lose focus on the basics of product quality and customer satisfaction.  Then, somebody comes in and undercuts you by price with the same product, and you are finished.  Kaput.  Brand loyalty doesn't exist, because it wasn't your focus.

Apple has consistently (since its rebirth after booting John Scully) kept every aspect of their products controlled in order to produce a consistent, reliable series of quality products that work together seamlessly.  Expansion is through industry-accepted standards (USB, SATA, Ethernet, SODIMM, etc.), or at least ones that they deem should be acceptable (Thunderbolt, Lightning, etc.).  Software remains tight with security to protect user's privacy - the company sells products, not subscriptions (yes, I know about iCloud storage and iTunes Match).  You buy an iPhone, you by an iPad, you buy a Mac, that's it.  They don't want your data - they don't want to snoop on their private, encrypted text messaging platform, they don't want to collect your usage statistics to sell to advertisers.  I know, as part of the Apple iOS Developer Program, I have not had any access to iTunes customer data to help me sell my apps.

With Android, everybody and his brother makes an Android device.  But, remember the fork?  Those manufacturers forked Android, what, 2 or 3 years ago to develop that new device you just got for Christmas or Hannukah?  And now, you get a new device with ancient technology that doesn't protect you from viruses.  Worse, you are an app developer - what version of Android do you target?  4.1?  4.2?  4.3?  4.4?  5.0?  What devices do you target?  You probably go with the big markets - Samsung, HTC, LG, etc.  That both limits you, and it makes your development and testing cycles much more complex.

And what are the results?  Let's look at one market, mobile smart phones.  A recent article shows that Apple, even though their global market share is around 25%, accounts for 90% of the profitability.  And it isn't like their products are that much more expensive than competitors - indeed, they are about the same cost.  Sounds like they are doing something right.

So, back in the day I was an expert, but I never did like the IBM PC (or the PC Compatible clones).  For the same reasons and more, Android has many drawbacks.  For a technical geek, it may be the cat's meow, because it is more open, more accessible to the inner workings.  But I am both a technical geek and a user, and reliability and security, coupled with a commitment to protection of privacy and personal information, are more important to me.  I don't need to get a remote terminal login to my phone - I'm cool with that!  And I certainly don't want an open messaging platform, where they glean everything from my text messaging (and everything else I do online) and use that to monetize my personal, private activities.

No comments:

Post a Comment