The Tablet that Desperately Wants You to Buy it

microsoft tablet virusesOver the holidays I chuckled a little bit every time I saw a commercial for the Microsoft Surface. It wasn’t hard to see the desperation in those commercials that were trying so hard to convince people to replace their MacBook with a Surface. What is pathetic is that these commercials were using the same old tried-and-failed tactics that Old World Technology companies have been using for years with little success. Now they are using them seemingly louder and more often, which only makes them all the more sad.

First, the commercials focus almost exclusively on hardware features. From the particular processor they use, to the removable keyboard, to the use of a stylus, the marketing team behind these campaigns still believe that people buy technology devices based on nothing but hardware specifications. That ship sailed way back in 2007 with the introduction of the iPhone. However, it’s the only weapon that Old World Technology companies have in their holsters, so they’ll keep pulling the trigger even if it is out of bullets.

The reality is that the commercials don’t mention that these Windows-based devices run Windows 8 because you might as well say these tablets are infected with Ebola. The common perception of Windows 8 is not favorable (to put it mildly) and Microsoft is doing marketing gymnastics to avoid talking about Windows 8 while promoting Windows 8-based devices. No wonder Microsoft has now announced that Windows 10 will be free to Windows 7 and Windows 8 users.

It’s also interesting that Microsoft chooses to compare their devices to a MacBook laptop instead of an iPad tablet. The definition of a “laptop” is blurred, since for what a lot of people do, an iPad is more than enough. People can also buy keyboards for the iPad and make it much more like a traditional laptop. Styluses can also be added to an iPad. However, an iPad starts at $499 where a Microsoft Surface Pro 3 starts at $799. Then to get the keyboard that Microsoft raves so much about will set you back another $129. And don’t think that the keyboard is an “optional” accessory. Attempting to use the Surface Pro without a keyboard is like attempting to use, well, a laptop without a keyboard. So in reality the Surface Pro starts at just under $1000 So maybe now we see why Microsoft wants to position the Surface against the MacBook. The prices are comparable. It is a very tough sell to convince someone to buy $1000 Surface over a $499 iPad. The problem is that people don’t really want a tablet that can replace their laptop. They want a tablet. Just a tablet. A tablet that is simple to use, won’t break the bank, and that doesn’t require overly complex accessories.

Let’s talk a little more about the keyboard and stylus that Microsoft thinks are the best thing since sliced bread. As I mentioned above, for all the talk that the Surface is a tablet with an optional detachable keyboard, if you’ve ever used a Surface you know that the keyboard is virtually non-optional. The same thing goes for the stylus. The commercials say, “I can write with a pen.” The reality is, “I MUST use a pen.” Attempting to use the Surface without a stylus is an exercise in frustration. The reason is that for everything that Microsoft says about the Surface Pro being a modern tablet, its operating system is still rooted deeply in the traditional (i.e. Old World) Windows desktop model.

Does anyone remember the “Tablet PCs” that Bill Gates declared were “the future” back in the year 2000? Probably not, because if you were unfortunate enough to get suckered into purchasing one back then, you found out that the “Tablet PC” was really just an over-priced, overweight laptop that had a stylus. Sales of the so-called Tablet PCs were abysmal, eerily reminiscent of how the Microsoft Surface line has sold so far. The main problem then, as now, is that attempting to use an operating system designed for a keyboard and mouse with a touchscreen interface is unwieldy, cumbersome, and generally just plain awkward.

microsoft tablet windows 8Of course today Microsoft has Windows 8, which is an operating system designed for tablets and mobile devices so things should be better now, right? Unfortunately, Windows 8 has only made things worse for Microsoft. Windows 8 is a Frankenstein of an operating system, attempting to combine the old desktop paradigm with a new touchscreen interface. The problem for Microsoft is that people nearly universally hate the new Windows 8 touchscreen interface. So they fall back to the familiar desktop Windows interface which is really hard to use with a touchscreen. All the controls are too small to effectively use with a fingertip, so a stylus is not optional but a practical necessity. But then this introduces a new problem: using a tablet with a stylus in one hand is doubly awkward. So that lauded kickstand suddenly becomes not just a nice feature, but something that is absolutely required because you need to put the device down on a table to actually use it with a pen. Suddenly that tablet isn’t much of a tablet. Oh, and hopefully you don’t lose that Stylus because it costs $45 to replace. It’s also easy to lose, by the way, because unlike previous Tablet PCs, there is no place to slide the stylus into when you’re not using it.

If you’ve ever attempted to use a Surface as a true tablet, you quickly realize that it is just not the right shape to use effectively. It is simply too tall to use handily in portrait mode (i.e. “vertically”) and too wide to use comfortably in landscape mode (i.e. “horizontally”). It is quite apparent that the Surface line was designed as a laptop with a detachable keyboard. For some reason Microsoft thinks that people really want their tablets to be laptops. But it doesn’t take a genius to see that this is not true. There is a reason the iPad sales took off like a rocket. It was the right combination of computing device with no baggage required. Most people didn’t miss the keyboard. Virtually no one complained that there wasn’t a stylus. For those who did want keyboards and styluses, a plethora of third-party options are available, but the important point is that none of these add-ons are necessary. The iPad works just fine with nothing more than your fingers.

Then there are some issues of fit and polish. The fact that the Surface Pro is really just a traditional Windows PC stuffed into a very thin form factor brings along the issues of heat that affect all PCs. One of the first things people will notice about the Surface Pro as compared to other true tablets is the sound. No, not the audio from YouTube videos but rather the sound of a fan. Yes, the Surface Pro requires a fan to keep it from overheating. People will also notice if they hold the Surface is that it gets fairly hot, even with the fan. Not that these issues are truly that big of a deal, but it again just goes to show that the Surface is not truly a tablet, but rather a laptop with a detachable keyboard. A lot of tech companies fail to appreciate the fine little details that can make or break the success of a device. The sum total of all the fine little touches (or lack thereof) in a technology device can be the difference whether people enjoy using a device or begrudgingly tolerate it.

Finally, if all that weren’t enough, Windows still brings along the baggage that is malware. Conveniently Microsoft doesn’t mention that the Surface is still as susceptible to malware as any other Windows-based computer. All the detachable keyboards in the world can’t stop a Surface from getting infected with a virus that brings it to its knees. Its stylus can’t remove malware either. In my experience, malware is one of the most frustrating experiences a computer user can have. It is also one of the main motivations for people leaving the Windows platform over the last several years. If we completely ignore every detail I wrote about in the rest of the article, I can’t imagine that people would happily leave the virtually malware-free comfort of an Apple MacBook or iPad to go back to dealing with malware. This issue alone should be a deal-breaker. Until Microsoft can adequately address the issue of Malware, those who use Windows-based devices are setting themselves up for a world of hurt.

In all honesty, I think the Surface Pro is actually a very good device for some very particular uses. I have in fact recommended the Surface Pro to a clients as a best-of-breed device when they required a Windows-based laptop/tablet hybrid with a stylus. The main use case are doctors who want to be able to use a stylus when working with Windows-only practice management software. It is a little ironic that the price caused a few of my clients to decline my recommendation, but I digress. I believe Microsoft should stop attempting to sell the Surface as a replacement to the MacBook or even iPad because it just makes them look desperate. Plus if any unfortunate consumers actually believe Microsoft’s commercials, customer dissatisfaction will probably do more to hurt Microsoft in the long wrong.

Technology and Economic Change

A friend on Facebook posted the following:

tech-032212-003-617x416I’d like to share some thoughts on ideas from an article I read recently (and unfortunately can’t locate at the moment).

The gist of the article is the Technology (or Internet) Revolution of the last few decades has widened the gap between the rich and non-rich much as the Industrial Revolution of the 19th Century did. Not as the result of governmental policies due to fundamental changes in how our economic system functions. For example:

* Technology has resulted in the concentration of wealth, power and technology.

* Technology has eliminated many low-skill and/or repetitive jobs that traditionally served as a way out of poverty and into the middle class.

* Technology has opened up competition for many higher-skill jobs (design, coding, journalism, automation, etc., etc.) to the global market resulting in wage stagnation for middle-class workers.

What I took away from the article is that – just as the abuses of the Industrial Revolution lead to many labor, industrial and economic reforms – the current revolution will require systemic changes to our economic framework. And, more importantly, traditional Liberal and Conservative ideologies, which are based upon the experience of the 20th Century, really aren’t equipped to provide those solutions

Any thoughts?

Any thoughts? Of course I have thoughts on this topic! But a Facebook comment simply won’t do my thoughts justice, so I told him I’d write a blog post.

The points presented obviously slant towards technology being a negatively disrupting force in the economy. Then my friend’s takeaway was that because of these negative disruptions, systemic changes will be required to our economy. I must assume he meant governmental regulations, because he then talks about liberal and conservative ideologies.

I must say that when presented in this context, the technology and Internet revolution sure does sound like a scourge on humanity! Who wouldn’t be for controlling this abomination? But of course, this is only one side of the story. Before we can discuss this topic, we should take a more balanced look.

Has technology concentrated wealth and power? Undoubtedly people have become extraordinarily wealthy from their Internet businesses. But it has also created incredible opportunities for everyday entrepreneurism that could never have been imagined before. When companies that didn’t exist before 1995 like Amazon.com can become the world’s largest book seller and topple old giants like Borders, technology has also created a more even playing field. When the Internet has basically destroyed the stranglehold the recording industry had on music, it must be talked about in the same breath as any claims of power concentration. Certainly the Internet has caused a de-concentration of power as well as any perceived concentrations.

Besides these big examples, the Internet has created entirely new career categories such as web developers, social media marketers, mobile app developers, and YouTube celebrities. Which if it is true that the Internet has eliminated low-skill and/or repetitive jobs, then we must also acknowledge the creation of new jobs. We must also accept the fact that the Internet has made it easier for entrepreneurs to find customers and therefore made it easier for non-technical businesspeople – from housekeepers to accountants – to compete against big companies.

It’s no secret competition is good for consumers. When the Internet makes it easier for small business to compete, consumers win. Not only does competition help keep costs down, but it also increases innovation and gives consumers more choices. Potentially with competition keeping costs down, wages may be affected. But while some wages may stagnate, others may increase for those who take the bull by the horns and adapt to the changes the Internet was brought forth.

So with the perspective of a more balanced viewpoint, do we really need government to “reform” our economy? Technology has transformed not only the economy, but almost all aspects of our society. The fact that this conversation started on social media and that I can publish my response on a blog that anyone in the world can read are proof alone of that. What we must realize is that regulation has side-effects. If we want to stifle the supposed negative ramifications of technological change, we need to accept that we will also dampen the positive effects that technology can bring to the economy and society.

I do agree on one of my friend’s points: that traditional political ideologies aren’t suited to provide solutions to a rapidly changing economy, especially when most politicians don’t even understand that which they would attempt to regulate. I would go so far as to say that any political ideology that says they need to control the economy is not suited to The New World of Technology.

Instead of the violent, forceful change that government brings, I trust in the natural, holistic evolution that free people bring about on their own as they adapt to change. Systemic changes have already happened and will continue to happen as the market transforms. Not through any action of government, but rather through the behavior of free people in a market that has been relatively free of regulation. The technology industry is one of the closest examples of a functioning free market that we have seen in recent history. Luckily by its nature of rapid advance, there has been little opportunity for governments to suppress it, at least in this country. Which has been a good thing because it was the technology industry that burgeoned our economy in the 1990’s and has almost single-handedly kept it afloat through the rough times of the 2000’s. That and it fundamentally changed the way we live and communicate. Certainly we must be appreciative of the positive advances that this market free of government interference has brought forth.

We must understand that we are only at the beginning of the technological advances coming. If we attempt to apply control and regulations now, based on our rudimentary understanding of technology as it exists today, we potentially strangle the benefits of technology that hasn’t even been invented yet. We threaten to stifle the advancement of modern technology much in the same way that was done in the early part of the 20th century with the heavy-handed government regulation of radio and telephone communication. This resulted in the government-sanctioned monopoly of AT&T and the concentration of power in politically-favored media companies. Ironically, this concentration of power has only started to erode because of the Internet. What a shame it would be to strangle the freedom the Internet has brought us because we are frightened of change. What a shame it would be if our fear returned us to the pre-Internet world where our information was tightly controlled by a select few from government and big corporate interests.

Bottom line, free people adapt to change. The relatively free market that technology has developed in has brought us amazing advances in our society and economic growth. Why would we want to endanger it? Attempting to control the economy amounts to playing god and not doing a very good job of it. No politician is smarter than all of us. Offering them control of the new economy is a foolhardy endeavor, certainly destined to be a so-called “cure” worse than any perceived “disease”.

Apple, Samsung, the Evolution of Smartphones, and Real Innovation

Apple-Samsung-logoNow that the iPhone 6 models have been released, the buzz among consumers and competitors alike is in full swing. Both Apple fans and detractors are lighting up the Internet with their opinions on larger screen sizes and claims of which companies are the most innovative. For me, it has been interesting to observe the evolution of the smartphone and the corresponding market. I feel that we are now at the launch point for the next stage of advancement in mobile technology so it is a good time to reflect and look forward.

Let’s be clear that Apple created the smartphone market as we know it today. Sure there were “smartphones” before the iPhone, but that is like comparing DOS-based computers to the Macintosh and all graphical computers after. They simply weren’t in the same league. For all the talk about Apple not innovating lately, I think a lot of people are missing the big picture. The iPhone was a huge leap forward from the “primordial” smartphones, just as the Macintosh was a huge leap from early text-based personal computers. Once a market segment is created, competition begins to fill in. But competing in an established category is quite different from creating a market and continually moving it forward.

Considering how so many people felt themselves “technology illiterate” in the PC era, the fact that millions upon millions of people now own smartphones is quite a feat. But obviously this didn’t happen overnight. It’s been seven years since Apple introduced the original iPhone and 6 years since they rolled out the App Store. Every year Apple brings out some new advances, but while other companies want to ride the bleeding edge, Apple has been very different in their approach. It’s almost as if Apple knows that too much, too quickly has the potential to turn off consumers. They seem to understand that mainstream society needs to get used to technology advancements before building further upon them. Steve Jobs once used the analogy of people and technology being on a “first date” in the 80’s and I think the analogy somewhat extends out now. Move too fast and someone will get dumped. Had Apple gone too quickly and made the iPhone too complex or less reliable, it may have stymied the adoption of the iPhone and smartphones in general. However, as Apple was slowly bringing us along, being careful not to move too fast or introduce bleeding-edge technology that could have negatively impacted user experience, a certain set of users wanted to move faster.

In the early days of Android, phone manufacturers tried just about everything and anything they could to compete with the iPhone and differentiate themselves from other Android phones. I likened it to throwing everything against the wall to see what would stick. For all the gimmicky things that Android phone manufacturers tried, the one thing that actually seemed to stick was bigger screen size. Ironically, bigger screen sizes may not have been an intentional development, but rather done out of necessity, as the phones that initially supported LTE needed bigger batteries to handle the increased drain that the early LTE chipsets required. Bigger batteries required bigger phones and correspondingly bigger screen sizes. Samsung then took the larger screen size concept to the next level with the Galaxy Note in 2012. It isn’t hard to understand why certain people like a bigger screen. Techie-types seem to like anything with bigger specs, and those with bad eyesight think that bigger screens means they are easier to read. Others simply like to have more screen room to work with. Regardless, Samsung’s bigger screens ended up being virtually the only thing that average users could identify with on non-Apple phones. Somewhat paradoxically, while competitors claimed bigger screens on phones were better, they simultaneously pushed smaller screens on tablets! But that is a discussion for another time.

Where Apple was being careful in not pushing technology too fast, they may have been just a little too cautious. The mainstream that was “technology illiterate” became savvy quicker than Apple expected. Ironically, it appears that the ease-of-use Apple was so careful to protect empowered users to not be so fearful of technology, emboldening them to explore products from other companies. Samsung’s bigger screens was a simple draw to those who were wanting to push the technology envelope. The simple fact that Apple has now introduced larger screen sizes shows that there must have been significant customer demand for them. From my own personal experience as a technology consultant, I can say that I was asked many questions about larger screen sizes in the last couple of years. Several people indicated that they were contemplating leaving the iPhone to get a bigger screen. Congratulations, Samsung! Even a blind squirrel finds a nut sometimes!

Now don’t get me wrong. I think competition is a great thing. It keeps all on their toes, constantly working to improve their products or services. Ultimately consumers win from varied choices and lower costs. But I think one can tell a lot about the company behind the products from their advertisements. Why does it seem that the large majority of Samsung’s ads are trying to poke fun at the iPhone? Samsung perhaps doesn’t realize (or perhaps they do) that they are also making fun of people who use iPhones. This isn’t a good way to win friends and influence people. If your products are so great, Samsung, why do you need to build them up by trying to tear down others? And what does it say about the people who are influenced by this type of conceited, self-congratulatory commercials? It’s a throwback to the Old World of Technology where many technology professionals gave off an aura of smug superiority. I think most people have no desire deal with egotistical technology professionals any longer.

Unfortunately for Samsung, the jig may be up. For all those people who claim that having a bigger screen was innovative, it was something that was very simple to copy. Now that Apple has introduced bigger screen sizes, Samsung no longer has an easy claim to fame. The fact that sales of the iPhone 6 have been record-setting seems to indicate that while people did in fact want smartphones with big screens, what they truly wanted was an iPhone with a big screen. For all of Samsung’s hype, their “Next Big Thing” usually was simply their next “big” thing. Now that everyone’s big, what will Samsung do? When the “we had big screens first!” marketing campaign fizzles out, what gimmicky tech features will they resort to next?

Hopefully Samsung enjoyed their time in the sun because it seems that Apple is back with a vengeance. Their current path of innovation, save the Apple Watch, may seem subtle at this point, but it all stands to fundamentally reshape not only the mobile device market, but the entire technology landscape as well. Besides making incredible technology products, what Apple does at its core is bring technologies into the mainstream. By making technology easy to use and accessible, Apple makes technology more powerful than any hardware specification alone can. Let’s look forward a little bit:

  • HomeKit will become the standard for unified home automation across disparate devices. The “Jetsons” home will finally come closer to reality.
  • HealthKit will become the standard for organizing personal health information from various sensors and data input. Soon we will hear stories about how Apple technology is literally saving people’s lives.
  • Apple Pay will move forward the payment transaction industry that has been quagmired. Where other tech companies have tried to bring mobile payments to the mainstream, only Apple has the customer base and industry influence to actually pull it off.
  • All these technologies will be also tied into Apple Watch, which already seems like it will be one of the hottest tech items of the coming years. I will write more about Apple Watch in a future article.

The problem for Samsung, or any other competing manufacturer, is that unlike a simple large screen, none of these technologies that Apple is bringing forward are easy to copy. This is because Apple isn’t just bringing raw technology advances to the table. Apple is doing the very hard work of making the technology easy to use and accessible. This requires a lot of development work as well as significant investments in creating industry relationships. Companies that are primarily manufacturers do not have the nearly 40 years of R&D and ingrained culture of innovation that Apple has. It’s quite a different thing to bring genesis to an entire ecosystem of amazing user experience than it is to throw some tech specs at a board and slap together cheap electronic devices. To Samsung, smartphones are just another TV or microwave that they churn out en masse. But to Apple, it’s personal. They aren’t just a manufacturing company. They really do care about making “insanely great” devices. This is the legacy of Steve Jobs. The payoff for Apple is that they are clearly the most valuable company in the technology industry, even if we don’t look at the numbers. Apple’s clout among companies and consumers in the economy at large is priceless and the fact that the entire world waits with baited breath to see what Apple does next is proof enough.

For those that think they can compete with Apple when they have established mainstream success with technology ecosystems, they should learn from recent history. Many tried to knock Apple off the iPod/iTunes pedestal. All failed miserably. Corporate juggernauts like Microsoft and Sony seemed feeble when attempting to replicate the success that Apple had. Again, this was because Apple wan’t simply making digital music players. Anyone could and did make those. Apple knew that they had to focus on the entire user experience and make it brain-dead simple for users to not only play songs on their devices, but also purchase and organize their music as well. Getting music on early MP3 players was a chore for all but the most techie among us. It seems obvious now, but if you couldn’t get music on your device, the device itself was pretty useless, no matter how great its tech specs were. The same will hold true for HomeKit, HealthKit, and Apple Pay. While its competitors were all busy trying to make phones with the biggest, baddest, tech specs, Apple was quietly leapfrogging them in technology that consumers will truly care about. By the time other companies figure this out, assuming they ever will, they will likely be too late. If Apple is successful in creating new technology ecosystems around their new innovations, it will be extraordinary hard for anyone to compete. This will be especially true for a simple manufacturing company like Samsung who doesn’t even make the operating system that runs on their phones.

Apple’s marketing slogan for the iPhone 6 is “Bigger than Bigger”. Subtle, but absolutely on point. While it seems that size may in fact matter, at the same time it really doesn’t matter. There are bigger things than big screens and we are about to see this come to fruition. It will certainly be interesting to see what Apple’s competition does next.

The Apple-IBM Deal: No, Hell Didn’t Freeze Over

Steve Jobs Flips off IBM

Steve Jobs Flips off IBM

If you have even a rudimentary knowledge of the history of the personal computer, you know that during the 1980’s Apple and IBM were considered mortal enemies. In a fight to the death for dominance of the personal computer market, it was the upstart Apple who created the personal computer revolution vs the old guard IBM who was the 800-pound gorilla in the technology industry. To this day there are still perceptions of Apple and IBM as distinct opposites in the technology world. This is even though a lot of time has passed since the PC wars of the 80’s and things have changed quite a bit. So when Apple and IBM announced a strategic partnership it wasn’t surprising that many people were somewhat confused. How in the world could these two companies form a strategic partnership?

If you know the history of technology as well as I do, the announcement wasn’t actually all that surprising. While at first IBM was in fact Apple’s antagonist, Microsoft actually became the common enemy of both companies. Let me give you a brief background.

Yes, it was IBM that came out with the “IBM PC” that ran a DOS operating system made by Microsoft. But due to shortsightedness on IBM’s part, along with some strategic maneuvers on Microsoft’s part, Microsoft became the big winner in the personal computer market, crushing every personal computer maker that didn’t run MS-DOS (and later Windows). Apple was virtually the only personal computer company to survive, albeit just barely. At the same time, the PC revolution crushed nearly every old-school technology company that was prominent during the mainframe era of the 60’s and 70’s. Even IBM itself was nearly put out of business by the onslaught of IBM-compatible PC clones running Microsoft operating systems.

What saved Apple was the return of Steve Jobs and the subsequent expansion of Apple’s technology offerings into mobile devices such as the iPod, followed later by the iPhone and iPad, along with the revenues of the iTunes and App Stores. What saved IBM was their refocusing on their corporate services offerings back in the 1990’s. In fact, IBM was the first big name to get out of the PC business in 2004, when they sold their PC division to Lenovo. Only after Apple ushered in The New World of Technology with the iPhone and iPad, drastically changing the technology market, did other names such as Dell and HP begin to seriously target the corporate services market that IBM had long dominated. Both Apple and IBM realized that the PC market was beyond direct competition with Microsoft, but there were bigger things in store. Apple focused on the consumer market and IBM focused on enterprise services.

Fast-forward to present day and the deal really makes perfect sense. Apple is the dominant force in the consumer and small business market due to the iPhone and iPad. IBM commands a lot of influence in the corporate world. IBM wants to grow with the mobile device revolution and perhaps due to the lessons learned in the 1980’s, knows that there is plenty of money to be made in offering services instead of trying to create their own devices. Apple would love to get more enterprise business and knows that partnering with such a well-respected name like IBM is probably the quickest way to achieve growth.

So it really is just a very simple strategic alliance between two companies with a lot to gain between them. Yes, it seems a little funny at first, but the reality is that both companies are very mature and powerful and stand to get more powerful together. What was your first reaction when hearing this announcement?

The Facebook Messenger App is NOT the Devil!

The Facebook Messenger App - Could it be SATAN!? No, just some sensationalist claims gone viral.

The Facebook Messenger App – Could it be SATAN!? No, just some sensationalist claims gone viral.

Unless you’ve been living under a rock, you are well aware of all the dire warnings about Facebook’s “new” Messenger app floating around the Internet. At first I wasn’t going to write anything about it, but it seems that the story continues to get bigger. So I feel it necessary to discuss the warnings and how it all got started.

An article written by Nick Russo for a Houston radio station claimed that the Facebook Messenger app would have permissions to do all sorts of privacy-invading things if you installed it. For some reason, the article went viral. Well, it probably went viral for the same reason people send chain letters about virus hoaxes. It had just enough sensationalism mixed in with an authoritative tone to seem credible. The name of Nick’s radio station is “The Bull,” and perhaps that should have been an indication to people reading it that his article was for the most part, BS.

I’m not sure why this radio personality felt it necessary to pretend to be a technology expert. The very first time I read the article I knew there was something just not right. I tried to research his claims for some friends who were asking and for the life of me I couldn’t find anything about this guy stating that he had any professional experience besides working in radio. There’s nothing wrong with working in radio, but if you’re going to use your platform to disseminate information, please be sure you know what you’re talking about! As far as I’ve seen, Nick has not yet written an apology for his fear-mongering article, but rather has shifted into portraying himself as some sort of privacy advocate. Once again, I’m all for privacy advocates, but if you’re going to advocate – know of what you speak beyond just a cursory scratching of the surface.

Nick Russo made a lot of outlandish claims regarding what the Facebook Messenger app could do. The first problem with his claims were that he didn’t make a distinction between smartphones. I knew right away when reading his article was that there was no way Apple would allow an app like that to get into their App Store. Certainly it might be possible with an Android-based phone, however unlikely it would be, but Apple puts every single app submitted to their store through an approval process. Every. Single. App. Yeah, there’s no chance that Apple would allow Facebook Messenger, or any other app, to do the following as claimed by Nick Russo:

  • change or alter your connection to the Internet or cell service … for its own reasons without telling you.
  • send text messages to your contacts on your behalf … when they want
  • see through your lens on your phone whenever they want .. listen to what you’re saying via your microphone if they choose to
  • read your phone’s call log, including info about incoming and outgoing calls … Facebook will know all of this
  • read e-mails you’ve sent and take information from them to use for their own gain.
  • read personal profile information stored on your device … addresses, personal info, pictures or anything else
  • Facebook will now have a tally of all the apps you use, how often you use them and what information you keep or exchange on those apps.

It’s not like Apple iPhones are some off-the-wall brand that can be safely overlooked when discussing smartphones. They are just a *little* popular, to put it lightly. So to write an article like this with such extreme claims and not know about Apple’s approval process is simply irresponsible. But even if we were to ignore iPhones for the moment, does anyone really think that Facebook would want to do most of what is claimed above to their users? Perhaps Mr. Russo should have put in a call to someone at Facebook to ask a few questions first? Or at least do a tiny little bit of research on this thing called the Internet before publishing an article like this? I bet even the resident PC guy at “The Bull” probably could have warned Nick that his claims were pretty far out and to be careful before publishing his article. But alas, Mr. Russo took a little sliver of knowledge and believed he knew more than he did – running off like “The Bull” in a china shop and starting a viral tidal wave in the process.

To be fair, in theory – extreme theory, what Nick Russo claims above could possibly be accomplished by highly malicious apps running on some smartphone platforms. But Facebook Messenger isn’t a malicious app. And Nick must have found that out because in his next article he states, “I’ve now learned that both the New Facebook Messenger App and the original Facebook app have many of the SAME permissions.” Yes, I’m sure he did learn a few things once his article went viral! But perhaps those things should have been learned BEFORE publishing! As it turns out, the Facebook Messenger app (which isn’t new, but has been out for years), does virtually nothing different than any other similar app, including the normal Facebook app that billions of people already use. Oops!

Apparently once he found that out, Nick choose to portray himself as a privacy advocate, championing the idea that he made people more aware of the privacy choices on their phones. Fair enough, but let’s call a spade a spade. If he really cared about people’s privacy choices, he would have done some research and consulted with technology experts so that he could have written a balanced article. Any good that he has done has been completely obscured by the hysteria he created. Advocacy by accident at best. Fear-mongering at worst.

Bottom line, there are many articles that debunk Nick’s claims. Here is another article discussing some of Nick’s claims as “myths”. Facebook even posted an article discussing the privacy concerns. So the moral of the story is that we can’t believe everything we read – especially when it comes to technology topics. While we may not like the fact that Facebook is making everyone use a separate app for Messenger, spreading misinformation isn’t helping anybody.

IRS: Bad Sectors or Bad Intentions?

obama_open_governmentIn my previous article, I mentioned that all the attention paid to the details of Lois Lerner’s hard drive crash was just a red herring. I believe that the IRS attempting to use the hard drive crash excuse is simply the least important link in the chain of a comprehensive “innocence by incompetence” campaign. However, as weak of an excuse as it is for the IRS, the technical details of Lois Lerner’s hard drive could actually could end up being a smoking gun if we investigate far enough.

As I mentioned in my last article, I’ve seen more than my fair share of hard drive failures in the course of my 20 year professional career. I am very familiar with the functioning of hard drives, both in their mechanical and digital operations. So allow me to offer a quick primer on hard drive failure.

Any hard drive failure is commonly referred to as a hard drive “crash”. Technically the term “crash” has a very specific meaning in reference to hard drives – a head crash, if you care to know – but non-technical people may use the term “crash” to refer to any number of hard drive problems. Usually most people will say a hard drive crashed if the failure prevents data being read from the drive by normal methods and/or the computer will no longer boot from that drive. However, there are many less serious problems that could seem like a hard drive “crash” to non-technical users and restoring the drive to normal operation in those cases would not be difficult for a technology professional. More serious failures involve problems with the physical drive mechanism and may require the drive be sent to a data recovery specialist or forensic lab with highly sophisticated equipment to retrieve data.

In my experience, hard drive failures are unfortunately far too common. I have no problem believing that a hard drive crash could have befallen the computer used by Lois Lerner. Sure the timing seems questionable, but from a purely technical standpoint, this isn’t the smoking gun by itself. We must begin by questioning the specifics of the actual drive failure. Was it an actual head crash or a less serious glitch? Unfortunately the IRS has reported that the hard drive in question has already been recycled so no further attempts at recovery can be made, nor can the true cause of the hard drive failure be verified. All we know is that an e-mail from an IT manager to Lois Lerner in August 2011 said, “The sectors on the hard drive were bad which made your data unrecoverable.”

At this point I don’t believe the technical diagnosis of the hard drive failure nor the details of the recovery efforts made by the IRS IT department have been made public. These details need to be uncovered because if we know the diagnosis of the failure of the hard drive, we can begin to understand if the drive and data on it was intentionally destroyed. Alternately, we can also begin to deduce if the IRS IT department was using proper procedures during the recovery attempt and if there was any intentional wrongdoing within the IT department – or simply further incompetence.

According to the e-mail trail provided after Lerner’s hard drive failed, the data on the drive was deemed so important that the IRS IT department even went to the unusual lengths of sending the hard drive to their criminal investigation division’s forensic lab so they could attempt data recovery. A forensic lab often is able to piece together some data from a failed drive even if all the data on a drive is not recoverable. For a forensic lab to not be able to recover any data at all is highly unusual. This indicates that either the drive was wiped clean using advanced data deletion technology or it suffered extreme damage. Extreme damage is a rare occurrence for normal hard drive failures. Because the drive was sent to a criminal forensic lab, in theory the forensic specialists should have been able to tell if the hard drive was intentionally damaged or if there was anything unusual about the condition of the drive.

Lerner’s hard drive was reported as crashed on June 13, 2011. It wasn’t sent to the forensic lab until August 5, 2011. That’s a long time. Talk to any technology professional who is competent in data recovery and they will tell you the longer a failed drive is running in an attempt to recover data, the more damage that can be done. What was the IRS IT department doing that it took two months before they determined the drive was so bad it needed a data recovery lab? Especially if it was later determined that the “sectors on the hard drive were bad”. That type of failure should have been fairly obvious early on. Regardless, “bad sectors” do not take out all the data on a hard drive unless virtually the entire drive was damaged. In theory it could be possible that the efforts over two months by the IRS IT department could have damaged the drive beyond the point of data recovery even by a forensic lab. That would be a fairly inexcusable case of incompetence – or an intentional effort to scrub data from the drive. Either way it’s not a show of good faith on the IRS’s part.

Bottom line, there appears to be a chain of IT employees at the IRS that had access to the hard drive at any point in time, as well as “HP experts” and forensic specialists at the IRS’s criminal investigation division. If there is in fact a cover-up, the weak link in the chain may very well be any one of these IT people. Assuming they are not as politically motivated as IRS officials, it may be possible to get expert testimony from any one or more of the IT people that worked with and examined Lois Lerner’s hard drive that would conclude the drive had been tampered with or intentionally damaged. At the very least, we could find out why no data could be retrieved at all.

My hope is that if Lois Lerner’s hard drive is a smoking gun, one of the IT people involved will be brave enough to testify to this. The world could use another Edward Snowden right about now.

IRS: Innocence by Incompetency

obama-foia-2009Ever since the news broke a little over a week ago that the IRS lost e-mails connected to Lois Lerner because of a computer hard drive crash, I’ve been wanting to write an article addressing the technical aspects of this situation. However, the story kept growing as each day went by so I waited. As I sit down to begin this article late in the evening of June 23rd, I’ve just spent almost 4 hours watching the latest hearing live on CSPAN-2. Yes, you can’t get much geekier than spending an evening watching a government hearing discussing hard drives, backup tapes, and IT department policy. But I am who I am and the intersection of technology and politics is my wheelhouse. In all of history, there probably hasn’t been a more famous political story revolving around technology issues. Because of the size and scope of the various technical issues involved, this article will be the first of a likely series of articles tackling each major point in this long chain of events.

I almost feel that I don’t need to write these articles because it seems even technology laypeople instinctively know there is something highly suspicious about this situation. In this day and age of advanced technology, a simple hard drive crash simply doesn’t seem like a justified excuse to lose an important trail of digital communication. This is especially true for a government bureaucracy that purportedly symbolizes accurate record keeping. However, I still think a thorough review of the technology and management issues are worth examining.

As a technology professional, I’ve seen more than my fair share of hard drive failures. In my experience, hard drive failures are far too common, so I have no problem believing that a hard drive crash could have befallen the computer used by Lois Lerner. However, looking at the big picture, the hard drive failure really shouldn’t be relevant. All the hubbub about a hard drive crash is truly a red herring. Nonetheless, I will address the hard drive issue in my next article.

Any organization with halfway competent IT management that is required to preserve e-mails will have an e-mail archiving system in place. They will not defer responsibility of preserving required communications to individual employees – unless, of course, the negligence is intentional. One very important idea behind archiving is that e-mail communication may be used for criminal investigations among other things and it should be obvious that employees may decide not to preserve e-mails that are incriminating. The reality is that it is much easier to archive messages as they pass through a central server than it is to attempt to store and retrieve them from individual computers. And of course, automated centralized archiving eliminates the possibility of employees “losing” e-mails to cover their asses. To not archive messages at the server level is literally “so 1990’s”.

There are many archiving products and services available that work with common e-mail servers. It is well-known that the IRS uses the Microsoft Exchange platform, easily the most popular e-mail system for large enterprises. Therefore the IRS would have had its pick of any number of e-mail archiving systems to choose from. In fact, the IRS did have a contract with a company called Sonasoft that specializes in e-mail archiving (their tagline is “Email Archiving Done Right”). This contract was terminated at the end of fiscal year 2011 (August 31st), which seems highly unusual given the timing of the Lois Lerner hard drive failure and the supposedly lost e-mails in June of 2011. It was testified that the IRS only contracted with Sonasoft to archive the e-mails of the Chief Counsel within the IRS, which covered just 3,000 employees, not all 90,000. This also seems highly unusual to select such a small subset of employees. If archiving the e-mails of 3,000 IRS employees is deemed important, why not all 90,000 employees? At the very least, shouldn’t the heads of major divisions within the IRS, such as Lois Lerner, have had automated e-mail archiving as well? If nothing else, just from a productivity standpoint, the loss of e-mails for key personnel would be highly detrimental and an e-mail archiving system would be well worth the cost for the protection it provides, not to mention compliance with federal regulations in case of wrongdoing.

From a technical standpoint, the costs to archive the e-mails of all employees would not have been significantly greater. As with many technology systems, the greatest costs are in the initial implementation and baseline infrastructure, not in scaling of said systems. While archiving the volume of e-mail generated from 90,000 people would require a large amount of storage, it is not an impossible task. There are many companies in the United States that have hundreds of thousands of employees and are required to archive e-mails in order to comply with federal regulations. Or that being said, we know the NSA has enourmous data centers that are more than capable of monitoring and archiving communications on hyper-massive scale. Certainly it wouldn’t be beyond the capability of another gigantic federal agency such as the IRS to properly manage their own required records. The agency that has been charged with enforcing the president’s signature legislation shouldn’t have a problem archiving emails of a piddly 90,000 accounts, should they? They are in charge of maintaining records of hundreds of millions of American citizens, after all.

But that’s exactly what the current IRS chief wants you to believe. That the IRS’s technology infrastructure plus the policies and procedures that manages it are so woefully antiquated and out-of-date, they just couldn’t prioritize the archiving of e-mail messages. This is true even though e-mail messages are considered official records by the IRS’s own handbook and they are required to preserve them. The excuse has been given that properly archiving all the e-mails of the agency would cost between $10-$30 million and they just didn’t have the proper funding. I would love to know where this figure was arrived at because this seems like an extraordinarily high number, given that they were already contracting with a company that was doing e-mail archiving and scaling it shouldn’t have approached anywhere near these costs. Even several hundred terabytes of storage didn’t cost anywhere near $10 million in 2011.

What the IRS wants the American public to accept is that they can’t be proven guilty because they were incompetent. The truth of the matter, speaking as a technology expert with 20 years of professional experience, is given the laughable policies and procedures the IRS had in place, the chain of events as they describe them are entirely plausible. The burning question is whether or not these policies were in place due truly to incompetence or for convenient plausible deniability. At this point, we can only assume they are “innocent by incompetence.” But this can not be acceptable. If anything, the hypocrisy implicated here is far too much for anyone but the most ardent authoritarian to embrace. The IRS, nor would most law enforcement agencies, accept the excuse that a technical problem resulted in the destruction of evidence. In fact, the term “spoliation of evidence” is a legal concept that allows for courts to assume that evidence destroyed for any reason (whether claimed “accidental” or otherwise) would have been detrimental to the defense and assume the worst. Given the stringent federal regulations that require publicly held corporations to archive years worth of e-mails, it would be a significant case of “do as I say, not as I do” statist double-standards to allow the IRS to get away with this highly convenient set of circumstances.

While many apologists claim that the scandal is unfairly targeting Barack Obama, the reality is that he is the Chief Executive and that the IRS is a agency of the Executive Branch. That alone should prompt any leader of integrity to take charge and demand answers. But what is especially disturbing is that Obama was elected under the auspices of “Hope and Change” and one of those key tenets was “transparent and open government.” In fact, one of his first acts as president was to release presidential memorandums addressing the free flow of information from government agencies, regardless of the protection of personal interests. So for Obama to turn a blind eye on this situation is an egregious violation of his own proclamations. Do we really want a president that doesn’t stand by his own promises?

While “innocence by incompetency” may keep certain IRS figures out of jail or the president from being impeached, it won’t help the IRS in the long run. As I mentioned before, even technology laypeople realize there is something extraordinary about this situation. People from all political persuasions are incredulous at the arrogance and audacity shown by the IRS management over having their credibility questioned. We the people can not stand and let this pass. If the IRS is wanting to prove just how incompetent they really are, they need to have this incompetence severely punished. Instead of rewarding them by increasing their budget, we need to drastically reduce the power this agency has over the American people. The first step is to eliminate or rescind any further increases in power the IRS receives, such as that dubiously authorized by Obamacare. The ultimate step would be to abolish the IRS completely. While such a thought seemed like fantasy only a short time ago, the unprecedented nature of this situation has people seriously questioning the justified existence of such an agency and their legitimate role in a free society.

What steps do you think should be taken against the IRS for their seeming incompetency?

Windows is the Elephant in the Room

elephantI just read an article by Ed Bott of ZDNet discussing how Microsoft’s marketing for the Surface Pro 3 has backfired. Basically the article states that since Microsoft was comparing the upcoming Surface Pro 3 to a MacBook Air and also stating that the Surface Pro 3 can replace a laptop, the tech journalists that attempted to replace their MacBook Airs with a Surface Pro 3 were less than happy with their experience. The article however attempts to explain why these journalists’s experiences aren’t representative of an average user’s needs. That basically a tech journalist’s workflow is far too complex as compared to an average user who would have a MacBook. So the readers of the tech journalists reviews were being done an injustice because their needs are far too different from a tech journalist’s. Many of the comments on the article were in agreement with the author, attempting to rationalize the poor reviews of the Surface Pro 3 as a laptop replacement. Finally, the author states that “Getting the tech press to step outside of an Apple-centric bubble and imagine a world where people might choose a Windows laptop over a MacBook is the biggest challenge of all.”

That last statement would be utterly hilarious if it didn’t completely ignore the long history of the PC era until the last few years. Until Apple broke through with the iPhone and then ended the PC era with the iPad, Apple was virtually ignored among tech journalists. Perhaps the tech press is Apple-centric for a good reason. They are embracing The New World of Technology because they finally have a real choice as compared to 20 years of Microsoft domination.

It amazes me how many people, tech journalists and otherwise, still believe that comparing a Windows PC to a Mac is, well, an “apples-to-apples” comparison. Let’s face it, people who are using laptops in any “work” environment are more like tech journalists than not. For Microsoft to compare their hardware to Apple’s hardware is completely missing the point and they got exactly what they deserved.

It’s like judging a woman completely on her “specs”. There’s a lot more to a woman than her physical appearance, no matter what kind of perceived “performance” you may get out of it. The reality is that what’s on the inside is a lot more important because ultimately that is where most of the work actually takes place. You’re never going to make the most of that “hardware” if you can’t manage the “software” effectively. Similarly, when you compare the different platforms, a Mac is simply a more friendly, easy-to-use, and therefore productive work environment. Microsoft simply does not create the level of user experience refinements that Apple does. That’s been true since 1984. Let’s not pretend here. The disaster that is Windows 8 should be proof enough of that.

Those who are used to the Mac will definitely have a hard time switching to Windows. Hell, Windows users are having a hard time switching to Windows 8! But it seems as if some people, Ed Bott included, are simply writing it off as a “transitional” problem. It’s a lot more than that. Sure, those who are used to Windows will have a learning curve switching to a Mac, but from my experience, most people get comfortable within a week and then start to realize the advantages the Mac OS brings to them. My favorite quote from someone who switched to a Mac after years of Windows use was “This is how computing should be.” On the contrary, those who try to switch from a Mac to Windows rarely ever get used to Windows and will go back to a Mac as soon as they can.

It’s not just a matter of “what you’re used to”. I’ve seen far too many examples of this play out from average, everyday people who are clients, friends, or friends of clients. When people have a chance to experience both Windows and the Mac, overwhelmingly they choose a Mac. I think we’re seeing this play out in the larger market as consumers are making their own purchase decisions more and more. Most PC purchases were, and still are to a great degree, made in mass by big companies. The fact that the Mac market continues to grow while the overall PC market shrinks is one sign of this.

The other part that doesn’t make sense is people saying that tech journalists aren’t a good comparison to an average person. That’s absolutely true, but not for the reasons being bandied about by Ed Bott or certain commenters. If anyone had the ability to make a transition to Windows from a Mac, it would be a tech journalist – someone who presumably is very comfortable with technology. If a tech journalist has trouble switching to a Surface Pro 3, then what chance in hell does an average consumer have? Usually tech journalists do not give enough attention to the needs of an average user. They usually let their tech bias slant their view towards the hardware specs and not enough attention to the ease-of-use. Ed Bott’s article is a perfect example of that. The fact that several tech journalists panned the Surface Pro 3 should be a warning heeded by people thinking of purchasing one. The reality is that either the device is too complex for an average user or that the device isn’t robust enough for a professional. That’s not a good reality for Microsoft.

As I’ve said, I’ve seen the switching scenario play out hundreds of times over the last 20+ years in both directions. I’ve had many clients that used a Mac at one point, then were forced to switch to a PC by their work environment. Years later, they still wished for a Mac and eventually made their own purchase decision going back. I have *never* seen this scenario with someone wanting to go back to Windows. Usually, as the saying goes, once you go Mac you never go back. Especially given that you can run Windows in Boot Camp or a virtual machine so the old compatibility argument has been long gone.

What’s truly ironic is that I see people switching to the Mac because they say that compared to Windows 8, they think the Mac is more like “Windows”. Ouch. Seriously Ouch.

Microsoft can try to compare their hardware to Apple’s hardware all day long, but they’re ignoring the elephant in the room that is Windows.

Fighting the Technology Backlash!

social-media-activism-1I’m not usually one to let things easily bother me. It takes just the right provocation to actually make me feel anything resembling anger. However, when I watched the viral video “Look Up” by Gary Turk, it pushed my buttons. Sure, there has been much written about people getting lost in technology such as this article called Dear Mom On The iPhone that gets passed around every so often. There is definitely a simmering backlash out there against all the new technology introduced in the last several years. But there was something about “Look Up” that was just exasperating to me. It went too far. I couldn’t let it go. I knew that it was time to fight back against the technology backlash. So to quote Samuel L. Jackson from Jurassic Park, “Hold on to your butts”, because things are about to get real!

Frankly, I’m sick and tired of all the whining and complaining regarding The New World of Technology that we live in today. Whether it’s someone bitching about people taking too many pictures or people looking at their phones too much or people spending too much time on Facebook or people playing too many video games, the level of antagonism towards technology in general has gone far enough. I’ve had it. Especially when a lot of this teeth-gnashing is being done on social media, being posted on YouTube, or being written about on a freaking blog!

I get it. Technology is changing things. Very rapidly. I’ve written about it before – that the last 6 or so years has seen technology disrupt our society unlike at any time before. Change is scary, I know. People like to complain about change. This isn’t new. But usually the stereotypical scene here is a couple of old people sitting on their rockers grumbling about their ailments and how those young whippersnappers just don’t have any respect. Now it seems that young is the new old and anyone over the age of 30 is eligible for old-timer syndrome.

Now let me have some empathy here first. Yes, there are many people out there that are in fact clinically addicted to their mobile devices or video games or using social media. They should seek out professional help. Then there are legions of people that are not truly addicted but probably spend an inordinate amount of time using technology of some sort. People that are in this situation should be encouraged to practice some moderation. But there’s a difference between gently encouraging someone and browbeating them into submission like “Look Up” tries to do. The reality is that true encouragement will work a lot better than bullying someone into compliance. In fact, attempting to harass someone will likely cause the opposite reaction, as anyone can tell you that forbidden fruit tastes the sweetest. To those that like to complain, what does it really matter to you anyway? Focus your energies on improving yourself and quit trying to run everyone else’s lives. You’re only making yourself seem totally out of touch and simply marginalizing your influence.

The fact that technology is so mobile now allows people to interact with it on an almost constant basis. Where in the past technology virtually chained us to our desks, we now use our technology out in public like never before. This is where I think those with curmudgeon-like tendencies get their knickers all up in a twist. It’s out there in their faces now. Everywhere one turns they can see The New World of Technology. Not only in real-life but all over traditional media now too. They can’t get away from it. The old human nature of being afraid of what we don’t understand kicks in. I feel their pain. But frankly, those of us who are making use of new technology are tired of listening to the grumbling. If you don’t like it, then don’t use it but leave the rest of us alone.

Perhaps the greatest thing about mobile technology is that it gives us access to the Internet anywhere we are. There is so much we can do with the Internet from communication to research. Literally it is the world at our fingertips. Saying someone is addicted to the Internet is like saying someone is addicted to communication. It’s like saying someone is addicted to learning. It’s like saying someone is addicted to life.

Now as I said before, there are people that probably do spend too much time using technology. For those who think that’s a big problem, just have some patience. Society as a whole behaves very similarly to how individuals behave – just on a longer, drawn-out scale. Visualize a kid who has just received a new toy that they’ve been wanting for a really long time. At first they seem addicted to it. They’re constantly playing with it. Maybe they even go too far and play with it when they really shouldn’t be or stay up way past bedtime because they don’t want to stop playing. Maybe they even sneak it into places they shouldn’t. But as with most things, they eventually get out of the “shiny new toy” phase and stop playing with it as much. Or as they mature in general they learn moderation. How the parents react to shiny new toy syndrome will go a long way in how a child will develop. If the parents come down hard, it usually only makes the child want to play with the toy more. If the parents chastise the child, the child will still want to play with the toy, but now they’ll go to lengths to hide their play. Either way, the child loses some respect for the parents as they feel the parents just don’t understand and aren’t making an effort to do so. However, if the parent takes the time to understand why a child loves a toy so dearly and patiently teaches the child that sometimes there can be too much of a good thing, the child will probably be more likely to learn and practice moderation going forward.

I believe society is in the shiny new toy phase with The New World of Technology. It has moved so fast we are still figuring out the new rules of the game. Or more precisely, we are making them up as we go along. To make things harder, technology continues to change right out from under us. Just when it seems we have a handle on things, new services like Pinterest or Instagram added to the constant stream of new devices being introduced totally change the nature of the beast. Everyone, especially those complaining the loudest, better buckle up because as a technology professional I guarantee you we’re just getting started.

Instead of complaining, I would encourage you to look at the bright side of all this new technology.

Playing games stimulates imagination and provides a motivation for learning. Video games truly are no different than traditional games in this regard, except that the evolving nature of video games has opened up exciting new opportunities. In many ways, I believe video games are a new form of classic storytelling. Where older generations revered their books and movies, younger generations hold the same adoration for the adventures they find in their video games. People today don’t need to just read about exciting worlds, they can virtually be part of them. I know it can be hard for older generations to grasp this concept. Video games seem so childish and a waste of time to many. However, chastising the playing of video games only serves to prove just how out of touch you are.

Spending time on social media is a true and valid form of communication. Just because it is a new way to communicate doesn’t make it any less legitimate. In a world where a lack of communication is endemic, discouraging communication is highly counterproductive. Sharing our experiences through photos is another form of communication. Just because you’d rather “take it all in,” don’t try to diminish how others prefer to capture their memories. Again, you only serve to remove yourself from their relevancy.

At one point in our history, we could only communicate with others who were in our immediate vicinity. Then the telephone changed all that. We could literally get in touch with anyone across town or around the world. I’m sure many people back when the telephone was being introduced thought that phone calls were not a natural thing and spending too much time on the phone was bad. Growing up in the 80’s, I know the stereotype was teenage girls getting yelled at by their parents to get off the phone, so this “problem” spanned generations. Today a lot of voice conversation has been replaced by texting and social media, but the core complaint still hangs around.

Social media, just like the telephone before it, does in fact allow people to connect. Just because it isn’t necessarily “in real life” doesn’t mean those connections aren’t real. In fact, social media allows more people to connect in meaningful ways. Just like the telephone could allow family members who lived in different countries to verbally communicate where it was impossible previously, social media allows people to connect with people from around the block or around the world in ways that were not possible prior. With mobile devices, this level of communication is now possible anywhere, anytime. Sure, we are still figuring out the etiquette for this new medium. But instead of complaining, help set the new rules. Be open-minded and understanding and you’ll get a lot further.

The problem I have with works like Gary Turk’s “Look Up” is that it is extremely condescending. While I understand he’s just trying to get people to not miss out on the world around them, the tone is quite patronizing and feels like an anti-technology hit piece. I picked out a few quotes as examples:

“This media called social is anything but”

“All this technology we have it’s just an illusion”

“When you step away from this device of delusion”

“We’re a generation of idiots, smart phones and dumb people”

“Look up from your phone, shut down the display
Stop watching this video, live life the real way.”

There’s no balance in his video. Watching it you’d think that technology was destroying the human race and we need to rid the world of this menace. But most reasonable people know otherwise. Whether it was listening to rock n’ roll, watching TV, playing pinball, playing arcade games, or whatever the new thing was at the time, I’m certain many people remember being told that those things were bad by the older generations of their youth. Which only makes it more surprising that they are turning around and doing the same to the young generations of today. The main difference is now technology has moved so rapidly, that otherwise young generations are already feeling passed by. What is truly interesting is that chronological age isn’t even the true gauge of potential curmudgeon status. What is seemingly more important is how comfortable one is with technology. People in their 30’s and 40’s show a wide range of technology experience so it isn’t unheard of for younger individuals to act like stereotypical grumpy old nags towards others who are actually older. In fact, Gary Turk is only 27. Act your age, Gary!

As a father of two girls and someone who runs their own business, I can say for certain that The New World of Technology and social media has expanded my opportunity for connecting. Besides having gained a lot of business directly from contacts made through social media, I have connected with a lot of people I would have never otherwise had the chance. Plus I’ve reconnected with many people from my past that I would not likely had much of a opportunity to do so thorough other means. So where many people bemoan the idea that people are not communicating because of new technology, in reality I believe it has expanded communication for the better. Just because you don’t recognize it as such or are too afraid to learn more doesn’t mean it is wrong or the not “the real way”. Sure, there can be too much of a good thing, but remember most people will learn moderation with their shiny new toy. I’m not alone in this way of thinking. Here is a great rebuttal to the Dear Mom on the iPhone article I mentioned called, “Dear Mom Judging the Mom on Her iPhone.”

Technology has driven our civilization forward since the dawn of time. Whether it was stone tools or quantum computing, the technology that humans create virtually define us. To take such a hardline stance on new technology only serves to create divisions where none need to exist. Whereas older generations may feel a level of ambivalence or even animosity to new ways of living and communicating, younger generations have no such reservations and devour these new methods with abandon. They have no reason to feel otherwise. Young people strongly identify with their technology and the way they use it. But when they are told that the things they enjoy aren’t “the right way”, they’ll push back. They’ll lose respect for those that attack the way they live their lives. Eventually they’ll stop listening. Choose your attitude carefully or risk becoming irrelevant.

Let’s celebrate human achievement and what technology can do for us instead of making videos that call us “a generations of idiots” with “smart phones and dumb people”.

#FightTheBacklash

March 12th was NOT the Birthday of “The Internet”

world-wide-web-birthdayA lot of news stories circulated yesterday celebrating “the birthday of the Internet”. I’m about to get a little nitpicky here, but guess what, it’s my blog so I can do what I want. If anything, March 12th can be considered the birthday of the World Wide Web, but the Internet has been in existence in various forms since 1969. So for technical correctness, please stop saying this is the 25th anniversary of “the Internet”. I know that for many of us, the web – for all practical purposes – IS the Internet, but let’s try to be just a little historically correct, shall we?

That being said, I’m about to get even more nitpicky. March 12th, 1989 is the date that Tim Berners-Lee first put forth a proposal to his employer, CERN, for developing a new way of linking and sharing information over the Internet (just to reiterate my point above, the Internet had to already BE in existence in order to create a World Wide Web). However I feel it is a stretch to say that this proposal, while it was the genesis of the World Wide Web, is the actual birthday of the web. This proposal put forth the very basic ideas that would grow into the web. However, if one reads the proposal, it is much more of a request to research the feasibility of such a system and to develop a future plan to implement such a system. In fact, at the end of the proposal, Berners-Lee specifically calls out that a second phase would be required to set up a real system at CERN. To boot, this proposal was never actually officially accepted, but Berner-Lee’s boss allowed him to work on it as a side project.

So what do I consider the real birthday of the World Wide Web? It’s hard to say specifically, but here are some important dates:

  • November 12, 1990: This is the date that Tim Berners-Lee put forth another proposal detailing the technical specification for actually developing a system that he called “WorldWideWeb”. This proposal was accepted and the real work of creating the web was put into motion. This could more accurately be called the birthday of the web.
  • February 26, 1991: On this date Berners-Lee released the first web browser to his colleagues within CERN. At this point the web was only available within CERN, but the fact that people were browsing is significant.
  • May 17, 1991: The first machine set up as a web server to the public Internet is set up by Tim Berners-Lee. Truly, this could be considered the birthday of the web as it was the first time anyone in the world (who had Internet access, of course) could feasibly browse the web. Not that there was much information of interest available that day. From this point forward, web servers were set up in organizations all over the world and development of web browsers for all computer operating systems began in earnest.
  • April 30, 1993: The source code of the World Wide Web protocols are made public domain by CERN. Tim Berners-Lee urged CERN to do this so that it would be freely available to anyone who wanted to set up a web site. Had this not happened, it would have changed the history of the web as the de-facto standard for organizing and sharing information on the Internet. Some people consider this the real birth of the World Wide Web and the moment the Internet began to creep into the mainstream.

Now, I’m not going to argue much with the people behind the Web at 25 movement, since Tim Berners-Lee himself is supportive of this project, even thought I might disagree with the particulars. They say he “invented” the web in 1989, but that’s like saying Edison invented the light bulb before he actually got it working. It’s one thing to come up with an idea, it’s another to actually make it reality. I still say that the “invention” or “birth” of the World Wide Web took place in late 1990/early 1991 as the dates above show. But if people want to celebrate the idea that the web was born on March 12, 1989, that’s fine, especially since the man who created it isn’t arguing. In my research as a technology historian, I know that many dates in history are hard to pin down exactly, especially when it comes to technology development. In fact, the dates I list above may not be entirely accurate depending on how people define technology releases and which source is claiming what. So at least March 12, 1989 does point to written documentation of the first reference to the project that would eventually become the World Wide Web. But please, at least call it for what it is, not “the birthday of the Internet”.