IRS: Innocence by Incompetency

obama-foia-2009Ever since the news broke a little over a week ago that the IRS lost e-mails connected to Lois Lerner because of a computer hard drive crash, I’ve been wanting to write an article addressing the technical aspects of this situation. However, the story kept growing as each day went by so I waited. As I sit down to begin this article late in the evening of June 23rd, I’ve just spent almost 4 hours watching the latest hearing live on CSPAN-2. Yes, you can’t get much geekier than spending an evening watching a government hearing discussing hard drives, backup tapes, and IT department policy. But I am who I am and the intersection of technology and politics is my wheelhouse. In all of history, there probably hasn’t been a more famous political story revolving around technology issues. Because of the size and scope of the various technical issues involved, this article will be the first of a likely series of articles tackling each major point in this long chain of events.

I almost feel that I don’t need to write these articles because it seems even technology laypeople instinctively know there is something highly suspicious about this situation. In this day and age of advanced technology, a simple hard drive crash simply doesn’t seem like a justified excuse to lose an important trail of digital communication. This is especially true for a government bureaucracy that purportedly symbolizes accurate record keeping. However, I still think a thorough review of the technology and management issues are worth examining.

As a technology professional, I’ve seen more than my fair share of hard drive failures. In my experience, hard drive failures are far too common, so I have no problem believing that a hard drive crash could have befallen the computer used by Lois Lerner. However, looking at the big picture, the hard drive failure really shouldn’t be relevant. All the hubbub about a hard drive crash is truly a red herring. Nonetheless, I will address the hard drive issue in my next article.

Any organization with halfway competent IT management that is required to preserve e-mails will have an e-mail archiving system in place. They will not defer responsibility of preserving required communications to individual employees – unless, of course, the negligence is intentional. One very important idea behind archiving is that e-mail communication may be used for criminal investigations among other things and it should be obvious that employees may decide not to preserve e-mails that are incriminating. The reality is that it is much easier to archive messages as they pass through a central server than it is to attempt to store and retrieve them from individual computers. And of course, automated centralized archiving eliminates the possibility of employees “losing” e-mails to cover their asses. To not archive messages at the server level is literally “so 1990’s”.

There are many archiving products and services available that work with common e-mail servers. It is well-known that the IRS uses the Microsoft Exchange platform, easily the most popular e-mail system for large enterprises. Therefore the IRS would have had its pick of any number of e-mail archiving systems to choose from. In fact, the IRS did have a contract with a company called Sonasoft that specializes in e-mail archiving (their tagline is “Email Archiving Done Right”). This contract was terminated at the end of fiscal year 2011 (August 31st), which seems highly unusual given the timing of the Lois Lerner hard drive failure and the supposedly lost e-mails in June of 2011. It was testified that the IRS only contracted with Sonasoft to archive the e-mails of the Chief Counsel within the IRS, which covered just 3,000 employees, not all 90,000. This also seems highly unusual to select such a small subset of employees. If archiving the e-mails of 3,000 IRS employees is deemed important, why not all 90,000 employees? At the very least, shouldn’t the heads of major divisions within the IRS, such as Lois Lerner, have had automated e-mail archiving as well? If nothing else, just from a productivity standpoint, the loss of e-mails for key personnel would be highly detrimental and an e-mail archiving system would be well worth the cost for the protection it provides, not to mention compliance with federal regulations in case of wrongdoing.

From a technical standpoint, the costs to archive the e-mails of all employees would not have been significantly greater. As with many technology systems, the greatest costs are in the initial implementation and baseline infrastructure, not in scaling of said systems. While archiving the volume of e-mail generated from 90,000 people would require a large amount of storage, it is not an impossible task. There are many companies in the United States that have hundreds of thousands of employees and are required to archive e-mails in order to comply with federal regulations. Or that being said, we know the NSA has enourmous data centers that are more than capable of monitoring and archiving communications on hyper-massive scale. Certainly it wouldn’t be beyond the capability of another gigantic federal agency such as the IRS to properly manage their own required records. The agency that has been charged with enforcing the president’s signature legislation shouldn’t have a problem archiving emails of a piddly 90,000 accounts, should they? They are in charge of maintaining records of hundreds of millions of American citizens, after all.

But that’s exactly what the current IRS chief wants you to believe. That the IRS’s technology infrastructure plus the policies and procedures that manages it are so woefully antiquated and out-of-date, they just couldn’t prioritize the archiving of e-mail messages. This is true even though e-mail messages are considered official records by the IRS’s own handbook and they are required to preserve them. The excuse has been given that properly archiving all the e-mails of the agency would cost between $10-$30 million and they just didn’t have the proper funding. I would love to know where this figure was arrived at because this seems like an extraordinarily high number, given that they were already contracting with a company that was doing e-mail archiving and scaling it shouldn’t have approached anywhere near these costs. Even several hundred terabytes of storage didn’t cost anywhere near $10 million in 2011.

What the IRS wants the American public to accept is that they can’t be proven guilty because they were incompetent. The truth of the matter, speaking as a technology expert with 20 years of professional experience, is given the laughable policies and procedures the IRS had in place, the chain of events as they describe them are entirely plausible. The burning question is whether or not these policies were in place due truly to incompetence or for convenient plausible deniability. At this point, we can only assume they are “innocent by incompetence.” But this can not be acceptable. If anything, the hypocrisy implicated here is far too much for anyone but the most ardent authoritarian to embrace. The IRS, nor would most law enforcement agencies, accept the excuse that a technical problem resulted in the destruction of evidence. In fact, the term “spoliation of evidence” is a legal concept that allows for courts to assume that evidence destroyed for any reason (whether claimed “accidental” or otherwise) would have been detrimental to the defense and assume the worst. Given the stringent federal regulations that require publicly held corporations to archive years worth of e-mails, it would be a significant case of “do as I say, not as I do” statist double-standards to allow the IRS to get away with this highly convenient set of circumstances.

While many apologists claim that the scandal is unfairly targeting Barack Obama, the reality is that he is the Chief Executive and that the IRS is a agency of the Executive Branch. That alone should prompt any leader of integrity to take charge and demand answers. But what is especially disturbing is that Obama was elected under the auspices of “Hope and Change” and one of those key tenets was “transparent and open government.” In fact, one of his first acts as president was to release presidential memorandums addressing the free flow of information from government agencies, regardless of the protection of personal interests. So for Obama to turn a blind eye on this situation is an egregious violation of his own proclamations. Do we really want a president that doesn’t stand by his own promises?

While “innocence by incompetency” may keep certain IRS figures out of jail or the president from being impeached, it won’t help the IRS in the long run. As I mentioned before, even technology laypeople realize there is something extraordinary about this situation. People from all political persuasions are incredulous at the arrogance and audacity shown by the IRS management over having their credibility questioned. We the people can not stand and let this pass. If the IRS is wanting to prove just how incompetent they really are, they need to have this incompetence severely punished. Instead of rewarding them by increasing their budget, we need to drastically reduce the power this agency has over the American people. The first step is to eliminate or rescind any further increases in power the IRS receives, such as that dubiously authorized by Obamacare. The ultimate step would be to abolish the IRS completely. While such a thought seemed like fantasy only a short time ago, the unprecedented nature of this situation has people seriously questioning the justified existence of such an agency and their legitimate role in a free society.

What steps do you think should be taken against the IRS for their seeming incompetency?

Windows is the Elephant in the Room

elephantI just read an article by Ed Bott of ZDNet discussing how Microsoft’s marketing for the Surface Pro 3 has backfired. Basically the article states that since Microsoft was comparing the upcoming Surface Pro 3 to a MacBook Air and also stating that the Surface Pro 3 can replace a laptop, the tech journalists that attempted to replace their MacBook Airs with a Surface Pro 3 were less than happy with their experience. The article however attempts to explain why these journalists’s experiences aren’t representative of an average user’s needs. That basically a tech journalist’s workflow is far too complex as compared to an average user who would have a MacBook. So the readers of the tech journalists reviews were being done an injustice because their needs are far too different from a tech journalist’s. Many of the comments on the article were in agreement with the author, attempting to rationalize the poor reviews of the Surface Pro 3 as a laptop replacement. Finally, the author states that “Getting the tech press to step outside of an Apple-centric bubble and imagine a world where people might choose a Windows laptop over a MacBook is the biggest challenge of all.”

That last statement would be utterly hilarious if it didn’t completely ignore the long history of the PC era until the last few years. Until Apple broke through with the iPhone and then ended the PC era with the iPad, Apple was virtually ignored among tech journalists. Perhaps the tech press is Apple-centric for a good reason. They are embracing The New World of Technology because they finally have a real choice as compared to 20 years of Microsoft domination.

It amazes me how many people, tech journalists and otherwise, still believe that comparing a Windows PC to a Mac is, well, an “apples-to-apples” comparison. Let’s face it, people who are using laptops in any “work” environment are more like tech journalists than not. For Microsoft to compare their hardware to Apple’s hardware is completely missing the point and they got exactly what they deserved.

It’s like judging a woman completely on her “specs”. There’s a lot more to a woman than her physical appearance, no matter what kind of perceived “performance” you may get out of it. The reality is that what’s on the inside is a lot more important because ultimately that is where most of the work actually takes place. You’re never going to make the most of that “hardware” if you can’t manage the “software” effectively. Similarly, when you compare the different platforms, a Mac is simply a more friendly, easy-to-use, and therefore productive work environment. Microsoft simply does not create the level of user experience refinements that Apple does. That’s been true since 1984. Let’s not pretend here. The disaster that is Windows 8 should be proof enough of that.

Those who are used to the Mac will definitely have a hard time switching to Windows. Hell, Windows users are having a hard time switching to Windows 8! But it seems as if some people, Ed Bott included, are simply writing it off as a “transitional” problem. It’s a lot more than that. Sure, those who are used to Windows will have a learning curve switching to a Mac, but from my experience, most people get comfortable within a week and then start to realize the advantages the Mac OS brings to them. My favorite quote from someone who switched to a Mac after years of Windows use was “This is how computing should be.” On the contrary, those who try to switch from a Mac to Windows rarely ever get used to Windows and will go back to a Mac as soon as they can.

It’s not just a matter of “what you’re used to”. I’ve seen far too many examples of this play out from average, everyday people who are clients, friends, or friends of clients. When people have a chance to experience both Windows and the Mac, overwhelmingly they choose a Mac. I think we’re seeing this play out in the larger market as consumers are making their own purchase decisions more and more. Most PC purchases were, and still are to a great degree, made in mass by big companies. The fact that the Mac market continues to grow while the overall PC market shrinks is one sign of this.

The other part that doesn’t make sense is people saying that tech journalists aren’t a good comparison to an average person. That’s absolutely true, but not for the reasons being bandied about by Ed Bott or certain commenters. If anyone had the ability to make a transition to Windows from a Mac, it would be a tech journalist – someone who presumably is very comfortable with technology. If a tech journalist has trouble switching to a Surface Pro 3, then what chance in hell does an average consumer have? Usually tech journalists do not give enough attention to the needs of an average user. They usually let their tech bias slant their view towards the hardware specs and not enough attention to the ease-of-use. Ed Bott’s article is a perfect example of that. The fact that several tech journalists panned the Surface Pro 3 should be a warning heeded by people thinking of purchasing one. The reality is that either the device is too complex for an average user or that the device isn’t robust enough for a professional. That’s not a good reality for Microsoft.

As I’ve said, I’ve seen the switching scenario play out hundreds of times over the last 20+ years in both directions. I’ve had many clients that used a Mac at one point, then were forced to switch to a PC by their work environment. Years later, they still wished for a Mac and eventually made their own purchase decision going back. I have *never* seen this scenario with someone wanting to go back to Windows. Usually, as the saying goes, once you go Mac you never go back. Especially given that you can run Windows in Boot Camp or a virtual machine so the old compatibility argument has been long gone.

What’s truly ironic is that I see people switching to the Mac because they say that compared to Windows 8, they think the Mac is more like “Windows”. Ouch. Seriously Ouch.

Microsoft can try to compare their hardware to Apple’s hardware all day long, but they’re ignoring the elephant in the room that is Windows.

Fighting the Technology Backlash!

social-media-activism-1I’m not usually one to let things easily bother me. It takes just the right provocation to actually make me feel anything resembling anger. However, when I watched the viral video “Look Up” by Gary Turk, it pushed my buttons. Sure, there has been much written about people getting lost in technology such as this article called Dear Mom On The iPhone that gets passed around every so often. There is definitely a simmering backlash out there against all the new technology introduced in the last several years. But there was something about “Look Up” that was just exasperating to me. It went too far. I couldn’t let it go. I knew that it was time to fight back against the technology backlash. So to quote Samuel L. Jackson from Jurassic Park, “Hold on to your butts”, because things are about to get real!

Frankly, I’m sick and tired of all the whining and complaining regarding The New World of Technology that we live in today. Whether it’s someone bitching about people taking too many pictures or people looking at their phones too much or people spending too much time on Facebook or people playing too many video games, the level of antagonism towards technology in general has gone far enough. I’ve had it. Especially when a lot of this teeth-gnashing is being done on social media, being posted on YouTube, or being written about on a freaking blog!

I get it. Technology is changing things. Very rapidly. I’ve written about it before – that the last 6 or so years has seen technology disrupt our society unlike at any time before. Change is scary, I know. People like to complain about change. This isn’t new. But usually the stereotypical scene here is a couple of old people sitting on their rockers grumbling about their ailments and how those young whippersnappers just don’t have any respect. Now it seems that young is the new old and anyone over the age of 30 is eligible for old-timer syndrome.

Now let me have some empathy here first. Yes, there are many people out there that are in fact clinically addicted to their mobile devices or video games or using social media. They should seek out professional help. Then there are legions of people that are not truly addicted but probably spend an inordinate amount of time using technology of some sort. People that are in this situation should be encouraged to practice some moderation. But there’s a difference between gently encouraging someone and browbeating them into submission like “Look Up” tries to do. The reality is that true encouragement will work a lot better than bullying someone into compliance. In fact, attempting to harass someone will likely cause the opposite reaction, as anyone can tell you that forbidden fruit tastes the sweetest. To those that like to complain, what does it really matter to you anyway? Focus your energies on improving yourself and quit trying to run everyone else’s lives. You’re only making yourself seem totally out of touch and simply marginalizing your influence.

The fact that technology is so mobile now allows people to interact with it on an almost constant basis. Where in the past technology virtually chained us to our desks, we now use our technology out in public like never before. This is where I think those with curmudgeon-like tendencies get their knickers all up in a twist. It’s out there in their faces now. Everywhere one turns they can see The New World of Technology. Not only in real-life but all over traditional media now too. They can’t get away from it. The old human nature of being afraid of what we don’t understand kicks in. I feel their pain. But frankly, those of us who are making use of new technology are tired of listening to the grumbling. If you don’t like it, then don’t use it but leave the rest of us alone.

Perhaps the greatest thing about mobile technology is that it gives us access to the Internet anywhere we are. There is so much we can do with the Internet from communication to research. Literally it is the world at our fingertips. Saying someone is addicted to the Internet is like saying someone is addicted to communication. It’s like saying someone is addicted to learning. It’s like saying someone is addicted to life.

Now as I said before, there are people that probably do spend too much time using technology. For those who think that’s a big problem, just have some patience. Society as a whole behaves very similarly to how individuals behave – just on a longer, drawn-out scale. Visualize a kid who has just received a new toy that they’ve been wanting for a really long time. At first they seem addicted to it. They’re constantly playing with it. Maybe they even go too far and play with it when they really shouldn’t be or stay up way past bedtime because they don’t want to stop playing. Maybe they even sneak it into places they shouldn’t. But as with most things, they eventually get out of the “shiny new toy” phase and stop playing with it as much. Or as they mature in general they learn moderation. How the parents react to shiny new toy syndrome will go a long way in how a child will develop. If the parents come down hard, it usually only makes the child want to play with the toy more. If the parents chastise the child, the child will still want to play with the toy, but now they’ll go to lengths to hide their play. Either way, the child loses some respect for the parents as they feel the parents just don’t understand and aren’t making an effort to do so. However, if the parent takes the time to understand why a child loves a toy so dearly and patiently teaches the child that sometimes there can be too much of a good thing, the child will probably be more likely to learn and practice moderation going forward.

I believe society is in the shiny new toy phase with The New World of Technology. It has moved so fast we are still figuring out the new rules of the game. Or more precisely, we are making them up as we go along. To make things harder, technology continues to change right out from under us. Just when it seems we have a handle on things, new services like Pinterest or Instagram added to the constant stream of new devices being introduced totally change the nature of the beast. Everyone, especially those complaining the loudest, better buckle up because as a technology professional I guarantee you we’re just getting started.

Instead of complaining, I would encourage you to look at the bright side of all this new technology.

Playing games stimulates imagination and provides a motivation for learning. Video games truly are no different than traditional games in this regard, except that the evolving nature of video games has opened up exciting new opportunities. In many ways, I believe video games are a new form of classic storytelling. Where older generations revered their books and movies, younger generations hold the same adoration for the adventures they find in their video games. People today don’t need to just read about exciting worlds, they can virtually be part of them. I know it can be hard for older generations to grasp this concept. Video games seem so childish and a waste of time to many. However, chastising the playing of video games only serves to prove just how out of touch you are.

Spending time on social media is a true and valid form of communication. Just because it is a new way to communicate doesn’t make it any less legitimate. In a world where a lack of communication is endemic, discouraging communication is highly counterproductive. Sharing our experiences through photos is another form of communication. Just because you’d rather “take it all in,” don’t try to diminish how others prefer to capture their memories. Again, you only serve to remove yourself from their relevancy.

At one point in our history, we could only communicate with others who were in our immediate vicinity. Then the telephone changed all that. We could literally get in touch with anyone across town or around the world. I’m sure many people back when the telephone was being introduced thought that phone calls were not a natural thing and spending too much time on the phone was bad. Growing up in the 80’s, I know the stereotype was teenage girls getting yelled at by their parents to get off the phone, so this “problem” spanned generations. Today a lot of voice conversation has been replaced by texting and social media, but the core complaint still hangs around.

Social media, just like the telephone before it, does in fact allow people to connect. Just because it isn’t necessarily “in real life” doesn’t mean those connections aren’t real. In fact, social media allows more people to connect in meaningful ways. Just like the telephone could allow family members who lived in different countries to verbally communicate where it was impossible previously, social media allows people to connect with people from around the block or around the world in ways that were not possible prior. With mobile devices, this level of communication is now possible anywhere, anytime. Sure, we are still figuring out the etiquette for this new medium. But instead of complaining, help set the new rules. Be open-minded and understanding and you’ll get a lot further.

The problem I have with works like Gary Turk’s “Look Up” is that it is extremely condescending. While I understand he’s just trying to get people to not miss out on the world around them, the tone is quite patronizing and feels like an anti-technology hit piece. I picked out a few quotes as examples:

“This media called social is anything but”

“All this technology we have it’s just an illusion”

“When you step away from this device of delusion”

“We’re a generation of idiots, smart phones and dumb people”

“Look up from your phone, shut down the display
Stop watching this video, live life the real way.”

There’s no balance in his video. Watching it you’d think that technology was destroying the human race and we need to rid the world of this menace. But most reasonable people know otherwise. Whether it was listening to rock n’ roll, watching TV, playing pinball, playing arcade games, or whatever the new thing was at the time, I’m certain many people remember being told that those things were bad by the older generations of their youth. Which only makes it more surprising that they are turning around and doing the same to the young generations of today. The main difference is now technology has moved so rapidly, that otherwise young generations are already feeling passed by. What is truly interesting is that chronological age isn’t even the true gauge of potential curmudgeon status. What is seemingly more important is how comfortable one is with technology. People in their 30’s and 40’s show a wide range of technology experience so it isn’t unheard of for younger individuals to act like stereotypical grumpy old nags towards others who are actually older. In fact, Gary Turk is only 27. Act your age, Gary!

As a father of two girls and someone who runs their own business, I can say for certain that The New World of Technology and social media has expanded my opportunity for connecting. Besides having gained a lot of business directly from contacts made through social media, I have connected with a lot of people I would have never otherwise had the chance. Plus I’ve reconnected with many people from my past that I would not likely had much of a opportunity to do so thorough other means. So where many people bemoan the idea that people are not communicating because of new technology, in reality I believe it has expanded communication for the better. Just because you don’t recognize it as such or are too afraid to learn more doesn’t mean it is wrong or the not “the real way”. Sure, there can be too much of a good thing, but remember most people will learn moderation with their shiny new toy. I’m not alone in this way of thinking. Here is a great rebuttal to the Dear Mom on the iPhone article I mentioned called, “Dear Mom Judging the Mom on Her iPhone.”

Technology has driven our civilization forward since the dawn of time. Whether it was stone tools or quantum computing, the technology that humans create virtually define us. To take such a hardline stance on new technology only serves to create divisions where none need to exist. Whereas older generations may feel a level of ambivalence or even animosity to new ways of living and communicating, younger generations have no such reservations and devour these new methods with abandon. They have no reason to feel otherwise. Young people strongly identify with their technology and the way they use it. But when they are told that the things they enjoy aren’t “the right way”, they’ll push back. They’ll lose respect for those that attack the way they live their lives. Eventually they’ll stop listening. Choose your attitude carefully or risk becoming irrelevant.

Let’s celebrate human achievement and what technology can do for us instead of making videos that call us “a generations of idiots” with “smart phones and dumb people”.

#FightTheBacklash

March 12th was NOT the Birthday of “The Internet”

world-wide-web-birthdayA lot of news stories circulated yesterday celebrating “the birthday of the Internet”. I’m about to get a little nitpicky here, but guess what, it’s my blog so I can do what I want. If anything, March 12th can be considered the birthday of the World Wide Web, but the Internet has been in existence in various forms since 1969. So for technical correctness, please stop saying this is the 25th anniversary of “the Internet”. I know that for many of us, the web – for all practical purposes – IS the Internet, but let’s try to be just a little historically correct, shall we?

That being said, I’m about to get even more nitpicky. March 12th, 1989 is the date that Tim Berners-Lee first put forth a proposal to his employer, CERN, for developing a new way of linking and sharing information over the Internet (just to reiterate my point above, the Internet had to already BE in existence in order to create a World Wide Web). However I feel it is a stretch to say that this proposal, while it was the genesis of the World Wide Web, is the actual birthday of the web. This proposal put forth the very basic ideas that would grow into the web. However, if one reads the proposal, it is much more of a request to research the feasibility of such a system and to develop a future plan to implement such a system. In fact, at the end of the proposal, Berners-Lee specifically calls out that a second phase would be required to set up a real system at CERN. To boot, this proposal was never actually officially accepted, but Berner-Lee’s boss allowed him to work on it as a side project.

So what do I consider the real birthday of the World Wide Web? It’s hard to say specifically, but here are some important dates:

  • November 12, 1990: This is the date that Tim Berners-Lee put forth another proposal detailing the technical specification for actually developing a system that he called “WorldWideWeb”. This proposal was accepted and the real work of creating the web was put into motion. This could more accurately be called the conception of the web.
  • February 26, 1991: On this date Berners-Lee released the first web browser to his colleagues within CERN. At this point the web was only available within CERN, but the fact that people were browsing is significant.
  • May 17, 1991: The first machine set up as a web server to the public Internet is set up by Tim Berners-Lee. Truly, this could be considered the birthday of the web as it was the first time anyone in the world (who had Internet access, of course) could feasibly browse the web. Not that there was much information of interest available that day, but the web was “live” for all intents and purposes. From this point forward, web servers were set up in organizations all over the world and development of web browsers for all computer operating systems began in earnest.
  • April 30, 1993: The source code of the World Wide Web protocols are made public domain by CERN. Tim Berners-Lee urged CERN to do this so that it would be freely available to anyone who wanted to set up a web site. Had this not happened, it would have changed the history of the web as the de-facto standard for organizing and sharing information on the Internet. Some people consider this the real birth of the World Wide Web and the moment the Internet began to creep into the mainstream.

Now, I’m not going to argue much with the people behind the Web at 25 movement, since Tim Berners-Lee himself is supportive of this project, even thought I might disagree with the particulars. They say he “invented” the web in 1989, but that’s like saying Edison invented the light bulb before he actually got it working. It’s one thing to come up with an idea, it’s another to actually make it reality. I still say that the “invention” or “birth” of the World Wide Web took place in late 1990/early 1991 as the dates above show. But if people want to celebrate the idea that the web was born on March 12, 1989, that’s fine, especially since the man who created it isn’t arguing. In my research as a technology historian, I know that many dates in history are hard to pin down exactly, especially when it comes to technology development. In fact, the dates I list above may not be entirely accurate depending on how people define technology releases and which source is claiming what. So at least March 12, 1989 does point to written documentation of the first reference to the project that would eventually become the World Wide Web. But please, at least call it for what it is, not “the birthday of the Internet”.

Sneek Peek #3 of Diggers Episode Featuring Steve Jobs Lost Aspen Time Tube

Here is the second clip I found from the upcoming episode of Diggers where we recovered Steve Jobs’ Lisa mouse.

Another Sneek Peek of Diggers Episode Featuring Steve Jobs Lost Aspen Time Tube

I found two additional clips from the upcoming episode of Diggers where we recovered Steve Jobs’ Lisa mouse. Here is the first one:

Sneak Peek of Diggers Episode; “Lost” Steve Jobs Aspen Time Tube

C|NET just released a video sneak peak of the upcoming Diggers episode where they unearth the Lost Apsen Time Tube containing Steve Jobs’ Lisa mouse from his “Lost” 1983 Speech. I am featured at the end of the clip talking about Steve Jobs and the mouse. Check it out!

The original article with the video is here on C|NET’s site.

“Lost” Steve Jobs Time Capsule Episode to Air February 25 on National Geographic Channel

This is a picture of me, the cast of Diggers, and two people who were at the IDCA conference in 1983. From left to right, "Ringy", me, "KG", Thane Roberts, and my client John Celuch.

This is a picture of me (second from the left), the cast of Diggers, and two people who were involved with the time capsule at the IDCA conference in 1983. From left to right, “Ringy”, Marcel Brown, “KG”, Thane Roberts, and my client John Celuch. The time capsule is behind us.

We finally have an official air date for the highly anticipated excavation of the “Lost” Steve Jobs Time Capsule and his Lisa mouse that was buried inside. The National Geographic Channel will air two episodes of Diggers on February 25th starting at 10 PM Eastern. One of the episodes will feature the Aspen Time Tube and the efforts that finally culminated in the recovery of the Lisa mouse. I will also personally be featured in the show, offering historical perspective on the significance of Steve Jobs, the “lost” speech that he gave at the conference in Aspen, the history of the technology industry since 1983, and some of the more interesting technology artifacts that were uncovered.

I have little details about the episode but here is what I have been told:

  • Usually Diggers runs 30-minute segments per dig site, but because the Aspen excavation was such a big project it will be featured for an entire one hour episode.
  • The Aspen episode is intended to be the season premiere. Whether this means it will air first of the two episodes that night is still up in the air, but I certainly hope so!

So set your DVRs and don’t miss this episode! For those of you that live in my area, we are planning a watch party to be held somewhere  that has a large party room with TVs. Stay tuned for details!

Mac OS X Mavericks: The Sky is Falling … Or Not.

Settle it down there, Chicken Little!

Settle it down there, Chicken Little!

An article on ZDNet, “Mavericks: The end of Macs in the enterprise?“, complains that Apple will no longer update older versions of their Mac operating systems, now that Mac OS X 10.9 has been released. The author claims that Apple is forcing CIOs to make a choice of upgrading to an untested operating system or leave themselves open to attack. The author certainly makes a convincing sounding argument, but ultimately it is not much more than Chicken Little claiming the sky is falling.

The author starts off by stating that Macs have never been that popular in the enterprise, which is somewhat funny because if Macs are not that popular in the corporate world, then what is the point of writing this article? I can only guess that it serves to tip the author’s hand that he simply isn’t that fond of Macintosh computers, which he reinforces by calling them “shiny” and later “pretty”, as if that is the only reason people buy Apple computers.

To say that Macs have never been that popular “in the enterprise” is true, but it isn’t the whole truth. The reality is that Macs have never been popular with IT departments who are in charge of managing a large number of commodity computers. To ask end-users, I’m sure we’d have a very different answer as to which computer they prefer, if given a choice. In addition, Macs are popular with departments that value what the Mac brings to the table, obviously design and/or publishing departments within corporations. These departments usually operate mostly autonomously from the larger corporate IT department because 1) many IT people have a irrational dislike of computers that aren’t Windows-based PCs, and 2) they generally do not require the constant support a Windows environment requires. Most creative departments are fairly happy to self-support their Macs because they can. And because Mac-using departments generally don’t need help from their corporate IT, it only serves to further raise the ire of IT departments who generally don’t like users who aren’t beholden to their assistance.

Macs will probably never be popular with enterprise IT departments, but it isn’t because of security “issues”. It is because the enterprise wants commodity computing. Macs will never be commodity computing. We shouldn’t look for Macs to replace PCs as commodity computing because mobile devices are doing that already. Not that it is relevant to this discussion, but it is somewhat amusing to see that Apple’s iPhone and iPad are the darling of corporate America. Again, not because they have been blessed by corporate IT, but rather because end users have overwhelmingly demanded it.

The reality is as CIO it doesn’t matter what you think. You will need to support whatever devices your end-users are bringing into the work environment. BYOD isn’t a suggestion anymore. But this is actually a very good thing for the enterprise. Shift the burden of device support to your users. Configure your IT infrastructure so that it doesn’t matter what device your users bring on. And for the love of god, make sure your corporate data security isn’t dependent on securing end-user devices. If your network can be compromised because some  PC has a virus, then you’ve failed. Just as companies don’t give employees cars nor worry about the maintenance of their employees transportation, companies should make it an expectation that employees have properly functioning computing devices. If it doesn’t work, its up to the employees to get it fixed. That will ensure employees make good decisions about their technology. Which usually means NOT choosing a Windows PC.

Still, the entire premise of this article is highly suspect. Why on Earth would you switch to an untested operating system? Mavericks is out now. Start testing it. It’s not like you need to upgrade right this second. You have at least a few weeks if not months to do some very thorough testing and for software vendors to patch software if necessary. The sky is not falling, Chicken Little.

I understand that when viewing the world through Windows-colored glasses, people tend to be a little jittery. Yes, security patches are an absolutely critical thing for Windows systems. I can’t blame anyone for being a little shell shocked when they are in charge of a Windows environment. Windows is a war-zone and those who use Windows need to take every precaution necessary. But after 12 years and only a very few instances of malicious software – most of which has been very minor, not very widespread, and/or easily remedied – I think the proof is in the pudding that Mac OS X is a very secure operating system. Apple might not be patching older operating systems? Boo-hoo. Pull up your big boy pants and realize this is The New World of Technology. If security is your concern, you should be embracing anything that doesn’t start with a “W” and ends with “indows”.

If the idea is to walk away from headaches, everyone in IT should have long ago ran screaming from the migraine that is Windows. Managing Mac computers is a walk in the park compared to the nightmare that is Windows. If the choice is to not leave yourself open to attackers, Windows is NOT that choice. Windows’ swiss-cheese design leaves itself open to attackers and we must stay vigilant to patch every hole in the dam that springs a leak. The Mac OS has proven itself to be a formidable fortress and the latest version is more secure than ever. Instead of acting like Chicken Little and screaming that the sky is falling, take a breath, look at the big picture, and quit missing the forest for the trees.

Can We Now Please Get Serious About Viruses?

system-failureMany news reports last month warned of a new type of ransomware called CryptoLocker. In a nutshell, CryptoLocker uses sophisticated encryption techniques to scramble an infected user’s data and then holds the data for ransom. Only if the user pays $300 will the data be decrypted and become usable again. If the user does not pay $300 within about 3 or 4 days of getting infected, CryptoLocker automatically destroys the decryption key required to unlock the data and the user will never be able to recover the data in any other way. If this sounds nasty, you’re damn right it is.

Of course, as with nearly all malicious software, this malware can only infect Windows-based systems. At this time, CryptoLocker can not infect Macintosh computers, iOS devices (iPhone, iPad, iPod Touch), or Android-based devices. While it is within the realm of reality that criminals could create a Mac version, the underlying secure UNIX-based design of the Mac makes this very unlikely (the virtual non-existence of malware for the Mac OS X platform after 12 years should be proof enough). I’ll keep it simple and say there’s zero chance of this happening on iOS platforms. And while this particular incarnation of CryptoLocker probably wouldn’t be effective on an Android device, there are already examples of ransomware popping up on Android devices.

In other news, another malware called Dexter has resurfaced in South Africa after infecting systems in the US, UK and dozens of other countries towards the end of last year. This particular malware attacks Windows-based point-of-sale systems and skims credit card information from customers shopping at infected stores. But what’s tens of millions dollars between friends, eh?

While scams can happen on any platform, and some cross-platform development environments (Java, Adobe Flash) can create malware on any platform that supports them, the bottom line is that Windows is the center of the malware universe. Windows is so full of holes it makes Swiss Cheese jealous. For all the anti-virus software out there, their effectiveness has steadily declined over the years, detecting only 70 to 90 percent of malware according to a report from a few years ago. The situation hasn’t improved over the years, as malware is increasingly prevalent and more sophisticated in its methods of attack and evasion.

Windows is a war zone. If you choose to participate in this environment, you must take increasingly intricate actions to stay protected. And that protection is dubious in nature. Where simple anti-virus software and firewalls used to be enough for most people, it is becoming increasingly clear that additional layers of protection are necessary to actually be “protected”. Most of these steps are far beyond the average computer user’s comprehension or feasibility of implementation and even then it is a constant battle to stay updated and aware.

When will enough be enough? Untold numbers of individuals and businesses lose millions upon millions of dollars a year combating a problem that Microsoft’s operating system fosters. Sure it isn’t Microsoft’s fault that malware authors feast on their operating system, but the reality is that Microsoft created the environment for malware to flourish. Something MUST change in the technology industry because this simply can NOT continue. Technology is supposed to make our lives easier, not harder.

It is time to face the stark reality that Windows is no longer (not that it has ever truly been) a platform that we can consider a viable foundation to run our lives or businesses. For all the hype about Windows 8 (not that anyone is listening) the reality is that Windows users are one infection away from losing their valuable data. That data could be irreplaceable photos of their children. Or it could be information that their livelihood depends on. Or it could be other people’s confidential information that they have been entrusted with. I for one am sick of dealing with this problem. It does not need to be this way.

Other operating systems, namely the Mac and iOS, are virtually immune to malware. Nothing is perfect, but Mac OS X is a paradise compared to the Windows war zone. And iOS is virtually impregnable with Apple strictly controlling that environment and how software can be installed on it.

Technology professionals, it is time for a “come to Jesus” moment. If you continue to advocate the deployment of technologies founded on Windows – and if you advocate for the deployment of other malware-susceptible platforms such as Android – you are doing your customers, clients, or employers a disservice. More than that I suggest you are now sabotaging those who pay your salaries. Take a look in the mirror and ask yourselves if you can live with the potential disaster that lurks around the corner. The next CryptoLocker or Dexter attack may hit your systems and you’ll have no one to blame but yourselves. It is time to take a stand and start informing those who look to you for technology expertise that the only real solution to malware is to move away from the platforms that are their breeding grounds. Yes, it will be tough to swim against the current, but the tide is already changing. Will you help lead the charge or simply follow along?