Karma and the 1985 World Series

1985 World Series, Out at First

Even Royals fans know this is an out.

I do not acknowledge the supposed results of the 1985 World Series.

As far as I’m concerned, the series is still in game 6, bottom of the ninth, pending a review of a play at first base. Given that video replays clearly show the runner is out at first and umpire Don Denkinger made the wrong call, I believe that the game stands at one out with bases empty in the bottom of the 9th inning.

Now unlike many Cardinals fans, I’m not going to say that this call absolutely cost the Cardinals the World Series. They still had two more outs to go, and who knows what would have happened after that. It it certainly possible the Royals would have rallied regardless. However, in the same breath, with the correct call the Royals would have remained down by a run with no base runners and only two outs left. It doesn’t take a baseball expert to surmise that a team in the World Series two outs away from winning with their dominant closer on the mound has the odds clearly on their side. But just to emphasize the point, an article written by a self-professed Royals fan analyzing the percentages show that the Cardinals’ chances of winning the game would have gone from 81% to 89% if the correct call had been made. Instead, the Cardinals’ chances of winning changed to only 67% with the blown call. Many Royals fans will argue that the Cardinals still had the odds on their side and therefore “the call” can’t be blamed for the Cardinals losing, but the effects of such an unprecedentedly bad call on the psyche of a team can’t be ignored.

Denkinger-effectLet’s put a little perspective on the magnitude of the call made by Don Denkinger. Umpires make bad calls all the time to be sure. However, rarely are bad calls so irrefutably wrong. Rarely are bad calls made in a game and situation of such importance. Rarely do bad calls so directly affect the outcome of a championship. As I mentioned, all the TV replays showed the call was wrong. The announcers couldn’t believe the call. After the game, the commissioner of baseball told Denkinger that the call was wrong. The call was so bad that Denkinger himself later admitted he made the wrong call. This wasn’t just an ordinary bad call. It was an epically bad call in the tightest of games. This was a bad call of historic proportions. Given the situation, it is perhaps the worst call ever in the history of baseball. It is fairly obvious that the Cardinals just weren’t the same team after that call and it is hard to blame them given the unbelievable situation. Unfortunately we will never know what would have happened if the play had been correctly called and the game would have continued like business as usual.

I’m a big believer in Karma. I think Karma can even extend in ways that are hard for us to fathom. Therefore, I do not think it is coincidence that the Kansas City Royals, who had not played a postseason game since that fateful 1985 World Series, finally made it into the playoffs this year – the very year that instant replay was finally introduced into Major League Baseball. Had instant replay been available in 1985, Don Denkinger’s colossally bad call would have been easily reversed and the series would have wrapped up without controversy. Maybe baseball Karma was at work and the Royals were destined to not play in the postseason until instant replay could prevent a bad call from influencing a series like it did in 1985. Could be. Just maybe. Baseball is a strange sport like that.

Given we can't go back in time, this is the way it should be.

Given we can’t go back in time, this is the way it should be.

Like I said above, even with the odds squarely in their favor, I’m not going to claim that the Cardinals are the rightful 1985 World Series champions. Even if the call had been correctly made or reversed, they still had two outs to go and without the benefit of a time machine, we can not go back to find out if they would have closed out that game. However, it is a stretch to say that the Royals can rightfully claim the 1985 crown. The bad call and ensuing brouhaha gave them a significant unfair advantage. So how do we resolve the situation? It wouldn’t be right to strip the Royals of the title and give it to the Cardinals, but we can’t ignore the monumental umpiring error that possibly robbed a team of a championship . The best I can come up with is to acknowledge both teams as 1985 World Champions. In this year of instant replay, we should attempt to right what once went wrong. 29 years later we can’t apply instant replay and resume game 6 of the 1985 World Series, but we can retroactively acknowledge the egregiously bad call that completely changed a series.

Apple, Samsung, the Evolution of Smartphones, and Real Innovation

Apple-Samsung-logoNow that the iPhone 6 models have been released, the buzz among consumers and competitors alike is in full swing. Both Apple fans and detractors are lighting up the Internet with their opinions on larger screen sizes and claims of which companies are the most innovative. For me, it has been interesting to observe the evolution of the smartphone and the corresponding market. I feel that we are now at the launch point for the next stage of advancement in mobile technology so it is a good time to reflect and look forward.

Let’s be clear that Apple created the smartphone market as we know it today. Sure there were “smartphones” before the iPhone, but that is like comparing DOS-based computers to the Macintosh and all graphical computers after. They simply weren’t in the same league. For all the talk about Apple not innovating lately, I think a lot of people are missing the big picture. The iPhone was a huge leap forward from the “primordial” smartphones, just as the Macintosh was a huge leap from early text-based personal computers. Once a market segment is created, competition begins to fill in. But competing in an established category is quite different from creating a market and continually moving it forward.

Considering how so many people felt themselves “technology illiterate” in the PC era, the fact that millions upon millions of people now own smartphones is quite a feat. But obviously this didn’t happen overnight. It’s been seven years since Apple introduced the original iPhone and 6 years since they rolled out the App Store. Every year Apple brings out some new advances, but while other companies want to ride the bleeding edge, Apple has been very different in their approach. It’s almost as if Apple knows that too much, too quickly has the potential to turn off consumers. They seem to understand that mainstream society needs to get used to technology advancements before building further upon them. Steve Jobs once used the analogy of people and technology being on a “first date” in the 80’s and I think the analogy somewhat extends out now. Move too fast and someone will get dumped. Had Apple gone too quickly and made the iPhone too complex or less reliable, it may have stymied the adoption of the iPhone and smartphones in general. However, as Apple was slowly bringing us along, being careful not to move too fast or introduce bleeding-edge technology that could have negatively impacted user experience, a certain set of users wanted to move faster.

In the early days of Android, phone manufacturers tried just about everything and anything they could to compete with the iPhone and differentiate themselves from other Android phones. I likened it to throwing everything against the wall to see what would stick. For all the gimmicky things that Android phone manufacturers tried, the one thing that actually seemed to stick was bigger screen size. Ironically, bigger screen sizes may not have been an intentional development, but rather done out of necessity, as the phones that initially supported LTE needed bigger batteries to handle the increased drain that the early LTE chipsets required. Bigger batteries required bigger phones and correspondingly bigger screen sizes. Samsung then took the larger screen size concept to the next level with the Galaxy Note in 2012. It isn’t hard to understand why certain people like a bigger screen. Techie-types seem to like anything with bigger specs, and those with bad eyesight think that bigger screens means they are easier to read. Others simply like to have more screen room to work with. Regardless, Samsung’s bigger screens ended up being virtually the only thing that average users could identify with on non-Apple phones. Somewhat paradoxically, while competitors claimed bigger screens on phones were better, they simultaneously pushed smaller screens on tablets! But that is a discussion for another time.

Where Apple was being careful in not pushing technology too fast, they may have been just a little too cautious. The mainstream that was “technology illiterate” became savvy quicker than Apple expected. Ironically, it appears that the ease-of-use Apple was so careful to protect empowered users to not be so fearful of technology, emboldening them to explore products from other companies. Samsung’s bigger screens was a simple draw to those who were wanting to push the technology envelope. The simple fact that Apple has now introduced larger screen sizes shows that there must have been significant customer demand for them. From my own personal experience as a technology consultant, I can say that I was asked many questions about larger screen sizes in the last couple of years. Several people indicated that they were contemplating leaving the iPhone to get a bigger screen. Congratulations, Samsung! Even a blind squirrel finds a nut sometimes!

Now don’t get me wrong. I think competition is a great thing. It keeps all on their toes, constantly working to improve their products or services. Ultimately consumers win from varied choices and lower costs. But I think one can tell a lot about the company behind the products from their advertisements. Why does it seem that the large majority of Samsung’s ads are trying to poke fun at the iPhone? Samsung perhaps doesn’t realize (or perhaps they do) that they are also making fun of people who use iPhones. This isn’t a good way to win friends and influence people. If your products are so great, Samsung, why do you need to build them up by trying to tear down others? And what does it say about the people who are influenced by this type of conceited, self-congratulatory commercials? It’s a throwback to the Old World of Technology where many technology professionals gave off an aura of smug superiority. I think most people have no desire deal with egotistical technology professionals any longer.

Unfortunately for Samsung, the jig may be up. For all those people who claim that having a bigger screen was innovative, it was something that was very simple to copy. Now that Apple has introduced bigger screen sizes, Samsung no longer has an easy claim to fame. The fact that sales of the iPhone 6 have been record-setting seems to indicate that while people did in fact want smartphones with big screens, what they truly wanted was an iPhone with a big screen. For all of Samsung’s hype, their “Next Big Thing” usually was simply their next “big” thing. Now that everyone’s big, what will Samsung do? When the “we had big screens first!” marketing campaign fizzles out, what gimmicky tech features will they resort to next?

Hopefully Samsung enjoyed their time in the sun because it seems that Apple is back with a vengeance. Their current path of innovation, save the Apple Watch, may seem subtle at this point, but it all stands to fundamentally reshape not only the mobile device market, but the entire technology landscape as well. Besides making incredible technology products, what Apple does at its core is bring technologies into the mainstream. By making technology easy to use and accessible, Apple makes technology more powerful than any hardware specification alone can. Let’s look forward a little bit:

  • HomeKit will become the standard for unified home automation across disparate devices. The “Jetsons” home will finally come closer to reality.
  • HealthKit will become the standard for organizing personal health information from various sensors and data input. Soon we will hear stories about how Apple technology is literally saving people’s lives.
  • Apple Pay will move forward the payment transaction industry that has been quagmired. Where other tech companies have tried to bring mobile payments to the mainstream, only Apple has the customer base and industry influence to actually pull it off.
  • All these technologies will be also tied into Apple Watch, which already seems like it will be one of the hottest tech items of the coming years. I will write more about Apple Watch in a future article.

The problem for Samsung, or any other competing manufacturer, is that unlike a simple large screen, none of these technologies that Apple is bringing forward are easy to copy. This is because Apple isn’t just bringing raw technology advances to the table. Apple is doing the very hard work of making the technology easy to use and accessible. This requires a lot of development work as well as significant investments in creating industry relationships. Companies that are primarily manufacturers do not have the nearly 40 years of R&D and ingrained culture of innovation that Apple has. It’s quite a different thing to bring genesis to an entire ecosystem of amazing user experience than it is to throw some tech specs at a board and slap together cheap electronic devices. To Samsung, smartphones are just another TV or microwave that they churn out en masse. But to Apple, it’s personal. They aren’t just a manufacturing company. They really do care about making “insanely great” devices. This is the legacy of Steve Jobs. The payoff for Apple is that they are clearly the most valuable company in the technology industry, even if we don’t look at the numbers. Apple’s clout among companies and consumers in the economy at large is priceless and the fact that the entire world waits with baited breath to see what Apple does next is proof enough.

For those that think they can compete with Apple when they have established mainstream success with technology ecosystems, they should learn from recent history. Many tried to knock Apple off the iPod/iTunes pedestal. All failed miserably. Corporate juggernauts like Microsoft and Sony seemed feeble when attempting to replicate the success that Apple had. Again, this was because Apple wan’t simply making digital music players. Anyone could and did make those. Apple knew that they had to focus on the entire user experience and make it brain-dead simple for users to not only play songs on their devices, but also purchase and organize their music as well. Getting music on early MP3 players was a chore for all but the most techie among us. It seems obvious now, but if you couldn’t get music on your device, the device itself was pretty useless, no matter how great its tech specs were. The same will hold true for HomeKit, HealthKit, and Apple Pay. While its competitors were all busy trying to make phones with the biggest, baddest, tech specs, Apple was quietly leapfrogging them in technology that consumers will truly care about. By the time other companies figure this out, assuming they ever will, they will likely be too late. If Apple is successful in creating new technology ecosystems around their new innovations, it will be extraordinary hard for anyone to compete. This will be especially true for a simple manufacturing company like Samsung who doesn’t even make the operating system that runs on their phones.

Apple’s marketing slogan for the iPhone 6 is “Bigger than Bigger”. Subtle, but absolutely on point. While it seems that size may in fact matter, at the same time it really doesn’t matter. There are bigger things than big screens and we are about to see this come to fruition. It will certainly be interesting to see what Apple’s competition does next.

Ferguson and Big Government Hypocrisy

goodcop_badcop

You probably asked for it to become this way, even if you don’t yet realize it.

I live near St. Louis, MO, which as you probably know by now, is the metropolitan area where the city of Ferguson is located. My house is less than 25 miles away from Ferguson, which suddenly became the focus of the nation last month. The killing of Michael Brown by a police officer has set off both a figurative and literal firestorm in the city and the nation.

The entirety of the Ferguson situation become much larger than the initial incident. There are 3 main plots to this story. First, obviously, is the question whether the killing of Michael Brown by a police officer was murder or self-defense. That evolved into protests encompassing the larger issue of police brutality. When the protestors (or other people who were only interested in starting trouble) became violent and destructive, the perceived overreaction of the local police brought up questions about the militarization of local police forces and violations of free speech. The three issues are obviously interrelated, but they are also distinct. Each point deserves a thorough discussion.

I’m not going to say much about the killing of Michael Brown because there are so many unanswered questions. None of us know for sure what happened that day so it is impossible to make a conclusion. However, I know there is a lot of suspicion regarding the integrity of police departments so I completely understand the concerns people may have regarding the investigation itself.

I’m certainly no fan of police brutality. Those who choose to “serve and protect” must always remember the second part of that statement. I don’t believe that police should have any special protection if they violate people’s rights.

I will say that violence is wrong and I think all sides are united against violence, whether it is from police or rioters. Preventing violence is truly the core of all the issues surrounding the Ferguson situation, yet I think many people are missing a big common thread.

One thing became pretty evident over the first few nights of the protests and riots. The way the police handled the situation with military-style equipment pushed a lot of people’s buttons. Many claimed to be shocked and appalled at the sight of highly militarized police forces on suburban American streets. From the TV reports, especially the live ones, it certainly seemed like heavily armed police using tear gas and rubber bullets were going overboard attempting to clear people, including journalists, from the streets. Scenes like that set off a frenzy on social media with people decrying the brutality of police trampling people’s right to protest. Certainly I’m no fan of anybody’s rights being infringed upon. But what I found hypocritical was that many of the people who were criticizing this overreach of government are people who consistently advocate for government overreach in other ways.

You can’t have your cake and eat it too when it comes to big government! If you want more government control in one area, you must be willing to accept big government across the board, including more powerful police and more intrusion into your personal life. If you don’t like overly-powerful government agencies, you must be willing to shrink government entitlement programs and allow other people to live their lives as they see fit. Big government is two sides of the same coin. You can not honestly expect to have a government that is powerful and manipulative on one hand, yet unintrusive and peaceful on the other. Government only knows one way to get things done. When you only have a hammer, everything looks like a nail.

Government, at its core, is the authorized use of violence. Note that I didn’t say “justified”. I said “authorized”. There is a big difference. Justified means that an action is morally sound. Authorized simply means an action was permitted or commissioned. I think a lot of people misunderstand the functioning of government. Governments, and the laws created by them, are not the shining ideal of virtue. Nor can we expect them to be. Simply because government takes an action doesn’t make that action justified, even if a law authorizes their conduct. Laws are made by imperfect people, especially considering those who make laws are very imperfect and possibly corrupt politicians. There are many examples in history of very bad laws if you need proof of this. These imperfect laws are then enforced by other imperfect people, some of who may be unethical themselves, and you can begin to see the recipe for disaster that too many laws brings us. The more laws we have increases the odds that there will be the authorized but unjustified use of violence.

Because government is violence incarnate, it should be an option of last resort. Ideally government should only be there to protect people’s rights when certain people will not cooperatively respect the rights of others. When we begin to ask government to do more than protect our natural rights, we open a Pandora’s box of unintended consequences. The more we ask of government, the more powerful it becomes. Ultimately, that power is concentrated into people with guns and armor.

I often tell people that all government actions are at gunpoint. A lot of people push back against this statement. They claim that they don’t have men with guns forcing them to comply with the laws or pay their taxes. However, this is not a metaphor. It is reality. All government laws, taxes, regulations, etc. are enforced by the threat of violence. Usually just the threat is enough. But if anyone choses to disobey a law or decides not to pay a tax, the threat would become real. It doesn’t matter if the law or tax is truly justified, breaking laws or not paying taxes will eventually result in men with guns taking you to jail or forcibly taking your property. Continue to resist and force will be used against the “perpetrator”, up to and including lethal force if deemed “justified” by those enforcing the laws. Perhaps those that didn’t believe me before saw what happened in Ferguson and have begun to understand.

This isn’t a partisan issue, either. Government has grown steadily since the beginning of the last century, but under the current and previous presidential administration, government at all levels have seen unprecedented increases in scope and corresponding overreach. It is probably easy to correlate the militarization of local police with the Bush administration’s war on terror. But government growth is government growth. Whether you supported the so-called “Patriot” Act and its liberty destroying actions, or whether you called for the government to force free people to purchase health insurance with Obama’s ironically named “Affordable” Care Act, the end result is the ultimately the same. If the militarization of police is now becoming visible due to the war on terror, how long will it take for the militarization of the IRS to be felt from Obamacare? Laws must be enforced, and the more power government has, the more powerful those enforcers will become. Whether it’s BLM agents with helicopters and armored vehicles killing cattle at the Bundy Ranch, state governments dictating who and who may not marry, or local police SWAT teams with body armor and tear gas terrorizing residents and journalists in Ferguson, these incidents have more in common than some people would care to admit. The common thread of big government weaves throughout.

Perhaps hypocrisy is too strong of a word for some people in this situation. They honestly don’t yet understand the correlation between the expansion of big government and violent government overreach. The scenes in Ferguson in the aftermath of the Michael Brown shooting should serve as a wake-up call for these people. Every time you want government to make a new law, think about what you saw in Ferguson. Every time you want government to raise taxes, recall those images from Ferguson. Every time you want more government, remember Ferguson – because ultimately that is exactly what you are asking for.

The Apple-IBM Deal: No, Hell Didn’t Freeze Over

Steve Jobs Flips off IBM

Steve Jobs Flips off IBM

If you have even a rudimentary knowledge of the history of the personal computer, you know that during the 1980’s Apple and IBM were considered mortal enemies. In a fight to the death for dominance of the personal computer market, it was the upstart Apple who created the personal computer revolution vs the old guard IBM who was the 800-pound gorilla in the technology industry. To this day there are still perceptions of Apple and IBM as distinct opposites in the technology world. This is even though a lot of time has passed since the PC wars of the 80’s and things have changed quite a bit. So when Apple and IBM announced a strategic partnership it wasn’t surprising that many people were somewhat confused. How in the world could these two companies form a strategic partnership?

If you know the history of technology as well as I do, the announcement wasn’t actually all that surprising. While at first IBM was in fact Apple’s antagonist, Microsoft actually became the common enemy of both companies. Let me give you a brief background.

Yes, it was IBM that came out with the “IBM PC” that ran a DOS operating system made by Microsoft. But due to shortsightedness on IBM’s part, along with some strategic maneuvers on Microsoft’s part, Microsoft became the big winner in the personal computer market, crushing every personal computer maker that didn’t run MS-DOS (and later Windows). Apple was virtually the only personal computer company to survive, albeit just barely. At the same time, the PC revolution crushed nearly every old-school technology company that was prominent during the mainframe era of the 60’s and 70’s. Even IBM itself was nearly put out of business by the onslaught of IBM-compatible PC clones running Microsoft operating systems.

What saved Apple was the return of Steve Jobs and the subsequent expansion of Apple’s technology offerings into mobile devices such as the iPod, followed later by the iPhone and iPad, along with the revenues of the iTunes and App Stores. What saved IBM was their refocusing on their corporate services offerings back in the 1990’s. In fact, IBM was the first big name to get out of the PC business in 2004, when they sold their PC division to Lenovo. Only after Apple ushered in The New World of Technology with the iPhone and iPad, drastically changing the technology market, did other names such as Dell and HP begin to seriously target the corporate services market that IBM had long dominated. Both Apple and IBM realized that the PC market was beyond direct competition with Microsoft, but there were bigger things in store. Apple focused on the consumer market and IBM focused on enterprise services.

Fast-forward to present day and the deal really makes perfect sense. Apple is the dominant force in the consumer and small business market due to the iPhone and iPad. IBM commands a lot of influence in the corporate world. IBM wants to grow with the mobile device revolution and perhaps due to the lessons learned in the 1980’s, knows that there is plenty of money to be made in offering services instead of trying to create their own devices. Apple would love to get more enterprise business and knows that partnering with such a well-respected name like IBM is probably the quickest way to achieve growth.

So it really is just a very simple strategic alliance between two companies with a lot to gain between them. Yes, it seems a little funny at first, but the reality is that both companies are very mature and powerful and stand to get more powerful together. What was your first reaction when hearing this announcement?

The Facebook Messenger App is NOT the Devil!

The Facebook Messenger App - Could it be SATAN!? No, just some sensationalist claims gone viral.

The Facebook Messenger App – Could it be SATAN!? No, just some sensationalist claims gone viral.

Unless you’ve been living under a rock, you are well aware of all the dire warnings about Facebook’s “new” Messenger app floating around the Internet. At first I wasn’t going to write anything about it, but it seems that the story continues to get bigger. So I feel it necessary to discuss the warnings and how it all got started.

An article written by Nick Russo for a Houston radio station claimed that the Facebook Messenger app would have permissions to do all sorts of privacy-invading things if you installed it. For some reason, the article went viral. Well, it probably went viral for the same reason people send chain letters about virus hoaxes. It had just enough sensationalism mixed in with an authoritative tone to seem credible. The name of Nick’s radio station is “The Bull,” and perhaps that should have been an indication to people reading it that his article was for the most part, BS.

I’m not sure why this radio personality felt it necessary to pretend to be a technology expert. The very first time I read the article I knew there was something just not right. I tried to research his claims for some friends who were asking and for the life of me I couldn’t find anything about this guy stating that he had any professional experience besides working in radio. There’s nothing wrong with working in radio, but if you’re going to use your platform to disseminate information, please be sure you know what you’re talking about! As far as I’ve seen, Nick has not yet written an apology for his fear-mongering article, but rather has shifted into portraying himself as some sort of privacy advocate. Once again, I’m all for privacy advocates, but if you’re going to advocate – know of what you speak beyond just a cursory scratching of the surface.

Nick Russo made a lot of outlandish claims regarding what the Facebook Messenger app could do. The first problem with his claims were that he didn’t make a distinction between smartphones. I knew right away when reading his article was that there was no way Apple would allow an app like that to get into their App Store. Certainly it might be possible with an Android-based phone, however unlikely it would be, but Apple puts every single app submitted to their store through an approval process. Every. Single. App. Yeah, there’s no chance that Apple would allow Facebook Messenger, or any other app, to do the following as claimed by Nick Russo:

  • change or alter your connection to the Internet or cell service … for its own reasons without telling you.
  • send text messages to your contacts on your behalf … when they want
  • see through your lens on your phone whenever they want .. listen to what you’re saying via your microphone if they choose to
  • read your phone’s call log, including info about incoming and outgoing calls … Facebook will know all of this
  • read e-mails you’ve sent and take information from them to use for their own gain.
  • read personal profile information stored on your device … addresses, personal info, pictures or anything else
  • Facebook will now have a tally of all the apps you use, how often you use them and what information you keep or exchange on those apps.

It’s not like Apple iPhones are some off-the-wall brand that can be safely overlooked when discussing smartphones. They are just a *little* popular, to put it lightly. So to write an article like this with such extreme claims and not know about Apple’s approval process is simply irresponsible. But even if we were to ignore iPhones for the moment, does anyone really think that Facebook would want to do most of what is claimed above to their users? Perhaps Mr. Russo should have put in a call to someone at Facebook to ask a few questions first? Or at least do a tiny little bit of research on this thing called the Internet before publishing an article like this? I bet even the resident PC guy at “The Bull” probably could have warned Nick that his claims were pretty far out and to be careful before publishing his article. But alas, Mr. Russo took a little sliver of knowledge and believed he knew more than he did – running off like “The Bull” in a china shop and starting a viral tidal wave in the process.

To be fair, in theory – extreme theory, what Nick Russo claims above could possibly be accomplished by highly malicious apps running on some smartphone platforms. But Facebook Messenger isn’t a malicious app. And Nick must have found that out because in his next article he states, “I’ve now learned that both the New Facebook Messenger App and the original Facebook app have many of the SAME permissions.” Yes, I’m sure he did learn a few things once his article went viral! But perhaps those things should have been learned BEFORE publishing! As it turns out, the Facebook Messenger app (which isn’t new, but has been out for years), does virtually nothing different than any other similar app, including the normal Facebook app that billions of people already use. Oops!

Apparently once he found that out, Nick choose to portray himself as a privacy advocate, championing the idea that he made people more aware of the privacy choices on their phones. Fair enough, but let’s call a spade a spade. If he really cared about people’s privacy choices, he would have done some research and consulted with technology experts so that he could have written a balanced article. Any good that he has done has been completely obscured by the hysteria he created. Advocacy by accident at best. Fear-mongering at worst.

Bottom line, there are many articles that debunk Nick’s claims. Here is another article discussing some of Nick’s claims as “myths”. Facebook even posted an article discussing the privacy concerns. So the moral of the story is that we can’t believe everything we read – especially when it comes to technology topics. While we may not like the fact that Facebook is making everyone use a separate app for Messenger, spreading misinformation isn’t helping anybody.

Jumping Off a 35-Foot Cliff and Facing One’s Fears

35-foot Cliff

That’s the cliff I jumped off. Yes, from the top.

Other than my immediate family, not many people are aware that I have a bit of a fear of heights. Now it’s not a debilitating “phobia” as others may actually suffer from, but I certainly get strong feelings of anxiousness when looking out from tall buildings or over railings. Sometimes even driving over long or tall bridges gives me “the willies”. It’s enough of a fear that I will make a little effort to avoid it when possible, but not so much that it actually interferes with “normal” situations.

I had the good fortune to take a lake vacation for the week of July 4th with some friends on Lake Norfolk in Arkansas. One of the activities available is jumping off a cliff into the lake. There is a short section about 10-12 feet to jump off and then there is an approximately 35-foot jump from the top. Joining my kids, I had no problem jumping off the 10-foot section. But we were all curious about the 35-foot jump. My 12-year old daughter especially, who is a big fan of the Divergent series, wanted to try the jump to emulate the initiation of the Dauntless faction. We went up to the top section and took a look. Surprisingly, the view over the edge didn’t make the hair on the back of my neck stand up – not that there is actually any hair on the back of my neck, but you know the feeling! So my daughter and I talked and we both decided to face our fears and take the leap.

I asked my daughter if she wanted me to go first and of course she did. So I started psyching myself up to make the leap. Now that I had committed to making the jump, the view over the edge suddenly looked a little more intimidating! I got myself up to the edge and began visualizing my leap. Ironically I had just been talking to my daughters the day before about fear and the fight or flight response. Now we got to put it into action. Besides the anxiety at jumping off a cliff, there was some fear of the falling sensation and also about hitting the water correctly so it wouldn’t hurt. My conscious mind was realizing that I had watched many other people jump before me and nobody was getting hurt. All these things raced through my mind as I tried to urge myself to make the jump. I looked down at all the people who were in boats watching me and the other jumpers. I looked down at the water. Then my hands and arms literally got numb. Not like tingly but seriously prickly-numb like electricity was flowing through them. I started shaking out my arms to ward off the sensation. It was like my body couldn’t believe what my brain was contemplating. After a few seconds this numbness subsided and I was ready to go. Now the hard part was making my legs actually jump. I was mentally ready for the jump, but it’s one thing to physically force yourself to leap over a cliff. I went for it once but stopped. My legs simply wouldn’t do what I was commanding them to do. Then I started talking to myself loudly trying to muster up the last bit of courage to force myself to go through with it. It took a few more seconds but I finally felt the moment and I jumped …

… I was flying through the air …

The hardest part was making the jump. The rest was easy. Not that I had enough time to think about anything but trying to hit the water feet-first. It was over so quickly that I don’t really remember anything except the view of my legs clearing the cliff – that and hitting the water. I mostly landed correctly and splashdown really didn’t hurt. Although who knows how much adrenaline was pumping through my veins!

I made room for my daughter to jump. Unlike me who took several minutes to finally leap over the cliff, she hesitated just once but then went for it. It was nice to have a shared experience like that, especially when conquering something that is one of my only fears. I rode high on that the rest of the day. It’s not every day we get to push our own limits and I’m glad I was able to pass this particular test.

Sharia Law? Hobby Lobby Ain’t a Government, It’s a Private Business

george takei

I want equal rights for everyone as long as I agree with those rights.

I appreciate people like George Takei who have used successfully used social media to leverage their celebrity status into large followings. Especially when they not only use their platform for entertaining, but also to comment on social issues. Sticking out one’s neck in support of a cause they believe in is a risky thing to do, but I believe it is a worthwhile endeavor to speak out, especially against establishment. However, at times it seems that people like Takei need to educate themselves a little before speaking, for they highlight their ignorance of the issues.

The recent Hobby Lobby decision by the Supreme Court has set social media on fire, with people on one side claiming this is a defeat for women’s rights. George Takei wrote an article, Hobby Lobby Ain’t A Church, It’s A For Profit Business, in which he asks what the decision would have been if Hobby Lobby were run by Muslims and they attempted to enforce Sharia Law on their employees. While at first the comparison may appear relevant (Christian beliefs vs Muslim beliefs), the difference between the two are so big that it borders on intentional distortion of the facts, assuming Takei truly understands what rights are.

Sharia law, as any system of laws, require government enforcement to be of any influence. Or at least a group of people claiming authority using violent force, or the threat of violent force, to coerce people into compliance. To my knowledge, Hobby Lobby does not force anyone to work for them. Nor do they force anyone to purchase from them. Nor do they stop any of their employees from purchasing anything, let alone birth control. They simply are choosing not to offer certain types of birth control on the insurance benefits they offer their employees based on their moral conviction.

Takei, as do many others on his side of the debate, go into various arguments attempting to show hypocrisy with Hobby Lobby’s beliefs regarding what they choose to offer or invest in. Others I’ve read similarly go into supposed scientific arguments why Hobby Lobby isn’t being consistent with their beliefs. The bottom line is it doesn’t matter. Hobby Lobby is a business owned by free individuals in a free market. Unless one has an ownership interest in a particular company, one should have no say into how a company runs their business, no matter how silly or illogical one believes they are acting. Just as one should not have a say into how another chooses to live their life, something Takei strongly crusades for.

Takei is a fervent advocate of gay rights and has been especially vocal regarding the issue of gay marriage. He even states in this article, “Our personal beliefs stop at the end of our noses, and we should therefore keep it out of other people’s business — and bedrooms.” The hypocrisy in this statement is practically self-evident! He literally states that our personal beliefs should be kept out of other people’s business. Yet here he is advocating that the Supreme Court should have upheld a law that literally sticks the nose of government squarely into other people’s businesses. A law that is based upon the arbitrary belief that a business is somehow obligated to offer a particular set of insurance benefits as defined by others.

The real problem here is that we have a system where certain people believe that health insurance is a “right.” There is also a correlated belief out there that employers are the anointed dispensers of health insurance for the country. Since certain people believe that health insurance is a “right,” and these same people generally believe that employers have an obligation to provide health insurance for their employees, it seems to make sense that employers should offer health insurance that covers virtually every potential need a person could have. This was the motivation behind the so-called Affordable Care Act, also known as Obamacare, which attempted to force employers to offer exactly this type of health insurance. Now that the Supreme Court has ruled against this aspect of the law, it should be no surprise that those supported this law are up in arms. It flies in the face of their personal morality.

Therein lies the problem. People like George Takei don’t understand what rights are. I will write on this more extensively in a future article, but for now it is suffice to say that true and natural rights do not involve infringing the rights of others. In this case, the example is clear. People have the right to purchase birth control. A purchase transaction is a voluntary exchange between the buyer and the seller. People also have the right to purchase health insurance. Once again, the transaction is voluntary. The idea that employers are obligated to offer a particular type of health insurance is advocating a coerced transaction. Whether or not the employer actually wants to offer a particular type of insurance, the advocates of health insurance as a “right” believe it is justified and moral to use the threat of government violence to make sure an employer delivers an arbitrarily defined set of insurance – even if this defined set is against the morality of the employer. They believe their definition of morality is superior to any other morality. Exercising one’s true and natural rights does not involve the infringing of the rights of others. Otherwise, it is not a right. Especially when the attempt to exercise this so-called right requires violence or the threat of violence.

To be clear, the most hypocritical aspect to this whole situation is that just as Sharia law is an arbitrary system of morals, the idea that health insurance is a right, as well as a raft of related ideas, are also an arbitrary set of morals. Accusing Hobby Lobby of the equivalent of Sharia law ignores the fact that the accusers are attempting to use real government violence to enforce their own arbitrary set of moralities. So who are the bad guys here? Free individuals running their own privately-owned business according to their beliefs, or the people attempting to use government to coerce others to comply with a particular set of moralities? George Takei and others need to take a look in the mirror. If they want others to keep their beliefs out of their bedrooms, they need to make sure to keep their beliefs out of others’ businesses. You may be a big celebrity, but that doesn’t give you the authority to extend your personal beliefs beyond the end of your own nose.

IRS: Bad Sectors or Bad Intentions?

obama_open_governmentIn my previous article, I mentioned that all the attention paid to the details of Lois Lerner’s hard drive crash was just a red herring. I believe that the IRS attempting to use the hard drive crash excuse is simply the least important link in the chain of a comprehensive “innocence by incompetence” campaign. However, as weak of an excuse as it is for the IRS, the technical details of Lois Lerner’s hard drive could actually could end up being a smoking gun if we investigate far enough.

As I mentioned in my last article, I’ve seen more than my fair share of hard drive failures in the course of my 20 year professional career. I am very familiar with the functioning of hard drives, both in their mechanical and digital operations. So allow me to offer a quick primer on hard drive failure.

Any hard drive failure is commonly referred to as a hard drive “crash”. Technically the term “crash” has a very specific meaning in reference to hard drives – a head crash, if you care to know – but non-technical people may use the term “crash” to refer to any number of hard drive problems. Usually most people will say a hard drive crashed if the failure prevents data being read from the drive by normal methods and/or the computer will no longer boot from that drive. However, there are many less serious problems that could seem like a hard drive “crash” to non-technical users and restoring the drive to normal operation in those cases would not be difficult for a technology professional. More serious failures involve problems with the physical drive mechanism and may require the drive be sent to a data recovery specialist or forensic lab with highly sophisticated equipment to retrieve data.

In my experience, hard drive failures are unfortunately far too common. I have no problem believing that a hard drive crash could have befallen the computer used by Lois Lerner. Sure the timing seems questionable, but from a purely technical standpoint, this isn’t the smoking gun by itself. We must begin by questioning the specifics of the actual drive failure. Was it an actual head crash or a less serious glitch? Unfortunately the IRS has reported that the hard drive in question has already been recycled so no further attempts at recovery can be made, nor can the true cause of the hard drive failure be verified. All we know is that an e-mail from an IT manager to Lois Lerner in August 2011 said, “The sectors on the hard drive were bad which made your data unrecoverable.”

At this point I don’t believe the technical diagnosis of the hard drive failure nor the details of the recovery efforts made by the IRS IT department have been made public. These details need to be uncovered because if we know the diagnosis of the failure of the hard drive, we can begin to understand if the drive and data on it was intentionally destroyed. Alternately, we can also begin to deduce if the IRS IT department was using proper procedures during the recovery attempt and if there was any intentional wrongdoing within the IT department – or simply further incompetence.

According to the e-mail trail provided after Lerner’s hard drive failed, the data on the drive was deemed so important that the IRS IT department even went to the unusual lengths of sending the hard drive to their criminal investigation division’s forensic lab so they could attempt data recovery. A forensic lab often is able to piece together some data from a failed drive even if all the data on a drive is not recoverable. For a forensic lab to not be able to recover any data at all is highly unusual. This indicates that either the drive was wiped clean using advanced data deletion technology or it suffered extreme damage. Extreme damage is a rare occurrence for normal hard drive failures. Because the drive was sent to a criminal forensic lab, in theory the forensic specialists should have been able to tell if the hard drive was intentionally damaged or if there was anything unusual about the condition of the drive.

Lerner’s hard drive was reported as crashed on June 13, 2011. It wasn’t sent to the forensic lab until August 5, 2011. That’s a long time. Talk to any technology professional who is competent in data recovery and they will tell you the longer a failed drive is running in an attempt to recover data, the more damage that can be done. What was the IRS IT department doing that it took two months before they determined the drive was so bad it needed a data recovery lab? Especially if it was later determined that the “sectors on the hard drive were bad”. That type of failure should have been fairly obvious early on. Regardless, “bad sectors” do not take out all the data on a hard drive unless virtually the entire drive was damaged. In theory it could be possible that the efforts over two months by the IRS IT department could have damaged the drive beyond the point of data recovery even by a forensic lab. That would be a fairly inexcusable case of incompetence – or an intentional effort to scrub data from the drive. Either way it’s not a show of good faith on the IRS’s part.

Bottom line, there appears to be a chain of IT employees at the IRS that had access to the hard drive at any point in time, as well as “HP experts” and forensic specialists at the IRS’s criminal investigation division. If there is in fact a cover-up, the weak link in the chain may very well be any one of these IT people. Assuming they are not as politically motivated as IRS officials, it may be possible to get expert testimony from any one or more of the IT people that worked with and examined Lois Lerner’s hard drive that would conclude the drive had been tampered with or intentionally damaged. At the very least, we could find out why no data could be retrieved at all.

My hope is that if Lois Lerner’s hard drive is a smoking gun, one of the IT people involved will be brave enough to testify to this. The world could use another Edward Snowden right about now.

IRS: Innocence by Incompetency

obama-foia-2009Ever since the news broke a little over a week ago that the IRS lost e-mails connected to Lois Lerner because of a computer hard drive crash, I’ve been wanting to write an article addressing the technical aspects of this situation. However, the story kept growing as each day went by so I waited. As I sit down to begin this article late in the evening of June 23rd, I’ve just spent almost 4 hours watching the latest hearing live on CSPAN-2. Yes, you can’t get much geekier than spending an evening watching a government hearing discussing hard drives, backup tapes, and IT department policy. But I am who I am and the intersection of technology and politics is my wheelhouse. In all of history, there probably hasn’t been a more famous political story revolving around technology issues. Because of the size and scope of the various technical issues involved, this article will be the first of a likely series of articles tackling each major point in this long chain of events.

I almost feel that I don’t need to write these articles because it seems even technology laypeople instinctively know there is something highly suspicious about this situation. In this day and age of advanced technology, a simple hard drive crash simply doesn’t seem like a justified excuse to lose an important trail of digital communication. This is especially true for a government bureaucracy that purportedly symbolizes accurate record keeping. However, I still think a thorough review of the technology and management issues are worth examining.

As a technology professional, I’ve seen more than my fair share of hard drive failures. In my experience, hard drive failures are far too common, so I have no problem believing that a hard drive crash could have befallen the computer used by Lois Lerner. However, looking at the big picture, the hard drive failure really shouldn’t be relevant. All the hubbub about a hard drive crash is truly a red herring. Nonetheless, I will address the hard drive issue in my next article.

Any organization with halfway competent IT management that is required to preserve e-mails will have an e-mail archiving system in place. They will not defer responsibility of preserving required communications to individual employees – unless, of course, the negligence is intentional. One very important idea behind archiving is that e-mail communication may be used for criminal investigations among other things and it should be obvious that employees may decide not to preserve e-mails that are incriminating. The reality is that it is much easier to archive messages as they pass through a central server than it is to attempt to store and retrieve them from individual computers. And of course, automated centralized archiving eliminates the possibility of employees “losing” e-mails to cover their asses. To not archive messages at the server level is literally “so 1990’s”.

There are many archiving products and services available that work with common e-mail servers. It is well-known that the IRS uses the Microsoft Exchange platform, easily the most popular e-mail system for large enterprises. Therefore the IRS would have had its pick of any number of e-mail archiving systems to choose from. In fact, the IRS did have a contract with a company called Sonasoft that specializes in e-mail archiving (their tagline is “Email Archiving Done Right”). This contract was terminated at the end of fiscal year 2011 (August 31st), which seems highly unusual given the timing of the Lois Lerner hard drive failure and the supposedly lost e-mails in June of 2011. It was testified that the IRS only contracted with Sonasoft to archive the e-mails of the Chief Counsel within the IRS, which covered just 3,000 employees, not all 90,000. This also seems highly unusual to select such a small subset of employees. If archiving the e-mails of 3,000 IRS employees is deemed important, why not all 90,000 employees? At the very least, shouldn’t the heads of major divisions within the IRS, such as Lois Lerner, have had automated e-mail archiving as well? If nothing else, just from a productivity standpoint, the loss of e-mails for key personnel would be highly detrimental and an e-mail archiving system would be well worth the cost for the protection it provides, not to mention compliance with federal regulations in case of wrongdoing.

From a technical standpoint, the costs to archive the e-mails of all employees would not have been significantly greater. As with many technology systems, the greatest costs are in the initial implementation and baseline infrastructure, not in scaling of said systems. While archiving the volume of e-mail generated from 90,000 people would require a large amount of storage, it is not an impossible task. There are many companies in the United States that have hundreds of thousands of employees and are required to archive e-mails in order to comply with federal regulations. Or that being said, we know the NSA has enourmous data centers that are more than capable of monitoring and archiving communications on hyper-massive scale. Certainly it wouldn’t be beyond the capability of another gigantic federal agency such as the IRS to properly manage their own required records. The agency that has been charged with enforcing the president’s signature legislation shouldn’t have a problem archiving emails of a piddly 90,000 accounts, should they? They are in charge of maintaining records of hundreds of millions of American citizens, after all.

But that’s exactly what the current IRS chief wants you to believe. That the IRS’s technology infrastructure plus the policies and procedures that manages it are so woefully antiquated and out-of-date, they just couldn’t prioritize the archiving of e-mail messages. This is true even though e-mail messages are considered official records by the IRS’s own handbook and they are required to preserve them. The excuse has been given that properly archiving all the e-mails of the agency would cost between $10-$30 million and they just didn’t have the proper funding. I would love to know where this figure was arrived at because this seems like an extraordinarily high number, given that they were already contracting with a company that was doing e-mail archiving and scaling it shouldn’t have approached anywhere near these costs. Even several hundred terabytes of storage didn’t cost anywhere near $10 million in 2011.

What the IRS wants the American public to accept is that they can’t be proven guilty because they were incompetent. The truth of the matter, speaking as a technology expert with 20 years of professional experience, is given the laughable policies and procedures the IRS had in place, the chain of events as they describe them are entirely plausible. The burning question is whether or not these policies were in place due truly to incompetence or for convenient plausible deniability. At this point, we can only assume they are “innocent by incompetence.” But this can not be acceptable. If anything, the hypocrisy implicated here is far too much for anyone but the most ardent authoritarian to embrace. The IRS, nor would most law enforcement agencies, accept the excuse that a technical problem resulted in the destruction of evidence. In fact, the term “spoliation of evidence” is a legal concept that allows for courts to assume that evidence destroyed for any reason (whether claimed “accidental” or otherwise) would have been detrimental to the defense and assume the worst. Given the stringent federal regulations that require publicly held corporations to archive years worth of e-mails, it would be a significant case of “do as I say, not as I do” statist double-standards to allow the IRS to get away with this highly convenient set of circumstances.

While many apologists claim that the scandal is unfairly targeting Barack Obama, the reality is that he is the Chief Executive and that the IRS is a agency of the Executive Branch. That alone should prompt any leader of integrity to take charge and demand answers. But what is especially disturbing is that Obama was elected under the auspices of “Hope and Change” and one of those key tenets was “transparent and open government.” In fact, one of his first acts as president was to release presidential memorandums addressing the free flow of information from government agencies, regardless of the protection of personal interests. So for Obama to turn a blind eye on this situation is an egregious violation of his own proclamations. Do we really want a president that doesn’t stand by his own promises?

While “innocence by incompetency” may keep certain IRS figures out of jail or the president from being impeached, it won’t help the IRS in the long run. As I mentioned before, even technology laypeople realize there is something extraordinary about this situation. People from all political persuasions are incredulous at the arrogance and audacity shown by the IRS management over having their credibility questioned. We the people can not stand and let this pass. If the IRS is wanting to prove just how incompetent they really are, they need to have this incompetence severely punished. Instead of rewarding them by increasing their budget, we need to drastically reduce the power this agency has over the American people. The first step is to eliminate or rescind any further increases in power the IRS receives, such as that dubiously authorized by Obamacare. The ultimate step would be to abolish the IRS completely. While such a thought seemed like fantasy only a short time ago, the unprecedented nature of this situation has people seriously questioning the justified existence of such an agency and their legitimate role in a free society.

What steps do you think should be taken against the IRS for their seeming incompetency?

Windows is the Elephant in the Room

elephantI just read an article by Ed Bott of ZDNet discussing how Microsoft’s marketing for the Surface Pro 3 has backfired. Basically the article states that since Microsoft was comparing the upcoming Surface Pro 3 to a MacBook Air and also stating that the Surface Pro 3 can replace a laptop, the tech journalists that attempted to replace their MacBook Airs with a Surface Pro 3 were less than happy with their experience. The article however attempts to explain why these journalists’s experiences aren’t representative of an average user’s needs. That basically a tech journalist’s workflow is far too complex as compared to an average user who would have a MacBook. So the readers of the tech journalists reviews were being done an injustice because their needs are far too different from a tech journalist’s. Many of the comments on the article were in agreement with the author, attempting to rationalize the poor reviews of the Surface Pro 3 as a laptop replacement. Finally, the author states that “Getting the tech press to step outside of an Apple-centric bubble and imagine a world where people might choose a Windows laptop over a MacBook is the biggest challenge of all.”

That last statement would be utterly hilarious if it didn’t completely ignore the long history of the PC era until the last few years. Until Apple broke through with the iPhone and then ended the PC era with the iPad, Apple was virtually ignored among tech journalists. Perhaps the tech press is Apple-centric for a good reason. They are embracing The New World of Technology because they finally have a real choice as compared to 20 years of Microsoft domination.

It amazes me how many people, tech journalists and otherwise, still believe that comparing a Windows PC to a Mac is, well, an “apples-to-apples” comparison. Let’s face it, people who are using laptops in any “work” environment are more like tech journalists than not. For Microsoft to compare their hardware to Apple’s hardware is completely missing the point and they got exactly what they deserved.

It’s like judging a woman completely on her “specs”. There’s a lot more to a woman than her physical appearance, no matter what kind of perceived “performance” you may get out of it. The reality is that what’s on the inside is a lot more important because ultimately that is where most of the work actually takes place. You’re never going to make the most of that “hardware” if you can’t manage the “software” effectively. Similarly, when you compare the different platforms, a Mac is simply a more friendly, easy-to-use, and therefore productive work environment. Microsoft simply does not create the level of user experience refinements that Apple does. That’s been true since 1984. Let’s not pretend here. The disaster that is Windows 8 should be proof enough of that.

Those who are used to the Mac will definitely have a hard time switching to Windows. Hell, Windows users are having a hard time switching to Windows 8! But it seems as if some people, Ed Bott included, are simply writing it off as a “transitional” problem. It’s a lot more than that. Sure, those who are used to Windows will have a learning curve switching to a Mac, but from my experience, most people get comfortable within a week and then start to realize the advantages the Mac OS brings to them. My favorite quote from someone who switched to a Mac after years of Windows use was “This is how computing should be.” On the contrary, those who try to switch from a Mac to Windows rarely ever get used to Windows and will go back to a Mac as soon as they can.

It’s not just a matter of “what you’re used to”. I’ve seen far too many examples of this play out from average, everyday people who are clients, friends, or friends of clients. When people have a chance to experience both Windows and the Mac, overwhelmingly they choose a Mac. I think we’re seeing this play out in the larger market as consumers are making their own purchase decisions more and more. Most PC purchases were, and still are to a great degree, made in mass by big companies. The fact that the Mac market continues to grow while the overall PC market shrinks is one sign of this.

The other part that doesn’t make sense is people saying that tech journalists aren’t a good comparison to an average person. That’s absolutely true, but not for the reasons being bandied about by Ed Bott or certain commenters. If anyone had the ability to make a transition to Windows from a Mac, it would be a tech journalist – someone who presumably is very comfortable with technology. If a tech journalist has trouble switching to a Surface Pro 3, then what chance in hell does an average consumer have? Usually tech journalists do not give enough attention to the needs of an average user. They usually let their tech bias slant their view towards the hardware specs and not enough attention to the ease-of-use. Ed Bott’s article is a perfect example of that. The fact that several tech journalists panned the Surface Pro 3 should be a warning heeded by people thinking of purchasing one. The reality is that either the device is too complex for an average user or that the device isn’t robust enough for a professional. That’s not a good reality for Microsoft.

As I’ve said, I’ve seen the switching scenario play out hundreds of times over the last 20+ years in both directions. I’ve had many clients that used a Mac at one point, then were forced to switch to a PC by their work environment. Years later, they still wished for a Mac and eventually made their own purchase decision going back. I have *never* seen this scenario with someone wanting to go back to Windows. Usually, as the saying goes, once you go Mac you never go back. Especially given that you can run Windows in Boot Camp or a virtual machine so the old compatibility argument has been long gone.

What’s truly ironic is that I see people switching to the Mac because they say that compared to Windows 8, they think the Mac is more like “Windows”. Ouch. Seriously Ouch.

Microsoft can try to compare their hardware to Apple’s hardware all day long, but they’re ignoring the elephant in the room that is Windows.