Reading 06: Unethical Government Backdoors

Technology companies are currently facing an intense amount of scrutiny and debate concerning one of the most difficult balancing acts of the modern, digital age. Should companies value individual customer privacy, or should they instead cede to the demands of government agencies for less security in the name of saving lives and national security? While I think there exists a middle ground for both goals to be respected and achieved without denying the other entirely, if forced to choose between the two extremes, the answer is clear. Companies like Apple should not lessen the security of their products and build one key fits all government backdoors. As Apple’s CEO Tim Cook put it, they should not “undermine the very freedoms and liberty our government is meant to protect.”

Like most ethical debates, I do think there exists a great deal of grey area in the case of government technology backdoors. And I do believe that a company like Apple should endeavor to help a government agency like the FBI in the proper scenarios under the right circumstances. In fact, Apple has been making every ethical effort in its power to assist the FBI in its investigation over the past several months. But this new demand by the FBI falls well outside the bounds of an ethical request.

Make no mistake, the San Bernadino terrorist attack was a tragedy, and the perpetrators and all involved should be brought to justice. But the U.S. government should not act unethically itself in its efforts to do so, and it shouldn’t force U.S. companies like Apple to act unethically on the its behalf.  The FBI’s intentions may be good, as noted by Tim Cook in his customer address, but in its efforts to protect national security, the FBI is actually putting millions of innocent Americans at risk. Despite any promises that an Apple produced iPhone backdoor would only be used in this case and other special circumstances, “there is no way to guarantee such control.” If Apple either willingly submits to this request or is forced to under court order, there is no going back – no closing of Pandora’s proverbial box once it has been opened. The ability to hack an iPhone and recover all of its data will exist for misuse by anyone capable of accessing the backdoor – whether that be identity thieves, computer hackers, or even corrupt government officials. By trying to recover the information of a few terrible individuals, we would actually be putting everyone at risk. It is this very risk posed to all that discredits the argument some may make that if you have nothing to hide then you shouldn’t be worried about lower security measures and less privacy. In a world where iPhones and other digital devices are increasingly becoming the central hubs of our lives, we may not have anything to hide from the FBI, but we have everything to hide – back accounts, personal identification information, etc. – from criminals, hackers, and other ill-doers who would use digital backdoors for nefarious purposes.

Besides the unethical nature of the inherent risk posed to millions of Americans by the FBI’s  backdoor demand, there also exists the unethical implications the demand has specifically on Apple and its employees – namely its software engineers. The article The Conscription of Apple’s Software Engineers summarizes this point perfectly. The point is right there in the title. If forced to bow to the FBI in court, Apple and its engineers are essentially becoming forced employees of the government, doing the government’s hacking and investigative work for it.  What then if every software engineer at Apple refuses to write the unethical backdoor code? What if every engineer at Apple resigns under ethical grounds? Would the government punish disobeyers in court? While this sort of hypothetical takes the situation to the extreme, it showcases the absurdity of the FBI’s request. The government does not “have a claim on the brainpower and creativity of citizens and corporations.” If it does, under the auspices of an exploitative reading of an Act passed in 1789, then I fear for the future of a free United States of America.

Reflection on Project 2: Job Interview Guide

Writing a job interview guide really reinforced how complicated, involved, and stressful the job interview process is, especially for college students unaccustomed to the rigors of job searching. While every section of our guide includes valuable information, I think the most important sections are those on when to prepare, how to prepare, and what resources to utilize. If I could go back in time and do it all over again, I would have most certainly started my application and interview preparation earlier, especially for my internships. Luckily, I had two amazing internships at great companies the past two summers; however, I could have started prepping much earlier to make the process less stressful. By the time senior year came around, I knew to get an early start on preparation and applications. I am still amazed at the time and effort needed outside of regular classwork to really feel well prepared for job interviews.

With the seemingly ever-increasing amount of time and effort required for students to truly be well prepared and competitive in their job hunt, many colleges are facing the difficult decision of whether or not to reorganize and alter their curriculum to better face this reality. Overall, I feel that Notre Dame’s computer science courses have prepared me extremely well for a career in the technology industry. Nonetheless, I would suggest a few minor changes that could make a significant difference in the job and internship prospects of future c.s. students. The biggest suggestion I would have is to move the required Algorithms course from the fall of senior year to either semester during junior year. Because I was taking Algorithms concurrently with the majority of my full-time job search, I often encountered interview questions that involved Algorithms concepts we had not covered yet in class. This put me at a not-insignificant disadvantage compared to students from other schools, such as Stanford, that teach Algorithms prior to senior year.

Computer science is also an interesting field in that it falls under different colleges and categorizations at different universities. At Notre Dame it is within the College of Engineering – a natural home in my opinion. But I’d also be intrigued if Notre Dame explored separating out computer science into its own School of Computing. In my opinion, this would allow computer science students to take classes more appropriate to their interests that would better prepare them for job hunting and interviews, rather than having to take required engineering courses like chemistry and biochemistry. I can almost guarantee that I will never use anything from either of those classes. My time would have been much better spent focusing on computer science in those class slots.

All in all, I do think Notre Dame’s CSE program does a great job preparing students for the workforce. The preparation provided by coursework at Notre Dame along with the extra hours spent prepping outside of classes, attending career fairs, and taking 4 a.m. trains into Chicago for interviews all contributed to me landing a great job at a great company. And no matter how Notre Dame chooses to rearrange its curriculum, I still believe preparation outside of one’s classwork is going to separate the truly successful job seekers from those who graduate without offers in hand.

 

Reading 05: Boeing Whistleblowing

There are many reasons I decided not to concentrate in cyber security as a Computer Science student at Notre Dame. Among these reasons certainly stands the complexity and difficulty associated with developing strong computer security systems. It seems only a matter of time until the next large company makes headlines for allowing the theft of private customer data or for its own lax internal security practices. It turns out storing hundreds of passwords in a file called “passwords” might not be the greatest idea. Despite the difficulty associated with managing computer security at a large corporation, companies are still not absolved from the responsibility of doing their best to protect themselves, their employees, customers, shareholders, and the public at large through the development and practice of robust computer security protocols. When neglected, this responsibility often leads to an uncomfortable situation in which a company’s own employee(s) expose suspected wrongdoing within the company – a practice dubbed whistleblowing.

In 2006, several employees blew the whistle on Boeing, the largest airplane manufacturer in the world, regarding the company’s inability and – in their opinion – lack of efforts to develop and practice the necessary computer security systems to adhere to the 2002 Sarbanes-Oxley Act – a law aimed at preventing a repeat of the Enron situation in which shareholders were mislead by falsified financial data. The employees in question were subsequently fired in 2008 and their dismissal was upheld in court in 2011.

Considering that companies are rarely happy about whistleblowing, as evidenced by Boeing’s reaction of firing the employees who spoke to the media, there are usually two sides to every whistleblowing story. And in a world where we often see monolithic corporations like Boeing raking in billions of dollars in profit, it’s easy to come to the premature conclusion that everything Boeing did in this situation was wrong and everything the punished employees did was right. But in my opinion the reality of the situation includes much more grey than black and white, with rights and wrongs committed by both sides.

First of all, it seems readily apparent that Boeing was not treating its internal computer security employees fairly and ethically in the first place. When describing Boeing’s push to become Sarbanes-Oxley compliant, one employee spoke of “the first two years as pure hell.” The inability of the Fortune 50 company to meet its legal requirements more than three years after the initial compliance regulation start date coupled with the constant arguing and back and forth between company managers and external auditors further demonstrates the lack of organization, direction, and leadership that lead to a hostile work environment for computer security employees at the company.

At the same time, employees of a company who have access to private company data and information bear their own ethical responsibilities. I think the employees who were fired by Boeing had the responsibilities to both blow the whistle and maintain the security of private company information. Their major mistake came in the manner in which they blew the whistle, choosing to go directly to the media rather than filing their complaints solely through the proper legal channels. As unjust as it may seem, Boeing was legally acting within the law when it fired the employees as held up in court in 2011. The whistleblower provision of the Sarbanes-Oxley Act”protects employees from discrimination if they deliver the information to a federal regulatory or law enforcement agency, a member or committee of Congress or or a work supervisor.” The media is notably absent from that list.

Upon learning that “Boeing also recently suffered three separate cases of data theft in which the personal information of more than 400,000 employees was stolen,” one can partially understand the anger of executives when its own employees spoke to and leaked information to the media. At the same time, if reports of Boeing “spying on other employees to ferret out whistleblowers by videotaping workers and reading their e-mail” are true, one can equally understand the need employees might feel to out the transgressions they view are taking place. Furthermore, the fact that the whistleblowers of note in this case felt the need to go to the media rather than through the approved legal channels might point to a flawed whistleblowing system itself, one that doesn’t promote and incentivize proper, ethical behavior within the process.

Now that the dust has settled on the 2006 whistleblowing at Boeing story, it seems to me that unethical decisions were made by both parties – the company and the whistleblowers. Boeing could have treated its own employees in a more ethical manner while attempting to implement Sarbanes-Oxley compliance in the first place, and the employees in question could have brought their concerns to the attention of proper, legal correspondents, rather than the media, to best protect the private data and information of Boeing. If anything can be learned from the confusing, complicated tale of whistleblowing at Boeing, it might be that these stories often have no clear cut ethical and unethical participants.