Alana+and+Mary+Bryce

Neg

**FCC 1ac** **FCC 1ac - plan** **FCC 1ac – NSA Overreach**
 * The United States federal government should substantially curtail National Security Agency surveillance of domestic communications to or from United States persons through Federal Communications Commission oversight.**


 * Contention 1 – NSA Overreach**

The NSA, originally formed to monitor outside threats to the security of the United States, has increasingly turned its surveillance towards the American public. 7 The NSA was originally formed in 1952 growing out of intelligence and cryptology analytics developed during WWII, which naturally developed the agency’s mission to monitor threats coming from outside the United States.8 Today, the NSA is “authorized to collect, process, analyze, produce, and disseminate signals intelligence information and data for foreign intelligence and counterintelligence purposes to support national and departmental missions, and to provide signals intelligence support for the conduct of military operations.”9 Under the **letter of the law**, this power is significantly limited in the domestic arena. The Foreign Intelligence Surveillance Act of 1978 (“ FISA ”)10 bars the NSA from intercepting any domestic, electronic communications of persons inside the U nited S tates unless a judge on the Foreign Intelligence Surveillance Court (“FISA Court”) issues a warrant upon finding that “the purpose of the surveillance is to obtain foreign intelligence information. . . and there is probable cause to believe that the target of the surveillance is an agent of a foreign power.”11 FISA also places various restrictions on other forms of domestic surveillance activities that do not intercept the contents of communications, such as the “installation and use” of pen registers or trap and trace devices, which capture the origin and destination of phone calls or other communications to and from a particular telephone number or other device.12 In 2001, Congress substantially expanded FISA with the USA PATRIOT Act,13 adding, among other provisions, a section that authorizes the Director of the Federal Bureau of Investigation (“FBI”)—or FBI agents designated by the Director—to petition the FISA Court for “an order requiring the production of any tangible things. . . for an investigation to obtain foreign intelligence information not concerning a United States person or to protect against international terrorism or clandestine intelligence activities.”14 Moreover, under Executive Order 12,333, when the NSA conducts intelligence-gathering activities abroad —which are not regulated by FISA15— it may collect, retain, or disseminate information about U nited S tates persons “only in accordance with procedures established by the head of the agency and approved by the Attorney General.”16 Despite its foreign-centric mission and the **express limits on its domestic authority**, the NSA has increasingly turned its attention to activities of persons within the U nited S tates in the wake of 9/11. For instance, in 2006, it was discovered that the NSA had created a call database in 2001 that collected tens of millions of citizens’ phone records from data provided by AT&T, Verizon, and BellSouth.17 “[T]he largest database ever assembled in the world” at the time, its goal was to log “every call ever made within the nation’s borders.”18 The NSA itself has acknowledged its serious obligation to operate effectively in an increasingly interconnected and globalized world without stepping on the toes of civil liberties for the sake of national security.19 Additionally, the NSA’s intrusions into domestic communications extend beyond call data to reach citizens’ activity on the Internet .20 For years, the NSA “unlawfully gathered tens of thousands of emails and other electronic communications between Americans” as part of the agency’s broader collection of communications as they “flow across Internet hubs” under Section 702 of FISA .21 Pursuant to these practices, the NSA may have intercepted as many as 56,000 domestic electronic communications through various methods, 22 some of which the FISA Court has found unconstitutional.23 The disclosure of these NSA practices triggered a **substantial backlash**. Many Americans reacted by taking steps to insulate themselves from what they considered unwarranted government intrusion on their private lives and activities.24 Even though several crucial FISA Court rulings have been partially declassified and released to the public25 in an effort to demonstrate that the NSA’s powers are not unrestrained, public trust and confidence in the agency has clearly diminished .26 In the wake of these disclosures, forty-five percent of Americans felt that the government went too far in its surveillance programs pursuant to anti-terrorism efforts.27 This “massive swing” in public opinion about government policies embodies “the public reaction and apparent shock at the extent to which the government has gone in trying to prevent future terrorist incidents.”28 Coupled with the steps that many Internet users are taking to prevent government intrusion on their online activities and communication, this shift in public opinion shows that Americans are dissatisfied with the reach of government surveillance.29
 * Legal prohibitions bar NSA domestic surveillance – but the lack of a** **credible oversight mechanism** **means the NSA actively captures US person data through section 702 and XO 12333**
 * Healey, 14** - J.D. Candidate, The George Washington University Law School (Audra, “A Tale of Two Agencies: Exploring Oversight of the National Security Administration by the Federal Communications Commission”, FEDERAL COMMUNICATIONS LAW JOURNAL Vol. 67, December)

III. OVERSIGHT IS NEEDED, AND THE FCC SHOULD PROVIDE IT A. Existing executive and legislative oversight mechanisms are inadequate in promoting efficiency and public confidence in the NSA. The executive and legislative mechanisms currently in place to provide oversight of the NSA are inadequate in promoting public confidence and effective national security. Ostensibly, the activities of the NSA are generally governed by the Constitution, federal law, executive orders, and regulations of the Executive Branch.41 On the legislative side, there are two congressional bodies—the House Permanent Select Committee on Intelligence (“HPSCI”) and the Senate Select Committee on Intelligence (“SSCI”) —that are responsible for ensuring that the NSA follows the applicable laws and regulations.42 In the executive branch, NSA oversight is vested in the President’s Intelligence Advisory Board, the Office of the Director of National Intelligence, and the Department of Justice.43 Ostensibly, in addition to these legislative and executive oversight mechanisms, the NSA has also implemented internal controls: the Office of the Inspector General performs audits and investigations while the Office of Compliance operates to ensure that the NSA follows relevant standards.44 However, despite the appearance of effective controls, these oversight mechanisms have failed to prevent the current **public crisis in confidence** that the NSA is fulfilling its mission with the least possible adverse impact on the privacy of U.S. citizens. The authority of the NSA, subject to the above controls, is very limited **on paper**. Every intelligence activity that the NSA undertakes is purportedly constrained to the purposes of foreign intelligence and counterintelligence.45 For instance, Executive Order 12,333 provides the authority for the NSA to engage in the “collection of communications by foreign persons that occur wholly outside the United States.”46 Additionally, FISA authorizes the NSA to compel U.S. telecommunications companies to assist the agency in targeting persons who are not U.S. citizens and are reasonably believed to be located outside the United States.47 However, **despite the appearances of controls**, both external and internal, the “communications of U.S. persons are sometimes incidentally acquired in targeting the foreign entities .”48 The varying types of data gathered can produce a “detailed map” of a given person’s life based on those persons with whom they are in contact.49 For instance, metadata can be used to piece together substantial information about relationships; this information includes who introduced two people, when they met, and their general communication patterns, as well as the nature and the extent of their relationships.50 The recently disclosed collection of contact lists by the NSA has not been authorized by Congress or FISA.51 Additionally, while other collection policies that touch upon domestic communications, such as those under Section 702, have authorization, often neither lawmakers nor the public have even a rough estimate of how many communications of U.S. citizens are being acquired .52 The NSA is easily able to operate around its apparent lack of authority. One anonymous official has been quoted as saying that the NSA consciously avoids the restrictions placed on it by FISA by collecting this information from access points all over the world .53 This method means that the NSA is not required to restrict itself to collecting contact lists belonging to specified intelligence targets .54 The collection mechanism ostensibly operates under the assumption that the bulk of the data collected through the overseas access points is not data from American citizens.55 However, this is not necessarily true due to the globalized nature of the Internet as a communications infrastructure, as “ data crosses boundaries even when its American owners stay at home .”56 The oversight mechanisms currently applied to this collection program require the NSA only to satisfy its own internal oversight mechanisms or to answer possible inquiries from executive branch that there is a “valid foreign intelligence target” in the data collected.57 Moreover, congressional oversight is not effective because members of Congress have candidly said they do not know precisely the right questions to ask NSA officials .58 Often, in congressional hearings, NSA officials and other senior members of the intelligence community are evasive unless directly pressed, and the congressional committees are stymied by their lack of knowledge regarding just which questions need asking.59 Given the realities of the NSA overstepping its authority, there is no indication to the public that the agency, even as it has been collecting data from American citizens, has been required to answer to its various oversight mechanisms in an effective manner. In response, President Obama directed the Privacy and Civil Liberties Oversight Board (“PCLOB”) to conduct two reports about NSA intelligence gathering methods.60 The PCLOB is an independent, bipartisan agency within the executive branch tasked with reviewing and analyzing executive branch actions taken in the name of national security to determine whether appropriate consideration has been afforded to civil liberties in the development and implementation of national anti-terrorism policy.61 The recent PCLOB Report emphasizes that there is a: compelling danger. . . that the personal information collected by the government will be misused to harass, blackmail, or intimidate, or to single out for scrutiny particular individuals or groups. . . . while the danger of abuse may seem remote, given historical abuse of personal information by the government during the twentieth century, the risk is more than merely theoretical.62 The second report addressed more specifically Internet surveillance activities of the NSA—specifically those undertaken pursuant to Section 702.63 These reports demonstrate that there is a serious risk of abuse of the data collected by the NSA, as well illustrating the failings of current governmental oversight of NSA data collection policies. Moreover, according to some classified intelligence documents released by The Washington Post and other outlets, the NSA appears to be overwhelmed by the sheer amount of data it is has collected, which indicates that the mechanisms in place do not adequately help the NSA to focus its search. For instance, the NSA has begun to implement a program (SCISSORS) in order to focus on the portion of the data that is relevant amongst the mass of data collected.64 This is because the NSA was collecting broad swaths of data with “little or no [foreign intelligence] information.”65 The first PCLOB report indicates that the NSA metadata collection program does not pass any semblance of relevancy standards to target the data to a specific question of national security; this is because the NSA does not have reason to suspect the owners of the metadata, unlike in other cases where the collection was lawful.66 Thus, the current oversight system suffers from some serious failings. First, it does not allow for a focused inquiry by the congressional committees. Additionally, the NSA can get around requirements imposed on it by FISA by conducting Internet surveillance abroad that **nonetheless captures U.S. data flows**, many of which traverse foreign networks. Moreover, the NSA has over-collected data with little value to the agency’s national security mission, and therefore must sift through masses of data involving regular American citizens while fighting a public battle about how much information the agency collects.67 This all suggests deficiencies in the NSA’s oversight structure, as all preventive executive, legislative, and internal controls have not been effective.
 * Existing oversight mechanisms guarantee NSA circumvention of prohibitions on domestic surveillance – that creates a** **confidence crisis** **in the future of the internet**
 * Healey, 14** - J.D. Candidate, The George Washington University Law School (Audra, “A Tale of Two Agencies: Exploring Oversight of the National Security Administration by the Federal Communications Commission”, FEDERAL COMMUNICATIONS LAW JOURNAL Vol. 67, December)

The effects of the NSA disclosures on the Internet Freedom agenda **go beyond the realm of Internet governance**. The loss of the **United States as a model** on Internet Freedom issues has made it harder for local civil society groups around the world —including the groups that the State Department’s Internet Freedom programs typically support203— to advocate for Internet Freedom within their own governments .204 The Committee to Protect Journalists, for example, reports that in Pakistan, “where freedom of expression is largely perceived as a Western notion, the Snowden revelations have had a damaging effect. The deeply polarized narrative has become starker as the corridors of power push back on attempts to curb government surveillance .”205 For some of these groups, in fact, even the appearance of collaboration with or support from the U.S. government can diminish credibility, making it harder for them to achieve local goals that align with U.S. foreign policy interests.206 The gap in trust is particularly significant for individuals and organizations that receive funding from the U.S. government for free expression activities or circumvention tools. Technology supported by or exported from the U nited S tates is, in some cases, inherently suspect due to the revelations about the NSA’s surveillance dragnet and the agency’s attempts to covertly influence product development. Moreover, revelations of what the NSA has been doing in the past decade are eroding the moral high ground that the U nited S tates has often relied upon when putting public pressure on authoritarian countries like China, Russia, and Iran to change their behavior. In 2014, Reporters Without Borders added the United States to its “Enemies of the Internet” list for the first time, explicitly linking the inclusion to NSA surveillance. “The main player in [the United States’] vast surveillance operation is the highly secretive National Security Agency ( NSA ) which, in the light of Snowden’s revelations, has come to symbolize the abuses by the world’s intelligence agencies ,” noted the 2014 report.207 The damaged perception of the U nited S tates208 as a leader on Internet Freedom and its diminished ability to legitimately criticize other countries for censorship and surveillance opens the door for foreign leaders to justify—and even expand—their own efforts .209 For example, the Egyptian government recently announced plans to monitor social media for potential terrorist activity, prompting backlash from a number of advocates for free expression and privacy.210 When a spokesman for the Egyptian Interior Ministry, Abdel Fatah Uthman, appeared on television to explain the policy, one justification that he offered in response to privacy concerns was that “the US listens in to phone calls, and supervises anyone who could threaten its national security.”211 This type of rhetoric makes it difficult for the U.S. to effectively criticize such a policy. Similarly, India’s comparatively mild response to allegations of NSA surveillance have been seen by some critics “as a reflection of India’s own aspirations in the world of surveillance,” a further indication that U.S. spying may now make it easier for foreign governments to quietly defend their own behavior.212 It is even more difficult for the United States to credibly indict Chinese hackers for breaking into U.S. government and commercial targets without fear of retribution in light of the NSA revelations.213 These challenges reflect an overall decline in U.S. soft power on free expression issues.
 * The perception of surveillance overreach wrecks the US internet freedom agenda**
 * Kehl, 14** – Policy Analyst at New America’s Open Technology Institute (Danielle, “Surveillance Costs: The NSA’s Impact on the Economy, Internet Freedom & Cybersecurity” July, []

The 2013 revelations of mass surveillance by the U.S. government transformed the global debate about Internet freedom. Where once Washington routinely chided foreign governments and their corporate collaborators for engaging in online censorship, monitoring and other forms of Internet repression, the tables have turned. Edward Snowden, a former National Security Agency (NSA) contractor, leaked thousands of documents revealing America’s most secret electronic surveillance programs, unleashing a tidal wave of criticism and charges of hypocrisy, many directed at some of the very U.S. officials who have championed online freedom. America’s Internet freedom agenda – the effort to preserve and extend the free flow of information online – **hangs in the balance** .1 Already a contested space, the Internet after the Snowden revelations has become even more politically charged, with deep international divisions about its governance and heated battles over its use as a tool of political change. With 2.8 billion Internet users today, and several billion more expected over the next decade, the contest over online freedom grows more important by the day .2 As an ever-greater proportion of human activity is mediated through Internet-based technologies, the extent of online rights and restrictions takes on an increasingly vital role in political, economic and social life.3 Despite the many complications arising from the Snowden disclosures, America still needs a comprehensive Internet freedom strategy, one that tilts the balance in favor of those who would use the Internet to advance tolerance and free expression, and away from those who would employ it for repression or violence .4 It will need to pursue this strategy while drawing a **sharp distinction** between surveillance for national security purposes ( in which all governments engage) and monitoring as a means of political repression ( which democracies oppose ). This is not an easy task, but it is an important one. More than a year after the first Snowden revelations emerged, now is the time to reenergize the Internet freedom agenda. Internet Freedom before Snowden The U.S. government’s explicit pursuit of Internet freedom began during the Bush administration’s second term. Among other steps, the establishment of the State Department’s Global Internet Freedom Task Force aimed to coordinate efforts to promote Internet freedom and to respond to online censorship.5 Building on this foundation, Secretary of State Hillary Rodham Clinton made the expansion of online rights a major focus of U.S. foreign policy in the first Obama term. Speaking in 2010, she cited Franklin Delano Roosevelt’s Four Freedoms and added a fifth, the “freedom to connect – the idea that governments should not prevent people from connecting to the Internet, to websites or to each other.”6 A year later, she pledged America’s “global commitment to Internet freedom, to protect human rights” – including the rights to expression, assembly and association – “online as we do offline.”7 And after the Arab Spring, the United States in 2011 established the Freedom Online Coalition, a collaboration of 23 countries to coordinate efforts to expand global Internet freedom.8 The U.S. government has backed up its words with resources. Since 2009, the State Department and other government agencies have spent more than $125 million on Internet freedom programming.9 In addition to the State Department’s efforts, other government agencies, including the Broadcasting Board of Governors, the U.S. Agency for International Development, the Defense Advanced Research Projects Agency and others, fund the development and deployment of tools aimed at expanding Internet freedom. These programs invest in technologies that allow users to circumvent firewalls so as to access censored material, communicate outside the watchful eye of autocratic regimes, secure their websites and data, link computers in decentralized mesh networks, and establish new Internet connections when existing ones have been cut.10 It supplements the provision of technology with training programs in dozens of countries. The Obama administration also took regulatory steps to promote Internet freedom, particularly after technology demonstrably facilitated the 2009 Green Revolution in Iran and the 2011 Arab Spring. The Treasury Department relaxed restrictions on the export of Internet-related software and services to Iran, explicitly to “foster and support the free flow of information to individual Iranian citizens.”11 Two years later, the White House issued an executive order that imposed sanctions on individuals who engaged in computer and network disruption, monitoring and tracking on behalf of the governments of Iran or Syria.12 The United States has aimed to promote the free flow of online information through diplomatic action as well. State Department diplomats pressure repressive regimes to loosen their Internet restrictions, free imprisoned bloggers and ensure that citizens can express themselves online without fear of punishment. U.S. government officials have engaged in significant dialogue with U.S. and multinational technology companies about their involvement in aiding Internet repression and in establishing transparency standards. American diplomats have also pressed for Internet freedom in the proliferating international fora that have taken up the issue. In 2012, for instance, the United States won approval of a U.N. Human Rights Council resolution affirming that freedom of expression and other rights that people have offline must also be protected online.13 Trade agreements have provided yet another vehicle for the U.S. Internet freedom agenda with, for example, hortatory language in the U.S.-Korea Free Trade Agreement calling for the free flow of online information.14 A key element of U.S. action has been aimed at preventing fundamental changes to the multistakeholder model of Internet governance, which brings together individuals, governments, civil society organizations, private firms and others for transparent and consensus-based decisionmaking.15 One such challenge arose at the December 2012 World Conference on International Telecommunications, when 89 countries – a majority of ITU members in attendance – supported an attempt by Russia, China, Iran and others to give governments greater control over the Internet.16 Despite opposition from the United States and others, the session ended with 89 countries signing the revised treaty; 55 other countries did not. As a sign of what may come in future international treaty negotiations, such numbers did not favor the multistakeholder model, and this was so even before the Snowden revelations emerged to complicate U.S. efforts. The Snowden Fallout and the Internet Freedom Agenda The dramatic revelations about NSA spying that began to emerge in June 2013 provoked a storm of international reaction.17 Political leaders expressed outrage at American surveillance practices and threatened a raft of retaliatory measures. President Dilma Rousseff of Brazil cancelled a planned state visit to the United States and the Brazilian government later organized an international meeting (NetMundial) to discuss the future of Internet governance.18 German Chancellor Angela Merkel was deeply affronted by the alleged monitoring of her personal cellphone. Chinese and other officials charged America with blatant hypocrisy. The fallout affected the private sector as well; where previously the focus of many observers had been on the aid given by U.S. companies to foreign governments engaged in Internet repression, the gaze shifted to the role American corporations play – wittingly or not – in enabling U.S. surveillance. Countries that had been the target of American reproaches rebuked the U.S. government for what they saw as hypocrisy. The United Nations and other international venues became platforms for international criticism of the United States. Germany and Brazil together sponsored a resolution adopted by the U.N. General Assembly in late 2013 backing a “right to privacy” in the digital age.19 In June 2014, the U.N. High Commissioner for Human Rights issued a report that endorsed digital privacy as a human right and criticized mass surveillance as “a dangerous habit rather than an exceptional measure.”20 Some European officials began to question the existing Internet governance model itself. In a statement, the European Commission said, “Recent revelations of large-scale surveillance have called into question the stewardship of the US when it comes to Internet Governance. So given the US-centric model of Internet Governance currently in place, it is necessary to broker a smooth transition to a more global model.”21 Nongovernmental groups that might otherwise be partners with the U.S. government in promoting Internet freedom reacted sharply as well. Reporters Without Borders, for instance, listed the NSA as an “Enemy of the Internet” in its 2014 report on entities engaged in online repression. Drawing no distinction between surveillance aimed at protecting national security and surveillance intended to suppress free expression and political dissent, the organization declared the NSA “no better than [its] Chinese, Russian, Iranian or Bahraini counterparts.” 22 Mass surveillance methods used by democracies like the United States, it added, are “all the more intolerable” as they “are already being used by authoritarian countries such as Iran, China, Turkmenistan, Saudi Arabia and Bahrain to justify their own violations of freedom of information.”23 Tim Berners-Lee, the inventor of the World Wide Web, said, “Mass surveillance is the most immediate threat to the open Internet and the most insidious because we can’t see it.” 24 The Electronic Frontier Foundation asserted that “mass surveillance is inherently a disproportionate measure that violates human rights,”25 and officials with Human Rights Watch observed that the surveillance scandal would render it more difficult for the U.S. government to press for better corporate practices and for companies to resist overly broad surveillance mandates. “Now,” its chief researcher said, “ the vision and credibility of the U.S. and its allies on Internet freedom is in tatters .”26 The reactions to the Snowden disclosures threatened to go beyond verbal denunciations, diplomatic protests and critical press. The most serious commercial fallout came in the **rising support for data localization requirements**. Russia in July 2014 approved legislation that requires data operators to store the personal data of its citizens within the country’s borders.27 Indonesia, Brazil and Vietnam have also called for their citizens’ data held by companies such as Facebook to be stored domestically .28 Data localization has been debated in the European Parliament and elsewhere on the continent as well.29 Apart from the chilling effect on innovation and the loss of business to America companies, Internet freedom itself could become a casualty of such mandates. If a user’s data must be held within the borders of a repressive country, its government will have new opportunities to censor, monitor and disrupt online information flows. Such moves, combined with increasing questions about the multistakeholder approach to Internet governance (and possible support for a governmentdriven approach), together give rise toconcerns aboutthe potential “ Balkanization ” fragmentation of the Internet, in which a constellation of national-level systems could take the place of the current global online infrastructure. As former NSA general counsel Stewart Baker warned, “The Snowden disclosures are being used to renationalize the Internet and roll back changes that have weakened government control of information.”30 This is evident in other proposed steps as well. Brazil and the European Union have announced plans for an undersea cable that would route data transmissions directly between Europe and Latin America and bypass the U nited S tates .31 The European Union threatened to suspend the Safe Harbor data-sharing agreement with the U nited S tates and promulgated new rules for it that EU officials said stemmed directly from worries after the Snowden disclosures.32
 * The US needs to draw a** **sharp distinction** **between domestic and national security surveillance to make the US Internet Freedom agenda credible – otherwise** **global internet fragmentation** **will result**
 * Fontaine, 14** – President of the Center for a New American Security; was foreign policy advisor to Senator John McCain for more than five years; Worked at the State Department, the National Security Council and the Senate Foreign Relations Committee; was associate director for near Eastern affairs at the National Security Council; B.A. in International Relations from Tulan University (Richard, “Bringing Liberty Online; Reenergizing the Internet Freedom Agenda in a Post-Snowden Era”, Center for a New American Security, September 18, 2014, []
 * edited for language
 * Data localization will destroy global economic growth**
 * Chandler and Le, 15** - * Director, California International Law Center, Professor of Law and Martin Luther King, Jr. Hall Research Scholar, University of California, Davis; A.B., Harvard College; J.D., Yale Law School AND **Free Speech and Technology Fellow, California International Law Center; A.B., Yale College; J.D., University of California, Davis School of Law (Anupam and Uyen, “DATA NATIONALISM” 64 Emory L.J. 677, lexis)**
 * C. Economic Development Many governments believe that by forcing companies to localize data within national borders, they will increase investment at home. Thus, data localization measures are often motivated, whether explicitly or not, by desires to promote local economic development. In fact, however, data localization raises costs for local businesses, reduces access to global services for consumers, hampers local start-ups, and interferes with the use of the latest technological advances. In an Information Age, the global flow of data has become the **lifeblood of economies **across the world. While some in Europe have raised concerns about the transfer of data abroad, the European Commission has recognized "the critical importance of data flows notably for the transatlantic economy." n209 The Commission observes that international data transfers "form an integral part of commercial exchanges across the Atlantic including for new growing digital businesses, such as social media or cloud computing, with large amounts of data going from the EU to the US." n210 Worried about the effect of constraints on data flows on both global information sharing and economic development, the Organisation for Economic Co-operation and Development (OECD) has urged nations to avoid "barriers to the location, access and use of cross-border [*722] data facilities and functions" when consistent with other fundamental rights, in order to "ensure cost effectiveness and other efficiencies." n211 The worry about the impact of data localization is widely shared in the business community as well. The value of the Internet to national economies has been widely noted . n212 Regarding Brazil's attempt to require data localization, the Information Technology Industry Council, an industry association representing more than forty major Internet companies, had argued that " in-country data storage requirements would detrimentally impact all economic activity that depends on data flows ." n213 The Swedish government agency, the National Board of Trade, recently interviewed fifteen local companies of various sizes across sectors and concluded succinctly that " trade cannot happen without data being moved from one location to another ." n214 Data localization, like most protectionist measures, leads only to small gains for a few local enterprises and workers, while causing significant harms spread **across the entire economy**. The domestic benefits of data localization go to the few owners and employees of data centers and the few companies servicing these centers locally. Meanwhile, the harms of data localization are widespread, felt by small, medium, and large businesses that are denied access to global services that might improve productivity . In response to Russia's recently passed localization law, the NGO Russian Association for Electronic Communications stressed the potential economic consequences, pointing to the withdrawal of global services and substantial economic losses caused by the passing of similar laws in other countries. n215 For example, besides the loss of international social media platforms, localization would make it impossible for [*723] Russians to order airline tickets or consumer goods through online services. Localization requirements also seriously affect Russian companies like Aeroflot because the airline depends on foreign ticket-booking systems. n216 Critics worried, at the time, that the Brazilian data localization requirement would "deny[] Brazilian users access to great services that are provided by US and other international companies." n217 Marilia Marciel, a digital policy expert at Fundacao Getulio Vargas in Rio de Janeiro, observes, "Even Brazilian companies prefer to host their data outside of Brazil." n218 Data localization affects domestic innovation by denying entrepreneurs the ability to build on top of global services based abroad . Brasscom, the Brazilian Association of Information Technology and Communication Companies, argues that such obligations would "hurt[] the country's ability to create, innovate, create jobs and collect taxes from the proper use of the Internet." n219 Governments implementing in-country data mandates imagine that the various global services used in their country will now build infrastructure locally. Many services, however, will find it uneconomical and even too risky to establish local servers in certain territories. n220 Data centers are expensive, all the more so if they have the highest levels of security. One study finds Brazil to be the most expensive country in the Western hemisphere in which to build data centers. n221 Building a data center in Brazil costs $ 60.9 million on average, [*724] while building one in Chile and the United States costs $ 51.2 million and $ 43 million, respectively. n222 Operating such a data center remains expensive because of enormous energy and other expenses - averaging $ 950,000 in Brazil, $ 710,000 in Chile, and $ 510,000 in the United States each month. n223 This cost discrepancy is mostly due to high electricity costs and heavy import taxes on the equipment needed for the center. n224 Data centers employ few workers, with energy making up three-quarters of the costs of operations. n225 According to the 2013 Data Centre Risk Index - a study of thirty countries on the risks affecting successful data center operations - Australia, Russia, China, Indonesia, India, and Brazil are among the riskiest countries for running data centers. n226 Not only are there significant economic costs to data localization, the potential gains are more limited than governments imagine. Data server farms are hardly significant generators of employment, populated instead by thousands of computers and few human beings. The significant initial outlay they require is largely in capital goods, the bulk of which is often imported into a country. The diesel generators, cooling systems, servers, and power supply devices tend to be imported from global suppliers. n227 Ironically, it is often American suppliers of servers and other hardware that stand to be the beneficiaries of data localization mandates. n228 One study notes, "Brazilian suppliers of components did not benefit from this [data localization requirement], since the imported products dominate the market." n229 By increasing capital purchases from abroad, data localization requirements can in fact increase merchandise trade deficits . Furthermore, large data farms are [*725] enormous consumers of energy, n230 and thus often further** burden overtaxed energy grids**. They thereby harm other industries that must now compete for this energy, paying higher prices while potentially suffering limitations in supply of already scarce power . Cost, as well as access to the latest innovations, drives many e-commerce enterprises in Indonesia to use foreign data centers. Daniel Tumiwa, head of the Indonesian E-Commerce Association (IdEA), states that "the cost can double easily in Indonesia." n231 Indonesia's Internet start-ups have accordingly often turned to foreign countries such as Australia, Singapore, or the United States to host their services. One report suggests that "many of the "tools' that start-up online media have relied on elsewhere are not fully available yet in Indonesia." n232 The same report also suggests that a weak local hosting infrastructure in Indonesia means that sites hosted locally experience delayed loading time. n233 Similarly, as the Vietnamese government attempts to foster entrepreneurship and innovation, n234 localization requirements effectively bar start-ups from utilizing cheap and powerful platforms abroad and potentially handicap Vietnam from "joining in the technology race." n235 Governments worried about transferring data abroad at the same time hope, somewhat contradictorily, to bring foreign data within their borders. Many countries seek to become leaders in providing data centers for companies operating across their regions. In 2010, Malaysia announced its Economic Transformation Program n236 to transform Malaysia into a world-class data [*726] center hub for the Asia-Pacific region. n237 Brazil hopes to accomplish the same for Latin America, while France seeks to stimulate its economy via a "Made in France" digital industry. n238 Instead of spurring local investment, data localization can lead to the loss of investment. First, there's the retaliation effect. Would countries send data to Brazil if Brazil declares that data is unsafe if sent abroad? Brasscom notes that the Brazilian Internet industry's growth would be hampered if other countries engage in similar reactive policies, which "can stimulate the migration of datacenters based here, or at least part of them, to other countries." n239 Some in the European Union sympathize with this concern. European Commissioner for the Digital Agenda, Neelie Kroes, has expressed similar doubts, worrying about the results for European global competitiveness if each country has its own separate Internet. n240 Then there's the avoidance effect. Rio de Janeiro State University Law Professor Ronaldo Lemos, who helped write the original Marco Civil and is currently Director of the Rio Institute for Technology and Society, warns that the localization provision would have caused foreign companies to avoid the country altogether: "It could end up having the opposite effect to what is intended, and scare away companies that want to do business in Brazil." n241 Indeed, such burdensome local laws often lead companies to launch overseas, in order to try to avoid these rules entirely. Foreign companies, too, might well steer clear of the country in order to avoid entanglement with cumbersome rules. For example, Yahoo!, while very popular in Vietnam, places its servers for the [*727] country in Singapore. n242 In these ways we see that data localization mandates can backfire entirely, leading to avoidance instead of investment. Data localization requirements place burdens on domestic enterprises not faced by those operating in more liberal jurisdictions. Countries that require data to be cordoned off complicate matters for their own enterprises, which must turn to domestic services if they are to comply with the law. Such companies must also develop mechanisms to segregate the data they hold by the nationality of the data subject. The limitations may impede development of new, global services. Critics argue that South Korea's ban on the export of mapping data, for example, impedes the development of next-generation services in Korea: Technology services, such as Google Glass, driverless cars, and information programs for visually-impaired users, are unlikely to develop and grow in Korea. Laws made in the 1960s are preventing many venture enterprises from advancing to foreign markets via location/navigation services. n243 The harms of data localization for local businesses are not restricted to Internet enterprises or to consumers denied access to global services. As it turns out, most of the economic benefits from Internet technologies accrue to traditional businesses. A McKinsey study estimates that about seventy-five percent of the value added created by the Internet and data flow is in traditional industries, in part through increases in productivity . n244 The potential economic impact across the major sectors - healthcare, manufacturing, electricity, urban infra-structure, security, agriculture, retail, etc. - is estimated at $ 2.7 to $ 6.2 **trillion **per year. n245 This is particularly important for emerging economies, in which traditional industries remain predominant. The Internet raises profits as well, due to increased revenues, lower costs of goods sold, and lower administrative costs. n246 With data localization mandates, traditional businesses [*728] will lose access to the many global services that would store or process information offshore. Data localization requirements also interfere with the most important trends in computing today. They limit access to the disruptive technologies of the future, such as cloud computing, the "Internet of Things," and data-driven innovations (especially those relying on "big data"). Data localization sacrifices the innovations made possible by building on top of global Internet platforms based on cloud computing. This is particularly important for entrepreneurs operating in emerging economies that might lack the infrastructure already developed elsewhere. And it places great impediments to the development of both the Internet of Things and big data analytics, requiring costly separation of data by political boundaries and often denying the possibility of aggregating data across borders. We discuss the impacts on these trends below.**

Economic decline causes war and miscalculation Royal 10**— Jedidiah Royal, Director of Cooperative Threat Reduction at the U.S. Department of Defense, M.Phil. Candidate at the University of New South Wales, 2010 (“Economic Integration, Economic Signalling and the Problem of Economic Crises,” //Economics of War and Peace: Economic, Legal and Political Perspectives//, Edited by Ben Goldsmith and Jurgen Brauer, Published by Emerald Group Publishing, ISBN 0857240048, p. 213-215)**
 * Less intuitive is how periods of economic decline may increase the likelihood of external conflict . Political science literature has contributed a moderate degree of attention to the impact of economic decline and the security and defence behaviour of interdependent states. Research in this vein has been considered at systemic, dyadic and national levels. Several notable contributions follow. ¶ First, on the systemic level, Pollins (2008) advances Modelski and Thompson's (1996) work on leadership cycle theory, finding that rhythms in the global economy are associated with the rise and fall of a pre-eminent power and the often bloody transition from one pre-eminent leader to the next. As such, exogenous shocks such as economic crises could usher in a redistribution of relative power (see also Gilpin. 1981) that leads to uncertainty about power balances, increasing the risk of miscalculation (Feaver, 1995). Alternatively, even a relatively certain redistribution of power could lead to a permissive environment for conflict as a rising power may seek to challenge a declining power (Werner. 1999). Separately, Pollins (1996) also shows that global economic cycles combined with parallel leadership cycles impact the likelihood of conflict among major, medium and small powers, although he suggests that the causes and connections between global economic conditions and security conditions remain unknown. ¶ Second, on a dyadic level, Copeland's (1996, 2000) theory of trade expectations suggests that 'future expectation of trade' is a significant variable in understanding economic conditions and security behaviour of states. He argues that interdependent states are likely to gain pacific benefits from trade so long as they have an optimistic view of future trade relations. However, if the expectations of future trade decline, particularly for difficult [end page 213] to replace items such as energy resources, the likelihood for conflict increases , as states will be inclined to use force to gain access to those resources. Crises could potentially be the trigger for decreased trade expectations either on its own or because it triggers protectionist moves by interdependent states.4 ¶ Third, others have considered the link between economic decline and external armed conflict at a national level. Blomberg and Hess (2002) find a strong correlation between internal conflict and external conflict, particularly during periods of economic downturn. They write, ¶ The linkages between internal and external conflict and prosperity are strong and mutually reinforcing. Economic conflict tends to spawn internal conflict , which in turn returns the favour. Moreover, the presence of a recession tends to amplify the extent to which international and external conflicts self-reinforce each other. (Blomberg & Hess, 2002. p. 89) ¶ Economic decline has also been linked with an increase in the likelihood of terrorism (Blomberg, Hess, & Weerapana, 2004), which has the capacity to spill across borders and lead to external tensions. ¶ Furthermore, crises generally reduce the popularity of a sitting government. “Diversionary theory" suggests that, when facing unpopularity arising from economic decline, sitting governments have increased incentives to fabricate external military conflicts to create a 'rally around the flag' effect. Wang (1996), DeRouen (1995). and Blomberg, Hess, and Thacker (2006) find supporting evidence showing that economic decline and use of force are at least indirectly correlated. Gelpi (1997), Miller (1999), and Kisangani and Pickering (2009) suggest that the tendency towards diversionary tactics are greater for democratic states than autocratic states, due to the fact that democratic leaders are generally more susceptible to being removed from office due to lack of domestic support. DeRouen (2000) has provided evidence showing that periods of weak economic performance in the United States, and thus weak Presidential popularity, are statistically linked to an increase in the use of force. ¶ In summary, recent economic scholarship positively correlates economic integration with an increase in the frequency of economic crises, whereas political science scholarship links economic decline with external conflict at systemic, dyadic and national levels.5 This implied connection between integration, crises and armed conflict has not featured prominently in the economic-security debate and deserves more attention. ¶ This observation is not contradictory to other perspectives that link economic interdependence with a decrease in the likelihood of external conflict, such as those mentioned in the first paragraph of this chapter. [end page 214] Those studies tend to focus on dyadic interdependence instead of global interdependence and do not specifically consider the occurrence of and conditions created by economic crises. As such, the view presented here should be considered ancillary to those views.**

Growth is acontrolling impact– eliminates therational incentivesfor war Gartzke 11 **(Erik, Associate Professor of Political Science – University of California-San Diego, Ph.D. – University of Iowa, B.A. – University of California-San Francisco, “SECURITY IN AN INSECURE WORLD”, Cato Institute, 2-9, www.cato-unbound.org/2011/02/09/erik-gartzke/security-in-an-insecure-world/)**
 * Almost as informative as the decline in warfare has been where this decline is occurring. Traditionally, nations were constrained by opportunity. Most nations did not fight most others because they could not physically do so. Powerful nations, in contrast, tended to fight more often, and particularly to fight with other powerful states. Modern “ zones of peace” are dominated by powerful, militarily capable countries. These countries could fight each other, but are not inclined to do so . At the same time, weaker developing nations that continue to exercise force in traditional ways are incapable of projecting power against the developed world, with the exception of unconventional methods, such as terrorism. The world is thus divided between those who could use force but prefer not to (at least not against each other) and those who would be willing to fight but lack the material means to fight far from home. Warfare in the modern world has thus become an activity involving weak (usually neighboring) nations, with intervention by powerful (geographically distant) states in a policing capacity. So, the riddle of peace boils down to why capable nations are not fighting each other. There are several explanations, as Mack has pointed out. The easiest, and I think the best, explanation has to do with an absence of motive . Modern states find little incentive to bicker over tangible property, since armies are expensive and the goods that can be looted are no longer of considerable value . Ironically, this is exactly the explanation that Norman Angell famously supplied before the World Wars. Yet, today the evidence is abundant that the most prosperous , capable nations prefer to buy rather than take . Decolonization, for example, divested European powers of territories that were increasingly expensive to administer and which contained tangible assets of limited value. Of comparable importance is the move to substantial consensus among powerful nations about how international affairs should be conducted. The great rivalries of the twentieth century were ideological rather than territorial . These have been substantially resolved, as Francis Fukuyama has pointed out. The fact that remaining differences are moderate, while the benefits of acting in concert are large (due to economic interdependence in particular) means that nations prefer to deliberate rather than fight . Differences remain, but for the most part the capable countries of the world have been in consensus, while the disgruntled developing world is incapable of acting on respective nations’ dissatisfaction. While this version of events explains the partial peace bestowed on the developed world, it also poses challenges in terms of the future. The rising nations of Asia in particular have not been equal beneficiaries in the world political system. These nations have benefited from economic integration, and this has proved sufficient in the past to pacify them . The question for the future is whether the benefits of tangible resources through markets are sufficient to compensate the rising powers for their lack of influence in the policy sphere. The danger is that established powers may be slow to accommodate or give way to the demands of rising powers from Asia and elsewhere, leading to divisions over the intangible domain of policy and politics. Optimists argue that at the same time that these nations are rising in power, their domestic situations are evolving in a way that makes their interests more similar to the West. Consumerism, democracy, and a market orientation all help to draw the rising powers in as fellow travelers in an expanding zone of peace among the developed nations. Pessimists argue instead that capabilities among the rising powers are growing faster than their affinity for western values, or even that fundamental differences exist among the interests of first- and second-wave powers that cannot be bridged by the presence of market mechanisms or McDonald’s restaurants. If the peace observed among western, developed nations is to prove durable , it must be because** warfare proves futile as nations transition to prosperity**. Whether this will happen depends on the rate of change in interests and capabilities, a difficult thing to judge. We must hope that the optimistic view is correct, that what ended war in Europe can be exported globally. Prosperity has made war expensive , while the fruits of conflict, both in terms of tangible and intangible spoils have declined in value. These forces are not guaranteed to prevail indefinitely. Already, research on robotic warfare promises to lower the cost of conquest. If in addition, fundamental differences among capable communities arise, then warfare over ideology or policy can also be resurrected. We must all hope that the consolidating forces of prosperity prevail, that war becomes a durable anachronism.**

FCC 1ac – Tech Leadership

Contention 2 – Tech Leadership

NSA surveillance iscrushingU.S. cloud-computing –decks competitivenessand spills over to theentire tech sector Donohue 15 – Professor of Law, Georgetown Law and Director, Center on National Security and the Law, Georgetown Law (Lauren, HIGH TECHNOLOGY, CONSUMER PRIVACY, AND U.S. NATIONAL SECURITY, Symposium Articles, 4 Am. U. Bus. L. Rev. 11 p.15-18, 2015, Hein Online)//JJ// //I. ECONOMIC IMPACT OF NSA PROGRAMS The NSA programs, and public awareness of them, have had an immediate anddetrimental impact on the U.S. economy. They have cost U.S. companiesbillions of dollarsin lost sales, even as companies have seen theirmarket shares decline. American multinational corporations have had to develop new products and programs to offset the revelations and to build consumer confidence. At the same time, foreign entities have seen revenues increase. Beyond the immediate impact, the revelation of the programs, and the extent to which the NSA has penetrated foreign data flows, has undermined U.S. trade agreement negotiations. It has spurred data localization efforts around the world, and it has raised the spectre of the future role of the United States in Internet governance. Even if opportunistic, these shifts signal an immediate andlong-term impact of the NSA programs, and public knowledge about them,on the U.S. economy. A. Lost Revenues and Declining Market Share Billions of dollars are on the line because of worldwide concern that the services provided by U.S. information technology companies are neither secure nor private. Perhaps nowhere is thismore apparentthan incloud computing. Previously, approximately 50% of the worldwide cloud computing revenues derived from the United States. The domestic market thrived: between 2008 and 2014, it more than tripled in value. But within weeks of the Snowden leaks, reports had emerged that U.S. companies such as Dropbox, Amazon Web Services, and Microsoft's Azure were losing business. By December 2013, ten percent of the Cloud Security Alliance had cancelled U.S. cloud services projects as a result of the Snowden information. In January 2014 a survey of Canadian and British ** businesses found that one quarter of the respondents were **moving their data outside the United States**. The Information Technology and Innovation Foundation estimates that declining revenues of corporations that focus on cloud computing and data storage alone could reach $35 billion over the next three years. Other commentators, such as Forrester Research analyst James Staten, have put** actual losses **as high as** $180 billion **by 2016, unless something is done to restore confidence in data held by U.S. companies. The monetary impact of the NSA programs extends **beyond cloud computing **to the** high technology industry**. Cisco, Qualcomm, IBM, Microsoft, and Hewlett-Packard have all reported declining sales as a direct result of the NSA programs. Servint, a webhosting company based in Virginia, reported in June 2014 that its international clients had **dropped by 50**% since the leaks began. Also in June, the German government announced that because of Verizon's complicity in the NSA program, it would end its contract with the company, which had previously provided services to a number of government departments. As a senior analyst at the Information Technology and Innovation Foundation explained, " It's clear to every single tech company that this is affecting their bottom line . The European commissioner for digital affairs, Neelie Kroes, predicts that the fallout for U.S. businesses in the EU alone will amount to billions of Euros. Not only are U.S. companies losing customers, but they have been forced to spend billions to add encryption features to their services. IBM has invested more than a billion dollars to build data centers in London, Hong Kong, Sydney, and elsewhere, in an effort to reassure consumers outside the United States that their information is protected from U.S. government surveillance.26 Salesforce.com made a similar announcement in March 2014.27 Google moved to encrypt terms entered into its browser.28 In June 2014 it took the additional step of releasing the source code for End-to-End, its newly-developed browser plugin that allows users to encrypt email prior to it being sent across the Internet.29 The following month Microsoft announced Transport Layer Security for inbound and outbound email, and Perfect Forward Secrecy encryption for access to OneDrive.30 Together with the establishment of a Transparency Center, where foreign governments could review source code to assure themselves of the integrity of Microsoft software, the company sought to put an end to both NSA back door surveillance and doubt about the integrity of Microsoft products.3' Foreign technology companies, in turn, are seeing **revenues increase**. Runbox, for instance, an email service based in Norway and a direct competitor to Gmail and Yahoo, almost immediately made it publicly clear that it does not comply with foreign court requests for its customers' personal information. Its customer base increased 34% in the aftermath of the Snowden leaks. Mateo Meier, CEO of Artmotion, Switzerland's biggest offshore data hosting company, reported that within the first month of the leaks, the company saw a 45% rise in revenue. Because Switzerland is not a member of the EU, the only way to access data in a Swiss data center is through an official court order demonstrating guilt or liability; there are no exceptions for the United States. In April 2014, Brazil and the EU, which previously used U.S. firms to supply undersea cables for transoceanic communications, decided to build their own cables between Brazil and Portugal, using Spanish and Brazilian companies in the process. 36 OpenText, Canada's largest software company, now guarantees customers that their data remains outside the United States. Deutsche Telekom, a cloud computing provider, is similarly gaining more customers. Numerous **foreign companies **are marketing their** products as "NSA proof' **or "safer alternatives" to those offered by U.S. firms, gaining market share in the process.**//

//The best and newest research confirms the link// //Marthews and Tucker, 15 **– * National Chair at Restore the Fourth AND** PhD in economics and professor of Marketing at MIT (Alex and Catherine, “Government Surveillance and Internet Search Behavior”, 29 April 2015, gg// // This study is the first to provide **substantial empirical documentation** ofa chilling effect, both domestically in the shorter term and internationally in the longer term, that appears to be related to increased awareness of government surveillance online . Furthermore, this chilling effect appears in countries other than the US to apply to search behavior that is not strictly related to the government but instead forms part of the private domain. Our findings have the following policy implications. From an economic perspective, our finding that there was an effect on international Google users’ browsing behavior has potential policy implications for the effects of government surveillance on international commerce. **From a US competitive standpoint, the longer-run effect observed on international Google** **users’ search behavior indicates that knowledge of US government surveillance of Google** **could indeed affect their behavior**. At the most limited end of the spectrum, it could steerthem away from conducting certain searches on US search engines; at the most severe endof the spectrum, they might choose to use non-US search engines. Such effects may not be limited simply to search engines. For example, as Google’s services are embedded in a large array of products, it could potentially hinder sales of Android-enabled mobile phones.Though preliminary attempts are being made to work towards initial measures of the economicimpact of surveillance revelations (Dinev et al., 2008), no systematic study yet exists.All we can do, within the context of our data, is to indicate that on the basis of the effectswe find, the strong possibility of **substantial economic effects** exists, and to suggest that such potential adverse economic impacts should be incorporated into the thinking of policy makers regarding the appropriateness of mass surveillance programs .There are limitations to the generalizability of our findings. First, we are not sure how the results generalize outside of the search domain towards important tech industries such as the rapidly growing US cloud computing industry. Second, we are not sure how the revelations affected search on Google’s major competitors, such as Bing and Yahoo! Search. It may be that the effect on their services was lessened by reduced media focus on them relative to Google in the light of the PRISM revelations and potentially the extent to which users anticipated that their servers may be located outside of the US. Third, our results are focused on the effects of revelations about government surveillance as opposed to the direct effects of government surveillance per se. Notwithstanding these limitations, we believe that our study provides an important first step in understanding thepotential for effects of government surveillance practices on commercial outcomes and international competitiveness. //

//**That undermines US global technological leadership**// // **Castro and McQuinn 15**, Daniel Castro works at the Center for Data Innovation, Government Technology, The Information Technology & Innovation Foundation, worked at the U.S. Government Accountability Office, went to Carnegie Mellon. Alan McQuinn works at the Federal Communications Commission, previously had the Bill Archer Fellowship at the University of Texas, (June 2015, “Beyond the USA Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness”, // // CONCLUSION When historians write about this period in U.S. history it could very well be that one of the themes will be how the U nited S tates lost its **global technology leadership** to other nations . And clearly one of the factors they would point to is the long-standing privileging of U.S. national security interests over U.S. industrial and commercial interests when it comes to U.S. foreign policy . This has occurred over the last few years as the U.S. government has done relatively little to address the rising commercial challenge to U.S. technology companies, all the while putting intelligence gathering first and foremost. Indeed, policy decisions by the U.S. intelligence community have **reverberated throughout the global economy**. If the U.S. tech industry is to remain the leader in the global marketplace, then the U.S. government will need to set a new course that balances economic interests with national security interests. The cost of inaction is not only short-term economic losses for U.S. companies, but a wave of protectionist policies that will systematically weaken U.S. technology competiveness in years to come, with impacts on economic growth, jobs, trade balance, and national security through a weakened industrial base. Only by taking **decisive steps to reform its digital surveillance activities** will the U.S. government enable its tech industry to effectively compete in the global market. //

//**Tech leadership is the** **primary driver of hegemony** **–**// //**Weiss 14 – Fellow of the Academy of the Social Sciences in Australia, Professor Emeritus in Government and International Relations at the University of Sydney, Honorary Professor of Political Science at Aarhus University. (Linda, America Inc.?: Innovation and Enterprise in the National Security State, Cornell University Press, 4/1/14, p. 1-3)**//**JJ** So what accounts for America’s transformative capacity ? Where do its breakthrough innovations come from? My answer traces the relationship between high technology, national security, and political culture. It advances three interlinked propositions regarding the role of the NSS as technology enterprise and commercialization engine; **its geopolitical drivers** ; and the institutional consequences of an antistatist constraint. **The national security state as technology enterprise.** First, America's capacity for transformative innovation derives not merely from the entrepreneurship of its private sector, or simply from the state as such, but from the national security state —a particular cluster of federal agencies that collaborate closely with private actors in pursuit of security-related objectives. The NSS is a wholly new postwar creation that is geared to the permanent mobilization of the **nation's science and technology resources** for **military primacy**, and here I document and explain why it has had to become increasingly involved in commercial undertakings. Although centered on defense preparedness, the NSS is a good deal broader than the military, yet narrower than the state as a whole. In addition to its defense core in the Department of Defense, the NSS comprises several other components created at the height of the Cold War to pursue, deliver, or underwrite innovation in the service of securing technological supremacy. Although some are designated as "civilian" in their ori- gins, evolution, and current mix of activities, these NSS components remain deeply enmeshed in national security or dual-use functions (as we shall see in chapter 2).4 Acting as commander in chief, the president sits at the peak of this complex, supported by the Oval Office and, in particular, the Office of Science and Technology Policy. In sum, I discuss NSS activities not in the more popular sense of a surveillance state, but as a national "technology enterprise" in which the military is the central, but far from exclusive, actor. In telling this Story, I demonstrate and account for a major shift in NS.S innovation programs and policies that involved the national security agencies cultivating and undertaking commercialization ventures. (c. 1945 up to the 1970s), this process of fostering commercially relevant (general-purpose or dual-use) technologies took both direct and indirect forms. Then (especially from the 1980s onward) it also took a more proactive form, via patenting and licensing reforms and cooperative agreements to transfer technology from the federal labs to the private sector, via the launching of new procurement and joint innovation initiatives, and via the creation of new venture capital (VC) schemes. By placing greater emphasis on commercialization opportunities, some of these incentives sought to sweeten collaboration with the DOD and other security-related agencies, and thus to increase NISS influence over the direction of technology. A significant problem for the NSS has been that since the late 1970s, it has become progressively more challenging to enlist innovative companies in the private sector to work on security-related projects. While traditional defense suppliers grew increasingly large and specialized in systems integration, by the 1970s the more innovative producer companies—above all, critical suppliers Of integrated circuits—had begun to pull away from the federal market. Attracting nondefense firms to do defense work was at one time easy because the government market (in semiconductors and computers, for instance) was so much larger than the private market, and healthy profits could be made. But by the mid- 1970s commercial markets had come into their own, leading firms to reorient production to suit the more standardized demand. One consequence of lacking the earlier pull power Of massive demand is that NISS agencies have had to create new incentives to foster private-sector collaboration. One of the major incentives intended to reattract the private sector is the inclusion of commercial goals in NSS technology policies. Commercial viability therefore has to stand alongside security and technological supremacy in NSS policy. For instance, if a firm works with an agency to create a technology, service, or prototype for use by the U.S. Army, it will also be encouraged from the outset of the project to create a similar product for the commercial market. In this way, and many more, the NSS has progressively been drawn into promoting commercial innovation for security reasons. One implication, demonstrated in some detail, is that the NISS has achieved a much broader reach than commonly implied by the notion Of a military-industrial complex. Geopolitical drivers. What are the drivers of the NSS technology enterprise? **Geopolitics and related threat perceptions** have been **the original catalyst** for NSS formation and its **evolution as an innovation engine**. This state- (and technology-) building dynamic has occurred in three broad phases: the Cold War, the rise of Japan as techno-security challenge, and the post-9/11 era of asymmetric threats. The NSS emerged and expanded in fits and starts after World War II in response to a perceived international threat, emanating from the Soviet Union, that proved both enduring and persistent. It is instructive to note that in this phase the NSS bears at least some comparison with the erstwhile "developmental states" of Northeast Asia. They too emerged in response to an intensely perceived security threat, from neighboring China and North Korea, but instead sought national security more broadly via economic improvement, or industrial catch-up. Living on the fault lines of the Cold War in the presence of a credible and unyielding security threat exerted an unusual pressure on the East Asian states to pursue security by building economic strength. More distinctively in the case of Japan, Peter Katzenstein has developed the argument that, against the backdrop of terrible defeat, domestic power struggles succeeded in reorienting Japan's conception of security in favor Of economic rather than military strength. Thus the Japanese state practices a form of "technological national security" in order to ensure against its resource dependence and reduce its exposure to international supply disruptions (Katzenstein 1996, 2005; also Samuels 1994). Fundamental motivations drawn from different historical experiences thus serve to underline a unique feature of the NSS. In contrast to Japan (and the East Asian developmental states more generally), America's national security State has been geared to the pursuit of technological superior, not for reasons of national independence, economic competitiveness, or resource dependency, but **in order to maintain American primacy**. For the United States, the experience of World War Il drove home the point that science and technology (S&T) was a game changer—the key to winning the war—and that **future preparedness** would depend on achieving and sustaining technological superiority. Geopolitics is thus the driver, not economics. I emphasize this point because many analysts have viewed the Pentagon as the source of an industrial policy that is pursued beneath the radar6—a claim that this book disputes since it mistakes the nature of the primary driver. From its inception, the NSS was tasked with ensuring the technology leadership of the United States for the purpose of national defense. Even as the Soviet menace retreated, security proved paramount as the U.S. confronted a newly resurgent Japan that threatened to dethrone it as the regnant technology power. Appreciating the strength and intensity of the U.S. security focus means never underestimating the significance of this point: as long as U.S. military strategy continues to rely on a significant technology lead over its adversaries (real or potential), threats to that lead can never be simply (or even primarily) a commercial matter—even when the NSS "goes commercial.
 * NSS=National Security State**

(Yuhan, “America’s decline: A harbinger of conflict and rivalry”, 1-22, [], ldg) This does not necessarily mean that the US is in systemic decline, but it encompasses a trend that appears to be negative and perhaps alarming. Although the US still possesses incomparable military prowess and its economy remains the world’s largest, the once seemingly indomitable chasm that separated America from anyone else is narrowing. Thus, the global distribution of power is shifting, and the inevitable result will be a world that is less peaceful, liberal and prosperous, burdened by a dearth of effective conflict regulation.Over the past two decades, no other state has had the ability to seriously challenge the US military. Under these circumstances, motivated by both opportunity and fear, many actors have bandwagoned with US hegemony and accepted a subordinate role. Canada, most of Western Europe, India, Japan, South Korea, Australia, Singapore and the Philippines have all joined the US, creating a status quo that has tended to mute great power conflicts. However, as the hegemony that drew these powers together withers, so will the pulling power behind the US alliance. The result will be an international order where power is more diffuse, American interests and influence can be more readily challenged, and conflicts or wars may be harder to avoid.As history attests, power decline and redistribution result in military confrontation. For example, in the late 19th century America’s emergence as a regional power saw it launch its first overseas war of conquest towards Spain. By the turn of the 20th century, accompanying the increase in US power and waning of British power, the American Navy had begun to challenge the notion that Britain ‘rules the waves.’ Such a notion would eventually see the US attain the status of sole guardians of the Western Hemisphere’s security to become the order-creating Leviathan shaping the international system with democracy and rule of law. Defining this US-centred system are three key characteristics: enforcement of property rights, constraints on the actions of powerful individuals and groups and some degree of equal opportunities for broad segments of society. As a result of such political stability, free markets, liberal trade and flexible financial mechanisms have appeared. And, with this, many countries have sought opportunities to enter this system, proliferating stable and cooperative relations. However, what will happen to these advances as America’s influence declines? Given that America’s authority, although sullied at times, has benefited people across much of Latin America, Central and Eastern Europe, the Balkans, as well as parts of Africa and, quite extensively, Asia, the answer to this question could affect global society in a profoundly detrimental way. Public imagination and academia have anticipated that a post-hegemonic world would return to the problems of the 1930s: regional blocs, trade conflicts and strategic rivalry. Furthermore, multilateral institutions such as the IMF, the World Bank or the WTO might give way to regional organisations.For example, Europe and East Asia would each step forward to fill the vacuum left by Washington’s withering leadership to pursue their own visions of regional political and economic orders. Free markets would become more politicised — and, well, less free — and major powers would compete for supremacy.Additionally, such power plays have historically possessed a zero-sum element. In the late 1960s and 1970s, US economic power declined relative to the rise of the Japanese and Western European economies, with the US dollar also becoming less attractive. And, as American power eroded, so did international regimes (such as the Bretton Woods System in 1973). **A world without American hegemony is one where great power wars re-emerge**, the liberal international system is supplanted by an authoritarian one, and trade protectionism devolves into restrictive, anti-globalisation barriers. This, at least, is one possibility we can forecast in a future that will inevitably be devoid of unrivalled US primacy.
 * Hegemonic decline causes great power wars**
 * Zhang et al., Carnegie Endowment researcher, 2011**

[Eric, technical writer and user advocate for The Rackspace Cloud, September 14, 2010 []]
 * Cloud computing solves climate modeling**
 * Boyce, 10**

The promise of the cloud isn’t just about gaming and the ability to safely store all those photos that you wish you hadn’t ever taken. Many of **the most promising cloud-based applications** also **require massive computational power**. Searching a database of global DNA samples requires abundant, scalable processing power. Modeling protein folding is another example of how compute resources will be used. Protein folding is linked to many diseases including Alzheimer’s and cancer, and analyzing the folding process can lead to new treatments and cures, but it requires enormous compute power. Projects like Folding@home are using distributed computing to tackle these modeling tasks. **The cloud will offer a larger, faster, more scalable way to process data and thus benefit any heavy data manipulation task**. 6. Is it going to be hot tomorrow? Like protein folding modeling, **climate simulation and forecasting requires a large amount of data storage and processing.** Recently the German Climate Computing Center (DKRZ) **installed a climate calculating supercomputer that is capable of analyzing 60 petabytes of data (** roughly 13 million DVD’s) at over 158 teraflops (trillion calculations per second). In the next couple of decades, this level of computing power **will be widely available and will exist on remote hardware. Sophisticated climate models combined with never before seen compute power will provide better predictions of climate change and more rapid early warning systems**

[ Vicky Pope is the head of climate science advice at the Met Office Hadley Centre, “ How science will shape climate adaptation plans,” 16 September 2010, []] Some would argue that **the demand for information on how climate change will affect our future outstrips the current capability of the science and climate models.** My view is that **as scientists**, we can provide useful information, but **we need to** be clear about its limitations and **strive to improve information** **for the future**. We need to be clear about the uncertainties in our projections while still extracting useful information for practical decision-making. I have been involved in developing climate models for the last 15 years and despite their limitations we are now able to assess the probability of different outcomes for the first time. That means **we can quantify the risk of** these **outcomes happening**. These projections – the UK climate projections published in 2009 - are already forming the backbone of adaptation decisions being made in the UK for 50 to 100 years ahead. A project commissioned by the Environment Agency to investigate the impact of climate change on the Thames estuary over the next 100 years concluded that current government predictions for sea level rise are realistic. A major outcome from the scientific analysis was that the worst-case scenarios for high water levels can be significantly reduced - from 4.2m to 2.7m – because we are able to rule out the more extreme sea level rise. As a result, massive investment in a tide-excluding estuary barrage is unlikely to be needed this century. This will be reviewed as more information becomes available, taking a flexible approach to adaptation. The energy industry, working with the Met Office, looked at the likely impact of climate change on its infrastructure. The project found that very few changes in design standards are required, although it did highlight a number of issues. For instance, transformers could suffer higher failure rates and efficiency of some types of thermal power station could be markedly reduced because of increasing t emperatures. A particular concern highlighted by this report and reiterated in today's report from the Climate Change Committee - the independent body that advises government on its climate targets - is that little is known about how winds will change in the future - important because of the increasing role of wind power in the UK energy mix. Fortunately many people, from private industry to government, recognise the value of even incomplete information to help make decisions about the future. **Demand for climate information is increasing, particularly relating to changes in the short to medium term**. **More still needs to be done to refine the climate projections and make them more usable and accessible. This is especially true if we are to provide reliable projections for the next 10 to 30 years. The necessary science and modelling tools are being developed,** and the first tentative results are being produced. We need particularly to look at how we communicate complex and often conflicting results. In order to explain complex science to a lay audience, scientists and journalists are prone to progressively downplay the complexity. Conversely, in striving to adopt a more scientific approach and include the full range of uncertainty, we often give sceptics an easy route to undermine the science. All too often uncertainty in science offers a convenient excuse for delaying important decisions. However, in the case of climate change there is overwhelming evidence that the climate is changing — in part due to human activities — and that changes will accelerate if emissions continue unabated. In examining the uncertainty in the science we must take care to not throw away what we do know. Science has established that climate is changing. **Scientists** now **need to press on in developing the emerging tools that will be used to underpin sensible adaptation decisions which will determine our future.** **Pascual and Elkind 2010** (Carlos [US Ambassador to Mexico, Served as VP of foreign policy @ Brookings]; Jonathan [principal dep ass sec for policy and int energy @ DOE]; Energy Security; p 5; kdf) Climate change is arguably the greatest challenge facing the human race. ¶ It poses profound risks to the natural systems that sustain life on Earth and ¶ consequently creates great challenges for human lives, national economies, ¶ nations' security, and international governance.New scientific reports ¶ emerging from one year to the next detail ever more alarming potential ¶ impacts and risks. ¶ It is increasingly common for analysts and policymakers to refer to ¶ climate change as a threat multiplier, a destructive force that will exacerbate ¶ existing social, environmental, economic, and humanitarian stresses. ¶ The warming climate is predicted to bring about prolonged droughts ¶ in already dry regions, flooding along coasts and even inland rivers, an ¶ overall increase in severe weather events, rising seas, and the spread of ¶ disease, to cite just a few examples. Such impacts may spark conflict in ¶ weak states, lead to the displacement of millions of people, create environmental ¶ refugees, and intensify competition over increasingly scarce ¶ resources. ¶ One of the great challenges of climate change is, indeed, the scope of ¶ the phenomenon. The ongoing warming of the globe results chiefly from ¶ one of the most ubiquitous of human practices, the conversion of fossil fuels ¶ into energy through simple combustion. Halting and reversing climate ¶ change, however, will require both unproven-perhaps even unimaginedtechnology ¶ and sustained political commitment. We must change living ¶ habits in all corners of the globe over the course of the next several decades. ¶ We must resist the impulse to leave the problem for those who follow us ¶ or to relax our efforts if we achieve a few years of promising progress. The ¶ profound challenge will lie in the need for successive rounds of sustained ¶ policymaking, successive waves of technological innovation, and ongoing ¶ evolution of the ways in which we live our lives.
 * Key to warming adaptation**
 * Pope, 10**
 * Warming is a threat magnifier, makes all impacts inevitable and causes extinction**

Abstract¶ In this paper we demonstrate how a cloud-based computing architecture can be used for **planetary defense and space situational awareness (SSA**). We show how utility compute can facilitate both a financially **economical and highly scalable solution** for space debris and near-earth object impact analysis. As we improve our ability to track smaller space objects, and satellite collisions occur, the volume of objects being tracked vastly increases, increasing computational demands. Propagating trajectories and calculating conjunctions becomes increasingly time critical, thus requiring an architecture which can scale with demand. The extension of this to tackle the problem of a future near-earth object impact is discussed, and how cloud computing can play a key role in this civilisation-threatening scenario.¶ Introduction¶ Space situational awareness includes scientific and operational aspects of space weather, near-earth objects and space debris. This project is part of an international effort to provide a global response strategy to the threat of a Near Earth Object (NEO) impacting the earth, led by the United Nations Committee for the Peaceful Use of Space (UN-COPUOS). The impact of a NEO – an asteroid or comet – is a severe natural hazard but is unique in that technology exists to predict and to prevent it, given sufficient warning. As such, the International Spaceguard survey has identified nearly 1,000 potentially hazardous asteroids >1km in size although NEOs smaller than one kilometre remain predominantly undetected, exist in far greater numbers and impact the Earth more frequently1. Impacts by objects larger than 100 m (twice the size of the asteroid that caused the Barringer crater in Arizona) could occur with **little or no warning**, with the energy of hundreds of nuclear weapons, and are “devastating at potentially unimaginable levels” 2 (Figure 1). The tracking and prediction of potential NEO impacts is of international importance, particularly with regard to disaster management. **Space debris poses a serious risk to satellites** and space missions. Currently Space Track3 publishes the locations of about 10,000 objects that are publicly available. These include satellites, operational and defunct, space debris from missions and space junk. It is believed that there are about 19,000 objects with a diameter over 10cm. **Even the smallest space junk** travelling at about 17,000 miles per hour **can cause serious damage** ; the Space Shuttle has undergone 92 window changes due to debris impact, resulting in concerns that a more serious accident is imminent4, and the International Space Station has to execute evasion manoeuvres to avoid debris. There are **over 300,000 objects** over 1cm in diameter and there is a desire to track most, if not all of these. By improving ground sensors and introducing sensors on satellites the Space Track database will increase in size. By tracking and predicting space debris behaviour in more detail we can reduce collisions as the orbital environment becomes ever more crowded .¶ **Cloud computing provides the ability to trade computation time against costs**. It also favours an architecture which inherently scales, providing burst capability. By treating compute as a utility, compute cycles are only paid for when they are used. Here we present a cloud application framework to tackle space debris tracking and analysis, that is being extended for NEO impact analysis. Notably, in this application propagation and conjunction analysis results in peak compute loads for only 20% of the day, with burst capability required in the event of a collision when the number of objects increases dramatically; the Iridium-33 Cosmos-2251 collision in 2009 resulted in an additional 1,131 trackable objects (Figure 2). Utility computation can quickly adapt to these situations consuming more compute, incurring a monetary cost but keeping computation wall clock time to a constant. In the event of a conjunction event being predicted, satellite operators would have to be quickly alerted so they could decide what mitigating action to take.¶ In this work we have migrated a series of discrete manual computing processes to the Azure cloud platform to improve capability and scalability. It is the initial prototype for a broader space situational awareness platform. The workflow involves the following steps: obtain satellite position data, validate data, run propagation simulation, store results, perform conjunction analysis, query satellite object, and visualise.¶ Satellite locations are published twice a day by Space Track, resulting in bi-daily high workloads. Every time the locations are published, all previous propagation calculations are halted, and the propagator starts recalculating the expected future orbits. Every orbit can be different, albeit only slightly from a previous estimate, but this means that all conjunction analysis has to be **recomputed**. The quicker this workflow is completed the quicker possible conjunction alerts can be triggered, providing **more time for mitigation** .¶ The concept project uses Windows Azure as a cloud provider and is architected as a data-driven workflow consuming satellite locations and resulting in conjunction alerts, as shown in Figure 3. Satellite locations are published in a standard format know as a Two-Line Element (TLE) that fully describes a spacecraft and its orbit. Any TLE publisher can be consumed, in this case the Space Track website, but also ground observation station data. The list of TLEs are first separated into individual TLE Objects, validated and inserted into a queue. TLE queue objects are consumed by comparator workers which check to see if the TLE exists; new TLEs are added to an Azure Table and an update notification added to the Update Queue.¶ TLEs in the update notification queue are new and each requires propagation; this is an embarrassingly parallel computation that scales well across the cloud. Any propagator can be used. We currently support NORAD SGP4 propagator and a custom Southampton simulation (C++) code. Each propagated object has to be compared with all other propagations to see if there is a conjunction (predicted close approach). Any conjunction source or code can be used, currently only SGP4 is implemented; plans are to incorporate more complicated filtering and conjunction analysis routines as they become available. Conjunctions result in alerts which are visible in the Azure Satellite tracker client. The client uses Virtual Earth to display the orbits. Ongoing work includes expanding the Virtual Earth client as well as adding support for custom clients by exposing the data through a REST interface. This pluggable architecture ensures that additional propagators and conjunction codes can be incorporated, and as part of ongoing work we intend to expand the available analysis codes.¶ The framework demonstrated here is being extended as a generic space situational service bus to include NEO impact predictions. This will exploit the pluggable simulation code architecture and the cloud’s burst computing capability in order to allow refinement of predictions for **disaster management** simulations and potential emergency scenarios anywhere on the globe .¶ Summary¶ We have shown how a new architecture can be applied to space situational awareness to provide a scalable robust data-driven architecture which can enhance the ability of existing disparate analysis codes by integrating them together in a common framework. By automating the ability to alert satellite owners to potential conjunction scenarios we reduce the potential of conjunction oversight and decrease the response time, thus making space safer. This framework is being extended to NEO trajectory and impact analysis to help improve planetary defencs capability for all.
 * Cloud computing is also critical to** **space situational awareness** **—solves asteroids and debris**
 * Johnston et al 9** [Steven, PhD in computer engineering and MEng degree in software engineering, specializes in cloud-based architecture, Kenji Takeda, Solutions Architect and Technical Manager for the Microsoft Research Connections EMEA team, has extensive experience in Cloud Computing, Hugh Lewis, professor at University of Southampton, specialist in space situational awareness, Simon Cox, professor of Computational Methods and Director of the Microsoft Institute for High Performance Computing at University of Southampton, Graham Swinerd, professor at University of Southampton, specializes in space situational awareness, “Cloud Computing for Planetary Defense”, http://eprints.soton.ac.uk/71883/1/John_09.pdf, October 2009, 3/31/15]

// **Humanity has the skills** and know-how to deflect a killer asteroid of virtually any size, **as long as the incoming space rock is spotted with enough lead time**, experts say. Our species could even nudge off course a 6-mile-wide (10 kilometers) behemoth like the one that dispatched the dinosaurs 65 million years ago. We'd likely have to slam multiple spacecraft into a gigantic asteroid over a period of several decades to do the job, but the high stakes would motivate such a strong and sustained response, researchers say. "If you can hit it with a kinetic impactor, you can hit it with 10 or 100 of them," former NASA astronaut Ed Lu, chairman and CEO of the nonprofit B612 Foundation, which is devoted to protecting Earth against asteroid strikes, said during a news conference last month. "And I would submit to you that if we were finding an asteroid that's going to wipe out all life on Earth, or the majority of life on Earth, that funding is not an issue for launching 100 of them," Lu added. Undiscovered asteroids Lu and four other spaceflyers spoke Oct. 25 at the American Museum of Natural History in New York City. A primary purpose of the event was to draw attention to the danger asteroids pose to human civilization and life on Earth, and to discuss ways to mitigate the threat. Earth has been pummeled by space rocks repeatedly over the eons and will continue to get hit, a reality that was reinforced in February when a 55-foot-wide (17 meters) space rock exploded in the atmosphere over the Russian city of Chelyabinsk, injuring more than 1,000 people. The Russian meteor came out of nowhere, evading detection by the various instruments that are scanning the heavens for potentially hazardous objects. And there are many more such space rocks out there, gliding through deep space unknown and unnamed. To date, scientists have discovered about **10,000 near-Earth objects**, or NEOs — just **1 percent of the 1 million** or so asteroids thought to come uncomfortably close to our planet at some point in their orbits. So the **top priority** of any asteroid-defense effort should be a stepped-up detection campaign, Lu said. " Our challenge is to find these asteroids first, before they find us," he said. " **You cannot deflect an asteroid you haven't yet found** ."//
 * We have deflection capabilities, but** **detection** **is key**
 * Wall, 13** – senior writer at space.com (Mike, Space.com, “How Humanity Could Deflect a Giant Killer Asteroid”, 11/22/13, [], //11)//

//**Extinction**// //**Matheny, 7** (Jason G Matheny, Prof of Health Policy and Management at the Bloomberg School of Public Health at Johns Hopkins University, “Reducing the Risk of Human Extinction,” Risk Analysis Volume 27 Number 5, Oct. 15 2007, http://www.upmc-biosecurity.org/website/resources/publications/2007_orig-articles/2007-10-15-reducingrisk.html)//twemchen Even if extinction events are improbable, the expected values of countermeasures could be large, as they include the value of **all future lives**.This introduces a discontinuity between the CEA of extinction and nonextinction risks. Even though the risk to any existing individual of dying in a car crash is much greater than the risk of dying in an asteroid impact, asteroids pose a much greater risk to the existence of future generations (we are not likely to crash all our cars at once) (Chapman, 2004 ). The "death-toll" of an extinction-level **asteroid impact** is the population of Earth, **plus all** the **descendants** of that population who would otherwise have existed if not for the impact. There is thus a discontinuity between risks that threaten 99% of humanity and those that threaten 100%.

**FCC 1ac – FCC rocks**


 * Contention 3 – FCC rocks**

IV. HOW THE FCC SHOULD ADDRESS THE NSA SURVEILLANCE: IMPLEMENTING THE SOLUTION Congress is equipped to enact legislation codifying FCC oversight of the NSA by virtue of both current law and the PCLOB’s recommendations. First, the Telecommunications Act can serve as the basis for the FCC to take action to further develop its protection of consumers on the Internet, Moreover, there has been some movement in Congress calling on the FCC to take action regarding the NSA phone database, indicating the possibility of the FCC taking up an oversight role.116 Further, Congress gave the FCC broad investigation, regulatory, and enforcement powers, as well as the privacy-focused directive of implementing Consumer Propriety Network Information protection.117 Additionally, the first PCLOB Report calls for extensive changes in the NSA and FISA Court regime while the second report calls expressly for industry input and expertise: the FCC could facilitate some of the suggested changes through its subject matter expertise. Even as the FCC is set up to facilitate the PCLOB recommendations, Congress needs to codify the legal authority for the FCC to do this specifically. **Granting express legal authority is key**, as organic statutes of agencies determine what a given agency can and cannot do. Congressional authorization would be a logical outgrowth of both the FCC’s regulatory interests and current legal recommendations regarding NSA oversight. A. Congress should amend the organic statutes of the FCC and NSA and encourage participation in the FISA Court. The lack of oversight of NSA data collection practices will continue to be problematic moving forward, as national security is an ongoing concern and technology is a large part of life in a modern society. There is need for effective and transparent oversight of the NSA’s data collection. As such, Congress should act by amending the organic statutes of both the NSA and the FCC to provide the FCC with oversight authority over the NSA, and by allowing the FCC to participate as amicus curiae with the FISA Court. 1. Congress should amend the NSA organic statute to provide for collection of data by the FCC. The NSA needs transparent and easily understood oversight. While it should not have to disclose national security information, the agency should be required to disclose basic statistics, such as how much information it is gatherin g, similar to Recommendation 9 in the second PCLOB Report.118 This would at least illustrate to the public, via the FCC, that the NSA is **targeting** its surveillance at legitimate threats to national security — **rather than performing blanket surveillance of all Internet users**. Further, these reforms would comport with the PCLOB’s enumerated Recommendations.119 As of now, “lawmakers and the public do not have even a rough estimate of how many communications of U.S. persons are acquired under section 702.”120 Because the NSA is required to target foreign communications in order for its surveillance to be lawful,121 an annual snapshot showing the volume of its surveillance will help foster some degree of transparency,122 helping assure citizens that their privacy is not being intruded upon, without hampering legitimate national security efforts.123 This expanded role for the FCC in relation to the NSA should be codified by Congress. First, Congress should amend the NSA’s organic statute to require the agency to comply with FCC requests for data. Additionally, while the FCC does not have the security clearance to review the substance of the surveillance, such clearance is not necessary on an agency-wide basis. Instead, Congress should require the NSA to provide targeting statistics that could be reasonably disclosed, or at least preliminary statistics that could focus the FCC’s inquiry. This new legislation is all that is necessary to facilitate oversight on the NSA side, as the FCC will require most of the congressional authorization. 2. The FCC’s organic statute should be amended to allow the FCC authority over NSA data collection and participation in the FISA Court. To enact a solution based on FCC oversight of NSA data collection, Congress should pass legislation allowing the FCC to collect information from the NSA, and to allow the FCC to submit its findings about this data to congressional oversight committees as well as the FISA Court. While novel, this solution is in keeping with the PCLOB recommendations, particularly the recommendation emphasizing the need for the NSA to publicly disclose the scope of its surveillance.124 Moreover, it is not uncommon for agencies to have oversight authority over other agencies.125 Thus, this type of inter-agency accountability could be codified to provide the FCC with oversight authority over NSA data collection. Congress should first authorize the FCC to request certain types of data from the NSA. Similar to the PCLOB’s recommendation,126 this data, rather than being substantive, would be statistical ; for instance, it might include data and the basic context surrounding how many communications providers from which the NSA is collecting metadata, or how many email contact lists the NSA is gathering .127 This would thereby provide oversight over the relevancy problem, wherein the NSA collects information in such wide swaths so as not to be tied to any particularized inquiry .128 The FCC would therefore be in a position to review the volume of information, while keeping it confidential. The legislation should also include authorization for the FCC to interact with the other oversight bodies. Congress should give the FCC the authority to send any of the statistics that the agency finds problematic to the FISA Court and the relevant congressional committees, and should provide for the FCC to be informed of proceedings implicating data collection over which the FCC would be granted authority. Additionally, Congress should provide a mechanism for the FCC to liaise with Congress on a regular basis specifically about the NSA data collection since it involves sensitive information: for instance, setting out regular reports or allowing Congress to send inquiries to the FCC as needed on the technical aspects of the NSA’s methods of data collection. The language could also allow for public comment on NSA collection to some extent, modeled on the current FCC notice and comment procedures. The FCC could thereby ask for generalized comments without disclosing the exact nature of its inquiry. Thus, the FCC could solicit public comment on the underlying idea of NSA surveillance as it relates to the communications infrastructure and incorporate valid comments in its representations to the relevant oversight mechanisms. This would enable the FCC to incorporate comments by carriers and consumer interest groups into the oversight process and allow some degree of public participation without sacrificing national security. Moreover, the legislation must include a mechanism for protecting national security information. The FCC has knowledge about the underlying infrastructure where the data is coming from as well as experience dealing with sensitive information.129 However, there are valid concerns in disclosing any sort of information implicating national security. To that end, Congress may wish to consider adding a position in the FCC for an intelligence officer with clearance who can look into relevance when the amounts of data raise a red flag in the FCC’s internal process for reviewing the data. Moreover, placement of a member of an NSA staffer in the FCC would facilitate inter-agency cooperation and dialogue about data collection. For enforcement, in order to preserve national security, Congress should avoid providing the FCC any mechanism to call the NSA before it via hearing. However, the FCC would be able to report specially to the House and Senate committees, as well as petition the FISA Court as amicus curae. Additionally, if the PCLOB wants to stay involved and keep developing oversight, Congress should provide an avenue for the FCC to call forth another PCLOB investigation should the need arise. 3. Congress should allow outside parties to petition the FISA Court. Congress should follow the PCLOB Recommendation to allow outside parties, to petition the FISA Court to put forth independent views. The PCLOB recommendation about FISA Court operations would allow for public comment.130 While there are logistical problems with allowing other parties before the court, the PCLOB suggests that a Special Advocate could advise the FISA Court whether amicus participation would be helpful in a given case.131 Input from outside sources132—and, in particular, the FCC—would be useful in terms of providing technical insights into the impact of NSA surveillance on telecommunications. In particular, the FCC could be among the independent viewpoints incorporated in the continuing process of evaluating upstream and “about” collection. 133 Moreover, even if Congress decides to provide limited amicus participation, the FCC, providing volumetric data or technical expertise, could help act as a bridge between the public, parties in the communications field, and the court. The FISA Court itself considers each and every surveillance application fastidiously, but the public needs to have the same confidence in the court’s impartiality and rigor as those government actors who interact with or serve on the court.134 While there is need for secrecy due to national security concerns, there is also the need for the court to take into account a greater range of views and legal arguments, as well as receive technical assistance and legal input from outside parties.135 The PCLOB report indicates that, while there are difficulties in inviting amicus participation by parties lacking national security clearance, such as the FCC, the fact that it has been done in one instance indicates that it is possible to invite participation from outside parties without infringing upon national security.136 Moreover, as mentioned above, it may be useful for Congress to create a position at the FCC in which national security clearance is granted. Not only would this create a safeguard for the integrity of national security information, but this would provide for a person who can be called before the FISA Court who could be exposed to the facts of a given case, and using the data that has been collected and/or analyzed by the FCC, could provide insight into a particular instance. Therefore, Congress should encourage the FISA Court to use its ability to appoint technical experts as well as passing legislation to allow for more amicus participation by outside parties.137 Congress should enact legislation following the PCLOB recommendations with an eye towards focusing on the FCC as an expert by enacting legislation for the FCC to participate as amicus curiae before the FISA Court. V. CONCLUSION The FCC is in a position to provide oversight and transparency to the NSA Internet monitoring scandal. As an agency tasked with regulating the technology and communications sectors, the FCC has been keeping up with the infrastructure and development of technology vis-à-vis the Internet as it pertains to its congressional mandate and its own regulations. Moreover, there would not be an intrusion onto national security efforts because only the volume of information collected would be disclosed. The current crisis in public confidence shows that there is a place for the FCC to be an integral part of the oversight process. The FCC would focus the inquiry of the congressional oversight committees and provide the FISA Court with much-needed outside perspective and **technical assistance**, while simultaneously giving the public some comfort and adding transparency to the process. This inter-agency monitoring could increase **accountability and public confidence** in a way that **traditional oversight mechanisms cannot**: thus, the FCC is in a unique position to add value to the oversight of the NSA and Congress should pursue codifying this solution.
 * FCC oversight of the NSA curtails** **bulk** **internet surveillance and enforces** **targeted** **surveillance that excludes domestic users. FCC** **technical expertise** **and** **transparency** **prevent circumvention**
 * Healey, 14** - J.D. Candidate, The George Washington University Law School (Audra, “A Tale of Two Agencies: Exploring Oversight of the National Security Administration by the Federal Communications Commission”, FEDERAL COMMUNICATIONS LAW JOURNAL Vol. 67, December)

C. The FCC mission can be naturally expanded to protect privacy in relation to surveillance. The FCC has a strong privacy background as well as a strong history of promoting openness and transparency on the Internet. First, this section shows the FCC has been extending many of its regulations to the Internet and adapting to changes in technology as it does so. Second, the FCC has a strong history of protecting the nation’s communications infrastructure. The FCC has experience with accounting for the globalized nature of communications .91 This section next argues that the FCC’s background in these areas prepares the agency to step into a new role overseeing the NSA collection of data. Finally, this section discusses the benefits of tasking the FCC with this important oversight role. 1. The FCC has strong a background and significant expertise that will allow the agency to provide oversight of the NSA. Since the “advent of the Internet,” the FCC has been involved in regulating this facet of the nation’s communications infrastructure.92 For instance, as early as 1980, the FCC considered the extent to which information processing (as involved in Internet services) required further or different regulation from other communications networks.93 In 1980, the FCC began to recognize a distinction between basic and enhanced services, and applied this distinction until its codification in the Telecommunications Act of 1996.94 Following codification, the FCC continued its use of this framework, but expanded its scope to include elements of Internet infrastructure, such as broadband connectivity.95 However, the FCC remained willing to consider applying its regulatory framework to new technologies.96 This flexibility has helped the agency adapt to new and changing technology as it influences the nation’s communications infrastructure. Additionally, the FCC acknowledges the impact of privacy on the Internet. The recognition that “[c]onsumers’ privacy needs are no less important when consumers communicate over and use broadband Internet access than when they rely on [telephone] services,” has played a large part in FCC policy, as the agency has long supported protecting the privacy of broadband users.97 The FCC further ensures that consumers have control over how their information is used, and that they are protected from “malicious third parties.”98 Moreover, there is a direct link between consumer confidence and the adoption of new technology, which the agency has taken into account as it formulates new policies. As former Chairman Genachowski explained, in the FCC’s view, “[i]f consumers lose trust in the Internet, this will **suppress broadband adoption** and **online commerce** and communication, and all the benefits that come with it.”99 Moreover, the FCC has recognized that it can, and should, play a major role in protecting privacy and **consumer confidence** in the Internet, including **working with industry members** to provide best practices for security 100 and encouraging broadband adoption .101 The next logical step is for Congress to authorize the FCC to further develop Internet privacy principles in the context of **protecting consumers from NSA monitoring of their Internet communications** and **access of the Internet providers’ infrastructure** to do so. 2. FCC oversight of the NSA could confer significant benefits. The lack of oversight indicates the need for a solution that is **publically visible** but would not undermine national security : due to its relevant expertise, the FCC is that solution. First, there are benefits specific to the FCC’s area of expertise which make it well-suited to provide insight into the data collection regarding the public good and communications infrastructure. Second, the FCC’s unique insights into the technological aspects of the Internet put the agency in a position to be uniquely helpful to congressional oversight committees. Moreover, the FCC is also particularly well-suited to provide oversight consistent with plans advocated by the PCLOB: for instance, specially providing the FISA Court with useful and insightful amicus curiae briefs.102 There are significant benefits to the FCC being the agency to provide insight into the NSA’s monitoring activities. The NSA gets the information it collects from “major Internet switches” and depending on the type of surveillance, does not have to notify the companies from which it collects data .103 However, the FCC could, with additional congressional authority, provide insight into basic statistics about the information collected by the NSA : for instance, volume, requiring the NSA to at least show patterns (i.e., the “relationship mapping” aspects).104 This could be beneficial to the national security mission: by providing a volumetric, technical analysis , based on practices that can be described, the FCC could help focus the NSA’s data collection, and thereby contribute to the effort to **reduce overcollection** , as well as provide a grounds for congressional monitoring and more effective court cases.105 Moreover, the FCC routinely deals with sensitive information and collecting public comments .106 For instance, the FCC often makes certain pieces of information confidential in its proceedings. Recently, the agency issued protective orders in its comment-seeking proceeding regarding the Technological Transition of the Nations Communications Infrastructure.107 This experience would facilitate the FCC acting as a bridge between the NSA and its oversight mechanisms. Additionally the PCLOB report calls for a similar oversight scheme.108 The PCLOB, in its first report, calls for the government to work with Internet service providers and other companies that regularly receive FISA production orders to develop rules permitting the companies to voluntarily disclose certain statistical information.109 Additionally, the PCLOB recommends that the government publicly disclose detailed statistics to provide a more complete picture of government surveillance operations.110 The PCLOB also recommends that independent experts as well as telecommunications service providers help assess at least one data collection technique.111 The FCC regularly interacts with these companies in its own rulemaking proceedings, and would therefore be in a position to facilitate independent expertise being utilized in assessing the efficacy of the collection.112 This is not only because the agency works with the companies and the infrastructure involved already, 113 but also because the FCC’s general technical expertise places the agency in a position to consider what types of statistics would be helpful to the public. The need for expertise in determining the technical aspects of whether the data being collected is authorized is not limited to DOJ and NSA efforts, but extends to the FISA Court. In its first report, the PCLOB calls for Congress to enact legislation enabling the FISA Court to hear independent views.114 While a federal agency rather than an “independent” entity, the FCC would be particularly well-suited to bolster the outside input and provide the FISA Court with information regarding the impact on telecommunications, particularly the Internet, of NSA surveillance of the American public. The FCC would be a particularly helpful independent view to involve in the FISA Court proceedings because of its technical expertise. Furthermore, the FCC has significant experience dealing with sensitive information, such as trade secrets .115 Both these traits make the agency particularly well-suited to provide helpful insights to the FISA Court.
 * FCC** **public visibility** **,** **expertise** **in communication tech, and** **data analysis** **curb overreaching and boost** **public** **and** **industry****confidence** **in privacy protection**
 * Healey, 14** - J.D. Candidate, The George Washington University Law School (Audra, “A Tale of Two Agencies: Exploring Oversight of the National Security Administration by the Federal Communications Commission”, FEDERAL COMMUNICATIONS LAW JOURNAL Vol. 67, December)

The NSA’s extensive surveillance of U.S. citizens was brought into the spotlight by the recent disclosures of former NSA contractor Edward Snowden.5 The first of Snowden’s disclosures, released by The Guardian on Wednesday, June 5, 2013, revealed that the NSA was collecting phone call detail records from millions of U.S. consumers on a daily basis.6 This has prompted widespread public concern about the extensive information collection policy of the NSA. As technology continues to develop and the Internet continues to play a major role in modern life, governmental monitoring of Internet activity will likely become an area of increasing concern. The best way to ensure proper oversight of this monitoring is by empowering an administrative agency: namely, the Federal Communications Commission ( the “FCC”). This Note will address what role the FCC could and should play in overseeing intelligence activities that implicate individual privacy on the Internet and telecommunications networks. This Note argues that the FCC, as the expert independent agency that routinely deals with the Internet and telecommunications networks, has both the tools and capacity to provide some oversight and protection for Internet users. Part II discusses the background of each agency, beginning with the NSA, then delves into the FCC and its efforts to keep pace with the ever-changing Internet. Part III argues that, because the existing mechanisms for overseeing governmental, domestic surveillance programs are inadequate, and given the FCC’s long history of scrutinizing the interplay of national security and privacy involving telecommunications, Congress should empower the FCC to address privacy concerns raised by the NSA’s surveillance of U.S. citizens. Part IV discusses how the FCC could address NSA surveillance activities, laying out possible, practical solutions that Congress should provide.
 * FCC has the technical expertise to effectively curtail NSA internet surveillance**
 * Healey, 14** - J.D. Candidate, The George Washington University Law School (Audra, “A Tale of Two Agencies: Exploring Oversight of the National Security Administration by the Federal Communications Commission”, FEDERAL COMMUNICATIONS LAW JOURNAL Vol. 67, December)