Zero Days Negative MI 7

Zero Days Negative MI 7

Zero Days Negative – MI 7


case defense


1nc solvency

2nc alt causes

2nc cybersecurity impossible

2nc status quo solves

critical infrastructure advantage

1nc critical infrastructure

1nc grid impact

2nc grid impact

1nc water impact

at: agriculture

at: air traffic control

at: econ impact

at: emergency response impact

at: internet impact

ip theft advantage

1nc china modernization

2nc china modernization

1nc hegemony

1nc russian modernization

oco’s advantage

1nc treaties/norms

1nc cyberwar

2nc cyberwar

2nc fear mongering

2nc retaliation

2nc status quo solves

2nc us strikes first

at: china impact

offcase arguments

advantage counterplans

1nc oversight cp

2nc oversight solves

1nc regulations cp

2nc regulations cp

1nc wassenaar regulations cp

2nc solvency

at: cp doesn’t solve china

at: can’t catch all vulnerabilities

cyberdeterrence da

1nc cyberdeterrence da

2nc link/turns case wall

link – legal restrictions

link – transparency

brink – no cyberwar now

internal link – china war

internal link/impact – korea war

impact – china war

at: cyberdefense

at: cyberoffense bad

at: deterrence impossible

at: deterrence doesn’t apply to cyber

at: deterrence fails (attribution)

at: no retaliation

at: other agencies solve

at: transparency solves war

at: treaties solve

nato counterplan

1nc nato counterplan

2nc nato cp solvency

2nc nato impact

2nc russia cyberwar impact

at: permutation

politics links

1nc politics link

tpa solves ip theft

case defense


1nc solvency

No solvency --- US demand doesn’t drive global zero-day use

Bellovin et al. 14 [Steven M., professor of computer science at Columbia University, Matt Blaze, associate professor of computer science at the University of Pennsylvania, Sandy Clark, Ph.D. student in computer science at the University of Pennsylvania, Susan Landau, 2012 Guggenheim Fellow; she is now at Google, Inc., April, 2014, “Lawful Hacking: Using Existing Vulnerabilities for Wiretapping on the Internet,” Northwestern Journal of Technology and Intellectual Property, 12 Nw. J. Tech. & Intell. Prop. 1] //khirn

P165 It is interesting to ponder whether the policy of immediately reporting vulnerabilities could disrupt the zero-day industry. Some members of the industry, such as HP DVLabs, "will responsibly and promptly notify the appropriate product vendor of a security flaw with their product(s) or service(s)." n245 Others, such as VUPEN, which "reports all discovered vulnerabilities to the affected vendors under contract with VUPEN," n246 do not. Although it would be a great benefit to security if the inability to sell to law enforcement caused the sellers to actually change their course of action, U.S. law enforcement is unlikely to have a major impact on the zero-day market since it is an international market dominated by national security organizations.

Can’t solve lack of trust within the private sector --- regulatory and competitive barriers

Jaffer 15 [Jamil N., Adjunct Professor of Law and Director, Security Law Program, George Mason University Law School, Occasional Papers Series, published by the Dean Rusk Center for International Law and Policy, 4-1-2015, “Cybersecurity and National Defense: Building a Public-Private Partnership,”] //khirn

But, second, and perhaps even more important, is the lack of trust within the private sector — the inability of private industry actors to communicate with one another the threats they’re seeing. And there are a lot of reasons for that. There are regulatory reasons, there are competitive reasons, and there’s just an inherent sense of, “It’s hard for me to tell the guy next door what I’m doing.” Now, the truth is that at the systems administrator level this happens all the time. Systems administrators of major corporations all the time will call each other up and say, “Hey, I’m seeing this on my network. Are you seeing it?” And the reason that relationship works is because they trust each other. They know that the other sys admin is not going to, you know, screw them over competitively. They do worry at the corporate level, however. If general counsel were to know about this kind of conversation going on, they’d probably be tamping it down and saying, “Look, you can’t be talking to, you know, the sys admin over at our competitor because who knows if he tells his CEO what’s going to happen to us competitively.”

Vulnerabilities inevitable --- orphans

Bellovin et al 14 (Steven M. Bellovin (computer science prof at Columbia), Matt Blaze (associate prof at UPenn, Sandy Clark (Ph.D student at UPenn), & Susan Landau (Guggenheim fellow), April 2014, Lawful Hacking: Using Existing Vulnerabilities for Wiretapping on the Internet, Northwestern Journal of Technology and Intellectual Property, April, 2014, 12 Nw. J. Tech. & Intell. Prop. 1, lexis) /AMarb

To whom should a vulnerability report be made? In many cases, there is an obvious point of contact: a software vendor that sells and maintains the product in question, or, in the case of open-source software, the community team maintaining it. In other cases, however, the answer is less clear. Not all software is actively maintained; there may be “orphan” software without an active vendor or owner to report to.253 Also, not all vulnerabilities result from bugs in specific software products. For example, standard communications protocols are occasionally found to have vulnerabilities,254 and a given protocol may be used in many different products and systems. In this situation, the vulnerability would need to be reported not to a particular vendor, but to the standards body responsible for the protocol. Many standards bodies operate entirely in the open,255 however, which can make quietly reporting a vulnerability—or hiding the fact that it has been reported by a law enforcement agency—problematic. In this situation, the choice is simple: report it openly.

Can’t solve info sharing --- legal barreirs

Bucci, Ph.D., Rosenzweig and Inserra 13 (Steven P., Paul, and David, April 1, 2013, A Congressional Guide: Seven Steps to U.S. Security, Prosperity, and Freedom in Cyberspace, Heritage Foundation, /AMarb

There are four steps that can be taken to enable and encourage the needed cyber information sharing. First, Congress should remove barriers to voluntary private-sector sharing. Currently, legal ambiguities impede greater collaboration and sharing of information.[14] As a result, nearly every cybersecurity proposal in the last Congress contained provisions for clarifying these ambiguities to allow sharing. The 2011 Cyber Intelligence Sharing and Protection Act (CISPA), the Strengthening and Enhancing Cybersecurity by Using Research, Education, Information, and Technology (SECURE IT) Act of 2012, and the Cyber Security Act (CSA) of 2012 all authorized sharing by stating that “[n]otwithstanding any other provision of law” a private-sector entity can “share” or “disclose” cybersecurity threat information with others in the private sector and with the government.[15] While sharing information is important, all of it should be voluntary, in order to encourage true cooperation. After all, any arrangement that forces a business to share information is, by definition, not cooperation but coercion. Voluntary sharing will also allow organizations with manifest privacy concerns to simply avoid sharing their information, while still receiving helpful information from the government and other organizations. Second, those entities that share information about cyber threats, vulnerabilities, and breaches should have legal protection. The fact that they shared data about an attack, or even a complete breach, with the authorities should never open them up to legal action. This is one of the biggest hindrances to sharing today, as it seems easier and safer to withhold information than to share it, even if it will benefit others. The Information Technology Industry Council (ITIC) provides several examples of how liability concerns block effective information sharing. Under current law, “Company A [could] voluntarily report what may be a cybersecurity incident in an information-sharing environment, such as in an ISAC (Information Sharing and Analysis Centers), or directly to the government, such as to the FBI.” The result of such sharing could be that government prosecutors, law enforcement agencies, or civil attorneys use this information as the basis for establishing a violation of civil or criminal law against Company A or a customer, partner, or unaffiliated entity harmed by the incident sues Company A for not informing them of the incident as soon as they were aware of it. Company A’s disclosure can be seen as a “smoking gun” or “paper trail” of when Company A knew about a risk event though Company A did not yet have a legal duty to report the incident. Such allegation could lead to costly litigation or settlement regardless of its validity.[16] With the threat of legal action, businesses have determined that they are better off not sharing information. Strong liability protection is critical to expanding information sharing. Third, the information that is shared must be exempted from FOIA requests and use by regulators. Without such protection, a competitor can get its hands on potentially proprietary information through a FOIA action. Alternatively, if information is shared with a regulator, it will dampen voluntary sharing, since organizations will fear a backlash from regulators, who could use shared information to penalize a regulated party or tighten rules. Once again, the ITIC provides a valuable example. If a company shares information on a potential cybersecurity incident and “later finds that a database was compromised that included Individually Identifiable Health Information as defined under the Health Insurance Portability and Accountability Act (HIPAA),” then the Federal Trade Commission could use the shared information “as evidence in a case against [that company] for violating the security provisions of HIPAA.”[17] If shared information is exempted from FOIA and regulatory use, a company can share important data without fear that its competitive advantages will be lost to other firms or used by regulators to impose more rules or costs.[18]

NSA won’t listen to the plan --- circumvention inevitable

Gellman 13 (Barton Gellman writes for the national staff. He has contributed to three Pulitzer Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. The Washington Post: “NSA broke privacy rules thousands of times per year, audit finds.” Published August 15th, 2013. Accessed June 29th, 2015. KalM

The National Security Agency has broken privacy rules or overstepped its legal authority thousands of times each year since Congress granted the agency broad new powers in 2008, according to an internal audit and other top-secret documents. Most of the infractions involve unauthorized surveillance of Americans or foreign intelligence targets in the United States, both of which are restricted by statute and executive order. They range from significant violations of law to typographical errors that resulted in unintended interception of U.S. e-mails and telephone calls. The documents, provided earlier this summer to The Washington Post by former NSA contractor Edward Snowden, include a level of detail and analysis that is not routinely shared with Congress or the special court that oversees surveillance. In one of the documents, agency personnel are instructed to remove details and substitute more generic language in reports to the Justice Department and the Office of the Director of National Intelligence. In one instance, the NSA decided that it need not report the unintended surveillance of Americans. A notable example in 2008 was the interception of a “large number” of calls placed from Washington when a programming error confused the U.S. area code 202 for 20, the international dialing code for Egypt, according to a “quality assurance” review that was not distributed to the NSA’s oversight staff. In another case, the Foreign Intelligence Surveillance Court, which has authority over some NSA operations, did not learn about a new collection method until it had been in operation for many months. The court ruled it unconstitutional. Read the documents NSA report on privacy violations Read the full report with key sections highlighted and annotated by the reporter. FISA court finds illegal surveillance The only known details of a 2011 ruling that found the NSA was using illegal methods to collect and handle the communications of American citizens. What's a 'violation'? View a slide used in a training course for NSA intelligence collectors and analysts. What to say (and what not to say) How NSA analysts explain their targeting decisions without giving "extraneous information" to overseers. [FISA judge: Ability to police U.S. spying program is limited] The Obama administration has provided almost no public information about the NSA’s compliance record. In June, after promising to explain the NSA’s record in “as transparent a way as we possibly can,” Deputy Attorney General James Cole described extensive safeguards and oversight that keep the agency in check. “Every now and then, there may be a mistake,” Cole said in congressional testimony. The NSA audit obtained by The Post, dated May 2012, counted 2,776 incidents in the preceding 12 months of unauthorized collection, storage, access to or distribution of legally protected communications. Most were unintended. Many involved failures of due diligence or violations of standard operating procedure. The most serious incidents included a violation of a court order and unauthorized use of data about more than 3,000 Americans and green-card holders. In a statement in response to questions for this article, the NSA said it attempts to identify problems “at the earliest possible moment, implement mitigation measures wherever possible, and drive the numbers down.” The government was made aware of The Post’s intention to publish the documents that accompany this article online. “We’re a human-run agency operating in a complex environment with a number of different regulatory regimes, so at times we find ourselves on the wrong side of the line,” a senior NSA official said in an interview, speaking with White House permission on the condition of anonymity. “You can look at it as a percentage of our total activity that occurs each day,” he said. “You look at a number in absolute terms that looks big, and when you look at it in relative terms, it looks a little different.” There is no reliable way to calculate from the number of recorded compliance issues how many Americans have had their communications improperly collected, stored or distributed by the NSA. The causes and severity of NSA infractions vary widely. One in 10 incidents is attributed to a typographical error in which an analyst enters an incorrect query and retrieves data about U.S phone calls or e-mails. But the more serious lapses include unauthorized access to intercepted communications, the distribution of protected content and the use of automated systems without built-in safeguards to prevent unlawful surveillance. The May 2012 audit, intended for the agency’s top leaders, counts only incidents at the NSA’s Fort Meade headquarters and other ­facilities in the Washington area. Three government officials, speak­ing on the condition of anonymity to discuss classified matters, said the number would be substantially higher if it included other NSA operating units and regional collection centers. Senate Intelligence Committee Chairman Dianne Feinstein (D-Calif.), who did not receive a copy of the 2012 audit until The Post asked her staff about it, said in a statement late Thursday that the committee “can and should do more to independently verify that NSA’s operations are appropriate, and its reports of compliance incidents are accurate.” Despite the quadrupling of the NSA’s oversight staff after a series of significant violations in 2009, the rate of infractions increased throughout 2011 and early 2012. An NSA spokesman declined to disclose whether the trend has continued since last year. One major problem is largely unpreventable, the audit says, because current operations rely on technology that cannot quickly determine whether a foreign mobile phone has entered the United States. In what appears to be one of the most serious violations, the NSA diverted large volumes of international data passing through fiber-optic cables in the United States into a repository where the material could be stored temporarily for processing and selection. The operation to obtain what the agency called “multiple communications transactions” collected and commingled U.S. and foreign e-mails, according to an article in SSO News, a top-secret internal newsletter of the NSA’s Special Source Operations unit. NSA lawyers told the court that the agency could not practicably filter out the communications of Americans. In October 2011, months after the program got underway, the Foreign Intelligence Surveillance Court ruled that the collection effort was unconstitutional. The court said that the methods used were “deficient on statutory and constitutional grounds,” according to a top-secret summary of the opinion, and it ordered the NSA to comply with standard privacy protections or stop the program.

The plan doesn’t solve basic NSA surveillance --- that makes corporate trust impossible

Kehl, 14 (July, 2014, Danielle Kehl is a senior policy analyst at New America's Open Technology Institute, where she researches and writes about technology policy. , “Surveillance Costs: The NSA’s Impact on the Economy, Internet Freedom & Cybersecurity”