Category Archives: GEOINT

Papers from “Where’s the Value? Emerging Digital Economies of Geolocation” session

The written texts from the AAG panel session I co-organized with Agnieszka Lesczczynski entitled “Where’s the Value? Emerging Digital Economies of Geolocation” are now available. The panelists were Elvin Wyly (UBC), Rob Kitchin (National University of Ireland at Maynooth), Agnieszka Leszczynksi (University of Birmingham) and Julie Cupples (University of Edinburgh).

Two were posted to blogs (linked below) and two are reproduced below. Although I posted links to a couple of these previously, this blog entry collects them all. (Two panelists, Sam Kinsley and David Murakami Wood, were regrettably unable to attend.)

Thanks again to all!

~ ~ ~

Elvin Wyly: “Capitalizing the Records of Life” (see below)

Rob Kitchin “Towards geographies of and produced by data brokers

Agnieszka Leszczynski “What makes location valuable? Geolocation as evidence, meaning, & identity” (see below)

Julie Cupples “Coloniality, masculinity and big data economies

And here again is the audio from the session.

~ ~ ~

“Capitalizing the Records of Life”
Elvin Wyly, UBC

Let me begin with a confession.  I did some homework reading the cv’s of my colleagues on this panel, and this is where I found the answer to our central question, “Where’s the value in the emerging digital economies of geolocation?”  I.  Am.  In.  Awe.  It’s here, right here, right now, in the intersecting life-paths of extraordinary human geographers coming together to share the results of labor, creativity, critical insight and commitment.  Julie Cupples’ work on decolonizing education and geographies of media convergence intersects with Agnieszka Lesczynski’s inquiry into the gendered dimensions of the erosion of locational privacy and the “new digital spatial mediations of everyday life,” and Sam Kinsley’s ‘Contagion’ project on the movement of ideas through technologically mediated assemblages of people, devices, and algorithms.  David Murakami Wood’s Smart Cities project and editorial assemblage in the journal Surveillance and Society respond directly to the challenges and opportunities in Rob Kitchin’s (2014) call in The Data Revolution for “a more critical and philosophical framing” of the ontology, epistemology, ideology, and methodology” of the “assemblage surrounding” the production and deployment of geolocational data.  And many of these connections have been the subject of wise anticipatory reflections on Jeremy Crampton’s Open Geography, where the adjective and the verb of ‘open’ in the New Mappings Collaborative give us a dynamic critical cartography of the overwhelming political and knowledge economies of spatialized information.

As I read the cv’s of my panelists, it became obvious that the value of a record of a life — that’s the Latin curriculum vitae — is the new frontier of what James Blaut (1993) once called The Colonizer’s Model of the World, and what Kinsley (2014) has diagnosed as “the industrial retention of collective life.”  Smart cities, the social graph, the Internet of Things, the Quantified Self, the Zettabyte (270 bytes) Age analyzed by Kitchin:  all of this signifies a new quantitative revolution defined by the paradox of life in the age of post-humanist human geography.  In the closing lines of Explanation in Geography, David Harvey announced “by our models they shall know us” — a new generation of human geographers bearing the models and data of modern science; today, it’s the algorithms, models, and corporations that arrive bearing humans — millions and billions of them — whose curricula vitae can be measured, mapped, and monetized at scales that are simulteneously perrsonalized and planetary.  Facebook alone curates more than 64 thousand years of human social relations every day (four-fifths of it on mobile devices and four-fifths of it outside the U.S. and Canada) and LinkedIn CEO Jeffrey Weiner (quoted in MarketWatch, 2015) recently declared, “We want to digitally map the global economy, identifying the connections between people, companies, jobs, skills, higher educational organizations and professional knowledge and allow all forms of capital, intellectual captial, financial capital, and human capital to flow to where [they] can best be leveraged.”

Capitalized curricula vitae, however, are automating and accelerating what Anne Buttimer once called the ‘dance macabre’ of the knowledge economies of spatialized information, because the deceptively friendly concept of ‘human capital’ is in fact a deadly contradiction:  capital is dead labor, the accumulated financial and technological appropriation of surplus value created through human labor, human creativity, and human thought.  Buttimer’s remark about geospatial information being “a chilly recording by a detached observer, a hollow rattle of bones” hurt — because this is what she said in a conversation to the legendary time-geographer Torsten Hägerstrand, who in the 1940s spent years with his wife Britt in the church-register archives of a rural Swedish parish to understand “a human population in its time and space context.  Here’s what Hägerstrand (2006, p. xi) recalls:

 “[We] worked out the individual biographies of all the many thousands of individuals who had lived in the area over the last hundred years.  We followed them all from year to year, from home to home, and from position to position.  As the data accumulated, we watched the drama of life unfold before our eyes with graphic clarity.  It was something of stark poetry to see the people who lived around us, many of whom we knew, as the tips of stems, endlessly twisting themselves down in the realm of times past.”

Hägerstrand wrote that he was disturbed and alarmed by Buttimer’s words, and I am too, because Allan Pred (2005, p. 328) began his obituary for Hägerstrand by quoting Walter Benjamin, emphasizing that it is not only knowledge or wisdom, but above all real life — “the stuff stories are made of” — which “first assumes transmissible form at the moment of …death.”  But just as “every text has a life history” (Pred, 2005, p. 331) that comes to an end, now Allan Pred’s curriculum vitae has also assumed transmissible form of the market-driven, distorted sort you can track through the evolving Hägerstrandian time-space prisms of the digitized network society.  Hägerstrand is dead, but he has a Google Scholar profile that’s constantly updated by the search robots, and the valorized geolocatable knowledge of his citations put him in a dance macabre of apocalyptic quantification:  he is “worth” only 1.093 percent of the valorization of another dead curriculum vitae, that of Foucault, who’s also on Google Scholar.  The world is falling in love with geography, but we don’t need more than just a few human geographers to do geography, thanks to the self-replicating algorithms and bots of the corporate cloud of cognitive capital.

The geolocatable knowledge economy is thus a bundle of contradictions and the endgame of the organic composition of human capital.  Human researchers spending years in the archives to build databases are now put into competition with the fractal second derivatives of code:  how do I balance my respect and reverence for our new generation of geographers screen-scraping APIs and coding in R, D3, Python, and Ruby on Rails without giving up what we have learned from the slow, patient, embodied labor of previous generations working by hand?  I see the tips of stems, not just in Hägerstrand’s small Swedish parish, but right here, in this room.  Tips of stems, endlessly twisting down in the realm of times past — but in today’s times where each flower now faces unprecedented competition in every domain:  jobs, research support, academic freedom, human care, human recognition, human attention.  Tips of stems, endlessly twisting through time-spaces of a present suffused with astronomical volumes of geographical data in what the historian George Dyson (2012) calls the “universe of self-replicating code.”  Tips of stems, tracing out an entirely new ontology of socio-spatial sampling theory defined by the automated mashup analytics that now combine Hägerstrand’s time-space diagrams with Heisenberg’s observational uncertainties, Alan Turing’s (1950) ‘universal machine,’ and Foucault’s archaeology of knowledge blended with Marx’s conception of the “general intellect” and Auguste Comte’s notion of the ‘Great Being’ of all the accumulated knowledge of intergenerational human knowledge, tradition, and custom.  Tips of stems, tracing lifeworlds of a situationist social physics that treats smartphones as “brain extenders” (Kurzweil, 2014) converging into a planetary “hive mind” (Shirky, 2008) while reconfiguring the observational infrastructures and human labor relations of an empiricist hijacking of positivism:  if Chris Anderson (2008) is correct that the petabyte age of data renders the scientific method obsolete, then who needs theory?

We all need theory — we humans.  Theory is the intergenerational inheritance of human inquiry, human thought, and human struggle.  Let me be clear:  I mean no disrespect to the extraordinary achievements of the new generation of data revolutionaries represented by my distinguished panelists, and all of you who can code circles around my pathetic do-loop rusty routines in FORTRAN, Cobol, and SAS.  Tips of stems, twisting themselves down into the realms of human history:  take a look around, at one of the last generations of human geospatial analysts, before we’re all replaced by algorithmic aggregation.  Yesterday’s revolution was humans doing quantification.  Today’s revolution is quantification doing humans.


Anderson, Chris (2008).  “The End of Theory:  The Data Deluge Makes the Scientific Method Obsolete.”  Wired, June 23.

Blaut, James (1993).  The Colonizer’s Model of the World.  New York:  Guilford Press.

Dyson, George (2012).  “A Universe of Self-Replicating Code.”  Edge, March 26, at

Hägerstrand, Torsten (2006).  “Foreword.”  In Anne Buttimer and Tom Mels, By Northern Lights:  On the Making of Geography in Sweden.  Aldershot:  Ashgate, xi-xiv.

Kitchin, Rob (2014).  The Data Revolution:  Big Data, Open Data, Data Infrastructures and Their Consequences.  London:  Sage Publications.

Kinsley, Sam (2014).  “Memory Programmes:  The Industrial Retention of Collective Life.”  Cultural Geographies, October.

Kurzweil, Ray (2014).  Comments at ‘Will Innovation Save Us?’ with Richard Florida and Ray Kurzweil.  Vancouver:  Simon Fraser University Public Square, October.

MarketWatch (2015).  “LinkedIn Wants to Map the Global Economy.”  MarketWatch, April 9.

Pred, Allan (2005).  “Hägerstrand Matters:  Life(-path) and Death Matters — Some Touching Remarks.”  Progress in Human Geography 29(3), 328-332.

Shirky, Clay (2008).  Here Comes Everybody:  The Power of Organizing Without Organizations.  New York:  Penguin.

Turing, Alan M. (1950).  “Computing Machinery and Intelligence.”  Mind 59(236), 433-460.


~ ~ ~

“What makes location valuable? Geolocation as evidence, meaning, & identity”
Agnieszka Lesczcynski, University of Birmingham

I want to invert the question that Jeremy and myself posed to the panel when organizing this session by asking, rather than ‘where is the value’ in geolocation, what is it that makes geolocation valuable? In contending that there are particular kinds of economies emerging around location, it is because geolocation itself is somehow intrinsically valuable, and I’d like to make some preliminary propositions to this end.

Over the last few years I have been particularly interested in the ways in which emergent surveillance practices of the securities agencies, made broadly known to us through the as yet still-unfolding Snowden revelations, are crystallizing around big data – its collection, mining, interception, aggregation, and analytics. And specifically, I’m particularly interested in the ways in which locational data is figuring as central within these emergent regimes of dataveillance. Indeed at the close of 2013, Barton Gellman and Ashkan Soltani, reporting in the Washington Post, identified at least ten American signals intelligence programmes or SIGADs that explicitly sweep up locational data – i.e., where location data is the target or object of data capture, interception, and aggregation.

  • Under a SIGAD designated HAPPYFOOT, the NSA taps directly into mobile app data traffic that streams smartphone locations to location-based advertising networks organized around the delivery of proximately relevant mobile ads, often unencrypted and in the clear. This locational data, which is often determined through mobile device GPS capabilities, is far higher-resolution than network location, allowing the NSA “to map Internet addresses to physical locations more precisely than is possible with traditional Internet geolocation services”
  • Documents dating from 2010 reveal that the NSA and GCHQ exploit weaknesses in ‘leaky’ mobile social and gaming applications that veil secondary data mining operations behind primary interfaces, piggybacking off of commercial data collection by syphoning up personal information including location under a signals intelligence program code-named ‘TRACKER SMURF’ after the children’s animated classic
  • In perhaps the most widely publicized example, the NSA collects over 5 billion cell phone location registers off of cell towers worldwide, bulk processing this location data through an analytics suite code-named CO-TRAVELLER which looks to identify new targets for surveillance on the basis of parallel movement with existing targets of surveillance – i.e., individuals whose cell phones ping off of the same cell towers in the same succession at the same time as individuals already under surveillance.
  • Just a few months ago, it was leaked that the CSE, or Canada’s version of the NSA, was tracking domestic as well as foreign travelers via Wi-Fi at major Canadian airports for up to two weeks as they transited through the airports and subsequently through other ‘nodes’ including other domestic and international airports, urban Wi-Fi hotspots, transport hubs, major hotels and conference centers, and even public libraries both within Canada and beyond in a pilot project for the NSA;
  • and, most recently, under a SIGAD code-named LEVITATION, the CSE has been demonstrated to be intercepting data cable traffic to monitor up to 15 million file downloads a day. Particularly significant in the leaked CSE document detailing this programme is that the CSE explicitly states that it is looking to location data to improve LEVITATION capabilities for intercepting both GPS waypoints and “[d]evices close to places” so as to further isolate and develop surveillance targets, including those carrying and using devices within proximity of designated locations.

So the question is, why geolocation? Why is it of such great interest to the securities agencies? And here I want to argue that it is of interest because it is inherently valuable, and uniquely so amongst other forms of PII. And this value is latent in the spatio-temporal and spatial-relational nature of geolocation data.

  • the spatio-temporal nature of many spatial big data productions means that it may be enrolled as definitive evidence of our complicity or involvement in particular kinds of socially disruptive events or emergencies by virtue of our presence, or as in the case of CO-TRAVELER, co-presence and co-movement, in particular spaces at particular times
  • furthermore, longitudinal retention of highly precise, time-stamped geoloational data traces allow for the reconstruction of detailed individual spatial histories, which like the CO-TRAVELER example, similarly participate within what Kate Crawford has recently characterized as emergent truth economies of big data in which data is truth;
  • the relational nature of spatial big data productions, in which our data may be used to discern our religious, ethnic, political and other kinds of personal affiliations and identifies on the basis of the kinds of places that we visit and the ability to establish linkages with other PII across data flows;
  • and, in this vein, the ways in which locations are inherently meaningful – for example, they may be as revealing of highly sensitive information about ourselves as our DNA. For instance, on the basis of the specialty of a medical office that we visit, this information may be revelatory of the fact that we may have a degenerative genetic disease and the nature of that disease – information that socially we otherwise understand as some of the most private information about ourselves.
  • and, of course, the ways in which location is not only revealing of identity positions, but it is identity – for example, a group of researchers determined that unique individuals could be identified form the spatial metadata of only four cell phone calls at a very high confidence level.

So in asking where is the value in geolocation, my take is that it is valuable – to both the intelligence apparatuses that I have highlighted here but also corporate entities – because it is uniquely sensitive – revealing and identifying – amongst other forms of PII.

~ ~ ~

TASC to build app store for spy agency

Tim Shorrock (author of Spies for Hire) with news of a new geospatial intelligence contract for the NGA, worth $25m:

NGA has plan for total “Map of the World”

John Goolgasian, NGA

According to the NGA, one of the most popular sessions at the recent GEOINT 2013* (held over from 2013) conference was one which offered a total “Map of the World:”

But what is it?

Map of the World is the foundation for intelligence integration, said NGA Director Letitia A. Long in her keynote address at the four-day event.

The clue lies in this statement:

Twelve different data views will make up Map of the World and nine of them are online now, including maritime and aeronautical.

This, along with Goolgasian’s involvement, indicates that it is probably related to, or draws from, the work of the World-Wide Human Geography Database Working Group (WWHGD). I’ve written about Goolgasian on this blog before.

The WWHGD is a government-private contractor (Booz Allen Hamilton are the provided contact points and presumably run it) group that is seeking to:

The WWHGD Working Group is designed to build voluntary partnerships around human geography data and mapping focused on the general principle of making appropriate information available at the appropriate scales to promote human security. This involves a voluntary “whole-of-governments” national and international approach to create a human geography data framework that can leverage ongoing efforts around the world to identify, capture, build, share, and disseminate the best available structured and unstructured foundation data.

Here are the data they’re looking at in these layers:

The inclusion of things like land ownership maps directly on to the arguments of Geoffrey Demerest, who was a key player in the Bowman Expeditions. You can judge for yourselves about the set of information here. Personally I think it’s way too rigid and a-historical (what about a history of foreign intervention in an area, or standards of living and well-being?).

But even beyond that it reflects a belief in the efficacy of totalizing indexes. We heard something about this at the AAG, and Brad Evans and Julian Reid have a discussion about it in their new book Resilient Life.

The article continues:

“Through a single point on the Earth, the Map of the World will present an integrated view of collection assets from across the community, mapping information for military operations, GEOINT observations, and NGA analytic products, data and models,” said Goolgasian.

Worth keeping an eye on.

Contractor receives $400K federal funds for automatic license plate reading

According to reporting by Bloomsberg News the IRS, the Forest Service and the U.S. Air Force’s Air Combat Command have awarded a contractor over $400,000 in contracts for its automated licence plate recognition (ALPR) system since 2009.

It’s not clear if the contracts to Vigilant Solutions are ongoing, given the context that Homeland Security dropped similar plans in February of this year following widespread opposition form civil liberties groups.

“Especially with the IRS, I don’t know why these agencies are getting access to this kind of information,” said Jennifer Lynch, a senior staff attorney with the Electronic Frontier Foundation, a San Francisco-based privacy-rights group. “These systems treat every single person in an area as if they’re under investigation for a crime — that is not the way our criminal justice system was set up or the way things work in a democratic society.”

Other countries (including the UK) have long had such systems in place.

If you go to the Vigilant website they have a long complaining blog post about the lies and distortions by civil liberties groups:

License plate readers are under siege nationwide, thanks to a well-funded, well-coordinated campaign launched by civil liberties groups seeking to take advantage of the growing national debate over surveillance. 

Unfortunately, the campaign led by the American Civil Liberties Union (ACLU) has deliberately clouded and even omitted those facts.

According to this article, Vigilant actually successfully used the First Amendment to overturn an anti license-plate recognition law in Utah:

Vigilant Solutions and DRN [Digital Recognition Network] sued the state of Utah on constitutional grounds, arguing that the law infringed on the First Amendment right to take photographs of public images in public places, a right that everyone in Utah shares.

The law was overturned, but Vigilant com,plains that state agencies were then barred from using any of the data collected, impacting their profits. They also complain about data retention limits.

What’s also interesting about companies such as this is that they illustrate the argument for understanding policing and military together (see this blog post by Derek Gregory for example).

DNI Clapper: press coverage “inaccurate”

DNI Clapper labeled press coverage of the Snowden affair as “inaccurate, misleading and incomplete” at the GEOINT 2013* meeting today:

He also repeated his position that Snowden is not a whistleblower:

Our new paper on intelligence now online


Very excited to announce our paper “The New Political Economy of Geographical Intelligence” is now online at the publisher’s website for the Annals of the Association of American Geographers.

The publishers have provided a link for free access to the first 50 people (click here for free access)! (Edit: these have unfortunately all been claimed)

The regular link which will remain after those free accesses are used up is this one


A troubling new political economy of geographical intelligence has emerged in the United States over the last two decades. The contours of this new political economy are difficult to identify due to official policies keeping much relevant information secret. The U.S. intelligence community increasingly relies on private corporations, working as contractors, to undertake intelligence work, including geographical intelligence (formally known as GEOINT). In this article we first describe the geography intelligence “contracting nexus” consisting of tens of thousands of companies (including those in the geographical information systems and mapping sector), universities and nonprofits receiving Department of Defense and intelligence agency funding. Second, we discuss the “knowledge nexus” to conceptualize how geographical knowledge figures in current U.S. intelligence efforts, themselves part of the U.S. war on terror and counterinsurgency (COIN). To analyze the contracting nexus we compiled and examined extensive data on military and intelligence contracts, especially those contracts awarded by the country’s premier geographical intelligence agency, the National Geospatial-Intelligence Agency (NGA), for satellite data. To analyze the knowledge nexus we examined recent changes in the type of geographical knowledges enrolled in and produced by the U.S. intelligence community. We note a shift from an emphasis on
areal and cultural expertise to a focus on calculative predictive spatial analysis in geographical intelligence. Due to a lack of public oversight and accountability, the new political economy of geographical intelligence is not
easy to research, yet there are reasons to be troubled by it and the violent surveillant state it supports.

Key Words:
geographical intelligence, geographical knowledge, GEOINT, government contracting, National Geospatial-Intelligence Agency.

Annals paper on Geographical Intelligence

I’m very excited to say that our (myself, Sue Roberts and Ate Poorthuis) paper for the Annals of the Association of American Geographers is now at the proofs stage. The first page is below. I believe it will be out in an early 2014 issue.

Pages from Proofs

PDF of first page

Who are Booz Allen Hamilton?

Edward Snowden, the 29-year-old at the center of the NSA revelations, worked at a company called Booz Allen Hamilton (“Booz Allen”), just before leaving for Hong Kong. Who are they?

Booz Allen are a major, major, intelligence contractor. Put another way, they’ve taken at least $31 billion from the government (at least $19.6B from the Department of Defense; the 25th largest defense contractor). Their market cap is $2.51B. According to their financials, they earn more than 98% of their revenue from the government.

Here are a few links to stories about them of relevance. Wikipedia will give the general background, but I want here to highlight a couple of other sources you might overlook.

The Guardian has a nice run-down on them, pointing out links to DNI Clapper, Mike McConnell (Bush administration DNI) and James Woolsey.

The investigative journalist Tim Shorrock, author of Spies for Hire, has done the most reporting on them. It doesn’t take much when looking at them to find significant links to academic geography, GIS and GEOINT.

2007: Booz Allen Hamilton, Mike McConnell. Shorrock on Democracy Now.

2010: The corporate intelligence community. A photo tour (Shorrock). Northern Virginia, the epicenter of intel contracting.

Joan Dempsey, seen here at one of the many GEOINT conferences for which she has been MC. Dempsey is an Executive Vice President of Booz Allen, and a Board member of the USGIF, the US Geospatial Intelligence Foundation.

The USGIF is  the organization behind the annual GEOINT conferences. Although a non-profit it reported assets of around $5.1m in FY2011. It is mostly run by defense contractors, academics with intel and GEOINT interests (eg., the Chair of the GMU Geography department, Dr. Peggy Agouris) or other interested GIS experts (including Mike Goodchild, heh-hem and Jack Dangermond, CEO of Esri). Even the intel journalist Matthew Aid, who usually takes a fairly mainstream view of the IC, recently remarked that the USGIF had entered into a “sweetheart” contract with the NGA.

Update. The Wall Street Journal has a new piece on Booz Allen including the following nugget: “25,000 people, 76% of whom have government security clearances allowing them to handle sensitive national security information.” This includes 27% at one of the toppest of top levels, Top Secret//SCI. It’s worth studying official figures on security clearances, given here, which indicate 1.4 million people hold a “Top Secret” clearance. Looked at one way, that’s 1.4 million chances of a leak. Just in case you were wondering how a 29-year-old defense contractor got hold of such sensitive documents.

New information on Prism

I didn’t say much about Prism in my post yesterday as it didn’t seem quite as clear as the Verizon court order. (Compare the two here.) Additionally, the complete slideset was not posed by the Guardian, unlike the Verizon court order. We now have some additional information. (Update: The Guardian has now published a single additional slide.)

First, the program obviously exists. See this job ad requiring expertise in it, and this datasheet from Cryptome indicating its use since 2003; and this senior intel officer’s online resume at LinkedIn mentioning Prism expertise.Capture

I did think it odd that it was only funded at $20m. My guess right now based on additional reporting by Declan McCullagh, Chief Political Correspondent at CNET, is that it is software that facilitates data extraction/interface with the named companies. Additionally, Marc Ambinder, who I mentioned in my post, says “PRISM is a kick-ass GUI that allows an analyst to look at, collate, monitor, and cross-check different data types provided to the NSA from internet companies located inside the United States.”

It obviously works within the law, but if we accept tech company pronouncements, does not provide the sort of continuous “direct access” to company servers that has been discussed. The “fact of” Prisms’ existence is not classified, but what it does, is. McCullagh’s argument that “Prism is an unclassified web tool” is completely misleading.

Nevertheless, these are really a technical clarifications. The main points remain, I think:

1. Tech companies work with the government/NSA within the law to provide user data. We should still be concerned , even if this is just one small part of US surveillance. Most immediately, we need to rethink the law, especially FISA and the Patriot Act. Do not pay attention to tech company pronouncements that they operate within the law. No one said otherwise. But that’s the problem.

2. The government can obtain access to user records from these companies. Saying that it is overseen by the FISA Court is irrelevant–who’s going to appeal? The Court’s deliberations are secret. And if you did appeal, good luck: the Supreme Court recently refused to hear an appeal by Amnesty International because they “lack standing” ie don’t know for a fact that they were affected by the law. And as McCullagh concedes “How much oversight and review the Foreign Intelligence Surveillance Court actually provides is less than clear.”

3. The amount of data collected is still considerable. Consider this scenario laid out by Ambinder:

Under the FISA Amendments Act of 2008, the NSA and the attorney general apply for an order allowing them to access a slice of the stuff that a company like Facebook keeps on its servers. Maybe this order is for all Facebook accounts opened up in Abbottabad, Pakistan. Maybe there are 50 of them. Facebook gets this order.

Now, these accounts are being updated in real-time. So Facebook somehow creates a mirror of the slice of stuff that only the NSA can access. The selected/court-ordered accounts are updated in real-time on both the Facebook server and the mirrored server. PRISM is the tool that puts this all together. Facebook has no idea what the NSA is doing with the data, and the NSA doesn’t tell them.

The companies came online at different points, according to the documents we’ve seen, maybe because some of them were reluctant to provide their data and others had to find a way to standardize their data in a way that PRISM could understand. Alternatively, perhaps PRISM updates itself regularly and is able to accept more and more types of inputs.

What makes PRISM interesting to us is that it seems to be the ONLY system that the NSA uses to collect/analyze non-telephonic non-analog data stored on American servers but updated and controlled and “owned” by users overseas. It is a domestic collection platform USED for foreign intelligence collection. It is of course hard to view a Facebook account in isolation and not incidentally come into contact with an account that is owned by an American. I assume that a bunch of us have Pakistani Facebook friends. If the NSA is collecting on that account, and I were to initiate a Facebook chat, the NSA would suck up my chat. Supposedly, the PRISM system would flag this as an incidental overcollect and delete it from the analyst’s workspace. Because the internet is a really complicated series of tubes, though, this doesn’t always happen. And so the analyst must sometimes “physically” segregate the U.S. person’s data.

The top 3 myths about the recent surveillance revelations

The recent–and still ongoing–revelations in the Guardian by their columnist Glenn Greenwald and his colleagues have already given rise to a number of dismissive myths.

Here are three of them, and my responses.

1. “It’s nothing new. We’ve known about this for a long time.”

For example, Senator Chambliss, ranking member of the Senate Intelligence Committee: “Everyone’s been aware of it for years.”

This is a common human reaction to any information that is presented as being important. It’s healthy and reflects a critical attitude. You may remember  similar responses to the WikiLeaks cables. But the latter turned out to be incredibly useful. So it’s worth recognizing what is new here, and what we’ve already known. (And there is a difference between “known” in the sense of known as a undisputed fact and “suspected.”)

In 2005 the New York Times revealed (after sitting on the story until George Bush was re-elected) that the NSA had been performing “warrantless wiretaps” in a program known as “Stellar Wind.” The story was reported by James Risen and Eric Lichtblau (see super-useful EFF timeline here) who later won a Pulitzer prize for their reporting. This was–and remains–genuinely new information, not least because it was not something rogue going on, but was done under the full direction of the Bush White House. It was a central plank in liberals’ opposition to Bush’s war on terror as it applied domestically (the Iraq war was the other, as it applied overseas). Risen was subpoenaed twice by the government as part of their still-ongoing investigation into one of his alleged sources (Jeffrey Sterling, a former CIA employee) for a separate story (see case files here).

When President Obama took office he reportedly closed down this program. But note that it refers to “warrantless” wiretapping, or interception. What if you could get access without needing a warrant? And do so legally? This is, in large part, what is significant about the recent revelations. Yes, Sens. Udall and Wyden have been trying to publicly put on record information about this, as the latter tweeted yesterday:

But now this is confirmed by the Guardian’s Verizon story, rather than hinted or speculated at. So what is new is that we now know that:

The Guardian for the first time published an actual FISA Court order. This order revealed that the US is collecting information (specifically, metadata) on all communications by customers, both foreign and domestic, of the country’s biggest telecom provider. Specifically, Verizon’s business customers. Senator Feinstein, who is on the Senate Intelligence Oversight Committee, said in a press conference on June 6, 2013, that as far as she was aware this was a routine 3-month extension of a program going back at least to 2006.

It was previously speculated or thought that this was going on (eg., see this USAToday story from 2006). But now we know.

As recently as March, 2013, DNI Director Clapper, when asked a direct question on whether the US was collecting information on millions of Americans, said “no.” Glenn Greenwald directly called this a “lie” on “Democracy Now” this morning.

Second, the Guardian and the Washington Post both revealed the existence of another program, known as Prism, that collects the actual content of communications from Yahoo, Google, Apple, Facebook and so on, of people (including Americans) overseas. According to the document, which the Guardian has authenticated, the NSA has had “direct access” into the servers of these companies on an ongoing basis since 2007.

2. “It’s just metadata, not content.”

This is a serious misunderstanding. The secret FISA court order published by the Guardian gives the FBI and the NSA access to all “transactional metadata” which defenders of the program immediately characterized as akin to reading the outside of an envelope, rather than your letter inside. But to conclude that your personal privacy has not been violated is to be ignorant of what you can do with metadata. Note that the metadata  includes phone numbers, location, length of the call, and who called who. From this, it is easy enough to build a pretty complete picture of what’s going on (and it may therefore be even more valuable than the actual content!). After all, according to the Wall Street Journal, it was metadata that revealed former CIA Director GEN. David Petraeus’ affair with his Mistress Paula Broadwell. Investigators were able to note her location and contacts in order to build a case against her before reading any of her messages’ content. Investigators then used the metadata as probable cause to obtain a warrant to read her emails, which led them to Petraeus. It is in the nature of “big data” that it can be extensively mined for significant patterns and findings, and can be leveraged against ancillary data (Crampton et al. 2013).

Locational metadata is by itself a critical insight into activity. Indeed there is a whole field of intelligence analysis known as Activity-based Intelligence” or ABI, that is a key part of intelligence, including geographical intelligence (GEOINT) that relies on geolocational data. A recent paper (pdf) by a joint team of investigators from MIT, Harvard and Louvain recently showed that they could uniquely identify an individual 95 percent of the time from a large, anonymized dataset, knowing just four pieces of metadata. So if I know where you are just four times, I can almost certainly uniquely identify you even if personal identifiers are stripped (as there are not in the Verizon order). Then I can track you, see who you interact with, for how long, and build a pretty good picture that will at least get me a subpoena (which, remember, requires less evidence than a warrant).

Also note that metadata are deemed by US law to have been “given” by you to a third party and so are not subject to warrant having probable cause (a la the Fourth Amendment) but only a subpoena, which is much easier to get.

3. “The leaks (and the leakers) threaten legal, approved measures that are designed to ensure the safety of Americans. We should prosecute/investigate/stop leakers.”

For example, during the same press conference yesterday, Sen. Feinstein was asked if the Verizon leak should be investigated. She replied “Yes, I think so” (video).

There are several points to be made here. First, it is part of the problem, not the solution, that these programs (Verizon and PRISM, as well as others we sometimes hear about, such as “Ragtime” a codename revealed in Marc Ambinder’s book, Deep State) operate within the law. It indicates that the laws are wrong, overbroad, and unconstitutional. This includes the Patriot Act.

Second, to say that “Congress in fully briefed” as both President Obama and Sen. Feinstein did, is irrelevant and untrue. Only a very small group of Senators (typically either the “Gang of Four” [CRS pdf], or “Gang of Eight” [CRS pdf]) get anything like regular national security/intel briefings (there was a separate one yesterday, to 27 interested Senators), but, since they can’t tell the public what’s going on, and Intelligence committees rarely hold publicly accessible meetings, this is not much good to US citizens, nor even to other Senators and Congressmen and women not included.

To the point that these leaks damage operational programs and even cost lives, and therefore we need to investigate and prosecute leakers. First, there is a deficit of publicly available information that would provide a basis for a conversation about these matters. Second, according to one (fully briefed) Senator, Ron Wyden, who is on the intelligence committee, he said yesterday regarding this blanket surveillance that “Based on several years of oversight, I believe that its value and effectiveness remain unclear.”

Third, the investigation of leakers is not only wrong but counter-productive. These “leakers” are not acting for financial gain (just think how much money Bradley Manning could have made, or how much Thomas Drake has lost) but as whistleblowers. Whistleblowing, which candidate Obama praised in 2008, is an act carried out to alert to government waste, inefficiencies, or malfeasance.  Prosecutions of these whistleblowers, especially under the World War I-era Espionage Act ( a favorite of the Obama administration) will suppress future whistleblowers and hence the public’s ability to know about government waste, fraud and mismanagement.

These are my top 3 myths. There are others, and feel free to add your own.