Author Archives: Jeremy

Elden: Foucault’s Last Decade – Update 13

Stuart updates his work on his book Foucault’s Last Decade.

Foucault’s Last Decade – Update 13.

via Foucault’s Last Decade – Update 13.

Dictionary of Human Geography–new edition

The Dictionary of Human Geography has long been the gold standard for reference works in the field. The first edition was published in 1981 and was edited by Derek Gregory, Ron Johnston and David M. Smith. In a world of readers, encyclopedias, key thinkers, companions, and handbooks, the Dictionary occupies a special place. In an admittedly less crowded marketplace of the 1980s it was the one reference book many graduate students would actually pay for. (It’s still the one I recommend to grad students today.)

I’m very pleased therefore to say that I’ve agreed to be on the editorial team of a new, revised edition–the 6th. As Derek Gregory announced on his blog recently the other editors are Clive Barnett, Diana Davis, Geraldine Pratt, Joanne Sharp, and Henry Yeung. Derek remains as Editor-in-Chief.

Over the years the Dictionary has changed–most notably an increase in size (content) and contributors. One of the questions for any book such as this is the one of relevance in an era not only of Wikipedia (which I also recommend to students!) but of ebooks (and supplies of a shadowy legal/moral status thereof –a short Google search will give you the entire 5th edition for free). More substantially, there is the question of content relevance; the last edition came out in 2009 and the next edition wouldn’t be out before 2017. (There was a 5-year gap between the 1st and 2nd editions, then gaps of 8, 6, and 9 years).

I’m very much looking forward to working on the new edition; we’ll have an opportunity to engage with the exciting developments in critical cartography and the geoweb, among other things. But I’m also interested in seeing how the book can remain relevant in the face of the challenges above (and others no doubt!).

AeroVironment

ScreenClip

A company I’ve recently become interested in is AeroVironment.

Their stock has recently recovered from a down-turn last year. As of this writing they are at 31.16 a share.

They manufacture unmanned aerial vehicles (UAVs) or drones and have been in business since 1971. Their founder, Dr. Paul MacCready, perfected the first human-powered controlled flight, the Gossamer Condor.

More soon.

Big Data, intel, and finance

Interesting story in the New York Times on how intelligence capabilities are being repurposed for financial risk assessment. Involves people previously at Palantir, one of the most interesting and enigmatic intel companies.

There were plenty of parallels between the two worlds, but instead of agencies, spies and eavesdropping satellites, finance has markets, investment advisers and portfolios. Both worlds are full of custom software, making each analysis of a data set unique. It is hard to get a single picture of anything like the truth.

And in much the way Palantir seeks to find common espionage themes, like social connections and bomb-making techniques, among its data sources, Mr. Lonsdale has sought to reduce financial information to a dozen discrete parts, like price changes and what percentage of something a person holds.

Surveillance costs–new study

Shortly after the Edward Snowden revelations began in June 2013 I wrote a Commentary for Society and Space open site on the costs of security.

One of the issues I addressed had to do with the economic and other costs of surveillance:

What does the US actually pay? One attempt at an answer to this surprisingly difficult question was recently provided by the National Priorities Project (NPP). Their estimate was that the US national security budget was $1.2 trillion a year.

A new report by the New America Foundation has further explored the costs of surveillance in terms of lost business opportunities to US companies, US foreign policy and cybersecurity:

  • Direct Economic Costs to U.S. Businesses: American companies have reported declining sales overseas and lost business opportunities, especially as foreign companies turn claims of products that can protect users from NSA spying into a competitive advantage. The cloud computing industry is particularly vulnerable and could lose billions of dollars in the next three to five years as a result of NSA surveillance.
  • Potential Costs to U.S. Businesses and to the Openness of the Internet from the Rise of Data Localization and Data Protection Proposals: New proposals from foreign governments looking to implement data localization requirements or much stronger data protection laws could compound economic losses in the long term. These proposals could also force changes to the architecture of the global network itself, threatening free expression and privacy if they are implemented.
  • Costs to U.S. Foreign Policy: Loss of credibility for the U.S. Internet Freedom agenda, as well as damage to broader bilateral and multilateral relations, threaten U.S. foreign policy interests. Revelations about the extent of NSA surveillance have already colored a number of critical interactions with nations such as Germany and Brazil in the past year.
  • Costs to Cybersecurity: The NSA has done serious damage to Internet security through its weakening of key encryption standards, insertion of surveillance backdoors into widely-used hardware and software products, stockpiling rather than responsibly disclosing information about software security vulnerabilities, and a variety of offensive hacking operations undermining the overall security of the global Internet.

These may end up being upper bounds of the costs (and consequences), but they are very helpful in identifying what is at stake here. I haven’t read the whole report yet, but the executive summary is here (pdf).

 

The OSS Map Division–new piece in OSS Journal

Crampton OSS Journal

The Journal of the OSS Society has just published a short piece I wrote on the OSS Map Division. It appears in this issue (2014).

This is one of my most unusual publications. I was asked to do this quite a while ago. I think it has turned out nicely, and they used a most intriguing map from the AGS Library archives which I discovered in 2012, thanks to support from the David Woodward Memorial Fellowship in the History of Cartography. Thanks to Charles Pinck, OSS Society President, for his interest.

Updated 8/5/14 to link to pdf as published.

What is neoliberalism?

robinjames (@doctaj) on neoliberalism:

I want to hone in on one tiny aspect of neoliberalism’s epistemology. As Foucault explains in Birth of Biopolitics, “the essential epistemological transformation of these neoliberal analyses is their claim to change what constituted in fact the object, or domain of objects, the general field of reference of economic analysis” (222). This “field of reference” is whatever phenomena we observe to measure and model “the market.” Instead of analyzing the means of production, making them the object of economic analysis, neoliberalism analyzes the choices capitalists make: “it adopts the task of analyzing a form of human behavior and the internal rationality of this human behavior” (223; emphasis mine). (The important missing assumption here is that for neoliberals, we’re all capitalists, entrepreneurs of ourself, owners of the human capital that resides in our bodies, our social status, etc.) [3] Economic analysis, neoliberalism’s epistemontological foundation, is the attribution of a logos, a logic, a rationality to “free choice.”

I particularly like the way she enrolls Big Data and the algorithmic in her understanding of neoliberalism:

Just as a market can be modeled mathematically, according to various statistical and computational methods, everyone’s behavior can be modeled according to its “internal rationality.” This presumes, of course, that all (freely chosen) behavior, even the most superficially irrational behavior, has a deeper, inner logic. According to neoliberal epistemontology, all genuinely free human behavior “reacts to reality in a non-random way” and “responds systematically to modifications in the variables of the environment” (Foucault, sumarizing Becker, 269; emphasis mine).

This approach ties to what others have been saying for a number of years now on the algorithmic (I’m thinking of the work of Louise Amoore on data derivatives, among others) and the calculative (eg., Stuart Elden’s readings of Foucault and Heidegger). I’ve just completed a paper on Big Data and the intelligence community which tries to make some of these points, and Agnieszka Leszczynski and I have a cfp out for the Chicago meetings next year which we certainly hope will include these issues.

(Via this excellent piece on NewApps)