June 10, 2011
[hyperpublic] The risks and beauty of hyperpublic life
Jeff Jarvis moderates a discussion. “We need principles to inform the architecture” for where we want to go, rather than waiting for terms of service from the sites we use. We need a discussion about terms of service for the Internet. (He’s careful to note that he’s not suggesting there be a single terms of service for the Net.) We all have a stake in the discussion of the public and private, Jeff says. We should be careful about our metaphors, he says, citing Doc Searls’ cautioning against calling it a medium since a medium is something that can be owned and controlled. “It is a platform for publics,” Jeff says.
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people. |
Adam Greenfield, director of Urbanscale, a shop in NYC. He’s interested in how you design public spaces. He wants to push back against the idea of a networked city of the future. We already live in the networked city. Locative and declarative meedia (e.g., FourSquare) is widely adopted. We live among declarative objects, not just people declaring their position, e.g., TowerBridge tweets its open and closed states. In Tokyo, the side of a building is a QR code; it is an object with an informational shadow. Objects are incresaingly capable of gathering, processing, displaying, transmitting and/or taking action on info…which implies new modes of surveillance. His contention: Tens of millions of people are already living in these conditions, and we therefore need a new theory of networked objects.
He offers a taxonomy of what this class of objects implies. He begins with traffic signals controlled by motion detectors. The info is not uploaded to the network, and it has a clear public good.It is prima facie unobjectionable.
Then there is the mildly disruptive and disrespectful object. E.g., a sensor detects someone passing by a billboard that reacts to that presence. There’s no public good here. More concerning is a touch screen vending machine that makes gender assumptions based on interpretation of a camera image. Further, that info is gathered and used.
Another step into the disturbing: Advertisers scan faces and make predictive and prospectively normal assumptions that they sell to marketers.
But what about when power and knowledge resides in an ensemble of discrete things. E.g., in Barcelona, access to some streets depends on a variety of sensors and signs. It can even reside in code: surveillance camera sw gets upgraded with referendum.
We should be talking about public objects: Any discrete obj in the common spatial domain intended for the use and enjoyment of the general public [with a couple of refinements that went by to fast]. They should be open as in API and that their goods are non-rivalrous and non-excludable. This is great, but we should remember that this increases the “attack surface” for hostile forces. Also, we need to evolve the etiquettes and protocols of precedence and deconfliction. We should do this because it moves against the capture of public space by private entities, and it opens up urban resources like we’ve never seen (discoverable, addressable, queryable and scriptable). The right to the city should be underwirtten by the architecture of its infrastructure.
Q: [jeff] Why is the gender-sensing vending machine is creepy? Would you be ok if it guessed but let you correct it or ignore it?
A: I’ve been working with informed consent, but I heard this morning that that may not be the best model. We don’t want to over-burden people with alert boxes, etc.
Jeffrey Huang talks about a case study: the design of a new campus in the deserts of Ras Al Khaimah (one of the Emirates). In 2009, the Sheikh agreed to fund the creation of the new campus. Jeff and others were brought in to design the campus, from bare sand up. The initial idea for the project was to avoid creating a typical gated campus, but rather to make it open. They also wanted to avoid the water costs of creating a grass-lawned campus. “The ambition was to grow a campus where it made sense ecologically”: buildings where there’s natural cooling winds, etc. They’re designing large, fluid, open spaces, where “seeing and being seen is maximized.” There would be a network of sensors so that campus would be aware of what’s going on inside, including recognizing where individuals are. People’s profile info could be projected into their shadows. They wonder if they need special places of privacy. “There should be less necessity to design the private if and only if the hyperpublicness is adequately designed.” E.g. if no one owns the data, there’s full transparency about who looks at the data and what’s being captured.
Betsy Masiello from Google’s policy team gives some informal remarks. To her, a hyperpublic life implies Paris Hilton: Constant streaming, making your behavior available for everyone to see. But, she says, what this panel is really about is a data-driven life. It’s important not to blur the two. There’s public good that comes from big data analysis, and some baseline economic good.
She says she thinks about predictive analytics in two ways. 1. Analysis done to give you predictions about what you might like. It’s clear to the user what’s going on. 2. Predictions based on other people’s behavior. E.g., search, and Adam’s soda machine. Both create value. But what are there risks? The risk is a hyperpublic life. The risk of all this data is that it gets attached to us, gets re-identified, and gets attached to your identity. But this misses something…
E.g., she came across a Jonathan Franzen 1988 essay, “The Imperial Bedroom.” “Without shame there can be no distinction between public and private,” he wrote. She says you can feel shame even if you’re anonymous, but Franzen is basically right. Which brings her to a positive solution. “The design problem is how to construct and identify multiple identities, and construct and manage some degree of anonymity.” It is true that our tech will allow us to identify everyone, but policy requirements could punish companies from doing so. Likewise, there are some policy decisions that would make it easier to maintain multiple identities online.
Jeff: Your fear of re-identification surprises me.
Betsy: The hyperidentity public is created from the lack of contexts, without people knowing about it. People don’t know how all their contexts are becoming one. I think people want a separation of data used to personalize an ad from data they are sharing from their friends.
Jeff: This is John Palfrey’s “breakwalls”… But I’d think that Google would want as few restrictions as possible. They create liabilities for Google.
Betsy: That’s the design challenge. Search engines and Japanese soda machines haven’t gotten it right yet.
Jeff: What are the emerging principles. Separating gathering from usage. Control. Transparency…
Adam: I don’t there’s anonymous data any more.
Betsy: Yes, but could we create it via policy?
Adam: There are some fine uses of predictive analytics. E.g., epidemiology. But not when the police use it to predict crimes.
Jeff: Why not? Ok, we’ll talk later.
Q: What about third party abuse?
Adam: Our principle should be “First, do no harm.”
Jhoung: It’s a problem often because the systems don’t know enough. Either roll it back, or train it so it can make better distinctions.
Betsy: You can maybe get. FourSquare is an individual stating her identity. The flip is anonymous data about locations. That provides tremendous value, and you can do that while protecting the identities.
Jeff: But if you can’t protect, don’t collect it, then we’ll never collect anything and won’t get any of those benefits.
Q: [latanya] It’s not true that only those with something to hide want to remain anonymous. E.g., if you hide all the positive results of HIV tests, you can see who has HIUV. You have to protect the privacy of those who do not have HIV.
Jeff: But I got benefit from going public with my prostate cancer.
Latanya: But we live in a world of parallel universes. You got to control which ones knew about your cancer.
Q: [I could’t hear it]
Betsy: You don’t need to reveal anything about the individual pieces of data in a big data set in order to learn from it.
Q: (jennie toomey) There are lots of things we want kept private that have nothing to do with built or shame. Much of what we keep private we use to create intimacy.
Betsy: I was quoting Franzen.
Q: Privacy means something different in non-democratic societies.
Adam: We know historically that if info can be used against us, it eventually will be.
Q: Recommended: Solove’s The Taxonomy of Privacy
Adam: The info collected by E. Germany was used against people after E. Germany fell.
Jeff: But if only listen to the fears, we won’t get any of the benefits.