logo
EverydayChaos
Everyday Chaos
Too Big to Know
Too Big to Know
Cluetrain 10th Anniversary edition
Cluetrain 10th Anniversary
Everything Is Miscellaneous
Everything Is Miscellaneous
Small Pieces cover
Small Pieces Loosely Joined
Cluetrain cover
Cluetrain Manifesto
My face
Speaker info
Who am I? (Blog Disclosure Form) Copy this link as RSS address Atom Feed

October 26, 2008

Tweeting museum

Leslie Madsen Brooks at BlogHer writes about museums using Twitter. It’s a whole lotta links and a whole lotta love, including a link to Beth Kanter’s interview with MuseumTweets (= Amy Fox).

[Tags: museums twitter lesley_madsen_brooks beth_kanter amy_fox everything_is_miscellaneous ]

Tweet
Follow me

Categories: Uncategorized Tagged with: culture • digital culture • everythingIsMiscellaneous • knowledge • museums • twitter Date: October 26th, 2008 dw

Be the first to comment »

October 22, 2008

FreeCulture and Open Universities

From the FreeCulture movement has emerged the Open University Campaign based on the new Wheeler Declaration:

An open university is one in which

1. The research the university produces is open access.
2. The course materials are open educational resources.
3. The university embraces free software and open standards.
4. If the university holds patents, it readily licenses them for free software, essential medicines, and the public good.
5. The university network reflects the open nature of the internet.

where “university” includes all parts of the community: students, faculty, administration.

As we used to say: Right on.

[Tags: free_culture open_access open_courseware open_university university education ]

Tweet
Follow me

Categories: Uncategorized Tagged with: digital culture • digital rights • education • for_everythingismisc • knowledge • university Date: October 22nd, 2008 dw

3 Comments »

September 6, 2008

[ae] James Boyle

James Boyle is chairman of Creative Commons and teaches law at Duke. He’s talking about the nature of openness. [Note: Live blogging. Error prone and error-full.]

We have patterns of behavior that economic theory does not predict. We are risk averse. For example, it makes no sense to buy a warranty; we buy them out of an absurd sense that buying the warranty affects the device’s outcome. There is another kind of bias that we wouldn’t predict from economic theory: A systematic bias against openness. We don’t expect openness and collaboration to generate what they do. We overestimate the risks. We underestimate the risks of closed systems and overestimate closed systems’ benefits.

Suppose in 1990 I came to you with two proposals: Build an open system. Or, build something like Minitel, Compuserv or AOL; it’s controlled and permission-based. Which would you pick? If you pick the first, you’ll have piracy, spam, massive amounts of crap, flame wars, massive violations of IP, use for immoral purposes. “I think you’d pick network #2” because those risks are foreseeable, but you couldn’t imagine wikis, blogs, Google maps, etc. It’s hard for us to imagine the benefits of open systems. It’s not intuitive.

Again, in 1990 you are asked to assemble the greatest encyclopedia, in most languages, updated in real time, adopt a neutral point of view. In 1990, you’d say that you need maybe a billion dollars, a hierarchical corporation, lots of editors, vet the writers you’re hiring, peer reviewers, copyright it all to recoup the money we’ve invested, trademark it. And someone else says, “We’ll have a web site, and people will like put stuff up and people will edit it.” How many of us would have picked #2. We don’t understand openness.

Free software is the same story.

What conclusions should we draw? Some people are raised in places where they learn how to drive in snow and ice. They learn to turn into the skid, contrary to our impulses. We can train ourselves to overcome our biases. But open doesn’t always work. Sometimes we do need closed, controlled. E.g., open won’t get us all the way to a phase 3 drug trial. Open doesn’t always work for privacy. We need a world with both open and closed.

So far, James says, we in the audience agree. Now for some things that will not flatter our sympathy.

He talks about Putnam’s “Bowling Alone” that talks about the loss of civil organizations in America. But, Putnam noticed that in the early 1900s American intellectuals noticed that the move to cities fragmented the old ties. But they didn’t say that history will just automatically correct itself. Instead, they created organizations like the Kiwanis, the Elks, etc. They invented institutions to make up for a problem they saw. Eventually, those institutions worked.

So, if we are bad at judging the boundaries between open and closed, if it’s important to get it right, then it’s beholden on us to create the institutions of civil society that enable us to get past our biases. Creative Commons is one such. It provides an infrastructure for sharing our work.

Science Commons is another such group. The Web was created to exchange scientific info, but the Web currently works much better for buying shoes or porn than doing that. The vast majority of scientific literature is behind the pay wall. You can find it but not read it. Nor can you build a sort of Google Maps mashup — take all the literature on malaria, find all the geo locations, all the proteins, overlay it, build a wikipedia for science. You can’t do that because it’s illegal, technologically impossible, and even if you could, you can’t reassemble it and do a click and buy. “The World Wide Web doesn’t work for science.” Science commons tries to address that…

Q: Is the bias a metaphor or an inherent inability to understand openness?
A: About 80% is explained by the fact that for most of my generation’s lives, our experience of property was with physical things; if I have it, you can’t. There are economic benefits to knowing who owns it. The closed intuitions generally work there.


[I have to stop to get read to give my talk …] [Tags: ae08 ars_electronica james_boyle creative_commons copyright ]

Tweet
Follow me

Categories: Uncategorized Tagged with: ae08 • conference coverage • copyright • digital culture • everythingIsMiscellaneous • knowledge • libraries • science Date: September 6th, 2008 dw

7 Comments »

June 10, 2008

Britannica tweaks the wiki

Britannica has announced that it’s going to enable some measure of reader participation in the extending of the online version of their encyclopedia. You can see the beta of the new site here.

The detailed overview of the planned site says:

two things we believe distinguish this effort from other projects of online collaboration are (1) the active involvement of the expert contributors with whom we already have relationships; and (2) the fact that all contributions to Encyclopaedia Britannica’s core content will continue to be checked and vetted by our expert editorial staff before they’re published.

Excellent! We needs lots of variations on the theme of collaboration. Editing and expertise add value. They slow things down and reduce the ability to scale, but Wikipedia’s process makes it possible to read an article that’s been altered, if only for a minutes, by some devilish hand. It all depends on what you’re trying to do, and collectively we’re trying to do everything. So, this is good news from Britannica. It’ll be fascinating to watch.

To pick a nit, I’m not as convinced by Britannica’s insistence on objectivity as a value, however. The blog post says “we believe that the creation and documentation of knowledge is a collaborative process but not a democratic one.” It lists three positive consequences of this. The third is “objectivity, and it requires experts.” In a reference that makes you wish they’d at least once use the word “Wikipedia,” the post continues: “In contrast to our approach, democratic systems settle for something bland and less informative, what is sometimes termed a ‘neutral point of view.'” I think it would be reasonable for Britannica to tell us that an expert-based, edited system is likely to yield articles that are more comprehensive, more uniform in quality, more accurate and more reliable. But haven’t we gotten past thinking that expertise yields objectivity?

Anyway, I think it’s amazing that the Britannica, in its 240th year, is taking this step. Britannica will be better for it, and so will we. [Tags: britannica wikipedia knowledge everything_is_miscellaneous ]

Tweet
Follow me

Categories: Uncategorized Tagged with: britannica • culture • digital culture • education • everythingIsMiscellaneous • folksonomy • knowledge • media • wikipedia Date: June 10th, 2008 dw

6 Comments »

Berkman lunch: Anne Balsamo on Designing Culture

Anne Balsamo from U of Southern California and the Annenberg School is giving a Berkman lunchtime talk, called “Designing Culture: The Technological Imagination at Work.” [Live blogging, paraphrasing. And Anne is talking about deep themes. So, these notes will be especially inadequate, as well as getting things wrong, missing stuff, etc.]

Her book touches on technological imagination (how we engage the materiality of the world), technological innovation, and the reworking of culture. She’s particularly interested in the importance of training the technological imagination. Her book discusses designers who explicitly consider culture throughout the design process. The book speculates about what it would take to train imaginations to create new cultural possibilities as they are at designing new technologies. This is the responsibility of educators as well as of engineers, etc.

Chapter 1 does some framing. Chapter 2 is called “Gendering the technological imagination,” extending the topic of her previous book. “Technology was always gendered. We just didn’t recognize it as such.” It draws on feminist theories of reproduction as the basis of all technologies as reproductive. Chapter 3 (“The Performance of Innovation”) draws on her time at PARC designing a museum exhibit on the future of reading. It focused on how we perform innovations, rather than discover them. Chapter 4 (“Public interactives and technological literacies”) reflects on the literacy a designer must always take into consideration when designing interactive pieces, and how interactives draw upon existing literacies and require new ones for the future. It then looks to the ethics of designing public interactives. Chapter 5 (“Working the Paradigm Shift”) is on the labor of creating this shift. It draws on Henry Jenkins and calls on people to do the hard work of shifting the paradigm. People have to learn how to engage deeply under the hood, as well as the policy work. Chapter 6 is a coda (“The Work of the Book in a Digital Age”) about why she’s writing a book in the age of the digital. The book is transmedia and includes a multimedia documentary (“Women of the World Talk Back”) she co-authored about 15 yrs ago, a Web site, and some other pieces. She also is working on a new thesaurus that maps technology as a cultural ensemble.

She talks about working the paradigm shift. We have failed to bridge C.P. Snow’s two cultures. We need to do so through practices. New participants (esp. women) and new commitments. We need to learn to be learners, not to be the smartest person in the world. And we need more collaborative teams and new spaces where people can work together on technological things. We need places that aren’t owned territorially but are places where people can come together from multiple disciplines.

She is working on a new MacArthur project. Scholarship will be distributed and networked, Macarthur understands. Part of her new grant is understanding the technology to enable this to happen. Learning is happening in distributed fashion, not in any one place. She is looking at how museums and libraries will function as part of this distributed learning environment. She’s starting with the portfolio of reading devices developed at PARC for the museum exhibit. She is looking at digital learning objects, mixed reality learning environments (body-based, gesture-based), and thinking with objects (DIY … but, Ann asks, as the digital divide mainatins, will the poor get access only to the virtual while the affluent learn how to solder, weld, saw…).

Libraries and museusm are important for presreving culture and bring it into new understandings.

She leaves us with the question: What about the future of libraries and museums?

Discussion begins, but I’m not going to try to capture all of it. Here are some random points:

Ann says that we need to be smart about our metadata, recognizing that there is always a narrative there. If we don’t think about this, the semantic web will be stupid.

Ann thinks books will continue to be printed. But libraries may be about more than lending books and CDs/DVDs. They could lend tools, toys… [Great vision!] A library is also a stage where people can perform and participate in their culture. [Tags: berkman ann_balsamo libraries museums culture technology everything_is_miscellaneous ]

Tweet
Follow me

Categories: Uncategorized Tagged with: berkman • culture • digital culture • knowledge • libraries • museums • technology Date: June 10th, 2008 dw

Be the first to comment »

June 6, 2008

Open education and Publius

Berkman‘s Publius project keeps rolling along. There’s already lots of excellent stuff there, exploring how the Net is constituting its own governance mechanisms and norms. For example, today Peter Suber and Melissa Hagemann discuss open access, science, research, and education. But you can just browse through the topics and be pretty sure you’ll hit on something well worth reading.

[Tags: berkman publius governance ]

Tweet
Follow me

Categories: Uncategorized Tagged with: berkman • digital culture • digital rights • education • everythingIsMiscellaneous • governance • knowledge • publius • science Date: June 6th, 2008 dw

1 Comment »

May 21, 2008

Health Commons launched

Science Commons, in its relentless drive for product line expansion (I kid because I love), has posted a white paper proposing a Health Commons. In it, the authors, Marty Tenenbaum and John Wilbanks, lay out the problems and suggest a solution.

They write:

We are no longer asking whether a gene or a molecule is critical to a particular biological process; rather, we are discovering whole networks of molecular and cellular interactions that contribute to disease. And soon, we will have such information about individuals, rather than the population as a whole. Biomedical knowledge is exploding, and yet the system to capture that knowledge and translate it into saving human lives still relies on an antiquated and risky strategy of focusing the vast resources of a few pharmaceutical companies on just a handful of disease targets.

After citing more problems with the current system, the authors propose a Health Commons:

Imagine a virtual marketplace or ecosystem where participants share data, knowledge, materials and services to accelerate research. The components might include databases on the results of chemical assays, toxicity screens, and clinical trials; libraries of drugs and chemical compounds; repositories of biological materials (tissue samples, cell lines, molecules), computational models predicting drug efficacies or side effects, and contract services for high- throughput genomics and proteomics, combinatorial drug screening, animal testing, biostatistics, and more. The resources offered through the Commons might not necessarily be free, though many could be. However, all would be available under standard pre-negotiated terms and conditions and with standardized data formats that eliminate the debilitating delays, legal wrangling and technical incompatibilities that frustrate scientific collaboration today.

The paper emphasizes the need for metadata standards: “Providing such standards, Heath Commons improves and extends the public domain by
integrating hundreds of public databases into a single framework…” The Commons also provides the needed “social and legal infrastructure,” and a portal that provides the right set of services.

They hope that by lowering research costs, some of the 5,000 tropical diseases currently “uneconomical to address,” for example, will become the target of pharmaceutical R&D. “Health Commons makes it cost effective for small groups of researchers to conduct industrial scale R&D on rare diseases by exploiting the economies of scale afforded by an ecosystem of shared knowledge…”

The authors see the benefits going beyond the Commons’ value to non-profits. “Every pharmaceutical company sits on a wealth of promising targets and leads that they won’t develop themselves.”

The Health Commons could be a huge step forward. But it will take some work. “To realize the full potential, existing companies need to rethink their business models to leverage the commons.” As an example, the paper points out that “Only six out of the 1800 biotechnology companies funded since 1980 have made more money than was cumulatively invested in them.” Rather than counting striking it rich with proprietary drugs discovered via proprietary R&D platforms, perhaps companies could profit by opening up their platforms and taking a cut of any drugs discovered with them.

Finally, Health Commons will provide a way to continuously publish research, along with comments, to supplement the traditional publishing model.

Health Commons can and should be a big deal. It requires lots of pieces coming together over time, but its acknowledgment of the role of profit is encouraging, and it is in the hands of serious, committed, and wickedly smart people. [Tags: health science science_commons health_commons pharma everything_is_miscellaneous ]

Tweet
Follow me

Categories: Uncategorized Tagged with: everythingIsMiscellaneous • health • knowledge • metadata • pharma • science Date: May 21st, 2008 dw

2 Comments »

May 7, 2008

Harvard Law goes Open Access

The Harvard Law faculty has voted unanimously for an Open Access policy based on the one that the Harvard Faculty of Arts and Sciences passed a few months ago. Yay!

John Palfrey, Harvard Law’s new vice dean for library and information resources (and, of course, the soon-to-be-former exec dir of the Berkman Center) gets to implement this happy policy.

[Tags: open_access harvard libraries john_palfrey ]

Tweet
Follow me

Categories: Uncategorized Tagged with: everythingIsMiscellaneous • harvard • john_palfrey • knowledge • libraries • open_access Date: May 7th, 2008 dw

6 Comments »

April 7, 2008

Gene Koo on Wikipedia and postmodern truth

Nice post today by Gene Koo about Wikipedia’s view of truth in a postmodern world. A social process replaces the simple one-to-one relationship which we used to think “knowing” was. Something like that.

[Tags: wikipedia knowledge philosophy postmodernism ]

Tweet
Follow me

Categories: Uncategorized Tagged with: everythingIsMiscellaneous • knowledge • philosophy • postmodernism • wikipedia Date: April 7th, 2008 dw

4 Comments »

April 4, 2008

[topicmaps] Steve Pepper: Everything is a subject

Steve Pepper begins by talking about Vannevar Bush, whose influence on the Web has been profound. Bush was concerned with finding info, says Steve. His aim was to model how we find info on how the human mind works, i.e., by association. But, says Steve, Bush’s memex revolved entirely around documents, which is not how we think. [Caution: Live-blogging!]

Documents are about subjects. Subjects exist as concepts in our brains. They’re connected by a network of associations. Docs are how we happen to capture and communicate ideas. “Hypertext has been barking up the wrong tree” ever since the memex. (Steve then couches this more softly, acknowledging how much he loves the Web, etc.) We should be organizing information around topics/subjects, not around documents.

Why? Because topic maps reflect how we think. That’s why topic maps are ideal fo web sites. They’re subject-based associative. See topicmaps.com.

Steve counters the impression that topic maps are a portal technology. They were invented in 1991, before the Web. They “just turned out to be ideal for the purpose.” Until recently, they were mainly used for portals, but now they’re used increasingly to represent domains of knowledge. TMs are bigger than Topics, Associations, and Occurrences (TAO), for knowledge has a context. The concept of scope enables the rexpression of contextual validity, enabling multiple viewpoints. This makes topic maps more than a simple semantic tech. Semantics are decontextualized meaning, whereas pragmatics is contextuaiized meaning. See www.hoyre.no

Merging “is the single most powerful feature of topic maps.” Merging was the original motivation for topic maps, merging multiple indexes. It enables a “global knowledge federation.” You can arbitrarily merge any two topic maps. That can’t be done with relational databases or XML documents. But how to make it useful? It vcan’t be done by relying on names since every subject has multiple names, says Steve. The only solution for computers is identifiers. A topic in a topic map is a symbol that represents something in the real world, says Steve. He quotes the ISO definition: “A subject is any ‘thing’ whatsoever, whether or not it exists or has any other specific characteristics, about which anything whatsoever may be asserted by any means whatsoever.”

Meaning is expressed through the relationship between the representation and that to which it refers. Subject identifiers are central to topic maps. For example, which Steve Pepper wrote the letter of protest to the ISO committee? There’s a Steve Pepper in NJ who has a CD called “The Information Age.” But if you look at the metadata on the PDF of Steve’s letter, there’s a URI that describes Steve. This allows humans to disambiguate. At the moment there’s no good way to register such identities. “PSIs [Published Subject Identifiers] are perhaps not the final answer, but they’re a pretty good stopgap” and can easily be remapped if something else turns out to be the answer.

Steve ends by asking Microsoft to become more subject-centric. Windows is highly document-centric he says. He wants a desktop that shows him the subjects and topics he cares about, rather than folders and apps. Although there are some Semantic Web people working on a semantic desktop, Steve thinks Topic Maps is better for human-facing representations of knowledge. Why not have an entire subject-centric operating system, he asks: NLP for categorizing docukents, p2p, facilities for merges, etc.

Topic maps started out as a way to merge indexes, Steve says. It turned into a knowledge representation formalism. Now it’s the flag-bearer for subject-centric computing. Subject-centric computing is a paradigm shift, Steve says, comparing it to object-oriented programming, and then to the Copernican revolution. [Tags: topic_maps steve_pepper ]

Tweet
Follow me

Categories: Uncategorized Tagged with: conference coverage • everythingIsMiscellaneous • knowledge • metadata Date: April 4th, 2008 dw

6 Comments »

« Previous Page | Next Page »


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
TL;DR: Share this post freely, but attribute it to me (name (David Weinberger) and link to it), and don't use it commercially without my permission.

Joho the Blog uses WordPress blogging software.
Thank you, WordPress!