Cultural Analytics: history

September 2005: IGRID conference at Calit2 which includes demos of EVL LambdaVision display (55 tiled 30-inch LCD screens).

November 2005: Lev Manovich submits grant proposal to ACLS:


From Summary:

I propose to develop a new approach for the study of visual culture, including art, graphic design, vernacular imagery, photography, cinema, and digital media. The idea is to apply the techniques of computer-based data analysis and data display which already have become routine in the sciences – information visualization, image processing, data mining, data clustering, and others - to the ‘data’ of visual culture, i.e. cultural images.

Similar to the scientists which apply these techniques to massive data sets in order to see new patterns, we can analyze the imagery of whole artistic movements and whole historical periods – for instance, all Dutch seventieth century landscapes, all nineteenth century vernacular photographs available in museum collections, or even – one day - all of twentieth century cinema.

From proposal text:

Scientific communities are spending significant resources today to develop large format displays – such as the EVL LambdaVision display at Calit2 where my new lab is situated... I am convinced that we should follow the practice of scientists to be able to study our own data – cultural images - on very large displays such as LambdaVision. The use of large format displays is especially beneficial when we want to use visualization to look at actual image sets – so that we can visually examine relationships between tens of thousands of images.

April 2007: Calit2 and CRCA provide funding to establish Software Studies Initiative.

May-June 2007: Responding to the challenge by Larry Smarr to develop new applications for HIPerWalll, Manovich and Wardrip-Fruin write Cultural Analytics white paper which extends the ideas of Manovich's 2005 ACLS proposal:

"Can we create quantitative measures of cultural innovation? Can we have a real-time detailed map of global cultural production and consumption? Can we visualize flows of cultural ideas, images, and trends? Can we visually represent how cultural and lifestyle preferences – whether for music, forms, designs, or products – gradually change over time?

Today sciences, business, governments and other agencies rely on computer-based analysis and visualization of large data sets and data flows. They employ statistical data analysis, data mining, information visualization, scientific visualization, visual analytics, and simulation. We believe that it is time that we start applying these techniques to cultural data. The large data sets are already here – the result of the digitization efforts by museums, libraries, and companies over the last ten years (think of book scanning by Google and Amazon) and the explosive growth of newly available cultural content on the web."

"Visualizations should be designed to take full advantage of the largest gigapixel wall-size displays available today – that are being constructed at CALIT2."

April 2008: NEH established Digital Humanities Office and announces Humanities High-Performance Computing (HHPC) grant program:

"Humanities High-Performance Computing (HHPC) refers to the use of high-performance machines for humanities and social science projects. Currently, only a small number of humanities scholars are taking advantage of high-performance computing. But just as the sciences have, over time, begun to tap the enormous potential of HPC, the humanities are beginning to as well. Humanities scholars often deal with large sets of unstructured data. This might take the form of historical newspapers, books, election data, archaeological fragments, audio or video contents, or a host of others. HHPC offers the humanist opportunities to sort through, mine, and better understand and visualize this data."

November 2008: Software Studies Initiative is one of 3 labs awarded HHPC grant

April 2008: a group of Calit2 researchers headed by Lev Manovich receives Interdisciplinary Collaboratory Grant from UCSD Chancellor office to develop Cultural Analytics Research Environment: an open platform for Digital Humanities Research which will support real-time analysis of different types of visual and media data and a variety of visualization and mapping techniques.

The group which submitted the proposal included the following Calit2 researchers:

Lev Manovich (Visual Arts);
Noah Wardrip-Fruin (Communication);
Falko Kuester (Calit2 and Structural Engineering);
Jim Hollan (Cognitive Science).

January 2008: NEH and NSF announce Digging Into Data challenge. Grant amounts: up to 300K USD.

"The Digging into Data Challenge is an international grant competition sponsored by four leading research agencies, the Joint Information Systems Committee (JISC) from the United Kingdom, the National Endowment for the Humanities (NEH) from the United States, the National Science Foundation (NSF) from the United States, and the Social Sciences and Humanities Research Council (SSHRC) from Canada."

"The creation of vast quantities of Internet accessible digital data and the development of techniques for large-scale data analysis and visualization have led to remarkable new discoveries in genetics, astronomy, and other fields, and—importantly—connections between academic disciplinary areas."

"With books, newspapers, journals, films, artworks, and sound recordings being digitized on a massive scale, it is possible to apply data analysis techniques to large collections of diverse cultural heritage resources as well as scientific data. How might these techniques help scholars use these materials to ask new questions about and gain new insights into our world?"