Book: Remix Theory The Aesthetics of Sampling


This book is scheduled to be released in the next few weeks.
Remix Theory: The Aesthetics of Sampling is an analysis of Remix in art, music, and new media. Eduardo Navas argues that Remix, as a form of discourse, affects culture in ways that go beyond the basic recombination of material. His investigation locates the roots of Remix in early forms of mechanical reproduction, in seven stages, beginning in the nineteenth century with the development of the photo camera and the phonograph, leading to contemporary remix culture. This book places particular emphasis on the rise of Remix in music during the 1970s and ‘80s in relation to art and media at the beginning of the twenty-first Century. Navas argues that Remix is a type of binder, a cultural glue—a virus—that informs and supports contemporary culture.
Publisher: Springer Wein New York Press. Official link:
Table of Contents and Introduction are available on the official link.
To get a sense of the content of the book, read an earlier text, also published by Springer, which is now part of chapter three of the book: Regressive and Reflexive Mashups in Samping Culture. Official link to article:
Specific case studies for this book are made possible thanks to a post-doctoral fellowship in the Department of Information Science and Media Studies at the University of Bergen, in collaboration with the Software Studies Lab at the University of California, San Diego.
Remix Theory: The Aesthetics of Sampling can now be pre-ordered.  You can place your order on Amazon, Barnes and Nobles, Powell’sl Books, or another major online bookseller in your region, anywhere in the world.  The book is scheduled to be available in Europe in July, 2012 and in the U.S. in September/October of 2012.
The book will also be available electronically through university libraries that have subscriptions with Springer’s online service, Springerlink.  Educators who find the book as a whole, or in part, of use for classes are encouraged to consider the latter option to make the material available to students at an affordable price.
Anyone should be able to preview book chapters on Springerlink once the book is released everywhere.

For all questions, please feel free to contact me at eduardo_at_navasse_dot_net.

Below are selected excerpts from the book:
From Chapter One, Remix[ing] Sampling, page 11:
Before Remix is defined specifically in the late 1960s and ‘70s, it is necessary to trace its cultural development, which will clarify how Remix is informed by modernism and postmodernism at the beginning of the twenty-first century. For this reason, my aim in this chapter is to contextualize Remix’s theoretical framework. This will be done in two parts. The first consists of the three stages of mechanical reproduction, which set the ground for sampling to rise as a meta-activity in the second half of the twentieth century. The three stages are presented with the aim to understand how people engage with mechanical reproduction as media becomes more accessible for manipulation. […]The three stages are then linked to four stages of Remix, which overlap the second and third stage of mechanical reproduction.
From Chapter two, Remix[ing] Music, page 61:
To remix is to compose, and dub was the first stage where this possibility was seen not as an act that promoted genius, but as an act that questioned authorship, creativity, originality, and the economics that supported the discourse behind these terms as stable cultural forms. […] Repetition becomes the privileged mode of production, in which preexisting material is recycled towards new forms of representation. The potential behind this paradigm shift would not become evident until the second stage of Remix in New York City, where the principles explored in dub were further explored in what today is known as turntablism: the looping of small sections of records to create new beats—instrumental loops, on top of which MCs and rappers would freestyle, improvising rhymes. […]
From Chapter Three, Remix[ing] Theory, page 125:
Once the concept of sampling, as understood in music during the ‘70s and ‘80s, was introduced as an activity directly linked to remixing different elements beyond music (and eventually evolved into an influential discourse), appropriation and recycling as concepts changed at the beginning of the twenty-first century; they cannot be considered on the same terms prior to the development of machines specifically design for remixing. This would be equivalent to trying to understand the world in terms of representation prior to the photo camera. Once a specific technology is introduced it eventually develops a discourse that helps to shape cultural anxieties. Remix has done and is currently doing this to concepts of appropriation. Remix has changed how we look at the production of material in terms of combinations. This is what enables Remix to become an aesthetic, a discourse that, like a virus, can move through any cultural area and be progressive and regressive depending on the intentions of the people implementing its principles.

Computational Folkloristics

A new article from Tim Tangherlini and his UCLA colleagues:

Computational Folkloristics
By James Abello, Peter Broadwell, Timothy R. Tangherlini
Communications of the ACM, Vol. 55 No. 7, Pages 60-70.

Tim Tangherlini organized the most amazing workshop I ever attended: Networks and Network Analysis for the Humanities (An NEH Institute for Advanced Topics in Digital Humanities). Every day we had three lectures from leading experts in network analysis from the academy and also companies such as YouTube, plus hands-on software training. Being able to have lunch with leading people form computer science (who are normally very hard to access) and discuss your project with them was a really unique opportunity. Tim rocks! (Which is more than a metaphor because he really does - in 2002 he produced a documentary "Our Nation. A Korean Punk Rock Community.") His new article is must to read.

nice infographic about social media

Provided by:

Psychology of Social Networking

Big Data and Uncertainty in the Humanities

Big data and uncertainty in the humanities

September 22, 2012,
The Institute for Digital Research in the Humanities,
University of Kansas

This conference seeks to address the opportunities and challenges humanistic scholars face with the ubiquity and exponential growth of new web-based data sources (e.g. electronic texts, social media, and audiovisual materials) and digital methods (e.g. information visualization, text markup, crowdsourcing metadata).

“Big data” is any dataset that is too large to be analyzable with traditional means (whether e.g. manual close readings or database queries). Developments in cloud computing, data management, and analytics mean that humanists and allied scholars can analyze and visualize larger patterns in big data sets. With these opportunities come the challenges of scale and interpretation; we have moved from the uncertainty resulting from having too little data to the uncertainty implicit in large amounts of data.

What does this mean for how humanists structure, query, analyze and visualize data? How does this change the questions we ask and the interpretations we assign? How do we combine the best of a macro (larger-pattern) and a micro (close reading) approach? And how is interpretative and other uncertainty modeled?

Presentations addressing these both practical and epistemological questions are welcome.