Sher Doruff: The Translocal Event and the Polyrhythmic Diagram (2006)

29 February 2012, dusan

This thesis identifies and analyses the key creative protocols in translocal performance practice, and ends with suggestions for new forms of transversal live and mediated performance practice, informed by theory. It argues that ontologies of emergence in dynamic systems nourish contemporary practice in the digital arts. Feedback in self-organised, recursive systems and organisms elicit change, and change transforms. The arguments trace concepts from chaos and complexity theory to virtual multiplicity, relationality, intuition and individuation (in the work of Bergson, Deleuze, Guattari, Simondon, Massumi, and other process theorists). It then examines the intersection of methodologies in philosophy, science and art and the radical contingencies implicit in the technicity of real-time, collaborative composition. Simultaneous forces or tendencies such as perception/memory, content/expression and instinct/intellect produce composites (experience, meaning, and intuition- respectively) that affect the sensation of interplay. The translocal event is itself a diagram – an interstice between the forces of the local and the global, between the tendencies of the individual and the collective. The translocal is a point of reference for exploring the distribution of affect, parameters of control and emergent aesthetics. Translocal interplay, enabled by digital technologies and network protocols, is ontogenetic and autopoietic; diagrammatic and synaesthetic; intuitive and transductive. KeyWorx is a software application developed for real-time, distributed, multimodal media processing. As a technological tool created by artists, KeyWorx supports this intuitive type of creative experience: a real-time, translocal “jamming” that transduces the lived experience of a “biogram,” a synaesthetic hinge-dimension. The emerging aesthetics are processual – intuitive, diagrammatic and transversal.

Doctor of Philosophy, SMARTlab Programme in Performative New Media Arts, Central Saint Martins College of Art & Design, University of the Arts, London
288 pages

PDF
PDF (Appendix “The KeyWorx Interviews: Transcripts of Interviews and Conversations with KeyWorx Artists”)

Sher Doruff, Nancy Mauro-Flude (eds.): Connected: LiveArt (2005)

29 February 2012, dusan

The Connected! Programme spanned a two year period from January 2003 to January 2005. It officially concluded with a celebratory Birthday party for Art in the Theatrum Anatomicum of Waag Society, the local ‘home’-base of many Connected! projects. Although most of the people present at that event agreed with Federico Bonelli’s assessment “that art could have committed suicide in 1984″ – the research and the show goes on.

The Connected! Programme had four nested components: Projects, Artists-in-Residence, Sentient Creatures Lecture Series and Anatomic. This book documents many of the activities in these domains; the lectures, the events, the workshops, the performances, the installations, the discourse. Yet, it’s interesting to note that pulling together material for this publication was a bit like trying to capture the wind. Much of the work produced in this two-year period emphasized the real-time process of the making. Documentation of that often fragile, unstable and always already ephemeral process is sketchy at best and marginal to the actualization of the event itself. For many of these artists, documentation is a secondary concern, an afterthought. For others, documenting is an integral process indistinguishable from the event itself.

There are myriad photos in this catalogue of artists behind their laptops. Myriad photos that say little about the levels and layers of codified communication emitted from those unseen screens. These casual, unpretentious shots are images of social networks in progress – the translocal – a feedback loop of the local effecting the global affecting the local affecting the global. Not only does the artwork produced, or better transduced, scramble representational meaning but so too does the process of making. Performance practice that addresses the indeterminate dance-on-the-edge-of-chaos in compositional processes is a felt thing, an experience that doesn’t always translate well in laptop snapshots.

Co-writers: Federico Bonelli, Beth Coleman, Josephine Dorado, Lucas Evers, Wander Eikelboom, Howard Goldkrand, Jan-Kees van Kampen, Arjen Keesmaat, Jeff Mann, Mark Meadows, Hellen Sky, Michelle Teran, Ananya Vajpeyi

Publisher Waag Society, Amsterdam, 2005
Creative Commons BY-SA 2.0 Netherlands License
160 pages

publisher

PDF (updated on 2012-9-3)

Bill Stewart: Living Internet (2000)

27 February 2012, dusan

An in-depth reference about the Internet.

The site was written from 1996 through 1999, first published on the web on January 7, 2000, and updated regularly. It has more than 700 pages, 2,000 intra-site links, and 2,000 external links to some of the world’s best online content about the Internet.

The site is authored by Bill Stewart who has used the Internet since 1988, and first appreciated the power of the medium during the Tiananmen Square rebellion in China in 1989, when he saw how the net kept Chinese communities around the world in touch with the events through email and newsgroups, bypassing all government censorship.

View online (HTML)

Michael Hauben, Ronda Hauben: Netizens: On The History And Impact Of Usenet And The Internet (1996)

27 February 2012, dusan

Netizens, one of the first books detailing the Internet, looks at the creation and development of this participatory global computer network. The authors conducted online research to find out what makes the Internet “tick”. This research results in an informative examination of the pioneering vision and actions that have helped make the Net possible. The book is a detailed description of the Net’s construction and a step-by-step view of the past, present, and future of the Internet, the Usenet and the WWW.

The book gives you the needed perspective to understand how the Net can impact the present and the turbulent future. These questions are answered: What is the vision that inspired or guided these people at each step? What was the technical or social problem or need that they were trying to solve? What can be done to help nourish the future extension and development of the Net? How can the Net be made available to a broader set of people?

With foreword by Tom Truscott
A print edition was published by the IEEE Computer Society Press, later distributed by John Wiley
ISBN 0-8186-7706-6

reviews

authors
publisher
google books

View online (HTML)

Jose Felipe Ortega Sotó: Wikipedia: A Quantitative Analysis (2009)

26 February 2012, dusan

This doctoral thesis offers a quantitative analysis of the top ten language editions of Wikipedia, from different perspectives. The main goal has been to trace the evolution in time of key descriptive and organizational parameters of Wikipedia and its community of authors. The analysis is focused on logged authors (those editors who created a personal account to participate in the project). The comparative study encompasses general evolution parameters, a detailed analysis of the inner social structure and stratification of the Wikipedia community of logged authors, a study of the inequality level of contributions (among authors and articles), a demographic study of the Wikipedia community and some basic metrics to analyze the quality of Wikipedia articles and the trustworthiness level of individual authors. This work concludes with the study of the influence of the main findings presented in this thesis for the future sustainability of Wikipedia in the following years.

The analysis of the inequality level of contributions over time, and the evolution of additional key features identified in this thesis, reveals an untenable trend towards progressive increase of the effort spent by the most active authors, as time passes by. This trend may eventually cause that these authors will reach their upper limit in the number of revisions they can perform each month, thus starting a decreasing trend in the number of monthly revisions, and an overall recession of the content creation and reviewing process in Wikipedia. Finally, another important contribution for the research community is WikiXRay, the software tool we have developed to perform the statistical analyses included in this thesis. This tool completely automates the process of retrieving the database dumps from the Wikimedia public repositories, massaging it to obtain key metrics and descriptive parameters, and loading them in a local database, ready to be used in empirical analyses.

As far as we know, this is the first research work implementing a comparative analysis, from an quantitative point of view, of the top ten language editions of Wikipedia, presenting complementary results from different research perspectives. Therefore, we expect that this contribution will help the scientific community to enhance their understanding of the rich, complex and fascinating working mechanisms and behavioral patterns of the Wikipedia project and its community of authors. Likewise, we hope that WikiXRay will facilitate the hard task of developing empirical analyses on any language version of the encyclopaedia, boosting in this way the number of comparative studies like this one in many other scientific disciplines.

Doctoral Thesis
Ingeniero de Telecomunicación
Universidad Rey Juan Carlos, Escuela Tecnica Superior de Ingenieria de Telecomunicación, Madrid, 2009
Supervisor: Jesús M. González Barahona
Creative Commons BY-SA 3.0 License

author

PDF

Recent comments
Recent entries
More resources