Filed under thesis | Tags: · algorithm, art, automation, control society, generativity, governmentality, labour, lecture-performance, machine learning, neural networks
“Performing Algorithms: Automation and Accident investigates how artists might stage encounters with the algorithms driving our post-industrial, big-data-based, automatic society. Several important theories of this contemporary condition are discussed, including control societies, post-industrial societies, the automatic society, the cybernetic hypothesis, and algorithmic governmentality. These concepts are interwoven with histories of labour and automation, recent developments in machine learning and neural networks, and my own past work.
Through a series of expanded lecture performances that describe our algorithmic condition while setting it into motion, this research seeks to discover ways in which to advance new critical positions within a totalizing technical apparatus whose very design preempts it. The included creative works have been performed, exhibited, and published between 2014 and 2018. They are made available online through an artificially intelligent chatbot, a frequent figure in the research, which here extends the concerns of that research through to how the work is framed and presented.
The thesis focuses on both generative art and the lecture performance, which converge in performing algorithms but are generally not discussed in connection with one another. They emerged in parallel as artistic methods, however, at a time when management and computation were taking root in the workplace in the 1960s. Furthermore, as the Internet became widespread from the 1990s, generative art and the lecture performance each found renewed prominence.
With human language and gesture increasingly modelling itself on the language of computation and work constantly reshaped by the innovations of capital, this project identifies “not working” both in terms of the technological breakdown and also as a condition of labour under automation. A discussion of the first fatal accident involving a self-driving vehicle illustrates this dual condition. Shifting from glitch art’s preoccupation with provoking errors to a consideration of not working, this research proposes artistic strategies that learn to occupy rather than display the accident.”
Publisher Faculty of the Victorian College of the Arts and Melbourne Conservatorium of Music, The University of Melbourne, 2019
Filed under book | Tags: · abolitionism, algorithm, artificial intelligence, dna, facebook, google, prediction market, race, racism, segregation, social media, surveillance, technology
“From everyday apps to complex algorithms, Ruha Benjamin cuts through tech-industry hype to understand how emerging technologies can reinforce White supremacy and deepen social inequity.
Benjamin argues that automation, far from being a sinister story of racist programmers scheming on the dark web, has the potential to hide, speed up, and deepen discrimination while appearing neutral and even benevolent when compared to the racism of a previous era. Presenting the concept of the “New Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. Moreover, she makes a compelling case for race itself as a kind of technology, designed to stratify and sanctify social injustice in the architecture of everyday life.
This illuminating guide provides conceptual tools for decoding tech promises with sociologically informed skepticism. In doing so, it challenges us to question not only the technologies we are sold but also the ones we ourselves manufacture.”
Publisher Polity Press, Cambridge, 2019
ISBN 9781509526406, 1509526404
Interview with author (Sanjana Varghese, Guardian, 2019)Comment (0)
Matthew Plummer-Fernandez: The Art of Bots: A Practice-based Study of the Multiplicity, Entanglements and Figuration of Sociocomputational Assemblages (2018)
Filed under thesis | Tags: · algorithm, art, assemblage, bots, design, software, software art
“This thesis examines and analyses an emerging art practice known as artbots. Artbots are internet-based software applications that are imbued with character and configured to engage and entertain online audiences. This form of practice, and the community of practice leading it, was found to be underrepresented and misunderstood. I argue that this artform is original and warrants a more thorough understanding. This thesis develops a conceptual framework for understanding artbots that focuses on and enables questioning around pertinent aspects of the practice.
A wide range of literature was reviewed to provide theoretical underpinnings towards this framework, including literature on algorithm studies, science and technology studies, and software architecture. The devised framework examines artbot case studies through the notions of multiplicity, entanglement, and figuration, having understood artbots as heterogenous sociocomputational assemblages comprised of software components and human intraactivity.
The research followed a varied methodology that encompassed participant observation and my own practice-based experiments in producing artbots. The study resulted in several original works. In addition, a showcase titled Art of Bots brought together key proponents and artbots, further providing material that is analysed in this thesis. The study helped identify and discuss artbots with attention to how they utilise modular software components in novel arrangements, how normative human and nonhuman relations of interaction are being eschewed in favour of entangled interrelations, and how artbots challenge common narratives dictating technological constructs by inventing unique characters and figurations.”
Publisher Goldsmiths, University of London, 2018
Creative Commons BY-NC-ND License