Grant D. Taylor: The Machine That Made Science Art: The Troubled History of Computer Art 1963-1989 (2004)
Filed under thesis | Tags: · aesthetics, art, art history, computer art, computing, cybernetics, programming, technology
“This thesis represents an historical account of the reception and criticism of computer art from its emergence in 1963 to its crisis in 1989, when aesthetic and ideological differences polarise and eventually fragment the art form. Throughout its history, static-pictorial computer art has been extensively maligned. In fact, no other twentieth-century art form has elicited such a negative and often hostile response. In locating the destabilising forces that affect and shape computer art, this thesis identifies a complex interplay of ideological and discursive forces that influence the way computer art has been and is received by the mainstream artworld and the cultural community at large. One of the central factors that contributed to computer art’s marginality was its emergence in that precarious zone between science and art, at a time when the perceived division between the humanistic and scientific cultures was reaching its apogee. The polarising force inherent in the “two cultures” debate framed much of the prejudice towards early computer art. For many of its critics, computer art was the product of the same discursive assumptions, methodologies and vocabulary as science. Moreover, it invested heavily in the metaphors and mythologies of science, especially logic and mathematics. This close relationship with science continued as computer art looked to scientific disciplines and emergent techno-science paradigms for inspiration and insight. While recourse to science was a major impediment to computer art’s acceptance by the artworld orthodoxy, it was the sustained hostility towards the computer that persistently wore away at the computer art enterprise. The anticomputer response came from several sources, both humanist and anti-humanist. The first originated with mainstream critics whose strong humanist tendencies led them to reproach computerised art for its mechanical sterility. A comparison with aesthetically and theoretically similar art forms of the era reveals that the criticism of computer art is motivated by the romantic fear that a computerised surrogate had replaced the artist. Such usurpation undermined some of the keystones of modern Western art, such as notions of artistic “genius” and “creativity”. Any attempt to rationalise the human creative faculty, as many of the scientists and technologists were claiming to do, would for the humanist critics have transgressed what they considered the primordial mystique of art. Criticism of computer art also came from other quarters. Dystopianism gained popularity in the 1970s within the reactive counter-culture and avant-garde movements. Influenced by the pessimistic and cynical sentiment of anti-humanist writings, many within the arts viewed the computer as an emblem of rationalisation, a powerful instrument in the overall subordination of the individual to the emerging technocracy.” (Abstract)
Ph.D. Thesis
Landscape and Visual Arts, The Faculty of Architecture, The University of Western Australia, 2004
via MediaArtHistories
PDFs (updated on 2016-2-17)
Comment (0)Lev Manovich: Software Takes Command (2008–)
Filed under book | Tags: · aesthetics, computing, design, history of computing, media, media design, media theory, software, software studies

“Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination – a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the practice and the very concept of ‘media,’ the author of The Language of New Media (2001) develops his own theory for this rapidly-growing, always-changing field.
What was the thinking and motivations of people who in the 1960 and 1970s created concepts and practical techniques that underlie contemporary media software such as Photoshop, Illustrator, Maya, Final Cut and After Effects? How do their interfaces and tools shape the visual aesthetics of contemporary media and design? What happens to the idea of a ‘medium’ after previously media-specific tools have been simulated and extended in software? Is it still meaningful to talk about different mediums at all? Lev Manovich answers these questions and supports his theoretical arguments by detailed analysis of key media applications such as Photoshop and After Effects, popular web services such as Google Earth, and the projects in motion graphics, interactive environments, graphic design and architecture.”
First version self-published in 2008
Publisher Bloomsbury, July 2013
International Texts in Critical Media Aesthetics series, 5
Creative Commons Attribution Non-Commercial License
ISBN 1623567459, 9781623567453
xi+357 pages
Reviews: McKenzie Wark (Public Seminar, 2015), Alessandro Ludovico (Neural, 2014), Jussi Parikka (Cultural Politics, 2014), Patrick Davison (International Journal of Communication, 2014), Yanni Alexander Loukissas (Journal of Design History, 2014), Brock Craft (Popular Communication, 2014), Warren Buckland (New Review of Film and Television Studies, 2014), Martin E. Roth (Asiascape, 2014), Manuel Portela (MatLit, 2013), Alan Bilansky (Digital Humanities Quarterly, 2019).
Interviews: Michael Connor (Rhizome, 2013), Illya Szilak (HuffPost, 2017).
PDF (2 MB, added on 2019-8-23)
EPUB (7 MB, added on 2019-8-23)
Issuu (added on 2013-9-1)