The Bartlett’s Neil Spiller is still blazing a trail after all these years
In the first of a new series on how architecture schools are responding to new technology, the Bartlett’s Neil Spiller says the rapid evolution of computing is liberating for architects, but also poses some serious challenges
It is nearly 12 years since I edited the precursor to BD’s IT pages, during which time the practice of architecture has changed radically. The commercial availability of complex software and its reliant hardware technologies has created a fast, accurate and globally transferable design culture and community. Architects are caught in the same running-to-stay-still dance of Alice in Wonderland’s Red Queen as we attempt to cope with the changes that the virtual world brings us. All this was predicted a long time ago.
My own involvement with cyberspatial architecture began around this time — from 1992 onwards I was encouraging my students at the Bartlett to embrace the potential of cyberspace, and the design architectures that dwelt within it. My 1998 book, Digital Dreams: Architecture and the New Alchemic Technologies, sought to illustrate the amazing potential of new technology, particularly cyberspace, nanotechnology, biotechnology and emergence, and their similarities to the more arcane “technologies” of alchemy, shamanism and other transmutational systems. Digital Dreams trumpeted a new future for architects and new ways for architects to operate within the world. Was I right? Yes, most of the time!
In these past dozen years, many architects have been instrumental in exploring some of the emerging terrain engendered by digital architecture. One example is Lars Spuybroeck, whose Fresh H2O building in the Netherlands was an important piece of architecture on the way to full-blown responsive architecture.
Mark Burry’s revelation of the underlying family of geometries behind Gaudí’s work has rewritten Gaudí’s place within the pantheon of proto-cyberspatial architects. Burry has been in the vanguard, working with parametrics and generative components, while Marcos Novak’s AlloSphere, Eversion and Transarchitecture, Greg Lynn’s work on animate space, and hopefully my own work on vacillation, reflexivity, bioscapes and surreality, have all been important.
Today, parametric design and generative component design have also developed a whole number of notions about skinning architecture, evolving architecture, and fabricating architecture. Some architects have attempted to construct elements of buildings that have the ability to respond to data streams in real time. Others recognised that complex, computationally drawn cutting and routing patterns can create minute landscapes. This idea — that the old Fordist notions of mass production and the limits on the variety of factory-produced goods are no longer valid — allows architects to posit non-standard geometries or elements to buildings.
It costs relatively nothing to create ornate works, so the appropriateness of decoration manifests itself in the architectural debate again. This is a challenge for our more spartan colleagues. The envelope of buildings can be created from a series of braided surfaces visualised on computers, and machine instructions can be sent straight to the factory to enable full-size fabrication. Others architects use evolutionary algorithms to generate work, so proposals are in some sense bred. Information is now ubiquitous. All manner of data can be collected, transmitted and relocated, and can be used to create animated surfaces within a structure while also forming the fundamental building blocks of buildings. Therefore the old typologies of building have become corrupted and blurred. Without the rapid evolution of the computer and its ways of processing and keeping check on large amounts of data, none of these new projects would have been possible.
Computation technology is a double-edged sword. It promises liberation from laborious work and instant, or near-instant, communication. It promises smart interactive materials, surfaces and buildings. However, it can be responsible for surveillance, 24-hour working, a ubiquitous style of place, and ecological damage. The task for contemporary architects is to propose architectures that navigate and negotiate between these polarities, and vicariously create buildings and cities that are welcoming, enabling, facilitating, liberal, networked and spatially exciting.
Beyond a space
Architects also need to understand that architecture must be bedded into a landscape of ecology that far exceeds the boundaries of any specific site, country or continent. It is in the spatial manipulation of the relationships in these ecologies that their architecture resides. Architects must design using the imperative of flora, fauna, machines and networks, and their architecture must be capable of husbanding the forces of bio- chemistry, virtuality, movement patterns, the seasonal and diurnal and even millennial perturbations. Design must accommodate and rearticulate slow and abrupt phase changes of site and landscape.
The projects shown here are a knot of positions using the terrain I have just described, and are gleaned from work conducted by the Advanced Virtual & Technological Architectural Research (Avatar) group at the Bartlett. My own work, Genetic Gazebo, asks: can we create architectures that slip into other locations and spaces, return to show us what they’ve found, and “plant” a notation of this event in our environment? These “plantings” might exist for some for a long time, sometimes for shorter periods. Such ideas are capable of producing a sublimity of space that grows and decays, changes and rearranges, that speaks of the human condition as the actor in a series of linear, non-linear and quantum events.
One thing is certain: our world will continue to feel the onslaught of the digital tsunami, and architects must tame its power to create sublime new architectures. We must finally address the health of our planet in this context, and digital architecture, used prudently, will enable us to do this. It is a great time to be an architect.
Cyber facts abour cyberspatial architecture
The term cyberspatial architecture was first used mid-1990s. Before then, most architects experienced computers as a drafting tool, but now theorists could talk of an idealised architectural space that was exempt from real world concerns such as gravity and corridors. Now in the credit-crunched noughties, cyberspatial architecture is synthesised into all projects in terms of representation, and equally in terms of smart materials, data collection and reflexive spatial practices. Whatever an architect does, he relies on cyberspace to model, draw and produce it.
Three projects by the Bartlett’s avatar group
Genetic Gazebo - Neil Spiller
This is a kind of self-wiring computer that calculates and manages the formation of a vista in a garden. It uses as its equivalent to binary code a quadripartite code gleaned from extracting DNA code from fossilised prehistoric insects, dead gerbils and the fluttering of bathing birds.
Bonsai ship -Christian Kerrigan
Here both mechanical and biotactical interventions are used to grow a ship from a copse of trees. This kind of heavy metal bonsai technique constricts tree growth with metal corsets, which will create very structurally effective, dense timbers. This is a 200-year architectural project controlled by amber clocks, with even a slipway constructed the same way.
Kitchen biolab - Sacha Leong
This project explores a world where the everyday joins the rarefied protocols of tissue-engineering and biotechnology. A domestic kitchen doubles up as a biotech lab, and the breakfast table becomes a miniature architectural landscape.
For larger versions, see attached images.
Neil Spiller is professor of architecture and digital theory and vice-dean at the Bartlett, University College London. His new book, Digital Architecture Now, is published by Thames & Hudson in September.