Article originally published in the issue of september 2017 of Domus Magazine.
The current digitalization has opened up a lot of possibilities to scale up engineering and architecture work both spatially and temporally, from building to city scale and from point-in-time to long term analysis. We now draw much more transversely from all areas of expertise of the industry: structural analysis, energy consumption and production, thermal and psychological comfort, lifecycle analysis, biodiversity…
Thanks to computational tools, we try to quantify and visualize the evolution of carbon emissions, matter, energy and water flows supporting the activity of the multitude of agents and systems shaping our complex cities: it is now common to model precisely the potential for solar rooftops of an entire city at once, to simulate the way sunlight and wind heats and cools its streets and buildings, to optimize dozens of design options to meet sustainability targets. In order to cover this much ground at ever lower costs, we build and use a growing software infrastructure running on data, the so-called new oil of the 21st century. We extend it further and further by making buildings, networks and cities “smart”, contributing to the streams flowing in our data collection pipelines.
We are beginning to see the cracks in this digital infrastructure. Data we collect and on which we rely upon becomes hard to process and reason about, is wrong, incomplete, imprecise or obsolete: our clients only possess scrambled utility bills information, sensors start to drift or interfere with what they were supposed to measure… The time we devote to actual design work pales in comparison to the time spent on data cleansing, plumbing and modeling related tasks. The illusion of control given by the complex models and their simplistic metrics creates reality blind spots in our understanding. Our expertise is so stretched across disciplines that we sometimes use our tools as black-boxes, and lose some of our much needed critical judgment.
The digital infrastructure is here to stay and its expansion a huge opportunity to tackle the challenges we face, from climate change to resource scarcity. There is an undeniable economic and scientific value in this way of doing things, but its social and environmental gains are much more uncertain. Is it the best way at our disposal to reach urban sustainability? First, we may need to rely more on basic models that are simple but understandable and pragmatic, second, balance the ecological footprint and the benefits of this in silico infrastructure, and then, get good at improving its parts and our practices .
This way of doing things requires a broad and vertical expertise, which integrates big macroscopic sustainability issues and the kind of microscopic fieldwork we engage in when we clean up data, articulate equations, write or use simulation computer codes. The digital transformation has to be more than just a better way to store and move more data around in a complex web of software systems. Its job is first and foremost to create useful information for the design process and its participants: architects, engineers, clients, building occupants… For these different kinds of users, we have yet to build good human data interfaces to complement our building or city information models. Let’s do this!
Responding to the climate emergency by sustainably transforming urban fabric.
Leave A Reply