Stuff Engineering Students Like: Episode 1 Dangerous Stuff

When I was invited to present at a seminar on The Politics and Ethics of Global Urban Practice, I had a mate staying with me. She teaches on the women and gender studies program(me) at Georgetown and, as we shot the breeze about teaching, it struck me that what I wonder about on my bike home is actually central to all of her classes.

I am the woman teaching a wee bit on other people’s courses about 'The Environment' or 'Sustainability' or 'Effics' or some other ambiguous topic that eventually requires our engineering students (reluctantly) to submit a written task. I was that student in the nineties and those were pretty much the only women that lectured me.

I used this for my slide show and 'hacked' the book cover of Christian Lander's Stuff White People Like because the colours.

I used this for my slide show and 'hacked' the book cover of Christian Lander's Stuff White People Like because the colours.

And, worse, I am clearly failing at the ethics bit of this because I had to spend the morning before the seminar dealing with three international students who had cheated the plagiarism-checking algorithm when they submitted their essays to me. Ugh. So, I thought I would blog my presentation just to document some experiments with squeezing politics and ethics into my teaching. Necessarily, episode one of this pedagogical work has been about talking to myself, back then, over there – the 'old me': young, adventurous and dumb. The second element is about mixed feelings: is a little politics and ethics on engineering courses a dangerous thing. I'll cover the troubling project, funded by UCL Grand Challenges, that really had me questioning this in another post...

Firstly, I should explain four special things about my department (CEGE at UCL):

  • CEGE requires high A-level grades (UK exams for 18 year olds) but no subject pre-requisites and (consequently) has about 30% female undergraduates compared to a national average of 17% on civil engineering course that usually ask for A-levels in physics
  • Since 2004, CEGE has seen a massive curriculum reform, shifting away from traditional technical modules to scenarios and projects focused on applying engineering knowledge in contexts that are social, economic and environmental
  • We are on a cramped London site with limited lab space and time to allow students time to make, play with the material and fail with their hands. This gap is not central to this post but I will come back to it at the end
  • I’ve been lucky to pick up teaching on unconventional and important engineering courses developed by Sarah Bell (Sustainable Infrastructure, Engineering Thinking, Urban Flooding and Drainage) and Dina D’Ayala (Strengthening of Low-Engineering Buildings)

Ok. Now, in spite of all these great things, remember that I am usually in a room with undergraduate engineers or aspiring (and sometimes experienced) staff from international humanitarian organisations. So I operate on the assumption that many are like the 'old me': drawn to projects, models, solutions, silver-bullets, audiobooks not pages, diagrams not paragraphs, matrices not stories, corporate bureaucracy, clarity of hierarchy and so on. So, as I try to bring my practice (a.k.a. mistakes) to bear on this work, I try to teach against the pillars of this worldview: processes, codes and categories. I love them all.

At the seminar, it happened that the architects and engineers were the ones with slides and pictures rather than text and words and I am hoping that in my spidery illustrations, you will see a further point here that ties in to our ways of acquiring and prefering particular kinds of knowledge.  

Process: the algorithm might be wrong

This is about challenging an appealing formula by using pictures and stories to question 'normal' sequences, heirarchies and perspectives.

The image contrasts the "standard" algorithm presented in guidance on transitional shelter (p.111 of this pdf) with a photograph that I took about 12 days after the 2010 earthquake in Haiti. In the lecture, students are asked to think about why, when I returned later the same day, people had already gone back to a damaged house at the bottom of the ravine - a ravine that floods with sewage every rainy season - and were hanging out clothes by a flickering lantern at dusk. One answer is that this hillside is an excellent example of risk mitigation. It was a choice of location, from a limited set of choices, that minimised the risk of eviction, reduced rents, cut the time and costs of getting to work and allowed people to live closer to acquaintances already in the city and in buildings carefully adapted to protect from theft and hurricanes.

This is a process that has a history and a specific context that resists the algorithm.

Codes and Standards: critical thresholds and risks

This is about practising the application of codes and standards to show that standards may have unintended and undesirable consequences for engineers keen to do good and make things healthy and safe.

In a lecture on seismic mitigation, we look at the guidance published by a real international organisation in a real country. It is designed to help decide which schools should be retrofitted to withstand an earthquake.

Students are given information about three schools on a map and have to work out whether the retrofit is 'affordable'.

The most remote, tumbledown buildings turn out to be too costly to fix so the priority schools are those closest to roads, cities (and ministries!).

Students are then given information about the livelihoods of families in each place and have to comment on the possibilities of spatial or distributional injustice and whether mitigating one risk, could embed or exacerbate others.

In a lecture on water quality, students are asked to comment on the risky features of a water system using a sketch taken from WHO sanitary inspection forms (p. 150 Annex 2): tools for non-engineers to identify the worst risks to health in their own water systems. 

The process of measuring water quality is explained as expensive, slow and dependent on laboratory infrastructure and engineers. Students are then asked whether visual inspection by community members or high-tec testing by experts is most risky for people trying to manage their own water supplies. 

In both cases, students are asked: should people in poor countries accept lower standards for building codes/drinking water quality; could imposing high, unachievable standards for particular risks or 'sexy' problems we think we can solve, create greater risks in the aggregate for those "other" people? This provokes but does not resolve further questions: what is an acceptable risk and who decides what is acceptable?

To bring this closer to home and avoid a "them and us" debate, students are introduced to the QuALY system for deciding on health investments in the UK. Crudely, if a new intervention can extend life for less than, say, £22k per year, the country invests in it. Then a student has to read out this quote to show the process and political content of the decision:

“the threshold was not based on "empirical research" as no such research existed anywhere in the world … [it is] really based on the collective judgment of the health economists we have approached across the country. There is no known piece of work which tells you what the threshold should be.”

Former CEO of NICE, Professor Michael Rawlins (House of Commons, 2007)

To wrap up, and link this to professional codes of conduct and standards of behaviour, we look at two ideas that have been elaborated by Jonathan Wolff at UCL. Firstly, the idea that, as professionals, we have to account for our decisions. Unlike a doctor who is obliged (fraught with power and knowledge asymmetries as this may be) to sit with each patient to see whether they give consent to treatment, infrastructure engineers cannot sit with each rail passenger to outline the risks of riding a train. Instead of informed consent, society may prefer what Wolff calls “comfortable de-focusing”. This, then, has implications for our technical formulation of risk (Wolff, 2006):

risk = hazard x likelihood


risk = hazard x likelihood x blame x shame

And this goes beyond calculations and liabilities to the institutional, social and political context of engineering.

Categories: it matters who decides and how

Whether you end up 'living above or below the algorithm' depends on a whole series of technical and political processes which, depending on the context, implicate and have implications for engineers.

In this example, students are read a quote from an engineer after the Christchurch earthquake to illustrate the consequences of categories which are decided by politicians on the advice of engineers:

“In retrospect, I wish we had just had red zones (uninhabitable) and green zones (you can go home)… We thought, as engineers, that where we needed more detailed assessment, we should mark the zone as orange. Then we could investigate before letting people go home. But, in fact, the uncertainty, the not knowing…. it was worse for those people. People from the orange zones had higher rates of depression and suicide.”

Students are asked to think of other similar examples.

In this example, students are introduced to simple categories of "vulnerability" used by international organisations to target support to the most needy. Then, they are told a story about the earhquake in Kobe to look at how "being old" was not in itself the vulnerability but the fact that many older people lived in old, wooden houses in the old city centre. Their modest incomes came from letting out rooms to poorer, younger people looking for work. These houses were disproportionateyly affected by the earthquake and the fire that followed so this populaton was disproportionately represented in temporary housing. And because their housing was their only asset and their income from tenants had dried up, this group found it hard to borrow money. And because their housing was high density and on valuable land, with multiple occupants and owners and the layout pre-dated the latest urban planning standards, reconstruction was delayed. Students are asked whether they think vulnerability is a property of a person, a structure or a system.

These images also show how quickly the neat categories get tangled and ambiguous when a story is told or a picture is drawn. These categories are not amenable to diagrams.

Humility and Curiosity: necessary but insufficient?

In all cases, I was hoping to, at least, seed curiosity and humility by making the examples real, the stories personal and the dilemmas uncomfortable. The sessions were supposed to provoke awareness of alternative possibilities (extending the concepts or frameworks). When we worked through examples, there was also a chance to practice, to navigate and discover that questions could not be resolved (experiential, discursive, uncomfortable). In most of our courses, however, the objective is to become fluent and confident (assessed, achieved, solved). If I had left students feeling fluent and confident with any of this material I would surely have failed.

But what if these students were like the old me?

What if learning was just an accidental side-effect of wanting to solve or win? For me, this meant that I did not experience certain ways of knowing and I enjoyed a certain laziness about material that could not be acquired this way. Indeed, to my horror, I often see in myself and sense in our students:

Thanks Evgeny Morozov for inspiring this slide with your public lecture at LSE!

Thanks Evgeny Morozov for inspiring this slide with your public lecture at LSE!

The next post will look at why this is so very troubling...

Stuff Engineering Students Like: Episode 2 Dangerous Minds

The final part of this two-part post on the Politics and Ethics of Global Practice is about ambivalence: my uneasy feeling that trying to teach what you have learned through practice (a.k.a. blunders) may not be possible or desirable.

This first struck me when I started showing, in lectures and seminars, some of the films made by Alison Killing as part of our RIBA-funded project (re)constructing the city.

This film, in particular, was one that I used to transition from the ethical (how do you evaluate? what variables? what values? what formulae? what trade-offs? what consequences?) to the political (is it you that should evaluate?).

Learning comes about: it is not an outcome

I find this film both incredibly compelling and incredibly troubling (see for yourselves). But others seem more compelled than troubled: where I was expecting shock and criticism, I've had people ask how to get a job like the ones in the film.

So, I stopped showing it. Although I wanted students to learn through practice and exposure to alternatives, when that exposure was via the medium of film, I did not like what was being learnt!

The responses it provoked - like Frankenstein's monster - were unintended and uncontrollable: producing very different, unpredictable and emotional reactions to those of my beloved diagrams and exercises. This all fits in with a particular idea and format of teaching: learning outcomes are decided ahead of time, delivery is didactic and, while I may be accountable, I am also somehow in charge. INGO-ers will be familiar with this rubric... and with the participatory alternatives that are supposed to remedy it.

Privileging the familiar

The original intention of making the films was to show heavyweight Haitian urbanists and politicians challenging the action and legitimacy of the organisations that had entered Port-au-Prince after the earthquake. Instead, the stories were told by our compatriots or thirty-something peers. It shows just how difficult it is not to privilege the familiar when there is limited time to build the trust needed to get other voices on camera.

This privileging of the familiar is a paradox not only of professional practice but also of pedagogy: we do it and our students do it. It's not enough to rely on peer-based learning, project or scenario based learning to address all the exploratory and social aspects of professional practice. We also need to create opportunities for students to observe and question the gender, class and race dynamics that play out in the lecture room.

Privileging the protagonists

My sense is that the reactions to this film are not only about familiarity. There also seems to be a tendency to identify with the apparent protagonists.

This might just be me - after all I wanted to be an engineer because of MacGyver, the A-team, AirWolf and Maggie Philbin - quintessential heroes of eighties TV. Or, it might be that people who choose engineering assume that being an engineer will satisfy some kind of protagonist urge.

To fixate on the talking heads might be a pathway to empathy and alternative perspectives but this depends on who is talking and what excites the viewer. And it needn't be people. Technology provokes the same hypnosis. Last term, first year students had to build mini-hydro-electric power stations. Their final task was to work out the flow rate of water through the system using a measuring jug and stopwatch. When a digital flow meter was unveiled to repeat the tests, they discarded their measurements in favour of the laptop screen: this had to be better than their own hands, right? We said over and over again that the device had not been calibrated because of an equipment problem. It would show a pattern but no quantities. Like an untuned guitar, even played perfectly it would churn out nonsense.

This magpie instinct to privilege the shiny, the innovative and the heroic seems to be at the expense of trusting hands, eyes and just clunky, everyday stuff.

Privileging the written

Speaking of clunky, you might notice that my previous post - built around images and diagrams rather than text - was a bit more fluent than this one. By sketching out an array of teaching materials, I've been trying to show:

  • how much I love to "taxonomise" the world before writing anything
  • that the habit and joys of ordering help some people feel fluent and might even mean more or different types of people are able to share and then challenge an idea
  • that this way of trying to organise things gets messy when we're are told stories, when we're hearing narratives rather than tracing out systems

This is not an argument for stories over diagrams or vice versa. It is an argument that having recourse to only one way of knowing is narrow. This narrowness, though, seems to permeate academic and pedagogical formulae and to privilege the written.

Our discourse analysis work suggested that urban and humanitarian professionals also appear to do this. When asked to draw, some workshop participants, even those for whom drawing was a profession, rejected it as a way of coming to know things. Participants talked about drafting written policies without flinching but would not countenance drafting a drawing. While there was an unconcerned professional illiteracy with forms of information other than text, expecting 'beneficiaries' to draw maps was the participatory tool of choice. Curiously, when participants did examine aerial photos of a neighbourhood they had to forget, in some way, the assumption of their prior knowledge and expertise.

To ponder this, I like to go back to Matthew Crawford's suggestion (2012) that drawing requires

"that you short-circuit your normal mode of perception, which is less data-driven than concept-driven"... "trying to attend to the visual data more directly" (Crawford, p91-93).

It makes me think that maybe there is room for any discipline to accept other ways of coming to know and to become more comfortable or open to knowing nothing.


Practice makes....

Discussions on the theme of urban practice and pedagogy are due to follow this first UCL Urban Lab session on politics and ethics. These are intimately and ambivalently intertwined around how we collectively generate and pass on knowledge. And how, more mysteriously, we seem to lose track of our mistakes along the way, as they are wrapped into the rote of our own disciplines.

When I say that the production of films made me feel like Frankenstein, it's because the whole enterprise of working with an urban designer showed me that making, for other disciplines (designing space, producing a film), is, necessarily, also a commitment to the possibility of making mistakes.

My ambivalence is that, obviously, if you love to get stuff done, stuff is made. This seems to lead to a brand of mischief that is anathema to both civil engineering and the project cycles built into humanitarian response:

“Anyone who is both clever and lazy is qualified for the highest leadership duties, because he possesses the intellectual clarity and the composure necessary for difficult decisions. One must beware of anyone who is stupid and diligent -- he must not be entrusted with any responsibility because he will always cause only mischief.” (von Hammerstein, 1878-1943).

While some engineering disciplines are placing renewed emphasis on design though iteration, prototyping and agility and the humanitarian discourse is preoccupied with innovation, the concern, it seems to me, is whether these procedures can avoid the narrow, privileging processes that start much earlier.

Appetites urge...

"When you have a hammer, every problem looks like a nail"

Throwing the 'diligence' out with the bathwater, however, disguises a deeper protagonism problem: it is the very sense of power and of works to be accomplished, that reveals a certain set of seemingly surmountable problems.

My ambivalence is that I can't believe this is all bad! I've felt and, I think, shared the compulsion and euphoria associated with design and making in the rising after disaster. And I've come to wonder whether critics from text-based disciplines overlook the joy, commitment, mistakes, concerted imitation and dignity in the human appetite for technological fixes and expertise...

Maybe, it's ok to get stuck in if we try to be actively conscious. Or, we could keep reminding ourselves that being keen or feeling potent makes us see (only) problems we think we can and should (by virtue of being potent) solve.

But is it enough to "pass the stick", observe the room, reflect on the discourse?

Or should we leave the room?

Is it enough to be told or reminded that interfering in things or places that were already made is an intervention into a social and political process? Or do we need a closer examination of our discipline and the people bound to pursue it?

So what to do?!

I look at our curriculum and the way it is organised and I think our students are really lucky. They are taught by a dedicated, diverse staff. They get to work on safe, simulated projects that force them to apply what they know, work with each other and produce all sorts of written, drawn and spoken material.

The two things that nag at me are that all the scenarios and contexts don't necessarily get us away from 'solutionism' and a strange, unquestioning confidence in "the algorithm", especially once it is hidden in a computer and can churn out results.

What I'd like is to be able to model, through teaching, a professional ethos that:

  • is ready to do nothing or shut up;
  • questions the 'why' of its categorisations and realises these are political as well as technical;
  • accepts what cannot be known and
  • acknowledges what is missed, unsatisfying, inconclusive or tragic - inherently contradictory (from Daniel Miller's chatper on housing in the book Stuff)
  • can sometimes resort to narrative but acknowledges whose stories are interwoven and whose evidence is being marshalled;
  • somehow uses observation and discussion, images and exercises to unearth and confront disparities of power during professional encounters, distracting people from what they assume to be their own prior expertise and know-how

And I keep coming back to this thing about making mistakes: making, as in getting our hands on actual material; and mistakes, as in failing to find the right answer or recognising there might not be a right answer that we - because of where we come from and who we are - can or should find.

Only modest repairs to teaching and practice then!

Something along the lines of Matthew Crawford's 'stochastic art' (after Aristotle):

"Mastery [of a stochastic art] is compatible with failure to achieve its end....This experience of failure tempers the conceit of mastery ... In diagnosing and fixing things made by others, one is confronted with obscurities, and must remain constantly open to the signs by which they reveal themselves."