Building without Concrete?

Building without Concrete?

Concrete has been used as a lazy solution for every problem in the built environment. We can reduce our dependency and use of concrete.

After water, concrete is the most widely used substance on earth. The huge environmental burden of concrete is generally assumed to be necessary, and much research is being devoted to reducing the carbon costs of manufacture. Robyn Pender argues we should ask deeper questions: How much do buildings truly require concrete? And do we deploy it wisely?

The rise of concrete

In the history of the built environment, concrete is very much a newcomer – Ordinary Portland Cement (OPC) was not developed until the middle of the 19th century, and not commonly accepted for general construction use until after the First World War. Since then, however, it has become ubiquitous: as a mortar and render, formed into breeze blocks and roof tiles and pipes, reinforced with steel to make structural frames, poured to make foundations and footings, lift shafts and stairs, paving and road surfaces… a ubiquitous presence in every city and every building site.

Until coal began to be industrially exploited, construction relied on a tiny corpus of materials that took a minimum of energy to produce and use: plant-based (including wood, thatch and fibre), stone, earth, brick, and lime.1 With transport both difficult and expensive, buildings were made almost entirely from whatever was available on site or close by. Where these materials were imperfect, the local builders learnt tricks to make them work, such as protecting weak stone with lime renders. Designs varied from region to region, reflecting not only local availability, but also local climate and local economic needs. Vernacular buildings were simple, robust and easy to repair, and to alter if necessary.

It seems unlikely that concrete would ever have been so widely used for buildings if the masters-and-apprenticeship systems that passed along traditional knowledge had not disappeared during WW1. The loss of manpower during the war, and its replacement with fossil fuels, produced the environment for concrete to flourish. It was promoted as requiring neither expertise nor time; and these continue to be the principal reasons given for its employment. The new world of modern construction had arrived, and it was based on centralised manufacture of standardised materials and the development of standardised design. If concrete proved quick to fail and difficult to repair, demolition and replacement was made easier by fossil-fuel energy, and all this drove a new economy. As buildings became more complicated and more innovative, expertise became increasingly siloed, and there was less and less opportunity to learn from mistakes.

Keeping dry

The introduction of mass-produced window glass in the 17th century marks a turning point in construction: it is what Frederick Measham Lea (a former head of the Building Research Station) christened a ‘raincoat’ material. Like the other industrial-era materials – sheet metals, plastics, and of course concrete – it resists rain by virtue of being ‘waterproof’. The water beads on the outside, collecting and flowing down the surface. Over the 20th century raincoat envelopes came to dominate the built environment in the Global North, and they are now commonly assumed to be the optimum way to weatherproof.

<strong>Figure 1.</strong> Clay was used not just for cob walls and wattle-and-daub, but as the primary mortar for masonry of stone and brick, and not merely for simple vernacular buildings, but equally for cathedrals and castles. The core mortars we long presumed to be ‘degraded lime mortars’ have now proved to be earth, often stabilised with lime aggregate. <em>Photo:</em> Robyn Pender
Figure 1. Clay was used not just for cob walls and wattle-and-daub, but as the primary mortar for masonry of stone and brick, and not merely for simple vernacular buildings, but equally for cathedrals and castles. The core mortars we long presumed to be ‘degraded lime mortars’ have now proved to be earth, often stabilised with lime aggregate. Photo: Robyn Pender

The traditional approach, tried-and-tested over millennia, is what Lea christened ‘greatcoat’ envelopes. These provide weather resistance by using permeable materials, which have well-connected networks of pores forming fine capillaries able to transfer water. Greatcoat envelopes are resistant to rain because most of the pores are filled with air, so when a raindrop hits a surface pore it is prevented from penetrating by the pressure of the air it is trapping in the deeper pores (this also gives notable flood resistance). The droplet is held in the surface pore until it evaporates again, carried away by the wind. Even in a strong rainstorm, little or no rain runs down the surface. Like a woollen greatcoat, the wall remains dry on the inside no matter how wet the outside may feel.

The only way rain can be drawn further is if it should happen to hit the mouth of a capillary already filled with water. As long as the wall is in good condition this is rare. Greatcoat construction is characterised by detailing intended to prevent water entering at weak points (wide eaves to protect the wall heads; hood mouldings and sills with drips around windows). In the past, most buildings were rendered with lime-based mortars that not only increased their resistance, but also helped them to dry if a moisture problem occurred.2 Current research is examining whether lime-based renders might protect modern brick-and-block cavity walls from water penetration.

Sadly, the greatcoat approach is no longer understood by many building professionals. Permeable renders and protective detailing have disappeared from new build and repair, and greatcoats are regularly turned into raincoats by coatings and concrete. This is doubly unfortunate, because modern raincoat envelopes have some fundamental weaknesses. Raincoats require joints, and thermal expansion and contraction makes these difficult to seal. Concrete is brittle and prone to cracking, especially at interfaces with other materials such as brick. When water flowing down the surface meets a fine crack or imperfectly sealed joint, it will be drawn in and trapped.

Nonetheless, after the war concrete entirely replaced the earlier mortar systems in the UK and elsewhere, even in building conservation. One popular use was for floors and foundations. These usually incorporated damp proof courses (DPCs), meant to protect the building from water rising from ground level through capillary action. But ‘rising damp’ is another modern problem, caused by the introduction of plumbing in and around buildings. Leaks provide a continuous and copious supply of water under pressure, but are often hidden because the pipes have been concealed in concrete. If a building is exposed to flooding, the DPC in a concrete floor will act as a capillary, wicking water in through the walls and building. The use of concrete for the surfaces surrounding buildings has led to additional problems: splash-back erosion and run off, exacerbation of capillary rise, and ground water changes leading to subsidence or heave.

Learning from the past?

Earth mortars are already being rediscovered by ‘eco-builders’. For lime, recent work on hot mixing (using quicklime rather than slaked lime) has rediscovered that highly workable mortars can be made quickly and easily, removing many of the objections that have slowed lime’s readoption for new build.

In addition to better moisture behaviour and sustainability, traditional mortars have a number of other advantages over concrete. One is their more forgiving chemistry: the negative impact of salts from cement on stone and brick is well known, and there are other problems such as the corrosion of embedded copper pipes. Another is their capacity to absorb building movements: they do not need expansion joints, and deform plastically without losing their bonding to the masonry. Early 20th century engineers understood this to be desirable, but were constrained by calculators that could handle only simple rigid structures. Now, with ample computing power at our disposal, deformable materials are once again becoming popular.

The difficulties in maintaining and repairing concrete-reinforced buildings (epitomised by the recent tragic collapse in Florida) are additional considerations. For sustainability, it is critical that our buildings are as long-lived and easy to maintain as, say, a 15th-century cob cottage.

Tall buildings?

Very tall buildings depend on concrete, but although alternatives to concrete frames have been proposed, a deeper question must be asked first: how appropriate are these structures to a sustainable future? They rely on heavy servicing in the form of lighting, ventilation, HVAC, pumps, lifts etc; and they are less efficient than mid-rise buildings in terms of occupant density. Changing office use post-pandemic is another consideration. Can traditional construction help here as well? The old city of Sana’a in Yemen is composed of earth buildings as much as eight storeys high, which occupants consider to be very comfortable: more so than their concrete-based neighbours which imitate the aethetics rather than the materiality.

A positive future

A built environment beyond concrete begins to look very promising. There must be many forgotten practical reasons behind traditional building materials and systems, so obvious to our predecessors that they never bothered to write them down. We are having to relearn the basics from scratch, but it is already clear that, for most building purposes, turning away from concrete is likely to bring more benefits than problems.

Over the past century, concrete has been used as a lazy solution for every problem in the built environment. But the all-purpose tool is never the sharpest. Concrete will not disappear; the characteristic for which cement was invented – its ability to set under water – will continue to be invaluable. But it is time to question the materials and designs we use. A future built environment that draws on both traditional and ‘modern’ materials and systems, and puts longevity and sustainability at the heart of design and decision-making, should also be a much better place in which to live.


1. The ten volumes of Historic England’s Practical Building Conservation series for Routledge cover the historic development of most building materials and systems used in the UK.

2. The mechanisms have been investigated in research by, amongst others, David Wiggins (Historic Environment Scotland Technical Paper 27) and Historic England together with Sheffield Hallam University.

Latest Commentaries

Turkey and Syria Earthquake 2023. A devastating magnitude 7.8 earthquake struck the Turkish province of Kahramanmaras. Photo: Twintyre (Shutterstock).

In light of the recent earthquakes in Turkey and Syria, David Oswald and Trivess Moore (RMIT University) reflect on the rights that inhabitants have for buildings to be safe, healthy, comfortable and robust. However, serial and various failings in the construction supply side and its oversight by governments mean greater accountability is needed.

Blind Spots in Energy Policy

As a policy practitioner who leads a national organisation representing households and small businesses in shaping the future of Australia’s energy system, Lynne Gallagher (Energy Consumers Australia) responds to the Buildings & Cities special issue, Energy, Emerging Technologies and Gender in Homes.  Insights from lived experience reveal blind spots in the design, provision and use of smart tech that adversely affect energy outcomes.

Join Our Community