Never trust other people’s statistics, says Julia Park. There’s always a risk someone put a decimal point in the wrong place

Julia Park

Research is a curious and privileged occupation that can take many forms. The Oxford Dictionary defines it as, ‘The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions’. As so much research is now commissioned by private companies or individuals, they might like to add in, ‘unbiased’, next time the definition is reviewed.

Much of what I do falls some way short of that formal definition. I am happy to describe myself as a ‘loose researcher’ not least because, like many others, I rely heavily on other people’s work. I am usually doing no more than reading, thinking, testing and writing. As that rarely involves the primary research, which tends to be difficult to find and often impenetrable, an element of Chinese whispers has to be accepted.

When I do come up with ideas, they’re usually practical, rather than inspirational. Housing policy and standards is a strange obsession for someone who is fundamentally not keen on rules at all, but having accepted that rules are often necessary because not everyone can be trusted to do the right thing (just look at what’s being produced under PDR), it’s pretty important that they are sensible and remain relevant.

The devil is often in the detail, particularly when standards become regulation – a difference that is often under-appreciated. When Lifetime Homes was introduced in the early 1990s, the vast majority of new homes were houses, even in London. By the time Habinteg took over, flats had become the norm in all major cities. Persuading the standard-bearers that in apartment blocks, plumbing and drainage can’t simply meander around between floorboards, and that reasonable access to a WC could be achieved with a basin alongside if the basin was pushed back a bit, was difficult.

Two years later, they accepted that effectively requiring every fitting to be on a different wall was awkward to build, and that the boxing required to conceal pipework added a large amount of unusable space to the gross area of the bathroom. It rarely looked good and I put it to them that it risked being ripped out and replaced with something easier to live with, but almost certainly less accessible. The only beneficiary would be landfill. We haven’t solved the issue in wheelchair housing but thankfully the ‘three-in-a-row bathroom’ is now possible at Category 2. The phrase ‘by asking for a bit less, you often end up with a bit more’, passed on by a wise civil servant, has stayed with me. He also happened to mention that the 800mm zone next to the WC came from a study of just 15 wheelchair users.

As a vocal proponent of space standards, the question I am most frequently asked (and the one I dread most) is, ‘Why are UK homes the smallest in Europe?’ This claim is always attributed to the RIBA. A report it published in 2010 included a table of European comparators which revealed that the average size of all UK homes was 85sq m; the average of newly built homes, 76 sq m. Under both criteria, Denmark had the largest homes, (108sq m and 137sq m respectively). The table dated back to 2005.

When, in 2017, I wrote a book, One Hundred Years of Housing Space Standards: What Now, I included the same table, with the source and the 2005 date. I attributed it to a Cambridge professor who had published it in 2014. I didn’t attempt to source the original data on the basis that if it was good enough for him, it was good enough for me.

Only it wasn’t. I received an email from a reader who had tracked down the origin of the UK data for new homes. The figure of 76sq m appears to date back to at least 1996, possibly to 1980. He pointed out that the EHS (English Housing Survey) for 2014-15 shows that the average size of UK homes built since 2005, is 87sq m. An article published in the RIBA Journal in February 2017 concluded that UK homes are now very similar in size to those in France and Germany. People hate it when I tell them any of this­ because ‘rabbit hutch Britain’ makes a much more effective strapline.

Since then, I’ve become more suspicious about statistics. A blog in Construction Buzz a week or so ago caught my eye. Looking at the potential demand for ‘upward extensions’, it reported that developers have identified that 180,000 rooftop homes could be created in London with ‘potential to house 720,000 people’. Four people per extension seemed a bit ambitious but that wasn’t the most surprising figure. The research also looked at the number of people per square metre in major UK cities. The results were astonishing: Brighton and Hove has the highest concentration at 10 people per square metre. Leeds has eight, London seven, Portsmouth, Manchester, Newcastle and Nottingham, six.

Julia Park wrong table

The original - wrong - table

Could this really be true? Might they have added in all the upper floor space to the base area of each city? That seemed unlikely. Is it something for Radio 4’s More or Less? I decided to call the company named in the blog to find out more. The chap at the end of the phone asked how he could help. Referring to the research I asked how 10 people could physically fit into a square metre. Even three would be a squeeze, wouldn’t it? How does anyone move around? There must be some empty squares. Does that mean others have 20 people in them? He agreed it sounded a lot, but couldn’t help directly because his company sells building products (insulation mainly…). Did he know where the numbers had come from? Keen to please, he took my details and said he would ask the independent research company to get in touch.

I heard from them the next day. The data sources looked sound. The population figures came from the ONS, the area of the cities from another reputable website. The methodology was straightforward too: the number of people living in each city was simply divided by the area of the city in square metres. It all seemed to work – our cities really do seem to heaving.

At the end of the day it was still niggling away. I went back through the data. The city areas were originally in square kilometres. Instead of multiplying by 1,000,000 to convert to square metres, someone had multiplied by 1,000. Feeling vindicated (and sympathetic), I let the researchers know.

The moral of all this? Use reliable sources and always attribute the data you use. While you can’t check everything, try to trace back to primary sources when the data is central to your work or your conclusions. When you’re carrying out any kind of research (even the loose type) get someone else to read it. Whether it’s your own work or someone else’s, stand back and ask yourself whether something at least feels plausible. Save your calculations and if a number feels seriously wrong, start by checking the decimal point; it could save a lot of time and trouble.

How much space do Brighton and Hove-ians really have? 109sq m each (assuming my maths is correct…).

Incidentally, the ONS data behind this work is fascinating. Next time you feel like a loose researcher, you might like to take a look.