The planning system is a potential goldmine of detailed data, insight and forecasts that could help create places and businesses that adapt in real-time to the always evolving needs of their citizens and customers. But it is not. The sheer volume of insight lost would be a scandal had it not simply always been the case.
Every year, over 450,000 planning applications are submitted, but we collect very little of the data contained in these applications – and that’s before you even consider the reams of supplementary (and costly!) data-informed analysis setting out the social, economic and environmental impact that forms the evidence base for each application.
In addition, the 364 local planning authorities in England are each commissioning their own evidence base documents when preparing for a new local plan. The results of this exercise are stored in non-machine readable PDFs, which are accessed on an ‘as and when’ need by humans who have to trawl through them to get what they need. The data being collected is not stored in such a way that utilises its full value.
Why is this the way things are done? Why aren’t we thinking of how we could be using data to power digital technology that can improve the way we plan? Simply because the technology to do something different, something innovative, is new. Our current planning system has been with us for over a century. Digital technology? A few decades, if that. So, while it might perhaps be forgivable that we currently are where we are, it would fast become unforgivable should we choose to stay where we are.
Change is coming to planning standards
This is why it’s so exciting to see change starting to happen. Some great work has already started on developing planning application data standards and digital services, with Hackney Council, PlanX and the GLA all tackling elements of the challenge of standardising the top-level information that is submitted by applicants. But this is just the tip of the iceberg when it comes to the data contained within the planning system.
At Connected Places Catapult we have been building a map of all the data used as part of the evidence base in local plans and planning proposals. The evidence base for local plans seeks to understand the condition of a place and a ‘system’ within it (eg, employment land, green space, housing need etc.) and forecast how they may change in the future, so as to inform planning policy. The evidence base for planning proposals seeks to understand the context into which a development will land and the economic, social and environmental impacts of that development.
Our research into the data contained within a typical local plan suggests there are on average 50 evidence base documents used to create one local plan, which are collectively informed by more than 400 data sets. Then for a development proposal, there are typically around 20 documents submitted which are supported by more than 400 data sets. When comparing these data sets there are many that are used multiple times across multiple documents, as well as data sets that are used in both proposals and plans, such as traffic counts.
Taking a systems approach to the planning system
While we can’t aim to standardise all these datasets anytime soon, might we be able to standardise the most important or re-usable data? And how might we go about identifying what elements of the planning system could most usefully be standardised? We don’t know for sure, but better mapping of what data is used in what documents is a decent first step.
Developing agreement on what to standardise will, however, involve much more. As part of our digital planning programme we have been taking a systems approach; mapping the planning system and key user journeys, building a PlanTech community, building a map of the planning evidence base data, as well as researching the cost of planning evidence base data. With this knowledge, we are able to consider which data sets would be the most impactful for standardisation.
We also recently hosted a roundtable of representatives from across the sector including Ordnance Survey, MHCLG, BEIS, Open Systems Lab, the GeoSpatial Commission, Planning Officers Society and the London Borough of Hackney where we ran through a list of 18 different areas we have identified as important to consider for standardisation. The workshop confirmed some of our hunches but also brought to our attention some new opportunities. In the end, there were three key areas we’d recommend for standardisation:
- Progress or status of a proposal – In our original user research into the development management process, we got an excellent quote from an amateur developer who said: “Submitting a planning application is like walking through a series of black curtains. You open one curtain, then another one, then another one. You don’t know how many there are or what order they come in, but I know there is a home at the end.” Creating a standardised series of steps in the journey of a proposal would improve the user experience.
- Proposal red line boundary – This defines the boundary of a proposed development. Currently, this data is created by drawing on a physical map, then scanning and converting it to a JPEG or PDF. That file is then itself scanned or redrawn multiple times throughout the development management process. Creating a single digital record would both create efficiencies and reduce errors from occurring.
- National Proposals Map – All local plans have a series of core maps which contain information about sites allocated for development, conservation areas and geography-specific policies. This has been considered before by the Royal Town Planning Institute, but technology and digital literacy within local authorities now mean a national open data map for this is within our grasp.
We will be launching our research into data in the planning system at PlanTech Week between 17-19 September. During the event, you’ll hear the latest on how data and technology are transforming the planning sector. Find out more on the event’s dedicated website: https://www.plantechweek.com/.