Following the West Midlands urban tech summit, Stefan Webb and Joseph Bailey talk about the importance of opening up planning data and working with Birmingham City Council to understand the impact of new developments
The world of data analytics, big data and machine learning seems to have passed by much of the planning system. Yet paradoxically, of all public city services, the planning system possibly spends the most money on generating and retrieving data. This data, required (usually as a result of regulation and legislation) to provide the evidential grounding for planning applications, masterplans and city plans, is held in a vast number of overlapping document management systems which have little or no interoperability, and are inaccessible for machine or human.
The lack of accessibility of this data has consequences. The most obvious being the significant cost of repeated retrieval, generation and analysis of the same data. This data is extracted either by local authority officers, or developers or consultants employed by them, who are paid the same price for what is essentially a cut and paste of a previous piece of analysis.
Having more open and accessible planning data available should allow others – be they different services in local authorities, or those in the market wishing to develop new data-driven products and services – to use the same data without the costs of having to generate or retrieve it over and over again.
And this is recognised in Government. Speaking at the Urban Tech Summit in Birmingham, Comunities and Local Government Secretary Sajid Javi made clear that his number one priority is getting more homes built, and called out planning as being “an area ripe for innovation”. The Department for Communities and Local Government (DCLG) will be launching a new platform to unlock data, with Javid commenting that “embracing digital is no longer a ‘nice to have’ for local government”.
In the absence of more open, systematic and accessible planning data, performing a task as simple as comparing two developments is no small feat. Yet this comparison enables local authorities to better communicate the opportunities and impacts associated with development to planners and citizens alike.
Despite the challenges around access to planning data, Future Cities Catapult wanted to see if communicating the impacts of developments through time might be feasible with existing data. Using documents available to the public and the planning department at the City of Birmingham, we investigated two large developments – one currently under construction and one with a recently published masterplan.
Our review unearthed more than 180 PDF documents with a large portion containing rich information about the impact of both developments – during and after construction. The PDF format meant much of this information was conceptually ‘trapped’ and cumbersome to extract for use in digital tools. However, the nature of the information – such as text, tables and geospatial features – means that it shouldn’t be difficult to provide in machine readable formats. In fact – there are many existing standards that could be used to facilitate this.
Although the planning system needs an overhaul, there are many small interim changes that could be implemented to stimulate positive transformation. Based on our experience of liberating data on development whilst working with Digital Birmingham, we’ve suggested five small changes to facilitate the development of digital tools for communicating the opportunities associated with new developments:
- Set minimum data provisions – authorities should require a specific list of data to be provided for all development. This may encompass other mandatory requirements but would enable the generation of a systematic summary of the impact of a development extending current requirements. This will help with like-for-like comparisons.
- Mandate the provision of machine readable information – PDFs should be in machine readable versions as demonstrated by ODI Leeds, and authorities should not limit these formats to particular licensed software.
- Consider sharing by default – unless developers and consultants are generating commercially sensitive information (if so, consideration should be given to aggregating and anonymising) they should upload their information to an open platform. This enables automated testing of the submission (for example, ensuring that all the required information is present) and in turn saving admin time.
- Recognise the value in external data capture – collecting more information about the city (e.g. real time monitoring of air quality, traffic, noise, waste generation) and making it available to others makes it easier for developers and consultants to fulfil their minimum data requirements, and encourages consultancy processes to use the same data. In the long term, it may even enable the authority to automate some processes performed by developers or consultants.
- Enhance Transparency – to enable the improvement and reuse of impact assessments and analyses, authorities should mandate that developers and consultancies provide transparent and reproducible methodologies alongside their insights.
Whilst there is and should be the ability to reflect local circumstances and priorities in how local plans, masterplans, and planning applications are evidenced, core data requirements would allow for greater standardisation and automation of evidence gathering and analysis.
Implementing these incremental changes will fuel the design of digital tools to communicate opportunities associated with new developments in a transparent and consistent way. Comparing developments like-for-like, along with their impact and the opportunities that they offer will be quicker, easier and perhaps most importantly – a lot more compelling.