the Environment Centre

Creating an Internal Mapping Tool for a local Charity

the Environment Centre (tEC) is an environmental charity based in Southampton.

A few years ago, they developed an online tool to display energy efficiency data for households on an interactive map.

The data behind the maps was saved in Google Fusion Tables which was scheduled to be depreciated in 2019.

They have investigated alternatives to hosting the data on Fusion Tables, but have been unable to identify a suitable alternative for such a large dataset, and so invited me to put forward a proposal for a new version of the tool.

Creating an Internal Mapping Tool for a local Charity
View The the Environment Centre Website

The Previous Tool

The government has released energy efficiency data which includes address fields. With the public data, the addresses have been converted and geocoded to obtain Latitude and Longitude data for each record. The tool displays this data on a Google map and includes the following features:

1. A customised Google map background with the option for a map or satellite view
2. Checkboxes and sliders to allow the user to query the data and show homes meeting certain criteria
3. A count of the number of properties shown for each query
4. An address search and radius feature to show only records for a particular address, street or area, or to show only records a specified distance from a given point
5. Pop up information balloons showing the detailed information for each record
6. Google street view imagery
7. The option to view the data table for any given query
8. A download button to allow the user to download the results of a query
9. Ability to overlay other datasets (e.g. polygons)
10. A reset button
11. A tEC branded navigation bar

The Challenge

There were a range of free and low-cost tools for visualising simple datasets, but the Environment Centre (tEC) believed that they needed a bespoke tool to replicate their previous mapping tool.

This is because existing products are either cost-prohibitive or lacking functionality.

The previous tool could visualise tens of thousands of records and can query dozens of fields in a short space of time.

With the loss of Fusion Tables, the Environment Centre (tEC) were keen to explore alternatives.

The largest challenge I faced as a developer was the requirement of the system to be maintainable on a non-existent monthly budget. the Environment Centre (tEC) needed the new tool to exist on cost-effective platforms, with no use of external APIs that don’t provide free credits to charities.

The new tool would need to be able to process over 60,000 records of data from an excel spreadsheet file (.CSV) and render locations in Google Maps with little to any lag.

As tEC had added their own information and improved on the public dataset, they asked for the implementation of a user authentication system with a simple user email and password login process.

User Interface

We require a new tool that closely matches (or improves upon) the previous front end interface and all of the features listed above. We are open to ideas regarding the back-end.

We do not require the use of Google Maps but would not want to lose the familiar design, user-friendly controls or access to Street View if an alternative service was used.

 

The new interface will developed as to closely resemble the existing interface components with an updated look and feel. All designs will be created in the design tool Figma and sent over for approval as soon as available.

It is recommended to stay with the use of Google Maps, not only due to the familiarity but rich support they have for developer support and the well-documented API they provide.

Ongoing Costs

We do not have a budget for ongoing recurring costs but do currently have access to Google for Nonprofits credits.

The tools should be designed such that it can be maintained and edited internally, noting that we have some knowledge of HTML, Javascript and CSS linked to the development of the previous tool.

There may be future work to finesse the tool, but this should not be assumed.

Ongoing costs are something that was taken into account before fully exploring viable solutions for the mapping tool. As the tool will require minimal ongoing costs, it is recommended that the website be built with 3 free tools and 1 paid service –

1. Netlify and Gatsby – Frontend (Client side) – Free
2. Heroku and ExpressJS – Backend (Server side) – Free for Hobby plan and Keep alive plugin
3. Firebase – Database (Users and S3 URL for ExpressJS) – Free
4. AmazonS3 – File hosting (.csv file upload) – $0.023 / GB for file storage – $0.005 / 1,000 requests

Scalability

While our initial focus is upon producing maps for our internal use, we would like to be able to license maps to our partners in local authorities and other third party organisations.

The initial build of the system will include the potential to scale the system for partners and other organisations, as the original build will not include advanced user management, and would involve a straight forward, yet manual process with the user database (training will be provided).

Security

We require strong security controls to prevent unauthorised access to the data, which in some usage scenarios could be personal in nature. We would also seek to limit the risk of loss of IP.

The web application include two servers, one that is a client-side interface (Gatsby), the other being an ExpressJS node server for handling authentication and serving JSON data.

For security, all requests to the server endpoints will be denied unless they provide valid user credentials. The users will be manageable with a Firebase setup, which handles authentication and updates the server on user state in every request made.

Finally the uploaded spreadsheet file containing the rows and column information will be privately hosted on Amazon S3, and locked down to only being processed by the ExpressJS server.

In summary, a user must be logged in on the front-end through Firebase, Firebase will confirm the user status within the back-end through ExpressJS before it builds data endpoints which are sent back to a logged in user on the front-end.

Improvements to the Previous Tool

The previous tool has some limitations which we would be keen to remove. These include:

  1. Some lag when querying large datasets, especially when attempting to open the data table
  2. The download option is limited to certain browsers and queries returning a small number of results (and columns)
  3. Downloads not having the correct .csv or .xlsx extension
  4. Reliance on third-party systems (i.e. Google Fusion Tables) to work
  5. The address radius feature does not seem to be working currently (the address searches, in general, are likely to require improvement)
  6. While not essential, it would be useful to be able to draw around an area and display the results falling within this area
  1. When possible caching datasets locally will be attempted to improve performance
  2. Using React modules, download options will be improved and include all results possible across all modern browsers
  3. React modules will be implemented to provide the option of download as a .CSV or .XLSX when possible.
  4. Where possible, all reliance on third-party systems have been removed, the handling of data processing, filtering, display, authentication, etc. are handled by bespoke systems that will be owned by tEC on final delivery and sign-off
  5. Seek to fix and implement in the new build of the mapping tool.
  6. Polygon drawing on the map to search will be explored, and while cannot be guaranteed, every attempt will be made to implement this feature.

Continue Viewing My Case Studies

Tell Me About Your Project