Main Street is the result of a collaborative project between the DX Lab and the Tweed Regional Museum. It is the result of a two-week ‘digital drop-in’ run by the Lab. This drop-in program supports creative and innovative thinking and is open to collaborations with staff, researchers, students, artists and digital peers. Erika Taylor, Curator of the Tweed Regional Museum worked with the Lab developers for two weeks, to develop the concept and resulting first iteration of Main Street.
The prototype (desktop experience for now) explores how the collection at The Library can be used in conjunction with a NSW regional collection, to provide a digital experience that explores both collections comparatively. Main Street uses 100 images of “Main Streets” from the Tweed Regional Museum Collection (running on top of the page) and compares them 100 images of “Sydney Main Streets” from the SLNSW Collection (bottom of page) The data sets are organised in sequential order ranging from the 1880s to 1950s. The space in the middle of the page displays common words from both the Sydney Morning Herald and the Tweed Daily newspapers for the relevant decade, which we are pulling out from Trove.
Main Street uses several cultural heritage API’s, eHive, Trove and the Librarys‘. The project documentation and resulting code is available in our Github account. The project was built using the Symfony 2 framework primarily due to it being agnostic, and our RAD (Rapid Application Development) approach.
The development process was not all plain sailing by any stretch of the imagination and presented a few obstacles around filtering and aggregating the data from three very disparate API’s.
- An API is built on the premise of ‘how the data is stored’, in this instance there is no agreed and standardised method. Each data source required it’s own implementation to extract the necessary data.
- The primary endpoint provided by the each API is the ability to search, however, when harvesting data based on a pre-defined search query, this presents a problem. The search endpoints are restricted to return a subset of the results with a maximum threshold, simulating a paginated search. Therefore in order to return all of the results a large number of cURL requests were required, utilising the offset and limit parameters.
- Upon obtaining the response from the search query, further cURL requests were required in order to obtain the complete record for each of the results. Combine this with the initial set of requests and you have just built an ideal DOS attack. In order to restrain resource intensity, a maximum threshold was set on the number of results required from each of the data sources. Note: For EHive an additional set of cURL requests were required just to obtain the tags for each of the results.
You can read more about the process of making Main Street in this post. We hope you enjoy exploring the main street comparisons from the two different locations. We will continue to work on this early stage prototype to improve it and hopefully add more content from other cultural heritage institutions.
This project is supported by Arts NSW’s Mentorship, Fellowship and Volunteer Placement Program; a devolved funding program administered by Museums and Galleries NSW on behalf of the NSW Government.