Great Britain Historical Geographical Information System (GBHGIS)

New York Conference: Working Digitally with Historical Maps

This one day conference brought together map librarians who are digitising their collections with academic researchers using historical maps as sources for GIS systems and historical database. The focus was on novel uses of historical mapping in a digital environment.

Date: Saturday 25th February 2012
Time: 10.00am - 7pm
Venue: South Court Auditorium of the Schwarzman Building, New York Public Library, Stephen A. Schwarzman Building, Fifth Avenue at 42nd Street, New York, NY 10018-2788, USA
Venue Website: 
Location Map:

Special session of the Annual Meeting of the Association of American Geographers.

Supported by the New York Public Library, Cartography Associates and Old Maps Online and hosted by The New York Public Library.

Download PDFs from the day here:

New York Conference: Programme [Acrobat (.pdf) - 484kb Fri, 21 Dec 2012 09:42:00 GMT]

New York Conference: Abstracts [Acrobat (.pdf) - 500kb Fri, 21 Dec 2012 11:32:00 GMT]

Presentations from session 1: Building rich resources

Max Edelson* & Bill Ferster (University of Virginia) - The "New Map of Empire" Project: Enhancing Cartography Scholarship with Dynamic Online Collections

The New Map of Empire project displays historic maps so that scholars can analyze them and readers can explore them alongside published texts.  Built to enhance historian S. Max Edelson's book-in-progress, The New Map of Empire: How Britain Imagined America before Independence (Harvard University Press, 2013), this website features about 500 geo-referenced manuscript and published maps drawn from several archives and online repositories.  It is built on the VisualEyes platform developed by Bill Ferster at UVa's SHANTI and makes use of a base map tailored from the Google Maps API.  Dynamic overlays developed with UVa’s Scholars’ Lab show the demographic and jurisdictional growth of British settlement in North America.  Auto-search buttons bring up parts of the collection that correspond to chapters of the book and frame them geographically in terms of region.  Each collection is displayed as an index map —that is, a series of overlapping bounding boxes that show the spatial area represented—which can be populated with high-resolution images of the maps.  These technologies create an intuitive environment in which readers can explore the maps and scholars can direct them to see what is important about their content.  Goals for the project include building a shell version of the site that scholars can use to display their own map collections on the web, using mobile platforms, and in e-book format.  The project, previously known as the Cartography of American Colonization Database, has received funding through an NEH Digital Humanities Start-Up Grant as well as an ACLS Digital Innovation Fellowship.


Joseph Hurley (Georgia State University) - Visualizing Neighborhood Change: The Georgia State University Library Digital Map Collection, "Planning Atlanta: A New City in the Making,, 1950s - 1980s"

“Planning Atlanta: A New City in the Making, 1950s – 1980s” is a rich collection of digitized and geo-referenced historical Atlanta urban planning maps.  Containing over 700 historically significant city planning maps from the City of Atlanta and the Atlanta Regional Commission, this previously un-cataloged and hidden collection provides a vivid portrait of how the city's built environment, segregated neighborhoods and land use patterns changed over time.  Significant among these planning maps is an extensive number of urban renewal plans, many of which document “negro expansion areas” and neighborhoods that no longer exist due to redevelopment, highway creation and forced removal.  This digital collection employs flexible metadata fields, which provide unprecedented access to these maps and allows users to search by categories such as neighborhood, land use and urban renewal.  Moving beyond the traditional digital library collection, Planning Atlanta is designed to be an interactive collection that allows the original maps to be engaged with in ways that far exceed their original intentions.  Users of Planning Atlanta can open the maps directly into Google Earth as KMZ super overlays, thus enabling users with minimal GIS skills to readily visualize neighborhood and city-wide change by comparing these historical city planning maps with contemporary satellite images.  In this paper session, I will introduce this digital collection and discuss how Georgia State University faculty members from departments as varied as English to Geography are using this collection in their classes.


Michael Page (Emory University) - Modeling the History of the City using Library Resources

Atlases have long proved invaluable to scholars examining urban history. By leveraging GIS to link spatial features with archive & library collections the historical atlas can be transformed into a tool for digital humanities, social science, and public health. This presentation shares the concept, methods, intended outcomes, and challenges of a current project at Emory University Libraries in (re)mapping early 20th century Atlanta.


Marcel A Fortin* (University of Toronto) & Jennifer Bonnell (University of Guelph) - The Don Valley Historical Mapping Project

The Don Valley Historical Mapping Project is a collaborative HGIS project between University of Toronto GIS and Map Librarian, Marcel Fortin and Historian Jennifer Bonnell, currently a postdoc at the University of Guelph.

The project documents historical changes in the landscape of the Don River Valley. Drawing from a wide range of geographical information available for the Don River watershed, including historical maps, geological maps, fire insurance plans, planning documents, and city directories, the project uses GIS software to place, compile, synthesize and interpret this information and make it more accessible to the public as geospatial data and scanned maps.

To date, the project team has compiled geospatial datasets for the changes to the river channel and shoreline of Toronto harbor, 1858-1918; the industrial development in the Lower Don River Watershed, 1857-1951; historical mill sites in the Don River Watershed, 1825-1852; and land ownership in the watershed in 1860 and 1878.


John Cloud (NOAA Central Library) - Starting from Hassler's Primary Triangle: The Survey of the Coast in "New York Bay and Harbor and the Environs" as the Foundation for Geo-Spatial Data for North America.

The National Oceanic and Atmospheric Administration (NOAA) is the oldest scientific agency in the US government. It began, in 1807, as the Survey of the Coast, under the Swiss immigrant geodesist Ferdinand Hassler. The original geodetic network, and later topographic and hydrographic mapping, concentrated on what Hassler termed “New York Bay and Harbor and the Environs”. Historic Coast Survey materials of New York are compelling, because their creators were among the finest cartographers of the age, and because by their nature, they elicit and illuminate issues such as changing datums, evolving sea levels, shifting monuments, ecological transformations, and the challenges of mapping one of the most dynamic and celebrated human environments on the planet.

Presentations from session 2: Enabling Access

Julie Sweetkind-Singer (Stanford University) - Digital Philanthropy: Increasing Access through Donor Collaboration

Over the past three years, Stanford University Libraries has embarked on a program to work directly with local Bay Area map collectors to scan and provide online access to their private collections.  These collections mainly include items not owned by the libraries allowing Stanford to increase its digital holdings of rare and valuable materials.  The program benefits not only Stanford's scholars and students, but also the donor who is able to make their collection available to a public they could not easily reach.  The collections typically have been built over decades focusing on specific regions or themes giving one a unique view on a topic or geographic area.  This talk will focus on the steps taken to put the program in place including developing workflows, writing contracts, moving materials, manipulating metadata, and creating points of access.


Matt Knutzen* (New York Public Library) & Shekhar Krishnan (MIT Program in Science Technology and Society) - Unbinding the Atlas:  Working with Digital Maps

Scanned sources are now a routine part of scholarly research, but maps present a unique dilemma in the digital revolution underway in universities and libraries. Unlike texts, audio or video, historical maps are a rich funds of data which become no easier to read even after they are scanned. The work of warping, tracing, and annotating scanned maps overwhelms even the most ambitious scholar. Our presentation will show how open source GIS software, free web services and linked data standards, powered by "crowd-sourced" public data offers vast new archives for place-based research across the social sciences, geography and humanities.

The "spatial turn" in these disciplines has long been acknowledged, but there remains a wide gap between the data-driven empiricism of geographers and GIS experts, and new research agendas being advanced in "digital" history, anthropology, and humanities. The estrangement of quantitative from theoretical research in these disciplines has meant scholars lack the technical skills to build databases linking sources and stories to places and periods. Maps and geo-data are still mostly used to illustrate text-based arguments, rather than ask new questions about space and time.  We will demonstrate our ongoing work with the New York Public Library Maps Division on their Map Warper and Digitizer, a platform for scholars and the public to geo-reference scanned maps and digitize historical features in a web environment. We will also demonstrate a prototype Digital Gazetteer, a web-based interface for browsing, editing and relating named entities and geographic features over time.


Bonnie Burns (Harvard University Map Collection) - OpenGeoportal: A Collaborative Geographic Search Tool

Academic libraries and geospatial researchers are creating and collecting geospatial data at a furious rate, but discovery of spatially referenced materials in library catalogs can still be difficult due to a lack of integrated spatial searching.  OpenGeoportal is new system that provides an intuitive, map-based search interface along with more traditional text search tools. The powerful searching is coupled with fast data preview functionality, and this combination maximizes the accessibility of our catalog of tens of thousands of metadata records.

OpenGeoportal is envisioned as a way for institutions to share development resources and create a common interface to search for all kinds of geospatial data, both vector and raster.  It was developed collaboratively and is based on the open source components MapServer, OpenLayers and Solr.  The original partnership of Tufts, MIT and Harvard has grown to include academic partners Princeton, Stanford, Berkeley, Columbia, UConn, and Yale, as well as government partners such as MassGIS.  The group of partners is committed to sharing both development tasks and standards compliant metadata.

Within the OpenGeoportal metadata catalog are thousands of records describing geo-referenced images of historical maps.  From early global scale images to 1970s topographic maps, the OpenGeoportal catalog can provide access to a wide array of materials.  These maps can be previewed in geographic space, and most of the images can be downloaded instantly complete with geo-referencing information. This presentation will provide a demonstration of the OpenGeoportal search interface and an introduction to the historic materials in the catalog.


Meredith Westington* & Keith Bridge (NOAA/NOS/Office of Coast Survey) - The Value of a Bounding Box: Moving Historical Charts beyond the Image Browser

The Office of Coast Survey's Historical Map & Chart Collection ( contains over 33,000 scans of maps and charts from the late 1700s to present day. The Collection is a rich historical archive of charts and maps produced by National Oceanic and Atmospheric Administration and its predecessor agencies, namely the U.S. Coast and Geodetic Survey and U.S. Lake Survey. In 2011, the Office of Coast Survey launched two new search capabilities to make the collection more accessible-- by geographic position and by geographic place name.  Users, who were previously tied to searching geographies by names in chart titles or state attributes, can now discover new materials.

The new geographic searches are successful due to extensive efforts to collect 'bounding boxes' for each chart—a task complicated by the fact that nautical charts vary widely in scale and size.  Bounding boxes, maximum and minimum geographic extents, were collected in the digital environmental using a special tool that looked at ratios of pixels on the screen. This presentation will look at the bounding box collection tool, the methods used to develop the place name search based on the Geographic Names Information System, and detail a possible method for automatic geo-rectification based on bounding boxes and chart neatlines.


Gregory J Allord (US Geological Survey) - USGS Historical Topographic Map Collection: Converting and Integrating lithographic maps into The National Map.

The U.S. Geological Survey (USGS) topographic mapping program has accurately portrayed the complex geography of our nation on lithographically printed maps for almost 130 years. Currently, USGS is two-thirds of the way toward scanning, cataloging and geo-referencing  the approximately 200,000 topographic maps published since the inception of the Bureau's topographic mapping program in 1884.

Historical maps are an important national resource that provide a long-term record and document the natural, physical and cultural landscape. The history documented by this collection and the analysis of spatial patterns is invaluable throughout the scientific and nonscientific disciplines. Genealogists, historians, anthropologists, archeologists and others use this collection for research as well as for a framework on which a myriad of information can be presented in relation to the landscape.

The next stage in serving this comprehensive Historical Topographic Map Collection is completion of a digital repository of USGS topographic maps that will be available at no cost.  More than 140,000 GeoPDF files are currently available for viewing and printing by the general public and non-GIS users.  Development is underway to make spatially referenced files available through The National Map portal, which will provide a time series service and support for spatial analysis.  Other National Map data services, such as geographic names, will be tested for searching historical maps and linking to official geographic names data. This paper provides a description of the topographic map collection, organization and processing steps, status of maps processed to date, development of services, and future plans.

Presentations from session 3: Extracting and Defining Features

James Burt* (University of Wisconsin-Madison), Gregory J Allord (U.S. Geological Survey) &
Jeremy White (New York Times; University of Wisconsin-Madison) - Efficient Georeferencing of Small-scale Scanned Map Images

Agencies and libraries throughout the world are scanning historical printed thematic maps at an accelerating rate. Data overlay and other uses require geo-referenced images, but standard GIS geo-referencing tools are both tedious and error-prone. We demonstrate and report on new software suitable for small-scale maps that addresses the major user bottlenecks found in existing programs. Because of the nature of thematic maps, the program makes no assumptions about the layout of control marks in the image and works with about 20 map projections. (If the projection is unknown the user can search for the best-fitting projection from a list of candidates.) Our approach is use of low-order polynomials based on graticule control marks as a transform between image and projected spaces. Rather than requiring keyboard entry, we automatically populate longitude/latitude fields using a local linear model derived from previous entries. More importantly, we do not require exact digitizing of control marks. We perform pattern searches for graticule intersections within surrounding windows. The exact search pattern is determined by a control mark's position within the suite of marks, and is modified according to the map projection. Search windows can be roughly specified by on-screen digitizing or they can be optionally generated by the program. Cross-validation is used for error diagnosis, and images failing to meet a user-specified threshold are flagged as such. Output is TIFF formatted files in geographic or projected coordinates. The program is in the public domain and distributed as a Windows binary and in C# source code.


Richard Marciano (University of North Carolina Chapel Hill) - Connecting People, Past, and Place: exploring semi-automated extraction of text and polygons from common historic sources

The challenge we are exploring is the development of processes for automating and crowdsourcing the extraction and geo-referencing of individual and household records from a variety of publicly-available source of information, so that these public documents can be used by local organizations as public goods.  This also involves automating polygon extraction from historic maps.

The Digital Innovation Lab at UNC Chapel Hill is launching a pilot project called "P³: Connecting People, Past, and Place", intended to develop a software platform for harvesting and spatializing historical data from the most comprehensive and publicly-available sources of information about everyday life in early 20th century America: city directories, newspapers, urban ground plans, and census enumerations.

The Lab will initially explore techniques for automating the extraction of data about individuals and businesses from digitized historical city directories for North Carolina.  There are hundreds of such sources from the late 19th and early 20th centuries digitized by the North Carolina Collection of the University Library and published by the Internet Archive.  City directories contain alphabetical lists of the residents for a given town, along with their addresses and occupations. Many city directories published after 1900 also contain a street index, listing the residents of each street in street address order. In the southern U.S., city directories were racially coded until the 1960s, making it possible to identify African-American residents, businesses, and neighborhoods.


Andrea White (Louisiana State University and the University of New Orleans) - Creating an Archaeological Sensitivity Model for New Orleans using Historic Maps and Historical GIS

For geographers studying past urban landscapes, historic maps are a valuable tool. GIS has become a powerful method to overlay these archival maps onto the modern landscape, as well as perform spatial analysis to understand the past cityscape. Researchers can examine historic charts, drawings, and maps for clues to historic land use and to understand landscape change through time.  For archaeologists, these historic cartographic sources can indicate the potential location for archaeological sites.  In New Orleans, Louisiana, a large metropolitan and historic city, there are hundreds of buried archaeological sites. These sites, and the features and artifacts contained within them, tell the hidden story of city's colonial, antebellum, and more recent past. Since these archaeological sites are no longer visible, planners and developers often inadvertently destroy sites or may not understand the scientific value of archaeology. Due to the rapid pace of modern development, historical GIS modeling could aid in assessing the archaeological potential of a property quickly. Currently, the University of New Orleans and the Louisiana Division of Archaeology are developing a large-scale GIS archaeological sensitivity model that incorporates numerous historic maps, historical data, and archaeological site information to trace the development of the city over 250 years.  Our final product is not only a powerful planning tool, but a new device for researchers examining the past urbanism of the city.


Anne Leonard, Robin Michals & Peter Spellane* (New York City College of Technology) - Using old maps and new methods to discover the early chemical and petroleum industries of Newtown Creek

The Newtown Creek, a waterway in the inner harbor of New York City was, for a period in the late 19th century, one of the most important petroleum refining and chemical production sites in the US. Before petroleum was refined there, Newtown Creek was home to the country's largest coal oil refinery. Examining the precise locations of the pioneering companies' operations, especially those of producers of sulfuric acid and refiners of coal and petroleum oils, may indicate the role of industrial synergy in drawing these businesses to the Newtown Creek. We use hardcopy maps, land conveyances, aerial and land photographs, geo-rectified political and fire insurance maps, period health reports, and descriptions of then-current technical and business practices to inform descriptions of the historical and environmental impact of the chemical and petroleum technologies practiced along the banks of Newtown Creek from the 1860s through the early 20th century, with focus on the interactions between industrial entities along the Creek. A spatial and spatio-temporal analysis of the development of these industries allows a new examination of the effects of the chemicals and oil production industries on Newtown Creek and provides a basis for understanding the history of New York City and for understanding the EPA's recent addition of Newtown Creek to the Superfund National Priorities List of abandoned hazardous waste sites.


Stuart Macdonald (EDINA, University of Edinburgh) - Addressing History - Crowdsourcing the Past

The JISC-Funded AddressingHistory project, led by the EDINA at the University of Edinburgh in partnership with the National Library of Scotland has created a online crowdsourcing tool and API which enables users (particularly local and family history groups, and genealogists) to combine data from digitized historical Scottish Post Office Directories (PODs) for Edinburgh (1785, 1865, 1905 in the first instance), with contemporaneous historical maps. The technologies deployed are scalable for the full collection of 670 Post Office Directories covering the whole of Scotland.

Phase 2 funding sought to develop functionality complementary to the original work and to broaden geographic coverage of content. Work included spatial searching, enhancing the parsing process which assigns a geo-reference to POD entries (utilizing the Google geocoding utility). Multiple addresses (i.e. entries where individuals may have a domestic address and one or more business addresses) were also made explicit for searching purposes.

POD configuration files (used to configure the parser settings) were externalized thus rules required to successfully parse the POD content can be augmented (so that a user can add a repeated anomaly found within the structure of a POD).  We also plan to 'externalize' the parsing tool - thus a user wishing to geo-reference a POD (for an area of the country or era not covered by the tool) will have the ability to do so.  Additional content for Edinburgh, Glasgow and Aberdeen (1881, 1886, 1891) to coincide with census (and an inter-census) years, and a mobile Augmented Reality application will be added shortly.

Presentations from session 4: Digital Gazetteers

Merrick Lex Berman (China Historical GIS, Harvard University) - Historical Gazetteer Development and Integration:  CHGIS, Regnum Francorum, and GeoNames

As a growing corpus of historical place name data becomes available on the semantic web, or is exposed through open APIs, the practical problem of system integration demands more attention.   Disambiguation routines must move beyond textual string matching of place name spellings and take into account various filtering parameters, such as parent jurisdictions, feature classifications, and proximity.  Although temporal filters might also be used, the majority of currently available gazetteer data contain no date elements, therefore we must take the first step of attempting to create Correlations between datasets of undated place names and historical attestations of place names.  This discussion will report the results of using a string matching and proximity algorithm to find matching candidates between the China Historical GIS place names and Geonames.  A comparison with French historical place names will be attempted, based on data developed by Johan Ahlfeldt for Regnum Francorum.  In addition to batch geocoding methods, an early implementation of the MapWarper tool for digitization of named features from historical maps will be evaluated.


Raj Singh (Open Geospatial Consortium) - Establishing a Global Data Sharing Framework for Place Names

Building upon last year's special session on historical gazetteers, this talk presents work done by stakeholders in the place naming community to create a basic framework for data sharing across disciplines, ranging from archaeology to museum accession records to government place name databases to consumer "check-in" mobile apps. The Open Geospatial Consortium and the World Wide Web Consortium have led a standards effort to create a basic framework which will encourage greater data sharing about places on the web.

Having a standard information model across communities will enhance the ability for researchers and businesses to cross-reference place-based information across databases and disciplines, enabling, for example, one to navigate from an ancient battle location to the businesses located in that place today, and then on to museum collections containing objects from that place. In this session we will present the draft core standard for all place names, as well as a number of extensions to the core, including ones for consumer applications, augmented reality, historical records, and government place name records.


Ashley Holt (National Geospatial intelligence Agency) - Gazetteer representation of place name usage

Gazetteers provide an important foundation for geo-location services and applications, for example during crisis mapping, humanitarian assistance and disaster response. In addition, the variations in names (or nicknames) that people actually use to describe places can encode important relationships between a geographic location and the social constructs associated with place, which may change over time. Many new datasets have emerged that capture local and vernacular place names from user-contributed and harvested data. Deriving an understanding of place name usage patterns from these data will require new approaches for capturing place name information that is contained implicitly in user-contributed data, as well as methods for identifying patterns in vernacular place name usage over time. This paper will present a research agenda and framework for gazetteer representation of place name usage and changes in place name usage over time.

Key note Presentation: Finding and Referencing Old Maps Online

This joint presentation will demonstrate and launch a new global search portal for digitised historical maps: Old Maps Online

David Rumsey* (Cartography Associates),
Humphrey Southall* (Univ of Portsmouth - Great Britain Historical GIS)
Petr Pridal* (Klokan Technologies)

Hundreds of thousands of historical maps have now been scanned and made available on-line by libraries around the world, and this has been a great boon to anyone interested in the history of cartography. However, those interested in the history of the places shown on maps have been less well served: just because a map is "on the web" does not mean we can find the relevant library web site, and even when we find the site the available catalogues are little help in finding maps covering particular places. One indication of the unhelpful nature of existing map catalogues is the fact that, while essentially all major libraries have computerized their book catalogues, under a quarter of map catalogues have been computerized. A further problem is that even when digitized historical maps have been made available via geo-spatially aware online systems, the resulting references, i.e. the Uniform Resource Locators for accessing the maps, are generally very technology-dependent and unlikely to work even a few years later. This keynote presentation launches the Old Maps Online project, providing a universal search portal for historic maps designed to complement rather than compete with libraries' own search interfaces, and also developing best practices for defining persistent Uniform Resource Identifiers for historic maps - URIs not URLs. The portal is an enhanced version of an interface already developed for the David Rumsey Collection, and several major US and UK collections are committed to contributing, but our aim is to include as many collections as possible.