Friday, 17 October 2008

NEDF Part 4: Recommendations for a Plan of Action

The following recommendations should be seen as a checklist of actions that collectively will move NZs elevation infrastructure fully into a digitally wired Web 2.0 world. For this to be achieved every contributor needs to move their elevation assets and knowledge into a digital web-enabled form. Standards need to shift from official published documents describing the circumstances for the standard and containing formulae and data references, to authoritative web-services that actively support the embody best practice of the standard in use. While this might traditionally be approached in a grand design top-down organised way it can also be approached as a bottom-up grass-roots movement where each contributor progressively establishes a suit of web-services associated with their own elevation assets and knowledge. Such an approach is an anathema to traditionalists who need to organise, but the beauty of Web 2.0 is that provided each participant approaches the solution to their part of the problem using appropriate standards (eg OGC WFS, and WCS standards etc), with an expectation that everything will be in a state of continuous evolution as they incrementally respond to market needs – ie the development principle of continuous just-in-time beta releases rather than occasional massive version changes. Then collectively we will converge on a working solution with minimal grand-design overhead effort and significantly reduced risk of failure. Success in a Web 2.0 environment is directly related to shortened time to market. Don’t ‘talk and plan’ just ‘do it and do it again’.

There are three types of actions in the following recommendations, those associated with new improved data, those associated with licensing and pricing, and those associated with web-service enabling existing and new digital elevation assets. All are important in the long run, but provision of web-services is actually the easiest to achieve quickly and will drive the imperative for the other two, by generating demand and equally importantly making the need more transparent. So wherever there are digital assets and process knowledge that are already in the public domain – eg central govt data and standards, there is the opportunity to make a very significant start. Each agency with elevation assets will know their digital assets better than I do and will be able to take the principles outlined here and below and convert them into an appropriate implementation plan that will almost certainly deviate from the details that I have suggested and outlined below. The most important thing is that each agency takes on-board the principles above and considers their assets in the light of Web 2 thinking.

Access to Existing Data

With most high resolution data owned by local government, with a range of different licensing arrangements for access to data for other than the original purpose, work is needed to:

1. establish a web-service based on-line catalogue of all elevation data primary sources, their ownership and licensing. This includes LiDAR data and previous data sources such as contours and spot height measurements.

2. negotiate licensing arrangements for access to existing data where possible

3. establish web-services for on-demand data access and delivery

4. establish protocols for ensuring that future high resolution elevation data is licensed for widest possible access,

5. encourage all owners of elevation data assets to participate in making their data available. This includes non-traditional contributors such as Transit NZ and road engineering contractors who have very detailed before and after data associated with highway construction, road realignment etc. or such as architects and construction companies who build buildings whose outside dimensions (footprint and height) are needed to convert Digital Surface Models (DSM) to bare-earth Digital Elevation Models (DEM).

Reference Frame Solutions

Precise conversion between existing reference frames is limited by the state of our current knowledge of the reference frames, so a programme of work is required to resolve at least the uncertainties in existing knowledge and establish protocols for continuing refinement of our reference frame knowledge.

  • geoid reference: current knowledge of the geoid reference is based on a set of disconnected historic high precision level surveys, that followed the roads of the day, predating, for instance the Haast Pass road. Two possible solutions present themselves

  • 1. extend the high precision surveys, using modern equipment, to close the loops that are currently open, and link neighbouring surveys. This will allow the existing survey data to be recomputed, reducing the uncertainty in the existing data.

    2. investigating the option of adding a levelling payload to the existing road (& rail) condition surveying equipment. This equipment regularly traverses all major roads, recording road pavement condition as a function of location. If the survey vehicle had level recording gear added to its payload and all data from successive surveys were accumulated, the frequency of the measurements, would probably mean that even a lesser precision individual measurement, could result in greater overall precision.

    3. establish web-services for on-demand data access and delivery of all the historic and real-time raw data gathered

    4. establish web-processing services to provide on-demand standard reference analysis of this data.

  • sea-level reference: the key to precision in sea-level based reference frames, is the time-span of the measured baseline coupled with the quality of the reference to the associated land based bench-mark(s). A number of the existing sea-level stations are based on relatively short baseline times under a year. Two years of intensive measurement is normally considered the minimum to properly model the tidal pattern. Modelling for sea-level change, requires continuous, but less frequent monitoring. The suggested solution is to:

    1. determine the configuration of an optimal network of port and open-coast monitoring stations

    2. establish permanent sea-level monitoring stations with data-loggers

    3. establish web-services for on-demand data access and delivery of all the historic and real-time raw data gathered

    4. establish web-processing services to provide on-demand standard reference analysis of this data.

  • ellipsoidal reference: New Zealand uses many 'standard ellipsoids', some unique to NZ and others that are also used widely internationally. Unlike the geoid and sea-level references, ellipsoids are generally mathematically defined and not subject to ongoing refinement through measurement. The one exception is the family of ellipsoids based on NZGD2000, that are designed to allow for differential tectonic movement resulting in/from distortions to the NZ landmass. NZ has a network of permanent highest precision differential GPS stations established to monitor and define these distortions.

    1. establish web-services for on-demand data access and delivery of all the historic and real-time raw data gathered

    2. establish web-processing services to provide on-demand standard reference conversions between the ellipsoid used in NZ

    3. establish web-processing to provide the standard reference reduction of the data from the GPS stations, so that people can use the difference between the standard ellipsoid and the distortion of the NZ terrain at any date within the range of the observations.

Elevation Surface Interpolation Solutions

There are many of these, some geared to particularly source data types – eg contour to DEM, and Stereo image to DSM, others geared to production of elevation models with particular characteristics – eg drainage enforcement, optimising height and or slope accuracy, or removal of certain subtle artefacts. Ultimately the wider the selection the better. Some are available in open source codes others are licensed – obviously the Open Source ones are more amenable to being published as a web-service, the important thing is to get the codes in use.

1. establish web-services using open-source codes for interpolation of raw elevation data into a raster elevation model for a user nominated extent and resolution.

2. stand up existing ‘best of breed’ derived elevation datasets as web-services, eg as OGC WCS compliant service, so users can extract subsets as needed. Initially these datasets will be disconnected from their source data and codes, but in the longer term as the full processing workflow becomes available they will be pre-computed elevation datasets being constantly updated from all the available web-based primary data sources and software codes.

Reduction from surface model to bare-earth model

As has been noted earlier, this is a particular issue with processing LiDAR datasets and can account for up to 30% of the total cost of production of a bare earth elevation model. It is also often the most contentious part of the data delivery contract and therefore where most gain can potentially be made, and where there is least precedent for how to approach an optimal solution. In other words this is likely to be the hardest part to achieve.

1. establish web-services for known surface objects. With LiDAR, it is usually thought that surface objects (eg buildings, bridges) can be automatically identified from the raw LiDAR data and then removed. To a certain extent this is true, but if a city council, for instance already has 3D models of downtown buildings at a dimensional precision that exceeds the precision of the LiDAR, then it makes sense to use that data source. Also if a city utility already has data about assets in its drainage network – eg pipes and culverts under roads etc that can’t be directly observed in the LiDAR, then that can be very useful data to have as input to a drainage enforcement algorithm when attempting to create a surface elevation model for drainage or flood modelling. So data describing all of these known objects should also be available as web-services.

ContentsNEDF Part 1: The Australian National Elevation Data Framework
NEDF Part 2: Implications for New Zealand
NEDF Part 3: Strawman NZ Elevation Data Framework
NEDF Part 4: Recommendations for a Plan of Action

No comments: