Background
The Australian NEDF national workshop follows a process of wide consultation, regional needs assessment workshops and report preparation to support a case for significant investment in an enduring high resolution elevation data framework, encompassing both the marine and land environments of the Australian continent. Very broadly,
Throughout this document, the term NEDF will refer to the Australian NEDF, references to a possible
The Proposed NEDF
The Australians have consulted widely, and produced very creditable draft business plan & user needs analysis documents and by their own assessment a not so creditable draft science case – which are being reviewed by a four person panel of AAS, ANZLIC, CSIRO, and University senior experts who recognise the shortcomings and will recommend approaches to resolve them. The shortcomings in the science case are considered to be superficial and easily addressed rather than fundamental. The existing documents are available for download (ref 1-5) revised documents will be circulated when ready.
The key characteristics of a successful NEDF vision are:
- formal governance structure,
- a national nested multi-resolution ‘bare earth’ land and marine elevation dataset,
- nationally consistent specifications relating user-need to required elevation precision and formalised best practice,
- processes to ensure that needs are assessed and prioritised and resources and systems are in place to ensure the data is collected to meet needs as they evolve in the long term,
- robust authoritative metadata providing fitness for purpose,
- central searchable data catalogue,
- essentially free availability of elevation data,
- nationally accessible federated distributed data storage facilities
- vibrant elevation research and industry communities that
- contribute to GDP significantly beyond the level of Government investment
- provide feed-back contributing to advancing both needs and solutions
The Proposed NEDF Dataset Structure
There was a very strong desire that the NEDF should be enduring and forward looking, pushing the existing Australian Spatial Data Infrastructure (SDI) to the next level. However the solution as discussed is a traditional SDI solution augmented by a national strategy and governance for a suit of nested 'product' datasets that would satisfy 80/20 needs of users and be made 'freely available' through a web portal. Elevation products would be made available at resolutions of 9”, 3”, 1”, 1/3rd“, 1/9th“ ... horizontal resolution hierarchy corresponding to 250m, 90m, 30m, 10m, 3m, 1m ... as a rule of thumb vertical resolutions are typically 1/3rd of the horizontal resolution. Discussion focuses on relationships between user needs and elevation data requirements, the diversity of special uses and how they would be addressed, prospects for technical breakthroughs, the possibility of solving all needs with a single national LiDAR or similar high resolution survey, the contrast between expectations and what exists now: a national DEM (bare earth) at 250m resolution, augmented by 90m and 30m SRTM DSM (surface) products with restricted access to the 30m product due to 'counter terrorism' concerns.
The fundamental issue with any proposal based on product datasets, is the effort required to produce a solution (ie dataset) other than one of the core free datasets. Fitness for purpose is never a binary function, it is always a matter of degree, with inherent uncertainty and error. So the 80/20 rule is misleading since it superficially implies that 80% of needs are fully satisfied, whereas it is more likely to mean that hypothetically 40% of needs are fully satisfied, 45% are partially satisfied and 15% completely unsatisfied. Further the costs of exploring even a subtly different solution are very high – because the knowledge, raw data, processing capability and processing capacity aren’t readily available.
The Proposed NEDF Elevation Surface
Participants recognised that while the majority might be happy with one solution, there is significant need for a variety of surfaces – including bare earth (DEM), surface (DSM), terrain features (DTM) and a choice of data formats including rectangular prism, sloped tops, point heights … these differences are fundamental and will persist into the future – there is no one data product to satisfy all needs. Conversions between DEM, DSM and DTM are non trivial and often require very significant processing and or additional data. For LiDAR it was reported that DSM to DEM conversion can represent 30% of the costs. Information such as building footprints and elevations, urban trees, open drainage channels etc may be most appropriately sourced from city or council infrastructure datasets and used to inform the DSM to DEM conversion rather than being inferred from the LiDAR-DSM raw data. So raw elevation data should include height information from many ancillary datasets as well as the raw LiDAR cloud point heights.
The Proposed NEDF Reference Frame
Much was talked about of the complications as one goes from 10m vertical accuracy to sub-metre accuracy, especially from a national perspective. Differences in the specification for zero elevation become critical at these resolutions.. These include – ellipsoid shape (GPS reference frame), geoid shape (gravitational reference frame), mean sea level (topographic zero contour), mean high water mark (topographic coastline), mean high water springs (cadastral coastline), lowest astronomical tide (bathymetric zero) and variations between state and national approaches to providing solutions. There is wide variation in the precision to which these reference frames are known, the extent to which they are available in digital form and even the extent to which the differences can be reconciled by applying current technology. Some current best available data is based on historic essentially local arbitrary reference frames that cannot be recovered at precisions that would satisfy modern usage. There was also the recognition that existing technologies are least effective in the coastal/surf zone which impacts on the ability to reconcile differences between bathymetric and land based reference systems. Further climate change will result in a continually changing sea level model. Collectively these differences will be the subject of significant refinement from both theoretic and observational perspectives over the next few decades, with the consequence that any dataset that is part of a data-product centric NEDF, generated at a fixed point in time will be out of date shortly after is publication – resulting in a significant proportion of the user community being forced to use solutions that are outside the NEDF solution.
Beyond Data to Automated Workflow
A radical realisation started to emerge at the workshop, that the issues of continual change and refinement and of a diversity of need might be resolved by taking a managed source data & processing workflow approach, with both components of the solution being available for web-portal users to mix and match at their whim to suit their needs and $$ constraints. There wasn’t time at the workshop to thoroughly explore the full implications of such a shift, but 'workflow' issues were discussed by many participants during the afternoon breakout sessions and all three breakout session chairs mentioned workflow as part of their 5min summary reports at the concluding session.
To use the hypothetical example introduced earlier, a managed on-line workflow solution would allow all the partially or completely unsatisfied users to obtain variants of the solution that would more closely satisfy their requirements.
In Australia the computational and storage infrastructure is to a large degree already in place for such a solution – each state has a High Performance Computing facility and the NCRIS (National Collaborative Research Infrastructure Strategy) is designed to coordinate development and delivery of the required software systems. However representatives from the HPC community weren’t present at the workshop.
NEDF Funding Model Options
The other major theme to emerge was the impact & desirability of a whole of government public/private partnership approach as opposed to a purely government led solution. And that such a solution could still result in apparently free data use – in that a government led approach would be funded from tax, whereas a public/private solution might be funded by tax (the govt paying for 'early adopter bulk access' substantially augmented by in-line advertisements (ala Google adwords). In such a scenario, the cost to the government might be substantially reduced, likely by in excess of 1/10th, though costs had not yet been done by the private sector since the structure of the partnership would have a very great influence on the revenue flows and therefore investment strategies. It was stressed that key features of any successful private contribution would be; a predictable market unfettered by government intervention, other than as a 'guaranteed early adopter purchaser', and full industry involvement in the user needs phase so that everyone understood what the deliverable was. With such a proviso, there were considered to be no capacity constraints in the private sector to deliver whatever was required – even radical solutions such as those involving national very high resolution products.
No comments:
Post a Comment