Adding topography to topology

1374 0

Terrain data is playing an increasingly important role in the AEC sector. James Cutler, Managing Director of eMapSite, looks at the wealth of data that’s currently available to engineers, planners and architects for use in GIS (Geographic Information Systems) and CAD applications.

Time was when terrain was something you walked over and if you needed anything more detailed you sent out surveyors to collect selected site information, a process that is at once accurate, time consuming, expensive and inflexible. Today, nothing could be more different – commercial, legislative and regulatory, marketing, design, landscape and compliance pressures have created an environment in which the real world needs to be portrayed as accurately as possible and where height has become a core factor. For instance, terrain characterisation is an important step on the route from raw landscape height data to GIS applications like feature recognition and erosion and disaster damage prediction.

Ordnance Survey’s enhanced data Land-Form PROFILE Plus enables nationally consistent 3D modelling for activities such as flood risk assessment, pipeline maintenance, and route planning for road and rail. The data offers an overall 2m post space enriched digital terrain model, with height accuracy within 15 to 25 centimetres for selected high-risk areas such as flood plains and urban areas.

Even the nomenclature has changed – height can mean different things as well see below. The purpose of this article is to illustrate the key role that terrain data in all its guises is now playing across the AEC sector and to highlight the multiplicity of alternatives available to users.

So, terrain data… what is it?

Well, it’s the (representation of) height of– the ground, features on the landscape, buildings etc, isn’t it. And immediately we’re into a number of perceptions and variables that need to be understood before effective use is made of the available data.

Contour lines are perhaps the most established representation of landscape and most are aware that the closer they are together the steeper the gradient of the land. Few though could visualise very accurately the lie of the land purely on the basis of a contour map as traditional shading, hachuring and colouring methods have waned.

Traditional photogrammetric techniques rely on highly trained photogrammetrists to view overlapping aerial photographs through a stereo-plotter (a sophisticated tool that enables the viewer to see a 3D view of the overlap area) and to identify a selected elevation (typically above mean sea level) and then follow that elevation across the overlap area. This exercise is then repeated for each desired elevation (say 5m or 10m) to produce a contour map. This repetitive activity requires consistent national coverage of stereoscopic aerial photography or satellite imagery as well as a highly trained team of photogrammetrists and data auditors. The results, as seen on the OS LandRanger and Explorer maps are a marvel to behold and are available in digital form for use in CAD and GIS systems to inform decision making. Ordnance Survey even created a bespoke paper product called LandPlan that “burns” the contours into a backdrop map at 1:10 000 scale for users needing no additional sophistication. While expensive, these techniques did enable Ordnance Survey to produce the longstanding Landform Panorama and Landform Profile contour products. At 1:50,000 scale Landform Panorama is a “frozen” product, in part owing to external factors but primarily because at this scale even major terra-forming activities such as motorway embankments and bridges barely impact the alignment of contours at 10m intervals. Landform Profile with its 5m contours is however a product subject to update as a result to engineering works, transport infrastructure development and the like.

In Autodesk Civil 3D , surfaces can be built from a variety of 3D source data.
Advertisement
Advertisement

However, established mathematical techniques for surface generation such as terrain visualisation were an early beneficiary of the escalation in computer power. There has since been an explosion of terrain data production in the form of regular and irregular grids often called digital terrain models (DTM) or digital elevation models (DEM) (more processing friendly data structures than contour lines). In a DEM the height of a subset of all the points on a given terrain is stored explicitly. This is done in way that makes it possible to interpolate a height for every point on the terrain. The two DEM’s that are used the most are based on a grid and on a triangulation of the points for which an explicit height is known (a triangulated irregular network, or TIN).

In recognition of the processing and modelling requirements of users Ordnance Survey have long made available grid versions of Landform Panorama (with a 50m “posting” i.e. grid interval) and Landform Profile (with a 10m posting). However, it is essential to note that owing to the mathematical interpolation at the heart of their derivation these products are inherently less accurate than the source contour products.

Most users are aware that with vertical accuracies of the order of +/-1.8-3m and their analogous user scales of 1:50 000 and 1:10 000 these products are not suitable for detailed site modelling but are rather aimed at the mid-market landscape visualisation type application. As such, demand has been bolstered by the renewable energy sector, particularly for wind energy modelling and wind-farm visualisation. With stakes high on both sides visualisation and its associated disciplines of Zones of Visual Impact and inter-visibility analysis have played a major role in adjusting wind-farm location.

For an even coarser view of the landscape users should look to the SRTM-90 and EDX-250 terrain models (the numbers indicate the grid intervals). The former is of interest as it is a free DTM captured by the Space Shuttle and is accompanied by paid-for products down to a 30m posting!

But that’s all really pretty coarse?? send for the surveyors

Maybe so (see below), but the last decade, and the new millennium in particular, has witnessed the advent of a new generation of digital terrain data that to some extent fills in the gap between site survey and nationally consistent data sets, and does so in a way and at a price point that can significantly reduce direct costs and shorten development approval chains.

Not that the underlying techniques have not been well understood and available for a long time but rather that the technologies available to bring the outputs to market have evolved to a production level commensurate with the demands of the user. Again, photogrammetry has played a key role but laser, RADAR, satellite imagery and computing science have all contributed.

Terrain models and elevation models
Product / Method Resolution Vertical accuracy
NextMap Britain Digital Surface Model (DSM) 5m grid +/-0.5m (in SE England), +/-1m (everywhere else)
LIDAR (laser imaging) 1-3m grid +/-0.15-0.6m (scattered)
Building Heights photogrammetry n/r +/-0.15-1m
OS Profile Plus various various (depending on source)
NextMap Britain Digital Terrain Model (DTM5) 5m grid +/-0.6m (in SE England), +/-1.2m (everywhere else)

As indicated previously stereo-pairs of aerial photography enable very accurate height measurements to be taken, with precision increasing with photo scale. Thus, with the latest generation of airborne digital cameras capturing urban areas at anything up to 6.25cm resolution and with a new generation of software tools, it is now possible to automate extraction of heights of individual features. Building height data sets are the result and these are beginning to change the way in which noise modelling, urban regeneration, pollution risk, flood analysis and building design including wind funnelling among many other applications are undertaken. In turn these data sets are turning up in the new generation of computer and arcade games, movies, homeland security and virtual reality (VR) environments.

Naturally, the complexity of the tools used and the very volume of data involved means that the most detailed data is only available for urban areas and that there is no nationally consistent data set with sub-metre vertical accuracy. It also means that there is a commensurate increase in cost, although this pales beside the costs of putting survey teams out to capture the same data. The various data sets have been or are being productised and come in two broad categories – terrain models and elevation or surface models (see boxout, left)

Ah, the surveyors…

In the final analysis there will always be a need for land surveyors and their ilk in site survey and related activities but very much in line with RICS Geomatics Dept thinking the skill sets required by the modern surveyor range of necessity far wider than their methodical forebears. The advent of GPS in the 1980s and Real Time Kinematic (RTK) GPS more recently provide the platform for extensively automated survey data capture tools while the rise of mobile computing, W-Fi and associated technologies allow for the validation of same against the topographic base and for the integration of that approved data into current operations across a host of consultants and contractors working on the same project. Project management, compliance, capture methodologies, validation procedures and much more all fall to the surveyor. Indeed, many are now playing an invaluable part in facilitating the development of new capability within the OS MasterMap structure, the pre-build layer, of which more in another article.

www.emapsite.com

Advertisement

Leave a comment