Seeing the Landscape in Layers: A Walk Through Lidar Remote Sensing

There is a quiet precision in Lidar work. Every return pulse is a story, every elevation layer a whisper of the terrain. What looks like color and code is a dialogue between light and surface. This lab took that conversation apart, from raw LAS files to a three-dimensional rendering of life above and below the canopy.


Importing Lidar Data and Seeing Beyond the Noise

The first task was to import LAS-format Lidar data into ENVI with proper spatial settings. Each file represented a fragment of the world captured as point clouds.

Following the lab instructions, the X and Y pixel size was set to 2 meters, matching the dataset’s spatial resolution. Each DEM and DSM tile, ten in total, had to be loaded separately before being stitched together.

This was the first reminder that Lidar is not a single image. It is a mosaic of returns waiting to be arranged. In ENVI, these returns became a readable surface, a translation of invisible light into visible structure.

The rainbow color table was used for the Digital Elevation Model (DEM) because it provides a clear visual gradient that distinguishes subtle elevation differences across the terrain. Blue and purple tones effectively represent low-lying areas, while green, yellow, and red tones highlight mid to high elevations.

This continuous spectrum helps reveal topographic patterns such as valleys, slopes, and ridges at a glance. The rainbow scheme is especially effective in educational and interpretive settings where the goal is to communicate elevation variation quickly, even to viewers without technical backgrounds.

Although more specialized color ramps like “terrain” or “hypsometric” scales are sometimes used in scientific mapping, the rainbow palette remains a practical choice for general visualization because it maximizes visual contrast and makes topographic structure immediately recognizable.


Mosaicking and Visualizing the Terrain

Once the data was loaded, the next step was to mosaic the DEM and DSM tiles. The Digital Elevation Model (DEM) revealed the earth’s raw anatomy: valleys, ridges, and plains stripped of vegetation and buildings. Colored with the rainbow palette, the gradients of blue to red displayed the quiet rise and fall of the terrain.

The Digital Surface Model (DSM) told a more crowded story. The Waves color palette highlighted the clutter of life such as buildings, trees, and other surface features.

Finally, the orthophotos — two high-resolution aerial images at 25 cm resolution — anchored these abstractions to reality. Their red, green, and blue tones matched the patterns seen in the elevation models. The low blues of the DEM aligned with rivers, while the red heights traced ridgelines and tree clusters.

Together, the DEM, DSM, and orthophotos provided a complete view of the land. The DEM grounded the analysis, the DSM added surface complexity, and the orthophotos confirmed the patterns seen in elevation.


Deriving the Canopy Height Model by Subtracting Earth from Sky

The Canopy Height Model (CHM) comes from a simple concept: everything above ground exists in the difference between DSM and DEM.

Using ENVI’s Band Math, the formula float(b1) - float(b2) extracted this vertical space. The result was a model representing the height of vegetation and structures. The process was not without errors. Extreme negative values such as –838 appeared, indicating missing data or projection mismatches.

To fix this, the Replace Bad Data function was applied, removing values between –838 and 0. What remained were positive and meaningful heights that represented actual canopy and surface elevation.

This step revealed the separation between the natural and the built world, expressed in meters rather than lines on a map.


Lidar Mapping and Turning Numbers into Meaning

Once cleaned, the CHM was color-coded through Density Slicing to visualize variations in canopy height.

Height Range (m)ColorDescription
0.000–0.300BlackBare ground
0.310–3.000GreenGrass or low vegetation
3.010–5.000BlueSmall shrubs
5.010–10.000YellowMedium vegetation
10.010–15.000MagentaGrowing trees
15.010–20.000RedMature canopy or structures
20.010–23.000CyanTall canopy
23.010–30.000CoralHighest vegetation or buildings

The resulting color map looked alive. Green patches spread like meadows, while coral tones appeared where tall trees or rooftops reached higher.

This transformation turned statistics into geography and elevation data into ecological context. Every color represented a measurable reality.


3D Visualization and the Landscape Taking Shape

With ENVI’s SurfaceView tool and a vertical exaggeration factor of 5.0, the CHM became a living terrain. Peaks appeared where canopy thrived, while troughs marked open ground. The file 1827_CHM_2m.tif turned into a 3D landscape of height and depth, each vertex showing how light returned from earth to sensor.

Interpretation became instinctive. The white peaks revealed clusters of mature trees or built structures. Darker regions showed bare land or low vegetation. The 3D view bridged perception and measurement, allowing a more intuitive understanding of surface height and distribution.


Lessons from the Lab

Working through Lidar data feels like peeling the layers of a digital earth. It is both science and art.

From DEM to DSM to CHM, every step revealed a new level of understanding. Beyond learning ENVI workflows, this lab demonstrated the relationships between data, projection, and meaning. It showed how errors distort interpretation and how visualizing height reveals structure.

At its core, Lidar analysis is about perspective. Behind every pixel lies an echo of light that once touched the surface of the world.

Tags: