# Pine Creek Example 3¶

This example follows the previous Pine Creek example. It illustrates the use of unmixing and clustering. The principle is simple; we reuse the result of the previous analysis. To this result, we add a clustering analysis. Here are the steps:

• extract roads and buildings by unmixing (nbex_pine_creek2)
• use KMeans to cluster Pine Creek, at this step we generate 5 clusters
• stack the two classification maps, we obtain 6 classes

This example adds a control on noise management. First, scattered bands are removed. This has no influence on the clustering, but has an influence on the use of MNF. Following is the use of MNF with the help of Savitzky Golay.

When classification maps are stacked, in general, Majority Voting is used. For this example, the roads and buildings map always wins.

In [1]:
# Run on Python 2.7 and 3.x

from __future__ import print_function

%matplotlib inline

import os
import numpy as np
import os.path as osp
import pysptools.util as util
import pysptools.noise as ns
import pysptools.eea as eea
import pysptools.classification as cls
import pysptools.abundance_maps as amp

def NFINDR(data, n_, info=None):
ee = eea.NFINDR()
U = ee.extract(data, n_, maxit=5, normalize=False, ATGP_init=True)
return U

def FCLS(data, U, suffix=''):
am = amp.FCLS()
amaps = am.map(data, U, normalize=False)
return amaps

def MNF(data, n_components):
mnf = ns.MNF()
mnf.apply(data)
# get the first n_components
return mnf.get_components(n_components)

def kmeans(data, n, suffix):
km = cls.KMeans()
km.predict(data, n_clusters)
km.display(colorMap='Paired', suffix=suffix)

def MNF_reduce_component_2_noise_and_invert(data):
# Reduce the second component noise and
# return the inverse transform
mnf = ns.MNF()
tdata = mnf.apply(data)
dn = ns.SavitzkyGolay()
tdata[:,:,1:2] = dn.denoise_bands(tdata[:,:,1:2], 15, 2)
# inverse_transform remove the PCA rotation,
# we obtain a whitened cube with
# a noise reduction for the second component
return mnf.inverse_transform(tdata)

class Stack(object):

# self.cmap start at zero
self.cmap = (((clusters + 1) * np.logical_not(man_made)) + (man_made * 6)) - 1
self.n_clusters = n

def display(self, colorMap='Paired', suffix=''):
out = cls.Output('Stack')
out.display(self.cmap, self.n_clusters, colorMap=colorMap, suffix=suffix)

def display(image):
import matplotlib.pyplot as plt
img = plt.imshow(image)
img.set_cmap('gray')
plt.show()
plt.clf()

def remove_bands(M):
"""
Remove the bands with atmospheric
scattering.
Remove:
[0..4]
[102..110]
[148..169]
[211..end]
"""
p1 = list(range(5,102))
p2 = list(range(111,148))
p3 = list(range(170,211))
Mp = M[:,:,p1+p2+p3]
return Mp
In [2]:
data_path = os.environ['PYSPTOOLS_DATA']

sample = '92AV3C.hdr'

data_file = osp.join(data_path, sample)

# Remove some bands
data_clean = remove_bands(data)

n_clusters = 5

Note that running KMeans on the cleaned data or on the original data give the same result. The difference is at the NFINDR and MNF level.

First, we run KMeans on the cleaned data. This is the reference map.

In [3]:
kmeans(data_clean, n_clusters, 'cleaned data')
<matplotlib.figure.Figure at 0x7fd561246e50>

Next, we take the code of the pine creek example 2 and run it again. We obtain a class map for the roads and buildings.

In [4]:

U = NFINDR(data_clean, 12)
amaps = FCLS(data_clean, U)
man_made = (amaps[:,:,2] > 0.02)+(amaps[:,:,9] > 0.1)
<matplotlib.figure.Figure at 0x7fd55f42bed0>

Following, we apply MNF in two differents flavors. The first one is a strait application of MNF. For the second one, we remove some band noise to the second component. This action has a smoothing effect.

In [5]:
n_components = 40

tdata = MNF(data_clean, n_components)
km = cls.KMeans()
km.predict(tdata, n_clusters)
km.display(colorMap='Paired', suffix='MNF')

idata = MNF_reduce_component_2_noise_and_invert(data_clean)
km = cls.KMeans()
clusters = km.predict(idata, n_clusters)
km.display(colorMap='Paired', suffix='MNF with component 2 noise reduction')
<matplotlib.figure.Figure at 0x7fd561389a90>
<matplotlib.figure.Figure at 0x7fd55f3a4f90>

Finally, we stack the clustering result with the man-made result. It gives six classes. In brown the roads and buildings are superposed to the clusters classes.

In [6]: