Understanding How GPS, Remote Sensing and IDI Work  

Global Positioning System
Accuracy of GPS?
There are four basic levels of accuracy - or types of solutions - you can obtain with your real-time GPS mining system:


Autonomous Accuracy 15 - 100 meters
Differential GPS (DGPS) Accuracy 0.5 - 5 meters
Real-Time Kinematic Float (RTK Float) Accuracy 20cm - 1 meter
Real-Time Kinematic Fixed (RTK Fixed) Accuracy 1cm - 5 cm


GPS satellites broadcast on three different frequencies, and each frequency (or career wave) has some information or codes on it. You can think of it as three different radio stations broadcasting several different programs. The table below lists the signals and the contents:


L1 Career L2 Career L3 Career
19 cm wavelength 24 cm wavelength Data not available
1575.42 M Hz 1227.6 M Hz
C/A Code P Code
Navigation Navigation Message

P Code : Reserved for direct use only by the military
C/A Code : Used for rougher positioning
For Single frequency use only L1 career is used
For Double frequency, L1/L2/L3 career is used
The navigation message (usually referred to as the ephemeris) tells us where the satellites are located, in a special coordinate system called WGS-84. If you know where the satellites are at any given time, then you can compute your location here on earth.
Some Interesting Links :

Measuring GPS Accuracy
Introduction to the Global Positioning System for GIS and TRAVERSE
GPS Accuracy
How is the accuracy of GPS Receiver Described ? - By Chuck Gilbert, Road Measurement Data Acquisition System
Global Positioning Systems - Accuracy
GPS Accuracy by mapfacts.com
How Accurate is GPS?''
How Accurate Is GPS by Infield Solutions Pty Limited


Different types of answers given by a GPS.

Autonomous Positions
Uses.. C/A code only
Requires.. Only one receiver
Data from at least four satellites
Provides An accuracy range of about 15 - 100 meters

This solution is designed for people who just need an approximate location on the earth, such as a boat at sea or a hiker in the mountains.

Real-Time Differential GPS (DGPS) Positions
Uses.. C/A code only
Requires.. Two receivers
A radio link between the two receivers

Reference receiver at a known location broadcasts RTCM (Radio Technical Commission for Maritime Services) corrections.
Rover receiver applies corrections for improved GPS positions
Data from at least four satellites - the same four at both the references and rover (common satellites)
Provides An accuracy range of about 0.5 - 5 meters depending upon the quality of receiver and antennae used.

This solution gives much better results because here we have a known position at a reference receiver. However it must have a radio link between the reference receiver and the roving (moving) receiver.

Real Time Kinematic (RTK) Float Positions
Uses.. C/A code and career waves.
Requires.. Two receivers


Reference receiver at a known location tracks satellites and then broadcasts this satellite data over a radio link in a format called CMR. (CMR is a Trimble - defined format)
Rover receiver receives data from both the satellites and the reference station.
A radio link between the two receivers.
Data from atleast four common satellites.

Provides An accuracy range of about 20 cm to 1 meters.
This solution uses more of the satellite signal than the autonomous or DGPS solution. The CMR data is carrier phase data. The float solution is actually an intermediate step towards the most precise answer, which we''ll discuss next.

Real Time Kinematic (RTK) Fixed Solutions
Uses.. C/A code and career waves.

Requires.. Two receivers
Reference receiver at a known location tracks satellites and then broadcasts CMR data over a radio link.
Rover receiver receives data from both the satellites and the reference station.
A radio link between the two receivers.

Initialization, which is achieved most easily with dual-frequency receivers. Data from at least five common satellites to initialize on-the-fly (in motion) Tracking at least four common satellites after initializing.

Provides An accuracy range of about 1 - 5 cm.

We noticed that with each increasing level of precision, there are more requirements. The most important unique requirement for the RTK fixed solution is something called an initialization. Here it is not feasible to explain what''s happening in an initialization, but it is relevant to mention that initialization is necessary to work at centi-meter level accuracy. Dual frequency receivers can perform this process automatically.

If the receiver looses the initialization - which can happen if it fails to track enough satellites - then your working accuracy will drop to the float solutions status temporarily. Remember, however both of these solutions require a radio link to your reference receiver. If, for any reason, you loose your radio link, you will drop back to the autonomous level - the least precise - until the radio link is regained.

Factors that affect GPS
There are a number of potential error sources that affect either the GPS signal directly or your ability to produce optimal results:
Number of satellites - minimum number required:
You must track atleast four common satellites - the same four satellites - at both the reference receiver and rover for either DGPS or RTK solutions. Also to achieve centimeter -level accuracy, remember you must have a fifth satellite for on-the fly RTK initialization. This extra satellite adds a check on the internal calculation. Any additional satellites beyond five provide even more checks, which is always useful.


Multipath - reflection of GPS signals near the antennae:
Multipath is simply reflection of signals similar to the phenomenon of ghosting on our television screen. GPS signals may be reflected by surfaces near the antennae, causing error in the travel time and therefore error in the GPS positions.


Ionosphere - change in the travel time of the signal:
Before GPS signals reach your antenna on the earth, they pass through a zone of charged particles called the ionosphere, which changes the speed of the signal. If your reference and rover receivers are relatively close together, the effect of ionosphere tends to be minimal. And if you are working with the lower range of GPS precisions, the ionosphere is not a major consideration. However if your rover is working too far from the reference station, you may experience problems, particularly with initializing your RTK fixed solution.


Troposphere - change in the travel time of the signal:
Troposphere is essentially the weather zone of our atmosphere, and droplets of water vapour in it can effect the speed of the signals. The vertical component of your GPS answer (your elevation) is particularly sensitive to the troposphere.


Satellite Geometry - general distribution of the satellites:
Satellite Geometry - or the distribution of satellites in the sky - effects the computation of your position. This is often referred to as Position Dilution of Precision (PDOP).

PDOP is expressed as a number, where lower numbers are preferable to higher numbers. The best results are obtained when PDOP is less than about 7.

PDOP is determined by your geographic location, the time of day you are working, and any site obstruction, which might block satellites. You can use planning software to help you determine when you''ll have the most satellites in a particular area.

When satellites are spread out, PDOP is Low (good).

When satellites are closer together, PDOP is High (weak).


Satellite Health - Availability of Signal:
While the satellite system is robust and dependable, it is possible for the satellites to occassionally be unhealthy. A satellite broadcasts its health status, based on information from the U.S. Department of Defense. Your receivers have safeguards to protect against using data from unhealthy satellites.


Signal Strength - Quality of Signal :
The strength of the satellite signal depends on obstructions and the elevation of the satellites above the horizon. To the extent it is possible, obstructions between your GPS antennae and the sky should be avoided. Also watch out for satellites which are close to the horizon, because the signals are weaker.


Distance from the Reference Receiver :
The effective range of a rover from a reference station depends primarily on the type of accuracy you aere trying to achieve. For the highest real time accuracy (RTK fixed), roveres should be within about 10-15 Km (about 6-9 miles) of the reference station. As the range exceeds this recommended limit, you may failto initialize and be restricted to RTK float solutions (decimeter accuracy).


Radio Frequency (RF) Interference:
RF interference may sometimes be a problem both for your GPS reception and your radio system. Some sources of RF interference include:
Radio towers
Transmitters
Satellite dishes
Generators
One should be particularly careful of sources which transmit either near the GPS frequencies (1227 and 1575 MHz) or near harmonics (multiples) of these frequencies. One should also be aware of the RF generated by his own machines.


Loss of Radio Transmission from Base:
If, for any reason, there is an interruption in the radio link between a reference receiver and a rover, then your rover is left with an autonomous position. It is very important to set up a network of radios and repeaters, which can provide the uninterrupted radio link needed for the best GPS results.

GPS: Basic Stuff
http://www.innovativegis.com/basis/pfprimer/Topic7/TOPIC7.html#GPS:_Basic_Stuff

GIS technology allows you to view maps in the blink of an eye, visualize the spatial patterns in data sets and even model the complex interrelationships among mapped variables. But its abstract renderings (digital maps) require a real-world expression to make GIS a practical tool. For a long time farmers and other "field folk" have been breathing dust and swatting mosquitoes, while all the time lusting for a simple way to figure out where in the world they are and where they might be going. The celestial navigation, used by the early mariners as they gazed to the heavens, eventually gave way to the surveying and mapping sciences. But these solutions still seem beyond the grasp of the average bush-wacker. What was needed was a simple field unit that puts the power of GIS on the kitchen table, in a vehicle, or directly in our hands while standing in a field.

That''s where the global positioning system (GPS) comes in. It allows us to link GIS maps and their related data sets to real-world positions and movements. The GPS is based on a constellation of 21 satellites, each of which circles the globe every 12 hours. The system can be thought of as a set of "man-made stars" for an electronic equivalent to celestial navigation. So, how does it work? And will it work for you?

Figure 7.5 shows the important considerations in GPS technology. It uses a space-age update to the same principle of triangulation that you learned in high school geometry. First the space-age stuff. One of the satellites sends a signal toward earth stating the exact time. But when a GPS receiver on the ground checks the time, it''s a little off. The time lag times the speed of light that the radio waves travel tells you how far away the satellite is. Knowing the position and distance to a set of satellites allows calculation of the position of GPS receiver.

Although the process involves complicated electronics, it uses the same calculations you used in geometry class involving that device with a pencil on one arm and a life-threatening sharp point on the other. Recall that you would stick the point at a known location (satellite position) on a piece of paper then extend the arms for a given distance (satellite to GPS receiver time lag * speed) and make a small arc. Repeat for a second point/distance and where the arcs cross determines where you arein two-dimensional space.

In three-dimensional space, circles of a calculated distance are mathematically "drawn" about a set of satellites whose precise position are known at any instant in time through orbit mathematics. The intersection of the circles determines the position of the GPS receiver on the earth.

In trigonometric theory, only three channels (satellites) need to be monitored, but in practice four or more are needed for improved accuracy and to cancel receiver clock errors. The world of electronic wizardry (involving technical stuff like pseudo-random code, carrier-phase, ephemeris adjustments, and time hacks) allows accurate timing to one billionth of a second (.0000000001) and can produce extremely accurate distance measurements in three-dimensional space. Generally speaking, averaged stationary measurements (termed static mode) tend to be more accurate than a single reading or sets of readings made while on-the-go (kinematics mode).



GPS: Intermediate Stuff

As with everything in the real-world, actual GPS performance depends on several "muddling" factors. First and foremost is GPS''s history as a US Department of Defense program. They financed the billions needed to setup the system for military purposes and feel a bit uncomfortable if just anyone (such as terrorists or enemy troops) can simply tap into their system. They purposely degrade the signal using an operational mode called selective availability (S/A) to provide an accuracy of only about 100 meters. With the military muddling turned off, accuracy of about 10 meters is commonplace.

The signal degrading can be overruled by a method termed differential correction. A differential GPS unit uses real-time corrections from a local "reference receiver" whose exact location is known. When the reference receiver gets a satellite signal, it calculates its implied position, then quickly "reverse calculates" a direction and distance correction needed to place it where it should be. The correction is broadcast to the field units, or stored for post-processing of field readings back at the office.

There are several companies offering commercial differential correction signals similar to renting a telephone pager. Areas around navigable waterways can receive free correction signals from "beacons" established and maintained by the US Coast Guard (an interesting "combative" relationship between the USCG and DOD; go figure). However, these "real-time" signals aren''t required since you can download fairly accurate corrections for most areas from the Internet and "post-process" your GPS files the next day.

In general, there are two main hurdles in processing GPS signalsjitters and jumps. As with any instrument, inherent error for a set of readings at a fixed location yields a jittery cluster of possible positions, termed the sphere of uncertainty (see fig. 7.6). The cluster is statistically summarized to report the general accuracy of a GPS unit. A frequently used measure, the circle error probable (CEP), identifies the radius of a circle capturing 50 percent of the readings around test locations. Another measure reports a radius of a circle having one standard deviation around the actual location. Both measures assume the cluster of points is evenly distributed around the actual point. The worst kind of jitters has a directional bias.

Also, satellites come and go over the horizon with time; as one is dropped and another picked up, the calculated position can take a temporary jump. Although four satellites are technically sufficient, multi-channel receivers lock in on several more satellites and instantaneously switch without a sharp jump. Processing software uses running and dampened averages of several readings to cope with the jitters and jumps. Keep in mind that the silicon in all GPS hardware is about the sameit''s creative software that separates the best receivers.

A well-tuned differential GPS system in static mode for use on the farm can easily place you within a meter horizontally and five meters vertically. A simple, inexpensive, autonomous system can place you somewhere within a "football field." That is if atmospheric, ground-cover and terrain factors permitthings quickly deteriorate under a dense vegetation canopy and at the bottom of steep canyons. Also, the satellites are not always available in a nicely dispersed pattern in the sky. That means you need to plan to be in the field at the times the satellites'' celestial charts dictatetry explaining that one to your field crew.

A GPS''s ability to rapidly and accurately locate positions on the earth''s surface is a powerful addition to GIS technology. However, it is important to keep in mind that GPS is not intended to fully replace conventional surveys. It augments cadastral records with real-time and real-world positioning. When attached to a vehicle, GPS tracks it better than a hound dog.

The contribution of GPS to generating and updating GIS maps is obvious. Yet, GPS is more than a data collection deviceit''s a practical tool to navigate GIS results. As GIS matures, more of its applications will involve GIS modeling, such as "variable-width buffers" around streams considering terrain steepness, ground cover and soil erodibility. Although such relationships follow common sense, their spatial expression is extremely complex. The contractions and expansions of a variable-width buffer on a paper map are impossible to see in the field. However, if you download the coordinates of the buffer into your GPS you can navigate the complicated spatial result, in effect, delineating the spatial expression "on-the-go."

GPS: Basic Stuff
You have a grasp of the basics of GPS"gal-darn positioning system"that fires-up every time you go near that new Star Trek tractor. It uses a constellation of satellites and mathematical triangulation to tell you where you arerather precisely, if all goes well. However, there are several complex considerations involved in accurate positioning.

First, the distance from a satellite to you is determined by the time it takes a radio signal to traverse the space. Your typical stopwatch can''t cut it: 1/100th of a second equals 1,860 miles for a radio signal traveling at the speed of light. A timing error of just .01 second could put you on another continent. Clock accuracy has to be in the nanosecond range (.000000001 second) that translates to less than one foot and keeps the precision in precision farming.

While the timing signal is the heart of the system, GPS receivers also monitor other information from the satellite. The almanac for a satellite reports the timetable of the satellite, path of travel, orbital fine-tuning and horizon setting. This information is augmented by ephemeris data identifying the fine-tuning of the satellite''s position resulting from small predictable changes in orbit induced by gravitational effects of the sun, moon or other bodies, as well as other effects like the solar wind tugging on the satellite. The almanac information is updated monthly; the ephemeris data get updated every time the satellite passes over a ground control station.

In a sense, the almanac is like a bus schedule''s expected position at any time, while the ephemeris data update the estimate tempered by actual conditions. The result is the precise positioning of the satellite every billionth of a second. Knowing the exact location of each satellite and the distance from it to you provides the input to the trigonometry equations that solve for your positionless than a meter if all goes well. All this goes on at speeds, decimal places and minute spacing only electrical engineers and mathematicians truly appreciate.

An even more precise (and concurrently more complex) positioning method isn''t based on the measurement of time directly. It counts the number of radio waves between the individual satellites and the receiver. By carefully timing, counting and measuring the arrival of the waves, carrier phase receivers provide deca-meter (one-tenth of a meter) accuracy. However, you won''t see these GPSs on tractors for awhile since they have a hefty price tag and are notoriously unstable when moving.

So, what can go wrong? As discussed earlier, the military can muddle civilian use of the system or selectively turn it off during times of crisis. While differential correction usually restores the muddled measurements to within a meter, it requires access to base station corrections. Post-processing of a GPS data file can be made via the Internet; however, the turn-around time is several hours. For the real-time positioning needed in precision farming, you need additional hardware to monitor broadcast corrections and a fancier GPS receiver to make the adjustments "on-the-fly."

Even with differential correction, atomic clocks in the satellites and an extra satellite measurement for calculations, there are other sources of error, more subtle and difficult to deal with. Earth''s ionosphere can subtly affect the speed of the radio waves. As the density of charged particles in the ionosphere increases or the path gets closer to the horizon, the waves slow down and the time increases. While equations adjust for speed variation on an average day, under average ionospheric conditions, more advanced GPS receivers can tweak the calculations by measuring subtle changes in the frequency of the signals from two satellites. Weather conditions can disrupt calculations since water vapor affects radio signals in a similar manner. Although this error is almost impossible to correct, fortunately it is usually small.

A localized source of error that can have a significant effect is termed multi-pathing. It occurs when signals bounce off nearby objects before getting to the receiver. The effect is similar to "ghosting" on a TV where the signal takes a circuitous route instead of going directly to the receiver as the equations assume. Advanced signal processing techniques and antenna engineering minimize these problems.

A final consideration is the result of the relative positioning of the satellites. There are currently 21 satellites (plus three spares) circling the earth every 12 hours. They are dispersed so there are about 11 available to each half of the globe at any time. Of these, three are necessary for two-dimensional positioning (x,y) and four are needed for three-dimensional (x,y,z) positioning. However, the relative positioning of the satellites used for the calculations has a significant effect on accuracy.


If the satellites are clustered in one part of the sky the readings are less reliable. Although the orbits are designed to disperse the pattern, local features, such as a ridge or wind row of trees, often blots-out some of the paths forcing the selection of an alternate set satellites that are more bunched. If the satellites are too close together (see fig. 7.7), the geometry of the overlap of the circles forms a box instead of a point of intersection. If the satellites are too far apart, the increased atmosphere and interference from local terrain can muddle things.

Ideally, the three (or four) satellites should be balanced along a circle 45 degrees above the horizon; with minimal ionospheric, weather and multi-pathing effects at play. If that''s the case you are operating at peak precision; if not, things start to degrade. Most of the time, however, you can expect to be navigating within a few feet, which is a heck of a lot better than my golf swing''s mark from only 100 yards out.
Remote Sensing

The GIS/GPS technologies position both spatial data and spatial relationships on the landscape. But how to effectively identify, measure and monitor farm conditions is a continuing challenge. A GIS and its closely related field of remote sensing form an alliance that greatly enhances the technical toolkit for mapping. Remote sensing is actually GIS''s older brother, having its modern roots in World War II. Camouflage detection film was used to distinguish between healthy vegetation and cut branches piled on top of military equipment. To the human eye and normal film the healthy and cut branches were both green (at least for a few days), but on the new film they showed up as two different colors.

Remote sensing uses relative variations in electromagnetic radiation (EMR) to identify landscape characteristics and conditions. In fact, so do your eyes. Sunlight (the "visible" form of EMR) starts off with fairly equal parts of blue, green and red light. When sunlight interacts with an object, the objects composition causes it to absorb varying amounts of the different wavelengths of EMR "light." What light isnt absorbed is reflected to your eyes. Your brain interprets the subtle differences in the amount of blue, green and red in the reflected light to recognize the thousands of colors we relate to our surroundings.

Vegetation is particularly "photogenic" because of its structure, pigments and water content. Since sunlight is a plants source of energy, it goes out of its way to present its leaves in the best possible light. When thousands of plants in a field align themselves, their structure forms an excellent receptor and reflector of sunlight.

The physiology of a leaf determines the relative absorption and reflection of light. The equal portions of blue, green and red light from the sun are basically unaffected by the surface of the leaf, but when it encounters the chloroplasts containing chlorophyll A and B it is radically altered (see fig. 7.8). These pigments absorb most of the blue and red light for the energy needed in photosynthesis used in plant growth and maintenance. Other pigments in the leaf (i.e., carotines) absorb lesser amounts of the other wavelengths of light.

As the pigment-altered light continues deeper into the leaf, it interacts with the spongy mesophyll. This bubble-like structure acts like a mirror and reflects the light back toward the sky. Since the blue and red wavelengths have been diminished, we see a predominance of green in the reflected lighta healthy "green" leaf (because blue and red are usurped by the plant).

An unhealthy leaf, however, looks a lot different, particularly in remote sensing imagery. When water pressure changes (i.e., cutting a branch from its stem), the spongy messophyll in the leaves collapse within hours and this areas efficiency of reflecting light is greatly reduced. The chloroplasts, on the other hand, keep on working away at photosynthesis for several days. The result is that we "see" a slight change in reflectance (predominantly green) at first, then a slow progression to brown as the chloroplasts eventually quit preferentially absorbing blue and red light.

However, what makes remote sensings view different is its ability to look at reflected light beyond visible blue, green and red light. "Invisible" near-infrared light (NIR) is at wavelengths just beyond the red light your eyes can detect. These wavelengths are unaffected by the plants pigments and are highly reflected by the spongy mesophyll. When the "bubbles" in this portion of a leaf collapse, there is an immediate and dramatic change in the reflectance of near-infrared light. Thats the principle behind camouflage detection filmwe see a branch as green for days; remote sensing imagery detects a dramatic difference in near-infrared light in a matter of hours.

What makes remote sensing data so useful is that it encapsulates biological and physical characteristics into an image. The encrypted variations in reflected light emanating from a field provides information about changing conditions and crop statusimportant stuff you should keep your eye on

Figure 7.9 extends the discussion of the basic concepts of plant physiology and its interactions with light from a plant to a whole field. From a simplified view, as more biomass is added the reflectance curve for bare soil (similar to a dead leaf) is transformed into a spectral signature that typifies one big green leaf. As the crop matures, the reflectance pattern changes again. How spectral signatures change provide valuable insight into field conditions.

You now have a basic understanding of what happens to light in a plant canopy, lets take a loftier view and see how it is translated into a computer image. An aerial camera operates like your eye, except photographic paper replaces the optical nerves in the retina. The image is focused through the lens, with variation in light recorded by a photochemical process on the film.

The scanner in a satellite operates a bit differentlymore like your laser printer that "sees" the world through thousands of dots. Its sensor focuses for an instant at a spot on the ground (a few meters in diameter) as shown in figure 7.10. Like your eyes, it records the relative amounts of the different types of light it "sees"a lot of green for a dense healthy crop; much less green and more blue and red for bare ground. In addition to normal light (termed the visible spectrum), it can record other types that we cant see, such as near infrared, thermal and radar energy. The sensor sweeps from side to side and the satellite moves forward, recording the relative amounts of light reflected from millions of spots on the ground.

When these spots (termed pixels for "picture elements") are displayed on a computer, they form an image similar to an aerial photograph. In fact, a photograph can be "scanned" to generate a digital imagelike pulling the satellite out of the sky and passing it over the photo instead of the actual terrain. The important point is that behind any digital image there are millions of numbers recording the various types of light reflected from each instantaneous spot.

Three factors govern the quality and appropriateness of remote sensing data: 1) spatial, 2) spectral and 3) temporal resolutions.

Spatial resolution identifies the smallest thing (spatial unit) contained in the data. In a photograph, it is a glob of developed crystals embedded in the emulsion; in a digital image its the size of the pixel. Up to a point, smaller is better. If there is too much spatial detail, you "cant see the forest for the trees," nor store the burgeoning file.

Spectral resolution refers to the number and width of the wavelength bands (colors) contained in the data. Again, more is better, up to a point. The human eye and normal film "see" just three broad bands of lightblue, green and red. Optical scanners can record many more narrowly defined bands that can be "tuned" for specific wavelengths to enhance features of interest. The ability to extend the bands beyond our vision (particularly to near infrared energy) and analyze just the important ones allows us to gain a lot more information from the data than simply viewing an image.

The rub is that there is a tradeoff between spatial and spectral resolutionspushing both of them to the maximum results in too little energy for a sensors detector. Early satellite systems required a lot of energy to activate their detectors; therefore, they only had four broad bands and a footprint of about an acre. Modern detectors can record many more narrow bands and commonly have a footprint of only a few meters. At these resolutions (spectral and spatial), even satellite data becomes appropriate for some aspects of site-specific management.

Temporal resolution refers to the time step between images. A series of images collected each week over a field provides far more information than a single image taken at just one point in the crops growth. You guessed it; more is better, up to a point. But this time the point isnt driven by optical physics but your wallet. By its very nature, site-specific management implies small areas, while most remote sensing systems (particularly satellites) are designed to service large areas. Pricing and distribution channels for digital data in smaller bites (and bytes) and turn-around times needed by farmers are just now coming on line.

While the full potential of remote sensing might be just around the corner, an aerial photo backdrop is an essential element of any precision farming system. Theres a growing number of ways you can acquire such an image. If youre lucky you can download a "rectified" image from the Internet or pick up one from a governmental agency in your locale. Some farmers have struck a deal with th

Your Position: Where Are You?
All GIS databases require a common coordinate system to identify where in the world the data are and to spatially register various maps. A coordinate system is composed of two elements:

a spheroid that mathematically describes the three-dimensional shape of the earth and
a map projection that mathematically converts the spherical representation to two-dimensional coordinates for display on a flat screen or printed on a sheet of paper.
Mentioning the word "mathematically" twice in the same sentence is journalistic suicide, but it should reinforce the idea that maps within a GIS are numbers first, pictures later. As users of the technology you won''t be tested on the intellectual elegance of a blackboard full of equations, but need to understand the basic concepts and the coordinate "got''chas" you might encounter.

The first is the choice of the equation of the spheroid. It is similar to enlarging and shrinking a giant balloon in an attempt to "best fit" the earth''s surface. Keep in mind that the spinning earth is fatter at the equator so the equation of a simple sphere won''t do. Nor is the spheroid with the best overall fit the best fit for all locations on the earth. Hence, there is a multitude of variations to the basic equation; blow up the balloon a bit and a little more squashing at the poles gives North America a better fit but messes up things for Bolivia.

Once an appropriate spheroid is selected, most GISs record positions using latitude/longitude (Lat/Long). As shown in figure 7.1 an imaginary line is drawn from the center of the earth to a point on the earth''s surface and two angles are used to describe the line''s position: 1) the east-west sweep along the equator, termed longitude and 2) the north-south deflection, termed latitude.

Longitude ranges from 0 at the Prime Meridian passing through Greenwich, England, to +180 toward the east and 0 to -180 toward the west. Latitude ranges from 0 at the equator to +90 at the North Pole and 0 to -90 at the South Pole. For example, Denver''s position shown in the figure is -104.9 degrees longitude (west) and +39.8 degrees latitude (north).


So far things are fairly accurate. However, as shown in the figure 7.2, it is impossible to accurately cram the earth''s curved surface onto a flat plane. The result is that 1) all map projections distort shape, area, distance and direction; 2) the distortions increase with larger areas; 3) different projections produce different distortions; and 4) the best projection depends on the application.

The bright spot in this dismal situation is that most GISs use accurate Lat/Long coordinates for internal referencing and handle all of the math for converting to a flat screen or printed page whenever you ask. Your charge is to choose an appropriate coordinate system and stick to it when entering, processing and viewing data.





Your Position: Projecting the Right Image

As we have just discussed, there are two elements defining a GIS coordinate systemthe spheroid and the projection. The spheroid describes locations in three-dimensional space (curved earth) while the projection converts them to a two-dimensional rendering for plotting on a screen or sheet of paper (flat map). Changes in either element change the relative positioning of mapped data. For example, locations in the United States "move" nearly 200 feet between the North American Datum established in 1927 (NAD27) and the revised earth surface equations of the World Geodetic Spheroid established in 1983 (WGS83). Major problems arise when your GPS is set to WGS83, but the maps you downloaded from the Internet are NAD27 based; your registration is off more than half a football field from the start. But that''s nothing compared to the positioning errors that arise if you mix projections.

There are three basic types of map projection shown in figure 7.3: 1) cylindrical, 2) azimuthal and 3) conic.

These three projections refer to the shape of the projection surface. A cylindrical projection wraps a "sheet of paper" around the spheroid (earth), then projects each location perpendicular to the cylinder. An azimuthal projection simply projects locations onto the flat sheet of paper. A conic projection twists the paper into a cone before projecting locations onto it.

A projection''s graticule depicts the two-dimensional appearance of a regular grid inscribed on a three-dimensional surface. Figure 7.3 shows significant differences in the grid''s appearance for the three projections, which can translate into several football fields of movement.

Even within a single projection type, the orientation and placement of the projection surface can introduce dramatic changes as shown in the second figure of two maps of the United States in figure 7.4 by using subtly different cylindrical projection specifications (orientation and placement differences). Note that the map on the left is more compressed in the north-south direction. There are a myriad of differences in the shape, area, distances and directions among the features introduced by changes in the spatial referencing (spheroid and projection) of mapped data. So what can you do about all this "slop" in mapping?

First, choose a suitable spheroid, Since precision farming actively uses GPS data, it makes sense to use the revised WGS83 or NAD84 datum, There are four commonly used map projections in United States (Mercator, Transverse Mercator, Albers equal-area conic, and Lambert conformal conic) and two planer coordinate systems (state plane coordinate system and Universal Transverse Mercator). My personal favorite is Universal Transverse Mercator (UTM) because it is consistent throughout the world and uses the metric system, The state planer system is tailored for each state by dividing it into zones based on geomorphology and uses different projections for east-west and north-south oriented zones.

Actually, any of the above-mentioned systems will do, as most GISs can simply switch from one to another. However, it''s like a translator at the United Nationsthe basic concepts are easily converted into different languages, but subtle points can be lost, Your safest bet is to insure that your GIS "speaks as one" (whichever one) and immediately convert incoming data to the "official tongue."

Navigatainnal Systems
GPS, DGPS, and Backup Systems , A Farmers Guide to Precision Farming
Who uses GPS?

GPS and Racing Applications

An Overview of Remote Sensing
Remote Sensing is an extensive science, drawing from many areas for support and development. It depends greatly on the support of governments and private industries worldwide. Satellite and digital imagery play an important role in remote sensing; providing information about the land studied.

Remote Sensing Systems offer four basic components to measure and record data about an area from a distance. These components include the energy source, the transmission path, the target and the satellite sensor. The energy source, electromagnetic energy, is very important. It is the crucial medium required to transmit information from the target to the sensor.

Remote sensing provides important coverage, mapping and classification of landcover features, such as vegetation, soil, water and forests (diagram of spectral reflectance curves for vegetation, soil and water). The Kananaskis Valley has provided an environment for remote sensing studies, using satellite and digital imagery (from Landsat, SPOT and CASI).

The degree of accuracy achieved in classification depends on the quality of the images and the degree of knowledge possessed by the researcher, of the native types of species in the areas. Topographic data and a Digital Elevation Model also increase the classification accuracies. Correlations can then be drawn between drainage, surficial deposits and topographic features, in order to show the relationships that occur between forest, vegetation and soils. This provides important information for land classification and land-use management.

Remote sensing is an interesting and exploratory science, as it provides images of areas in a fast and cost-efficient manner, and attempts to demonstrate the "what is happening right now" in a study area. While airphotos and fieldwork remain critical as sources of information, the cost and time to carry out these methods sometimes may not be feasible for the study. Satellite and digital imagery acquired recently, provide more overall detail to assist the researcher in the classification process. Literature reviews and map interpretation are methods that can also be used for interpretation processes.

The benefits of remote sensing continue to arise. It can be used to access hard to reach areas for fieldwork, and provides a more detailed, permanent and objective survey that offers a different perspective. Airphotos are still favoured and easily accessible sources of information for classification.

Remote Sensing of the Global Environment

RST: Theoretical and Technical Perspectives of Remote Sensing; Special Applications

Remote Sensing

The philosophical underpinnings of remote sensing

Goes 39um Channel Tutorial
Goes 39um Channel Tutorial
Developed by NOAA/NESDIS Cooperative Institute for Research in the Atmosphere (CIRA), at the Colorado State University in Fort Collins, Colorado.
Table of Contents
Introduction
Basic Radiation Science
Energy Sources
Emmission and Reflection
3.9 & 10.7 um Channel Comparisons
Temperature Responsivity
Sub-pixel Response
Noise
Diffraction
Imagery Presentation
Imagery Applications
Currently Developed
Night-time Fog, Stratus & Cirrus
Super-cooled Clouds
Fog, Ice & Water Clouds Over Snow
Winter Storms
Earth- & Sea-surface Temperatures
Thin Cirrus & Multi-layered Clouds
Urban Heat "Islands"
Fire Detection
Under Investigation
Day-time Reflectivity
Visibility Contaminates
Sun Glint
Cumulus Bands at Night
Convective Cloud Phases
Volcanic Ash Cloud Monitoring (NEW!)
Glossary

Glossary
The GOES 3.9 um Tutorial Glossary
AVHRR -
Advanced Very High Resolution Radiometer, a 5-channel (4 infrared channels and 1 visible channel) instrument flown on board NOAA sun-synchronous polar-orbiting satellites.
band -
can refer to either a narrow spectral channel selected out of the electromagnetic spectrum, or to a larger portion of the spectrum.
blackbody -
a surface or body that absorbs all radiation incident upon it. Likewise, a blackbody has the maximum possible radiative emission for its given temperature.
channel -
a discrete portion of the spectrum measured by a satellite instrument, defined by a filter function (vs. wavelength). Satellite channels have a finite width, typically ranging from around 0.2 um in the visible to greater than 1.0 um in the infrared, or to greater than 10 um for sounder infrared channels.
emissivity -
also called emittance - the non-dimensional ratio of the radiance emitted from an object at a particular wavelength to the radiance that a blackbody would emit at that same temperature and wavelength. Thus a surface with an emissivity equal to 1.0 is a blackbody. All natural surfaces have emissivities less than 1.0, although most earth land surfaces have infrared emissivities between 0.9 and 1.0.
GOES -
Geostationary Operational Environmental Satellites, a series of satellites, in geosynchronous orbit, launched by the U.S. and operated by NOAA/NESDIS. There have been three generations of GOES satellites, starting with the SMS/GOES series in the mid-1970s. The most recent GOES satellites are GOES-8 and GOES-9, launched in 1994 and 1995, respectively.
geostationary -
sometimes called geosynchronous - a characteristic of a satellite orbit in which the satellite circles the globe, over the equator, in synchronization with the earth''s rotation. These satellites have a period of 24 h, allowing images of the scene below the satellite to be taken continuously, with little or no perceived movement.
Imager -
as applied to GOES satellites: a 5-channel instrument designed to measure in the visible and the infrared portions of the electromagnetic spectrum, to provide operational images every 15 minutes over most of the U.S.
infrared -
the portion of the electromagnetic spectrum with wavelengths ranging from longer than visible radiation, starting around 0.7 um, to wavelengths shorter than those in the microwave portion of the spectrum. Satellite instruments typically measure infrared radiation between wavelengths of about 3 um and 20 um.
Kirchoff''s Law -
the law that states that for objects in thermodynamic equilibrium (being characterized by a single temperature, or radiatively stable) the absorptivity equals the emissivity.
longwave -
when referring to the infrared portion of the electromagnetic spectrum, longwave is the region above about 10 um.
lookup table - or color table -
the enhancement (often using color) applied to satellite imagery, used to emphasize certain features that may be of interest.
medium wave -
when referring to the infrared portion of the electromagnetic spectrum, medium wave is the region between about 5 um and 10 um.
microwave -
the portion of the electromagnetic spectrum with much longer wavelengths than infrared radiation, typically above about 1 mm.
NEDR -
Noise Equivalent Delta Radiance, or noise equivalent radiance - the uncertainty in satellite measurements in terms of radiance units. The NEDR is usually a constant, regardless of the temperature of the scene being observed.
NEDT -
Noise Equivalent Delta Temperature, or noise equivalent temperature - the uncertainty in satellite measurements in terms of temperature units. The NEDT is a value which depends on the temperature of the scene being observed.
NESDIS -
the National Environmental Satellite, Data, and Information Service, the part of NOAA which operates U.S. weather satellites and provides satellite data services to other branches of NOAA and other branches of the government.
NOAA -
the National Oceanic and Atmospheric Administration, part of the U.S. Department of Commerce, responsible for monitoring and predicting the state of the oceans and the atmosphere. Also the name of the current series of polar-orbiting sun-synchronous weather satellites operated by NOAA.
Planck function -
a function named after Max Planck, which describes the blackbody radiative emission of a surface or body as a function of wavelength and temperature. The Planck radiance is a unique value for each wavelength and temperature.
polar-orbiting -
a characteristic of a satellite orbit that allows the satellite to circle the globe approximately over the poles of the earth. Polar-orbiting satellites have orbital inclinations, with respect to the equator, of close to 90 degrees. Typically, polar-orbiting weather satellites are also sun-synchronous.
radiance -
a conserved quantity of energy per unit area, per unit solid angle, and per unit of spectrum bandwidth. Radiances are measured by satellite instruments called radiometers or spectrometers, typically in units of mW / (m2.sr.cm-1), in the infrared portion of the spectrum or in units of W / (m2.sr.um), in the visible portion of the spectrum.
radiance temperature -
sometimes called brightness temperature, or blackbody temperature - the temperature measured by a satellite instrument, usually detected in terms of radiance, but converted into a temperature through the Planck function at a given wavelength.
resolution -
the size of the field-of-view (FOV) of a satellite picture element, as measured on the earth in kilometers. Resolution can have a second meaning: as the distance between the centers of adjacent picture elements. The two resolutions can be different, resulting in either overlap of individual FOVs, or gaps between them.
scene temperature -
the actual temperature of the scene being viewed. This temperature differs from the radiance temperature of the surface due to emissivity, reflectance, and atmospheric attenuation of the radiation.
shortwave -
when referring to the infrared portion of the electromagnetic spectrum, shortwave is the region below about 5 um.
Sounder -
as applied to GOES satellites: a 19-channel instrument designed to provide visible and infrared spectral radiances, used to vertically probe, or sound, the atmosphere. This is done by employing spectral bands with different amounts of atmospheric absorption, in order to measure temperatures and moisture at different depths in the atmosphere. Sounder data is typically available from GOES every hour, over the same locations.
sun-synchronous -
a characteristic of a satellite orbit that allows the satellite''s path to precess, or rotate slowly in synchronization with the earth''s revolution around the sun. Sun-synchronous satellites view the earth below at the same local time each pass; and, by necessity, are polar-orbiting, viewing the earth below during both a day-time and a night-time overpass, approximately 12 hours apart.
TIROS -
Television and InfraRed Observation Satellite - an old term used for the first polar-orbiting weather satellites. Currently, satellites in the series are called NOAA satellites.
visible -
the portion of the electromagnetic spectrum viewable by the naked eye, with wavelengths ranging from approximately 0.43 um to 0.69 um.
Back to the Table of Contents.

Global Position System

Remote Sensing
http://educationally.narod.ru/gisremotephotoalbum.html
Satellite Image
http://www.spot.com/?countryCode=US&languageCode=en
Hosted by uCoz