[From the U.S. Government Printing Office, www.gpo.gov]
Coastal Zone Information Center Go @ - - in", CENTER 0 Al i U DEVELOPMENT AND APPLICATION OF OPERATIONAL TECHNIQUES FOR THE INVENTORY AND MONITORING OF RESOURCES AND USES FOR THE TEXAS COASTAL ZONE Peggy Harwood The General Land Office 1700 North Congress Austin, Texas 78701 Robert Finley Bureau of Economic Geology The University of Texas at Austin Austin, Texas 78712 Samuel McCulloch Texas Natural Resources Information System Austin, Texas 78711 Patricia A. Malin The General Land Office 1700 North Congress Austinjexas 78701 John A. Schell Remote Sensing Center Texas A&M University College Station, Texas 77840 October, 1977 TYPE III (FINAL REPORT) April, 1975 through October, 1977 VOLUME If: APPENDICES prepared for: GODDARD SPACE FLIGHT CENTER GREENBELT, MARYLAND 20771 GENERAL LAND OFFICE OF TEXAS BOB ARMSTRONG, COMMISSIONER QH 541.15 M64 H3 1977 v. 2 This investigation was funded by National Aeronautics and Space Administration, Goddard Space Flight Center and the State of Texas through cost sharing. I COASTAL ZONE 1 N F. 0 Riii A `fi 0" Al 'U`7E N TE R DEVELOPMENT AND APPLICATION OF OPERATIONAL TECHNIQUES FOR THE INVENTORY AND MONITORING OF RESOURCES AND USES FOR THE TEXAS COASTAL ZONE li APPENDICES U - S . DEPARTMENT OF COMMERCE NOAA Appendix COASIAL SERVICES CENTER 2234 SOUTH HOBSON AVENUE A LANDSAT DATA LIBRARY CHARLESTON, SC 29405-2413 B TEXAS PARKS AND WILDLIFE DEPARTMENT VEGETATION SAMPLING METHOD AND FIELD INVESTIGATIONS SUMMARY C WIND VELOCITY AND DIRECTION TIME-HISTORIES AT THE TIME OF EACH LANDSAT IMAGE ANALYZED D SOFTWARE PROGRAMS AND MODIFICATIONS E GLOSSARY F DEVELOPMENT AND TESTING OF EXPERIMENTAL COMPUTER-ASSISTED ANALYTICAL TECHNIQUES G CONTROL NETWORK DATA SUMMARY H ANNOTATED BIBLIOGRAPHY ON THE APPLICATION OF AERIAL PHOTOGRAPHY AND LANDSAT IMAGERY TO THE STUDY OF COASTAL REGIONS I ACCURACY EVALUATION FOR EACH SCENE MAPPED, BY LAND COVER AND LAND USE CATEGORY DATA TABLES FOR ACCURACY'OF COMPUTER CLASSIFICATION IN Qsl_ THE HARBOR ISLAND TEST SITE N" K COST RECORDING F70R THE LANDSAT PROJECT L THE COST-SAVING ANALYSIS IN AN ECONOMIC CONTEXT M RAW DATA COSTS FOR LANDSAT MAPS DERIVED FROM IMAGE INTERPRE- TATION AND COMPUTER-ASSISTED ANALYSIS, WITH ONE TABLE ON LABOR COSTS FOR THE ENVIRONMENTAL-GEOLOGY MAP PrOperty of CSC Libra APPENDIX A LANDSAT DATA LIBRARY APPENDIX A LANDSAT COVERAGE OF THE TEST SITES 2, 3, 4, 5 FOR LANDSAT INVESTIGATION #23790 DATA CLOUD AVAILABILITY SCENE ID DATE COVER QUALITY Test Site 2: Summer: June - Aug. 2 1037 - 16251 08/29/72 20% 8888 1343 - 16253 07/01/73 20% 8888 1361 - 16252 07/19/73 20% 8888 2, 4 1703 - 16175 06/26/74 10% 8858 Fall: Sept. - Nov. 2 1073 - 16251 10/04/72 30% 8888 Winter: Dec. - Feb. 2 1217 - 16261 02/25/73 20% 8888 2 1505 - 16230 12/10/73 00% 2822 2 1901 - 16110 01/10/75 10% 8808 2, 4 2375 - 1.6112 02/01/76 00% 2 1576 - 16152 02/19/74 00% 8888 Spring: Mar. - May 1253 - 16262 04/02/73 20% 8888 2, 4 1289 - 16261 05/08/73 00% 8888 2 2051 - 16140 03/14/75 00% 8855 2 5027 - 16050 05/16/75 10% 5588 A-1 DATA CLOUD AVAILABILITY SCENE ID DATE COVER QUALITY Test Site 3: Summer: June - Aug. 1343 - 16253 07/01/73 20% 8888 1361 - 16252 07/19/73 20% 8888 1038 - 16305 08/30/72 20% 8888 1362 - 16305 08/30/72 20% 8888 2, 4 1703 - 16175 06/26/74 10% 8858 Fall: Sept. - Nov. 1092 - 16312 10/23/72 20% 8888 1110 - 16313 11110172 00% 8888 1452 - 16291 10/18/73 00% 7828 Winter: Dec. - Feb. 2, 4 1146 - 16314 12/16/72 00% 8888 1164 - 16312 01/03/73 10% 8888 1182 - 16313 01/21/73 00% 8888 2, 4 2034 - 16200 02/25/75 00% 8888 2016 - 16200 02/07/75 10% 5888 2 1578 - 16264 02/21/74 10% 8282 Spring: Mar. - May 1253 - 16262 04/02/73 20% 8888 1289 - 16261 05/08/73 00% 8888 1236 - 16320 03/16/73 10% 8888 A-2 DATA CLOUD AVAILABILITY SCENE ID DATE COVER QUALITY 1290 - 16315 05/09/73 00% 8888 1308 - 16314 05/27/73 20% 8888 2, 4 1614 - 16261 03/29/74 10% 8888 1974 - 16133 03/24/75 00% 8858 2 5028 - 16104 05/17/75 10% 8885 Test Site 4: Summer: June - Aug. 2 1326 - 16315 06/14/73 10% 8888 5 1380 - 16,311 08/07/73 30% 8883 5 1722 - 16232 07/15/74 30% 8858 5 2501 - 16081 06/06/76 20% 8885 2 1740 - 16292555 08/02/74 20% 8888 5 1758 - 16221 08/20/74 20% 8888 2, 4 5082 - 16080 07/10/75 10% 8888 Fall: Sept. - Nov. 2, 5 1092 - 16314 10/23/72 10% 8888 5 1110 - 16320 11/10/72 10% 8888 1452 - 16293 10/18/73 10% 8828 5 1776 - 16212 09/07/74 30% 5588 2 2268 - 16184 10/17/75 00% 5555 Winter: Dec. - Feb. 2, 4, 5* 1146 - 16320 12/16/72 20% 8888 5 1164 - 16:315 01/03/73 20% 8888 2, 3** 1182 - 16:315 01/21/73 00% 8888 A-3 DATA CLOUD AVAILABILITY SCENE ID DATE COVER QUALITY 5 2016 - 16202 02/07/75 00% 5885 2, 4 2034 - 16202 02/25/75 00% 8883 5 2375 - 16112 02/01/76 00% 8588 2, 4 2376 - 16172 02/02/76 00% 8888 Spring: Mar. - May 5 1236 - 16323 03/16/73 20% 8888 5 1254 - 16323 04/03/73 10% 8888 5 1290 - 16321 05/09/73 20% 8888 2 5-334 - 15523 03/18/76 30% 8888 2 1308 - 16320 05/27/73 10% 8888 2 1974 - 16135 03/24/75 10% 8858 5 5028 - 16111 05/17/75 10% 5588 Test Site 5: Summer: June - Aug. 1362 - 16315 07/20/73 20% 8888 1380 - 16314 08107173 20% 8888 1722 - 16235 07/15/74 20% 8888 2, 4 1740 - 16231 08/02/74 10% 8888 @z 1758 - 16223 08/20/74 10% 8888 Fall: Sept. - Nov. 2 1110 - 16322 11/10/72 10% 8888 2 1776 - 16215 09/07/74 20% 5855 1452 - 16300 10/18/73 20% 8888 A-4 DATA CLOUD AVAILABILITY SCENE ID DATE COVER QUALITY Winter: Dec. - Feb. 2, 4 1182 - 16322 01/21/73 00% 8888 1506 - 16293 12/11/73 10% 8888 2, 4 2034 - 16205 02/25/75 00% 8888 Spring: May'. - May 1614 - 16270 03/29/74 20% 8888 1974 - 16142 03/24/75 10% 8888 2070 - 16203 04/02/75 20% 8588 2 1290 - 16324 05/09/73 20% 8888 SPECIAL COLOR COMPOSITE (BANDS 4,5,6) REQUIRED DUE TO POOR BAND 7 TAPE COULD NOT BE REPRODUCED BY EDC/GODDARD DATA AVAILABILITY 1 Imagery on Order 2 Imagery on Hand 3 Tapes on Order 4 Tapes on Hand 5 Selected Products on Hand for Special Study A-5 APPENDIX B TEXAS PARKS AND WILDLIFE DEPARTMENT VEGETATION SAMPLING METHOD AND FIELD INVESTIGATIONS.SUMMARY APPENDIX B VEGETATION SAMPLING FOR LANDSAT DATA prepared by Larry Lodwick Texas Parks and Wildlife Department Introduction In an effort to obtain quantitative ground data on the plant commu- nities as defined by Landsat Telemetry, a sampling system which will allow a quick, yet quantitative analysis of the communities needs to be de- veloped. The procedure should be adaptable to the various communities Pre- sent, from mud flats, with only sparse vegetation cover, to salt marshes being predominately grasses or grass-like plants, to woodlands. The most efficient sampling system for large areas would be the point intercept measurement of cover. This has several advantages in that (1) it gives an indication of biomass (especially if height of the vegetation is known); (2) it can be used for all qrowth forms, from bryophytes to tree canopies; and (3) it can be adapted for the size of the community to be sampled (i.e., points farther apart for larger vegetation types). Although widely spaced points tend to reduce the precision of the measurements, it does serve as more rapid measurement than other sampling techniques (Mueller-Dombois and Ellenberg, 1974). From the data obtained by the cover method, it is possible to desig- nate plant associations which could be related to the various images interpreted by Landsat. Materials and Methods An advantage to sampling cover as opposed to other parameters is the small amount of equipment required for field sampling. The required mater- ials consist of a tape measure with a minimum length of 25 meters (or B-1 25 yards), a meter (or yard), stick, tally sheets (f i gure I ) for the sam- pling data and a plant press for collecting those plants which the inves- tigator is unfamiliar with. The method for data collection is as follows: 1. Prior to going into the field the sampling site should be located using the Landsat' printout to determine the approximate center of the vegetation type to be sampled. This point should then be located on a topographic map.(U.S.G..S.@ or low-altitude photograph. 2. Using the topographic map or photograph, locate the sampling site on the ground, setting a stake at the center point. The tape mea- sure should then be extended first to the north, then south, east, and west of the center point (preferably with the use of a com- pass) to a distance of 25 meters or yards (figure 2). The purpose of determining the location and direction of the transects prior to beginning of sampling is to reduce the bias of the investiga- tor. 3. At 25 evenly spaced points in each of the four directions from the center point, preferably at 1 meter (or yard) intervals, all species directly above or below the points should be identified and their heights, measured with the meter (or yard) stick (which, in the case of trees, may be estimated), recorded on the tally sheet (figures 3 and 3a). Bare soil, without vegetation, should also be recorded and treated as a species. This will enable one to assess the bare ground (mud flats, dredge spoil, etc). Those plant species with which the investigator is unfamiliar should be collected, pressed, and sent in for identification. Preferably the B-2 collection sample should consist of two or three individuals pressed separately with flowers (or fruits) and roots. These should be pressed. Information as to the sampling site, soil type, soil wet- ness (tidal marsh, dry uplands, standing water, etc.) should be recorded. The unknown plants should be numbered and the number listed on the tally sheet in place of the name. After the plant is identified, the number should be replaced by the correct name. 4. The information on the tally sheet should include the name of the investigator, the transect line number (as related to the map), the bearing of the line (north, south, east, or west), the date, and the amount ofinundation at the time of the sampling (i.e., dry land, mud, standing water, etc.). 5. One copy of each field sheet should then be sent to Austin for evaluation and analysis of the plant associations, 6. After several sites have been investigated, any problems encoun- tered need to be discussed to determine what changes in the pro- cedures might be made to alleviate the problems. Reference Mueller-Dombois, Dieter, and Ellenberg, Heinz, 1974, Aims and Methods of Vegetation Ecology: John Wiley and Sons, New York, 547 p. B-3 VEGETATION SURVEY Investigator Date Line Bearing Level of Inundation Sampling Points SPECIES 11 2 1 3 4 1 5 1 6 7 1 8 1 9 10 1 Ll 1 12 13 1 14 1 15 116 17 118 1 19 201 21 1 22 231 24 125 1 ca Figure 1 Sample tally sheet for recording ground cover. T X X X X X HiHijHH-1111111 I W1111111111!111 ....... 1111illiIiIIIiIIIIE X X X X Z T S Figure 2. Sample area for the measure of the point intercept method. After reaching a predetermined sampling site, select the center point and with the use of a compass, record those species which occur at 25 regular inter- vals (preferably one-meter intervals) directly north, south, east, and west of the center point. B-5 V1 III I I ziif f --t fa a 2 ".-I INA 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 IS 19 20 21 22 23 24 25 TAXODIUM MYRICA TYPHA SCIRPUS LYTHRUM SPARTINA SALIX DISTICHUM CERIFERA LATIFOLIA AM. NQ I PATENS NIGRA Figure 3. A schematic representation of a wetland vegetation type containing seven species. B-6 VECETATION.SURVEY investigator L. Lodwick Date 14 Jan. 1976 Line 001 Bearing North Level of Inundation Dry ground Sampling Points of SPECIES 11 2 1 3 4 1 5 1 6 7 1 8 1 9 101 Ll 12 1 13 1 14 15 1 16 1 17 18 1 3.9 1 20 1 21 221 231 24 25 lptsl Scirpus americana IM ).9m 2 Lythrum sp. #1 2m 2m 2m. 2m 4 ).5m ?.5m Taxodium distichum ?. 5M ).5M 4 Myrica cerifera L.2m .2m 1.2m IM Im 5 Spartina patens 2m 2m 12m 1 3 Typha I ati f ol i a 2m 2m 2m Salix nigra 4.5m 4. 5m 4m 4m 4m 5 Exposed soil X X X 3 Figure 3a. A tally sheet with the data for the association shown in Figure 3. LANDSAT FIELD INVESTIGATION SUMMARY prepared by George Clements Texas Parks and Wildlife Department Introduction Biological field verification was conducted to assist in correlating ADP imagery data and to document vegetation represented in the ADP and image-interpretation marsh classes. Approximately 18 vegetation sampling sites were chosen within spectrally uniform areas, and image-interpreta- tion results were verified for each of the four test sites. The Texas Gulf Coast is an area characterized by many diverse vege- tative ecosystems. These include large expanses of coastal prairie, salt marshes, fluvial timberlands and, farther south, resacas and and mesqui- tal-chaparral brushl.ands. This diversity is attributed to the wide vari- ance of climatic factors, substrates,and elevational differences of the coastal area. This investigation was conducted with particular emphasis on those 1and areas immediately adjacent to the Gulf and estuarine waters, espe- cially marshes. In the broadest sense, a marsh is a tract of soft, wet land usually characterized by monocotyledons such as grasses, rushes, and cattails. A more suitable definition applicable to the coastal regions of Texas is a tract of intertidal rooted vegetation which is alternately in- undated and drained by the rise and fall of the tide. This tidal water can be saline, brackish, or fresh water, thus further delineating types of marsh. Differences in elevation affecting frequency of tidal inundation give rise to another set of marsh terminology: low and high marshes. A marsh is, in reality, a series of communities which change gradually from the tidal creeks and pools to higher ground with the major varia- B-8 tions in habitat steming from differences in elevation and the consequent effects on duration of tidal inundation and substrate differences (Odum, Copeland and McMahan, 1974). Materials and Methods The field approach used was the point intercept for some sites and detailed observations and estimations for sites where the vegetation was virtually homogeneous or when the site was impenetrable brushland. The procedure for the point intercept transect method was as follows: 1. The site was located on a USGS topographic map by the image interpreter. 2. The site center point was located on the ground by the field investigator. 3. A 25-meter tape marked at one-meter intervals was stretched from the center point to the north, east, west and south for a to- tal of 100 sampling points. At each of the points (1 m), the species of plant and the height were recorded on a field data sheet. If immediate identification could not be made, a sample specimen was taken for later identification. Discussion Low Marsh The low brackish-to-saline marshes of the Texas coast, those which are tidally inundated almost daily, characteristically are vegetated with Spartina alterniflora as the dominant species. Often associated with it are.Batis maritima and within its range, Avicennia germinans. Generally, low marshes are more extensive on the upper coast in the Galveston-Freeport vicinity and gradually decrease in acreage southward B-9 down the coast. The low marsh in East Matagorda, West Matagorda, Espiritu Santo, and San Antonio bays are predominately situated on the bayward side of the barrier islands or peninsulas and, to lesser degrees, in other pe- ripheral areas of the bays. Beginning in the Aransas-Redfish Bay system, acreage of low marsh diminishes giving way to tidal flats and extensive areas of shallow waters with submerged rooted vegetation. In the Lower Laguna Madre area, the low Spartina alterniflora marsh is virtually nonexistent. However, low salinity areas well away from the bay shores were found which were not subject to tidal influences. Those areas supported typical halophytic vegetation. Site I of the Laguna Vista quadrangle was one example of this situation where Salicornia virginica, Batis maritima, Monanthochloe littoralis, Sparti@a spartina and Borrichia frutescens were growing with Opuntia lindheimeri and Karwinskia humbold- tiana. This tract was being utilized as rangeland for livestock. High Marsh Flats This topographical-unit is difficult to define by speciation or de- lineate from other units. By definition, it is tidally affected by estuarine waters but only during those periods of highest neap tides or more often by wind tides. The difficulties in delineating the unit and assigning characteristic species arise from the frequency of its inter- grading with other units, most often with both lower marsh and higher coastal prairie. Species most often found in the Unit I classified as high marsh are Distichlis spicata, Monanthochloe littoralis, Sporbolus K @,rin- icus and Borrichia frutescens. These species readily intergrade with lower ground species such as Batis maritima, Salicornia spp. and even Spartina alterniflora. Extensive patches of Spartina spartinae, the coastal prairie salt grass, are often found wi/thin the high marsh unit. B-10 Generally, the acreage of high marsh is much less than low marsh on the Texas coast. One of the most well-delineated high marsh units, from a ground view, is a pprtion of the area known as Welder's Flats located southeast of Seadrift, Texas, and the largest tract is in the Aransas National Wildlife Refuge southwest of Seadrift. Coastal Salt Grass Prairie Thousands of acres of Gulf Coastal plain extending from the Sabine River to the Rio Grande are dominantly vegetated by Spartina spartinae. These coastal prairies are generally low, poorly drained, heavy clay and often saline soils. This ecosystem is most often closely associated with the lands immediately bordering the coastal estuaries and marsh systems and often intergrades with the high marshes. This unit is represented by transect and ground site observation data in all four of the test site areas where Spartina spartinae is the dominant vegetation. Residents of the lower Texas coast call this association "sacahuistal," while on the upper coast, it is called salt grass prairie. Many coastal ranchers burn these areas annually since fresh, new growth provides a more suitable range for cattle. Barrier Islands The Texas coast has the longest chain of barrier islands and penin- sulas found anywhere in the world. They range in width from a few hundred yards to three or four miles (Andrews 1971). The generalized structure of these barriers is a gentle slope upward from the Gulf water line to the first set of dunes or foredunes. Height and development of the foredunes vary from low (six feet or less) on the upper Texas coast to the well- developed dunes of the lower coast which often attain heights of thirty B-11 feet or more. Continuing bayward, another set or two of dunes are found, usually not as high as the foredunes. These then give way to tracts of rolling, sandy substrate dotted with numerous depressions, swales, and hum- mocks, then grade into more level expanses until the bay shore is reached. At this point, the community may change to a barren tidal flat, high marsh, low marsh, or intergrades of the three depending on which bay 5ys- tem is being examined. The vegetation of the barrier islands varies more widely from the foredunes to the bayshore than from the upper coast to the lower coast. This vegetative community, on the whole, would have to be classified as grassland, as grasses constitute the dominant plants. The species list of dune plants changes very little from the upper to lower coast. Most frequently encountered on the foredunes proper are Sesuvium portulacastrum, Ipomoea pes-caprae, I. littoralis, Heliotropium curassavicum, Philoxerus vermicularis and Uniola paniculata. The dominant grass of the barrier flat along most of the coast is Spartina.patens, with Spartina spartinae being second dominant on the up- per coast and Schizachyrium scoparium second in the lower coastal barrier system. Other species frequently encountered during fieldwork in all areas included Cassia fasciculata, Helianthus spp., Ambrosia psilostachya, Andropogon qlomeratus, Machaeranthera phyllocephala, and Croton spp. These species were usually more prevalent on higher ground and hummocks of the barrier flat. Low, wet or moist swales and depressions were encountered behind the dunes at all barrier flat ground investigation sites. In the Aransas area, these were vegetated with Typha @p., Scirpus @pj. and Eleo- charis spp., while in the Port Isabel area, Dichromena colorata and Scirpus supinus were dominant. B-12 Summary 1. Low marsh is intertidal, rooted vegetation which is inundated on an almost daily basis. The characteristic species of this habitat on the Texas coast is Spartina alterniflora. Acreage of low marsh is more extensive on the upper Texas coast and diminishes to virtually nonexistent on the lower coast. 2. High marsh is that area which is tidally inundated only during highest neap tides or by wind tides. It is often difficult to delineate due to its intergrading with other units. I believe the key species of this system is Distichilis spicata and �Ror- obolus virginicus. Acreage of high marsh is much less than acre- age of low marsh on the Texas coast. 3. Coastal salt grass prairie is probably the most abundant habi- tat or ecosystem on the Gulf Coastal plain. The key species in this habitat is Spartina spartinae. 4. The Texas coastline has the longest chain of barrier islands and peninsulas in the world. The dominant plants are the grasses Spartina patens, S. spartinae and Schizachyrium scoparium. Num- erous microhabitats with their characteristic species are found within the strandplains. 5. Mesquital-chaparral is one of the more dominant associations on the arid lower Texas coast. This unit is found on the higher ground of the mainland areas. Characteristic species are Prosopis qlandulosa, Celtis.spinosa, and numerous other spiny trees and shrubs. B-13 References Carlton, J.M. 1975. A guide to common Florida salt marsh and mangrove vegetation. Fla. Marine Research Pub. No. 6. 30 p. Correll, D.S. and H. B. Correll. 1975. Aquatic and wetland plants of southwestern United States. Stan. Univ. Press. 1777 p. Fleetwood, R.J. 1967. Plants of Laguna Atascosa National Wildlife Refuge. U.S. Dept. Interior, Fish and Wildlife Ser. 48 p. Gould, F.W. and T.W. Box. 1965. Grasses of the Texas coastal bend. Welder Wildlife Found. contrib. 34 ser. C (rev.), Tex. A & M Univ. Press, 186 p. Hotchkiss, N. 1972. Common marsh, underwater, and floating-leaved plants of the United States and Canada. New York: Dover Publications, Inc. 124 p. Irwin, H.S. and M.M. Wills. 1961. Roadside flowers of Texas. Univ. of Texas Press. 295 p. Jones, F.B. 1975. Flora of the Texas coastal bend. Welder Wildlife Found. contrib. B-6, Mission Press, Corpus Christi, Texas. 262 p. Odum, H.T., B.J. Copeland and E.A. McMahan, Ed. 1974. Coastal ecological systems of the United States. Wash., D.C.: The Conservation Found. Spencer, E.B. 1974. All about weeds. New York: Dover Publications, Inc. 333 p. U.S.D.A. 1971. Common weeds of the United States. New York : Dover Publications, Inc. 463 p. B-14 APPENDIX C WIND VELOCITY AND DIRECTION TIME- HISTORIES AT THE TIME OF EACH LANDSAT IMAGE ANALYZED APPENDIX C WIND VELOCITY AND DIRECTION TIME- HISTORIES AT THE TIME OF EACH LANDSAT IMAGE ANALYZED Effective winds are those over 12 mph (10.4 knots). Wind strength is defined by S=(V-v) 2d where v is observed velocity, v-12 mph, d is duration in hours (Price, 1975). C-1 WIND SPEED (Knots) Galveston, Texas Fastest mile, direction 1- 8 May, 1973 I May 2 May 3 May 4 May 0 50 MNE=@@ Knots 5 May 6 May 7 May 8 May 30- VICTORIA March 22-30, 1974 25- 0 S=275 re) 0 a- 20- 0 0 0 0 S=38 15- S=20 S= 20 S-7 S=20 S- 0 M 0 5- LLJ 0- LLJ a- U) 0 5- z 0 Lu Go W 10- a- (D (n 0 0 N 15- S=7 Wind bearing of N40'E plotted to be i 1 0 consistent with adjacent bearings 0 LU Q@ WD - Wind direction 0- 20- U) S- Wind strength z 0 25- 30 22 23 24 25 26 27 28 29 30 MARCH (9AM Readings) 30- VICTORIA 0 rn 0 February 18 - 26, 1975 2 S=730 In 0 25 - 3: W 0 W :r 0 (n 20- S=275 S=130 COIJ z 3: it 15 - : a W j@jij@iii:i: 0 X (n 10- ...... u- u- 0 0 5- LIJ W 0-- 5- L3 W W Go 10- 0 z 15- a) Sz20 71 S=38 U.1 ox 20- 0 Q (n COD) Effective winds z 0 S=400 WD-Wind direction 25- 3: S-Wind strength 30 1 -1 - F Is 19 20 21 22 23 24 25 26 FEBRUARY (9AM Readings) C-4 30- CORPUS CHRISTI S=730 w 25- February 18-26, 1975 w q- 0 U) 0 c20- 0 w S 220 11 K) C 0 S=170-0 11 L3 w 15-: x U) IO_. U_ LL 0 0 C 5- w w 0- Q_ z a 5- w a- 10- z 15- S=63 S 653 Effective Winds S=38 w WD-Wind direction S=220 20- S -Wind strength z 0 25- D: S=550 301a 19 io 2'1 22 23 24 i5 i6 FEBRUARY (9AM READINGS) C-5 30- CORPUS CHRISTI 25- July 3-11, 1975 W W a- 20- Effective winds U) 0 0 WD Wind direction z 3: 0 S Wind strength Uj 0 x N X0 10- CU U_ 0 5- C W 0- W Q_ (n 0 5- Z 0 W W a- 10- .. ........ S=75 z _D 3: 15- S=63 S=63 0 0 S C\j S=93 W S=130 130 S= 130 00 0 It I x 0 n 0 00 0 20- 11 11 11 0 0 0 LO LO q T_ 3: S=275 3: U) ao M I Zj- - 11 11 z 0 Q 0 0 3: 3: 25- 3. 3: 30 3 4 7' 8' JULY (9AM READINGS) 20V C-6 30- 0 0 (D CORPUS) CHRISTI (D r1o .0 25- January 26-February3,1976 S 550 i Uj 3: 3: 0 CL 0 0 20- a IS-220(00 3: S 93 Uj 15 - 3: 0 S=20 S=7 10- U_ U_ 42 0 0 5- Uj Uj 0-- Q_ z 0 5- Uj Uj a_ U) 10- ... 8 15- S=38 S=3 S=38 Uj x 0 0 Effective winds 0 0 m 20- LO Co U) WD-Wind direction z 0 25- 3.1 S - Wind strength 3: 30 1- 26 2@ 28 29 3b 31 1 2 3' JANUARY- FEBRUARY (9AM READINGS) C-7 30- BROWNSVILLE 0 S=1150 February 18-26, 1975 0) 25- w w a. 20- S=170 U) S=130 z 3: 15- 0 w ... ..... . 0 10- .. . . ..... U_ LL 0 0 5- LU 0- w Q._ (n 0 5- z w w a. 10- 10, z ......... M, 3@: S38 w ir 20- S=170 Wind direction z WD (0 Co S=400 25- S -Wind strength 0 a)- 30- S=825 18 19 20 21 22 23 24 5 FEBRUARY (9 AM Readings) C-8 APPENDIX D SOFTWARE PROGRAMS AND MODIFICATIONS APPENDIX D SOFTWARE PROGRAMS AND MODIFICATIONS As indicated in the text, much of the software utilized for this in- vestigation was obtained from NASA. These programs were considered adequate for handling the basic classification tasks but were somewhat deficient in other areas. Consequently,, during the course of the project, several pro- grams were developed by the TNRIS staff to aid in the analysis effort. NASA also continued to develop capabilities in this area. Two such programs which were transferred to TNRIS are ELLTAB and HGROUP. Documentation and software for these and the other NASA programs used in the project can be obtained from the Computer Software Management and Information Center (COSMIC). The information provided in this appendix regarding these programs de- veloped by TNRIS staff describes the basic algorithm in each case. User manuals and related documentation can be obtained by contacting the Texas Natural Resources Information System, P.O. Box 13087, Austin, Texas 78711. D-1 ELLTAB PROGRAM I C***PROGRAM NOR14AL 2 C THIS PROGRAM NORMALIZES A VECTOR ARRAY 3 C 4 C NV = VECTOR DI-HENSION 5 C NS = NUMBER OF SAMPLES 6 C KF = VARIABLE FOW@4AT FOR READING CLASS NAME AND OATA VECTOR 7 C 8 DIMENSION KF(20),KC(50),D(50,50),Vl4AX(50) 9 C READ PARAMETERS, FORMAT AND DATA ARRAY 10 READ(5,100) NV,NS,KF 11 100 FORAAT(215,/,20A4) 12 DO 10 I-1,NS 13 10 READ(5,KF) KC(I),(D(I,J),J=l,NV) 14 C FIND MAXIMUM VALUE FOR EACH VKRIABLE 15 DO 20 J=1,NV 16 DO 20 I=I,NS 17 20 Vi'4AX(J)=Al4AXI(Vf4AX(J),D(I,J)) 18 C NOR14ALIZE THE VECTORS 19 DO 30 J=I,NV 20 DO 30 I=I,NS 21 30 D(I,J)=D(I,J)/V!lAX(J) 22 DO 40 I=I,NS 23 40 WRITE(l,KF) KC(I),(D(I,J),J=1,NV) 24 WRITE(6,200) 25 200 FOKIAT(//,' NORMALIZATION COMPLETED',//) 26 STOP 27 END DB0200*ELLTAB(l).S-F I FUNCTION SURF (X, KK, NN, ND) 2 c 3 C C011PUTES SU14 X OR SUM X**2 FROM A VECTOR. 4 C X = ARRAY CONTAINING THE SCORES TO BE USED. 5 C KK = ROW OR COLUMN NUtlBER IF X IS A MATRIX. SET I IF X IS A VECTOR. 6 C IF KK IS POSITIVE AND NOT 1, IT IS A COLUMN VECTOR. 7 c IF KK IS NEGATIVE AND NOT 1, IT IS A ROW VECTOR. 8 C NN = NUMBER OF VALUES TO BE SUMMED. IF NEGATIVE, SU14 X**2 COMPUTED. 9 C ND = NUMBER OF ROWS (OR ELEMENTS) DIMENSIONED FOR X IN THE' 10 C CALLING PROGRAM. 11 C 12 DIMENSION X(54,I) 13 SURF = 0.0 14 N = IABS(NN) 15 K = IABS(KK) 16 IF (NN) 5,55,10 17 5 IF (KK) 15,55,25 18 10 IF (KK) 35,55,45 19 15 DO 20 I=I,N 20 20 SURF - SURF + X(K,I)**2 21 RETURN 22 25 DO 30 I=I,N D-2 ELLTAB PROGRAM (Con't) 23 30 SUMF = SUMF + X(I,K)**2 24 RETURN 25 35 DO 40 I=I,N 26 40 SUMF = SUMF + X(K,I) 27 RETURN 28 45 DO 50 I=1,N 29 50 SUMF - SURF + X(1,K) 30 55 RETURN 31 END I SUBROUTINE CCDS (KF, KI, KqJ, KK, KL, KM) 2 C 3 C READS AND PRINTS TITLE, PARAMETERS, AND FORMAT CONTROL CARDS. 4 C KF = VECTOR HOLDING VARIABLE FORMAT ON RETURN. 5 C KI, KJ, KL, KM = PARAMETER VALUES. 6 C KH = TEMPORARY STORAGE WITHIN THIS ROUTINE. 7 C BLANK TITLE CARD YIELDS STOP. 8 C 9 DIMENSION KF(20), KH(20) 10 READ (5,9) KH 11 9FORMAT (20) 12 IF (KH(l) EQ. KH(2)) STOP 13 READ (5,10) KI, KJ, KK, KL, KM, KF 14 10 FORMAT (515 / 20A4) 15 WRITE (6,15) KH, KI, KJ, KK, KL, KM, KF 16 150FORMAT (IH1, 20A4 // 11H PARAMETERS / 13H COL 1- 5 15 17 113H COL 6-10 = , 15 13H COL 11-15 - , 15 13H COL 16-20 18 215 / 13H COL 21-25 15 15H DATA FORMAT 20A4) 19 RETURN 20 END D-3 HGROUP PROGRAM 1 C PROGRAM HGROUP 2 C 3 C HIERARCHICAL PROFILE-GROUPING ANALYSIS. 4 C PARAMETER CONTROL-CARD FIELDS. 5 C COL 1-5. NUMBER OF VARIABLES (MAX - 54). 6 C COL 6-10. NUMBER OF SUBJECTS (MAX - 54). 7 C COL 11-15. LEVEL OF GROUPING TO BEGIN GROUP-MEMBERSHIP PRINTING. 8 C COL 20. 1 = STANDARDIZE DATA ON EACH VARIABLE BEFORE GROUPING. 9 C COL 25. 1 = TRANSPOSE DATA MATRIX IN ORDER TO GROUP VARIABLES. 10 C FORMAT MUST SPECIFY AN ALPHANUMERIC SUBJECT-CODE FIELD, FOLLOWED BY 11 C NV SCORE FIELDS. IF DATA MATRIX IS TRANSPOSED (COL 25 = 1), 12 C GROUP-MEMBERSHIP CODES WILL BE SERIAL NUMBERS OF VARIABLES. 13 C SUBPROGRAMS REQUIRED ARE SUMF AND CCDS. 14 C 15 DIMENSION D(54,54), KG(54), W(54), KF(20) 16 REAL*8 LC(54), KC(54) 17 ND = 54 18 5CALL CCDS (KF, NV, NS, KP, KS, KT) 19 T - NS 20 C READ ALL DATA CARDS AND STANDARDIZE COLUMNS (VARIABLES), IF 21 C OPTIONED 22 DO 10 I-1,NS 23 10 READ KF, KC(I), (D(I,J), J=1,NV) 24 IF (KS EQ. 0) GO TO 20 25 DO 15 J=1,NV 26 A - SUMF(D, J, NS, ND) / T 27 S = SQRT(SUMF(D, J, -NS, ND) / T A A) 28 DO 15 I=1,NS 29 15 D(l,J) = (D(l,J) - A) / S 30 20 IF (KT EQ. 0) GO TO 30 31 C TRANSPOSE DATA MATRIX, IF OPTIONED. 32 N = MAXO(NS, NV) 33 DO 25 I-1,N 34 DO 25 J=I,N 35 X = D(I,J) 36 D(I,J) = D(J,q) 37 25 DOM = X 38 NS = NV 39 NV - T 40 C CONVERT DATA MATRIX TO INITIAL MATRIX OF ERROR POTENTIALS. 41 30 DO 45 1=1,NS 42 DO 35 J=1,NV 43 35 W(J) - D(I,J) 44 DO 45 J=I,NS 45 D(I,J) = 0.0 46 DO 40 K=1.,NV 47 40 D(I,J) = D(I,J) + (D(J,K) W(K))**2 48 45 D(I,J) = D(I,J) / 2.0 49 DO 55 I=I,NS 50 DO 55 J-1,NS 51 55 D(J,I) = 0.0 D-4 HGROUP PROGRAM (Con't) 52 NG-NS 53 C INITIALIZE GROUP-MEMBERSHIP AND GROUP-N VECTORS. 54 DO 60 I=1,NS 55 KG(I)=I 56 60 W(I)=1.0 57 C LOCATE OPTIMAL COMBINATION, IF MORE THAN 2 GROUPS REMAIN. 58 65 NG=NG-1 59 IF (NG EQ. 1.) GO TO 5 60 X=10.0**10 61 DO 75 I=1,NS 62 IF (KG(I) NE.I) GO TO 75 63 DO 70 J-I,NS 64 IF (I EQ. J OR. KG(J) NE. J) GO TO 70 65 DX = D(I,J) -- D(I,I) - D(J,J) 66 IF (DX GE. X) GO To 70 67 X=DX 68 L=I 69 M= J 70 70 CONTINUE 71 75 CONTINUE 72 NL = W(L) 73 NM - W(M) 74 WRITE (6,80) NG, L, NL, M. NM, X 75 800FORMAT (/ 14, 25H GROUPS AFTER COMBINING G, 13, 76 14H (N=, 13, ;7H) AND G, 13, 4H(N= 13, 10H), ERROR=, 77 2 F16.6) 78 C MDDIFY GROUP-MEMBERSHIP AND GROUP-N VECTORS, AND ERROR 79 C POTENIALS. 80 WS W (L) + W (6qM) 81 X D(L,M) * WS 82 Y D(L,L) * W(L) + D(M,M) * W(M) 83 D(L,L) = D(L,M) 84 DO 85 I-1,NS 85 IF (KG(I) EQ. M) KG(I) = L 86 85 CONTINUE 87 DO 95 I=1,NS 88 IF (I EQ. L OR. KG(I) NE. I) GO TO 95 89 IF (I GT. L) GO TO 90 90 OD(I,L) = (D(l,L) * (W(I) + W(L)) + D(I,M) (W(I) + W(M)) 91 1+ X - Y - D(I.,I) * W(I)) / (W(I) + WS) 92 GO TO 95 93 90OD(L,I) - (D(L.,I) * (W(L) + W(I)) + (D(M,I) + D(I,M)) 94 1* (W(M) + W(I)) + X - Y - D(I,I) * W(I)) / (W(I) + WS) 95 95 CONTINUE 96 W(L) - WS 97 IF (NG GT. K.P) GO To 65 98 C PRINT GROUP MEMBERSHIPS OF ALL OBJECTS, IF OPTIONED. 99 DO 115 I-1,NS 100 IF (KG(I) NE. I) GO TO 115 101 L-0 102 DO 100 J-I,NS 103 IF (KG(J) NE. I) GO TO 100 D-5 HGROUP PROGRAM (Con't) 104 L = L + 1 105 LC(L) = KC) 106 IF (KT EQ. 1) LC(L) - J 107 100 CONTINUE 108 IF (KT EQ. 0) GO TO 102 109 IF (KT EQ. 1) GO TO 104 110 104 WRITE (6,105) 1, L, (LC(J), J=I,L) ill 105 FORMAT (2H G, 13, 4H (N-, 13, 2H) , 2514 (14,K, 2514)) 112 102 WRITE (6,110) 1, L, (LC(J), J=I,L) 113 110 FORMAT (2H G, 13, 4H (N=, 13, 2H) , 15A7 (14X, 15A7)) 114 115 CONTINUE 115 GO TO 65 116 END D-6 SCALE REGISTER PROGRAM DB0200*-1).RDCLAS I SUBROUTINE RDCLAS(ILDISK,ISDLO,ISDWHI,NXWD,NSAM) 2 C***THIS SUBROUTINE READS A LARSYS CLASSIFICATION FILE 3 C***AND RETURNS CLASSIFICATION RESULTS FOR LINE ILDISK 4 DIMENSION NXWD(4000),IDATA(1000),NPTS(4),LSTART(4),LEND(4), 5 *ISTART(4),IEND(4) 6 DATA JUMP/O/ 7 DEFINE XCCT(SDISK)=(SDISK+NSAM-1)/NSAM 8 IF (JUMP) GO TO 10 9 JUMP=1 10 NCCTLO=MAXO(XCCT(ISDWLO-l), 1) 11 NCCTHI=MIN0(XCCT(ISDWHI+1),4) 12 C***READ HEADER RECORDS FROM LARSYS FILES 13 DO 5 NCCT=NCCTLO,NCCTHI 14 NUNIT=24+NCCT 15 READ(NUNIT,END=90,ERR=90) NXWD(l) 16 RESD(NUNIT,END=90,ERR=90) NXWD(l) 17 READ (NUNIT,END=90,ERR=90) NXWD(l) 18 READ(NUNIT,END=90,ERR=90) NXWD(l) 19 READ(NUNIT,END=90,ERR=90) (NXWD(l),I=1,10) 20 C***SAVE FIELD INFOMATION 21 NPTS(NCCT)=NXWD(l) 22 LSTART(NCCT)=NXWD(6) 23 LEND(NCCT)=NXWD(7) 24 ISTART(NCCT)=NXWD(9) 25 5 IEND(NCCT)=NXWD(l0) 26 C***FILL LINE WITH 'NO DATA' FLAGS 27 10 DO 20 I=1,4000 28 20 NXWD(I)='OOOOO0' 29 C***LOCATE REQUESTED DATA ON CLASSIFICATION FILES 30 DO 50 NCCT=NCCTLO,NCCTHI 31 NUNIT=24+NCCT 32 IF (ILDISK.LT.LSTART(NCCT).OR.ILDISK.GT.LEND(NCCT)) GO TO 50 33 C***READ A CLASSIFIED LINE 34 J=NPTS(NCCT) 35 30 READ(NUNIT,END=50,ERR=95) ILINE,(IDATA(I),I=1,J) 36 IF (ILINE.LT.ILDISK) GO TO 30 37 C***INSERT CLASSIFIED DATA INTO NXWD 38 J=O 39 KK=ISTART(NCCT)NSAM*(NCCT-1) 40 LL=IEND(NCCT)+NSAM(NCCT-1) 41 DO 40 I=KK,LL 42 J=J+l 43 40 NXWD(l)-IDATA(J) 44 50 CONTINUE 45 RETURN 46 C***ERROR READING CLASSIFICATION FILE 47 90 WRITE(6,100) NUNIT 48 100 FORMAT(' ERROR READING HEADER ON CLASSIFICATION FILE--UNIT ',12) 49 STOP D-7 SCALE REGISTER PROGRAM (Con't) 50 95 WRITE(6,200) NUNIT 51 200 FORMAT(' ERROR READING CLASSIFICATION FILE--UNIT ',12) 52 STOP 53 c 54 C***REWIND FILES AND RESET JUMP FLAG 55 ENTRY RESET 56 JUMP-O 57 DO 60 NCCT=NCCTLO,NCCTHI 58 NUNIT=24+NCCT 59 60 REWIND NUNIT 60 RETURN 61 END I SUBROUTINE MAPRNT(KTlPIX) 2 c ----------------- 3 C 4 C (E H SCHLOSSER) 5 C 6 c 7 C THIS SUBROUTINE REGISTERS ERTS MSS DATA FUR PRTCLASS 8 C 9 C 10 C EXTERNAL SUBROUTINES/FUNCTIONS CALLED 11 C ----------------------------------- 12 C 13 C NITHDG, 14 c SYMTAB 15 C TICGEN 16 C READ2N 17 c 18 C 19 INCLUDE KOMQT,LIST 20 INCLUDE KOMNER,LIST 21 INCLUDE KOMKLS,LIST 22 INCLUDE KOMFIT,LIST 23 INCLUDE WINDEF,LIST 24 INCLUDE KOMDEN,LIST 25 INCLUDE KOWW,LIST 26 INCLUDE KOMALT,LIST 27 INCLUDE KOMSYM,LIST 28 INCLUDE KOMTIC,LIST 29 C 30 DATA IOUT/8/ @ OUTPUT TAPE OF CLASSIFIED DATA 31 DIMENSION NXWD(4000) 32 DIMENSION IPBUF(1000) 33 DIMENSION LINFMT(4) 34 DATA LINFMT/'(1X,J4,1H:,NNNA1,1H:,J4)'/'/ 35 DATA KOLON/':'/ D-8 SCALE REGISTER PROGRAM (Cont) 36 c 37 INCLUDE TRFORM,LIST 38 INCLUDE NITAB,LIST @ DEFINE PROCEDURE TO COMPUTE ALT PRINT UNIT NUMBERS 39 INCLUDE DIGITS,LIST @ DEFINE PROCEDURES FOR DIGIT EXTRACTION 40 c 41 C 42 C INITIALIZE WIND0WS 43 c 44 NSAM=NERSAM/4 45 IPLMIN=PPDOWW(WLIN,WMIN) 46 IPLMAX= PPDOWW(WLIN,WMAX ) 47 IPCMAX=PPDOWW(WCOL, WMIN) 48 IPCMAX=PPDOWW (WCOL, WMAX) 49 NITMAX=I+(IPCMAX-IPCMIN)/(KPAGE-3) 50 IF(NITMAX..LE.8) GO TO 110 51 CALL MDWARN('WINDOW TOO WIDE') 52 GO TO 900 53 110 IF(KSYOWW(WORIG).EQ.-SCA') GO TO 130 54 IF(KSYOWW(WORIG).EQ.'DEG') GO TO 140 55 IF(KSYOWW(WORIG).EQ.'MIN') GO TO 140 56 GO TO 160 57 130 WRITE(6,135) NWDOW,MSA0DW(WLIN,WORIG),MSAOWW(WSAM,WORIG) 58 135 FORMAT(6X,WINDOW #-,13,' (ORIGIN ',14,' LINE, ',14,- SAmPLE).) 59 GO TO 170 60 140 WRITE(6,145) NWNDOW,GEDOWW,(WLAT,WORIG),GEDOWW(WLON,WORIG) 61 145 FORMAT(6X,'WINDOW #',13,' (ORIGIN ',F9.4,' LAT, ',F9.4,' LON)') 62 GO TO 170 63 160 WRITE(6,165) NWNDOW,UTMOWW,(WEA,WORIG),UTMOWW(WNO,WORIG) 64 165 FORMAT(6X,'WINDOW, #',13,' (ORIGIN ',-3P,F8.3,' KM E, ',F8.3, 65 & ' KM N)') 66 170 IPLTIC=99999 67 IPCTIC=99999 68 LVLTIC=l 69 C 70 c 71 C 72 C GENERATE TABULAR DATA 73 C 74 NITLO=O 75 NITHI=O 76 INCLUDE NITROT,LlST 77 NIT=O 78 NUNIT=NTAB(NIT) 79 CALL NITHDG(NUNIT) 80 CALL SYMTAB(NUNIT) 81 IF(KTIPIX.NE.1) WRITE(NUNIT,175) KTIPIX 82 175 FORMAT('0(l COUNT PIXEL)'/) 83 CALL GENTIC(NUNIT) 84 C 85 C D-9 SCALE REGISTER PROGRAM (Con't) 86 C BREAK WINDOW INTO SECTIONS, EACH C0MPOSED OF NOT MORE THAN MALTM PRINT UNITS 87 C 88 IPMCOD=MOD((IPCMAX"IPCMIN), (KPAGE-8)+l 89 FLD(30,6,LINFMT(2))=FLD(00,6,JHUNS(IPCMOD)) 90 FLD(00,6,LINFMT(3)=FLD(00,6,JTENS(IPCMOD)) 91 FLD(06,6,LINFMT)=FLD(00,6,JONES(IPCMOD)) 92 DO 800 NiLO=l,NITMAX,MALTM 93 NITHI=MINO((NITLO+MALTM-1),NITMAX 94 INCLUDE NITROT,LIST 95 CORLIN=CORL4P(IPLMI,O) 96 MSALIN=CORLIN @ FUTURE CORRECTION 97 CORLIN-MSALIN @ FUTURE CORRECTION -- TRUNCATE TO INTEGER 98 IPLIN=PPDL4C(CORLIN,O) 99 IPCLO=IPCMN+(KPAGE-8)*(NITLO-1) 100 IPCHI-MINO((IPCMIN+(KPAGE-8)*NITHI-1),IPCMAX) 101 NTICK=O 102 CALL GETIC 103 C 104 C 105 C HEAD PRINT UNITS 106 C 107 DO 180 NIT=NITLO,NITHI 108 NUNIT=NTAB(NIT) 109 CALL NITHDG(NUNIT) 110 180 CONTINUE ill CORSAM=CORS4P(IPLIN,IPCLO) 112 MSASLO=ADJS4C(CORLIN,CORSAM) 113 CORSAM=CORS4P(IPLIN,IPCHI+l) 114 MSASHI=ADJS4C(CORLIN,CORSAM==1.0 115 CALL SAMSCL 116 CALL BORDER 117 C 118 C 119 C COMPUTE FIRST/LAST DENSITY SAMPLES 120 C 121 200 CORSAM=CORS4P(IPLIN,IPCLO) 122 MSASLO-ADJS4C(CORLIN,CORSAM) 123 CORSAM=CORS4P(IPLIN,IPCHI+l) 124 MSASHI-ADJS4C(CORLIN,CORSAM)+1.0 125 C 126 C 127 C READ DENSITY LINE 128 C 129 CALL RDCLAS(MSALIN,MSASLO,MSASHI,NXWD,NSAM) 130 C 131 C 132 C LOCATE FIRST DENSITY PIXEL 133 C 134 MSASAM=MSASLO 135 NWDLO=MSASLO D-10 SCALE REGISTER PROGRAM (Cont) 136 NWD=NWDL0 137 NWDHI=MSASHI 138 IF(MSASA.LT.1) GO TO 350 139 C 140 C 141 C SCREEN PIXEL DENSITY 142 C 143 310 IF(NXWD(NWD).EQ.'000000) GO TO 350 144 C 145 C 146 C REGISTER/COUNT SCREENED PIXELS 147 C 148 CORSAM=CORS4A(MSALIN,MSASAM) 149 IPCOL=PPDC4C(CORLIN,CORSAM) 150 IPBUF(IPCOL-IPCIN+3)=NXWD(NWD) 151 C 152 C 153 C SCAN DENSITY PIXELS 154 C 155 320 MSASAM=MSASAM+1 156 NWD=NWD+l 157 325 IF(MSASAM.GT.MSASHI) GO TO 400 158 GO TO 310 159 C 160 C 161 C REGISTER FIRST 'NO DATA' PIXEL 162 C 163 350 CORSAM=CORS4A(MSALIN,MSASAM) 164 IPC1=PPDC4C(CORLIN,CORSAM) 165 IF(MSASAM.GT.O) GO TO 360 166 MSASAM= 1 167 NWD=l 168 C 169 C 170 C SCAN 'NO DATA' PIXELS 171 C 172 360 MSASAM=MSASAM+1 173 NWD=NWD+l 174 365 IF(MSASA.GT.MSASHI) GO TO 380 175 IF (NXWD(NWD).NE.'OOOOOO') GO TO 380 176 GO TO 360 177 C 178 C 179 C REGISTER STRING OF 'NO DATA' PIXELS 180 C 181 380 CORSAM=CORS4A(MSALIN,MSASAM-1) @ LAST 'NO DATA' PIXEL 182 IPC2=PPDC4C(CORLIN,CORSAM) 183 DO 385 IPCOL=IPCl,IPC2 184 385 IPBUF(IPCOL-IPCMI+3)=+999999 185 IF(MSASAM.GT.MSASHI) GO TO 400 186 GO TO 310 D-11 SCALE REGISTER PROGRAM (Con't) 187 C 188 c 189 C INCREMENT DISK LINE AND WRITE PRINT LINE 190 C 191 400 MSALIN=MSALIN+1 192 CORLIN-MSALIN @FUTURE CORRECTION 193 NLPRNT=PPDL4C(CORLIN,O) 194 IF(NLPRNT.GT.IPLIN) CALL LINOUT 195 IF(NLPRNT.GT.IPLMAX) GO TO 500 196 GO TO 200 197 c 198 c 199 C FOOT PRINT uNiTS 200 C 201 500 CALL BORDER 202 CALL SAMSCL 203 DO 550 NIT=NITLO,NITHI 204 NUNIT=NTAB(NIT) 205 WRITE(NUNIT,525) 206 525 FORMAT('O') 207 550 CONTINUE 208 C 209 C 210 800 CONTINUE 211 WRITE(NUNIT,805) 212 805 FORMAT('O'/6X,' **SEE UNIT 0 FOR LEGEND**'/) 213 NWNDOW=NWNDOW+1 214 ENDFILE IOUT 215 CALL RESETJ 216 C 217 900 RETURN 218 C 219 C 220 C 221 C 222 c 223 c 224 c 225 SUBROUTINE GETIC 226 NTICK=NTICK+l 227 IPLTIC=LINTIC(NTICK) 228 IPCTIC=COLTIC(NTICK) 229 LVLTIC-LEVTIC(NTICK) 230 RETURN 231 c 232 c 233 c 234 c 235 c 236 c D-12 SCALE REGISTER PROGRAM (Con't) 237 SUBROUTINE SAMSCL 238 FORMAT(6x,12411) 239 IPCNLO=MSASLO 240 DO 998 NIT=NITLO,NiTHI 241 NUNIT=NTAB(NIT) 242 IPCNHI=MINO,((IPCNLO+KPAGE-9),MSASHI) 243 DO 1000 I=I]IPCNLO,IPCNHi 244 1000 IPBUF(I-IPCIN+3)=1/1000 245 WRITE(NUNIT,99) (IPBUF(I-IPCMIN+3),I=IPCNLO,IPCNHI) 246 DO 100 I=IP(IPCNLO,IPCNHI 247 100 IPBUF(I-IPCMIN+3)=(1-1000*(I/1000))/100 248 WRITE(NUNIT,,99) (IPBUF(I-IPCMIN+q3),I=IPCLO,IPCNHI) 249 DO 10 I=IPCNLO,IPCNHI 250 10 IPBUF(I-IPCMIN+3)=(1-100*(1/100))/10 251 WRITE(NUNIT,,99) (IPBUF(I-IPCMIN+3),I=IPCNLO,IPCNLO) 252 DO 1 I=IPCNLO,,IPCNHI 253 1 IPBUF(I-IPCMIN.+)-I-1O*(I/1O) 254 WRITE(NUNIT,99) (IPBUF(IPCMIN+3),I=IPCNLO,IPCNHI) 255 998 IPCNLO=IPCNHI+l 256 RETURN 257 c 258 c 259 c 260 c 261 c 262 c 263 SUBROUTINE BORDER 264 DO 100 I=IPCLO,NITHI 265 100 IPBUF(I-IPCMIN+3)=':' 266 IPCNLO=IPCLO 267 DO 300 NIT=NITLO,NITHI 268 NUNIT=NTAB(NIT) 269 IPCNHI=MINO((IPCNLO+KPAGE-9),IPCHI) 270 WRITE(NUNIT,225) (IFBUF(I-IPCMI+3),I=IPCNLO,IPCNIH) 271 225 FORMAT(6X,124Al) 272 300 IPCNLO=IPCNHI+l 273 DO 800 I=IPCLO,IPCHI 274 800 IPBUF(I-IPCIN+3)=O @ MUST ZERO PRINT BUFFER! 275 RETURN 276 c 277 c 278 c 279 c 280 c 281 c 282 SUBROUTINE LINOUT 283 DIMENSION JSYTIC(2) 284 DATA JSYTIC/"*',.+/ 285 c 286 C LOOK UP SYMBOLS 287 c D-13 SCALE REGISTER PROGRAM (Con't) 288 DO 150 I=IPCLO,IPCHI 289 IPCCNT=MINO(IPBUF(I-IPCMIN+3),(KSYMSZ-1)) 290 150 IPBUF(I-IPCMIN+3)=KSYM(IPCCNT+I) 291 GO TO 300 292 C 293 C 294 C FLAG PRINT LINE(S) WITHOUT SCAN LINE DATA DUE TO SCALING 295 C 296 200 KSYBIT=IABS(KSYBIT) @ DISABLE OVERPRINT 297 DO 250 I=IPCLO,IPCHI 298 IF(IPBUF(I-IPCMIN+3).EQ.' GO TO 250 299 IF(IPBUF(I-IPCMIN+3).EQ'') GO TO 230 300 IF(IPBUF(I-IPCMIN+3).EQ.) GO TO 230 301 IPBUF(I-IPCMIN+3)=-:- 302 GO TO 250 303 230 IPBUF(I-IPCMIN+3)=' 304 250 CONTINUE 305 C 306 C 307 C INSERT TICK MARKS 308 C 309 300 IF(IPLTIC.GT.IPLIN) GO TO 400 @ SAVE TICK FOR SUBSEQUENT LINE 310 IF(LVLTIC.EQ.0) GO TO 330 @ ALWAYS INSERT PRIMARY TICKS 311 IF(IPBUF(IPCTIC-IPCMIN+3).EQ.' ') GO TO 330 312 IF(IPBUF(IPCTIC-IPCtIC+3).NE.':') GO TO 350 313 330 IF(IPBUF(IPCTIC-IPCMIN+2).EQ.:') 314 & IPBUF(IPCTIC-IPCMI,+2)='' @ LEFT TICK HALO 315 IPBUF(IPCTIC-IPCMIN+3)=JSYTIC(LVLTIC+l) @ TICK 316 IF(IPBUF(IPCTIC-IPCMIN+4).EQ.-:-) 317 & IPBUF(IPCTIC-IPCMIN+4)-' @ RIGHT TICK HALO 318 350 CALL GETIC 319 GO TO 300 320 400 CONTINUE 321 C 322 C 323 C WRITE PRINT UNIT LINE 324 C 325 IPCNLO=IPCLO 326 DO 540 NIT=NITLO,NITHI 327 NUNIT=NTAB(NIT) 328 IPCNHI=MINO((IPCNLG+KPAGE-9),IPCHI) 329 IF((NIT.EQ.NITHAX).AND.(IPCMOD.LT.122)) GO TO 530 330 WRITE(NUNIT,520) MSALIN, 331 & (IPBUF(I-IPCMIN+3),I=IPCNLO,IPCNHI),KOLON 332 520 FORMAT(lX,J4,':-,125Al) 333 GO TO 540 334 530 WRITE(NUNIT,LINFMT) MSALIN, 335 & (IPBUF(I-IPCMIN+3),I=IPCNLO,IPCNHI),MSALIN 336 540 IPCNLO=IPCNHI+l 337 NPIX=IPCHI-IPCL0+l 338 WRITE(IUUT) NPIX,(IPBUF(I-IPCMIN+3),I=IPCLO,IPCHI) D-14 SCALE REGISTER PROGRAM (Con't) 339 IF(KSYBIT.LT.6) GO TO 700 340 C 341 C 342 C OVERPRINT SYMBOLS 343 C 344 DO 660 KBIT=06,KSYBIT,6 345 DO 610 I=IPCLO,IPCHI 346 610 FLD(30,6,IPBUF(I-IPCIAI1+3))=FLO(KBIT,6,IP6UF(I-IPC,1111+3)) 347 IPCNLO=IPCLO 348 DO 640 NIT=NL,NITlI 349 NUNIT=NTAB(NIT) 350 IPCNHI=tIINO((IPCNLO+KPAGE-9),IPCll) 351 WRITE(NUNIT,620) 352 (IPBUF(I-IPCMI4+3),I=IPCNLO,IPCNHI) 353 620 FOUAT(,5X,124RI) 354 640 IPCNLO=IPCNRI+l 355 660 CONTINUE 356 c 357 C 358 C INCREMENT PRINT LINE 359 C 360 700 IPLIN=IPLIN--1 361 IF(NLPRNT.GIPLIN) GO TO 200 362 C 363 C 364 C REINITIALIZE LIE 365 C 366 KSYBIT=IABS(KSYBIT) ENABLE OVERPRINT 367 DO 750 I=IPCLO,IPCRI 368 750 IPBUF(I-IPCAIN+3)=O 369 RETURN 370 END D-15 MR-CLEAN PROGRAM 1 C***PROGRAM MR-CLEAN*** 2 C 3 C***THIS PROGRAM WAS WRITTEN TO HOMOGENIZE LANDSAT CLASSIFICATION 4 C RESULTS BY RECLASSIFYING A PIXEL TO REFLECT THAT OF ITS NEIGHBORS 5 c 6 C***READ IN CLASSES TO BE LEFT ALONE 7 DIMENSION LINE(3300,3),ISCAN(3300),ISYM(30) 8 DATA KNT/I/,IN,IOUT/10,11/ 9 5 READ(5,100,END=15,ERR=10) ISYM(KNT) 10 100 FORMAT(l) 11 KNT=Y-NT0+l 12 GO TO 5 13 10 WRITE(6,105) 14 105 FORAT(HO,- ERROR READING SYMBOLS) 15 STOP 16 C***READ FIRST TREE SCAN LINES 17 15 CNT=KNT-1 18 WRITE(31,333) 19 WRITE(32,333) 20 WRITE(33,333) 21 WRITE(34,333) 22 WRITE(35,333) 23 333 FOKIAT (IH,T46, EXAS NATURAL RESOURSES INFORMATION SYSTEM 24 25 CALL MA6600('31 ) 26 CALL MA6600('32 ) 27 CALL MA6600('33 ) 28 CALL MA6600('34 ) 29 CALL MA6600('35 ) 30 WRITE(6,106) 31 106 FORMAT(lHl) 32 READ(IN,END=98,ERR-99) NPIX,LINE(I,I),I-1 NPIX) 33 READ(IN,END=98,ERR=99) NPIX,(LINE(1,2),I=I,NPIX) 34 READ(IN,END-98,ERR=99) NPIX,(LINE(1,3),I-1,NPIX) 35 C***DON'T PROCESS FIRST SCAN LINE OR FIRST OR LAST PIXELS 36 WRITE(IOUT) NPIX,(LINE(J,),J-1,NPIX) 37 CALL MAP(LINE(l,l),NPIX) 38 16 ISCAN(l)-LINE(1,2) 39 ISCAN(NPIX)-LINE(NPIX,2) 40 LAST=NPIX-1 41 C***PROCESS MIDDLE SCAN LINE, PIXEL BY PIXEL 42 DO 90 1=2,LAST 43 ISCAN(l)-LINE(1,2) 44 IF (KNT.EQ.0) GO To 25 - 45 DO 20 J=1,KNT 46 IF (LINE(I,24)-ISYM(J)) 20,90,20 47 20 CONTINUE 48 25 N1=0 49 N2=0 50 N3=0 51 N4=0 D-16 MR-CLEAN PROGRAM (Cont) 52 C***COUNT NUMBER OF LIKE PIXELS FOR EACH NEIGHBOR 53 DO 30 K=1,3 54 DO 30 L=1,3 55 IF (LlNE(I+L-2,K).EQ.LINE(I-1,1)) N=Nl+l 56 IF (LINE(I+L-2,K).EQ.LINE(I-l)) N2-N2+1 57 IF (LINE(I+L-2,K).EQ.LINE(I+I,I)) N3=N3+1 58 30 IF (LINE(I+I,-2,K).EQ.LINE(I-I,2)) N4=N4+1 59 C***DOES ANY NEIGHBORING CLASS CONTAIN 5 PIXELS OR MORE? 60 GO TO (40,40,40,40,45,45,45,45,45),Nl 61 40 GO TO (50,50,50,50,55,55,55,55,55),N2 62 50 GO TO (60,60,60,60,65,65,65,65,65),N3 63 60 GO TO (90,90,90,90,75,75,75,75,75),N4 64 C***CHANGE PIXEL CLASS 65 45 ISCAN(I)=LINE(I-1,I) 66 GO TO 90 67 55 ISCAN(I)=LINE(I,I) 68 GO TO 90 69 65 ISCAN(I)=LINE(1+1,1) 70 GO TO 90 71 75 ISCAN(I)=LINE(I-1,2) 72 90 CONTINUE 73 C***WRITE THE ALTERED SCAN LINE 74 WRITE(OUT) NPIX,(ISCAN(I),I-1,NPIX) 75 CALL MAP(ISCAN(I),NPIX) 76 C***SHIFT SCAN LINES UP THE ARRAY 77 DO 95 1=12 78 DO 95 J=I,NPIX 79 95 LINE(J,I)-LINE(J,I+l) 80 C***READ NEW SCAN LINE AND LOOP 81 READ(IN,END-98,ERR-99) NPIX,(LINE(I,3),I-I,NPIX) 82 GO TO 16 83 C***EOF, WRITE LAST SCAN LINE 84 98 WRITE(IOUT) NPIX,(LINE(1,2),I=I,NPIX) 85 CALL MAP(LINE(1,2),NPIX) 86 ENDFILE IOUT 87 WRITE(6,110) 88 110 FURMAT(lH,END OF MR CLEAN) 89 Ill CALL MA6663('31 ) 90 CALL MA6663(-32 ) 91 CALL MA6663('33 ) 92 CALL MA6663('34 ) 93 CALL MA6663('35 94 STOP 95 C***ERROR ON READ 96 99 WRITE(6,115) 97 115 FORRAT(IHO,' ERROR READING IMPUT FILE') 98 GO TO 111 99 END. D-17 DETECT PROGRAM I C***PROGRAM DETECT 2 C 3 C***THIS PROGRAM COMPARES TWO REGISTERED CLASSIFICATION MAPS 4 C OF THE SAME AREA AND NOTES CHANGE BETWEEN THE TWO. 5 C 6 C***INPUT FILES = UNIT 10 AND UNIT 11 7 C UNFORMATTED FILES IN THE FORM: N,SYM1,SYM2,...,SYMN 8 C WHERE N IS THE NUMBER OF SYMBOLS PER LINE AND IS 9 C FOLLOWED BY THE N PRINT SYMBOLS TO BE USED 10 C***PARAMETER CARD(FREE FOILMAT): LINE,SAMPLE,UNIT 11 C WHERE LINE,SAMPLE ALL FOR OFFSETTING THE FILES 12 C I.E., PIXEL M,N ON UNIT 10 IS MATCHED WITH PIXEL 1,1 13 C ON UNIT 11 TO ADJUST -FOR SLIGHT THIS REGISTRATION 14 C (MAXIMUM OF 5 LINES ALLOWED). 15 C SPECIAL SYMBOLS ARE EXCLUDED FROM TESTING SINCE 16 C THESE ARE USED IN THE REGISTERED MAPS FOR TICK MARKS, ETC. 17 C IF NO CHANGE OCCURS, THE OUTPUT PIXEL IS BLANKED OUT. 18 C IF CHANGE OCCURS, THE SYMBOL ON UNIT 11 IS PRINTED UNLESS 19 C 'UNIT' ON THE PARAMETER CARD IS 10. IN THIS CASE, 20 C THE SYMBOL FROM UNIT 10 IS PRINTED. 21 C 22 DIMENSION L10(1000),Lll(1000),IPRNT(1000) 23 DATA LINE,SAMP/20/,IPRNT/1000*611 24 INTEGER COLON/:/,ASTER//PLUS/4+/ 25 C***WRITE THE HEADER ON THE ALTERATE PRINT FILES 26 WRITE(31,333) 27 WRITE(32,333) 28 WRITE(33,333) 29 WRITE(34,333) 30 WRITE(35,333) 31 333 FORMAT(lHI,T46,'TEXAS NATURAL RESOURSES INFORMATION SYSTEM', 32 33 C***ELIMINATE MARGINS ON THE ALTERNATE PRINT FILES 34 CALL MA6600('31 ) 35 CALL MA6600('32 ) 36 CALL MA6600(33 ) 37 CALL MA6600(34 ) 38 CALL MA6600(35 ) 39 WRITE(6,666) 40 666 FORAT(H) 41 C***READ OFFSET DATA 42 READ(5,100,END=20,ERR-90) LINE,SAMP,UNIT 43 100 FORMATO 44 ISAM-ISAMP-1 ~45 IF (LINE.EQ.0) GO TO 20 46 IF (LINE-5) 10,10,5 47 5WRITE(6,66) 48 66 FORMAT(' MAXIMUM OFFSET OF 5 LINES HAS BEEN EXCEEDED') 49 GO TO 99 50 C***FIND STARTING LINE ON UNIT 10 51 10 DO 15 I=I,LINE 52 15 READ(10,END-86,ERR-90) NlO,(Ll0(J),J=1,NlO) D-18 DETECT PROGRAM (Con't) 53 GO TO 25 54 C***READ FILES AND LOOK FOR CHANGE 55 20 READ(10,END-86,ERR=90) N1,(LlO(I),I=,Nl0) 56 25 READ(11,END-87,ERR=95) Nll,(Lll(I),I-1,Nll) 57 NPIX=MINO (NI0,N I I-ISAMP) 58 DO 80 I-1,NPIX 59 C***EXCLUDE SPECIAL SYMBOLS FROM TESTING 60 IF (LIO(I)COLON) 30,72,30 61 30 IF (LlO(I)-ASTER) 35,74,35 62 35 IF (LIO(l)-PLUS) 40,76,40 63 40 IF (LII(I+SM)-COLON) 45,72,45 64 45 IF (LII(I+ISAM)-ASTER) 50,74,50 65 50 IF (Lll(I+ISAM)-PLUS) 55,76,55 66 C***TEST FOR CHANGE AND SET PRINT SYMBOL 67 55 IF (Ll0(I)-Llll(I4+ISAm)) 60,80,60 68 60 IPRNT(I)=Llll:I+ISAM) 69 IF (IUNIT.EQ.11) IPRNT(I)=Ll(I) 70 GO TO 80 71 72 IPRNT(I)-COLON 72 GO To 80 73 74 IPRNT(I)=ASTIR 74 GO TO 80 75 76 1PRNT(I)=PLUS 76 80 CONTINUE 77 C***WRITE OUT CHANGE DETECTION LINE AND LOOP 78 CALL MAP(IPRIT(l),NPIX) 79 DO 85 I=1,1000 80 85 IPRNT(I)=6H 81 GO To 20 82 C***EOF AND ERROR PESSAGES 83 86 WRITE(6,300) 84 300 FORMAT( END OF FILE 10') 85 GO TO 99 86 87 WRITE(6,400) 87 400 FOWAT( END OF FILE ll) 88 GO TO 99 89 90 WRITE(6,500) 90 500 FORMAT( ERROR READING FILE 10) 91 C***RESET MARGINS ON ALTERNATE PRINT FILES 92 GO TO 99 93 95 WRITE(6,600) 94 600 FOEVAT( ERROR READING FILE 11) 95 99 CALL MA6663('31 96 CALL MA6663('32 97 CALL MA6663('33 98 CALL MA6663('34 99 CALL MA6663('35 100 STOP 101 END D-19 EXTRACT PROGRAM I C***PROGRAM EXTRACT 2 C 3 C THIS PROGRAM EXTRACTS BOUNDARIES FROM LANDSAT CLASSIFICATION FILES 4 C UNIT 8 REGISTERED MAP USED AS INPUT 5 C UNIT 15 OUTPUT IF BOUNDARY MAP 6 C*** 7 DIMENSION LINEI(4000),LINE2(4000),LPRT1(4000),LPRT2(4000) 8 DATA IN,IOUT/8,15/ID,NPTS,XINC/1,1,0.1/,ISAVE/O/ 9 YINC-1./6. 10 CALL SETADR(IOUT, 1) 11 WRITE(31,333) 12 WRITE(32,333) 13 WRITE(33,333) 14 WRITE(34,333) 15 WRITE(35,333) 16 333 FORAT(HI,T46,TEXAS NATURAL RESOURCES INFORMATION SYSTEM', 17 18 CALL MA6600('31 ) 19 CALL MA6600('32 ) 20 CALL MA6600('33 ) 21 CALL MA6600('34 ) 22 CALL MA6600('35 ) 23 C***READ IN CONTROL POINTS 24 5 READ(5,500,END-6,ERR-85) NROW,NCOL 25 500 FORMAT () 26 Xl=(NCOL-I)*XINC + XINC/2. 27 Yl=(NROW-1)*YINC + YINC/2. 28 WRITE(IOUT) ID,NPTS,X1,Yl 29 ID=ID+l 30 GO TO 5 31 6 NPTS=2 32 C***INITIALIZE ARRAYS 33 DO 10 1=1,4000 34 LPRT(I)=H 35 10 LINE(1)=6H 36 C***READ FIRST CLASSIFIED LINE INTO CORE 37 READ(IN,END=99,ERR=86) NPIX,(LINE(I),I=1,NPIX) 38 NRIJ=l 39 IEND=NPIX-1 40 C***LOCATE BOUNDARY POINTS IN THE FIRST LINE 41 DO 20 I-1,END 42 IF (LINE()-LINE(1+1)) 14,20,14 43 C***EXTRACT BOUNDARIES 44 14 XI-I*XINC 45 Yl=(NROW-1)*YINC 46 X2=XINC 47 Y2=NRYINC 48 LPRT2(l)-LINE2(l) 49 LPRT(1+1)=LINE2(1+1) 50 WRITE(IOUT) ID,NPTS,X1,Yl,X2,Y2 D-20 EXTRACT PROGRAM (Con't) 51 ID-ID+l 52 20 CONTINUE 53 C***SHIFT DATA, THEN PROCESS NEXT LINE 54 30 DO 40 1-1,4000 55 LPRTI=LPRT(I) 56 LPRT(I)-6 57 LII1(1)=LINE(I) 58 40 LINE2(1)=61 59 READ(IN,END-:99,ERR-86) NPIX,(LINE2(I),I=1,NPIX) 60 NROW=NRW+1 61 C***LOCATE BOUNDARY POINTS 62 DO 50 1-1,NPIX 63 IF (I.EQ.NPIX) GO To 46 64 IF (LINEI)-LINE(I+1)) 44,46,44 65 C***EXTRACT BOUNDARIES 66 44 Pl=IXINC 67 Ql=(NROW-1)*YINC 68 P2=I*XINC 69 Q2=NROW*YINC 70 LPRT2(I)=LINE2(I) 71 LPRT2(1+1)=LINE2(I+I) 72 WRITE(IOUT) ID,NPTS,Pl,Ql,P2,Q2 73 ID=ID+l 74 C***CHECK FOR BOUNDARY POINTS WITH PREVIOUS LINE 75 46 IF (LINE(I)-LINE1(I)) 48,50,48 76 48 LPRT1(I)=LINE1(I) 77 LPRT2(I)=LINE(I) 78 IF (ISAVE.EQ.0) GO TO 150 79 C***SAVE NEW BOUNDARY LINE 80 XXI=(I-I)*XINC 81 YYI=(NROWI)*YINC 82 XX2=I*XINC 83 YY2=(NR0141)*YIC 84 C***IF THE END POINTS DON'T MATCH WRITE THE CHAIN 85 IF (X2.EQ.XX1.AND.Y2.EQ.YY2) GO TO 145 86 WRITE(IOUT) ID,NPTS,X1,YI,X2,Y2 87 ID=ID+l 88 Xl-XXI 89 Yl-YYl 90 X2=XX2 91 Y2-YY2 92 GO TO 50 93 C***TIE ADJACENT BOUNDARY CHAINS TOGETHER 94 145 X2=XX2 95 Y2-YY2 96 GO TO 50 97 C***BEGIN A NEW CHAIN 98 150 Xl-(I-)*XINC 99 Yl=(NROW-14)*YINC 100 X2=I*XINC D-21 EXTRACT PROGRAM (Con't) 101 Y2=(NR0-1)*YINC 102 ISAVE=l 103 50 CONTINUE 104 C***OUTPUT BOUNDARY ON LINE PRINTER, THEN GET NEXT LINE 105 CALL MAP(LPRT1(1),NPIX) 106 IF (ISAVE.EQ.0) GO TO 30 107 WRITE(IOUT) ID,NPTS,X1,Yl,X2,Y2 108 ID=ID+l 109 ISAVE=O 110 GO TO 30 ill C***ERROR READING CONTROL POINTS 112 85 WRITE6,101) 113 101 FORMAT(' ERROR READING CONTROL POINTS') 114 STOP 115 C***ERROR READING FILE 116 86 WRITE(6,100) 117 100 FORMAT(' ERROR READING CLASSIFICATION FILE-) 118 STOP 119 C***END OF FILE 120 99 ENDFILE IOUT 121 CALL 14AP(LPRTI(l),NPIX) 122 WRITE(6,199) ID 123 199 FORMAT(IO- CHAINS EXTRACTED') 124 WRITE(6,200) 125 200 FORMAT(BOUNDARY EXTRACTION COMPLETED') 126 CALL MA6663('311) 127 CALL MA6663('32 ) 128 CALL MA6663('33 ) 129 CALL MA6663('34 ) 130 CALL MA6663('35 ) 131 STOP 132 END D-22 MERGE PROGRAM I C***THIS PROGRAM WILL MERGE SECTIONS OF TW0 LANDSAT DATA TAPES 2 C***FILE ASSIGNMENTS 3 C*** UNIT 10--FIRST INPUT TAPE 4 C*** UNIT 11--SCATCH FILE 5 C*** UNIT 13--OUTPUT TAPE 6 C***CARD INPUT: 7 C*** CARD I:STARTING LINE, ENDING LINE, BEGINING SAMPLE 8 C*** NOTE:THE PROGRAM WILL ADJUST THE STARTING SAMPLE 9 C*** SO THAT IT WILL FALL ON A WORD BOUNDARY 10 C*** CARD 2:ID FOR SECOND TAPE 11 C*** NOTE:IF BOTH FILES ARE ON THE SAME TAPE, 12 C*** PUNCH 'SAME' IN COL 1-4 13 C*** 14 DIMENSION 1BUF(733),JBUF(733)/7330/ 15 *CARD(5)/'@ASG,BOTH 10.,16N, / 16 C***COPY HEADER RECORD FROM FIRST TAPE 17 CALL NTRA14(10,10,2,9,IBUF,ISTAT,22) 18 CALL NTRA4(13,1O,1,9,IBUF,ISTAT,22) 19 CALL NTRAN(1LO,2,139,IBUF,ISTAT,22) 20 CALL NTAN(l3,1,139,IBUF,ISTAT,22) 21 C***READ CONTROL CARD AND ESTABLISH LIE AND SAMPLE LIMITS 22 READ(5,100) LINEI,LINE2,NSAM 23 100 FOILKAT ( ) 24 MOVE=LINEI+I. 25 CALL NTRAN(J.0,10,7,MOVE,22) 26 LAST=LINE2-1,INE1+1 27 1 START= N S/kM,--llOD(NSkl-1, 18) 28 1RITE(6,200) ISTART 29 200 FO14AT(THE FIRST SAMPLE NUMBER FROM TAPE I IS 14) 30 ISTART=ISTART-(ISTART/18)*2 31 NPIX=720-ISTART1 32 C***COPY SECTION OF FIRST TAPE 33 DO 20 I=1,LAST 34 CALL NTRAN(10,2,733,IBUF,ISTAT,22) 35 K=O 36 DO 10 J=ISTAR:1,720 37 K-K+l 38 10 JBUF(K)=IBUF(J) 39 20 CALL NTRAN(11,1,NPIX,JBUF,ISTAT,22) 40 C***CHANGE TAPES 41 CALL NTRAN(11,10,22) 42 READ(5,300) TAPE 43 300 FORMAT(A6) 44 IF (TAPE.NESAME, GO TO 25 45 CALL NTRAN(10,10,22) 46 CALL NTRAN(10,8,1,22) 47 GO TO 26 48 25 CALL EQUIP(REE,S 10. D-23 MERGE PROGRAM (Con't) 49 CARD(4)=TAPE 50 CALL EQUIP(CARD) 51 26 CALL NTRAN(10,7,[IO,22) 52 C***COPY SECTION OF SECOND TAPE 53 DO 40 1=1,LAST 54 CALL NTRAN(11,2,NPIX,JBUF,ISTAT,22) 55 CALL NTRAN(10,2,733,IBUF,ISTAT,22) 56 K=NPIX 57 JJ=ISTART-1 58 DO 30 J=1,JJ 59 K=K+l 60 30 JBUF(K)=IBUF(J) 61 40 CALL NTRAN(13,1,733,JBUF,ISTAT,22) 62 CALL NTRAN(13,9,22) 63 STOP 64 END D-24 APPENDIX E GLOSSARY APPENDIX E GLOSSARY Band - A group of wavelengths of light producing one color or con- venient group of wavelengths, such as near-infrared. Change Detection - Change detection is the process by which two images may be compared, resolution cell by resolution cell, and an out- put generated whenever corresponding resolution cells have different enough gray shades or gray shade n-tuoles. Channel - The same as "band" when used in computer work. Clustering - Mathematical procedure for organizing multispectral data into spectrally homogeneous groups. Clusters require identifica- tion and interpretation in a post-processing analysis. ISOCLS is a spectral Clustering program. **Color Composite - Color composite of three channels of ERTS-1 multi- spectral scanner digital data. The composites are third-or fourth- generation images, compared to first-generation composites produced from computer-compatible tapes using film recorder. **Computer-Compatible Tapes - Tapes containing digital ERTS-1 data. These tapes are standard 19-cm (7-1/2-in.) wide magnetic tapes in 9-track or 7-track format. Four tapes are required for the four- band multispectral digital data corresponding to one ERTS-1 scene. *Digital Image - A digital image, or digitized image, or digital picture function of an image, is an image in digital format and is obtained by partitioning the area of the image into a finite two-dimen- sional array of small uniformly shaped mutually exclusive regions, called resolution Cells, and assigning a "representative" gray shade to each such spatial region. A digital image may be ab- stractly thought of as a function whose domain is the finite two-dimensional set of resolution cells and whose range is the set of gray shades. **ERTS-1 Scene 2- Collection of the image data of one nominal framing area (185 km ) of the Earth's surface. The scene includes all data from each spectral band of each sensor. *Feature Selection - Feature selection is the process by which the features to be used in the pattern recognition problem are de- termined. Sometimes feature selection is called property selection. **Gray Scale - A scale of gray tones between white and black with an arbitrary number of segments. The ERTS-1 images have a 15-step gray scale exposed on every frame of imagery. The scale gives the relationship between gray level on the image and the electron beam density used to expose the original image. E-1 *Image - An image is a spatial representation of an object, scene, or another image. It can be real or virtual as in optics. In pattern recognition, image usually means a recorded image such as a photo- graph, map, or picture. It may be abstractly thought of as a con- tinuous function I of two variables defined on some bounded region of a plane. When the image is a photograph, the range of the func- tion I is the set of gray shades usually considered to be normalized to the interval 0,1 . The gray shade is located at spatial coor- dinate (x,y). A recorded image may be in photographic, video signal, or digital format. *Image Enhancement - Image enhancement is any of a group of operations which improve the detectability of the targets or categories. These operations include, but are not limited to, contrast improvement, edge enhancement, preprocessing, quantization, spatial filtering, noise suppression, image smoothing, and image sharpening. **ISOCLS - Iterative Self-Organizing Clustering System, a computer program developed at JSC using a clustering algorithm to group homogeneous spectral data. Controlling inputs allow investigators to control the size and number of clusters. Because the system produces a classification-type clustering map in which clusters require post- processing identification and interpretation, the system is frequently called a nonsupervised classification system. **LARSYS The set of classification programs for aircraft data handling and analysis developed at the Laboratory for the Applications of Remote Sensing, Purdue University. **Maximum Likelihood Ratio - Maximum likelihood ratio in remote sensing is a probability decision rule for classifying a target from multispectral data. Two types of errors are feasible: failure to classify the target correctly and misclassification of background as the target. In its simplest form, the likelihood ratio is Pt/Pb. This expression compares the probability (P) of an unknown spectral measurement being classified as target (t) to the probability of an unknown spectral measurement being classified as background (b). When Pt/Pb > I,- the formula decides t; and when TPb < 1, it decides b. ProbWi'lity density functions are compute from spectral samples, often called training samples. As the number of training samples increases, the mathematical computations of the maximum likelihood ratio increase in complexity. As a result, digital computer analysis is required. The analysis is called automatic data processing of multispectral remotely sensed data or automatic spectral pattern recognition of multispectral remotely sensed data. **MSS - Multispectral scanner system, sometimes called the multispectral scanner. The MSS usually refers to the ERTS-1 operational scanning system. **Nonsupervised Classification -.A procedure grouping spectral data into homogeneous clusters. Identification and interpretation are done in a postprocessing analysis. E-2 *Pattern Recognition - Pattern recognition is concerned with, but not limited to, problems of: (1) pattern discrimination, (2) pattern classification, (3) feature selection, (4) pattern identification, (5) cluster identification, (6) feature extraction, (7) preprocessing, (8) filtering, (9) enhancement, (10) pattern segmentation, or (11) screening. **Pixel - Picture resolution element, or one instantaneous field of view recorded by the multispectral scanning system. An ERTS-1 pixel is about 0.49 hectare (1.09 acres). One ERTS-1 frame contains about 7.36 x 10 pixels, each described by four radiance values. *Preprocessing - Preprocessing is an operation applied before pattern identification is performed. Preprocessing produces, for the cate- gories of interest, pattern features which tend to be invariant under changes such as translation, rotation, scale, illumination levels, and noise. In essence, preprocessing converts the measurements patterns to a form which allows a simplification in the decision rule. Preprocessing can bring into registration, bring into con- gruence, remove noise, enhance images, segment target patterns, detect, center, and normalize targets of interest. **Radiance - Measure of the radiant energy emitted by a radiator in a given direction. "Reflectance - Ratio of the radiance of the energy reflected from a body to that incident upon it. Reflectance is usually measured in percent. *Registering - Registering is the translation-rotation alignment process by which two images of like geometries and of the same set of ob- jects are positioned coincident with respect to one another so that corresponding elements of the same ground area appear in the same place on the registered images. In this manner, the corresponding gray shades of the 'two images at any (x,y) coordinate or resolution cell will represent the sensor output for the same object over the full image frame be-Ing registered. *Resolution - Resolution is a generic term which describes how well a system, process, component or material , or image can reproduce an isolated object or separate closely spaced objects or lines. The limiting resolution,, resolution limit, or spatial resolution is de- scribed in terms of the smallest dimension of the target or object that can just be discriminated or observed. Resolution may be functions of object contrast or spatial position as well as element shape (single point, number of points in a cluster, continuum, or line, etc.). "Signature - A set of spectral, tonal, or spatial characteristics of a classification serving to identify a feature by remote sensing. "Spectral Response - Spectral radiance of an object sensed at the satellite and recorded by the multispectral scanner. E-3 **Supervised Classification - Classification procedure in which data of known classes are used to establish the decision logic from which unknown data are assigned to the classes. The automatic data processing supervised classification procedure used at JSC during the ERTS-1 project used a Gaussian maximum likelihood decision rule. **Training Field - The spatial sample of digital data of a known ground feature selected by the investigator. From the sample the spectral characteristics-are computed for supervised multispectral classifi- cation of remotely sensed data. The statistics associated with train- ing fields form the input to the maximum likelihood ratio computations and train the computer to discriminate between samples. Extracted from Interpretation Systems Inc., 1976. **NASA Tech Memorandum, 1974. E-4 APPENDIX F DEVELOPMENT AND TESTING OF EXPERIMENTAL COMPUTER-ASSISTED ANALYTICAL TECHNIQUES DEVELOPMENT AND TESTING OF EXPERIMENTAL COMPUTER-ASSISTED ANALYTICAL TECHNIQUES Introduction The development and testing of computer-assisted techniques for anal- ysis of Landsat digital data was an evolutionary process throughout the course of the investigation. The basic software for accomplishing classi- fications of spectral data was obtained from NASA's Johnson Space Center and adapted to the UNIVAC '1100/41 System used by the Texas Natural Resour- ces Information System (TNRIS) staff. Additional software was developed by the TNRIS staff during the project to enhance the basic capability. Three of the four designated test areas, test sites 2, 3, and 5, were used to experiment with the various software routines and to develop a set of pro- cedures for application of the software and related techniques to classi- fication of land cover and land use within the test area. Test site 4 was reserved for a test and evaluation of the developed procedures and software. The following paragraphs will provide some of the details regarding the analysis effort on each test site, primarily to document the diffi- culties encountered during the development effort and to record the para- meters used for the analysis. The test sites are discussed in the same or- der in which they were addressed during the investigation. Two Landsat scenes were used on test site 3 and one each on sites 2 and 5. The control networks established for many of the Landsat scenes are contained in ap- pendix D for future reference if these same scenes need to be analyzed. The test site 4 analysis results are described in the main body of the report. F-1 Test Site 3 Early computer-assisted classification efforts were aimed primarily at establishing-unsupervised analysis procedures and evaluating available software for classifying Landsat Multispectral Scanner (MSS) data. Test site 3, consisting of the Austwell, Welder Flats, and adjoining Pass Cavallo/Port O'Connor USGS 7 1/2-minute Quadrangles, was the first test site classified. Two Landsat scenes containing test site 3 were evaluated: Scene Date 1614-16261 29 MAR 74 2034-16200 25 FEB 74 The first of two areas examined in site 3 (Scene 1614-16261) was the Austwell quadrangle area. ISOCLS clustering was performed on the entire Austwell quad area by sampling every third line and every other column of MSS data. Resulting statistics were used by the CLASSIFY and DISPLAY processors to produce a five-class computer map of the Austwell area. When visually compared to the Environmental Geologic Atlas of the Texas Coastal Zone - Port Lavaca Area (Mc Gowen et al., in print), the five classes correspond to pasture lands, agricultural land, one class of marsh, and two classes of water. Further clustering in the marsh area separated the marsh into two distinct types. Unclassified areas were also reexamined and yielded three more classes: an industrial area and two industrial holding tanks. The final classification resulted in nine classes which compared favorably with the broad categories on the BEG Biological Assemblages and Land Use Maps. Using procedures similar to those previously described, a classifica- tion map was produced for the Pass Cavallo/Port O'Conner quad areas. A F-2 .procedural change in the clustering process was tried in producing this classification. Using ISOCLS to cluster entire quad-sheet-size areas con- sumes large amounts of computer time. A less expensive method was adopted: a grayscale map was examined, along with available photography, to select numerous small areas throughout the scene in order to cover the observed spectral variation. Training class statistics were derived from cluster- ing every other line and column of data within these small areas. Classi- fication of the Pass Cavallo/Port O'Conner area yielded thirteen classes. When compared to photography and the maps of the Port Lavaca area, four to six classes corresponded to water of various depths and turbidity. The remaining classes delineated the barren land (beaches, dunes, and spoil areas) and separated the low vegetated wetland areas from the higher and drier vegetated areas. The second computer-assisted classification of test site 3 was made using Landsat scene 2034-16201 MSS data. Clustering was performed on all test site 3 areas with line and col- umn sampling intervals identical to the previous analysis. New ISOCLS parameter values were introduced to correct shortcomings found in the pre- vious classification results. ISOCLS parameter values for both the first and second evaluation of site 3 are noted in the table below. TEST SITE # ISOCLS PARAMETER VALUES Scene 1614-16261 Scene 2034-16200 Channels 2,3,4 Channels 1,2,3,4 ISTOP 20 ISTOP 10 NMIN 100 NMIN 20 DLMIN 3.2 DLMIN 2.0 STDMAX 4.5 STDMAX 3.0 MAXCLS 30 MAXCLS 30 KRN 2 KRN 2 F1-3 The classification results contained 17 classes for the Austwell area and 26 classes for both the Welder Flats and Pass Cavallo/ Port O'Conner areas. It should be noted that ELLTAB, a fast new "look-up" type of classi- fier, was introduced at this time. The new classifier produced results iden- tical to the LARSYS Classifier and was 17 times faster. SCALE/ REGISTER was also substituted for the LARSYS DISPLAY processor to produce a more usable map-like product (scaled and registered to the USGS quad maps). Correlation between the computer-assisted analysis results and BEG photo-interpretation results helped determine which classes were common to both techniques. Common classes again included water of various degrees of turbidity, high and low marsh environments, and barren areas such as beaches, dunes, and spoil areas. The computer-assisted procedures and tech- niques proved adequate for producing favorable classification results. A formalized set of steps for computer-assisted analysis was established as a result of the work on test site 3. This was changed several times during the investigation. The final set of procedures used for the site 4 analy- sis are contained in the main body of the report.. Test Site 2 The computer-assisted analysis of test site 2 (scene 1289-16261/8 May 1973) followed the procedures established at the end of test site 3. Considerable experimentation was done regarding the,classification para- meters in an attempt to attain smaller standard deviations. The results of this work established the parameters which were used during the remainder of the project. Because of the size and shape of the test area (seven quad- F-4 rangle sheets arranged stepwise along the coast), each of the steps re- quired four runs to complete all seven maps. The quad sheets included in test site 2 were: Jones Creek, Freeport, Oyster Creek, Christmas Point, Hoskins Mound, Sea Isle, and San Luis Pass. The Landsat scene contained an exceptionally wide range of spectral levels, including particularly high reflectance areas (urban/industrial), compared to those scenes previously studied. To allow for this wider range of reflectances and improve upon the initial classification obtained using previously established parameters, the sample size for acquiring the ISOCLS statistics was increased in some areas. Between 40 and 50 clusters were generated which had to be reduced to 40 subclasses for use of the ELLTAB programs. Because of Table limits, the 40 subclasses had to be displayed in two sets and consolidated manually by overlaying the two printouts for comparison with ground truth and subsequent refinement. The classification results compared favorably with similar results from the image-interpretation approach. Test Site 5 The environmental makE!-Up of test site 5 (scene 2034-16205) was some- what more complex than SitE!s 2 or 3. The presence of (1) subaqueous grass flats, (2) algal mats, (3) undifferentiated barren areas, (4) croplands, and (5) urban areas, contributed to the site's complexity. Test site 5 consisted of an area represented by five 7 1/2-minute USGS Quadrangles: Hawk Island, Laguna Atascosa, La Leona, La Coma, and Three Islands. By grouping adjoining quadrangle areas, the analysis was handled as three separate runs. Classification and display procedures were similar to those used in the site 3 analysis with the exception of an F-5 additional program: HGROUP was introduced to the classification scheme. The establishment of a DAM control network for test site 5 required slightly more time than previous test sites. This was due to a lack of adequate control points in key locations and approximately 40 percent of the scene contained coveragQ-over Mexico. ISOCLS clustering was performed on two selected areas in site 5 with sampling intervals consisting of every other line and column of data. Previously established parameter values were used with the exception of MAXCLS. The maximum number of clusters to be generated was changed from 30 to 50 in order to deal with the somewhat larger and more complex test site. The computer-assisted classification and display results compared favorably with the broad categories of the BEG land use and land cover class- ification scheme. Similar categories were: (1) water, (2) urban/built-up land, (3) grasslands, (4) wetlands, and (5) barren areas. Classified results contained 25 classes of which 17 were displayed. HGROUP was utilized for determining the 17 classes. After the site 5 analysis, a convenient method for combining spectrally similar classes was introduced (HGROUP) and the computer- assisted classification procedures were further modified for better efficiency. F-6 APPENDIX G CONTROL NETWORK DATA SUMMARY APPENDIX G CONTROL NETWORK DATA SUMMARY Test Site 5 Control Network Scene Id 2034-16205 Attitude Pitch = +-18 Roll = -.35 Control/Check Points (DAM-405) CCT POINT LINE SAMPLE LATITUDE LONGITUDE 3 1 665 746 26.2424 97.4070 3 2 703 765 26.2135 97.4043 3 3 705 788 26.2108 97.3920 3 4 718 792 26.2009 97.3915 3 5 768 718 26.1730 97.4433 3 6 786 711 26-1605 97.4507 4 7 517 20 26.3371 97.3305 4 8 583 86 26.2860 97.3042 4 9 610 45 26.2701 97.3335 4 10 608 94 26.2675 97.3050 4 CHK-11 619 88 26.2590 97.3113 4 12 695 135 26.2037 97.2977 RMS 60 meters G-1 Test Site 2 Control Network Scene Id 1289-16261 Attitude Pitch = +0-00 Roll = -0-72 Control Check Points (DAM-405) CCT POINT LINE SAMPLE LATITUDE LONGITUDE 1 CHK-2 492 92 29.52410 96-66320 1 CHK-4 797 392 29.28210 96-54950 1 CHK-5 544 637 29.43840 96-35930 1 8 1841 188 28.56630 96-86740 2 26 1612 218 28.65200 96-34080 3 CHK-9 868 652 29.05820 95-47530 3 CHK-13 1501 19 28.67300 95-96600 3 17 267 423 29.50280 95.48730 3 is 390 454 29.41330 95-49350 4 19 1069 70 28.89600 95-38400 4 21 987 176 28.94320 95-30730 4 CHK-22 470 450 29.28000 95.04690 4 23 563 417 29.21800 95-08370 4 25 594 199 29.21700 95-21440 4 CHK-29 782 154 29.08910 95-27960 4 CHK-35 937 207 28.97510 95-27990 4 CHK-37 1066 52 28.89900 95-39520 RMS 57 meters G-2 Scene 1, Test Site 4 Scene Id 2034-16202 Attitude Pitch = t.05 Roll = -.43 Control/Check Points (DAM-405) CCT POINT LINE SAMPLE LATITUDE LONGITUDE 1 1 57 408 28.2707 98.0213 3 2 369 435 27.9064 97.1350 2 3 539 625 27.8433 97.5250 2 4 536 585 27.8492 97.5477 2 5 1050 771 27.4733 97.5390 1 6 1598 787 27.1579 98.0950 3 7 1936 156 26.8336 97.5913 RMS 60 meters Scene 2, Test Site 4 Scene Id 2376-16172 Attitude Pitch = -.06 Roll = -.41 Control/Check Points (DAM-7605) POINT LINE SAMPLE LATITUDE LONGITUDE 1 68 920 28.0860 97.8903 2 150 2250 27.9064 97.1360 3 318 1593 27.8493 97.5477 G-3 Scene 2, Test Site 4 (continued) POINT LINE SAMPLE LATITUDE LONGITUDE 4 506 770 27.7917 98.0594 5 836 1781 27.4713 97.5396 6 1387 997 27.1600 98.0973 7 2194 1616 26.5427 97.8931 RMS 111 meters Scene 3, Test Site 4 Scene Id 5082-16080 Attitude Pitch = +.28 Roll = -.46 Control/Check Points (DAM-7605) POINT LINE SAMPLE LATITUDE LONGITUDE 1 726 1618 27.8493 97.5477 2 728 1660 27.8432 94.5251 3 1235 1803 27.1600 97.5390 4 1788 1005 26.8249 98.0973 5 2129 1989 28.2630 97.6060 6 268 651 28.0825 98.0136 7 486 941 28.0825 97.8903 8 155 2440 28.1698 96.9684 9 143 2495 28.1725 96.9337 RMS 95 meters G-4 Scene 4, Test Site 4 Scene Id 1146-16320 Attitude Pitch = +.26 Roll = -.57 Control/Check Points (DAM-405) CCT POINT LINE SAMPLE LATITUDE LONGITUDE 1 1 129 752 28.0873 97.8903 3 2 224 474 27.8989 97.1361 2 3 383 624 27.8493 97.5477 2 4 1125 697 27.3267 97.6545 2 5 1449 36 27.1600 98.0973 3 7 2136 647 26-5572 97.4254 2 8 2284 629 26.5275 97.9236 RMS 82 meters G-5 APPENDIX H ANNOTATED BIBLIOGRAPHY ON THE APPLICATION OF AERIAL PHOTOGRAPHY AND LANDSAT IMAGERY TO THE STUDY OF COASTAL REGIONS APPENDIX H ANNOTATED BIBLIOGRAPHY ON THE APPLICATION OF AERIAL PHOTOGRAPHY AND LANDSAT IMAGERY TO THE STUDY OF COASTAL REGIONS Two references which appear repeatedly are: 1. Symposium, Significant Results obtained from ERTS-1, referring to: Freden, S.C., Mercanti, E.P., and Becker, M.A., eds., 1973a, Symposium on significant results obtained from the Earth Resources Technology Satellite-1, Vol.I, Technical presentations. Sections A and B: Washington, D.C., Goddard Space Flight Center, NASA SP-327, March 5-9, 1973, 1730 p. 2. 3rd ERTS-1 Symposium, referring to: Freden, S.C., Mercanti, E.P., and Becker, M.A., eds.. 1973b, Third Earth Resources Technology Satellite-1 Symposium, Vol.I, Technical presentations, Sections A and B: Washington, D.C., Goddard Space Flight Center, NASA SP-351, December 10-14, 1973, 1974 p. H-1 Alexander, R.H., 1973, ERTS regional-scale overview linking land use and environmental processes in CARETS: Symposium, Significant Results Obtained from ERTS-1, p. 931-937. The pattern of tones and textures of ERTS-1 images was found to most closely correspond to land use maps when compared with preexisting maps of various types for the same area. Anderson, D.M., Gatto, L.W., McKim, H.L., and Petrone, A., 1973, Sediment distribution and coastal processes in Cook Inlet, Alaska: Symposium, Significant Results Obtained from ERTS-1, p. 1323-1339. Bands 6 and 7 allowed determination of the coastline of Cook Inlet, while bands 4 and 5 showed the suspended sediment and current patterns in the estuary. Circulation was seen to be primarily counterclockwise. Previously unmapped tidal flats and certain cultural features were identified. Anderson, R.R., Alsid, L., and Carter, V., 1975, Applicability of Skylab orbital photography to coastal wetland mapping: Proc. Am. Soc. Photogrammetry, 41st Ann. Mtg., March 9-14, 1975, p. 371-377. Skylab S190A color IR photographs were enlarged to a scale of 1:125,000, allowing transparent overlays to be used in mapping wetland features directly from the imagery. The study area was divided into five mappable units: (1) fresh estuarine river marsh, (2) brackish estuarine river marsh, (3) fresh estuarine bay marsh, (4) brackish estuarine bay marsh, and (5) near saline marsh. Anderson, R.R., Carter, V., and McGinness, J., 1973, Applications of ERTS data to coastal wetland ecology with special reference to plant community mapping and typing and impact of man: 3rd ERTS-1 Symposium, p. 1225-1242. ERTS-1 imagery is shown to be useful in wetland studies for test areas in North Carolina and Georgia. The authors explain why different data enhancement techniques were used and how each type aided the program. H-2 Anderson, R.R., Carter, V., and McGinness, J., 1973, Mapping Atlantic coastal marshlands, Maryland, Georgia, using ERTS-1 imagery: Symposium, Signifi- cant Results Obtained from ERTS-1, p. 603-608. ERTS-1 data were used as an inexpensive source of information for mapping the extensive coastal marshes of the Eastern United States. This paper is a'report on the feasibility of this approach. The study found that the following information could be derived from LANDSAT: (1) upper wetland boundary, (2) drainage pattern in wetland, (3) plant communities, (4) ditching activities associated with agriculture, and (5) lagooning for water-side housing developments. Anderson, R.R., and Wobber, F.J., 1973, Wetlands mapping in New Jersey: Photogramm. Eng., v. 39, p. 353-358. Color and color IR aerial photographs were shown to be a practical way to map or inventory wetlands. Determinable features include the mean high-water mark, the upper wetland boundary, and species associations in the area. Barrell, E.C., and Curtis, L.F., eds., 1974, Coastal vegetation surveys, in Environmental remote sensing: Applications and achievements: Edward Arnold, publisher, p. 127-143. Ecological zones in the coastal region have distinct vegetation coverage which is discernible on aerial photographs. The vegetation assemblages in six zones are discussed; these are intertidal, mudflat, salt marsh, shingle beach, sand dunes, and cliffs. The examples given are for the English coast. Bartlett, D., Klemas, V., and Rogers, R., 1975, Investigations of coastal land use and vegetation with ERTS-1 and SKYLAB-EREP: Proc. Am. Soc. Photogrammetry, 41st Ann. Mtg., March 9-14, 1975; p. 378-389. Machine-assisted analysis of ERTS-1 MSS scanner data was compared to manual interpretation of Skylab-EREP S190B photographs to determine H-3 I their usefulness in wetlands management. Skylab-EREP S190B was found to be superior in resolution but had the disadvantage of limited coverage. The authors conclude that either type of data is useful for the inventory of land cover types on a regional basis. Bowker, D.E., and others, 1973, Correlation of ERTS multispectral imagery with suspended matter and chlorophyll in lower Chesapeake Bay: Symposium, Significant Results Obtained from ERTS-1, p. 1291-1297. ERTS data were shown to be useful in monitoring estuarine waters for the assessment of siltation, productivity, and water type. Major areas of suspensate concentrations have been determined for Chesapeake Bay. Carter, V., and Schubert, J., 1974, Coastal wetlands analysis from ERTS MSS digital data and field spectral measurements: Ann Arbor, Michigan, Proc. 9th Internat. Symposium, Remote Sensing of Environment, p. 1241-1260. In utilizing vegetation distribution in the coastal area to map wetlands, signature analysis of vegetation types and physical features found in the zone has been tabulated. Also, seasonal variation and its effects are discussed. The system utilized computer analysis of the digital data. Clark, D.K., Zaitzeff, J.B., Strees, L.V., and Glidden, W.S., 1974, Computer derived coastal water classifications via spectral signatures: Ann Arbor, Michigan, Proc. 9th Internat. Symposium, Remote Sensing of Environment, p. 1213- 1239. ERTS-1MSS data were shown to be highly effective in the detection, classification, and delineation of water masses. Technical details of the processing of the color and black-and-white photographs are given, along with good definitions of terms. The enhancement and color slicing techniques used in the study are explained. DeBlieux, C., 1962, Photogeology in Louisiana coastal marsh and swamp: Gulf Coast Assoc. Geol. Soc., Trans., 12th Ann. Mtg., Oct.-Nov. 1962, p. 231-241. H-4 Aerial photographs of the Louisiana coastal zone show several structural features which exhibit surface expressions. This method may be useful in the search for stratigraphic and structural traps for oil and gas. Denathieu, P.G., and Verger, F.H., 1973, The utilization of ERTS-1 data for the study of the French Atlantic Littoral: 3rd ERTS-1 Symposium, p. 1447-1450. By utilizing ERTS-1 data, it was possible to accurately determine the direction of transport of sediments from rivers emptying into the Atlantic Ocean along the southwest coast of France. Ocean currents and a turbidity front at a depth of about 50 meters were observed. Dolan, R., and Vincent, L., 1973. Coastal processes: Photogramm. Eng., v. 39, p. 255-260. High-altitude aircraft photographs were used in conjunction with ground truth to study the crescentric forms seen on long, sandy coasts. These features indicate where overwash may occur during storms. Dolan, R., and Vincent, L., 1973, Evaluation of land use mapping from ERTS in the shore zone of CARETS: Symposium, Significant Results Obtained from ERTS-1, p. 939-948. ERTS-1 data provided a basis for land cover and land use mapping within the shore zone. MSS bands 4, 5, 6, and 7 are compared for their respec- tive utility in delinE!ating various features. Problems in mapping are discussed for: urban and built-up areas, forest areas, water, nonforested wetland, and barren land. El-ashry, M.R., and Wanless, H.R., 1967, Shoreline features and their changes: Photogramm. Eng., v. 33, p. 184-189. Sequential aerial photographs are shown to be essential in understanding long-term changes in shoreline geomorphology. Through the use of sequential photographs, net changes and the rate of change can be determined. H-5 Estes, J.E., Thaman, R.R., and Senger, L.W., 1973, Applications of ERTS-1 satellite imagery for land use mapping and resource inventories in the central coastal region of Claifornia: 3rd ERTS-1 Symposium, p. 457-490. ERTS-1 data were used to construct land use, landform, drainage, and vegetation maps of central California. Kelp distribution offshore was also mapped. The appendix lists the various categories mapped from the ERTS data. Feinberg, E.B., Yunghans, R.S., Stitt, J., and Mairs, R.L., 1973, Impact of ERTS-1 images on management of New Jersey's coastal zone. 3rd ERTS-1 Symposium, p. 497-503. The New Jersey Department of Environmental Protection utilized ERTS data to monitor and manage that state's coastal zone. Primary uses of ERTS are to: (1) detect land use changes, (2) monitor offshore waste disposal, (3) pick sites for outfalls of sewage treatment plants, and (4) allocate funds for shore protection. Fezer, F., 1971, Photo interpretation applied to geomorphology--A review: Photogrammetria, v. 27, n. 1, p. 1-50. The majority of the previous literature on the utilization of aerial photographs in geomorphology is reviewed. Most examples (and literature subjects) are for regions outside the United States. The paper includes a section on coastal landforms (p. 21-22) and a small section on deltas (p. 20). Flores, L.M., Reeves, C.A., Hixon, S.B., and Paris, J.F., 1973, Unsupervised classification and areal measurement of land and water coastal features on the Texas Coast: Symposium, Significant Results Obtained from ERTS-1, p. 1675-1681. By using two classification algorithms with digital ERTS data, it was possible to determine from 17 to 30 different classes representing mix- H-6 tures of water, land, and vegetation. The two areas studied were the Trinity River delta and the Galveston area. Fontanel, A., Guillenot, J., and Guy, M., 1973, First ERTS-1 results in Southeastern France: Geology, sedimentology, pollution at sea: Symposium, Significant Results Obtained from ERTS-1, p. 1483-1511. There are four parts to this paper: (1) Linear Trends Observed in the Western French Alps; (2) Some Results from the Study of the Dynamic Behavior of Coastal Sedimentation in the Gulf of Lions; (3) Study of Pollution at Sea in the Western Mediterranean; and (4) Processing of ERTS Imagery. Part 2.- p. 1492-1499. ERTS-1 data clearly showed past shorelines of the Rhone River delta and allowed them to be accurately mapped. Many of these shorelines had not been known prior to ERTS data usage. Fault control is also indicated in the area of ERTS data coverage. Gallagher, J.L., Reimult, R.J., and Thompson, D.E. , 1972, a comparison of four remote sensing media for assessing salt marsh primary productivity: Ann Arbor, Michigan, Proc. 8th Internat. Symposium, Remote Sensing of Environment, 2-6 Oct. 1972, p. 1287-1296. Four types of imagery from fixed wing aircraft were compared: Kodak Aerochrome Infrared, Ektachrome MS Aerographic, Kodak Infrared Aerographic, and imagery from a Bendix thermal mapper. Imagery interpretation was done with and without enhancement, and ground truth was used to evaluate results. Grimes, B.H., and Hubbard, @J.C.E., 1971, A comparison of film type and the importance of season for interpretation of coastal marshland vegetation: Photogramm. Rec., v. 7., p. 213-222. Color aerial photographs were found to be the best film type to use in England for the determination of coastal marshland vegetation. October was found to be the best time of the year for determination of vegetation H-7 while February was best for mapping topographic features. Mudflats were best seen on false color imagery. Guss, P., 1972, Tidelands management mapping for the coastal plains region: Washington, D.C., Am. Soc. Photogrammetry, Proc. Coastal Mapping Symposium, June 5-8 1972, p. 243-262. The author describes a pilot project conducted for the State of South Carolina to determine the feasibility of using aerial photographs to produce multipurpose maps for tidelands management. Requirements of the project are discussed as well as limitations of the data. Heath, G.R., and Parker, H.O., 1973, Forest and range mapping in the Houston area with ERTS-1 data: Symposium, Significant Results Obtained from ERTS-1, p. 167-172. The paper discusses procedures and results for two types of investigations using ERTS-1 data: forestry studies in which species content and condition of timber stands were determined, and a range investigation concerned with vegetation mapping in the Gulf coast marsh. Species of Spartina could be differentiated with or without computer-aided analytical techniques. The boundary between S. patens and S. spartinae in the area closely coincides with the wetlands inner boundary. Hunter, R.E., 1973, Distribution and movement of suspended sediment in the Gulf of Mexico off the Texas coast: Symposium, Significant Results Obtained from ERTS-1, p. 1341-1348. Sediment plumes observed on ERTS imagery differ very slightly in amount of suspended sediment. Data along the Texas coast show the extent and form of these plumes, some of which extend for many kilometers parallel to the shoreline. These plumes permit interpretation of nearshore currents. Kevlin, R.T., 1973, Recognition of beach and nearshore depositional features of Chesapeake Bay: Symposium, Significant Results Obtained from ERTS-1, p. 1269-1274. H-8 ERTS-1 support aircraft imagery was used to map such nearshore features as longshore bars in Chesapeake Bay. Also mapped were welded beach ridges and recurved spits. Klemas, V., Bartlett, D., Philpot, W., Rogers, R., and Reed, L., 1974, Coastal and estuarine studies with ERTS-1 and Skylab: Remote Sensing of Environment, v. 3, P. 153-174. The repetitive nature of the ERTS and Skylab imagery covering Delaware Bay allowed detection of changing conditions in and around the bay. Coastal vegetation, land use, current circulation, water turbidity, and ocean waste dispersion were studied. Ground truth allowed correlation of sediment concentration with reflectance on the images. The method used to determine currents from ERTS-1 data is discussed. Klemas, V., Bartlett, D., Rogers, R., and Reed, L., 1974. Inventories of Delaware's coastal vegetation and land-use utilizing digital processing of ERTS-1 imagery: Ann Arbor, Michigan, Proc. 9th Internat. Symposium, Remote Sensing of Environment, p. 1399-1410. Computer analysis of digital ERTS-1 data produced maps of various vegetative types with accuracies of 83-90 percent when compared with previous, conventional vegetation maps. The investigators plan to update and refine the system until higher accuracies are attained. Klemas, V., Daiber, F., and Bartlett, D., 1973, Identification of marsh vegetation and coastal land use in ERTS-1 imagery: Symposium, Significant Results Obtained from ERTS-1, p. 615-627. ERTS-1 data were used in combination with high- and low-altitude aircraft coverage and ground truth to determine the accuracy of using ERTS-1 data alone to distinguish stands of vegetation. Plant communities can be discriminated, but problems were encountered owing to the limited resolution of the satellite data. U-2 and RB-57 coverage were used to map small vegetation communities and the larger stands more accurately. H-9 Klemas, V., Ouley, C.W., and Rogers, R., 1973, Monitoring coastal water properties and current circulation with ERTS-1: 3rd ERTS-1 Symposium, p. 1387-1411. Currents in Delaware Bay detectable from ERTS-1 data coincided with predicted and measured currents in the bay. Convergent boundaries between different water masses were detected; some exhibited convergent shear. Waste disposal distribution was mapped. Results from the ERTS study are being used to predict potential oil slick movement and to estimate sediment transport. Klemas, V., Srna, R., Treasure, W., and Otley, M., 1973, Applicability of ERTS-1 imagery to the study of suspended sediment and aquatic fronts: Symposium, Si-gnificant Results Obtained from ERTS-1, p. 1275-1290. ERTS images of Delaware Bay were studied and compared with ground truth and aircraft coverage. Suspended sediment patterns and several types of aquatic interfaces, or fronts, were observed. Klemas, V., and others, 1974, Correlation of coastal water turbidity and current circulation with ERTS-1 and Skylab imagery: Ann Arbor, Michigan, Proc. 9th Internat. Symposium, Remote Sensing of Environment, p. 1289- 1317. Imagery and digital tapes of ERTS-1 data, along with extensive ground truth as to the exact amount of suspended sediment in the water, gave an indication of reflectance signatures for various sediment concentrations. The study of sediment distribution has also allowed determination of circulation patterns in Delaware Bay. Klemas, V., and others, 1974, Inventory of Delaware's wetlands: Photogramm. Eng., v. 40, no. 4, p. 433-439. , Enhanced RB-57 color IR imagery was used to map five categories in the wetlands of Delaware: (1) salt marsh cord grass, (2) salt marsh hay and spike grass, (3) reed grass, (4) high tide bush and sea myrtle, and (5) freshwater species in impounded areas. H-10 These units were characterized by: #1 - Spartina. alterniflora, #2 - Spartina. patens and Distichlis spicata. #3 - Phragmites communis #4 - Iva frutescens and Baccharis halimifolia Magoon, O.T., Berg, D.W., and Hallermeier, R.J., 1973, Application of ERTS-1 imagery in coastal studies: Symposium, Significant Results Obtained from ERTS-1, p. 1697-1698. Use of MSS ERTS images has permitted more accurate determination of tidal inlet configuration (as well as information on long-shore transport), updating of navigation charts in uninhabited, remote areas, and the near- shore water movement patterns. No enhancement techniques were used. .Mairs, R.L., Wobber, F.J., Garefalo, D., and Yunghans, R., 1973, Application of ERTS-1 data to the protection and management of New Jersey's coastal environment: Symposium, Significant Results Obtained from ERTS-1, p. 629-633. Using MSS bands 4 and 5, it was possible to detect the extent, drift, and dispersion of waste disposed in coastal waters. Moore, G.K., and North, G.W., 1974, Flood inundation in the southeastern United States from aircraft and satellite imagery: Ann Arbor, Michigan, Proc. 9th Internat. Symposium, Remote Sensing of Environment, p. 607-620. ERTS-1 data are useful in flood mapping if satellite passage coincides with the exact time of flooding. Otherwise, color-infrared photography is the most useful for determining the extent of flooding in forested areas during winter months. Orr, D.G., and Quick, J.R., 1971, Construction materials in delta areas: Photo- gramm. Eng., v. 37, no. 4, p. 337-351. H-11 Black-and-white, color, and color-infrared aerial photographs were used to locate depositional features on the Mississippi River delta. This method is shown to be a practical way of locating new sand, gravel, and clay deposits in this area. Pestrong, R., 1969, Multiband photos for a tidal marsh: Photogramm. Eng., v. 35, p. 453-470. The author compares various wavelength-sensitive films used in aerial photography to determine their usefulness in delineating vegetation zones within tidal marshes. Ektachrome IR was superior for vegetation type determination, while Ektachrome color transparencies were the most useful for general interpretation. Pirie, D.M., and Stellar, D.D., 1973, California coastal processes study: 3rd ERTS-1 Symposium, p. 1413-1446. ERTS-1 data were used to analyze nearshore currents, sediment transport, and river discharge along the California coast. Seasonal patterns in sedi- ment transport were found to be related to current systems and coastal morphology. Sediment plumes at times extended much farther offshore than previously thought. Sediment distribution was determined by using computer enhancement of the data. Polcyn, F.C., and Lyzenga, D.R., 1973, Updating coastal and navigational charts using ERTS-1 data: 3rd ERTS-1 Symposium, p. 1333-1346. Fairly accurate water depth data, up to 30 feet in the Bahamas and up to 200 meters in Lake Michigan, were obtained from ERTS-1 data. Processing of original ERTS data for depth information costs approximately $1.50 per square mile. Details of the process were not given. Reimold, R.J., Gallagher, J.L., and Thompson, D.E., 1972, Coastal mapping with remote sensors: Washington, D.C., Am. Soc. Photogrammetry, Proc. Coastal Mapping Symposium, p. 99-112. H-12 Slaughter, T.H., 1973, Seasonal changes of littoral transport and beach width and resulting effect on protective structures: Symposium, Signifi- cant Results Obtained from ERTS-1, p. 1259-1267. The direction of littoral transport and resulting beach width along Maryland's shoreline changes seasonally. This change makes erosion rates difficult to determine. Ground truth combined with ERTS-1 coverage points out the need for a year-long study to see a complete seasonal cycle. This would aid in protective structure design along waterfront properties, since the full potential for erosion would be shown. Sonu, C.J., 1964, Study of shore processes with aid of aerial photogrammetry: Photogramm. Eng., v. 30, p. 932-941. Ways in which aerial photographs may be used to better understand coastal environments are discussed. A large bibliography lists the majority of papers printed on the subject prior to this paper. Stafford, D.B., and Langfelder, J., 1971, Air photo study of coastal erosion: Photogramm. Eng., v. 37, no. 6, p. 565-575. Aerial photographs taken in successive years permit accurate determination of erosion in coastal areas. Only horizontal distances can be measured, making volumetric measurements of erosion difficult. These data.are shown to be necessary for development of urban areas or industry in coastal regions. Steller, D., Lewis, L.V., and Phillips, D.M., 1972, Southern California coastal processes as analyzed from multisensor data: Ann Arbor, Michigan, Proc. 8th Internat. Symposium, Remote Sensing of Environment, p. 983-998. Airborne imagery was used to detect and measure suspended sediment and tracer dyes in the nearshore zone off southern California. The methods discussed were developed to study sediment transport and coastal effluent distribution in the area. H-13 Steller, D.D., and Pirie, D.M., 1974, California nearshore processes: Ann Arbor, Michigan, Proc. 9th Internat. Symposium, Remote Sensing of Environment, p. 1261-1278. The suspended sediment present in turbulent nearshore waters, along with the repetition of ERTS-1 data over a year-long period, have allowed the oceanic circulation patterns near the California coast to be determined. Tuyahov, A.J., and Holz, R.K., 1973, Remote sensing of a barrier island: Photogramm. Eng., v. 39, 177-188. Three types of imagery are compared for their effectiveness in determining the environments of Padre Island, Texas. Color, color-infrared, and thermal infrared are compared in delineating vegetation stands, vegetated vs. nonvegetated dunes, tidal flats, and hurricane washover channels. Williams, R.S., Jr., 1973, Coastal and submarine features on MSS imagery of southeastern Massachusetts: Comparison with conventional maps: Symposium, Significant Results Obtained from ERTS-1, p. 1413-1422. ERTS-1 data provided the necessary geologic and hydrographic information to update conventional maps of coastal areas where conditions vary rapidly. The data obtained through ERTS are both accurate and relatively inexpensive and provide a constant update. Williamson, AX, and Grabau, W.E., 1973, Sediment concentration mapping in tidal estuaries: 3rd ERTS-1 Symposium, p. 1347. Methods are discussed for the determination of the amount of suspended sediment in water and of ways ERTS data may be used to exactly locate and delineate surface-water turbidity. Wobber, F.J., and Anderson, R.R., 1973, Simulated ERTS data for coastal manage- ment: Photogramm. Eng., v. 39, p. 593-598. H-14 ERTS data are shown to be potentially very useful in mapping wetland boundaries, monitoring land use changes in wetlands, studying offshore currents, and in planning dredge spoil disposal. Wright, F.F., Sharma, G.D., and Burbank, D.C., 1973, ERTS-1 observations of sea surface circulation and sediment transport, Cook Inlet, Alaska: Symposium, Significant Results Obtained from ERTS-1, p. 1315-1322. Suspended sediment, visible on MSS 4 and 5, allowed the determination of sediment and pollutant trajectories, areas of probable commercial fish concentration, and the circulation regime. Yost, E., Hollman, R., Alexander, J., and Nuzzi, R., 1973, An interdisciplinary study of the estuarine and coastal oceanography of Block Island and adjacent New York coastal waters: 3rd ERTS-1 Symposium, p. 1607. Water samples were taken to correspond with the timing of ERTS-1 coverage. This procedure allowed reflectance to be "quantified" in terms of the amount of suspended sediment present in the water. H-15 APPENDIX I ACCURACY EVALUATION FOR EACH SCENE MAPPED, BY LAND COVER AND LAND USE CATEGORY APPENDIX I Accuracy Evaluation For Each Scene Mapped, by Land Cover and Land Use Category The percent. correct value for each category is based on the assumption that one-half of the questionable points would ultimately be considered correct if additional data became available. Analysis of Test Site 2 Landsat Scene 1289-16261, 8 May 1973 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U 3 0 0 3 100 Ui 1 0 0 1 100 Ut 4 0 0 4 100 Ue A 0 1 0 1 0 G 42 10 3 55 80.0 Gd Gb 8 0 0 8 100 1 0 0 1 100 wo 4 0 3 7 85.7 Wlm 21 2 0 23 91.3 Whm 14 1 0 15 93.3 Wtf 1 0 0 1 100 Wga WS 3 0 0 3 100 B 1 0 0 1 100 Bd Bds BU Total -103 14 6 123 86.2 Percent correct@ assuming one-half of questionables are correct . . . . . . 86.2% Percent correct assuming all of questionables are correct . . . . . . . . . 88.6% Percent correct assuming all of questionables, are incorrect . . . . . . . . 83.7% 1-2 Analysis of Test Site 3 Landsat Scepe 1614-16261, 29 Mar. 1974 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U Ui 1 0 0 1 100 Ut Ue A 36 0 0 36 100 G 81 6 0 87 93.1 Gd 2 0 0 2 100 Gb Gbr wo 0 0 1 1 100 Wlm 11 0 0 11 100 Whm 4 2 0 6 66.7 Wtf Wga Ws 2 0 0 2 100 B 2 0 0 2 100 Bd Bds 3 0 0 3 100 Bu Total 142 8 1 151 94.7% Percent correct of total assuming one-half of questionables are correct 94.7% Percent correct of total assuming all of questionables are correct . . . . 94.7% Percent correct of total assuming all of questionables are incorrect . . . 94.0o/ol 1-3 Analysis of Test Site 3 Landsat Scene 2034-16200, 25 Feb. 1975 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U Ui 1 0 0 1 100 Ut 1 0 0 1 100 Ue A 32 1 33 97.0 G 88 5 93 94.6 Gd Gb Gbr wo WIM 7 4 0 11 63.6 Whm 5 2 0 7 71.4 Wtf 2 0 0 2 100 Wga Ws 1 0 0 1 100 B 1 0 0 1 100 Bd Bds Su 1 1 0 2 50.0 Total -139 13 0 152 91.5% Percent correct of total assuming one-half of questionables are correct 91.5% Percent correct of total assuming all of questionables are correct . . . . 91.5% Percent correct of total assuming all of questionables are incorrect . . . 91.5% 1-4 Analysis of Test Site 4 Landsat Scene 2034-16202, 25 Feb. 1975 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U 4 2 1 7 71.4 Ui 3 0 1 4 100 Ut 6 0 0 6 100 Ue A 3 0 0 3 100 G 14 0 0 14 100 Gd Gb 4 0 0 4 100 Gbr wo 12 0 2 14 92.9 Wlm 3 0 1 4 100 Whm 0 0 1 1 100 Wtf 8 3 2 13 69.2 Wga 13 0 0 13 100 WS B 1 0 0 1 100 Bd Bds 2 1 1 4 75.0 Bu 1 2 2 5 40.0 Total 74 8 11 93 86.0% Percent correct of total assuming one-half of questionables are correct 86.0% Percent correct of total assuming all of questionables are correct . . . . 91.4% Percent correct of total assuming all of questionables are incorrect . . . 79.6% 1-5 Analysis of Test Site 4 Landsat Scene 5082-16080, 10 July 1975 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U 5 0 1 6 100 Ui 2 0 0 2 100 Ut 3 0 0 3 100 Ue A 4 3 0 7 57.1 G 8 1 0 9 88.9 Gd Gb 4 2 0 6 66.7 G:w 2 0 2 4 75.0 wo 13 1 2 16 87.5 Wlm 2 1 0 3 66.7 Am Wtf 3 0 0 3 100 Wga 16 1 1 18 94.4 Ws 1 0 0 1 100 B 1 0 1 2 100 Bd Bds 3 0 0 3 100 Bu 2 1 0 3 66.7 @otal 69 10 7 86 84.9% Percent correct of total assuming one-half of questionables are correct 84.9% Percent correct of total assuming all of questionables are correct . . . . 88.4% Percent correct of total assuming all of questionables are incorrect . . . 80.2% 1-6 Analysis of Test Site 4 Landsat Scene 2376-16172, 2 Feb. 1976 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U 3 0 1 4 100 Ui 2 1 0 3 66.7 Ut 5 0 0 5 100 Ue A 4 1 0 5 80.0 G 12 0 0 12 100 Gd 1 0 0 1 100 Gb 5 0 0 5 100 C@r wo 8 1 1 10 90 wlm 0 0 1 1 100 Whm Wtf 5 5 6 16 50.0 Wqa 8 0 0 8 100 Ws B 1 0 1 2 100 Bd Bds 4 0 0 4 100 BLJ 1 1 0 2 50.0 Total 59 9 10 78 82.1% Percent correct of total assuming one-half of questionables are correct 82.1% Percent correct of total assuming all of questionables are correct . . . . 88.5% Percent correct of total assuming all of questionables are incorrect . . . 75.6% 1-7 Analysis of Test Site 5 Landsat Scene 2034-16205, 25 Feb. 1975 Number of Points Checked Percent Classification Correct Incorrect Questionable Total Correct U Ui Ut 1 0 0 1 100 Ue A 16 1 0 17 94.1 G 42 1 0 43 97.6 Gd 1 0 0 1 100 Gb wo Wlm Whm Wtf 15 1 0 16 93.8 Wga 13 0 0 13 100 Ws B 4 2 0 6 66.7 Bd Bds 1 0 0 1 100 BU 18 7 0 25 72.0 Total ill 12 0 123 90.2% Percent correct of total assuming one-half of questionables are correct 90.2% Percent correct of total assuming all of questionables are correct . . . . 90.2% Percent correct of total assuming all of questionables are incorrect . . . 90.2% 1-8 - APPENDIX J DATA TABLES FOR ACCURACY OF COMPUTER CLASSIFICATION IN THE HARBOR ISLAND TEST SITE Table 2 COMPUTER AND IMAGE-INTERPRETATION CLASSIFICATION ACCURACY COMPARISON 10 JULY 1975 U A G Wo W B U 2 0 0 0 0 0 A 0 1 0 0 0 0 G 0 0 1 0 0 0 WO 6 4 8 19 0 1 W 2 0 2 0 22 0 B 0 0 0 0 0 6 51 -t74 69% Table 3 COMPUTER AND IMAGE-INTERPRETATION ACCURACY COMPARISON WITH CLASS ADJUSTMENTS 10 JULY 1975 U Up W B WA U 1 0 0 0 0 Up 6 33 0 1 0 W 2 2 22 0 0 B 0 0 0 6 0 WA 0 0 0 0 1 63 -t 74 85% J-2 Appendix i DATA TABLES FOR ACCURACY OF COMPUTER CLASSIFICATION IN THE HARBOR ISLAND TEST SITE Table 1. COMPUTER AND IMAGE-INTERPRETATION CLASS CORRELATION MATRIX 10 JULY 1975 U A G wo. w B o 0 0 0 0 21 12 10 16 19 0 0 @3 7 8 5 3 1 2 0 0 0 63 0 & 0 0 10 25 0 4 % 4 0 8 0 15 1 5 2 9 22 1 0 2 0 0 0 4 0 G 0 0 0 0 0 0 z 8 0 0 0 2 0 > 0 0 0 3 2 4 J-1 Table 4 COMPUTER AND IMAGE-INTERPRETATION CLASS CORRELATION MATRIX 2 FEB. 1976 U A G k1o w B 2 0 0 0 2 23 11 1 47 15 12 4 13 4 7 9 0 0 A 0 4 0 0 14 0 & 2 0 7 16 1 0 % 6 1 2 0 29 4 2 10 1 0 13 1 0 0 0 0 0 0 G 4 0 0 0 8 0 z 3 0 0 0 17 0 A 0 0 0 0 1 0 Table 5 COMPUTER AND IMAGE-INTERPRETATION CLASSIFICATION ACCURACY COMPARISON 2 FEB. 1976 U A G wo w B U 3 1 2 2 0 0 A 0 0 0 0 0 0 G 4 0 12 4 2 2 wo 0 0 2 4 0 0 w 3 4 0 0 22 1 B 1 0 0 0 0 5 46 74 62% J-3 Table 6 COMPUTER AND IMAGE-INTERPRETATION ACCURACY COMPARISON WITH CLASS ADJUSTMENTS 2 FEB. 1976 U A Up Wa W B U 3 1 4 0 0 0 A 0 0 0 0 0 0 Up 4 0 22 0 2 2 Wa. 0 0 0 8 0 0 W 0 4 0 0 17 1 B 1 0 0 0 0 5 55 74 74% Table 7 COMPUTER AND IMAGE-INTERPRETATION CLASS CORRELATIONj USING AN INTENSIFIED SAMPLE 10 JULY 1975 U A G wo w B WA 0 0 1 0 1 18 0 4 7 .16 21 5 1 1 10 3 2 5 2 2 1 A 4 0 0 0 41 1 3 & 0 3 3 24 2 0 0 % 1 4 5 0 13 1 0 0 10 3 16 2 1 0 2 0 0 0 1 0 10 z 8 0 0 0 2 0 56 A 1 0 0 0 2 0 50 > 1 0 0 2 1 1 0 J-4 Table 8 COMPUTER AND IMAGE-INTERPRETATION CLASSIFICATION ACCURACY COMPARISON USING AN INTENSIFIED SAMPLE 10 JULY 1975 U A G WO W B WA U 10 3 2 5 2 2 1 A 0 0 0 0 0 0 0 G 0 0 0 0 0 0 0 WO 5 20 22 61 10 3 1 W 5 4 5 2 54 2 3 B 0 0 1 0 1 18 0 WA 11 0 0 0 5 0 116 259 -t 374 69% Table 9 COMPUTER AND IMAGE-INTERPRETATION CLASSIFICATION ACCURACY COMPARISON WITH CLASS ADJUSTMENTS (COMBINING ALL VEGETATION CLASSES-A, G, WO), USING AN INTENSIFIED SAMPLE 10 JULY 1975 U Up W B WA U 10 10 2 2 1 Up 5 103 10 3 1 W 5 11 54 2 3 B 0 1 1 18 0 WA 11 0 5 0 116 301-!-374 80% J-5 Table 10 COMPUTER AND IMAGE-INTERPRETATION CLASS CORRELATION, USING AN INTENSIFIED SAMPLE 2 FEB. 1976 U A G wo w B WA 2 0 0 0 1 20 2 8 3 52 8 8 4 2 16 1 7 3 0 2 1 0 2 0 0 7 0 0 & 2 0 4 27 3 1 0 % 3 6 1 0 25 2 4 5 8 1 0 10 0 0 0 0 0 0 0 0 0 G 3 0 0 0 9 1 56 z 1 0 0 0 10 0 6. A 4 0 0 0 0 0 10 J-6 Table 11 COMPUTER AND IMAGE-INTERPRETATION CLASSIFICATION ACCURACY COMPARISON USING AN INTENSIFIED SAMPLE 2 FEB. 1976 U A G WO W B WA U 16 1 7 3 0 2 1 A 0 0 0 0 0 0 0 G 8 3 52 8- 8 4 2 WO 2 0 4 27 3 1 0 W 9 16 2 0 52 2 10 B 2 0 0 0 1 20 2 WA 7 0 0 0 9 1 66 233 -t 351 66% Table 12 COMPUTER AND IMAGE-INTERPRETATION CLASSIFICATION ACCURACY COMPARISON WITH CLASS ADJUSTMENTS, USING AN INTENSIFIED SAMPLE 2 FEB. 1976 U A Up W B WA U 16 1 10 0 2 2 A 0 0 0 0 0 0 Up 10 3 91 11 5 2 W 9 16 2 52 2 10 B 2 0 0 1 20 2 WA 7 0 0 9 1 66 245 -t 351 =70% J-7 APPENDIX K COST RECORDING FOR THE LANDSAT PROJECT I APPENDIX K COST RECORDING FOR THE LANDSAT PROJECT prepared by Ed Deakin III August 7, 1975 I. Objectives Cost records are to be maintained for product development at each of the test sites. The purpose of maintaining these records will be to assist in the evaluation of cost effectiveness of satellite data- gathering methods. Ii. Types of Records There are two types of records that are to be kept by individual project participants. These are: 1. time allocation records, and 2. equipment usage records. Data from these records will be accumulated by a project accountant. The records which the project accountant will use are: 1. staff cost accumulation sheets, and 2. equipment cost accumulation sheets. Each of these types of records and their use is described below. III. Time Allocation Records Staff members are to maintain an account of the time spent on each task at each site, and for time spent on each step according to the Project Evaluation Review Schedule (PERS). The Time Allocation Record (Exhibit 1) is designed to facilitate this record-keeping. K-1 AGEIICY_ LANDSAT PROJECT EXHIBIT I TIME ALLOCATION RECORD NAMP: Woel; End in,g:_ STAFF LEVEL: P E RS Draft Date:-- NONDAY TUESDAY THURSDAY F R I DAY Site Site S i t'j Site Site - Code Stan Hmr. Task- Code H=s Task Stop Hourd Task Codc. S T, -tf!.P- a I.Code Step.li=--. 'P;, Other: Other: Other: O:her: ITOTAL HOURS. TOTAL HOURS TOTAL HOURS TOTAL HOURS TOTAL HOURS The staff member should fill.in..a new Time Allocation Record each day; as work is performed,, a notation is made on the record. There are two task codes: E for Examining ADP Software, and B for Building a Regional Base. If a staff member is working on examining ADP software for test site 3, and is indexing tapes from EROS (Step 4), the person would enter an E in the Task column, a 3 in the Site Code column, and a 4 in the Step column. Exhibit Ia shows a Time Allocation Record that has been filled in for John Doe, who performed that task on Tuesday of the week ending September 5, 1975. Exhibit Ia also shows sample entries for Monday, a holiday, and for other days of the week. Note that on Friday, this person performed several tasks. The second task, which is labeled B 4 7 2 in the four columns of the form indicates that this person spent two hours on Friday building a regional base at Harbor Island, and during that time the person was involved in "ground truth" interpretation. Time should be kept to within 1/4 of an hour. (Smaller divisions of time are generally more costly than the benefit of increased accuracy obtained). IV. Equipment Usage Records Use of specialized equipment and use of the computer should be recor- ded on Equipment Usage Records (Exhibit II). The recording of com- puter use will be handled by the computer accounting system; thus use of these records by the staff will concentrate on the use of specialized equipment. An Equipment Usage Record form should be kept K-3 AGENCY LANDSAT PROJECT EMIBIT L;j TIME ALLOCATrON RECORD NAME- Week End i-q-. STAFF LEVEL: rER5 D@-aft Date: 7. If 7 MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY Site Site I Site, Site Site Task Code Step HoLm Task I Codel Step Hain, Task Codo i Step FcLrs Tu c od r:@H@ Ste&!. fl,lg- ,E' @3 3 3 131,1 ,5 @3 /0 3 17 Z/ Other: Other; Other: r= Other: Zz@ TOTAL HO TOTAL HOURS -URS S, -FTOTAL HOURS TOTAL 17OURS TOTAL HOURS LANDSAT PROjECT EXHIBIT EQUIPMENT USAGE RECORD TYPE OF EQUIPNENT: Weo2k Ending:_ 111IRS Draft MONDAY TUESD Y WEDNESDAY TliljlRSDAY FRlD"\Y Site S "' Si"'l Site Cojc 'odc Task SLep firs, Trkl8(,',T, Step tIrs. Task IC.de Step Hrs. Task C Step lfrs. Task, Code Step Ilirs. I--T TOTAL USAGE USAGE TOTAL USAGE WAL USAGE TOTAL USAGE near each piece of equipment. When the equipment is used, the user should record the task, site code, step, and time of use in the appro- priate columns on the Equipment Usage Record (EUR). In many cases, equip- ment use is limited to a few of the steps for each task. The project accountant should be able to compare the times spent on tasks with the equipment usage record to help verify the data contained on the EUR. An example of an EUR for a Richards Light Table is shown in Exhibit Ila. This type of equipment is used to perform Steps 7 and 8 in Task B only. Thus, usage for the equipment should conform fairly closely to the time spent on those tasks. From the example, one can see that the equipment was used on Tuesday and Friday only. Tasks performed on those days were as noted. V. Staff Cost Accumulation Sheets Data from the time allocation records must be transferred to cost records which accumulate costs by task, site, and step. The project accountant will use the cost accumulation forms in order to transfer data from the time allocation records. An example of a Staff Cost Accumulation Sheet is shown in Exhibit III. An example of a filled-in Staff Cost Accumulation Sheet is presented in Exhibit IlIa. The first line shows that a Geologist I spent one hour during the week on Step 1 of Task E at Site 3. The standard rate for a Geologist I is $10.00. This rate is multiplied by the hours K-6 ACI:,'ICY LANDSAT PROJECT EXHIBLT l3a NT USAGE EQUfPML RECORD TYPE OF FOUIPMrNT: PER!; DrIft Date:_____ M' ONDAY TUISS DAY W EDN'T, SE, A Y Tlj7j'@'D,,Y Site S i Site Site Tas, Codc Step Hrs. Tskj(S!,dee Step I-Irs. Task @;ode Step Fl's.1 Tasl: Cocle Step Task Step 7 1 /Z, ,3 TOTAL USAGE LITOTAL USAGE TOTAL USAGE TOTAL USAGE TOTAL USAGE Step IlIrs. 7 To- LANDSAT PROJECT EXHIBIT III COST ACCUMULATION SHEET TASK:-- S A E: CUMULATIVE WEFKLY WEEK ENDING: Staff S td. S ta ff Staff Std. Total For Step Level Hrs. lirs. Total ITevpl Hrs .Hrtds Total ILevel Hrs. Frs. Total This Step LANDSAT PROJECT COST ACCUMULATION SHEET EXHILIT IIIa TASK: STA F F _5 CUMULATIVE WEEKLY WEEK ENDING: Staff Sid Staff Sid. Staff Std: Total For Step Levell Hrs. Hrs: Total evel Hrs. Hrs. Total Lev, H,,. I lirs Total This Stop /O.M /0. C? & /0.00 361-00 G.2 @,Feo 00 3 1 - ;I11Z 5- /0-00 -5-0. 00 G-2 9,00 72-00 5,5 5--.70 /0,40 13-2 -1P '9 0 az F,00 1.4.00 ,:-7 F, 60 3 4, 0 G /0.'m 40-obl .2- J0. oo /6.0D e o ,53 _0 0 - P.60 61) J7 0 worked to obtain the total cost for that staff level during the week. The totals are added across the line, and a total cost for each step is entered in the last column. In Step 4, there are three staff levels which were engaged in perfor- ming this particular step this week. The hours for each and the standard rates for each are used to determine the totals. The three total costs are added together to arrive at a total cost for Step 4 for this week. (Notice that the Geologist II hours can be tied back to Exhibit Ia. The arrow indicates the steps where this can be done.) Step 8 required more than three staff levels. To indicate the contin- uation onto the next line, a diagonal'iline was placed in the "Total for This Step" column, and the additional data were entered in the next line. If a particular task requires more than one Staff Cost Accumulation Sheet, additional sheets can be added, with the notation "Continua- tion" made at the top. Weekly accumulations should be made in order to facilitate reporting. At the time that reports are due, the weekly cost accumulation sheets can be used as a basis for preparing cumulative cost accumulation sheets. The cumulative box would be checked, and the accumulated hours for each of the staff levels would be entered under the appropriate steps. At the end of each task for each site, the cumulative cost accumulation sheet will have the total times spent on each of the steps in that task as well as the total standard costs for that step. K-10 The standard costs to be used for each staff level should be the costs that are expected to occur if the project is operational. These costs would include the employee's hourly rate plus a provision for employee benefits, and other costs related to that employee's time. The staff levels are abbreviated in the cost accumulation forms. The project director should prepare a key to indicate the staff levels associated with each of the abbreviations, as well as thO.:standard rate for each level. VI. Equipment Cost Accumulation Sheets The costs associated with the use of each piece of specialized equip- ment and with the computer should be accumulated on Equipment Cost Accumulation Sheets (Exhibit IV). The process of transferring the data from individual Equipment Usage Records to these sheets is identical to the process for transferring staff time records. The standard rates for- equipment use should be determined based on the expected life of equipment. This can be approximated by taking the expected life of the equipment in years and multiplying it by the expected annual us,age in hours. This "productive hours" life of the equipment is then divided into the equipment cost to arrive at an hourly cost for use of the equipment. For example, if a machine will last for two years, and is used an average of 520 hours per year (or 10 hours per week), it has a productive-hours life of 1040 hours (2 years x 520 hours per year). If the equipment costs $18,560 and can be sold for $4,000 at the end of the second year, then the net equipment cost is $14,560 ($18,560 - $4,000). The hourly rate K-11 LAIDSAT PRDJECT E;XliIBIT IV COST ACCUMULATION SHEET TASK: s rri,: CUMULATIVE WEEKLY WEEIZ ENDING: Equip Std 1Equip Std Equ P std Total For Step Type 1-Irs. Rate Total Type Hrs. Rate Total TvP-2-! lirs. Rate Total This Step would be this $14,560 divided by 1,040 hours' or $14.00 per hour. Computer costs should be assigned to each step based on the records maintained by the computer center. Each job submitted to the center should be coded to indicate the task, site, and step to which the job applies. Standard computer use costs should be based on the computer costs expected to occur under operational conditions. VII. Other Cost Records Certain other costs will be incurred under the project. The most significant of these is likely to be travel costs. The basic docu- ment for these costs will be the travel voucher. These vouchers should be coded with the task, site,and step codes so that the travel costs can be associated with the final cost repnrts. Other costs,such as supplies, should be estimated. In general, these costs will be too small to require detailed record-keeping. Estimates of supplies use should be reported by breaking down the total use for the project to individual steps on an appropriate basis. VIII. Reporting Costs.Ificurred, A report of-costs incurred for each step should be prepared to in- dicate the costs likely to occur in an operational setting. Such a report should list costs for!each of the three major step categories: Data Acquisition, Information Extraction, and Display. Under each of these steps, costs should be shown with the following categories: K-13 Cost Source of Information Staff Staff Cost Accumulation Sheets--Cumulative Equipment Equipment Cost Accumulation Sheets--Cumulative Travel Travel Vouchers--according to codes Other As Estimated The estimates for Other Costs should be documented to provide a means of tracing these costs. K-14 APPENDIX L THE COST-SAVING ANALYSIS IN AN ECONOMIC CONTEXT APPENDIX L. THE COST-SAVING ANALYSIS IN AN ECONOMIC CONTEXT In general, the cost-benefit method of analysis provides a framework by which alternative government investments can be evaluated. Any expendi- ture of monies involves both benefits and costs, the gain from the action and the expense of undertaking it. In a decision-making context, it is not only the direct benefits and costs of an action which should be weighed, but also the benefits and costs of alternatives. It is then the net gain over the alternative or the opportunity cost of a selected course of action which should count. The key to a cost-benefit study, then, is opportunity cost, or the value foregone in deciding to undertake a particular action rather than an alternative. Cost-benefit, then, is examination of choices among the most promisinq alternatives. Selection is forced upon societv because of limited means or, in the context of government, on government agencies because of budget constraints. A cost-saving study is a form of cost-benefit. Any investment of monies which has the objective of reducing costs for a product or service focuses on the costs only. If the benefits and costs of one action are B and C, and the benefits and costs of the closest cost alternative are B2 and C21 since Bl = B21 then C1 - C2is equal to the cost-savings.I This is shown in the diagram below, where price is assumed equal to cost: A P=C 2 2 D E I Q2 Ql Q @eE. shan Benefit Analysis: Praeger Publishers, Washington, HO'li, P. 4COst- BI or total benefits of Action I is OABQ2 and C 1 or total costs is OC 1EQ2' For alternative 2, total benefits (B 2) are OABQ2 and total costs are OC 2BQ 2' Since for Q2 amount of the good, Bl=B 22 then cost-savings reduces to C 2-CI or C1C2BE. This approach assumes that demand, AQ2 is the same for all pro- ducts. In the context of this study, a cost-savings approach assumes decision- makers are indifferent to the type of map they use. Each map supplies him with the information he needs, regardless of the differences in number of classes, mapping technique, etc. The question of choice among maps is dic- tated by the least-cost method of mappi-ng whether it is by computer-assisted processing of digital tapes, image interpretation of Landsat imagery, or conventional mapping using aerial photos. This approach contrasts with one which considers the demand for each map as being slightly different. The demand for a map would depend upon the information content of each map and its impact on the decision-making process. If decision-makers have a desire to make the best decision for the State of Texas, then their demand for a map will be dictated by how they see each map influencing this decision. The benefit-cost equation would be slightly more complicated than that for the cost-saving approach. From the diagram illustrating cost-saving, it can be deduced that this study has several major faults. One is that no account is made for user needs (demands). Table 8 in the text shows the various costs per square mile of information assuming various volumes of users, but these volumes of de- mand are hypothetical only. The comparison of Landsat maps with conventional methods does not establish the demand for either. A survey of potential users would assess each user's current source of information, and the cost of such information, and compare that cost to Landsat products.. The L-2 cost-savings for a Landsat map would be the sum of the savings in cost for each user where each individual savings in cost would be computed as the cost of Landsat maps minus the costs of present information sources. Another fault in the study is that the mapping results from computer and image analysis techniques are based on small areas. One inherent advantage of the computer approach is that once the mapping parameters are established, large areas can be mapped quickly, albeit with possibly different features being mapped than by image-interpretation methods. An additional weakness in com- paring the products from the three mapping approaches is that the maps con- tain different levels of information and different accuracies. Also, the computer methods still can be improved upon, whereas the methods based on more conventional techniques are essentially stable with regard to signi- ficant improvements. These faults in the cost-savings study are, of course, due to difficul- ties and incompatibilities in the sources of information rather than in the cost-savings approach itself. The study compared Landsat-derived maps to conventional mapping by way of illustration; the cost-savings study is an example only. Several other points should be made about the Landsat study. Costs shown would be costs to the state if it should invest. This is not the relevant criterion for a cost-benefit study, however. The costs which should count are costs to the nation. In this study, purchase costs of Land- sat imagery and tapes are counted as the only costs, yet such costs do not count placing a satellite into orbit. The remaining cost which is not cal- culated here amounts to a positive transfer or subsidy to the state from the federal government.2 2See E.J. Mishan, Cost-Benefit Analysis: Praeger Publishers, Washington, D.C., 1971, p. 34. L-3 A cost not treated in Landsat mapping is that of administrative over- head. Some other indirect costs may have been omitted. For the environ- mental geology map, no calculation was made of office overhead or labor costs other than interpretation. Quality.criteria such as accuracy and consistency of classification or timeliness in delivery of products were not compared to costs. Finally, it is assumed that the areas mapped by these various approaches will be completed within the period of a year so that all costs are current costs as much as possible. The present value of money, and therefore dis- counting, was not considered in the computations. L-4 APPENDIX M RAW DATA COSTS FOR LANDSAT MAPS DERIVED FROM IMAGE INTERPRETATION AND COMPUTER-ASSISTED ANALYSIS, WITH ONE TABLE ON LAB13R COSTS FOR THE ENVIRONMENTAL-GEOLOGY MAP APPENDIX M. RAW DATA COSTS FOR LANDSAT MAPS DERIVED FROM IMAGE INTERPRETATION AND COMPUTER-ASSISTED ANALYSIS, WITH ONE TABLE ON LABOR COSTS FOR THE ENVIRONMENTAL- GEOLOGY MAP. These tables contain the cost data: labor costs, equipment costs, office overhead, and miscellaneous costs. Frequently they contain additional data not actually used in the computations in the text. This data is not extraneous, though, for it permits additional computations for parties who are interested in tabulating their own costs. A list of tables is shown below. Tables Title 1. Labor Costs of Image Interpretation: staff salary and fringe benefits by staff level, per hour. 2. Labor Costs of Computer-Assisted Analysis: staff salary and fringe benefits by staff level, per hour. 3. Labor Costs -for Bureau of Economic Geology South Texas Mapping Project: staff salary and fringe benefits by staff level, per hour. 4. Labor Costs of Image Interpretation: data interpretatio ,n, staff time by steps in interpretation, representative time for a scene -in site four. 5. Labor Costs of Image Interpretation: data interpretation, staff time by activities other than interpretation, time for site four. 6. Labor Costs of Computer-Assisted Analysis: data interpre- tation, staff time by steps, representative time for a scene in site four. 7. Labor Costs of Data Acquisition: staff time by agency. 8. Equipment Costs: special equipment in image interpretation. 9. Equipment Costs: special equipment in computer-assisted analysis. M-1 Tables-(con't) Title 10. Equipment Costs: data acquisition, special equipment in ordering imagery. 11. Equipment Costs: image interpretation, total hours of use of special equipment for a representative scene in site four. 12. Equipment Costs: computer-assisted analysis, total hours of use of special equipment for scene 3 of site four. 13. Equipment Costs: computer-assisted analysis, total minutes of computer time by program, representative operational run and estimates for supplemental and error runs. 14. Equipment Costs: data acquisition, total hours of use of special equipment in ordering imagery. 15. Office Overhead: image interpretation, costs of office equipment. 16. Office Overhead: computer-assisted analysis, costs of office equipment. 17. Office Overhead: costs of office space and materials. 18. Miscellaneous Costs: data. 19. Miscellaneous Costs: image interpretation and computer- assisted analysis, travel costs for site four. 20. Miscellaneous Costs: image interpretation, scene and scribe coat enlargement. 21. Miscellaneous Costs: image interpretation, drafting costs for one scene. 22. Miscellaneous Costs: data display, commercial costs of hard copy color display for computer-assisted analysis and image interpretation maps. .M-2 Table 1. Labor Costs of Image Interpretation: Staff Salary and Fringe Benefits by Staff Level, Per Hour (1976-77). JOB CLASSIFICATION RANGE AVERAGE SALARY HOURLY FRINGE TOTAL PER PER FOR RATE BENEFITS REMUNERATION MONTH MONTH YEAR PER HOUR Research Scientist Associate IV $1465-1852 $1659 $19,908 $9.58 $1.13 $10.71 Research Scientist Assistant 11 804-1014 909 10,908 5.25 .73 5.98 Research Scientist Assistant 1 680-888 784 9,408 4.53 .64 5.27 Technical Staff Assistant 111 538-727 633 7,596 3.65 .54 4.19 Secretary 595-804 661 7,932 3.82 .56 4.38 Explanation of Positions: Research Scientist Associate IV is the Senior Image Interpretor. Research Scientist Assistant (I or 11) is an assistant who mainly performs steps 5 and 7. Technical Staff Assistant III is a draftsman-. Fringe Benefits OASI (Social Security) = 5.85% of first $ 15,300. UCI (Unemployment Compensation) = .2111 of first @ 4,200. WCI (Workmen's Compensation) = .35% of total salary. Premium Sharing = $ 15 per month for full-time employees, only half-time to full-time employees are eligible and the $ 15 per month is prorated on hours of work. Matching Retirement = 6.0% of the first $ 25,000 and only those working half-time or more are eligible.. 1 The hourly rate is figured as 173.2 hours per month. Source: Gary Otting, Univ. of Texas Personnel. M- 3 Table 2. Labor Costs of Computer-Assisted Analysis: Staff Salary and Fringe Benefits by Staff Level, Per Hou-r. Job ClIassi- Range Per Average Salary Forl Hourly Fringe To+_--.0 ficatii Month IPer Monthl Year Rate Benefitsl R Per Hour P@- -iour System Analyst 1, $1302-1639 $1470.50 $17,646 $ 8.49 $ 1.19 $ 9.68 Group 16 Engineering Tech- 1141-1437 1289.00 15,468 7.44 1.11 8.55 nician IV, Group 14 Clerk Typist 11, 590-743 666.50 7,998 3.85 .62 4.47 Group 4 Explanation of positions: A system analyst served to adapt programs to LANDSAT needs during the investigation. Actual analysis by computer was carried on by an engineering technician. Fringe Benefits: OASI (Social Security) 5.85% of first $15,300. 2 UCI (Unemployment Compensation) = .210. of first $4,209. WC1 (Workman's Compensation) = .35% of total salary. Premium sharing for health insurance = S15 maximum per month Matching Retirement = 7.5% of gross salary. The hourly rate is figured as 173.2 hours per month. 2The state pays both unemployment and workman's compensation from general funds. In order to obtain a comparable opportunity cost, the rate for the university system (paid by employers) was used. These rates are computed on experience for professionals and therefore should be comparable to professionals in state government. Source: Phyllis Snyder, Office of Personnel, General Land Office and Bill Monks, Texas Employment Commission. M-4 Table 3. Labor Costs for Bureau of Economic Geology South Texas Mapping Project: Staff Salary and Fringe Benefits by Staff Level, Per Hour (1976-77). - Job Classification Range Per Average Salary For Hourly, Fringe Total Remun- Mon Lh Per Morith Year, Rate Derlu@'i 'tS e-r@d-tiOn Per Per 0ur Hour Research Scientist $1370- $1523 $18,276 '@)8.79 $1.08 $ 9.87 Associate 111 1675 Research Scientist 1465- 1659 19,908 9.58 1.13 10.71 Associate IV 1852 Senior Cartographer 1281- 1451 17,412 8.38 1.05 9.43 1620 Research Scientist 680- 784 9,408 4,53 .64 5.27 Assistant 1 888 Fringe Benefits: OASI (Social Security) = 5.85% of first $15,300. UCI (Unemployment Compensation) = .2% of first S4,200. WCI (Workmans Compensation) = .35% of total salary. Premium Sharing = $15 per month for full-time employees, only half-time to full-time employees are eligible and the $15 is prorated on hours of work. Matching Retirement = 6.0% of the first $25,000 and only those working half- time or more are eligible. 1Hourly rate figured on average hours per month of 173.2. Source: Gary Otting, University of Texas Personnel. M-5 Table 4. Labor Costs of Image Interpretation: Data Interpretation, Staff Time By Steps in Interpretation, Representative Time for a Scene in Site Four. Step Description Hours of Time for Research Hours of Time for Research Representative, Total Time Scientist Associate IV Scientist Assistant I In Hours, Associate IV and Assistant I Scene Represen- Scene Represen- tative tative 2 3 1 2 3 1 Review aerial photoqraphy, Coastal 7 1/4 5 1/4 6 6 1/4A 6 1/4 Atlas Maps, and published tide and weather data for test site and image data 2 Take a preliminary field trip to 4 4 4 8A 4 4 4 8A' 16 become generally acquainted with test site 3 Complete line boundary map of test 13 3/4 12 3/4 12 12 3/4A 1/2 0 12 3/4 1site area 4 Classify features according to the 15 1/2 3 3/4 61/4 6 1/2 6 1/2 modified Anderson system 5 Study supportive data in detail, re- view results, field check and corre- late with biological data 6 Document results for scene, especially problems and uniqje aspects of imagery 7 Produced correcteJ image interpretatio at 1:125,000 and overlays at 1:24,000 8 Oualitative analysis of classification products to evaluite accuracy 9 Evaluate format and content of result- 1 1 1/2 3/4A 5 1/2 5 7 5 3/4 6 1/2 Ing map 10 Evaluate image Interpretation of 3 A scene In conjunction with other scenes Total so 81 1/2 131 1/2 Footnotes: Averane time,.taken as representative time, is denoted by an "A". All other representative times were adjusted, Scene 4 of site 4 was not completed. r Source: Robert Finley and Robert Baumoardne Bureau of Economic Geoloqv. Table 5. Other Labor Costs of Image Interpretation: Data Interpretation, Staff Time by Activities Other Than Interpretation. Hours of Time Description of Activity Research Scientist Research Scientist Grand Total Associate IV Assistant I (1) Meetings 18 3/4 18 3/4 (2) Research (Literature 18 3/4 18 3/4 and Design) (3) Quarterly Report 18 1/4 18 1/4 (4) Other 20 1/4 20 1/4 Total, Labor Cost other than Inter- pretation 76 76 (5) Turbidity Study 53 3/4 24 77 3/4 (6) Change Detection in 2 112 4 6 112 Spoil Area Study Total, Special Studies 56 1/4 28 84 1/4 M-7 Table 6. Labor Costs Of Computer-Assisted Analysis: Data Interpretation, Staff Time By Steps, Re- presentative Time For A Scene In Site Four Step Description With Computer Proaram Packaae Representative Hnurs Averaqe For In Parentheses For Four Scenes Scenes 3 & 4 1 Select LANDSAT scene and determine data tapes ID number (ERTSIDC) 1.5 .5 2 Examine available imagery l.OA .5 3 Merge data tapes @r duplicate tapes if necessary (MERGE) 3.0 1.0 4 Estimate scan line and sample numbers A for the area of interest 2.5 2.5 5 Generate grayscale maps of the area A (GRAYMAP /P I COUT) 7.0 9.0 6 Obtain meteorological data 3 .5 .5 7 Participate in orientation field trip4 8.0 8.0 8 Establish control network (COEF) 13.0A 12.0 9 Classify water (DAM) 8.0 6.0 10 Cluster all training areas within the A scene (ISOCLS) 5.0 4.5 11 Examine class statistics 2 1 3.0 1 4.0 12 Refine a training class if indicated by step 105 - - 13 Use class statistics to build the look- 5.OA up table (ELLTAB Table) 3.0 14 Combine classes for display purposes (HGROUP) 7.0 4.0 15 Classify the area (ELLTAB CLASSIFY) 9.OA 8.0 16 Register and display the classified A results (REGISTER) 19.0 14.0 17 Outline or color code homogeneous N areas 1.0 1.0 M-8 Table 6. Labor Costs of Computer-Assisted Analysis (Con't) Step Description With Computer Program Package Representative Hours Average For In Parentheses For Four Scenes Scenes 3 & 4 18 rxamine the classificatinn. map Anti field check 24.0 24.0" 19 Stop if satisfied with results 20 Retrain on unclassified or poorly separated areas (ISOCLS) 2.0 2.0 other programs, COMPUter name 18 Correlation 7.0 3.5 Mr. Clean 3.0A 2.5 Change Detection 1.0 1.0 Total 130.5 111.5 AAverage used for all four scenes to the nearest half hour. NIn step 18, 8 hours was used as typical analysis time and 16 hours for a field trip. In step 17, each map would normally be color coded. 1Staff is an Engineering Technician IV, Group 14. 2Scene four was contained on one computer compatible tape. In certain cases, it is a non-representative scene and therefore the times on the first three scenes prove more representative. 3Thirty minutes is the maximum time for all four scenes. 4An orientation field trip is necessary for at least one scene. The twenty-four hours could be taken as a figure for one scene or, alternatively, as sufficient for all four scenes. 5Step 12 proved to be a bogus case, for more than one class had to be refined in every case. Source: Bill Hupp, Texas Natural Resources Information System, The Interpre- ter for Site Four. M-9 Table 7. Labor Costs Of Data Acquisition: Staff Time By Agency. Step No. Responsible Agency Estimated Time (Hours) 1 TWDB 1.0 2 BEG 2.0 3 TWDB 0.5 4 GLO 0.5 5 TWDB 1.0 6 TWDB 2.0 7 TWDB 0.5 8 TWDB 0.5 TOTAL 8.5 1Steps are described in the LANDSAT Quarterly Report of December, 1975. These steps could be expected for data acquisition for any site, given four scenes as a data acquisition package for the site. M-10 Table 8. Equipment Costs: Special Equipment in Image Interpretation. Name Description or Model No. Producing Costs as of I Life Other Company July 1 , 1976 Richards Light Table with Zoom Transfer Scope: --Zoom ransfer ScopeTM Wide 5ase 21`4 Bausch and Lomb $4,975.00 Scientific Optical Instruments Rochester, N.Y. 146021 __r%lchards Light Table MM_ Richards $4,950.00 231100 1545 Spring Hill Rd. McLean, Va. 22101 TOTAL $9,925.00 5 Richards Light Tal with Z20M Transfei Scope: --Richards Table 11'able Only MM Richards (See Above) $8,785.00 475100 --Light Source 2500 Ft. L-3500 K $ 50.00 --Reel Bracket Motorized 1000 Ft. $1,490.00 240 Stereo.- 240R/ Bausch and Lomb $4,770.00 scope 15AE TOTAL $15,095.00 5 1All costs are retail. .2 This system is comparable to an FMA (Photointerpreters Station) used in the LANDSAT Project owned by the Bureau of Economic Geology and obtained from the U.S. Air Force, Ogden, Utah. Source: Neil P. Yingling, Bausch and Lomb Salesman, Dayton, Ohio. Table 9. Equipment Costs: Special Equipment in romputer-Assisted Analysis. Name Model No. Company - - - Costs Life Keyboard Printer Terminal AJ832-30 Anderson Jacobson, Inc. $4,490.00 5 1065 Morse Ave. Sunnyville, Calif. Teleterm Printer C011132 Computer Devices, Inc. $3,900.00 5 9 Ray Ave. Burlington, Mass. Film Viewing Table, GFL3040 Richards Corp., Inc. $2,755.00 5 Photographic Inter- 1545 Springhill Rd. pretation McLean, Va. Microfiche Reader "Realist Realist, Inc. $ 466.00 5 3335," N. 93, W. 16288 Regal Vanguard Dr. Series Menomonee, Wis. Motorized Reader/Printer 400M 3M Business Products $2,582.21 5 Sales 1948 S. Interregional Austin, Texas 78104 1Retail equipment costs as of July, 1976 include transportation. Source: O.T. Greer, Texas Water Development Board. M-12 Table 10. Equipment Costs: Data Acquisition, Special Equipment In Ordering Imagery. Name IDesc ription or Other Catalogue No. Company Retail Costl Life Recordak Unit: Total Access Files to LANDSAT Eastman Kodak, Co. $9,434.20 5 Photography 610 Gray St. Houston, Texas 77702 1. Recordak Microstar With Printer Adapter Reader 150 1683 2,565.65 5 2. Lens 21-28x Lens Kit 103 0477 121.25 5 3. Recordak Microstar Zoom Kit 103 0535 266.75 5 4. Recordak Image Con- 130 7362 4,462.00 5 trol Board, Interface 5. Recordak Printer 141 3343 1,421.05 5 6. Recordak 11 inch 141 1776 130.90 5 Print Platen 7. Recordak Retrieval Station Console 150 1444 252.20 5 Side Shelf 150 1469 63.05 5 Front Shelf 150 1485 63.05 5 S. Recordak Access Files Module Type 16-60 150 1527 36.90 5 Base and Top 150 1568 if 51.40 5 Table 11. Equipment Costs: Image Interpretation, Total Hours of Use of Special Equipment for a Repre- sentative Scene in Site Four. Equipment Hours Steps Zoom Transfer Scope 19 1/4 Total time of associate on steps 3 and 4 Richards Light Table (MIM-231100) 6 3/4 1/4 of total time for assistant on step 5 Richards Light Table (MIM-475100) 1 Total time of assistant on step 9 M-14 Table 12. Equipment Costs: Computer-Assisted Analysis, Total Hours Of Use Of'Special Equipment For Scene 3 of Site Four. Remote Computer Terminal: Step Hours Date 8 7 August 8 8 7 August 10 Total 8 14 The remote computer terminal can be treated as either of two nieces of equipment, the keyboard printer terminal or the teleterm printer. See the table on special equipment in computer-assisted analysis. M-15 Table 13. Equipments Costs: Computer-Assisted, Total Minutes of Computer Time by Program, Representa- tive Operational Run and Estimates for Supplemental and Error Runs. OPERATING RUN: PROGRAM STEP COMPUTER TIME IN MINUTES ERTSIDC 1 1.15 MERGE 3 1.21 GRAYMAP 5 3.16A PICOUT 5 11.30A COEF 8 NIL A DAM 9 8.62 ISOCLS 10 7.00 ELLTAB TABLE 13 8.66A A HGROUP 14 .15 ELLTAB CLASSIFY 15 8.86 A SCALE REGISTER 16 39.30 A MR. CLEAN N.A. 2.88 CHANGE DETECTION N.A. 2.28 TOTAL OPERATING 94.57 SUPPLEMENTAL 30.00 ERROR 15.00 -GRAND TOTAL 139.57 A Average for scenes 3 and 4. Source: Bill Hupp, Texas Natural Resources Information System. M-16 Table 14. Equipment Costs: Data Acquisition, Total Hours of Use of Special Equipment in Ordering Imagery. 1 2 Step In Data Description of Step Hours Of Equipment Percent Use Life In Acquisition Use Attributable Years to LANDIAT Project 1 Obtain current LANDSAT Accessions .5 Keyboard 100.0 5 List From EROS Data Center Printer I Terminal 2 Select LANDSAT Imagery According .5 Recordak Unit 10.00 5 3: to Criteria For Cloud Cover and Quality 6 Receive and Index LANDSAT Data .5 Richards Light 100.0 5 Table (Model GFL 3040) 1A full list of steps can be found in the December, 1975 Quarterly Report. 2See tables on equipment for cost and model information. ' 3The Recordak Unit is used by other agencies working through the Texas Natural Resources Information System Library. Source: Sam McCulloch, Texas Natural Resources Information System. Table 14. Equipment Costs: Special Equipment in Ordering Image (Con't) Name Description Cost Per Number of Cost2 Source Cassette Cassette 9. Master EROS File Black and White $15.00 142 $2,130.00 EROS Data Center Cassette Sioux Falls, S.D. Color Cassette 40.00 96 3,840.00 10. Duplizate File Black and White 6.60 142 937.20 Cassette Color Cassette 26.00 96 2,496.00 00 1Cost of Recordak equipment from July 1, 1976 to June 30, 1977. 2Purchase cost of microfilm files to TWDB. Source: Recordak Unit, O.T. Greer, Texas Water Development Board; microfilm cassettes, Sam McCulloch, Texas Natural Resources Information System. Table 15. Costs of.Office Equipment IIii Imz@ge Interpretation, Site Four Room, 15 Ft. by 20 Ft., Housing Senior Interpreter and Assistant: Pescription of Furniture lCost Per Item lNo. of Item Metal Executive Desk $200.00 1 $200.00 Metal Executive Chair 60.00 1 60.00 Work Tables (30 inches by 50 inches) 100.00 1 100.00 Tracing Table (24 inches by 36 inches) 150.00 1 150.00 Drafting Stool 65.00 1 65.00 Chalk Board (4 Ft. by 6 Ft.) 150.00 1 150.00 Bulletin Board (4 Ft. by 6 Ft.) 75.00 2 150.00 Guest Arm Chairs 60.00 2 120.00 TOTAL $995.00 .Room, 10 Ft. By 12 Ft., Housing Secretary Description of Furniture )Cost Per Item No. of Items Total Secretarial Desk (Metal) $220.00 1 $220.00 Secretarial Chair (Metal) 60.00 1 60.00 Five Drawer File Cabinet 125.00 1 Typewriter TOTAL 694.00 1 $1,099.00 This list of equipment approximates current costs (August, 1976) of replacing equipment used by personnel at the Bureau of Economic Geology during the Landsat Project, excluding drafting personnel. The room, housing the senior interpreter and assistant (Research Scientist Associate IV and Research Scientist Assistant), is approximate size for the area of a room shared with personnel working on other BEG projects. A standard room size was used for all secretaries working on the project. Source: Ruth King, Planning Program, General Land Office. M-19 Table 16. Office Overhead: Computer-Assisted Analysis, Costs of Office Equipment. Room, 15 Ft. by 20 Ft., Housing Two Engineering Technicians, the Computer Operator And Supervisor Description of Furniture Cost Per Item No. of Items Total Metal Executive Desk $200.00 2 $400.00 Metal Executive Chairs 50.00 2 100.00 Five Drawer Map File (with Bases and Caps) 350.00 2 700.00 Metal Work Table 100.00 2 200.00 Guest Arm Chair 60.00 2 120.00 Metal Bookcase (Five-Shelf) 125.00 1 125.00 Five-Drawer File Cabinet 125.00 1 125.00 Film Cannister Rack (Wood, Sixty Bin) 150.00 1 150.00 Wood Bookcase (36 inches by 84 135.00 1 135.00 inches) TOTAL $2055.00 Room, 8 Ft. by 10 Ft., Housing System Analyst Descrigtion Cost Per Item No. of Items Total Metal Executive Desk $200.00 1 $200.00 Metal Executive Chair 50.00 1 50.00 Wood Bookcase (36 in. by 84 in.) 135.00 1 135.00 Five-Drawer File Cabinet 125.00 1 125.00 TOTAL $510.00 M-20 Table 16. Office Overhead: Computer-Assistedl Analysis, Costs of Office Equipment Room, 10 Ft. by 12 Ft., Housing Secretary: Description of Furniture Cost Per Item No. of Items Total Secretarial Desk (Metal) $220.00 1 $220.00 Secretarial Chair (Metal) 60.00 1 60.00 Five-Drawer File Cabinet 125.00 1 125.00 Typewriter 694.00 1 694.00 TOTAL $1099.00 iThis list of equipment approximates current costs (August, 1976) of replac- ing equipment used by personnel at the Texas Natural Resources Information System during the Landsat Project. It excludes housing of the computer and computer support staff and housing of the recordak equipment. The room size approximates true size with the exception of the room housing the secretary. A standard room size was used for all secretaries working on the project. Source: Ruth King, Planning Program, General Land Office. M-21 Table 17. Office Overhead: Image Interpretation-and Computer-Assisted Analysis, Costs of Office Space and Materials. Office Space: Rental at $ .31 per month per square foot.. Office Supplies: Initial cost @ $90.00 per office Monthly cost @ $25.00 per office $90.00 2078.4 (hours in a year) $ .043/hour $25.00 173.2 (hours in a month) .144/hour $ .187/hour Source: Ruth King, Planning Program, General Land Office. M-22 Table 18. Miscellaneous Costs: Data Acquisition, LANDSAT Imagery and Digital Tapes for One Site. Cost of Imagery for Image Interpretatio Item Scale Band Cost 1) Color Transparency 1:1,000,000 Composite $12.00 2) Black and White Print 1:250,000 5 15.00 3) Black and White Positive 1:1,000,000 4 5.00 Transparency 5 5.00 6 5.00 7 5.00 4) Black and White Negative 1:1,000,000 5 6.00 Transparency 7 6.00 TOTAL $74.00 5) Color Master 50.00 Total Cost Per Scene $124.00 Tin Four Scenes For Site = TOTAL $496.00 Cost of Digital Tapes for Computer-Assisted Analysis Interpretation 1) Black and White Positive Transparency, with Scale 1:1,000,000 and Band 7 $ 5.00 2) Nine Track Tape Set, BPI = 1600 200.00 $205.00 Times Four Scenes Per Site 1 TOTAL $820.00 M-23 Table 18. Miscellaneous Costs (Con't) 1A typical order for one site is four scenes. The primary scenes are the latest imagery on winter scenes. Two are chosen, a year apart. A summer scene is then chosen between the two winter scenes. A final scene is one some years back, giving historical perspective. For example, for site 4, the four scene dates were: February 25, 1975; February 2, 1976; July 10, 1975; December 16, 1972. Source: Order form for site four, with purchase costs dated March 6, 1976. M-24 Table 19. Travel Costs For One Site. Preliminary Field Trip: Computer-Assisted Analysis Interpretation (Step 7) Technician and assistant @ 2 1/2 days @ per them of $22.00 = $110.00 Image Interpretation (Step 2) Senior Interpreter and Assistant @ 2 112 days @ per them of $22.00 = $110.00 Field Check: Computer-Assisted Analysis Interpretation (Step 18) Technician and Assistant @ 2 112 days @ per them of $22.00 = $110.00 Image Interpretation (Step 5) Senior Interpreter and Assistant @ 2 112 days @ per them of $22.00 = $110.00 The travel time could cover a larger area than the 200 square miles re- presenting one site. The degree of homogeneity of the area would influence the amount of traveling necessary. M-25 Table 20. Miscellaneous Cost: Scene and Scrib? Coat Enlargement In Image Interpretation. Enlarge portion of LANDSAT scene to scale of 1:125,000 from scale of 1:1,000,000 $2.80/Scene Enlarge scribe coat sheet to scale of 1:24,000 from scale of 1:125,000 $23.00/Scene Total $25,80/Scene iThis work is done by the Automation Division (D-19), Texas Department of Public Highways and Transportation. Source: Bill Hupp, Texas Natural Resources Information System. M-26 Table 21. Miscellaneous Costs: Image Interpretation, Drafting Costs for One Scene. Step Labor Materials Print Total 3 $56.58 $6.00 $62.58 7 9.43 $6.00 15.43 $78.01 M-27 Table 22. Miscellaneous Costs: Data Display, Conmiercial Costs of Hard Copy, Color Display for Computer-Assisted Analysis and Image Interpretation Maps. Number of Classes Approximate Area Scale Approximate Number of Cost Identified In Square Mile Map Size Copies (1) 23 200 1:125,000 10"XIO" 1 560.00 Computer- (2) 23 200 1 :125,000 10"XIO" 100 682.00 Assisted Analysis Map (3) 23 200 1 :125,000 10"XIO" 2,400 1,248.00 (4) 23 4,000 1 :125,000 42 " x4l " 1 1,750.00 T (5) 23 4,000 1 :125,000 42"x4l " 100 3,113.00 r'.) 00 (6) 23 4,000 1:125,000 42"x4l " 3,500 3,859.00 (1) 23 200 1 :125,000 1 O"xl 0" 1 208.00 (2) 23 200 1 :125,000 10"X10" 100 330.00 Image In- (3) 23 200 1 :125,000 10"xl 0" 2,400 896.00 terpretation (4) 23 4,000 1:125,000 38 " x42 1 1,772.00 Map (5) 23 4,000 1 :125,000 38"W" 100 3,135.00 (6) 23 4,000 1 :125,000 38"W" 2,500 3,981.00 Source: Seiscom Delta, Inc., Houston, Texas. COA INIF E IN T E R DATE DUE GAYLORDINo. 2333 PRMTED IN U S A III @111101111111 111111 3 6668 14106 7241