Book Read Free

Unmanned: Drones, Data, and the Illusion of Perfect Warfare

Page 38

by William M. Arkin


  8. Volkan Tas, Optimal Use of TDOA Geo-Location Techniques Within the Mountainous Terrain of Turkey, Naval Postgraduate School, September 2012; Myrna B. Montminy, Captain, USAF; “Passive Geolocation of Low-Power Emitters in Urban Environments Using TDOA,” Air Force Institute of Technology, AFIT/GE/ENG/07-16, March 2007.

  9. My Share of the Task, pp. 144–145.

  10. Though most sources, including McChrystal in My Share of the Task, refer to TF 714, there is a mind-boggling list of other secret organizations that were involved: Task Force 145 North, Task Force 170, the SOCOM J-239 Information Operation (OI) shop, and Project Sovereign Challenge.

  11. Top Secret America, pp. 221–255.

  12. PowerPoint Briefing, “Operational ISR in the CENTCOM AOR,” Colonel Terri Meyer, USCENTAF/A2, Shaw AFB, South Carolina, Al Udeid AB, n.d. (December 2004); obtained by the author.

  13. My Share of the Task, p. 155.

  14. PowerPoint Briefing, “Operational ISR in the CENTCOM AOR,” Colonel Terri Meyer, USCENTAF/A2, Shaw AFB, South Carolina, Al Udeid AB, n.d. (December 2004); obtained by the author.

  15. My Share of the Task, p. 154.

  16. My Share of the Task, p. 157.

  17. My Share of the Task, p. 149.

  18. My Share of the Task, p. 156.

  19. Jeremy Scahill and Glenn Greenwald, “The NSA’s Secret Role in the U.S. Assassination Program,” The Intercept (website), February 10, 2014, 12:03 AM EDT; https://firstlook.org/theintercept/article/2014/02/10/the-nsas-secret-role/ (accessed July 9, 2014).

  20. Top Secret America, p. 242.

  21. Jeremy Scahill and Glenn Greenwald, “The NSA’s Secret Role in the U.S. Assassination Program,” The Intercept (website), February 10, 2014, 12:03 AM EDT; https://firstlook.org/theintercept/article/2014/02/10/the-nsas-secret-role/ (accessed July 9, 2014).

  22. PowerPoint Briefing, USAF FMV Needs, Initiatives, & Requirements, Robert T. “Bo” Marlin, DISL; Deputy Director, ISR Capabilities (AF/A2C), May 22, 2013; obtained by the author.

  23. Rick Atkinson, “Left of Boom; ‘If You Don’t Go After the Network, You’re Never Going to Stop These Guys. Never.,’” Washington Post, October 3, 2007, p. A1 (Part 4 in a series).

  24. See On Point II: Transition to the New Campaign: The United States Army in Operation Iraqi Freedom, May 2003–January 2005, p. 191; Michael T. Flynn, Rich Juergens, and Thomas L. Cantrell, “Employing ISR; SOF Best Practices,” Joint Forces Quarterly (JFQ), issue 50, 3rd quarter 2008.

  25. My Share of the Task, p. 156.

  26. Michael T. Flynn, Rich Juergens, and Thomas L. Cantrell, “Employing ISR; SOF Best Practices,” Joint Forces Quarterly (JFQ), issue 50, 3rd quarter 2008.

  27. Rebecca Grant, “Iraqi Freedom and the air force,” Air Force Magazine, March 2013.

  28. Stew Magnuson, Military ‘Swimming in Sensors and Drowning in Data,’ National Defense, January 2010.

  29. Michael W. Isherwood, “Roadmap for Robotics; USAF expects unmanned aircraft to play a huge role in future warfare,” Air Force Magazine, December 2009; Sean D. Naylor, “Inside the Zarqawi Takedown; Persistent Surveillance Helps End 3-Year Manhunt,” Defense News, June 12, 2006, p. 1.

  30. DOD PowerPoint Briefing, Department of Defense Sustainability, n.d. (November 2009).

  31. PowerPoint Briefing, “Persistent and Evolving Threats,” Deputy Chief of Staff, G-2 LTG John F. Kimmons, MICA Luncheon, Fort Huachuca, Arizona, October 31, 2005; obtained by the author.

  32. My Share of the Task, p. 165.

  33. My Share of the Task, p. 153.

  CHAPTER FIFTEEN Beyond the Speed of War

  1. The Signatures Support Program (SSP), under the purview of the Defense Intelligence Agency, was previously called the National Signatures Program (NSP). See Chadwick T. Hawley, “Signatures Support Program,” in Atmospheric Propagation VI, edited by Linda M. Wasiczko Thomas and G. Charmaine Gilbreath, Proc. of SPIE (2009), Vol. 7324-17.

  2. Chadwick T. Hawley, “Signatures Support Program,” in Atmospheric Propagation VI, edited by Linda M. Wasiczko Thomas and G. Charmaine Gilbreath, Proc. of SPIE (2009), Vol. 7324-17.

  “The Signatures Support Program (SSP) leverages the full spectrum of signature-related activities (collections, processing, development, storage, maintenance, and dissemination) within the Department of Defense (DOD), the intelligence community (IC), other Federal agencies, and civil institutions. The enterprise encompasses acoustic, seismic, radio frequency, infrared, radar, nuclear radiation, and electro-optical signatures. The SSP serves the war fighter, the IC, and civil institutions by supporting military operations, intelligence operations, homeland defense, disaster relief, acquisitions, and research and development. Data centers host and maintain signature holdings, collectively forming the national signatures pool. The geographically distributed organizations are the authoritative sources and repositories for signature data; the centers are responsible for data content and quality. The SSP proactively engages DOD, IC, other Federal entities, academia, and industry to locate signatures for inclusion in the distributed national signatures pool and provides world-wide 24/7 access via the SSP application.”

  3. Matthew Edward Fay, Major, United States Marine Corps; An Analysis of Hyperspectral Imagery Data Collected During Operation Desert Radiance, Naval Postgraduate School, June 1995, p. 5.

  4. MASINT is defined as “Intelligence obtained by quantitative and qualitative analysis of data (metric, angle, spatial, wavelength, time dependence, modulation, plasma, and hydromagnetic) derived from specific technical sensors for the purpose of identifying any distinctive features associated with the emitter or sender, and to facilitate subsequent identification and/or measurement of the same. The detected feature may be either reflected or emitted.”

  MASINT is information derived from measurements of physical phenomena intrinsic to an object or event. Measurements/signatures resulting from the shift, change, vibration, fluctuation, existence of, or the lack of any of these states for a given phenomenon:

  • Electro-Optical: examples infrared, laser, spectral,

  • Radar,

  • Polarimetric,

  • High-Power or Unintentional Radio Frequency Emanations,

  • Geo-Physical: examples seismic, acoustic, magnetic, gravimetric, infrasonic;

  • Chemical,

  • Biological,

  • Nuclear, and

  • Biometrics: relies on unique signatures of human beings.

  See PowerPoint Briefing, National MASINT Management Office (NMMO), MASINT/Common Sensor COI, April 2009; obtained by the author.

  5. “Non-literal exploitation is the analysis of measurable, quantifiable, repeatable data collected by remotely-located sensors to produce information of intelligence value that cannot be interpreted by the human eye and cognitive system.” See NGA.IP.0006_1.0 2011-07-27 Implementation Profile for Tactical Hyperspectral Imagery (HSI) Systems, 2011, pp. 16–17.

  “A hybrid definition of nonliteral imagery exploitation can be found in Joint Pub 1-02 [the official military dictionary]…. The process of extracting non-spatial information from image data, automatically or semi-automatically, using non-traditional, advanced processing techniques, employing models, measurements, signatures (spectral, textual, temporal, polarization), or other features to detect, locate, classify, discriminate, characterize, identify (material, unit, function), quantify (material, time, physical), track, predict, target, or assess objects, emissions, activities, of events represented in the imagery.” See Matthew Edward Fay, Major, United States Marine Corps; An Analysis of Hyperspectral Imagery Data Collected During Operation Desert Radiance, Naval Postgraduate School, June 1995, p. 2.

  6. Spatial resolution is the smallest distance between two ground points such that both points can be resolved by the sensor. Spectral resolution is the number and dimension (size) of specific wavelength intervals in the electromagnetic spectrum to which a remote sensing instrument is sensitive. See Christopher Burt, Detection of Spatially Unresolved (Nomin
ally Sub-Pixel) Submerged and Surface Targets Using Hyperspectral Data, Naval Postgraduate School, September 2012, p. 15.

  7. A pixel is defined as: “The atomic element of an image having a discrete value. Although a pixel value represents a minute area of an image, the generic use of the term does not specify the exact shape or symmetry of the area (circle, oval, square, rectangle, other) represented by the value.” Information received from NGA, 2014.

  Pixel size is a direct indicator of the spatial resolution of the sensor because pixels are the smallest elements that can be detected by the sensor. Spatial resolution is a measure of the smallest angular or linear separation between two objects that can be resolved by the sensor. More simply put, it is the smallest separation between two objects on the ground that can be detected as a separate object.

  8. The electromagnetic spectrum extends from the short-wave cosmic ray region to the long-wave TV and radio-wave region and includes, among others: gamma rays, X-rays, ultraviolet, visible, near infrared, thermal infrared, microwave, and radio waves (TV and radio bands). The wavelengths of visible light range from 400 to 700 nanometers (nm), near-infrared wavelengths range from 700 to 1100 nm, and short-wave infrared wavelengths range from 1400 to about 3500 nm.

  9. Henry Canaday, “Seeing More with Hyperspectral Imaging,” Geospatial Intelligence Forum (GIF) 11.2, p. 21.

  10. Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, p. 4.

  11. A number of multispectral sensors were developed, particularly by NASA, but it was not until the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) was flown aboard a specially configured U-2 in 1989 that hyperspectral information extraction was achieved, with AVIRIS imaging 224 contiguous spectral bands at a resolution of 10 nanometers. See Matthew Edward Fay, Major, United States Marine Corps; An Analysis of Hyperspectral Imagery Data Collected During Operation Desert Radiance, Naval Postgraduate School, June 1995, pp. 9–10.

  12. Spectral imaging as opposed to the more familiar visible-light imagery, that is, photography, can distinguish many objects from their surroundings by strong reflection, or lack thereof, in parts of the electromagnetic spectrum but are washed out in a photographic view. The distinguishing reflectance (or emittance) in a small portion of the spectrum may be undetectable when an entire portion of the spectrum, such as the visible-light portion, is observed as a whole.

  13. Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, pp. 1–2.

  14. Christopher Burt, Detection of Spatially Unresolved (Nominally Sub-Pixel) Submerged and Surface Targets Using Hyperspectral Data, Naval Postgraduate School, September 2012, pp. 9–10.

  15. See discussion in Andrew C. Rice, Context-Aided Tracking with Adaptive Hyperspectral Imagery, Air Force Institute of Technology, AFIT/GE/ENG/11-43, June 2011, pp. 1–9.

  16. Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, p. 13.

  17. Jeffrey H. Bowles, John A. Antoniades, Mark M. Baumback, John M. Grossmann, Daniel Haas, et al., “Real-time analysis of hyperspectral data sets using NRL’s ORASIS algorithm,” Proc. SPIE 3118, Imaging Spectrometry III, 38 (October 31, 1997); Matthew Edward Fay, Major, United States Marine Corps; An Analysis of Hyperspectral Imagery Data Collected During Operation Desert Radiance, Naval Postgraduate School, June 1995.

  18. Deploying a hyperspectral sensor in space was not a seamless task. The Lewis spacecraft was launched in August 1997 with a 384-channel sensor but did not succeed in demonstrating hyperspectral technology. Soon after launch, the spacecraft developed a slow spin, rendering the solar array unusable. The spacecraft could not be recovered, and it reentered Earth’s atmosphere in September 1997.

  Ten days after 9/11, the air force’s satellite-borne Warfighter 1 hyperspectral package on board a civilian satellite was destroyed when its launch vehicle failed.

  19. In cases where targets were located in environments that masked or distorted the signature spectra of the target, even when objects were overtly exposed, some sensors were not able to discriminate the target. See Jeffrey D. Sanders, Target Detection and Classification at Kernel Blitz 1997 Using Spectral Imagery, Naval Postgraduate School, December 1998.

  20. Naval Research Laboratory, “NRL Demonstrates First Autonomous Real-Time Hyperspectral Target Detection System Flown Aboard a Predator UAV,” October 31, 2000; www.nrl.navy.mil/media/news-releases/2000/nrl-demonstrates-first-autonomous-realtime-hyperspectral-target-detection-system-flown-aboard-a-predator-uav#sthash.UuOUzpv4.dpuf (accessed April 30, 2014).

  21. Data from a nadir-looking visible hyperspectral sensor were analyzed by an onboard real-time processor. A three-band false-color waterfall display of the hyperspectral data with overlaid target cues, along with the corresponding high-resolution image chips, was transmitted to a ground station in real time.

  The push-broom sensor consisted of a grating spectrometer and a 1024x1024 custom charge-coupled device (CCD) camera. The sensor operated at a frame rate of 40 Hz and provided 1,024 cross-track spatial pixels and 64 wavelength bands (450 to 900 nm). The panchromatic imaging sensor operated in the visible-wavelength region and consisted of a CCD line scanner and a large-format lens (300 mm). This sensor operated at a frame rate of 240 Hz and provided high-resolution imagery via 6,000 cross-track spatial pixels. A high-frame-rate video frame grabber and custom demodulation software decoded the transmitted data, which consisted of a false-color waterfall display, target cue information, and corresponding high-resolution image chips.

  See Naval Research Laboratory, “NRL Demonstrates First Autonomous Real-Time Hyperspectral Target Detection System Flown Aboard a Predator UAV,” October 31, 2000; www.nrl.navy.mil/media/news-releases/2000/nrl-demonstrates-first-autonomous-realtime-hyperspectral-target-detection-system-flown-aboard-a-predator-uav#sthash.UuOUzpv4.dpuf (accessed April 30, 2014).

  22. Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, p. 16.

  23. Chadwick T. Hawley, “Signatures Support Program,” in Atmospheric Propagation VI, edited by Linda M. Wasiczko Thomas and G. Charmaine Gilbreath, Proc. of SPIE (2009), Vol. 7324-17.

  24. Matthew Edward Fay, Major, United States Marine Corps; An Analysis of Hyperspectral Imagery Data Collected During Operation Desert Radiance, Naval Postgraduate School, June 1995, pp. 1–2.

  25. Over Afghanistan, Hyperion was used to give researchers “a unique opportunity to compare hyperspectral images of targets before and after they were bombed, adding to the store of signature data that can be used in applying the technology to targeting and post-attack damage assessment”; Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, p. 17.

  26. Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, p. 7.

  27. Amy Butler, “USAF Turns to Hyperspectral Sensors in Afghanistan: New sensors provide new edge in finding explosives in Afghanistan,” Aviation Week & Space Technology, September 19, 2011; Paul J. Pabich, Lieutenant Colonel, USAF; Hyperspectral Imagery: Warfighting Through a Different Set of Eyes, Occasional Paper No. 31, Center for Strategy and Technology, Air War College, October 2002, p. 7.

  28. USAF, HyCAS ACTD Management Plan, January 7, 2003.

  29. “Using a National Aeronautics and Space Administration (NASA) WB-57 aircraft flown at an altitude of approx
imately 15,240 meters (roughly 50,000 feet), 218 flight lines of hyperspectral data were collected over Afghanistan between August 22 and October 2, 2007. These HyMap data were processed, empirically adjusted using ground-based reflectance measurements, and georeferenced to Landsat base imagery. Each pixel of processed HyMap data was compared to reference spectrum entries in a spectral library of minerals, vegetation, water, ice, and snow in order to characterize surface materials across the Afghan landscape.” See USGS Projects in Afghanistan, Hyperspectral Surface Materials Maps; http://afghanistan.cr.usgs.gov/hyperspectral-maps (accessed May 12, 2014).

  30. Shannon O’Harren, Trude V. V. King, Tushar Suthar, and Kenneth D. Cockrell, “Information-driven Interagency Operations in Afghanistan,” Joint Forces Quarterly (JFQ), Issue 51, 4th quarter 2008.

  31. Chadwick T. Hawley, “Signatures Support Program,” in Atmospheric Propagation VI, edited by Linda M. Wasiczko Thomas and G. Charmaine Gilbreath, Proc. of SPIE (2009), Vol. 7324-17.

  32. Henry Canaday, “Seeing More with Hyperspectral Imaging,” Geospatial Intelligence Forum (GIF) 11.2, p. 22. See also National Air Intelligence Center PowerPoint Briefing, Hyperspectral Collection and Analysis System (HyCAS) ACTD, n.d. (June 2003).

  33. SpecTIR Government Solutions (SGS), GEOINT | MASINT | IMINT; SpecTIR Spectator, Quarterly Newsletter, Vol. 1, 2012.

  The SpecTIR Hyperspectral Automated Processing and Exploitation System (SHAPES) is described by its manufacturer as a highly rugged ground-based hyperspectral remote-sensing capability in a trailer. SHAPES’s sensor can complete a scan in thirty seconds, and then SHAPES takes twenty to forty seconds to process, exploit, and generate a report from scanned data. Processing includes radiometric calibration, atmospheric compensation, and target detection. See Henry Canaday, “Seeing More with Hyperspectral Imaging,” Geospatial Intelligence Forum (GIF) 11.2, p. 22.

 

‹ Prev