Editor’s Note: This piece was originally penned by John Croft, FAA NextGen Outreach Writer and Editor. In this article, Croft explains the steps being taken to maximize the power of cameras on the ground to help the flight deck and how they are becoming invaluable weather sensing tools. By further automating the process by which weather patterns and hazards are relayed to pilots, accuracy of weather information can be increased and the overhead cost for analyzing thousands of weather images can be minimized. Here’s his full article about how the FAA is tackling this approach:

Subscribe to CAT

Weather Information

Weather cameras all over Alaska show the current conditions. The FAA is working to convert those images into weather information.

Cameras can be a vital tool to give pilots in remote areas a firsthand look at the real-time visibility at remote airports and en-route waypoints. But, can cameras be turned into weather sensors?

FAA weather researchers think the answer is yes. In a new research project, planned to go live during 2019 in Alaska, the FAA will fuse human intelligence and automation. This attempt will be to evolve cameras into measurable visibility sensors to improve weather situational awareness as well as forecast accuracy.

Weather Information

Flight Category graphics give pilots a snapshot of visibility conditions over a broad area.

For pilots, the information will make for safer flights; they will see the estimated visibility conditions along terrain-challenged routes in one glance on a weather application. By delivering more observation information, the project also could provide forecasters additional data to boost the reliability of localized predictions.

The work is part of the broader FAA NextGen Weather Program, which includes Weather Technology in the Cockpit (WTIC) research and the Aviation Weather Research Program (AWRP). WTIC uses NextGen information and surveillance technologies to deliver enhanced weather information to pilots in the cockpit. AWRP is an effort to explore and develop ways to improve the weather information that supports decision making in the National Airspace System.

In Alaska, where weather can deteriorate quickly, general aviation pilots could use help from both programs. There are relatively few manned or automated weather observation stations available for real-time weather and localized forecasts in the state.

In 2008, the FAA began installing weather cameras in the Alaskan wilderness. Each site typically has four cameras pointing in the cardinal directions. There are about 300 camera sites and about 1,000 individual cameras typically providing new images every 10 minutes that pilots can access through the FAA’s AVCamsPlus website.

Weather Information

Pilots can manually cycle through individual cameras to see real-time visibility and history.

The FAA cautions that the camera images are to be used for situational awareness only, not for the regulatory visibility minima to start or complete certain flights. The cameras augment 39 Automated Surface Observing System (ASOS) weather stations that provide pilots with a variety of information over audio or digital links.

ASOS measures, among other things, wind speed, temperature, pressure and dew point, and — perhaps most important to pilots — the ceiling and visibility they can expect. ASOS measurements are considered the “truth model” for visibility, although the system measures air clarity rather than how far a pilot actually can see, according to the National Weather Service (NWS). The sensor measures visibility in the horizontal plane and cloud ceiling (clear, scattered, broken or overcast) in the vertical plane.

Apps that pilots use for navigation or flight planning typically have a Flight Category option that graphically shows visibilities at reporting stations with ASOS or similar automated weather stations.

It uses one of four color codes:

  1. Purple is the poorest visibility – low instrument flight rules (IFR).
  2. Green is the best – visual flight rules (VFR).
  3. Red indicates IFR conditions.
  4. Blue indicates marginal VFR.

So, if the map shows all the stations along your route in green, that’s good.

At the numerous camera locations without ASOS in Alaska, pilots can refer to AVCamsPlus and click on individual cameras to see the most recent views. To estimate visibility, they compare the live view with a stored view taken in clear weather and annotated with distance markers to terrain features.

Weather Information

Pilots compare live images to stored images to estimate visibility. The FAA hopes to automate that process.

Viewing camera-by-camera through a route during a preflight, repeated every so often, is time-consuming and workload-intensive. However, with visibility estimates from cameras, a pilot could easily see all the information on a Flight Category-type map before and during a flight. That’s what FAA WTIC researchers and FAA partner Rockwell Collins aim to do by evaluating visibility at a subset of camera sites, using human observers hired through Amazon Mechanical Turk (MTurk), a crowdsourcing site where participants are paid to evaluate tasks assigned via the internet.

During the study in the summer of 2017, researchers posted camera images on the crowdsourcing site. Paid observers estimated visibility based on actual images versus annotated good-weather images. Researchers learned it took 8–10 observers looking at an image to come to a consensus, or crowd solution, on the visibility. A WTIC developed algorithm rates Amazon MTurk observers based on how closely their answers matched crowd solutions. WTIC Program Manager Gary Pokodner said 80 percent of crowdsourced visibility results matched the ASOS visibility to within 20 percent — a positive result. In many cases, results that varied by more than 20 percent were due to camera placement that caused obstructed views.

In a separate study, AWRP researchers had a different goal in mind for the cameras: using image processing and edge-detection algorithms developed by the MIT Lincoln Laboratory to determine visibility at remote sites to generate more accurate localized forecasts. Forecasters already use data from traditional ground weather sensors and from satellites in their prediction models; the visibility estimates from cameras will provide additional data to improve their forecasts. “Better observations, better forecasts,” said Jenny Colavito, the ceiling and visibility project lead for AWRP. “But in order to use the cameras, we have to digitize their output. Right now, it’s just an image, so we have to extract a visibility estimate using automation.”

When Colavito learned of the WTIC crowdsourcing project, she saw an opportunity. “This year we are working together with WTIC to create a hybrid,” she said. The idea is to gather crowdsourced visibility estimates through MTurk and to insert the Lincoln Laboratory-developed automation as one member of the crowd. “What we’re hoping is that our automation is going to be equivalent to a high-achieving worker for converging on a solution,” said Colavito. “If you have someone who’s really good, then you might only need one other person to verify that you’re correct.”

The estimates from the crowd can also be used to improve the automation through machine learning techniques. Ultimately, the goal is to perfect the automation to the point that it could be used as a standalone measure of visibility, allowing for near-real-time interpretation of the visibility and negating the need for a set of human eyes to weigh in — or at least minimizing the size of the crowd. Determining the connection between camera visibility, ASOS visibility and information a pilot needs to know to fly safely are hurdles to overcome. “We have found cases where there is a discrepancy between the ASOS reading and what we see in the camera imagery,” Colavito said.

While ASOS is considered the meteorological ground truth, this summer, Pokodner and his team will study how best to define the visibility measure most useful to a pilot, the “aviation visibility,” and what additional information human viewers might be able to add, such as “mountains not visible.” If testing goes as planned, Colavito said her group will begin to integrate the camera visibility data into their gridded analysis models and will work with the NWS to integrate the data into NWS numerical weather prediction models. Pokodner says WTIC researchers will study how to set triggers in the edge-detection algorithms to indicate when weather is changing – the ideal time to initiate crowd-sourcing of images.

John Croft

About John Croft

John Croft is the FAA NextGen Outreach Writer and Editor