Forecasting changes in weather and climate—and their influence on human health, economic vitality and environmental sustainability—is a big numbers game. The National Oceanic and Atmospheric Administration uses numerous observation platforms—satellites, aircraft, ships, buoys and in-place sensors—to siphon a steady torrent of raw environmental data into the agency’s computer systems.
“It’s like a vacuum cleaner for environmental data,” says Joe Klimavicz, who as NOAA’s chief information officer has primary responsibility for guiding the agency’s massive Hoovering operation.
NOAA assimilates, analyzes, processes and disseminates this data to the public through multiple channels, including some 750 web sites. The value of that information became widely apparent last year as millions of gallons of oil flowed into the Gulf of Mexico following the explosion of the Deepwater Horizon drilling rig in April. As government agencies scrambled to coordinate their response and the public sought to understand the scope of the disaster and its implication, Klimavicz’s office went live on the web with the Geospatial Platform. The on-line tool integrated 600 layers of data to provide a visual, geospatial means of accessing data, via points and clicks, that otherwise would have been inaccessible.
“We build satellites. We exploit the data. We disseminate our data to customers in vast quantities.”
—Joseph Klimavicz, CIO, NOAA
The new tool reflects a goal of the agency to take information off the shelf and get it into the hands of people—scientists, policy makers, emergency responders and citizens—who can use it. The site provides users with information about oil flows, habitat degradation, risks to human health and impacts on fisheries. The day it went live, the site attracted more than three million visitors. To make the site more useful, Klimavicz anticipates integrating decision-making tools into the geospatial platform as a means to help users harness the data, he told Government Computer News, which recognized NOAA’s Geospatial Platform 1 as one of 10 exceptional IT achievements in government last year.
‘Data as Our Lifeblood’
Observational data collected by NOAA are the planet’s vital statistics. The quantity of information collected—sea surface temperatures, chlorophyll concentrations in oceans, sea levels, air temperatures, to name a few—will increase ten-fold by 2020, Klimavicz says. Then there is the data generated by NOAA’s supercomputers, which run climate-forecasting models and produce approximately 80 terabytes of data per day. The sheer volume of numbers crunched during weather, climate and hurricane forecasting is such that simply doubling the resolution of the results generated by those simulations requires a 16-fold increase in computing capacity.
Public demand for weather and climate data doubles every year or so, as well, Klimavicz says. During a major snowstorm that blankets much of the country, hits to the agency’s web sites have spiked by a factor of eight. During a hurricane, traffic on those sites have surged eight to 12 times during peak hours.
“The demand and expectations continue going up,” says Klimavicz. “Nobody is saying ‘There is a budget crunch, so we’ll let you get by today.’ Budgets are declining. My job is to figure out the right technology path and how to implement that technology and manage the data in today’s fiscally constrained environment.”
“I look at data as our lifeblood,” he says.
Taking the Intel Outside
Klimavicz, 51, is a native of the Washington, D.C. area. He has lived and worked in and around the nation’s capital for most of his life, the past 25 years as a federal employee. He attended college at Virginia Tech, in Blacksburg, Va., earning a Bachelor of Science in 1983 and a Master of Engineering in 1988. He is married and has three daughters.
He entered the workforce with the CIA as an imagery scientist charged with developing photogrammetric math models in the National Photographic Interpretation Center. He recalls that his “first job was writing optical equations for satellites . . . in a very intensive environment. There were days I would go to work and never see another human being [in those] dark cold computer rooms.”
He “moved around quite a bit within the intelligence and national security community,” working on the Director of Central Intelligence staff before taking a job at the Defense Department. His last position before coming to NOAA was deputy CIO at the National Geospatial-Intelligence Agency.
“I kind of look at NOAA as having an equivalent mission to NRO (National Reconnaissance Office) and NGA combined on the unclassified side of the house,” Klimavicz says. “We build satellites. We exploit the data. We disseminate our data to customers in vast quantities. It’s a lot of the same kinds of techniques and problems.”
‘Constantly Analyzing the Data’
Upon assuming the responsibilities of CIO, Klimavicz zeroed in on strengthening NOAA’s security profile. He immediately moved to shore up the confidentiality, integrity and availability of data that the agency needs to warn of approaching storms, protect the country’s coasts, project the impact of climate change and monitor air quality in the event of a terrorist attack. The intelligence work he did prior to joining NOAA informs Klimavicz’s approach to his current job, yet there is a big difference. While at the CIA and DoD, he protected secrets that could affect national security. In the context of public weather and climate data, security is more about maintaining the integrity of information and ensuring that it is available when needed by public- and private-sector decision makers. “Tornado warnings have to go out in seconds or minutes,” he says. “The public counts on NOAA to deliver relevant, accurate and timely data.”
Klimavicz’s office established NOAA’s Cyber Security Center last summer. The Security Center uses tools to correlate security log events across the enterprise alerting analysts to potential security incidents. When it is operating at full capacity, the center will screen all the agency’s data, including potential breaches totaling “a billion or more raw events per day,” he says. Those initial screens identify several thousand events daily for security analysis, of which “a dozen require further investigation. . . . We are making sure we’re constantly analyzing the data.”
As the IT demands of NOAA’s operations expand exponentially, managing the cost of supercomputers, storage space and dedicated networks becomes a bigger challenge. The electricity needed to power a computer capable of running the agency’s simulations can run to $1 million annually.
The agency is consolidating IT operations, including many data centers, currently housed at more than 200 facilities across the country. In Hawaii, for example, the agency is consolidating at a single location in Oahu a dozen smaller offices, each with “a server room, a couple of racks of equipment and one or two guys maintaining it,” Klimavicz says. “A lot of the satellite command and control operation and processing consolidation has been done. We’ve also got all of our administrative and financial systems in one data center.”
To further enhance efficiency and green up the agency’s operations, Klimavicz is pushing as much of NOAA’s IT operations as possible to the cloud, “which allows us to go green very quickly.”
The bottom line? Budgets are down. Demand for accurate, timely climate data is up. High performing technology at massive scale is critical. Sustainability is imperative. In the numbers game that is climate and weather, Klimavicz is making it all add up.
John Pulley is a veteran journalist in the Washington, D.C., area and founder of The Pulley Group, an editorial services agency.