For several reasons, news reporters covering the climate occasionally cite data from private firms that analyze the risks of severe weather damaging buildings, roads and other infrastructure across both large and small geographic areas.
First, those firms offer data at a level of granularity, sometimes down to specific addresses, that is largely unavailable from federally funded and produced climate data.
Second, some of those firms offer easy-to-understand risk scores.
A particular neighborhood or property might have a flood risk score of 4 out of 10, for example, or a wildfire risk score of 1 out of 10.
Also, some firms will work with reporters to give them free access to data that otherwise sits behind a paywall.
But, as journalists use those analyses, they should understand this burgeoning market for paid climate data and be willing to ask questions of these companies — particularly how they arrive at their risk assessments and how they are handling the potential for federal climate research being severely curtailed under the second administration of President Donald Trump.
In this explainer, we’ll address these questions:
- What climate data does the federal government in the U.S. collect and provide?
- What are some of the major climate risk data companies to know about?
- What type of climate risk data is privatized, who uses it, and what risks do these firms analyze?
- What concerns do academic researchers have about the climate risk data that companies sell?
- Should the federal government make detailed, address-level climate risk data publicly available?
- What questions should journalists be asking?
Keep reading to learn more.
What climate data does the U.S. federal government collect and provide?
The U.S. federal government has historically put billions of dollars per year toward scientific research examining the changing climate in the U.S. and globally.
During most of the 2010s, federal funding for climate science and clean energy technologies was between $8 billion and $13 billion per year, according to a 2018 Government Accountability Office analysis. The Inflation Reduction Act, which became law in 2022, put hundreds of billions more federal dollars toward climate-focused projects, along with tax credits and loans to individuals and businesses.
NASA and the National Oceanic and Atmospheric Administration, an agency of the U.S. Department of Commerce, are the two primary sources of government-produced climate data. Many federal climate data products are available for free through NASA’s Center for Climate Simulation and NOAA’s National Centers for Environmental Information.
The Federal Emergency Management Agency provides searchable flood maps, while the Environmental Protection Agency provides weather maps with projections of future temperatures, precipitation, extreme storms and heat, and sea level rise.
Journalistic investigations have revealed flaws in some government climate risk data. A 2022 Washington Post investigation into FEMA flood maps found “communities throughout the country where FEMA’s maps are failing to warn Americans about flood risk.” As part of the investigation, Michael Grimm, then acting deputy associate administrator of the agency’s Federal Insurance and Mitigation Administration, told the Post that those maps “do not forecast flooding. Maps only reflect past flooding conditions and are a snapshot in time. They do not represent all hazards and do not predict future conditions.”
While federal agencies collect a tremendous amount of climate data through weather stations, satellites and even human observers, predicting future risks is a difficult task, and it often relies on data about damaging weather events that communities have faced in the past.
“That systematic underestimation of climate change is a function of the way we assess risk,” says Justin Mankin, an associate professor of geography and earth sciences at Dartmouth College. “Risk assessment is effectively an analysis that’s retrospective. Your snapshot of risks today is a function of what history has been, not that which is coming.”
At the same time, the federal government appears to be stepping back from investing in climate research. Under the first and second administrations of President Donald Trump, some federal climate data has been scrubbed from public view.
“Along with eliminating climate-related jobs, the Trump administration has also been rapidly removing climate language and scientific data from federal agency websites,” writes environment reporter Kiley Price in a February 2025 article for Inside Climate News.
One final note on this issue: Public climate data is typically produced at a wider geographical view, while private climate data firms sell risk projections down to very detailed levels, such as individual properties.
What are some of the major climate risk data companies?
This is not an exhaustive list, but here are some of the better-known firms that sell data predicting future climate risks:
- Carbon4 Finance
- ClimateCheck
- Climate-X
- First Street
- Jupiter Intelligence
- Riskthinking.AI
- Verisk
- XDI
- Zesty.ai
Several of those firms were founded during the 2010s, in response to demand from credit rating agencies, businesses, investment firms and government officials for detailed climate risk data to inform policy and financial decisions.
“There’s been a market uptick, particularly a few years ago,” says Oriana Chegwidden, a research scientist at CarbonPlan, a nonprofit open data organization focusing on climate change solutions. “There was kind of a really, a huge, huge growth in these companies.”
First Street is among the most widely known to the public, as it licenses its risk data to popular real estate websites — more on that below.
What type of climate risk data is privatized?
Firms often sell climate risk analyses at the property level. It’s called downscaling — taking global or regional climate data to make refined, property level climate risk projections.
“Tomes and tomes of work have been written on this by the downscaling community, and they all get different answers depending on the analytical choices they make,” Mankin says. The techniques are complex, he adds, but could include making choices about when rain fails and how water flows over surfaces, such as concrete or vegetation.
Broadly speaking, this detailed risk analysis is what companies are selling. They’re also often selling an appealing user interface, such as interactive, attractive maps.
Who uses data from private climate risk companies?
Paid clients, government agencies and financial institutions, among potentially others.
Popular real estate platforms, such as Redfin and Zillow, license data from First Street and let users search specific addresses and see flood, wildfire and other climate risk estimates, along with insurance coverage recommendations, depending on the platform.
Several climate data firms confirmed that they are willing to provide their data for free to journalists.
What types of risks do these firms analyze?
Firms that produce climate risk data often focus on physical risks — potential damage to physical infrastructure, like roads, bridges, factories, sports stadiums, schools, and apartment buildings.
Those firms often cater to financial services, with the goal of identifying portfolio assets, such as real estate, that could be at risk of damage due to climate change. While financial services are regulated by federal government agencies like the Securities and Exchange Commission, there is no national regulator of the climate data market.
There are two types of physical climate risks, according to the U.S. Environmental Protection Agency: acute and chronic.
- Acute physical risks are those associated with a particular extreme event, such as a hurricane, tornado or flood.
- Chronic physical risks are those stemming from changing global climate patterns over time, leading to things like recurrent heat waves and higher sea levels.
Is the climate risk data that companies sell trustworthy?
Modeling procedures that climate risk firms use may be difficult to scrutinize — either because firms don’t make those procedures public, or, if they do make their technical procedures and modeling assumptions public, they may be difficult for reporters to understand.
How firms turn broad, government-produced climate and weather data into granular, address-level risk estimates is the “black box” that researchers sometimes talk about.
Some firms, such as First Street, provide technical reports on their risk modeling methodologies. Journalists covering private climate data risk analyses should vet any methodologies provided by a company with independent experts, such as academic climatologists who study risk modeling.
According to researchers we spoke with, this is the work companies are selling: Using statistical modeling methods to turn broad climate data into detailed, address level data.
“There’s an incentive for them to differentiate themselves from others, to have their own special sauce,” says Chegwidden, the research scientist at CarbonPlan. “And there’s an incentive to be as good as you can be.”
Emissions scenarios may play a role in how firms develop their risk assessments. The best known emissions scenarios are from the United Nation’s Intergovernmental Panel on Climate Change. Emissions scenarios are best-case, worst-case and middle-ground scenarios for global temperature increases, usually through 2100 and depending on future levels of greenhouse gas emissions.
Some firms may not use future emissions assumptions at all and may simply extrapolate future risks based on past data, says Galina Hale, a professor of economics at the University of California Santa Cruz who studies the financing of climate solutions.
For firms that do use emissions scenarios, the scenario a firm uses could affect their results, according to Chegwidden.
For example, assuming a very high emissions scenario would imply more melting of polar ice and potentially more flooding for particular low-lying areas, whereas a lower emissions scenario could produce forecasts suggesting less flooding.
In an August 2024 CarbonPlan report, Chegwidden, with CarbonPlan editorial lead Maggie Koerth and executive director Jeremy Freeman, requested data from nine climate risk firms in order to attempt to identify discrepancies between the analyses those firms produced.
“Specifically, we asked for historical and future risk scores from 342 locations: fire risk for 128 post office locations in California and flood risks for 214 post office and public school locations in New York,” they write. “This request ought to have been easy to fulfill, with data files small enough for email and requiring minimal documentation.”
Two companies, Jupiter Intelligence and XDI, agreed to participate. First Street, Verisk, Climate Check and Zesty.ai declined, while the remaining firms did not respond. The data the researchers obtained were “based on different analyses and use different definitions and systems for scoring risks,” they write.
Flood and fire risks through 2100 broadly increased in both datasets, but the scale of the risk varied, as did the risk at more granular levels of data. The authors suggest that future comparisons be done at the level of zip codes or census tracts in order to identify discrepancies between the analyses climate risk firms are producing and selling.
“Depending on who you ask about what risk is, and how climate change will influence risk, the answer could be different,” Chegwidden says.
Credit ratings agencies, which assess the creditworthiness of municipalities and companies, often have in-house climate risk analysts. For example, credit rater Moody’s bought climate risk analytics firms Four Twenty Seven in 2019 and RMS in 2021.
But recent reporting and research suggests that credit rating agencies may be underestimating climate risks for municipalities. In 2023, Bloomberg journalist Gautam Naik reported that “there is growing concern that credit rating analysts are misreading climate risks in the $133 trillion global bond market.”
And in a report published in March 2025, sustainable finance analyst Shu Xuan Tan of the nonprofit Institute for Energy Economics and Financial Analysis finds that credit ratings agency methodologies tend to “overlook longer-term impacts, leaving investors vulnerable to financial shocks,” adding that those agencies should “refine their methodologies as climate risks grow to capture long-term financial threats.”
What concerns do academic researchers have about the climate risk data that companies sell?
Most companies that sell climate risk data employ people with doctoral degrees in relevant fields, such as earth science, geography or economics. They may use methodologies that are scientifically sound. But what they may be lacking, according to some academic climatologists, is the scientific process, including peer review and reproducibility, which refers to other researchers reproducing the same results.
In public-sector climate science, which includes scientists working for government entities or academic institutions that often make their data analyses and models public, “we can interrogate these models, take them apart, see where they work, where they don’t work and improve upon them,” Mankin says.
But, he adds, stakeholders in the public sector are different from those catered to by private firms — which, like all businesses, seek to maximize profits. In public-sector climate science, the people building the models and performing data analysis are also members of the public, Mankin explains. That means they’re part of the same group — the public — using the data to make adaptation or mitigation decisions.
“In private-sector climate data analytics, the model builders and the model users are not the same people anymore,” Mankin says. “And so, the incentive structure around the fidelity and accuracy and transparency of the models — they’re totally different. When the user and the developer are the same person, the imperative to get it right is really high.”
Individual climate data firms may well offer sound data and predictions in certain cases, says Madison Condon, an associate professor of law at Boston University who has studied the market for localized climate risk data. Those projections may be useful for identifying climate risks at larger scales, like for municipalities, or they may offer useful predictions for certain types of hazards but not others.
But Condon says she is wary of recommending that consumers use climate risk projections to make financial decisions — such as whether to take out a 30-year mortgage based on flood or fire risk projections for a particular address. “That’s a pretty specific question that the products out there sort of claim to be a one-stop shop for every asset in America, for every hazard,” Condon says. “I think the takeaway is that you should be very cautious in using those data products, if you’re basing financial decisions on them.”
Should the federal government make detailed, address-level climate risk data publicly available?
In public finance economics, there are instances in which something does not meet the strict definition of a “public good,” but it may still make sense for the government to provide it publicly for free. The reason has to do with positive externalities, says Hale.
“Positive externalities,” are benefits people and businesses receive even if they aren’t directly involved in producing or consuming something.
If those benefits can only be obtained through government involvement, and if those benefits outweigh the benefits the private market can provide, then the case can be made that the government should step in.
Streetlights are one analogy for why it could make sense for the federal government to produce granular climate data, says Hale.
Imagine a heavily populated city that doesn’t provide streetlights. While individual property owners could pay to light the sidewalks in front of their property, the streets of the city would be ill lit overall.
Research has linked better street lighting with reduced crime, so the city could create a positive externality — safer streets — by lighting its streets in a more complete way than individual property owners could.
In the case of climate risk data, federal or state governments could create a positive externality — a more robust understanding of potential climate risks for neighborhoods or even specific addresses — for both individuals and businesses.
“Producing this data, if it is to be shared, people will benefit from that,” Hale says. “There’s also efficiencies. Because at the moment, if private companies have to come up with this data, they’re all putting resources into figuring out these things. And that means hiring more people — OK, more jobs for economists, I guess that’s good. But it’s generally not efficient. And that’s why if the government was to model, or collect and provide this data, then insurance companies could use it, and developers and home buyers.”
Condon notes that insurers and private climate data companies often rely on federal government climate models that are so large they can only feasibly be run on expensive, complex computers. NOAA has at least two computing clusters dedicated to climate and weather modeling, while NASA has at least one.
“The public good element is that nowhere in the world is there a private global climate model,” Condon says. “The huge atmospheric models that represent the relationship between the atmosphere and the oceans, they run on the world’s fastest supercomputers. Most developed countries only have one.”
There are also opportunity costs associated with rebuilding areas damaged by extreme weather events, Mankin says, such as neighborhoods destroyed during the January 2025 fires in Los Angeles. Opportunity costs are the things that cannot be purchased, such as the construction of new homes, because that money instead had to go toward rebuilding destroyed homes.
“When we’re talking about the net economic consequences of something like the L.A. fires, it is those opportunity costs as well, it’s not just direct damages,” Mankin says. “It is the fact that L.A.’s economy would have been on a different economic growth trajectory had those fires not occurred.”
Municipalities may be able to mitigate such damage with detailed, publicly available climate risk data, much in the way that cities, business and consumers use national, publicly funded weather information to prepare for severe storms, Mankin adds.
But is there federal funding for detailed climate risk data?
With ongoing cuts to the federal workforce, including proposed billions in cuts to climate and weather research and analysis agencies like the National Oceanic and Atmospheric Administration, it’s difficult to imagine federal resources being put toward producing detailed climate risk data anytime soon.
On top of potential funding cuts, federal officials in late April told nearly 400 scientists to halt work on the sixth edition of the congressionally mandated National Climate Assessment — an occasional, authoritative report on climate change risks and consequences that the U.S. government publishes.
“We are pens down now, because that’s been defunded,” says Mankin, one of the NCA authors. Some NCA authors are exploring publishing the report through independent channels, according to a May 2025 article by Nature correspondent Alexandra Witze.
While the resources of private firms working to produce detailed climate risk data pale in comparison to those the federal government could harness, obtaining accurate risk data at a granular level can still be costly and time consuming.
Hale says she is working on a project at the University of California Santa Cruz aimed at producing a detailed physical risk analysis at the property level for small sections of the California coast, to inform adaptation and mitigation decisions for those communities. There are about a half dozen people working nearly full-time on the project, she says.
“Engineers, ocean scientists, economists are all trying to figure that out,” she says. “So, to get it really accurate would take a lot of work. The data that includes all residential properties is, like, terabytes. And that’s just for California.”
(A terabyte is about 1,000 gigabytes. Many commercially sold laptops have several dozen gigabytes of memory.)
When and why did the U.S. federal government start funding climate research?
It’s helpful for journalists to remind audiences about the purpose of government-funded climate and weather information: protecting both life and property. Businesses regularly make investment decisions based on federal climate and weather data.
Federally funded physical science research dates to the early 1800s.
“Calls for a National Climate Service — a federal entity that would provide location-specific climate and adaptation information for free — have been around since the 1970s and continue to this day,” Condon writes in a 2023 Arizona State Law Journal article.
In 1970, President Richard Nixon created NOAA, merging several different scientific research agencies into a single agency under the U.S. Department of Commerce.
“The oceans and the atmosphere are interacting parts of the total environmental system upon which we depend not only for the quality of our lives, but for life itself,” Nixon wrote in a message to Congress delivered that July. “We face immediate and compelling needs for better protection of life and property from natural hazards, and for a better understanding of the total environment — an understanding which will enable us more effectively to monitor and predict its actions, and ultimately, perhaps to exercise some degree of control over them.”
What questions should journalists be asking?
Here are a few starting-point questions that can inform your thinking as you seek answers to how private climate data firms and others produce and use risk analyses.
How are insurers, which use federal climate data and research to inform their pricing, preparing for potential major funding cuts to that federal data collection and analysis?
“The insurers are mostly using outputs of government data,” Condon says. “They’re mostly using all the data products that are now pretty under threat, so the data products from NOAA, mostly, and NASA.”
Which statistical models are climate data risk firms using?
“I absolutely would be asking about methodology,” Hale says. “I would be asking to open the black box. What is going in? What’s happening inside? How do you decide the reliability? I would want to know all the details of which raw data they use.”
How are climate data risk firms ensuring their risk projections are as accurate as possible?
“I would want the receipts on accuracy,” Mankin says. “And how model improvements are made.”
Do firms inform the public when their analyses suggest certain areas are at high risk of climate-related damage? If so, how?
“What is their minimal, moral and ethical obligation, given the public sector investment in the science that undergirds their business model?” Mankin says.
Who are their customers, which emissions scenarios do they rely on and what other assumptions do they make about the country’s climate future?
“Having a better handle on who the customer base is could give insight into what this data is being made to do,” Chegwidden says.
How are local government officials using private climate data to make planning decisions and how much did the data cost?
“If you’re in a small town, it’s just not that many people who are involved in oversight of [things] like sewage and urban planning,” Condon says. “And so, you’re a concerned citizen, you would be very interested in good data sets, and how they’re being applied to various decisions.”
Further reading
Carbon Majors and the Scientific Case for Climate Liability
Christopher Callahan and Justin Mankin. Nature, April 2025.
Climate Beliefs and Asset Prices
Galina Hale and Bhavyaa Sharma. VoxEU Column, October 2024.
The People Have a Right to Climate Data
Justin Mankin. New York Times guest essay, January 2024.
Climate Services: The Business of Physical Risk
Madison Condon. Arizona State Law Journal, August 2023.
Expert Commentary