Welcome to the EROS User Experience webinar series, where we talk to staff at EROS to learn more about the data, tools and services coming out of the USGS Earth Resources Observation and Science, or EROS Center. Today's webinar is entitled The New Annual (1985 to 2023) National Land Cover Database: Improving on a 30 Year Legacy. I'm your host, Danielle Golon. The remote sensing user service lead here at EROS in Sioux Falls, South Dakota. The time is currently 1 p.m. central, and we have a very exciting webinar today. So let's go ahead and get started. First, a few logistics to ensure the best audio experience. All participants have been muted. If you have any questions or comments during the webinar, please enter them in the chat and we will address them at the end of the webinar for the best view of the slides. We also suggest change your view in teams to focus on the speaker under the view tab. If teams hasn't already done that automatically for you already. Today's webinar is being recorded. A recording will be available later on the annual NLCD website, as well as the USGS Trainings, YouTube channel and the USGS media gallery. Today's webinar will consist of a presentation followed up by a question and answer session. Today we have several speakers from the USGS EROS team working on the National Land Cover Database or NLCD on the line to talk about some exciting news coming out of the NLCD. Our speakers today include Terry Sohl, who serves as the chief of the Integrated Science and Applications Branch for Iasb here at the USGS arrows. Terry began his career at EROS in 1993, serving as a contractor on the original team to help develop the first NLCD using Landsat imagery in subsequent years. His research focused on the development of a new USGS land cover model, FORE-SCE. In 2021, Terry became the branch chief of the ISAB leading staff in remote sensing based science, with applications including land change dynamics, drought, disturbance and fire, agricultural monitoring, topographic mapping, vegetation dynamics and more. Also speaking today will be Jon Dewitz, a physical scientist here at the USGS arrows. Jon has been involved with the NLCD for 23 years. Jon led the field data collection and trained the teams and partners on NLCD methodology for the early releases. Jon was the operations manager for the next three releases. The Acting Project Manager for the last two and NLCD releases, and the product Supervisor and coordinator for all releases from 2001 to 2021. He is currently supporting the annual NLCD team and is the Acting Land Fire Project Manager, and our third speaker of the day will be Jesslyn Brown, a research geographer here at the USGS EROS, where Jess has worked for over 30 years. Just as research has involved utilizing remote sensing to improve our understanding of changes in terrestrial vegetation related to the climate and other driving forces. For applications including early warnings of droughts, tracking vegetation phenology i.e. seasonal dynamics and mapping land cover and land use. For the last two years, Jess has helped to lead the evolution of the NLC to produce annual land cover data for the United States of America. Today's presentation will provide an overview of and seed, including why the project began. What data sets are available? Exciting news about the newest data and how the data are being used, and applications in the real world, and why these data are important. Once our speakers have finished their presentation, we will then transition over to the Q&A portion of the webinar. We have several user services staff members and staff from the NLCD team here at EROS on the line to help answer any questions you may have. 00:03:55:05 - 00:04:16:14 Unknown During the Q&A portion again, please feel free to add your questions or feedback throughout the webinar using that webinar chat. We will try to answer all your questions in the chat within the time allotted, but if we are not able to address your questions during the Q&A portion, we will follow up with you offline. And if there's ever a future webinar topic you would like us to cover, please feel free to suggest that in the chat as well. With that, it is my pleasure to introduce today's first speaker, Terry Sohl. Take it away Terry. Thank you Danielle. Thank you everybody for showing up both online and out here in the audience. This is a very special talk for me. I mean, it was 31 years ago that I started at aerospace with the first job to help start the first NLCD and to stand here 31 years later to see a project that's still going and has had such an impact on people is really special. I do want to highlight that we have Jess and Jon and myself that are going to talk today, but the people that have contributed to this project over the years are countless. And just today, I happened to put the link to this out on Facebook and add a few more names pop up, and who had noted that they had worked on this in the past, so we had to do some quick editing. But this is a project that really has had a lot of people over the years that have helped support it. You can't talk about NLCD without talking about Landsat, you know, NLCD, Landsat are intricately linked. And, you know, obviously, for the folks in the building here, we're very familiar with Landsat. But that familiar Landsat series is the longest operational satellite system in the world with we have over 50 years of continuous records. The genesis of it goes all the way back to the 1960s. And so we had a Stewart Udall who was the Secretary of the interior at the time, who had a vision of taking all this technology that was going on with the space race at the time, turning those cameras around and and looking at the Earth's surface, and not from the standpoint of military application or intelligence, but for environmental or other applications. That led to the launch of the first satellite in 1972, center, in which I'm speaking right now, Earth Resources Observation and Science Center here in Sioux Falls open the next year, where we disseminate, wire and do science with the Landsat data. There have been a series of satellite since. Right now we have two systems that are currently acquiring data Landsat eight and nine. With Landsat nine, the latest that launched in 2021. We have Landsat Next. That next system to keep the legacy alive. That's going to launch in the early 2030s. We're going to talk a lot about the impact of NLCD over the years, and we'll talk about the impact of Landsat over the years. But just since we made the data freely available in 2008, you know, over 100 million scenes have been downloaded. And what makes this data special and what makes NLCD special compared to some of the other land cover products that are out there, is that long term record, and we're certainly going to hit on the ability to take a 40 year record like the new products that Jess is going to talk about, to look at a wide variety of applications, a little bit of history of land cover and USGS. So it goes back to the 1970s and at the time, aerial photography was actually used produce the first product. This is called the LWD, a product manually interpreted, very intensive process, but that was the first product. It wasn't until the early 1990s that we had a remote sensing based land cover product for the US, and this was a coarser product with the Air Advanced Very High Resolution Radiometer. From a Landsat perspective, you know, Landsat launched in 72. It wasn't until the 90s that we even considered doing national scale map. And a lot of that costs. At the time, Landsat scenes were between 2 and $4000. So during the early 1990s, a group of federal agencies got together. They had a common data buy of Landsat data to share across federal agencies. And that was the formation of the milk and that consortium. After that data buy initiated a pilot, which I'll talk about in a second. They wanted to investigate the capability to use that Landsat archive to do land cover, and that was the genesis of NLCD. That's the genesis of my career here at Airbus. And that in the early 1990s, we were asked to do a prototype, and we wanted to do it by federal region. So Federal Region three was a collection of about five states here Pennsylvania, West Virginia, Virginia, etc. it's a daunting task at the time. So at the time, it took about 45 Landsat scenes to cover this area. So remember we had to download the tapes, dig tapes. We had to handpick registration points. We had to create the land cover product itself. And I'm not going to go into the methodology, but the methodology is light out today compared to what we used to do back then, where manual interpretation used to play a big role. But it was the successful completion of that prototype that led the MRLC consortium to decide to go ahead and fund the first national land cover database, again a daunting task. It took about 440 of Landsat scenes to cover the conterminous US. It's a process that we started in the mid 1990s, but we didn't finish until late 1999, but that was the genesis of NLCD the first land cover product, largely funded by USGS and EPA. Most of the work was done by us here at USGS. You want to mention all the federal partners with MRLC that have contributed over the years. They really have been vital. But ever since then, the early product since then, their 1992 product, which was the only product in town at the time, NLCD truly has been the geographic face of the nation. I'm going to turn it over to Jon, who's going to talk about the next step, which is taking NLCD from these early days to a more modern phase. We're going to follow with Jess, who's going to talk about the new annual land cover products. And then I'm going to come back and wrap up with some of the impacts. Thanks, Terry. Thanks for the introduction and the history of NLCD 1892. As I start talking about the next generation of NLCD, I just wanted to reemphasize how revolutionary the 92 data set was, and how it set the stage for everything else that was to come. As remote sensing started to become more accessible with new tools for analysis. The next evolution of NLCD started to take shape, NLCD 2001 planning focused on remote sensing concepts. For example, 92 classes like transitional burn and urban recreational grasses were changed to land cover definitions more easily defined and classified by their spectral information. Second NLCD 2001 changed from a single product to a multiple product database in 2001. This new concept utilized fractional components, which allowed users to capture additional information from these complementary class bifurcations. In this example, the 92 land cover here shows orchards classified around the city of Fresno and Green for the 2001 product, though we had these other two products and the 92 row crops, Orchard, small grains and fallow classes were merged into a single class called cultivated Crops. However, with the 2001 database concept, orchards are still captured. As we can see here in the canopy cover, canopy layer functions as a modifier to the land cover classes, giving users additional information about orchards as well as forested wetlands and scattered forest anywhere canopy exist. Developed classes were also changed to allow the production to follow the prediction of percent developed impervious surface. As we started to define our methods, partnerships became the focus that would allow completion of this database concept. The MRLC consortium made up these partners, along with many state partners eager to have the next generation of accurate and usable land cover. Areas across the US were broken into regions, and each partner provided either mapping expertise and specific knowledge for those ecoregions or funding that enabled the USGS to map those regions. Rid of this partnership was imagery, as Landsat scenes were still very expensive. This imagery was made up of three seasonal images a spring, a leaf on and a leaf off. These were shared to the partners in the consortium with C-CAP and Gap mapping all of the coastal regions in collaboration with the USGS. This imagery was also the starting point for the Land Fire Project that also mapped several regions. As mapping moved forward, the fractional components became a large focus for the consortium. These fractional components required high risk training. This is a one meter image we see here. This image is one of thousands of chips that were used to train the first canopy cover prediction. The same is true for impervious surface, which also required large amounts of training data. And yes, the majority of high res imagery at this time was black and white. U.S. Digital Earth a quarter quads. Landsat imagery, as always for NLCD is the foundation of our maps. This is one of the leaf on imagery mosaics of Salt Lake City. This image is transformed with the training data previously mentioned into the following components. For 2001. The first components land cover. In addition to the huge amounts of high risk training data. Much of the land cover required field work for accuracy. NOAA, along with many different internal external contractors, worked with myself and others to collect this land cover training data. The second component is impervious surface. The high risk training data was generated in small chips across the entire zone or ecoregion for training. The methodology for both land cover, canopy and impervious surface came from a lot of work between Bruce Wiley, the Min Yang Jin, San Juan, and of course, Colin Homer. Forest canopy prediction was the most fickle of all of these classifications. We had slight changes in phonology that translated into sometimes large differences in canopy density, so the seamless imagery mosaics were very important. You can again see the database concepts here. We're in Salt Lake City where both land cover and impervious would show developed in this area. However, we can also see the canopy over this urban area and well into the surrounding mountains. This database concept was put into every product and LXD has delivered. After 2001 was published, work on the rest of the states and territories continued. Alaska at that time had almost no high res imagery, just a little bit of high res around Anchorage that we used for training the impervious surface prediction. All of the land cover training data was collected by Dave Saul Coates and myself over a very long summer of work. There were additional Alaska only classes, which were dwarf shrubs such year, basis and moss. Hawaii mapping went very quickly as NOAA had completed a one meter classification. We simply sampled from that for many classes and created our own products. Puerto Rico land cover was completed by the International Institute of Tropical Forestry and the USDA, while the USGS completed the impervious surface and canopy products. This work also translated into another collaborative project, the North American Land Change Monitoring System. This collaboration continues today with NLCD comprised in all of the United States. Portions with 2001 published 2006 and 2011 were created using similar methodology, but with change detection across multiple image pairs. This change detection created the first nationwide map of change for NLCD research for the 2016 release began before 2011 was even published, something Jin was the primary architect for the creation of this methodology, and this new methodology greatly improved change detection and land cover accuracy. The new change detection and land cover mapping methodology relied heavily on four different aspects, which, with each contributing a piece to the land cover. Landsat, of course, is the foundation. Specialized partner data was highly integrated throughout, and I'll talk about that a little more on the next slide. Shape and context were used throughout for accurate mapping, especially in the boundaries, but also to get better pools of training. Data that had fewer spectral outliers tend to confuse the classifier succession and trajectory analysis also greatly improve land cover change through time by removing earlier mapping errors. And I'll see also move from five year to two and three year releases for 2016. We also released five new products. First of these was the impervious surface descriptor, which was labeled each type of developed in the transportation buildings, ex urban areas, well pads, etc.. This was also the first release of the NLCD city shrub components, which also greatly improved rangeland areas of the west. These first shrub components later evolved into the project Arc map, which today still puts out six yearly rangeland components from 85 to current, along with other predictions. We also released several science products. The first created additional land cover classes differentiate rangeland, grass and shrub from other areas. My last slide shows the degree of integration with the MRLC partners. Each partner contributed directly to parts of NLCD land cover, tree canopy cover, impervious surface and rangeland monitoring. The Forest Service took over a tree canopy mapping in 2011. NOAA continued mapping coastal areas and later adopted the NLCD 30 meter land cover as they moved to one meter mapping. The BLM funds arc map for rangeland monitoring, which was used for all of the rangeland areas. LC map products, continued imagery and later developed. The EPA was a partner from the very beginning and gave funding to the start of NLCD, and continues collaboration today on accuracy assessment. Land fire products contributed to shrub and forest classes and transitional areas, and the National Agricultural Statistics Service contributed heavily to agricultural mapping for NLCD. These partners also relied on the NLCD products for various parts of their work. Disintegration continued in 1619 and 2021 for NLCD release. And with that, I'll turn it over to Jess for the next generation of NLCD. Thank you Jon. So this is a great day. Earlier this morning, our products were released in a whole bunch of different access methods, which I'll go into a little bit later. So annual NLCD represents the most recent evolution of land cover for the contiguous US. And it's awesome to be here at this point to share this with you all. I like the expression you've come a long way, baby. And I also want to say thank you to the stunningly talented team of scientists and engineers here at Eros and other locations, but mainly here, who have created this product suite. I can't mention all of you. Terry showed that slide earlier with 50 or more names on it. So just know that your work is really appreciated. So what is the annual NLCD? And this is a Landsat Time series product. We consider it a level for utilizing earlier levels of Landsat. And we couldn't do this without the long history of the Landsat program. Annual NLCD is a suite of six land cover and land cover change products for each year annually from 1985 to 2023 at that 30 meter spatial resolution that we're also used to. It's created through deep learning algorithms and a time series approach to characterize annual change, classify land cover, and characterize annual fractional impervious surface such as Jon was just describing. So we follow the heritage, the legacy NLCD for much of what we do. Again, this is the continuous US. We plan to map other areas in the future. But what we're releasing today is the lower 48 states, and we are also planning future annual update cycles. So you'll see there the website. For lots more information go to that link. Digital object identifier for use for citation. If you use our data, which we hope you will. All right. So a little bit more. Here's some more detail about these products. We have the six product suite on the left side. You'll see the land cover land cover change is the next one. Land cover confidence fractional impervious surface impervious descriptor and spectral change. Day of year. For the land cover, we utilize the 16 level two land cover categories that users are used to. We have chosen not to make any changes to this land cover thematic classification, so that users don't have to make too many changes in their use of land cover. So this animation provides a closer look at what the annual frequency of land cover can provide. So we've been condensed for quite a while. That, and users have convinced us that they want data with higher frequency, and they want it provided with lower latency. So what's apparent first, and this will loop what is apparent first is the growth of the urban development here in the Dallas Fort Worth area. And you'll also see these reservoirs through time, flickering a little bit back and forth with different water levels. And significantly, you can see Lake Ray Roberts, ten miles north of Denton, Texas, up towards the top edge of the image. And this was dammed in the early 1980s. The reservoir was filled up between 1985 and 1987. So really conveniently, at the beginning of our record, you can see that reservoir filling up over time. So I want to go through the familiar products first. The ones that have been part of the legacy NLCD releases. So land cover, of course, is the predominant thematic land cover category within the specific mapping year. And with respect to these 16 broad categories of artificial and natural surface cover. The other two that will be familiar to land cover users that are part of this release is the fractional impervious surface and the impervious descriptor. So the fractional impervious surface reflects a fractional area of those 30 meter pixels that is covered with artificial substrate or structures. And then the impervious descriptor is that categorical information giving you giving the user information of whether it's a road or other types of urban or or structures. The three that you might not be as familiar with is land cover change. So this is a product that shows the change between one year and the next, and the changes are represented in the second year. So from 1985 to 1986, the land cover change will be reflected in that 86 product. And the numerical system gives you exactly what was the prior land cover change, is the prior land cover category, and then the second two digits of the numerical value is the latter year land cover. Land cover confidence comes out of our methodology. So this is the probability value for that land cover class gives you an idea of how well the classification procedure was mapping this category. We do have categories that are awfully close together in their spectral values, like grassland in trouble. For example. So you might see lower confidence values for those types of classes. And then the last one at the bottom is the spectral date change day of year. And this is a product that reflects the timing or change that was detected in our our time series algorithm. So the day of year when Landsat surface reflectance was detected as being different from the the prior reflectance, the prior pattern of reflectance over time and again, another little bit of a closer up to a different area. So no longer Dallas. This is Marysville, Washington. And hopefully yeah, it looks like it's working. So we have all six of our products shown in a blue and annual animation. Marysville is a city in Washington and actually this area has a great amount of urban growth over the past 39 years as well. So I don't have a lot of time to go into the methods here today. So this is going to be the one slide on methodology that will. So there's a lot more depth to this. And we're working right now diligently on publications to provide this information more deeply to the public. So our methodology relies on a multi-stage deep learning architecture, which we refer to as Elkins standing for Land Cover Artificial Mapping System. And this helps us generate not only the land cover, but also the impervious surface in one integrated wheat, so that we don't have discrepancies between the products. So this allows us to generate the thematic land cover, fractional impervious cover, and other related information that supports the entire product suite. The system relies on a series of neural networks that are trained using labels derived from a modified version of the 2021 edition of the Legacy NLCD that we learn from the past, we create this land cover data set, so we use three deep learning models chained together. U-Net is at our detection head, and that leverages our spatial features. We also have a refinement head that allows us to bring in the intensity information on the fractional impervious surface. And we also bring in the land change information from our continuous change detection harmonic modeling process, creating refined land cover. So these are all linked together. And this creates a spatial temporal classification that leverages legacy approaches from NLCD from historical and LQ map as well. That dimension. But it's really powered by these modern deep learning algorithms. The USGS has a long history of conducting validation for our land cover products, and we're in the process of doing that for annual NLCD right now. Our plans have these long roots in prior validation that was conducted for individual NLCD epochs and in the prior annual LQ map reference and validation effort. We've got two phases, and we've just completed the first one. Over both phases, we're collecting 10,000 plots across the contiguous US, and the first 5000 are randomly collected and the second 5000 we will start now. Now that we have our products completed. And the stratification is designed to help us better sample rare land cover classes and actually land cover change. So this reference and validation effort involves all 39 years. So we will validate all 39 years and use this photo interpretation to you know, we have manual interpreters working on this. And they are, you know, working diligently to collect really high quality reference data. Over this 39 years, we'll complete the formal validation of annual NLCD next year and publish that information on USGS Geoscience Base. But how do we know how good it is? How good is what we produce? So we're utilizing the first 5000 to estimate that agreement with those plots. And what we have shown in orange here is there's four different levels that we assess this agreement. Primary level two stands for those 16 categories. Those higher detailed land cover classes, primary level two plus alternate means that we allowed for fuzzy classification. Our 3D meter pixels often have more than one land cover type present, and an alternate label is collected by our reference team. And so if we get the secondary label right or the ultimate label right in our land cover category, that's counted as a match, you can read the levels. Their primary level one would be the more general eight categories of land cover versus 16, and so on. The bottom half of the table is actually utilizing the same 5000 plots that we just collected and evaluating historical or legacy NLCD. And really, the bottom line here is that you're seeing a very similar level of accuracy utilizing the same reference data plots. As I said, this will be published later next year. So these charts show some of the trends in Conus land cover. However, this is not all 16 land cover classes because it's hard to distinguish all of them if we have them all in a single graph here, the one on the left is the four categories that cover the highest amount of land the greatest amount of land across the 48 states, so that consists of scrub, shrub, cultivated crops. Scrub. Scrub is an orange, cultivated crops in brown and grasslands in yellow, and our evergreen forest in green. And what you'll see is some slight decreases and increases in these land cover types over time. I'll just talk about the cropland right now. So our cropland class that has decreased over time. And the decreases in that line actually correspond very well with cycles such as those in the Conservation and Reserve program, which was established in the mid 1980s. And actually a lot of land was taken out of agricultural production due to that program. And we also see increases on the right in developed land cover through time. And some of that crop land is also converted into developed land. So how did you get the data? Data access. We're providing data access through some familiar sources on the right side of this slide and then some new access points. So the MRLC website is very familiar to existing NLCD users. A mosaic download site is there. We also have a customized NLCD web viewer, where you can download your area of interest for the products that you want and the years that you want. A very customized and tailored way to download the data. Mosaic download is for grabbing the whole us all at once, year by year. On the left side, we've now added Earth Explorer as a data access point. So that's new. These were all stood up this morning, by the way. And we also have data in the cloud. So in the S3 bucket right next to where the Landsat archive resides, you can also find and get NLCD data to work with in the cloud. We couldn't be where we are today without processing in the cloud. So just a jump to annual NLCD by the numbers. On the left side, this is the amount of. Landsat observations at Landsat. Observations at the pixel level were utilized to make the land cover greater than 295 trillion pixels were processed, and it took us just just about two months of time to do this in the cloud. And our output land cover suite is 2.1 trillion pixels for all products and about 8.9 billion pixels per product per year. These are our update plans, I'll just say, in the time allowed that we are producing an update. For most us, the plan is to do that early next summer. In 2025, we'll add the 2024 data and give users updated land cover. Also next year, we'll publish our full validation and provide those data. Beyond that, our plan is to work on Alaska and Hawaii. The data are I've got a little tilde there. The dates are a little bit in question, but we believe that we can map land cover successfully for those two stations from 2000 and forward, perhaps 1999. But that's the plan. And we will also continue to update those year by year. And I just want to segway, because Terry would like to go into more detail about our user community. But I know that many of you online are users of these data. So land cover is really foundational for hundreds, maybe thousands of applications and science study and USGS is listening to you. We want to know your use of the data, and we want it to continue to be very useful to you going forward. So I'm going to turn the podium back to Terry, who's going to go into our user community in more depth. Thank you Jess. Just an indication of some of the interest in nal city. We have as of this morning, almost 600 people that registered the webinar. So why is NLC important? You know, 30 years? Why are we still here? Why are we still doing it? Well, you know, we're using a new deep learning approach to map land cover with NLCD, so why not ask deep learning? You know, why is NLCD important? So if you go to ChatGPT and you ask what are some of the applications of NLCD, you can ask 510, 20, 50, 500 if you want. It'll come up with 500 different applications of NLCD, and it ranges everything from hydrology to biodiversity to carbon and greenhouse gases, to weather to economic applications. Policy. If you want to go further for fun, you know, go to ChatGPT type in real world example of NLCD for type in a state, type in an application. Whether it's biodiversity, you want to find out NLCD you use in South Dakota, type it in. I just chose the one in the lower left energy development. So how does need being used for energy development? And one example it came up with is solar energy development in the Mojave Desert. You want more detail of how NLCD is being used to help identify optimal sites. You measure the impact on sensitive habitats, assuming land use conflicts and supporting environmental review and compliance. This is one application, and I could type in real world example for energy development and come up with a hundred more. I. It really is hard to overestimate the impact of NLCD on the federal government, on society as a whole. This is work that's done by the National Land Imaging Program and USGS. So National Land Imaging Program funds Landsat, they fund NLCD. This is a Sankey chart that demonstrates the relationships between that NLCD product suite and all the federal agencies and all the different applications. And the point here is it's a mess of spaghetti. That's kind of the point. I mean, it's very hard again, to overestimate the impact on the federal community. You know, more recently, some initial analyzes, when you look at the Ela, when you look at agriculture, on the forestry sector or in the climate sectors, over 70% of all Landsat applications in some way flow through NLCD. NLCD is a foundational product tied to Landsat, and the impact on the community is hard to overestimate. You know, from a science perspective, if you want to nerd out, you know, from a citation perspective, there have been over 10,000 citations and the peer reviewed literature, those papers that talk about the use of NLCD themselves have been cited over 350,000 times. From a policy perspective, over 3300 policy documents in the US cite NLCD. If you look at the impact in the national news over 10,000 stories, and I do want to thank Carol during, in the Backpack, our librarian, for helping put together these these numbers. Again, it's science community, the federal community, the research community. The impacts are very large. One thing I do want to do is kind of talk about some of the categories and give you a flavor for how LCD is used. And and so just by sectors, climate and weather, you know, the applications for NLCD or anything from, you know, national and scale all the way down to a very local. So something like rainfall extremes. You know, people think of climate impacting the landscape. It works the other way as well. So the landscape impacts the climate. And so this is a study that looked at urbanization from NLCD over the years and tied it to increases in rainfall extremes. Or you can look at the urban heat island work. This is George Sheehan here. And the arrows that does that work. Looking at the impact of landscape change on urban Heat Island, the one I do want to hit is the upper right. So this carbon balance work, we gave one person a sneak preview of the annual NLC data, if you will. Kelsey smile. And he's the one that cut it out for us over here. So Ben Slater, who works with the Western Geographic Science Center for USGS, he has a model called Lucas. It's a land use and a carbon model. And and one of the things that we did is to try to demonstrate the value of an annual LCD product is we provided a preview of this data about a month ago, and he ran it through the Lucas model. So taking land use transitions from annual NLCD urbanization develop densification, agricultural change, forest change. You know, having that annual change from 1985 all the way to present, as he said, game changer. You know, it's a game changer for him in the carbon community. So this is a study site that was done in, Central California. So just to the east of San Francisco Bay, the Sierra Nevadas, that are the mountains in the far right, the forested area, so close up, this is carbon and 1985. So the green that you see there, the darker the green, the higher the red content. So this is 1985. This is the end of the period in 2022. So you see a massive loss of carbon. And if you look at the corresponding map to the left you see that similar look across the entire Sierra Nevada. That's fire mostly. You know, a lot of these are fires that have swept through and reduce the carbon amount in these areas. And it's also forest cutting. So you'll see these smaller patches. So that's that corresponds to forest cutting over time. This is a map that shows change over time. So that deep magenta color is where you've had a very strong loss of carbon over time. And again, it shows a very high footprint of those fires. And you see that again throughout the Sierra Nevadas. And this is showing some of the products. But net primary productivity and other carbon measures over time. And again, this is something that's unique. You know, being able to look at carbon change on an annual basis from 1985 to present at that spatial resolution, again, as Ben says, game changer, biodiversity and wildlife. So again, everything from a national scale application. So looking at something like species richness, so looking at habitat structure and content, trying to tie to the species prevalence of amphibians or birds or reptiles. Looking at you know me I'm a birder. So looking at my migration hotspots in the eastern U.S. over time, looking at pollinators in North Dakota all the way down to something very specific, moose vehicle collision modeling. You know, it's amazing when you look at the applications of NLCD and what comes up. So this is tying habitat prevalence to transportation networks. And so they they did that to look at areas that are most at risk for moose collision modeling that's used by insurance companies. So I mean this is real world applications. Again a little bit deeper dive. This is work that I did about ten years ago taking NLCD. And that long historical record allows us to not only look at what's happened in the past, but what's happening out into the future. And so it's something like NLCD. We developed a model here. It arrows that forecasts into the future. And using that model and modeling land cover change based on NLCD going out into the future. Tying that to climate. We can look at something like changes in the future distribution of sharp tailed grouse, a species that's very popular for hunting in South Dakota by 2075. With land use and climate change, that that range is very, very reduced. So you have, again, a real world impact. You know, hunting in South Dakota is a multibillion dollar industry. And, you know, being able to use NLCD in this fashion for forecasting into the future is really a direction that we want to go as a center scale hydrology, again, at very broad scale, like looking at surface runoff or sedimentation across the whole U.S., looking at harmful algal blooms, something we hear more and more about. So something that's even more policy related. So non-point source water quality trading from a hydrologic perspective. Another example that again has that that anticipatory element looking out in the future, if you look at NLCD data go to central Minnesota. Starting about 2005. There's a company, a timber company, Potlatch, that sold a lot of land to RDF, which is the world's largest producer of potatoes. And within a couple of years, over 40mi² of forest land were plowed and converted to potatoes. Well, we picked that up with NLCD. That's information that is valuable to people. 200 miles downstream, Minneapolis gets their water from the Mississippi River. Within a couple of years of all that land cover change, they started to notice water quality issues downstream in Minneapolis. So we were asked with EPA, the city of Minneapolis and us at arrows to look at historical land cover change using NLCD, using that to help model what's going to happen in the future, and looking at what the likely water treatment needs will be for the city of Minneapolis. So again, a very practical application where they used our information to help buy new equipment because based on what we provided, they're going to need to be able to treat for cyano toxins and nitrates in the near future. Human health an area again that we're trying to get more into. So mosquito distribution looking at a disease vector like mosquitoes and how that's related to the landscape, looking at air quality, looking at rabies and even Covid. And there's not one, two, there's at least 3 or 4 papers that came out after Covid that tied the prevalence of Covid to the distribution of the land cover on the landscape. So from a disease perspective, from a natural hazards and safety perspective, whether we're looking at nationwide wildfire risk. So as urban centers creep into the wildland interface, this wildland urban interface has a big impact on fire risk, to flood hazards to coastal inundation, landslides, susceptibility policy and economic applications. So national flood policy NLCD data is used to help define that. It's help to look at the Clean Water Act, help to look at quantifying U.S land value from an economic perspective or even environmental justice. You know, trying to look at the distribution of landscapes and how that relates to different population. So, you know, I'm gonna point to Danny here. He's probably surprised. But 30 years, so 30 years, 30 years ago, three of us sat out on a field trip because Danny's dad. Sorry. Anyway, very special project to me. It started 30 years ago with, with Danny's dad. And here we are 30 years later, and Danny's in the audience, and. Sorry, I didn't mean to do this. But it's it's just a very special project to me. And to see that 30 years later, the hard work that's gone in and the number of people in this room that have contributed is, is just amazing. And I hope that 30 years now there's another Howard that's working on it. And I—no pressure, Danny—I mean, I don't know, but I do want to point out the impact of this project. I mean, for me personally, it obviously means a lot, but or for the nation as a whole, I mean, it's really hard to overestimate the impact of NLCD, you know, both on the scientific community and then from a federal user perspective, you know, with this new deep learning approach that we should, we really are very well positioned to, to move forward, you know, to that next generation of, of land cover and hopefully in another 30 years, we're still here doing this. And we want to hear from users. You know, we want to hear from you guys what we can do better. And you know, whether that's, you know, critique of our current products, what new products you'd like to see. You know, we're we're always trying to design that next version of NLCD. So I'll end with again the access. So, you know, accessing the annual NLCD site where we can get the data and if you have context or questions. So I believe Danielle is going to open it up for questions. And we can have questions in the building or questions from online. Yes. Perfect. Thank you for a great presentation. Terry, Jon and Jess. For everyone listening and we encourage you to please go out and take a look at the new annual NLCD data. These data are available from the MRLC website as well as from the USGS Earth Explorer, which actually we just provided a webinar on how to use E last month. So a webinar recording of that can be found on the USGS YouTube training channel in case you're interested in following along while you access the data. If you prefer to access and work with the data on the cloud, we also included information on where to find these new annual NLC data products and data that are stored in the cloud to while working with the data. If you have any questions or feedback, please feel free to reach out to us. We have provided our contact information on the screen. You can either email custer@usgs.gov or call (605) 594-6151. And you can always use that feedback button. That's available on the Earth Explorer application, which will send a message to our team as well. And lastly, we'd like to remind everyone the Eros User Group or EU org, which is a listserv to be notified of future webinars and other ways to provide feedback on the data and tools provided by the USGS. Eros. As we're wrapping up the presentation, I've added a few final links in the chat, which include more information about NLCD, as well as details on how to sign up to be notified about future and LCT data on their listserv, and how to sign up for that EOG listserv I mentioned earlier, as well as a link to watch some of our previously recorded webinars at this time. We'll now move on to the Q&A portion of the webinar. If you have any final questions or feedback about NLCD, perhaps any features you would like to see in the future, please feel free to add those to the chat now as well. With that, we'll go ahead and start to field a few questions from the online audience. We have several questions that were written in. The first question we have is are the methods between NLCD 2001 and 2021 similar enough that change analysis can be done using these two data sets? So that's there's some nuance to that question. So the 2021 product spans 2001 to 2021. And all of those products that were released at that time can be used together and compared. If you have a very that first original version of 2001, there's been changes that won't allow comparison. For instance, Landsat or ortho rectification changed where things are on the ground. And if you have a building and all of a sudden Landsat decides, oh, it's more accurate if we move it here, you get weird change. So all of the products that were released at the same time are comparable, but we have different years. They were released. So you don't want to mix the ones released in different years. Okay. Thank you Jon. Our next question is to create NLCD for the US. Do we use only one model or integration of many regional based models. So the use of U-Net, the spatial part of our spatial temporal approach, has models that are trained over subregions of the US. So it's not one model. And I'm looking at Riley to give me the nod that that's true. Yeah. Yeah. So we had multiple regions or regions across the US in our spatial part of it. But really the modeling approach is more complex than just the U-Net part of it, because then we bring in the the multi temporal information. We also have a very long and detailed science product user guide that we've stood up on the website. That will give you a lot more background on the methodologies that was used to create annual NLCD. Perfect. Thanks, Jess. And these are mentioned I think this was covered, but just as a reminder, how often should we or should we at all expect updates to all prior year releases? Now that we have this fantastic annual NLCD resource? I think if I heard the question correctly, how often should we expect updates to auto prior years? Yes. Yeah. So our goal is not to update every year. Every year. That makes sense. So each year we will be adding new data and we will likely be updating the last few years. But we want to keep the majority of the record as stable as possible. So when a new update is added, users should check and see how many years have been updated before they download new data. Perfect. Thanks, Jess. Our next question. If one wants to extract the total extent of rangeland in Conus, does one need to process the RC map data, or is the rangeland extent represented in the NLCD data? Current and LXD data or historic data is a little different. The RC map data itself is a great product, and if you have any any inquiries about rangeland, that is the best place to go for legacy NLCD 012 21. We did lean on that product pretty heavily, although the change that we show is just a fraction of what you can see if you look at the RC map product. So that's where I would go. That's the root of the data. Perfect. Thank you. All right. Our next question. If one wants to chart land cover change over time, would you now recommend the new annual NLCD product over the legacy NLCD? Yes, I think that that's what I would do because the entire record is longer and uses a pretty consistent process through time. It didn't share some of our post classification methods, but we worked pretty hard to make sure that we decreased what we colloquially call land cover flip flop. So land cover change that is really more noise related utilizing a post classification routine. So I think that to do to explore change across the country, the new annual NLCD would give you not only a longer history, but internally consistent processing. Our next question what is the training data for the AI model? Is the AI model public? Well, I don't believe that we've made the full AI model public at this time. We could, but it's a complex process. The training was called from legacy NLCD, and that training data is also just internally staged. At this point, we're going to have to see if there's a lot of call for that before we put that data online. All right. Our next question is congrats on the new NLCD release if not already available. What is the release schedule of the 10,000 reference plots? Okay, the release schedule for the 10,000 reference plots is approximately nine months from now. So I would say mid mid-summer 2025. Perfect. All right. Our next question. Can you further explain what is meant by using three model heads for the annual NLCD deep learning model? Okay, that question would require a pretty deep and potentially long answer. Although the mic is just being handed to our deep learning expert, Riley Fleckenstein. Riley. You're on. I'll keep it short. Essentially, there's three components main components of our cams. Two of them are responsible for the land cover classification. There's a spatial component and a temporal component. And the third component it's responsible for the fractional previous prediction. And then we use a set of rules and guidelines in order to combine them into the final products. Perfect. Thank you Riley I recommend reading the science product user guide first. And then if you have more questions, please contact us. All right. We have a couple more questions, but we are coming at the top of the hour, so I'll try and get through these quickly. Are there any plans to collaborate with the USFS or the Forest Service for the annual tree canopy cover data? We need to have some more discussions on that. I think that we don't have a firm plan, but I can see how that would be a really good follow on to this effort. So the Forest Service will still be publishing, and at this point, it's still releasing in LXD forest tree canopy. So that will continue. They're starting their new updates and that will follow the next, I think in the next year. But I'm not sure on the exact timeline. Our next question, very curious about the comparisons between Elk map and annual and LXD, or specifically where and why they may disagree. We have not done that yet. It's the short answer. Our next question is there any ongoing project aimed at knowledge transfer to extend land cover mapping to other countries? I'm collaborating on a project with the USDA, Ars, focused on North America regional organization, and was wondering if the methods and protocols being used could be implemented in Canada and or Mexico as a starting point. I think with additional work that would be possible. We do have a collaboration right now where we're exploring land cover classification methods over Wuhan, China, and over Brasilia in Brazil. You know, it's early times. Yet what we're really finding is that the biggest challenges to going international is high quality training data for classification. So yes, it's possible, but you need that good quality training to make it work perfect. Thanks, Jess. We do have a couple of additional questions online, but we are over time, so we will respond to all the users. That left a question. We'll follow up with you offline later. But for right now, again, I encourage you all to go out and explore the new annual NLCD data set and let us know if you have any additional questions or feedback on the data. If you do have additional questions or feedback later about NLCD or any other data, tools or services at the USGS. EROS, you can always email our team at custserv@usgs.gov. Thank you all again for joining us today. We hope to see you at our next webinar in the future, which will be on accessing and working with Landsat data in the cloud. Again, if you'd like to be notified of when that webinar will occur, please sign up for the Eros User Group or UG listserv by emailing that customer. About usgs.gov email address we mentioned in the chat. Again, thank you all for taking the time to join us today. We hope you enjoyed the webinar and we'll follow up with those users that didn't get a question answered offline. Thanks again for joining us.