The shift in the Earth observation (EO) market from selling pixels to selling finished intelligence products has sharply accelerated in the past couple of years, due to two factors. First, satellite imagery has become commoditized, partly due to the launch of dozens of small satellites. Second, advances in artificial intelligence, cloud computing, and cloud storage have greatly expanded the number of people who can access sophisticated analyses of this imagery or run the analyses themselves.
For this second installment in this new series on geospatial analytics, I discussed these developments with:
Descartes Labs was founded by a group of former Los Alamos National Lab scientists who had been working for a long time with very large datasets, including geospatial datasets and satellite imagery. While these datasets were becoming more and more prevalent, Schlereth explains, these scientists were often hamstrung by the technology to which they had access. “They were not able to quickly scale up computer resources to deal with these large datasets and develop machine learning algorithms that would automatically cleanse and prepare the datasets for scientific analysis,” he recalls. “They were also not able to make these data available to others for large-scale analysis.”
So, they left the lab, started a company, and set about building that capability. Schlereth is responsible for bringing the company’s technology to the market, including its platform and datasets, as well as its relationships with customers and the services it provides to them.Descartes Labs was founded by a group of former Los Alamos National Lab scientists who had been working for a long time with very large datasets, including geospatial datasets and satellite imagery. While these datasets were becoming more and more prevalent, Schlereth explains, these scientists were often hamstrung by the technology to which they had access. “They were not able to quickly scale up computer resources to deal with these large datasets and develop machine learning algorithms that would automatically cleanse and prepare the datasets for scientific analysis,” he recalls. “They were also not able to make these data available to others for large-scale analysis.”