Three Trends Driving the Geospatial AI Revolution

Fingerprint Binary Microchip

Geospatial AI, the intersection of geospatial data and artificial intelligence, is the new frontier of technological innovation that promises to transform entire business industries. Paul Hahn discusses the three driving forces enabling the rise of geospatial AI and the immense potential and opportunities it represents for entreprises, governments and the world we live in.

 

For decades, geographic information systems (GIS) have been used widely to present a view of our world based on geographic and geospatial data. In many asset-intensive industries such as energy, transportation, and the public sector the ability to visualise business objects on maps is critical to improving efficiency and decision-making.

However, a significant revolution is underway, expanding the use of geographic data in a way that promises to disrupt entire industries. If you want a career in this field, you can join an online course and get your GIS Certificate.

Sometimes called geospatial AI, geospatial analytics, or GEOINT (geospatial intelligence), the intersection of geospatial data and artificial intelligence will be critical to enterprises and governments ranging from weather centres, national labs, defence agencies, healthcare, agriculture, insurance, transportation, and many more.

Behind the rise of geospatial AI are three trends: increased availability of geospatial data from satellites and remote sensing, the advancement of artificial intelligence (particularly machine and deep learning), and the availability of massive computational power.

Over the past two decades, there has been a dramatic reduction in satellite launch costs, dropping from an average of $18,500 (€16,445)/kg in 2000 to $2,720 €2,418)/kg in 2018 for placing a satellite into low earth orbit.

Geospatial Big Data

Geospatial data analysis has always been a big-data use case. Most earth observation data consists of highly detailed imagery and time-series data in large file sizes.

In the past, geospatial data was difficult to obtain primarily used by capital-intensive industries like oil and gas or for governmental purposes such as land-use planning, intelligence, and defence. Now, with increasing availability, geospatial data is being used broadly by organisations exploring the ability to analyse the location of earth objects to derive actionable intelligence.

Over the past two decades, there has been a dramatic reduction in satellite launch costs, dropping from an average of $18,500 (€16,445)/kg in 2000 to $2,720 (€2,418)/kg in 2018 for placing a satellite into low earth orbit1. In 2018, Euroconsult2 projected that over 7,000 small satellites would be launched over the next ten years, increasing the number of Earth observation satellites from 540 to over 1,400.

At the same time, the emergence of low-cost nanosatellites, satellite constellations, and newer sensor technologies such as synthetic aperture radar (SAR) and hyperspectral imaging are creating a growing collection of geospatial big data.

The increase in remote sensing capabilities has given rise to the Earth observation data market, which Northern Sky Research3 predicts will reach $6.9 billion (€6.1 billion) by 2027, doubling from its 2017 size. In Europe alone, the market for Earth observation data is projected to increase in 2021 from its 2016 value of $719 million (€639 million) rising to $1.42 billion (€1.26 billion) according to Technavio4 in a 2017 report.

 

Artificial Intelligence

Everyone is talking about AI in the enterprise and adoption of artificial intelligence has grown dramatically, with virtually every organisation rushing to integrate and deploy AI methodologies in their core business practices. A recent Cray-sponsored survey revealed that 34% of respondents already consider AI a “crucial to business” capability, with 72% expecting it to be so by 2022.

During this first wave of AI exploration, many organisations have tested the waters by applying a few AI-based techniques to their standard workflows. In June of 2019, Gartner5 reported that four active AI projects were typical for enterprises. These small-scale implementations were good first steps, which helped many enterprises test the feasibility of AI for their business. However, a dramatic increase in AI adoption is coming, with IDC predicting6 that spending will increase from $24 billion (€21.3 billion) in 2018 to $77.6 billion (€68.9 billion) in 2022.

Along the way, machine and deep learning have emerged as promising approaches for analysing geospatial data, offering a practical approach to identifying features or objects in detailed satellite imagery. Deep learning techniques take advantage of a neural network7, a computation system that attempts to mimic the processing of a human brain.

Today a variety of governmental and commercial organisations are using deep learning techniques, coupled with geospatial data from satellites to drive demonstrate the potential of geospatial AI. For example, at the 2019 Canadian Geoignite Conference8, Canada Mortgage and Housing Corporation9 described the use of deep learning and satellite geospatial data to track housing starts, specifically for remote and indigenous communities. In 2017, Capgemini10 used SAR data and machine learning to measure forest restocking in the United Kingdom, augmenting a costly labor-intensive manual process.

As geospatial AI matures, we’ll see an increased focus on integrated AI workflows, where performance and cost-efficiency concerns will dictate technology choices. For example, the data acquisition stage of the geospatial AI workflow where massive amounts of geospatial data are captured and staged for processing – will focus technology decisions around high-performance storage capacity and throughput. Other stages such as data preparation and model development will lead to price and performance trade-offs between systems incorporating general-purpose CPUs or specialised GPUs from Intel, AMD, and NVIDIA.

 

Here Comes Supercomputing

Widespread adoption of geospatial AI can’t happen without supercomputing systems that provide the massive computing and storage resources required for timely AI application development and delivery.

Geospatial AI is a supercomputing problem11. The digital universe is doubling in size every two years, headed for 175 zettabytes by 202512, and AI applications thrive on massive datasets. Add to this the growing interest in and need for distributed machine and deep learning13 methods (methods that improve data scientist productivity) training deep learning algorithms in less time by parallelising training computation across multiple machines.

To illustrate just how much computational power is needed for training, consider this example: A recent research report from Digital Catapult14 found that an approximate minimum computation requirement for training a deep neural network on a dataset of 1.28 million images would be on the order of an exaflop (one quintillion compute operations per second).

Processing and analysing ever-growing volumes of data, supporting heterogeneous workloads, and enabling distributed training methods all require increasingly powerful and capable computing architectures.

Supercomputers are tightly integrated, highly scalable, zero-waste architectures that offer the right technology for each task to enable maximum application efficiency and eliminate computational bottlenecks. They excel at ingesting, moving and processing massive volumes of data. And now systems are being purpose-built so that everything from the processors to the software ecosystem is geared to allow diverse AI and enterprise workflows to run on a single system simultaneously. Lastly, the compute power offered by supercomputers makes it possible to train larger deep learning neural networks using bigger geospatial training sets in less time.

Market forecasts for supercomputing bear this point out. In June, Hyperion Research predicted15 that by 2023, over $2.7 (€2.4 billion) would be spent worldwide on AI-dedicated high-performance computing servers and that AI represents the fastest-growing segment, with a five-year CAGR near 30 percent.

It boils down to this: Super-computers are the only machines that offer the tools and technologies to deliver the capabilities that organisations will inevitably crave as they embrace the coming geospatial AI wave.

 

For the business leader charged with improving operational efficiency or creating breakthrough results, the geospatial data revolution offers a tantalising promise: almost unlimited opportunity when coupled with imagination.

Putting It All Together

For the business leader charged with improving operational efficiency or creating breakthrough results, the geospatial data revolution offers a tantalising promise: almost unlimited opportunity when coupled with imagination. It’s hard to think of any business sector or government entity that can’t use geospatial data and artificial intelligence.

About the Author

Paul Hahn is the Analytics and AI Market Manager of Cray, Inc. For more information, please visit www.cray.com

 

References
1. https://ttu-ir.tdl.org/bitstream/handle/2346/74082/ICES _ 2018 _ 81 .pdf?sequence=1&isAllowed=y,%2048th%20International%20Conference%20on%20Environmental%20Systems
2. www.euroconsult-ec.com/shop/index.php?id_product= 109 &controller=product
3. https://spacenews.com/forecasts-call-for-rapid-growth-in -earth-observation-market/
4. https://www.businesswire.com/news/home/20170810005058/en/Top-3-Drivers-Impacting-Satellite-based-EO-Market
5. https://www.gartner.com/en/newsroom /press – releases / 2019 -07-15-gartner-survey-reveals-leading-organizations-expect-t
6. https://www.idc.com/getdoc.jsp?containerId = pr U S4 4291818
7. https://www.sas.com/en_us/insights/analytics/neural – networks . html
8. https://gogeomatics.ca/event/geoignite-2019/
9. https://geospatial.blogs.com/geospatial/spatial-analytics/
10.https://www.capgemini.com/2017/08/how-to-do -machine-learning-on-satellite-images/
11. https://www.intel.com/content/www/us/en/high-performance – computing/ai-hpc-is-happening-now-report.html
12. https://www.networkworld.com/article/3325397/idc-expect-175-zettabytes-of-data-worldwide-by-2025.html
13. https://www.cray.com/blog/deep-learning – cray – distributed – training- framework/
14. https://assets.ctfassets.net/nubxhjiwc091/6qDT7u9pzUsq8uukwCAayw/
acc7e59350faa88fc504fc990c17deb7/MIG_MachinesforMachineIntelligence_Report_
DigitalCatapult-1.pdf
15. https://insidehpc.com/2019/06/hpc-market-five-yearforecast-bumps-up-to-44-billion-worldwide/

LEAVE A REPLY

Please enter your comment!
Please enter your name here