Google AMED API for Indian agriculture, AMED API crop monitoring tool, Google AI for farming India 2025
Google AMED API for Indian agriculture, AMED API crop monitoring tool, Google AI for farming India 2025Google last week announced a set of new open-source artificial intelligence (AI) tools aimed at supporting India’s agricultural sector and improving cultural and linguistic diversity in global language models. The initiatives include a new Agricultural Monitoring and Event Detection (AMED) API and a collaboration with the Indian Institute of Technology (IIT) Kharagpur under its Amplify Initiative.
The AMED API, developed by Google DeepMind and Google’s Partnerships Innovation team, builds upon the company’s existing Agricultural Landscape Understanding (ALU) API. It leverages crop labels, satellite imagery, and machine learning to monitor field activity, providing data on crop types, field sizes, sowing and harvesting dates, as well as historical agricultural activity over the past three years. Updated fortnightly, the API is designed to help the agricultural ecosystem build targeted solutions to enhance productivity and climate resilience.
Alok Talekar, Agriculture and Sustainability Research Lead at Google DeepMind, said the new API aims to deliver “granular, real-time data” to support more impactful agricultural interventions. The tool is expected to assist with crop-specific needs and harvest volume prediction.
Start-ups such as TerraStack, incubated at IIT Bombay, are already exploring the use of Google’s APIs to build rural land intelligence systems for applications like climate risk assessment and rural lending.
In a parallel effort to improve the cultural relevance of AI models, Google has partnered with IIT-Kharagpur on its Amplify Initiative, which focuses on developing high-quality, hyperlocal datasets that reflect India’s linguistic and cultural diversity. These datasets will be made freely available to support the development of more inclusive AI systems.
The initiative follows earlier pilots in Sub-Saharan Africa and will now begin building datasets in Indic languages on topics including healthcare and safety. Google said the datasets are collected using a community-centric approach and expert review to ensure responsible development and minimise bias.
“AI models can be even more helpful with a deeper understanding of the vastness and complexity of the lived human experience,” said Madhurima Maji, Lead Program Manager for the Amplify Initiative in India.
IIT-Kharagpur’s Dr Mainack Mandal described the collaboration as a “new chapter in global AI development”, citing the importance of making AI more responsive to India’s “incredible plurality”.
These efforts build on Google’s ongoing Project Vaani, which recently reached a milestone by contributing over 21,000 hours of Indic speech audio and more than 800 hours of transcribed speech. The data, gathered from over 112,000 speakers across 120 districts, is being made available through India’s Bhashini platform and Hugging Face.
Dr Partha Talukdar, Language Research Lead at Google DeepMind, said the company is focused on enabling AI experiences that are “natural and intuitive” for users across India.
Google also highlighted the broader impact of its AI research, pointing to applications in maternal health, clinical efficiency, and agricultural intelligence. The AlphaFold Protein Structure Database, developed by Google DeepMind, is reportedly being used by over 150,000 researchers in India to advance studies on diseases such as cancer and autoimmune disorders.
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine