Amid growing concerns associated with artificial intelligene (AI), Prime Minister Narendra Modi on Wednesday urged a proper look into the rapid expansion of AI and asked the Group of 20 (G20) nations to join hands on addressing this issue. During the virtual G20 Summit, the PM said that AI should be developed keeping in mind the needs of people. “AI should reach people and it must be safe for the society,” he was quoted as saying.
The PM also called deepfake a “big concern”.
This comes a day after th government highlighted its unwavering commitment to protect citizens from misinformation, especially that originates from AI and deepfakes. During an interview with ANI, the Minister for State (MoS) for Information Technology Rajeev Chandrasekhar underscored the government’s ongoing efforts to establish robust frameworks. He added that, if deemed necessary, a new legislation would be introduced to safeguard the citizens of the country from the potential threats posed by deepfakes and misinformation.
It is to be noted that the concerns raised by the government and the Prime Minister are in line with recent incidents, such as the deepfake video of actor Rashmika Mandanna. The video, depicting a woman resembling Mandanna entering a lift in a black swimsuit, went viral, prompting social media users to confirm its status as a deepfake. Mandanna expressed her distress, describing the incident as “extremely scary” and highlighting the vulnerability of individuals to harm due to the misuse of technology on social media platforms.
To recall, Prime Minister Modi recently acknowledged the significant threats posed by emerging technologies like AI and deepfakes, at the Diwali Milan programme on November 17. Speaking at the programme at the BJP Headquarters in New Delhi, the PM urged caution regarding the rising threat of AI-generated deepfakes. He shared a personal experience of encountering a fake video depicting him participating in Garba, emphasising the potential for such misinformation to create turmoil and unrest in society.