SENSEX
NIFTY
GOLD
USD/INR

Weather

image 16    C

Chennai News

Chennai / The New Indian Express

details

Schools, teachers are posting videos of students online: Here is why parents should be concerned

This Deepavali, amid the Hindi-speaking meme trendsetters in Swiggy Indias advertisement, appeared a small burst of Tamil storm. With her two neatly folded braids tied in bright yellow ribbons, the three-foot-tall figure that hopped in and out of the frame was none other than Siva Dharshini. The believe yourself fame gained widespread attention when classroom reels of her, shot in a government school in Madurai, started circulating online. Her confident expressions and joyful engagement won over viewers. Siva Dharshini, however, isnt the only child who pops up on our feeds. There are countless other classroom videos of children explaining math, showing off clay art, or sharing skills they learnt at school. From up north, there are even videos of children cheerfully telling the camera what they brought for lunch or even announcing who their favourite Bollywood heroes are. And just days before Swiggy Indias ad was released, another classroom video went viral: a teacher asks her students to step forward one by one and share their holiday plans. The first child says hell visit his ooru and cut a cake for his younger brother; when the teacher asks endha ooru? he innocently names his village. One by one, each child is made to repeat the same drill on camera, revealing how and where they plan on spending their holidays. What these seemingly harmless classroom videos reveal is the growing trend of Indian schools both public and private and their teachers steadily expanding the digital footprint of young students and sometimes even exposing vital demographic details that can give strangers access to these children. While theres growing discussion about why parents should think twice before posting their children online or pushing them toward the child-influencer track, far less is said about schools doing the same. Somehow, parents dont seem to recognise that the risks are no different. Do you consent? Only one out of the six parents CE spoke to said that their childs school had asked for consent before posting pictures of their child on the schools social media accounts. It wasnt a separate consent form. It was simply a multiple-choice question, says Vyas Srinivasan, a parent. The issue starts here, Stegana Jency, director of Centre for Child Rights and Development, notes. It is good if schools are asking for parental consent, but does it count if they are not allowing the parents to make an informed choice? The answer is no. The consent form, at all costs, needs to explicitly state the risks that are associated with posting photographs and videos to their openly available and accessible accounts on the Internet, she explains, adding that schools dont even include a simple note in their captions asking others not to reuse the content without authorisation even if such a disclaimer may offer little legal protection in case of mishaps. Another parent said that their childs school had a policy of not taking classroom photographs or pictures, but that they do post photos and videos from celebrations, annual day, and sports day. They dont really ask consent, but its understandable, I guess. Four other parents, meanwhile, said that their respective wards schools took classroom photos and videos that were used for internal purposes, such as the schools newsletters. But when asked whether the school had provided any documentation assuring that these photos and videos wouldnt be repurposed or that they would take responsibility if they were parents said no such assurances were given. One of them even questioned, What could possibly go wrong? A 2024 New York Times investigation found that on Telegram, groups of men were openly sharing fantasies about sexually abusing minors featured on Instagram and praising how easily the platform made such images accessible. In these exchanges, paedophiles compared Instagram to a candy store. Childrens clear photographs in these wrong hands can be even more dangerous in the face of AI, says Rahul Roy, a Machine Learning Camera Systems Engineer. Four or five years back, AI models were not as good, and they needed a huge amount of data and per-person training to generate a convincing deepfake. That was why deepfakes worked only on celebrities, because they had a significant online presence. Now, in 2025, AI models have become so much better. These state-of-the-art models only need some features of the face to generate very convincing deepfakes, Rahul warns. Vinod Arumugam, a cyber social activist, elucidates, When schools post photos or videos of children on social media, anyone who downloads it can see the location details that can pave the way to any kind of attack. When schools also post over a period of time, a lot of data becomes available to easily track a child. The classroom videos can sometimes reveal far more than intended the school they attend, its location, their class, their names, and even details on visible ID cards such as blood group or parents phone numbers. Together, this information can make families vulnerable to targeted cyberattacks too, Vinod adds. Ridicule and crime But cybercrime isnt the only danger. Children are also exposed to ridicule. Since many schools and teachers dont make active efforts to remove or report re-uploads, these videos are often reused for mockery. Even Siva Dharshini faced derision for her looks and voice. Meme pages routinely exploit childrens clips, adding comments like Bunty, tu slow hai kya? to shame those who appear shy or introverted. Children may not grasp this harassment now, but the digital trail can follow them and hurt them when theyre older, Stegana flags. She even cites a recent extreme case in which 16 children were sexually abused at a childrens home. Once the incident surfaced, their identities were put at serious risk and the media was able to trace and expose photographs of the survivors, which is in violation of POCSO, through the homes social media account. Thankfully we had them taken down quickly, but this is a clear example of how things can go out of hand in such extreme cases, she adds. And how does the vulnerability ladder work? Firstly, children are automatically the most vulnerable. Secondly, their gender often shapes the level of risk they face in cases of sexual abuse, but it doesnt end there. Class and caste also play a major role, with children from lower-income or marginalised caste backgrounds being more frequently targeted. The same hierarchy fuels ridicule too those with the least power are often attacked the most. So is it right to argue that schools should never photograph or film children? Child-rights activists, cyber-safety experts, and AI researchers say the answer is more nuanced. Stegana says, We arent asking schools and teachers to completely refrain from the practice, but are only urging them to do so safely, by protecting the identity of children wherever necessary. There is a need to evolve a protocol and bring about a clear policy to safeguard children on this front. Vinod adds that his push is towards a more integrated framework that includes four stakeholders the state, media, respective institutions, and people. The state should bring in a policy, the media should question and keep the policy in check, and the institutions and people should be more aware of the risks and abide by the policy, he said. Where AI is concerned, there is a need for the law to protect citizens better, believes Rahul. Yes, photos are being used to train AI models, but we cant prevent that. Mathematically speaking, AI models are not copying any data but using the data to learn the weights inside it so that it can come up with a probability distribution. From a technological point of view, it is good that such a tool exists, but it is up to our government and lawmakers to protect us from misuse, he concludes. Safeguard children from AI-related attacks Do not upload high-resolution photos or videos on the internet, as AI models need high-resolution photos to work on. Do not upload a photo or video which has a clear view of the face. It is harder for the model to extract features from the face if it is clicked from a different angle. There are tools to add some noise or watermark to the face, which ruins the features of the face in vector space but is very subtle to the human eye. Since AI models work in vector space, this will make it harder for the model to work with images. Tips to protect children better Blur or mask faces Avoid showing ID cards, name tags, or uniform badges Remove personal details such as notebooks, labels, timetables, or anything with names, phone numbers, or addresses Never post children answering personal questions Do not share real-time content especially when children are on a field trip Avoid capturing children in vulnerable moments Turn off location tagging and remove geotags

18 Nov 2025 6:00 am