Kids are making deepfakes of each other, and laws aren’t keeping up

08.07.2025    MinnPost    5 views
Kids are making deepfakes of each other, and laws aren’t keeping up

This story was originally stated by Jasmine Mithani of The th Meet Jasmine and read more of her reporting on gender politics and framework Last October a -year-old boy in Wisconsin used a picture of his classmate celebrating her bat mitzvah to create a deepfake nude he then shared on Snapchat This is not an isolated development Over the past inadequate years there has been matter after affair of school-age children using deepfakes to prank or bully their classmates And it keeps getting easier to do When they emerged online eight years ago deepfakes were initially demanding to make Nowadays advances in tool through generative artificial intelligence have provided tools to the masses One troubling consequence is the prevalence of deepfake apps among young users If we would have talked five or six years ago about revenge porn in general I don t think that you would have located so a multitude of offenders were minors explained Rebecca Delfino a law professor at Loyola Marymount University who studies deepfakes Related Trump signs Klobuchar-Cruz Take It Down Act on nonconsensual deepfakes Federal and state legislators have sought to tackle the scourge of nonconsensual intimate image NCII abuse sometimes referred to as revenge porn though advocates prefer the former term Laws criminalizing the nonconsensual distribution of intimate images for authentic images at least are in effect in every U S state and Washington D C and last month President Donald Trump signed a similar measure into law known as Take It Down But unlike the federal measure numerous of the state laws don t apply to explicit AI-generated deepfakes Fewer still appear to directly grapple with the fact that perpetrators of deepfake abuse are often minors Fifteen percent of students announced knowing about AI-generated explicit images of a classmate according to a survey published in September by the Center for Democracy and System CDT a center-left think tank Students also released that girls were much more likely to be depicted in explicit deepfakes According to CDT the findings show that NCII both authentic and deepfake is a notable issue in K- community schools The conduct we see minors engaged in is not all that different from the pattern of cruelty humiliation and exploitation and bullying that young people have dependably done to each other commented Delfino The difference lies in not only the use of equipment to carry out particular of that behavior but the ease with which it is disseminated Policymakers at the state and federal level have come at perpetrators of image-based sexual abuse hard and fast no matter their age Delfino declared The reason is clear she stated The distribution of nonconsensual images can have long-lasting serious mental healthcare harms on the target of abuse Casualties can be forced to withdraw from life online because of the prevalence of nonconsensual imagery Image-based sexual abuse has similar negative mental soundness impacts on survivors as those who experienced offiline sexual violence Delfino announced that under majority existing laws youth offenders are likely to be treated similarly to minors who commit other crimes They can be charged but prosecutors and courts would likely take into account their age in doling out punishment Yet while specific states have developed penal codes that factor a perpetrator s age into their punishment including by imposing tiered penalties that attempt to spare first-time or youth offenders from incarceration the bulk do not While majority of agree there should be consequences for youth offenders there s less consensus about what those consequences should be and a push for reeducation over extreme charges Jail time offers answers and questions A survey by the Cyber Civil Rights Initiative CCRI a nonprofit that combats online abuse detected that people who committed image-based sexual abuse shared the threat of jail time as one of the strongest deterrents against the crime That s why the organization s guidelines recommendations have invariably pushed for criminalization disclosed Mary Anne Franks a law professor at George Washington University who leads the initiative A great number of states have sought to address the issue of AI-generated child sexual abuse material which covers deepfakes of people under by modifying existing laws banning what is legally know as child pornography These laws tend to have more severe punishments felonies instead of misdemeanors high minimum jail time or considerable fines For example Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator While incidents of peer-on-peer deepfake abuse are increasingly cropping up in the news information on what criminal consequences youth offenders have faced remains scarce There is often a considerable amount of discretion involved in how minors are charged Generally juvenile justice falls under state rather than federal law giving local personnel leeway to impose punishments as they see fit If local prosecutors are forced to decide between charging minors with severe penalties that are aimed at adults or declining to prosecute greater part will likely choose the latter noted Lindsay Hawthorne the communications coordinator at Enough Abuse a Massachussetts-based nonprofit fighting against child sexual abuse But then this throws away an opportunity to teach youth about the consequences of their actions and prevent reoffending Charges that come at a prosecutor s discretion are more likely to disproportionately criminalize youth of color and LGBTQ youth she reported A different approach to incarceration Delfino noted that in an ideal occurrence a judge in juvenile court would weigh a multitude of factors in sentencing the severity of the harm caused by deepfake abuse the intent of the perpetrator and adolescent psychology Experts say that building these factors directly into guidelines can help better deal with offenders who may not understand the consequences of their actions and allow for different enforcement mechanisms for people who say they weren t seeking to cause harm For example latest laws passed this session in South Carolina and Florida have proportional penalties that take into account circumstances including age intent and prior criminal history Both laws mirrored model bill written by MyOwn Image a nonprofit dedicated to preventing technology-facilitated sexual violence Founded by image-based sexual abuse survivor Susanna Gibson the organization has been involved in advocating for strengthened laws banning nonconsensual distribution of intimate images at the state level bringing a criminal justice reorganization lens into the debate Under the Florida law which took effect May offenders who profit from nonconsensual intimate images distribution are charged with felonies even if for a first offense But first-time offenders who use intimate images to harass casualties are charged with a misdemeanor if they do it again they then are charged with a felony This avoids sweeping criminalization of people who may not fully understand the harm caused by their actions Will Rivera managing director at MyOwn Image announced in a declaration South Carolina s newly passed law addressing AI-generated child sexual abuse material meanwhile explicitly states that minors with no prior related criminal record should be referred to family court and recommends behavioral wellbeing counseling as part of the adjudication A separate South Carolina law banning nonconsensual distribution of intimate imagery also has tiered charges depending on intent and previous convictions Beyond criminalization Experts are mostly united in believing that incarcerating youth offenders would not solve the issue of image-based sexual abuse Franks stated that while her group has long recommended criminal penalties as part of the answer there need to be more strategy solutions for youth offenders than just threatening jail time Amina Fazlullah head of tech plan advocacy at Common Sense Media explained that laws criminalizing NCII and abusive deepfakes need to be accompanied by digital literacy and AI teaching measures That could fill a massive gap According to Stanford there in the present isn t any comprehensive research on how a great number of schools specifically teach students about online exploitation Since the bulk teens aren t keeping abreast of criminal codes AI literacy development initiatives could teach young users what crosses the line into illegal behavior and provide information for casualties of nonconsensual intimate imagery to seek redress Digital literacy could also emphasize ethical use of equipment and create space for conversations about app use Hawthorne noted that Massachusetts s law banning deepfakes which went into effect last year directs adolescents who violate it to take part in an development effort that explains laws and the impacts of sexting Ultimately Franks declared the behavior that underlies deepfake abuse isn t new and so we do not need to rewrite our responses from scratch We should just stick to the things that we know which don t change with apparatus which is consent autonomy agency safety Those are all things that should be at the heart of what we talk to kids about she revealed Like abstinence-only tuition schools shaming and scaring kids about more common practices like sexting is not an effective way to prevent abuse Franks noted and can discourage kids from seeking help from adults when they are being exploited Franks noted that parents too have the power to instill in their children agency over their own images every time they take a photo She also announced there are myriad other techniques to regulate the ecosystem around sexually explicit deepfakes After all bulk protocol around deepfakes addresses harm already done and laws like the federal Take It Down Act put a burden on the victim to request the removal of their images from online platforms Part of addressing the matter is making it more complex to create and rapidly distribute nonconsensual imagery and keeping tools for deepfakes out of kids hands experts revealed One avenue for change that advocates see is applying pressure on companies whose tools are used to create nonconsensual deepfakes Third parties that help distribute them are also becoming a target After a CBS News probe Meta took action to remove advertisements of so-called nudify apps on its platforms Frank also suggested app stores could delist them Payment processors too have a lot of power over the ecosystem When Visa Mastercard and Discover cut off payments to PornHub after a damning New York Times record revealed how numerous nonconsensual videos it hosted the largest pornography site in the world deleted everything it couldn t confirm was above board nearly percent of its total content Last month Civitai in the end cracked down on generative AI models tailored around real people after payment processors refused to work with the company This followed extensive reporting by tech news site Media on the image-platform s role in the spread of nonconsensual deepfakes And of curriculum Franks reported revamping the liability protections digital services enjoy under Section could force tech companies hands when it comes to liability compelling them be more proactive about preventing digital sexual violence A version of this article first appeared in Tech Framework Press Feeling overwhelmed by the news The th is considering new tactics to keep you informed But we need your input Fill out this quick survey to share your thoughts The post Kids are making deepfakes of each other and laws aren t keeping up appeared first on MinnPost

Similar News

Research institute, affiliated with Harvard and MIT, lays off 75
Research institute, affiliated with Harvard and MIT, lays off 75

The layoffs represent less than 4% of staff, with most of the cuts in administrative roles. The post...

08.07.2025 2
Read More
Opinion: It’s Time for NYC to Appoint a Heat Czar
Opinion: It’s Time for NYC to Appoint a Heat Czar

“Heat is an infrastructure problem, an economic problem, a policy problem, a community problem, and ...

08.07.2025 2
Read More
Trump comes to Bondi’s defense amid uproar from his base over Jeffrey Epstein files flop
Trump comes to Bondi’s defense amid uproar from his base over Jeffrey Epstein files flop

By ERIC TUCKER, Associated Press WASHINGTON (AP) — President Donald Trump leapt to the defense of At...

08.07.2025 1
Read More