California can require bloggers who are parents to remove children’s content

As the daughter of a social media influencer, Caymi Barrett said she navigates life following a digital trail she wishes wasn’t there.
“Everything my mother posted is on social media,” she said. “Photos I wish never saw the light of day, secret details about my life, even when I started my first period.”
Barrett was speaking at a news conference Wednesday in support of Senate bill 1247, which would require social media platforms to provide a process for adults to request the removal of content that impersonates children and was created by a family member who received compensation for sharing material online.
The law will require the parent or other relative to remove or edit the content within 10 business days of receiving the notice. Applicants can take civil action against those who fail to comply with the law and statutory damages will be capped at $3,000 for each day the content remains online.
Sen. Steve Padilla (D-San Diego), who introduced the bill last month, said it would help protect the dignity and mental health of those whose childhoods were talked about on social media. The measure has been referred to the Senate Privacy, Digital Technologies and Consumer Protection Committee and is scheduled for a hearing on April 6.
“The evolution of these applications and technologies is amazing,” Padilla said. “But it changes our social dynamics and creates conditions where, while very productive for some people, they also require caution.”
The bill would build on previous legislation from Padilla that was signed into law two years ago and requires content creators who include children in at least 30% of their work to put some of their profits into a trust that children can access when they turn 18.
Alyson Stoner, a former child actress who appeared in films such as “Step Up” and “Cheaper by the Dozen,” spoke at a news conference and said she had suffered various injuries as a result of the portrayal of her life. Her appearance was discussed by strangers, pictures of her face were placed on pornographic images and a singer appeared during one of her dance lessons.
While the dangers of child pornography are well known, Stoner is concerned that social media is now creating similar situations for children across the country.
“Borders have blurred as personal home spaces become a set of content and a child’s real life becomes entertainment,” said Stoner, who now works as a mental health advocate. “Family members or adults around who are supposed to be safe and trustworthy people are usually the ones doing the recording.”
Barrett, who remembers being the victim of predators and being harassed online, said her mother knew the problems the virus was causing but continued to share her daughter’s life on social media.
“Everything that comes with the shipment shows my safety and health,” he said. “To this day, I still wonder in my mind what someone knows about me and that they have a preconceived idea about me based on what my mother wrote.”
Parents who create content centered around their children have come under increased scrutiny in the past few years after Ruby Franke — a prominent “mommy blogger” who shares stories about her family in Utah on YouTube — pleaded guilty to child abuse in 2023. His daughter, Shari Franke, is now advocating for more child protection online.
Keeping children safe on social media or while using artificial intelligence is a hot topic in California and nationally. Gov. Gavin Newsom said California is paving the way for legal restrictions on social media and artificial intelligence, but child safety advocates say there’s still a long way to go.
A landmark ruling this week in Los Angeles County Superior Court could reshape whether tech companies are responsible for the harms caused to children by their products. On Wednesday the judges found that Instagram and YouTube are responsible for designing platforms designed to attract young users.



