Winsome Sears Slams Concerns Over ‘Misgendering’ Nashville School Shooter

Virginia Lt. Governor Winsome Sears (R-VA) said it was ridiculous for people to be concerned about “misgendering” the woman who murdered six people at a Christian school in Nashville this week. 

Sears made the comments during Friday’s episode of “Real Time with Bill Maher” where she appeared on a panel alongside journalist and author James Kirchick. 

“This person murdered six people, I don’t really care who you say you are. You murdered six people, and three of them were children. You don’t get a say. Well, she’s dead now, but you don’t get a say in telling us who you are and what you’re about. You killed six people,” Sears said. 

"They are misgendering and dead naming the murderer. They are referring to the murderer by their given name and not their chosen name. … If someone says they're a man they're a man."

"This person murdered 6 people. I don't really care who you say you are." pic.twitter.com/AYqQ1s0wc6

— Eric Abbenante (@EricAbbenante) April 1, 2023

Sears was referring to the 28-year-old woman who shot six people at The Covenant School in Nashville, Tennessee, on Monday. Officials have still not announced a motive for the shooter, who identified as transgender, but they have described the shooting as targeted.

In the days following the shooting, media outlets like The New York Times and USA Today scrambled after claiming that police had “misidentified” the gender of the shooter. 

“There was confusion later on Monday about the gender identity of the assailant in the Nashville shooting,” the Times said. “Officials had used ‘she’ and ‘her’ to refer to the suspect, who, according to a social media post and a LinkedIn profile, appeared to identify as a man in recent months.”

Sears made her comments after Kirchick said that some in the media had not followed their own rules, but continued to identify the shooter as female despite the self-identification as a male.

CLICK HERE TO GET THE DAILYWIRE+ APP

“If you noticed, they are misgendering and deadnaming the murderer, right. They are referring to the murderer by their given name, not their chosen name, and referring to her as a woman, as opposed to what her identity apparently was as a man, which is not the way the media usually does these things,” he said. “They are usually very particular about the subjective sense of gender identity and respecting that. If someone says they’re a man then they’re a man, but in this case they’re not doing that.”

Sears, who won an upset electoral victory in Virginia alongside Governor Glenn Youngkin, later said that parents deserved more say over the education of their children.

“I’m a parent, I’m a parent all day. I get to decide what happens in my child’s life, not you, not the government, not anybody. I don’t co-parent. I had this child, I’m responsible for this child. If anything happens to little Johnny, you’re calling me. If  I don’t want my child given lap dances at school by a drag queen, I don’t want it done.”

Italy Bans ChatGPT Over Data Privacy, Child Safety Concerns

Italy has temporarily banned artificial intelligence chatbot ChatGPT over concerns about data privacy.

In a statement Friday, the Guarantor for the Protection of Personal Data (GPDP), which oversees data privacy online, banned the U.S.-based chat website and its parent company, artificial intelligence developer OpenAI, from processing data from users in Italy. The agency said that OpenAI has no legal basis to collect data from Italian users to train the model, and has no age verification system to protect children against inappropriate answers.

“[There is] no way for ChatGPT to continue processing data in breach of privacy laws,” the agency said in a press release Friday. “The Italian [Supervisory Authority] imposed an immediate temporary limitation on the processing of Italian users’ data by OpenAI, the US-based company developing and managing the platform. An inquiry into the facts of the case was initiated as well.”

Much of the agency’s concern stemmed from the amount of information the app collects to train its language models. “[N]o information is provided to users, nor to interested parties whose data was collected by OpenAI, LLC and processed through the ChatGPT service,” the GPDP order stated. The order also noted that the data processed by ChatGPT can be inaccurate, since the AI does not always match factual circumstances.

Furthermore, the order states that there is an “absence of a suitable legal basis in relation to the collection of personal data and their treatment for the purpose of training the algorithms underlying the functioning of ChatGPT.” The agency also cited a data breach that occurred on March 20, revealing both conversations and payment information of some who use the more sophisticated, paid-for version of ChatGPT.

The other issue highlighted by the Italian government was worries about age verification. OpenAI’s terms of service say that the service is for users over the age of 13, but Italy said the lack of age verification filters for minors “exposes them to absolutely unsuitable responses with respect to their degree of development and self-awareness.”  The agency said that because of the lack of age verification and the complex violations of Italian law, it had no choice but to impose a blanket ban on harvesting data from all users.

The ban went into effect immediately. OpenAI must notify the GPDP of what measures it is taking to comply with the order within 20 days, otherwise, the country will impose a fine of up to 20 million euros or 4% of the company’s global revenue.

The move from the Italian government comes just days after a Belgian man reportedly committed suicide after interacting with a different chatbot. According to Euronews, a man in his 30s, with a wife and two young children, committed suicide after interacting for several weeks with ELIZA, another chatbot powered by a different language model.

CLICK HERE TO GET THE DAILYWIRE+ APP

According to the man’s wife, who spoke with several European news outlets, the man became fixated on climate change and spent his days confiding his fears in ELIZA. The chatbot allegedly stoked the man’s fears until he developed suicidal ideation; the man reportedly suggested that he sacrifice himself so that ELIZA could save humanity through AI, his widow said.