AI

2 hours ago 3

The arrival of artificial intelligence has already had a transformative impact on society, and experts say it’s only the beginning. But with progress comes an inevitable downside, as the tech revolution threatens something Americans have long valued and fiercely guarded: privacy.

Several recent high-profile incidents underscore the volatile intersection where digital advancement and privacy now collide. 

Ring, the doorbell camera company owned by Amazon, faced enormous backlash after its disastrous Super Bowl ad, which was supposed to be a celebration of the company’s technology being responsible for tracking and finding a lost dog. Instead, it provoked outrage from viewers and privacy advocates who saw it as a harbinger of an AI-powered surveillance network that could be exploited by law enforcement and corporate interests. 

Many were confused when the FBI was able to retrieve Nest cam footage from the night that Nancy Guthrie was seemingly abducted from her Tucson, Arizona, home, after law enforcement said the data was inaccessible because the family did not have a paid subscription. FBI

The company’s CEO found himself apologizing for Ring’s vast camera network and its capabilities, even though monitoring homes and neighborhoods is its entire business model. 

As a result, Ring canceled its partnership with Flock Safety, a security software firm that sells license plate-scanning tech to law enforcement.

Meanwhile, OpenAI, the company behind ChatGPT, came under fire after it was revealed employees had banned accused Canadian school shooter Jesse Van Rootselaar’s account over disturbing messages — but never alerted police.

Doorbell cams don’t just keep watch on porch pirates; they can catch your neighbor cutting the lawn or power‑walking past your house, or serve as a neighborhood watchdog. 

AI chatbots serve up answers to any questions you have in an instant, and all that personal data winds up on a server farm somewhere. 

The sales pitch is peace of mind, but the actual cost — your privacy — may be greater than you care to pay.

The “Search Party” Ring ad that aired during the Super Bowl generated huge backlash. It was about neighbors helping track a lost dog using outdoor camera footage, but detractors said it raised troubling issues about privacy. Ring

“That’s actually terrifying to me,” says Matt Sailor, CEO of the surveillance solutions company IC Realtime. “You’re going to allow companies to use the data that’s being recorded and archived from your home with your family involved, not caring about the subject matter, and kind of do it under the guise of, ‘oh, we’re doing it to save Fido.’ It’s just wrong.”

“We’re definitely in a stage where we have to start resetting our expectations about what is private,” adds Michel Paradis, a lawyer who teaches a course at Columbia University on the Law of Artificial Intelligence. 

“And we also just have to be very cautious.”

On paper, Americans have never been more protected. 

In practice, the experts say, the system is a joke. 

Doorbell cams don’t just keep watch on porch pirates; they can catch your neighbor cutting the lawn or power‑walking past your house, or serve as a neighborhood watchdog.  Inga – stock.adobe.com

“Right now the laws we have are essentially running a dial-up connection in a 5G world,” says Paul Armstrong, a tech advisor and founder of TBD Group.

Meta recently paid a $725 million fine to settle privacy violations accusations, but for such a big company, that’s simply the cost of doing business. 

“Fines like this are like affirmations for these big tech firms,” says Sree Sreenivasan, CEO of Digi Mentors. “It shows them they’re on the right track with all of this stuff.”

“A nine-figure fine sounds enormous until you realize the number looks like a rounding error on a quarterly earnings call,” adds Armstrong. 

Peter Jackson, a cybersecurity and privacy attorney at Greenberg Glusker in Los Angeles, says most consumers have no idea how exposed they really are. 

OpenAI banned Jesse Van Rootselaar’s account over disturbing messages before the shooter killed eight people in Tumbler Ridge, BC, Canada, — but the company did not alert police. via REUTERS

“Consumers are under informed about what is happening with their information,” he says. “[Privacy] disclosures are technically thorough and practically useless. Most people don’t really understand what any of it means.”

Jackson agrees that current penalties don’t meet the moment. 

He points to a recent case in which The Walt Disney Co. agreed to pay $2.75 million to settle allegations it violated California’s consumer privacy laws. 

The entertainment giant was accused of not fully complying with users’ requests to opt out of data sharing on Disney’s streaming services. 

A person’s whereabouts as captured by a Ring camera can be used to build a court case against them. WSYX

The fine is a record under California’s privacy act, but as Jackson notes, “That amount is nothing to Disney. US privacy law is not sufficiently armed with penalties strong enough to incentivize companies to do better.”

“The privacy erosion isn’t a bug but a feature of the business model of most tech companies,” says Arash Vakil, a professor of business at CUNY and product consultant. 

“These companies that have a built-in subscription model are going to be able to have a better opportunity to maximize shareholder revenue, shareholder value.” 

“The reality is that these companies live on data,” adds Sailor. 

“They live on the information that you provide them. They’re gathering an amazing amount of information from your daily habits, and your data is not really your data. The companies own it.”

In this photo, Travis Decker is seen on the day he picked up his three young daughters in Wenatchee, Washington — the last time they were seen alive. The girl’s bodies were later discovered in their father’s truck, prompting a manhunt. Decker’s body was found months later. Chelan County Sheriff's Office

Judging by the popularity of doorbell cams and AI engines, consumers seem to be just fine with that outcome. “People are just really lazy, and they kind of always choose the easy way out,” Sailor says.

Many were confused when the FBI was able to retrieve Nest cam footage from the night that Nancy Guthrie, the mother of “Today” co-host Savannah Guthrie, was seemingly abducted from her Tucson, Arizona, home, after law enforcement said the data was inaccessible because the family did not have a paid subscription.

Days later, FBI Director Kash Patel said video from the home was “recovered from residual data located in backend systems.”

Google, the parent company of Nest, has no obligation to retain such data if the user has a subscription, and it may be overwritten at some point, though no it’s not clear when that typically happens. The company’s privacy policy notes that video expires after three hours.

Jaron Mink, an assistant professor of computer science and engineering at Arizona State University, points to the US having lax privacy regulations, according to NPR.

Halloween thieves were identified with this security camera footage. WUSA9

“Sometimes it means that it’s more difficult for the functionality to delete data to actually occur, because it’s not built as a requirement in the system in mind,” Mink said.

While Google has denied that it incorporates Nest user video is used to train AI models, according to Ars Technica, it has said, “We may use your inputs, including prompts and feedback, usage, and outputs from interactions with AI features to further research, tune, and train Google’s generative models, machine learning technologies, and related products and services.”

Sreenivasan, the former chief digital officer of New York City, says most Americans have made their peace with trading privacy for convenience and a sense of safety. 

“Absolutely, that’s what has happened. People have this weird relationship with technology,” he says. 

Critics say that people have traded privacy for convenience with technology like the Ring camera. Nick Beer – stock.adobe.com

“They want all the convenience and all the privacy, but don’t do anything about the privacy and do everything about the convenience.”

The pattern started long before AI, with cookies — those small text files stored by websites to remember your login and preferences. 

“When cookies first came around, Americans were like, ‘Yeah, whatever,’ ” Sreenivasan says. “You accepted everything, and you didn’t care, because you wanted the convenience. If you stopped accepting cookies, it wouldn’t remember your account, it wouldn’t remember your favorites, it wouldn’t remember your history. It would make online shopping a terrible experience.”

Then came Gmail in 2005, promising “unlimited messages, no having to delete anything,” Sreenivasan recalls. “We knew immediately that they are scanning your emails and giving ads. If my wife wrote me, ‘Honey, can you pick up some milk?’ I’d get an ad for Gristedes.”

In this footage, an Amazon worker was seen taking someone’s cat. Diane Huff-Medina via Storyful

Vakil sees the same pattern now with AI, cameras and apps. “Users or consumers have been very happily trading free stuff for their data,” he says. “We’ve kind of become accustomed to the convenience this technology offers . . . but you have to remember; if the product is free, then you are the product.”

But others think consumers are in an impossible position. “People were never given a genuine choice to either accept these terms or don’t use the product,” says Armstrong. “Opting out increasingly means opting out of modern life.”

Technology always creates new opportunities for invasions of privacy. 

What makes an AI chatbot different from Google is that it doesn’t just spit out links; it talks back. “AI chatbots have a kind of personality to them that makes them feel much more like a confidant,” Paradis says. “There’s a personality there that’s driving these responses.”

That “intimacy of the chatbot experience,” he says, “raises many questions about privacy . . . and how what people ask AI chatbots can potentially be used against them, either by law enforcement or even just socially.”

It has all the features of a confidential relationship — except it isn’t. “Legally, there’s no reason at all that anything you put into a chatbot should be considered as anything other than the type of information you would give to a bank,” Paradis says, noting banks can be forced by court order to hand over customer records.

Camera footage showed two officers in a Bronx apartment building shortly before one of them fired at an intoxicated tenant. Juan Rivera

What responsibility do tech companies have when their tools brush up against real‑world violence? In the wake of the Feb. 10 shooting in British Columbia, OpenAI pledged to overhaul its safety protocols. But would more proactive measures risk a different nightmare — a “Minority Report” world where people are punished for what they might do?

“The ChatGPT situation with the Canadian shooter exposes a no-win scenario nobody has legislated for yet,” Armstrong says. 

“Failing to report means complicity, but reporting means building a surveillance apparatus capable of flagging someone for a thought.”

Paradis agrees it’s a legal gray area. 

“In the early days of Google, people asked if Google should have to report to the police if you are unusually interested in ISIS based on your search history,” he notes. “With AI, you’re not just searching for how to buy a silencer. You’re asking it to explain how to use it and how to install it.”

Don’t expect lawmakers in Washington to sort this out, either.

“Certainly at the federal level, I think that’s going to be very unlikely,” Paradis says of stricter AI legislation. 

Over the next few years, he expects most real action to occur at the state level. The Trump administration has taken “a generally libertarian view,” he notes, even pushing agencies via executive order to look for ways to preempt state AI regulations.

“We are sort of in a digital Wild West,” says Jackson. “Our legal system in its current state is not built to fight such battles.”

Paradis is cautiously optimistic that we’ll eventually adapt, as we have with past disruptive technologies. “This is not the first major technology that’s created huge disruptions to our sense of privacy,” he says, pointing to cameras and radio, once seen as terrifying. “We’ve gotten smarter about it. And I think the same will happen now.”

Read Entire Article