5 Comments
User's avatar
Richard Kraushuber's avatar

I do not have problem with that, big thanks for the U.S. custom nfo

Expand full comment
Kenneth E. Harrell's avatar

Well, at least on iOS, Face ID can be assigned to every application. It enforces biometric authentication for opening every app on your phone.

That can help mitigate at least some of the issues mentioned in the earlier part of the post.

Expand full comment
Athena's avatar

Great post! Have you read Joy, B. (2018). Why the future doesn't need us. In R.V. Yampolskiy (Ed.) Artificial intelligence safety and security (pp. 3-20). Bacon-Roton, FL: Chapman and Hall/CRC.

Certain of his visions of the future seem increasingly salient.

Expand full comment
Stephen Fitzpatrick's avatar

In several of the reported cases of teen suicide, it wasn't until parents or loved ones went through their ChatGPT history to see the depth of the problems they were having. I suspect it will work it's way into TV shows and movies - police combing through AI logs just like they do with cell phones. Interesting piece. On another note, looking forward to the book ...

Expand full comment
Jess Maeve's avatar

You are definitely on the right track - this topic needs to be addressed more fully. I’ve been writing about the intersection of AI and domestic violence. I’d love for you to check out my essays as you appear to share my concerns, albeit from an entirely different starting point.

Relative to what you mentioned about law enforcement…. I can attest from experience that they are not allowed to log into someone’s account without some sort of warrant that no one really seems to understand how to obtain for non-tangible items. I offered the login credentials to the detective working on the case of felony charges brought against my soon to be ex husband….. but the detective was only permitted to receive a usb from me on which I had saved an export of the whole ChatGPT account. This is where my ex spent countless months straight having conversations about my mental stability or their “shared” (echo chamber) views that I had to be a narcissistic pathological liar. These inaccurate conclusions were the basis of the onslaught of abuse I suffered, physically and psychologically. But guess what… law enforcement thinks AI is about deepfakes and other buzzwords, and fails to recognize the tool it has become for an abuser (or even just someone with a lack of respect for your boundaries) to use as a form of manipulation.

The robots won’t take over our physical world until we have gladly handed over our minds. Now is the time to have serious conversations about the potential consequences.

Expand full comment