On ChatGPT Impersonation

Unfortunately ChatGPT can be used to convincingly impersonate individuals, and if deployed in settings like DMs and text messages, the personal information shared in those messages with the reasonable expectation of privacy, has now been made publicly available. This puts the identities of innocent people at risk. Even when you think you’re not sharing sensitive information about yourself, seemingly innocuous data about you can be weaponized against you, if it falls into the wrong hands.

Previous
Previous

Harassment of Women on Twitter

Next
Next

Twitter— Privacy, Free Speech and Profitability