It only takes seconds for AI to turn an innocent photo into something pornographic that can be distributed online. CBC’s Ellen Mauro breaks down how the images are being used illegally, why it’s nearly impossible to stop and sees first-hand how easy these deepfakes are to make.
Salvation Army ends housing program in Durham Region after funding cuts — and some landlords says it owes them
- today, 5:14 AM
- cbc.ca
- 0
No comments