BFM Top 5 at 5: Story 3 Deep faked images of two 18-year-olds circulated on social media
BFM Top 5 at 5

BFM Top 5 at 5: Story 3 Deep faked images of two 18-year-olds circulated on social media

Sinar Project fellow Melissa Lim shares with BFM why Malaysian authorities may not be able to take any action and possible solutions to the growing problem of deep fake images and videos.

Resource Type:
News Article

Melissa shares that in this specific case of fake images of two young women, that enforcement practices are not enough. Section 6 of the Sexual Offences Against Children Act only makes it a crime to make and produce sexual abuse material, and that this doesn't include nude pictures.

Melissa also spoke about what needs to be done to improve enforcement and what should be included in the upcoming AI bill set to be tabled sometime next year. 

"It is also more important to have digital education and that needs to be accessible to the public especially
kids because we cannot rely solely on criminalizing AI. AI is a tool and any tools can be misused."

"We need to educate people to use the tools better and not misuse them. The AI bill should give more options for recourse for victims because as of now it looks like the only recourse that victims have is the criminal option and any civil action will also rely on the police to identify the perpetrator to run investigations against the perpetrator and charge the perpetrator."

"So the government needs to be equipped with better enforcement structures and tools so that the police is more enabled to identify and find the perpetrator."

Listen to the podcast at starting from 12:20

Attachments