(1)AI’s use of women’s undressing apps in pictures in the United States has skyrocketedimage text translation
(2)Possibility of producing and distributing nude photos without consent of the parties
(3)AI Image Herald DB
(4)Herald Economy = Reporter Son In-gyu In-gyu In the United States, the number of users of deep fake applications and websites that take off women’s clothes in pictures with artificial intelligence AI has increased, raising concerns about abuse
(5)Bloomberg reported on the 10th that 24 million people visited a deepfake website using AI to take off their clothes in September alone, citing Grapecar, a software company as a service Deepfake is a combination of deep learning and fake, and refers to an image or video that manipulates faces as if they were real based on AI
(6)According to Grapica, the number of links advertising AI undressing apps on social media such as XX, old Twitter and Reddit increased by 2,400 in September from the beginning of this year Deepfake apps and websites use AI to create images as if the person in the picture is taking off his clothes, and the person in the picture is mostly female
(7)The popularity of such apps and websites is attributed to the release of AI, which can create images that are much more plausible than just a few years ago, Grapecar said.” Developers can create undressing apps for free using open-source AI
(8)However, Bloomberg noted that these apps and websites are intensifying concerns that recent AI technology developments could be exploited for crime It can cause serious legal and ethical problems in taking photos from social media without the consent or control of the parties, turning them into pornography such as nude photos, and distributing them
httpsbizheraldcorpcomviewphpud=20231210000176