A manipulation form that develops with new technologies: Deepfake

Haber ile ilişkili SDG etiketleri

DOI : https://doi.org/10.32739/uha.id.12612

Experts warn against content produced with deepfake technology, which has been on the agenda recently... So much so that it requires a very detailed examination to be detected. Assoc. Prof. Bahar Muratoğlu Pehlivan stated that the errors in the images and the defects in the person's face and head movements, the frequency of blinking and whether there is a distortion in the image when the person moves. Assoc. Prof. Bahar Muratoğlu Pehlivan said that the person's appearance, movements and voice can be imitated in a very close way to reality. Its use for the purpose of fake news and black propaganda is of great concern and added that deepfake technology is an extremely dangerous tool to manipulate society.

Assoc. Prof. Bahar Muratoğlu Pehlivan: "Deepfakes are an extremely dangerous tool to manipulate society"

Üsküdar University Deputy Dean of the Faculty of Communication, Assoc. Prof. Bahar Muratoğlu Pehlivan, instructor at the Department of Journalism, made statements on deepfakes, which have been recently on the agenda a lot.

It is able to imitate the person's image and voice very close to reality...

Defining 'deepfake' as a form of manipulation developed with new technologies, Assoc. Prof. Bahar Muratoğlu Pehlivan said that: "It can imitate a person's appearance, movements and voice in a way that is very close to reality. In this way, it can make the person look like that person did things they actually did not do and things they did not say. In general, synthetic media technology, in addition to simulating people's actions, can produce images of people who do not actually exist, or alter various properties of real images.".

Deepfakes hinders citizens' right to accurate information

Noting that deepfake technology is becoming increasingly common and easier to use, Pehlivan said that "We can say that Deepfake technology is based on deep learning-based artificial intelligence technology, which transfers faces from one person to another using source images of the person who is asked to be imitated.".

Stating that one of the concerns in this regard is that it often targets women, Pehlivan said: "On the other hand, its use for fake news and black propaganda are also a great concern. It seems to be an extremely dangerous tool for manipulating society. In our age, where false information spreads very quickly, and even if it turns out to be wrong, its spread cannot be prevented and the damage cannot be compensated, there are great problems at both individual and social levels. In this respect, it poses a threat to the democratic order and hinders the right of citizens to obtain accurate information.”.

Fake content that can manipulate can cause polarization...

Reminding that the production and dissemination of unreal images of individuals is a criminal element in itself, Assoc. Prof. Bahar Muratoğlu Pehlivan said: "False information about people whose images are simulated through deepfakes is spread and the personality rights of these individuals are violated. On the other hand, being able to produce images in this way makes it easier to commit other crimes through new communication technologies. Another problem is that it can cause polarization and conflict in society by sharing fake content that can manipulate society.".

Deepfake content requires a very thorough investigation

Underlining that, the images should be examined in great detail and each frame should be analyzed separately in order to detect deepfake content, Pehlivan said that "The errors in the images, the defects in the person's face and head movements, the frequency of blinking and whether there is a distortion in the image when the person moves are examined. Tools and programs developed for this purpose can be used.".

Noting that there are artificial intelligence applications developing to detect deepfake, Pehlivan completed her remarks as follows: "Of course, deepfake technology is also improving day by day, and one day, current verification tools may be inadequate. However, verification and synthetic media detection techniques will continue to develop in the same direction. The important thing is to be skeptical of the content spread through new communication technologies and to monitor whether it is confirmed.".

Üsküdar News Agency (ÜNA)