ما از کوکی ها و فناوری های دیگر در این وبسایت برای بهبود تجربه کاربری شما استفاده می کنیم.
با کلیک بر روی هر پیوند در این صفحه شما دستور خود را برای سیاست حفظ حریم خصوصیاینجاو سیاست فایلمی دهید.
باشه موافقم بیشتر بدانید

درباره‌ی Deepfake Detection

What you see... ...is no longer the truth.

“Deepfakes will challenge public trust in what’s real.”

Cybersecurity faces an emerging threat generally known as deepfakes, Malicious uses of AI generated synthetic media, the most powerful cyber-weapon in history, is just around the corner. The cybersecurity industry has only a short time to get ahead of it before it challenges public trust in reality. In response to this scary thought, Hany Farid, the "father" of digital image forensics, told The Washington Post, "Increasingly accessible tools for creating convincing fake videos are a deadly virus. However, the number of people working on the video-synthesis side, as opposed to the detector side, is 100 to 1."

Nation-states and Hollywood VFX artists have been able to manipulate media since the very beginning of media, but today anyone can download deepfake software and create convincing fake videos in their spare time because the cost of producing these new forms of synthetic media has decreased significantly. It will soon be as easy to create a fake video as it is to add an Instagram filter. Celebrities and politicians will be the primary targets of the weaponization of this deepfake technology. Deepfakes swap celebrities' faces into porn videos and put words in politicians' mouths, but they could do a lot worse. It may only be a matter of time before the general public is at risk too.

Deepfakes are such a threat to U.S. that the Defense Department is launching a project to repel "large-scale, automated disinformation attacks". The Pentagon's Joint Artificial Intelligence Center recently declared that deepfakes pose a very real threat to national security. Rep. Adam B. Schiff (D-Calif.), who chairs the U.S. House Intelligence Committee, said, "I don't think we're well prepared at all. And I don't think the public is aware of what's coming." On the other hand, this month Texas became the first state to criminalize deepfakes.

From a cybersecurity perspective, we have to address all known forgery methods with the highest accuracy possible and develop generalizable artifact-detection methods for “zero-day deepfakes”. However, the science of detecting deepfakes is, effectively, an arms race. Those who develop deepfake technology are acutely aware of its tremendous power for abuse. Identifying unknown tampered content is technically challenging, and that is why our research always has to keep up or even get ahead. Given the urgency of deepfake detection, we should focus on the critical need to develop solid detection algorithms before we are in the eye-of-the-storm.

Our deepfake detection technology was designed to detect deepfake videos or, simply, any fake content in the areas of visual and audio communication. After a year of research in synthetic media detection, we have built a multi-layered neural engine to spot deepfake content. When a platform integrates our technology, it provides an automatic warning if you are watching, reading, or hearing fake content. This will enable governments, social media platforms, instant messaging apps, news and media to detect AI-made forgery in digital content before it can cause social harm.

جدیدترین چیست در نسخه‌ی Varies with device

Last updated on 30/09/2019

Minor bug fixes and improvements. Install or update to the newest version to check it out!

بارگذاری ترجمه...

اطلاعات تکمیلی برنامه

آخرین نسخه

وارد شوید undefined در undefined Varies with device

نیاز به اندروید

Varies with device

نمایش بیشتر

Deepfake Detection اسکرین شات ها

اشتراک در APKPure
اولین کسی باشید که به نسخه اولیه، اخبار و راهنمای بهترین بازی ها و برنامه های اندروید دسترسی پیدا می کند.
نه، متشکرم
ثبت نام
با موفقیت مشترک شد!
اکنون به APKPure مشترک شده اید.
اشتراک در APKPure
اولین کسی باشید که به نسخه اولیه، اخبار و راهنمای بهترین بازی ها و برنامه های اندروید دسترسی پیدا می کند.
نه، متشکرم
ثبت نام
موفقیت!
شما الان عضو خبرنامه‌ی ما شدید.