Abstract

The rapid spread of generative AI has revolutionized media production, creating new challenges for information integrity as convincing deepfakes proliferate. Journalists play a critical role in upholding credible public information, yet existing deepfake detection technologies often overlook their specific workflows and requirements. This dissertation addresses these needs by identifying what journalists require from detection tools and evaluating usability in realistic scenarios through the following works: Journalists' Needs and Tool Design: Through qualitative user studies, we uncover journalists' preferences for tools that provide transparent, explainable evidence and context-aware analysis integrated into news verification routines. These insights drive the design of DeFake, a detection tool that integrates intra-frame, inter-frame, audio manipulation, and speaker identity analysis to support nuanced decision-making. Scenario-Based Evaluation: We present Dungeons & Deepfakes (D&DF), an innovative scenario-based role-play methodology for testing detection tools with journalists in the United States and Bangladesh. The evaluation demonstrates how time pressure, story impact, and explicit suspicions shape tool usage and reveals the prevalence of automation and confirmation biases. Simulation-Based Learning for Critical Media Verification: Building on insights from earlier phases, we design and examine a simulation-based learning environment that leverages AI-generated scenarios to support critical verification practice. The platform extends the D&DF framework into an interactive educational prototype, enabling users to engage in structured verification simulations. A small pilot study evaluates the system’s feasibility, usability, and pedagogical potential, showing that the approach is workable and offers promising directions while highlighting necessary refinements for future educational deployment. Collectively, these contributions advance practical solutions for media verification and empower journalists to safeguard information integrity within an AI-driven landscape.

Publication Date

12-2025

Document Type

Dissertation

Student Type

Graduate

Degree Name

Computing and Information Sciences (Ph.D.)

Department, Program, or Center

Computing and Information Sciences Ph.D, Department of

College

Golisano College of Computing and Information Sciences

Advisor

Matthew Wright

Advisor/Committee Member

Roshan Peiris

Advisor/Committee Member

Andrea Hickerson

Campus

RIT – Main Campus

Share

COinS