PROPOSAL ABSTRACT
Prepared for:
Broad Agency Announcement
Media Forensics (MediFor) 27
Information Innovation Office
DARPA‐15 58
Prepared by:
Viterbi School of Engineering
University of Southern California
Daniel P. Burns
USC Institute for Creative Technologies
12015 Waterfront Drive
Playa Vista, CA 90094-2536
Phone: 831.915.1212
eMail:
7 October 2015
Introduction
Both the manipulation of images and automated detection of manipulated imagery is at the same time enabled by digital computing and constrained by that same technology. There is little confidence, let alone assurance, that improved algorithms and enhanced systems will have significant impacts on the replacement of the slow and unverified manual analysis of imagery for possible manipulation by some reliable and rapid machine analysis. Quantum computing offers a potential path to rapid and valid machine processing of images to detect manipulation. No longer in the realm of future capabilities, the University of Southern California (USC) has for the last four years been successfully operating and conducting effective research on a Quantum Annealer (QA) manufactured by the D-Wave company. This facility can be used to evaluate new ways in which images can be assessed for possible manipulation. This will require new algorithms, new approaches and continued improvement and validation of this emerging technology. This is only made possible by the presence of the optimal team on site at USC. While QA capabilities may be useful in every stage of this process, pursuing its use in Integrity Analytics Research and Development seems most propitious at this time.
Goals and Impact
The underlying goal of this project is detect manipulations toassist in measuring integrity. Specific objectives are to utilize work already in progress on image identification via QA in order to establish differences in images that have been manipulated and those that are pristine. While some of this research may at first be empirical and not tightly directed at proving a stated thesis, a second objective is to use the early data to generate a valid and verifiable theory-based foundation for future research and application. A third objective will be the assessment of a range of different quantum analyses to evaluate their ability to differentiate the original image from the manipulated version of it. Having that set of differences defined, the final objective will be to conceive, test, and document ways in which the manipulated image can be identified as having been manipulated without having access to the original. Having made that determination and being able to quantify the likelihood of manipulation, a score can be assigned to each image and appropriate steps taken to react to that discovery. Exploratory evaluations of the use of QA in Physical and Semantic integrity will also be addressed and, if indicated, pursued.
Technical Plan
The first effort will be devoted to evaluating the well-published algorithms (Farid, 2009) on image manipulation detection for area in which QA can provide either qualitative improvements or compute-time reductions. Once those areas are identified, the next phase will be engaged in programming the D-Wave computer to make enhance or accelerate the processes. Prior to proceeding, a number of original images of varying content will be identified and characterized. They then will be manipulated in the ways known to be encountered in today’s environment. The degree of manipulation with be quantified and assessed “manually” by showing a version of the original and the manipulated image to a person who is unaware of the manipulative process and then asking that person to respond via a Likert Scale to the extent of the differences. These photos will then be analyzed using the “converted” algorithms and any other approach that is conceived by the QA researchers. The analyses will consider a range of different parameters for comparison and examine a number of different data visualization techniques to assess the optimal way to discern critical differences.
Finally, the team will use the QA capabilities once again to seek out the differences in original versus manipulated images to identify those differences that were “universal,” indicating a marker of manipulation that will identify such manipulation, even in the absence of the original as a a comparison. The quantification of this parameter or series of parameters then will be the manipulation score of subject images. These scores will them be compared to the “manual” Likert Scale scores to investigate correlations between human difference detection quantification and machine-automated singly photo analysis via QA.
Evaluation and Data Requirements
The team will incorporate the NIST Evaluation Plan into its procedures and schedules. Additional data requirements are not known at this time, but are considered to be trivial, if present at all.The team will anticipate that NIST will vary integrity indicators in accordance with the different methods of editing and/or manipulating images. The QA-enabled system will be so designed as to report any findings of any manipulation and to quantify the degree of certainty as to that manipulation. If possible, the type of manipulation will be identified as well as the method of that application.Most importantly, the area of suspected manipulation will be identified via a visualization of the image with suspect area high-lighted. The nature of the methodology that successfully identified the manipulation will be reported.
Statement of Work
Task 1 Algorithm Evaluation for QA Enhancement
Task 2 Implement QA programs for selected algorithms
Task 3 Develop or obtain a series of original and manipulated images in varying formats
Task 4 Assess degree of manipulation via Linker Scale reporting of observed differences
Task 5 Conduct comparison runs using the QA programs, varying observed parameters
Task 6 Using resultant data, analyze the differences in original and manipulated images
Task 7 Identify characteristics common to all manipulated images or list of suspect data
Task 8 Quantify manipulated markers ascertained by this process
Task 9 Compare derived scores with manually observed differences
Task 10 Develop and test hypotheses supporting observations
Task 11 Participate in NIST testing
Task 12 Record and Report on all of the above
Team Capabilities
References
Farid, H. (2008). Digital Image Forensics.Scientific American, 298:6 66-71
Farid, H. (2009).A Survey of Image Forgery Detection.IEEE Signal Processing Magazine, 26:2 16-25
- Statement of Work (Mandatory): Provide a brief summary the principal work
activities, appropriate specific major milestones (quantitative, if possible) at
intermediate stages of the project to demonstrate progress, and a brief plan for
accomplishment of the milestones. Describe any IP claims and proposed licensing if
anticipated.
- Supplement (Optional): If desired, include anything you consider relevant that
differentiates your team and approach. If desired, include a brief bibliography with
links to relevant papers, reports, or resumes of key performers.