Fortify
Fortify
Fortify
User Guide
The only warranties for HP products and services are set forth in the express warranty statements
accompanying such products and services. Nothing herein should be construed as constituting an additional
warranty. HP shall not be liable for technical or editorial errors or omissions contained herein.
The information contained herein is subject to change without notice.
Confidential computer software. Valid license from HP required for possession, use, or copying. Consistent
with FAR 12.211 and 12.212, Commercial Computer Software, Computer Software Documentation, and
Technical Data for Commercial Items are licensed to the U.S. Government under vendor's standard
commercial license.
Copyright Notice
Documentation Updates
The title page of this document contains the following identifying information:
• Software Version number
• Document Release Date, which changes each time the document is updated
• Software Release Date, which indicates the release date of this version of the software
To check for recent updates or to verify that you are using the most recent edition of a document, go to:
http://h20230.www2.hp.com/selfsolve/manuals
This site requires that you register for an HP Passport and sign in. To register for an HP Passport ID, go to:
http://h20229.www2.hp.com/passport-registration.html
You will also receive updated or new editions if you subscribe to the appropriate product support service.
Contact your HP sales representative for details.
Contents iii
About Folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
About Grouping Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
About the Source Code Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
About Displayed Source Code. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
About the Project Summary Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
About the Functions View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
About the Issue Auditing Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
About the Summary Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
About the SecurityScope Details Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
About the Recommendations Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
About the History Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
About the Diagram Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
About the Filters Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
About the Analysis Evidence Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Customizing the Auditing Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
About Searching Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
About Search Modifiers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Search Query Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Performing Simple Searches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Performing Advanced Searches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Adding Custom Tags: For FPRs Not Uploaded to Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Synchronizing Custom Tags—For an FPR Not on Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . 54
Committing Custom Tags—For an FPR on Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Synchronizing Custom Tags—For an FPR on Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Committing Filter Sets and Folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Synchronizing Filter Sets and Folders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
About Project Template Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
About Sharing Project Templates with Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Downloading Project Templates from Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Uploading Project Templates to Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
About Working with Audit Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Opening Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
About Merging Audit Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
About the Event Bridge Utility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
About Additional Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Uploading Results to Software Security Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
About Advanced Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Using the Audit Guide in Advanced Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
About Integrating with Bug Tracking Systems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
About Third-Party Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
About Public APIs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
About Penetration Test Schema . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Contents iv
Chapter 4: Auditing Analysis Results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Evaluating Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Performing Quick Audits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Creating Issues for Undetected Vulnerabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Suppressing Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Submitting Issues as Bugs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
About Hotspot Ranking of Unaudited Issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Listing Issues by Hotspot Ranking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
About Correlation Justification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Using Correlation Justification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Chapter 5: Audit Workbench Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
About Default Report Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
About the Fortify Security Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
About the Fortify Developer Workbook Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
About the OWASP Top Ten 2004, 2007, 2010 Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
About the Fortify Scan Summary Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
About Modifying Report Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Viewing Report Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Selecting Report Sections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
About Editing Report Subsections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Saving Modified Report Template Settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
About Report Template XML Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Generating Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Chapter 6: Functions View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
About the Functions View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Opening the Functions View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Sorting and Viewing Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Locating Functions in Source Code. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Locating Classes in Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Determining Which Rules Matched a Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Writing Rules for Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Creating Custom Cleanse Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
Chapter 7: Troubleshooting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Using the Debug Option. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Addressing the org.eclipse.swt.SWTError Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
About Out of Memory Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Allocating More Memory for Static Code Analyzer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Determining the Amount of Memory Used by External Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Resetting the Default Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Contents v
Appendix A: Sample Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Basic Samples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Advanced Samples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Appendix B: Static Analysis Results Prioritization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
About Results Prioritization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Quantifying Risk. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Inside a Static Analysis Engine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Estimating Impact and Likelihood with Input from Rules and Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Contents vi
Preface
About Contacting HP Fortify Software
If you have questions or comments about any part of this guide, contact HP Fortify using the information
provided in the following sections.
Technical Support
650.735.2215
fortifytechsupport@hp.com
Corporate Headquarters
Moffett Towers
1140 Enterprise Way
Sunnyvale, CA 94089
650.358.5600
contact@fortify.com
Website
http://www.hpenterprisesecurity.com
Preface vii
About HP Fortify Assistive Technologies
In accordance with Section 508 of the U.S. Rehabilitation Act, HP Fortify Software Security Center, HP Fortify
Audit Workbench, HP Fortify Plug-in for Eclipse, and HP Fortify for Package for Microsoft Visual Studio have
been engineered to work with the JAWS screen reading software package from Freedom Scientific. JAWS
provides text-to-speech support for use by the visually impaired. With JAWS, labels, text boxes, and other
textual components can be read aloud, providing greater access to the information therein.
To generate text-to-speech translations in an HP Fortify product’s graphical user interface, you use standard
JAWS commands. The following table lists keyboard combinations that can help you use JAWS with HP Fortify
products. For more information on using JAWS, consult the JAWS documentation.
Read values in combo boxes Press CTRL + DOWN ARROW key or press ENTER to enable Form mode.
Tab through multi-line text boxes Press CTRL + TAB to move from one multiline text box to the next.
Read multi-line labels Press INSERT + DOWN ARROW to read all lines in label.
Read disabled check boxes Press ESC to switch Forms mode to Virtual Cursor mode.
Preface viii
Change Log
The following table lists changes made to the HP Fortify Audit Workbench User Guide.
Software
Date Change
Release-Version
4.00-01 06/04/13 Removed the section About the Interface from Chapter 1
Change Log ix
Chapter 1: Getting Started with
HP Fortify Audit Workbench
The following topics provide an overview of Audit Workbench, instructions on how to start the tool, and
instructions on how to upgrade the Static Code Analyzer suite (SCA, Audit Workbench, and any plug-ins or
packages you have installed) as new versions of the installer become available.
About Upgrades
You can check on the availability of new Static Code Analyzer suite (including Audit Workbench) versions
from the Audit Workbench user interface. If a version newer than the one you have installed is available, you
can download it and upgrade your instance.
You can also configure Audit Workbench to check for, download, and install new versions automatically at
startup. Whether you upgrade your Static Code Analyzer and Apps manually or automatically, your data are
preserved.
To upgrade Static Code Analyzer and Apps from Audit Workbench, a Software Security Center administrator
must first set up the auto upgrade capability on the server host. The following topics address how to set up
auto upgrades (as a Software Security Center administrator) for Audit Workbench and how to can perform the
upgrades from Audit Workbench.
For information about the system requirements for using the auto upgrade feature, see the HP Fortify Software
Security Center System Requirements document.
Enabling HP Fortify Static Code Analyzer Suite Updates from HP Fortify Audit
Workbench
To make new Static Code Analyzer Suite installer available to Audit Workbench users for upgrades.
1. On the Software Security Center host, navigate to the <SSC_Install>/Core/support/tomcatForSSC/
webapps/ssc/WEB-INF/internal directory and open the securityContext.xml file in a text editor.
2. Locate, and then uncomment the following line:
<!-- <security:intercept-url pattern="/update-site/**" access="PERM_ANONYMOUS"/> -->
3. Save and close the securityContext.xml file.
4. Navigate to the <SSC_Install>/Core/support/tomcatForSSC/webapps/ssc/update-site/
installers directory.
5. Open and read the readme.txt file.
6. Copy the sample update.xml file content (between and including the <installerInformation> and
</installerInformation> tags, and then paste the copied sample text into a new text file.
7. Name the new file “update.xml” and save it to the <SSC_Install>/Core/support/tomcatForSSC/
webapps/ssc/update-site/installers directory.
8. Any time a new Static Code Analyzer Suite installer file
(HP_Fortify_SCA_and_Apps_<version>_<OS>.exe) becomes available, place it in the <SSC_Install>/
Core/support/tomcatForSSC/webapps/ssc/update-site/installers directory.
Note: HP Fortify Audit Workbench comes with several code samples to use to help you learn to use the tool.
For information about these samples, see Sample Files on page 94.
To validate a modified or new mapping, use the externalmetadata.xsd file, which is located in the
<SCA_and_Apps_Install>\Core\config\schemas directory. HP recommends that, after you change your
mapping document, you open the FPR file in the plug-in to see how the mapping works with the scan results.
If you change the external metadata document or create a new mapping document, be sure to make the same
changes on Software Security Center.
Note: When you update security content, any changes made locally to the Secure Coding Rulepacks and
external metadata are overwritten.
The following topics provide information about how to update HP Fortify Security Content (Security Content)
and manage security settings.
5. Select the settings for the types of issues you want to display in the results, and then click Run Scan.
Static Code Analyzer analyzes the source code. If Static Code Analyzer encounters any problems as it scans
the source code, it displays a Warning dialog box.
6. Click OK.
After the scan is completed, Audit Workbench displays the analysis results.
The Static Code Analyzer scan process includes the following stages:
• During the clean phase, Static Code Analyzer removes files from previous translation of the project.
• During the translation stage, Static Code Analyzer translates source code identified in the previous
screen into an intermediate format that is associated with a build ID. The build ID is typically the
project.
• During the scan stage, Static Code Analyzer scans source files identified during the translation phase
and generates analysis results, in the HP Fortify Project (FPR) format.
b. In the Fortify Secure Coding Rulepacks list, clear the check boxes that correspond to any Rulepacks you
want to disable during the scan.
c. To add a custom Rulepack, click Add Custom Rulepack, and then browse to and select the Rulepack file.
d. Click OK.
15. Select your scan settings, and then click Run Scan.
Static Code Analyzer starts the scan and displays progress information throughout the process. If Static Code
Analyzer encounters any problems scanning the source code, it displays a warning.
After the scan is completed, Audit Workbench loads the audit project and displays the analysis results.
The auditing interface consists of the following sections, which are numbered is the screen capture:
1. Issues panel
2. Source code panel
3. Functions panel
4. Issue auditing panel
5. Analysis Evidence panel
These sections are described in the following topics.
About Folders
Audit Workbench organizes scan results for a project into folders, which contain logically defined sets of
issues. Audit Workbench displays the folders as color-coded tabs in the issues panel (top left panel). The
number of issues that each folder contains is displayed at the top of each tab.
Audit Workbench comes with five folders. The filter set you select (Filter Set list) determines which folders are
visible in the issues panel. The following folders are visible while the default Security Auditor View filter set is
selected:
• The Critical folder contains issues that have a high impact and a high likelihood of occurring. Issues at this
risk level are easy to discover and to exploit, and represent the highest security risk to a program.
Remediate critical issues immediately.
You can create your own folders as you need them. For example, you might group all hot issues for a project
into a Hot folder and group all warning issues for the same project into a Warning folder. For instructions on
how to create your own folders, see Creating Folders.
Creating Folders
You can create your own folders as you need them. To display issues in a new folder, create a folder filter that
targets the new folder.
To create a new folder:
1. Select Tools → Project Configuration.
The Project Configuration dialog box opens.
2. Click the Folders tab.
The panel on the left displays existing folders. Fields on the right show the filter set, name, color, and
description of the selected folder.
3. Select a filter set to enable a folder that displays in the selected filter set only from the Folder for Filter Set
list. The Folders for Filter Set filters the folders displayed in the folder list.
4. To add a folder:
a. Click the plus icon next to Folders.
The Create New Folder dialog box opens.
b. Type a unique name for the folder, select a folder color, and then click OK.
Audit Workbench displays the folder name at the bottom of the folder list.
5. In the Description box, type a description of the issues the folder is designed to list.
Renaming Folders
If you rename a folder, the name change is global and is reflected in all filter sets.
To rename a folder:
1. Select Tools → Project Configuration.
The Project Configuration dialog box opens.
2. Click the Folders tab.
3. From the Folders for Filter Set list, select a filter set that displays the folder you want to rename.
4. In the Folders panel, select the folder to rename.
5. In the Name box, select the existing folder name, and then type a new name.
In the Folders panel, the folder name changes as you type.
6. Click OK.
The new folder name displays on the tabs.
Element Description
Issue Displays the issue location, including the filename and line number
Custom Tags area Displays lists with values that the auditor can add to the issue as attributes
For example, valid values for the Analysis tag are as follows:
• Not an issue
• Reliability issue
• Bad practice
• Suspicious
• Exploitable
Comments Appends additional information about the issue to the comment field
Rule Information Shows descriptive information such as issue category and kingdom
Element Description
Abstract/Custom Summary description of an issue, including any custom abstracts defined for your
Abstract organization
The SecurityScope Details tab shows the following information about runtime issues found by PTA or Runtime
Application Protection
Arguments—shows the Index, Value, and Return Value
Request—the HTTP Request information including the Method URL, content-length, accept-encoding, referer,
connection, accept-language, host, accept-charset, user-agent, content-type, cookie, accept, keep-alive
Stack Trace—shows the order of methods called during execution and line number information. Blue code
links are clickable and only display for code scanned by SCA
Element Description
Recommendations/Custom Provides recommendations for this type of issue, including examples, as well
Recommendations as custom recommendations defined by your organization
Tips/Custom Tips Provides tips for this type of issue, including any custom tips defined by your
organization
The Diagram tab displays information relevant to the rule type. Execution order is represented along the
vertical axis.
For dataflow issues, the trace starts at the top with the first function to call the taint source, then traces the
calls to the source (blue node), and ends the trace at the sink (red node). In the diagram, the source (src) and
sink nodes are also labeled. A red “X” on a vertical axis indicates that the function called finished executing.
The horizontal axis shows the call depth. A line shows the direction that control is passed. If control passes
with tainted data traveling through a variable the line is red, and when it is without tainted data, the line is
black.
The icons used for the expression type of each node in the diagram are the same icons as those used in the
Analysis Evidence panel. To view the icons and the descriptions, see About the Analysis Evidence Panel on page
42.
Option Description
Filters Displays a list of the visibility and folder filters configured in the selected filter set.
• Visibility filters show or hide issues
• Folder filters sort the issues into the folders in the analysis results panel
Right-click a filter to show issues that match the filter or to enable, disable, copy, or delete it.
Option Description
Then Indicates the filter type, where hide is a visibility filter and folder is a folder filter
Note: This option is visible when you create a new filter or edit an existing filter. In this case,
a dialog box displays the Then section.
Table 5 lists the icons used in the Analysis Evidence panel to show how data flow moves in the source code.
Icon Description
Information is read from a source external to the code (html form, url, and so on)
Comparison is made
Passthrough, tainted data passes from one parameter to another in a function call
Icon Description
A pointer is created
A pointer is dereferenced
Execution jumps
Generic
The Analysis Evidence panel can contain inductions, which provide supporting evidence for their parent nodes.
Inductions consist of a text node, displayed in italics as a child of the trace node, and an induction trace, which
is displayed as a child of the text node. A box surrounds the induction trace. The italics and the box distinguish
the induction from a standard subtrace.
Preference Description
Show Removed Issues Shows all issues that were uncovered in the previous analysis,
but are no longer evident in the new SCA analysis results
When multiple scans are run on a project over time,
vulnerabilities are often remediated or become obsolete. Static
Code Analysis marks these vulnerabilities as Removed Issues.
Collapse Issues Shows similar issues based on certain attributes under a shared
parent node in the Issues tree
Use Short File Names References the issues in the Issues view by file name only,
instead of by relative path (enabled by default)
Show Category of Issue Shows the category to which each issue belongs after the file
name and line number
Show Only My Issues Displays the name text on the folder tabs
Show Abstract in Issue Summary Shows the abstract information on the right of the Summary tab
Show Comments in Issue Summary Shows comments in the center of the Summary tab
Show ‘All’ Folder in Issue Summary Shows another bar in the chart on the Project Summary tab
Graph
Right justify ‘All’ folder Displays the All folder with its contents listed on the right
Display name in folder tabs Displays the name text in the folder tabs
Include Comments in History view Shows the history items for comments on the History tab
Attribute Used for Quick Audit Action From this list, you can select a custom tag for quick audit
actions. This enables you to assign custom tag values to issues.
For information about quick audits, see Performing Quick Audits
on page 64.
5. To specify your interface preferences, select or clear the preference check boxes.
Note: To restore the default settings at any time, click Reset Interface.
6. To save your preferences, click OK.
Comparison Description
Search terms can be further qualified with modifiers. For more information, see About Search Modifiers on
page 46. The basic syntax for using a modifier is modifier:<search_term>.
A search string can contain multiple modifiers and search terms. If you specify more than one modifier, the
search returns only issues that match all the modified search terms. For example,
file:ApplicationContext.java category:SQL Injection returns only SQL injection issues found in
ApplicationContext.java.
If you use the same modifier more than once in a search string, then the search terms qualified by those
modifiers are treated as an OR comparison. So, for example, file:ApplicationContext.java
category:SQL Injection category:Cross-Site Scripting returns SQL injection issues and cross-site
scripting issues found in ApplicationContext.java.
For complex searches, you can also insert the AND or the OR keyword between your search queries. Note that
AND and OR operations have the same priority in searches.
Modifier Description
[issue age] Searches for the issue age, which is either removed, existing, or new
<custom_tagname> Searches the specified custom tag. Note that tag names that contain spaces must
be delimited by square brackets.
Example: [my tag]:value
analysis Searches for issues that have the specified audit analysis value (such as
“exploitable,” “not an issue,” and so on)
analyzer Searches the issues for the specified analyzer
audience Searches for issues by intended audience. Valid values are “targeted,”
“medium,” and “broad”
audited Searches the issues to find true if Primary Custom Tag is set and false if
Primary Custom Tag is not set
dynamic Searches for issues that have the specified dynamic hot spot ranking value
file Searches for issues where the primary location or sink node function call occurs
in the specified file.
Modifier Description
[fortify priority Searches for issues that have a priority level that matches the specified priority
order] determined by the HP Fortify analyzers. Valid values are critical, high,
medium, and low, based on the expected impact and likelihood of exploitation.
The impact value indicates the potential damage that might result if an issue is
successfully exploited. The likelihood value is a combination of confidence,
accuracy of the rule, and probability that the issue can be exploited.
Audit Workbench groups issues into folders based on the four priority values
(critical, high, medium, and low) by default.
historyuser Searches for issues that have audit data modified by the specified user
kingdom Searches for all issues in the specified kingdom
maxconf Searches for all issues that have a confidence value up to and including the
number specified as the search term
<metagroup_name> Searches the specified metagroup. Metagroups include [owasp top ten 2010],
[sans top 25 2010], and [pci 2.1], and others. Square braces delimit field names
that include spaces.
minconf Searches for all issues that have a confidence greater than or equal to the
specified value.
package Searches for issues where the primary location occurs in the specified package
or namespace. (For data flow issues, the primary location is the sink function.)
[primary context] Searches for issues where the primary location or sink node function call occurs
in the specified code context. Also see sink, [source context].
primaryrule (rule) Searches for all issues related to the specified sink rule
ruleid Searches for all issues reported by the specified rule IDs used to generate the
issue source, sink and all passthroughs
sink Searches for issues that have the specified sink function name. Also see
[primary context]
source Searches for data flow issues that have the specified source function name. Also
see [source context]
[source context] Searches for data flow issues that have the source function call contained in the
specified code context
Also see source, [primary context].
sourcefile Searches for data flow issues with the source function call that the specified file
contains
Also see: file
status Searches issues that have the status reviewed, not reviewed, or under review
suppressed Searches for suppressed issues
taint Searches for issues that have the specified taint flag
trace Searches for issues that have the specified string in the data flow trace
Modifier Description
tracenode Enables you to search on the nodes within an issue’s analysis trace. Each
tracenode search value is a concatenation of the tracenode’s file path, line
number, and additional information.
<no attribute> Searches for issues that have any of the most common attributes that match the
specified string
Alternatively,
• To select a search term you used previously, click the arrow in the search box, and then select a search
term from the list.
To get assistance in composing the comparison for your search string, do the following:
1. Click your cursor in the search box, and then press CTRL + SPACE.
2. From the displayed list, double-click an issue attribute to begin your search string.
3. To get assistance specifying the comparison, with your cursor placed after the modifier in the search box,
press CTRL + SPACE.
4. From the displayed list, double-click the comparison to add to your search string.
5. Finish typing the search term.
The issues panel lists all of the issues that match your search string.
Audit Workbench saves all of the search terms you enter for the current session. To select a search term you
used previously, click the arrow in the search box, and then select a search term. (After you quit Audit
Workbench, the saved search terms are discarded.)
Creating complex search strings can involve several steps. If you enter an invalid search string, the magnifying
glass icon in the text field changes to a warning icon to notify you of the error. Click the warning sign to view
information about the search term error.
The advanced search feature makes it easier to build complex search strings. For a description of this feature
and instructions on how to use it, see Performing Advanced Searches.
5. To add an AND query row, in the top right corner of the dialog box, click AND add . To add an OR query
row, in the top right corner of the dialog box, click OR add .
6. Add as many query rows as you need for the search.
7. To delete a row, to the right of the row, click Delete . To remove all rows, click Clear.
8. Click Find.
Note: As you build your search string, the Advanced Search dialog box displays any errors in the status below
the search string builder. The Find button is not enabled unless all errors are resolved.
Consider the following examples:
• To search for all privacy violations in file names that contain jsp with getSSN() as a source, type the
following:
category:"privacy violation" source:getssn file:jsp
• To search for all file names that contain com/fortify/awb, type the following:
file:"com/fortify/awb"
• To search for all paths that contain traces with mydbcode.sqlcleanse as part of the name, type the
following:
trace:mydbcode.sqlcleanse
• To search for all paths that contain traces with cleanse as part of the name, type the following:
trace:cleanse
• To search for all issues that contain cleanse as part of any modifier, type the following:
cleanse
• To search for all suppressed vulnerabilities with asdf in the comments, type the following:
suppressed:true comments:asdf
To create a custom tag for results that have not been uploaded to Software Security Center:
1. Select Tools → Project Configuration.
The Project Configuration dialog box opens.
2. Click the Custom Tags tab.
3. At the top of the Tags panel, click the plus icon ( ).
4. In the Enter Value dialog box, type a label for the new tag, and then click OK.
5. To enable users to add new values for the tag during an audit, leave the Extensible check box on the Custom
Tags tab selected. To enable only managers, security leads, and administrators to add new values for the
tag during an audit, clear the check box.
6. To specify a value for the custom tag:
a. At the top of the Values column, click the plus icon ( ).
b. In the Enter Value dialog box, type a value for the new tag, and then click OK.
c. Repeat Step a and Step b for as many values as you need for the tag.
The value that you give your custom tag can be a discreet attribute for the particular issue this custom tag
addresses. For example, you may want to specify that this custom tag addresses a due date or server
quality issue.
7. (Optional) Enter descriptions of the custom tag and its values in the corresponding Description boxes.
8. (Optional) From the Default Value list, select the default value for the tag.
If you set a default value for the tag, then issues that do not have a value set for that tag have the default
value. If no default value is specified, then the tag value is set to Not Set.
3. Provide your Software Security Center username and password, and then click OK.
If the project template is not on Software Security Center, the Custom Tag Upload dialog box opens.
4. Do one of the following:
• To have Audit Workbench upload custom tags to the global pool on Software Security Center, click Yes.
• To prevent Audit Workbench from uploading custom tags to the global pool on Software Security
Center, click No.
The modified custom tags are also updated in the global pool.
Custom tags are not removed from the global pool. In Audit Workbench you are always looking at the custom
tags for a project. These are a subset of tags in the global pool. This ensures that custom tags not visible in
Audit Workbench can still be used for other projects.
Opening Projects
To open an FPR file:
1. Open Audit Workbench.
2. Select File → Open Project.
The Choose Project dialog box opens.
3. Browse to and select the FPR file, and then click Open.
If the FPR format is HP Fortify Static Code Analyzer version 4.5.1 or earlier, the Migration Wizard
automatically starts. Otherwise, the FPR displays in the Audit perspective. Migrate the issue IDs before you
migrate the audit data, as described in the HP Fortify Static Code Analyzer Migration Guide.
Important: If your custom bug tracker accesses supporting jar files, you must add them to the Class-Path
attribute in your bug tracker’s MANIFEST.MF file.
Evaluating Issues
To evaluate and assign auditing values to an issue or group of issues:
1. In the analysis results panel, select the issue or group of issues.
For information about the Analysis Evidence panel, see About the Analysis Evidence Panel on page 42.
2. Read the abstract on the Issue Summary tab, which provides high-level information about the issue, such as
which analyzer found the issue.
For example, “Command Injection (Input Validation and Representation, data flow)” indicates that this
issue, detected by the data flow analyzer, is a command injection issue in the Input Validation and
Representation kingdom.
3. Click the More Information link or the Issue Details tab to get more details about the issue.
4. On the Summary tab, assign values to the issue to represent your evaluation.
Default choices in the Analysis menu are:
• Not an issue
• Reliability issue
• Unknown
• Suspicious
• Exploitable
5. (Optional) In the Comments field, type a comment about the issue and your evaluation.
Suppressing Issues
As you assess successive scans of a project version, you might want to completely suppress some exposed
issues. In Audit Workbench, it is useful to mark an issue as suppressed if you are sure that the specific
vulnerability is not, and will never be, an issue of concern.
You might also want to suppress warnings for specific types of issues that might not be high priority or of
immediate concern. For example, you can suppress issues that are fixed, or issues that you plan not to fix.
To suppress an issue, do one of the following:
• In the issues panel, select the issue, and then, on the Summary tab in the Issue Auditing panel, click the
Suppress icon .
• Right-click the issue in the issues panel, and then select Suppress Issue.
Suppression marks the issue and all future discoveries of this issue as suppressed. As such, it is a semi-
permanent marking of a vulnerability.
To display issues that have been suppressed, from the menu bar, select Options Show → Suppressed Issues.
To unsuppress an issue, do one of the following:
• In the issues panel, select the suppressed issue, and then, on the Summary tab in the Issue Auditing panel,
click the Unsuppress icon .
• Right-click the issue in the issues panel, and then select Unsuppress Issue.
1. Select the issue in the issues panel, then click File Bug icon on the Summary tab.
If you are submitting a bug from Audit Workbench for the first time, the Configure Bugtracker Integration
dialog box opens.
2. Select the bug-tracking application, and then click OK.
The File Bug dialog box opens.
3. Review the issue description and change any values, if necessary.
4. Click Submit.
If your bug-tracking system requires you to log on, you must do so before you can file a bug through that user
interface. The issue is submitted as a bug in the bug-tracking application.
Unlikely Contains elements or characteristics also contained in audited issues marked Not an
Issue
Because you first selected a correlated issue, the View Correlations button is available.
3. Click View Correlations.
The Correlation Justification dialog box opens and displays the following three panels:
• The correlated issues tree on the left displays all correlated issues within a correlated group, sorted
based on analyzers.
• The relationship panel at the top right displays the correlation chain between issues. The chain
describes any indirect or direct relationships between the two selected issues.
• The panel at the bottom right describes each correlation rule in the correlation chain displayed in the
relationship panel.
4. To select two issues, press the CTRL key, and then click each issue.
5. To inspect the attributes that correlate these two issues, move your cursor to each link in the relationship
panel.
6. Click OK.
Use correlation justification to gain insight into code vulnerabilities and understand why certain issues are
correlated. This can help to reduce the time it takes to remediate the issues.
Section Subsection
Section Subsection
Results Certification
Results certifications summary. You can edit the text element
of this subsection.
Attack Surface
Attack surface summary. You can edit the text element of this
subsection.
Reference Elements
List of all libraries that SCA used during the translation phase
of analysis. You can edit the text element of this subsection.
Rulepacks
List of Rulepacks that SCA used in the analysis. You can edit the
text element of this subsection.
Properties
List of properties that SCA set during the analysis phase. You
can edit the text element of this subsection.
Section Subsection
Section Subsection
Section Subsection
Section Subsection
Scan Information
Scan information, including the SCA version, machine name,
and the name of the user who ran the scan. You can edit the
text element of this subsection.
Results Certification
Results certifications information, including the results
certification summary and the details of the results
certification. You can edit the text element of this subsection.
Rulepacks
List of Rulepacks that SCA used during the analysis. You can
edit the text element of this subsection.
Properties
Lists the properties that SCA set during the analysis. You can
edit the text element of this subsection.
Commandline Arguments
Lists the arguments that the program passed to SCA during
analysis. You can edit the text element of this subsection.
Warnings
Lists the warnings that occurred during analysis. You can
edit the text element of this subsection.
3. To view detailed information about what a section contains, click the listed section title.
On the right, the Generate Report dialog box displays the subsection headings and descriptions.
Variable Description
$PROJECT_LABEL$ Audit Workbench displays build-label when each FPR file that SCA
generated passes build project with -build-project
$PROJECT_NAME$ Build ID
$PROPERTIES$ Complete list of properties set during analysis phase (same format as
project summary)
$RESULTS_CERTIFICATION$ Complete certification detail with listing of validity on a per file basis
(see project summary)
$RESULTS_CERTIFICATION_ Short sentence describing certification (same format as project
SUMMARY$ summary)
$RULEPACKS$ Complete list of Rulepacks used during analysis (same format as
project summary)
$RUN_INFO$ Content from the Project Summary Runtime Information tab
$SCAN_COMPUTER_ID$ Hostname of machine on which scan was performed
$SCAN_DATE$ Date of analysis with the default formatting style for the locale
$SCAN_SUMMARY$ Summary of codebase scanned in the format # files, # lines of
code
The following XML is the Results Outline section of the HP Fortify Security Report:
<ReportSection enabled="false" optionalSubsections="true">
<Title>Results Outline</Title>
<SubSection enabled="true">
<Title>Overall number of results</Title>
<Description>Results count</Description>
<Text>The scan found $TOTAL_FINDINGS$ issues.</Text>
</SubSection>
<SubSection enabled="true">
<Title>Vulnerability Examples by Category</Title>
<Description>Results summary of the highest severity issues.
Vulnerability examples are provided by category.</Description>
<IssueListing limit="1" listing="true">
<Refinement>severity:(3.0,5.0] confidence:[4.0,5.0]</
Refinement>
<Chart chartType="list">
<Axis>Category</Axis>
</Chart>
</IssueListing>
</SubSection>
</ReportSection>
In this example, the Results Outline section contains two subsections. The first subsection is a text
subsection named Overall number of results. The section subsection is a results list named
Vulnerability Examples by Category. A section can contain any combination of subsections as its
contents.
Audit Workbench displays the Functions view in the top right panel.
3. To view coverage information about top-level (global) functions, expand the Top-level functions node.
The following icons indicate the rule coverage status of the function:
• A solid red square indicates that the function is not identified by any rules.
• A blue triangle indicates that the function is in a package covered by the Secure Coding Rulepacks.
• A green circle indicates that the function matches a rule defined in either the Secure Coding Rulepacks or a
custom Rulepack. (Matching a rule may not result in a reported issue.)
1. In the Functions view, right-click a class , and then select Find Usages on the shortcut menu.
The Search view (at center bottom) lists the file locations and line numbers in which the class is used.
Chapter 7: Troubleshooting 91
About Out of Memory Errors
The following two scenarios might trigger out-of-memory errors in Audit Workbench:
Scenario 1: An uploaded FPR file is too large
To solve this problem, increase the amount of memory allocated for running Audit Workbench.
To increase the memory allocated for Audit Workbench, set the environment variable AWB_VM_OPTS to
allocate memory for Audit Workbench. (For example, set AWB_VM_OPTS=-Xmx700M to allocate 700 MB to Audit
Workbench.)
Note: If you choose to set AWB_VM_OPTS, do not allocate more memory than is physically available.
Overallocation degrades performance.
Scenario 2: Running an SCA scan through the advanced scan wizard
In this scenario, the SCA instance of SCA executed during the scan is running out of memory. To resolve this
problem, increase the amount of memory allocated to SCA.
Chapter 7: Troubleshooting 92
Determining the Amount of Memory Used by External
Processes
You can use the com.fortify.model.ExecMemorySetting setting in the fortify.properties file to
determine how much memory external processes such as iidmigrator or the Event Bridge use. The default
setting is as follows:
com.fortify.model.ExecMemorySetting = 600
The value for this setting, which is expressed in MB, is translated into maximum heap size. In this case, 600
equates to -Xmx600M.
Chapter 7: Troubleshooting 93
Appendix A: Sample Files
Your HP Fortify software installation includes a number of sample files that you can use when testing or
learning to use SCA. The sample files are located in the following directory:
<HP_Fortify_Install>/Samples
The Samples directory contains two subdirectories: basic and advanced. Each code sample includes a
README.txt file that provides instructions on how to scan the code in SCA and view the output in Audit
Workbench.
The basic subdirectory includes an assortment of simple language-specific samples. The Advanced
subdirectory contains more advanced code samples, including samples that enable you to integrate SCA with
your bug-tracking system.
Basic Samples
Table 13 lists the sample files in the basic subdirectory, a short description of each file, and the vulnerabilities
identified. Each sample includes a README.txt file that provides details and instructions on its use.
javascript Includes the sample.js JavaScript file. Cross Site Scripting (XSS)
Open Redirect
Advanced Samples
Table 14 provides a list of the sample files in the advanced subdirectory
(<HP_Fortify_Install>\Samples\advanced). Each sample includes a README.txt file that provides
further details and instructions on its use.
Bugzilla Includes a Build.xml file built using the Audit Workbench bugtracker plugin
framework. The plugin includes the same functionality as the built-in Bugzilla
plugin so that it can be used as a guide to creating your own plugin.
You need to have Microsoft Visual Studio Visual C/C++ 2005 (or newer) installed.
You should also have the Fortify Analyzers installed, with the plugin for the Visual
Studio version you are using.
The code includes a Command Injection issue and an Unchecked Return Value
issue.
configuration This is a sample J2EE application that has vulnerabilities in its web module
deployment descriptor -web.xml.
csharp This is a simple C# program that has SQL injection vulnerabilities. Versions are
included for VS2003, VS2005, and VS2010. Upon successful completion of the
scan, you should see the SQL Injection vulnerabilities and one Unreleased
Resource vulnerability. Other categories might also be present, depending on the
Rulepacks used in the scan.
customrules Several simple source code samples and Rulepack files that illustrate rules
interpreted by four different analyzers: semantic, data flow, control flow, and
configuration. This directory also includes several miscellaneous real-world rules
samples that may be used for scanning real applications.
findbugs A sample that demonstrates how to run FindBugs static analysis tool together
with the Fortify Source Code Analysis Engine (Fortify SCA Engine) and filter out
results that overlap.
HPQC A sample that demonstrates the Audit Workbench bugtracker plugin framework
by implementing a plugin to HP Quality Center. This plugin communicates with an
HPQC server instance through the HPQC client-side addin. The bug tracker talks
to the addin through a COM interface, and the addin handles the communication
to the server.
javaAnnotations Includes a sample application that illustrates problems that may arise from its use
and how to fix the problems using the Fortify Java Annotations.
The goal of this example is to illustrate how the use of Fortify Annotations can
result in increased accuracy in the reported vulnerabilities. The accompanying
README file illustrate the potential problems and solutions associated with
vulnerability results.
maven-plugin Tests can be run on any projects that use Maven (for instance those included in
the samples directory, or WebGoat 5.3: http://code.google.com/p/webgoat/)
webgoat WebGoat test J2EE web application provided by the Open Web Application
Security Project (http://www.owasp.org). This directory contains the WebGoat
5.0 sources.
WebGoat java sources can be used directly for java vulnerability scanning via
Fortify Source Code Analysis Engine.
Quantifying Risk
Now that we have explained how we use impact, likelihood, and remediation information, we will explain how
we quantify these values as part of the static analysis process.
Since it is not possible to determine if or when an organization will suffer consequences related to a particular
vulnerability, Static Code Analysis takes a probabilistic approach to prioritizing vulnerabilities. Risk is defined
quantitatively, as follows:
risk = impact x likelihood
The risk that a vulnerability poses is equal to the impact of the vulnerability multiplied by the likelihood that
the impact will occur. We define impact as the negative outcome resulting from a vulnerability and likelihood
as the probability that the impact will come to pass.
Impact can come in many forms. For example, an organization might lose money or reputation because of a
successful attack, or it might lose business opportunity because the presence of a vulnerability causes a
system to fail a regulatory compliance check.
Two factors contribute to the likelihood that a particular vulnerability will cause harm:
• The probability that the vulnerability will be discovered (by an attacker or an auditor)
• The conditional probability that, once found, the vulnerability will be exploited
Impact
Accuracy
Confidence
Probability
Category Risk