0% found this document useful (0 votes)
80 views

Software Penetration Test

The document discusses issues with traditional approaches to software penetration testing and proposes conducting testing earlier in the software development lifecycle using techniques like security requirements analysis, abuse cases, and risk analysis. This would help uncover security issues earlier when they are less expensive to fix and allow the results to be more consistently applied across different teams.

Uploaded by

Sudeep Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views

Software Penetration Test

The document discusses issues with traditional approaches to software penetration testing and proposes conducting testing earlier in the software development lifecycle using techniques like security requirements analysis, abuse cases, and risk analysis. This would help uncover security issues earlier when they are less expensive to fix and allow the results to be more consistently applied across different teams.

Uploaded by

Sudeep Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Building Security In

Editor: Gary McGraw, [email protected]

Software Penetration Testing

Q
uality assurance and testing organizations are small and predefined allotment of
time and resources to the effort) as a
tasked with the broad objective of assuring that a final security checklist item at the
end of the life cycle.
software application fulfills its functional busi- One major limitation of this ap-
proach is that it almost always repre-
ness requirements. Such testing most often in- sents a too little, too late attempt to
tackle security at the end of the de-
volves running a series of dynamic functional tests to ensure velopment cycle. As we’ve seen, soft-
ware security is an emergent prop-
BRAD ARKIN proper implementation of the appli- component will perform function- erty of the system, and attaining it
Symantec cation’s features. However, because ally as desired. However, it’s unrea- involves applying a series of best prac-
security is not a feature or even a set sonable to verify that a negative tices throughout the software devel-
SCOTT STENDER of features, security testing doesn’t doesn’t exist by merely enumerating opment life cycle (SDLC; see Figure
Information directly fit into this paradigm.1 actions with the intention to pro- 1).1 Organizations that fail to inte-
Security Security testing poses a unique duce a fault, reporting if and under grate security throughout the devel-
Partners problem. Most software security de- which circumstances the fault oc- opment process often find that their
fects and vulnerabilities aren’t related curs. If “negative” tests don’t un- software suffers from systemic faults
GARY to security functionality—rather, cover any faults, we’ve only proven both at the design level and in the im-
MCG RAW they spring from an attacker’s unex- that no faults occur under particular plementation (in other words, the
Cigital pected but intentional misuses of the test conditions; by no means have we system has both security flaws and
application. If we characterize func- proven that no faults exist. When ap- security bugs). A late lifecycle pene-
tional testing as testing for posi- plied to security testing, where the tration testing paradigm uncovers
tives—verifying that a feature lack of a security vulnerability is the problems too late, at a point when
properly performs a specific task— negative we’re interested in, this both time and budget severely con-
then security testing is in some sense means that passing a software pene- strain the options for remedy. In fact,
testing for negatives. The security tration test provides very little assur- more often than not, fixing things at
tester must probe directly and deeply ance that an application is immune this stage is prohibitively expensive.
into security risks (possibly driven by to attack. One of the main problems An ad hoc software penetration
abuse cases and architectural risks) to with today’s most common ap- test’s success depends on many fac-
determine how the system behaves proaches to penetration testing is tors, few of which lend themselves to
under attack. misunderstanding this subtle point. metrics and standardization. The
One critical exception to this most obvious variables are tester skill,
rule occurs when the tester must Penetration knowledge, and experience. Cur-
verify that security functionality testing today rently, software security assessments
works as specified—that the applica- Penetration testing is the most fre- don’t follow a standard process of any
tion not only doesn’t do what it’s not quently and commonly applied of all sort and therefore aren’t particularly
supposed to do, but that it does do software security best practices, in amenable to a consistent application
what it’s supposed to do (with regard part because it’s an attractive late life- of knowledge (think checklists and
to security features). cycle activity. Once an application is boilerplate techniques). The upshot
In any case, testing for a negative finished, its owners subject it to pen- is that only skilled and experienced
poses a much greater challenge than etration testing as part of the final testers can successfully perform pen-
verifying a positive. Quality assur- acceptance regimen. These days, se- etration testing.
ance people can usually create a set of curity consultants typically perform The use of security requirements,
plausible positive tests that yield a assessments like this in a “time abuse cases, security risk knowledge,
high degree of confidence a software boxed” manner (expending only a and attack patterns in application de-

84 PUBLISHED BY THE IEEE COMPUTER SOCIETY ■ 1540-7993/05/$20.00 © 2005 IEEE ■ IEEE SECURITY & PRIVACY
Building Security In

sign, analysis, and testing is rare in cur-


rent practice. As a result, security Security External Static Penetration
requirements review analysis testing
findings can’t be repeated across dif- (tools)
ferent teams and vary widely depend- Abuse Risk Risk-based Risk
ing on the tester. Furthermore, any cases analysis analysis Security
security tests
breaks
test regimen can be structured in such
a way as to influence the findings. If
test parameters are determined by in-
dividuals motivated not to find any Requirements Design Test Code Test Field
security issues (consciously or not), it’s and use cases plans results feedback
likely that the penetration testing will
result in a self-congratulatory exercise
in futility. Figure 1. The software development life cycle. Throughout this series, we’ll focus on
Results interpretation is also an specific parts of the cycle; here, we’re examining penetration testing.
issue. Typically, results take the form
of a list of flaws, bugs, and vulnera-
bilities identified during penetration A better approach perform most of the grunt work
testing. Software development or- All is not lost—security penetration needed for basic software security
ganizations tend to regard these re- testing can be effective, as long as we analysis. Of course, a tool-driven
sults as complete bug reports—thor- base the testing activities on the secu- approach can’t be used as a replace-
ough lists of issues to address to rity findings discovered and tracked ment for review by a skilled security
secure the system. Unfortunately, from the beginning of the software analyst (especially because today’s
this perception doesn’t factor in the life cycle, during requirements analy- tools aren’t applicable at the design
time-boxed nature of late lifecycle sis, architectural risk analysis, and so level), but such an approach does
assessments. In practice, a penetra- on. To do this, a penetration test must help relieve a reviewer’s work bur-
tion test can only identify a small be structured according to perceived den and can thus drive down cost.
representative sample of all possible risk and offer some kind of metric re- Second, tool output lends itself
security risks in a system. If a soft- lating risk measurement to the soft- readily to metrics, which software
ware development organization fo- ware’s security posture at the time of development teams can use to track
cuses solely on a small (and limited) the test. Results are less likely to be progress over time. The simple met-
list of issues, it ends up mitigating misconstrued and used to declare rics commonly used today don’t
only a subset of the security risks pretend security victory if they’re re- offer a complete picture of a system’s
present (and possibly not even those lated to business impact through security posture, though, so it’s im-
that present the greatest risk). proper risk management. portant to emphasize that a clean bill
All of these issues pale in com- of health from an analysis tool
parison to the fact that people often Make use of tools doesn’t mean that a system is defect
use penetration testing as an excuse Tools should definitely be part of free. The value lies in relative com-
to declare victory. When a penetra- penetration testing. Static analysis parison: if the current run of the
tion test concentrates on finding and tools can vet software code, either in tools reveals fewer defects than a
removing a small handful of bugs source or binary form, in an attempt previous run, we’ve likely made
(and does so successfully), everyone to identify common implementa- some progress.
looks good: the testers look smart tion-level bugs such as buffer over-
for finding a problem, the builders flows.2 Dynamic analysis tools can Test more than once
look benevolent for acquiescing to observe a system as it executes as well Today, automated review is best
the test, and the executives can as submit malformed, malicious, and suited to identifying the most basic
check off the security box and get random data to a system’s entry of implementation flaws. Human
on with making money. Unfortu- points in an attempt to uncover review is necessary to reveal flaws in
nately, penetration testing done faults—a process commonly referred the design or more complicated
without any basis in security risk to as fuzzing. The tool then reports implementation-level vulnerabili-
analysis leads to this situation with the faults to the tester for further ties (of the sort that attackers can and
alarming frequency. By analogy, analysis.3 When possible, use of these will exploit), but such review is
imagine declaring testing victory by tools should be guided by risk analy- costly. By leveraging the basic
finding and removing only the first sis results and attack patterns. SDLC touchpoints described in this
one or two bugs encountered dur- Tools offer two major benefits. series of articles, penetration tests
ing system testing! First, when used effectively, they can can be structured in such a way as to

www.computer.org/security/ ■ IEEE SECURITY & PRIVACY 85


Building Security In

be cost effective and give a reason- though this problem is much harder should be targeted to ensure that
able estimation of the system’s secu- than it seems at first blush7). By iden- suggested deployment practices are
rity posture. tifying and leveraging security goals effective and reasonable and that ex-
Penetration testing should start during unit testing, we can signifi- ternal assumptions can’t be violated.

Integrate with
the development cycle
Penetration testing can be effective, as long Perhaps the most common problem
with the software penetration testing
as we base the testing activities on the process is the failure to identify
lessons to be learned and propagated
security findings discovered and tracked back into the organization. As we
mentioned earlier, it’s tempting to
from the beginning of the software life cycle. view a penetration test’s results as a
complete and final list of bugs to be
fixed rather than as a representative
at the feature, component, or unit cantly improve the greater system’s sample of faults in the system.
level, prior to system integration. security posture. Mitigation strategy is thus a criti-
Risk analysis performed during Penetration testing should cal aspect of the penetration test.
the design phase should identify continue at the system level and be Rather than simply fixing identified
and rank risks as well as address in- directed at the integrated software bugs, developers should perform a
tercomponent assumptions.4,5 At system’s properties such as global root-cause analysis of the identified
the component level, risks to the error handling, intercomponent vulnerabilities. If most vulnerabili-
component’s assets must be miti- communication, and so on. Assum- ties are buffer overflows, for exam-
gated within the bounds of con- ing unit testing has successfully ple, the development organization
textual assumptions. Tests should achieved its goals, system-level test- should determine just how these
attempt unauthorized misuse of, ing shifts its focus toward identifying bugs made it into the code base. In
and access to, target assets as well as intercomponent issues and assessing such a scenario, lack of developer
try to violate any assumptions the the security risk inherent at the de- training, misapplication (or nonexis-
system might make relative to its sign level. If, for example, a compo- tence of ) standard coding practices,
components. nent assumes that only trusted com- poor choice of languages and li-
Testers should use static and dy- ponents have access to its assets, braries, intense schedule pressure, or
namic analysis tools uniformly at the security testers should structure a test any combination thereof could ulti-
component level. In most cases, no to attempt direct access to that com- mately represent an important cause.
customization of basic static analysis ponent from elsewhere. A successful Once a root cause is identified,
tools is necessary for component- test can undermine the system’s developers and architects should de-
level tests, but a dynamic analysis tool assumptions and could result in an vise mitigation strategies to address
will likely need to be written or mod- observable security compromise. the identified vulnerabilities and any
ified for the target component. Such Dataflow diagrams, models, and similar vulnerability in the code base.
tools are often data-driven tests that high-level intercomponent docu- In fact, best practices should be de-
operate at the API level. Any tool mentation created during the risk veloped and implemented to address
should include data sets known to analysis stage can also be a great help such vulnerabilities proactively in
cause problems, such as long strings in identifying where component the future. Going back to the buffer
and control characters.6 Further- seams exist. overflow example, an organization
more, the tool design should reflect Tool-based testing techniques are could decide to train its developers
the security test’s goal: to misuse the appropriate and encouraged at the and eliminate using potentially dan-
component’s assets, violate intercom- system level, but for efficiency’s sake, gerous functions such as strcpy()
ponent assumptions, or probe risks. such testing should be structured to in favor of safer string-handling
Unit testing carries the benefit of avoid repeating unit-level testing. libraries.
breaking system security down into Accordingly, they should focus on A good last step is to use test result
several discrete parts. Theoretically, if aspects of the system that couldn’t be information to measure progress
each component is implemented probed during unit testing. against a goal. Where possible, tests
safely and fulfills intercomponent If appropriate, system-level tests for the mitigated vulnerability should
design criteria, the greater system should analyze the system in its de- be added to automated test suites. If
should be in reasonable shape (al- ployed environment. Such analysis the vulnerability resurfaces in the

86 IEEE SECURITY & PRIVACY ■ JANUARY/FEBRUARY 2005


Building Security In

code base at some point in the future, design, implementation, and deploy- Brad Arkin is a technical manager for
any measures taken to prevent the ment practices. Symantec Professional Services. His pri-
mary area of expertise is helping organi-
vulnerability should be revisited and zations improve the security of their
improved. As time passes, iterative se- References applications. Arkin has a dual BS in com-
curity penetration tests should reveal 1. G. McGraw, “Software Security,” puter science and mathematics from the
fewer and less severe flaws in the sys- IEEE Security & Privacy, vol. 2, no. College of William and Mary, an MS in
computer science from George Washing-
tem. If a penetration test reveals seri- 2, 2004, pp. 80–83. ton University, and is an MBA candidate
ous severity flaws, the “representative 2. B. Chess and G. McGraw, “Static at Columbia University and London Busi-
sample” view of the results should Analysis for Security,” IEEE Secu- ness School. Contact him at brad_arkin@
give the development organization rity & Privacy, vol. 2, no. 6, 2004, symantec.com.
serious reservations about deploying pp. 76–79.
Scott Stender is a partner with Informa-
the system. 3. B.P. Miller et al., Fuzz Revisited: A tion Security Partners. His research inter-
Re-Examination of the Reliability of ests are focused on software security, with
Unix Utilities and Services, tech. an emphasis on software engineering
and security analysis methodology. Sten-

P enetration testing is the most


commonly applied mechanism
used to gauge software security, but
report CS-TR-95-1268, Dept. of
Computer Science, Univ. Wiscon-
sin, Apr. 1995.
der has a BS in computer engineering
from the University of Notre Dame. Con-
tact him at [email protected].
it’s also the most commonly misap- 4. D. Verndon and G. McGraw, “Soft-
plied mechanism as well. By applying ware Risk Analysis,” IEEE Security & Gary McGraw is chief technology officer
of Cigital. His real-world experience is
penetration testing at the unit and Privacy, vol. 2, no. 5, 2004, pp. 81–85. grounded in years of consulting with
system level, driving test creation 5. F. Swidersky and W. Snyder, Threat major corporations and software pro-
from risk analysis, and incorporating Modeling, Microsoft Press, 2004. ducers. McGraw is the coauthor of
the results back into an organization’s 6. G. Hoglund and G. McGraw, Exploiting Software (Addison-Wesley,
2004), Building Secure Software (Addi-
SDLC, an organization can avoid Exploiting Software, Addison- son-Wesley, 2001), Java Security (John
many common pitfalls. As a mea- Wesley, 2004. Wiley & Sons, 1996), and four other
surement tool, penetration testing is 7. R. Anderson, Security Engineering: books. He has a BA in philosophy from
the University of Virginia and a dual PhD
most powerful when fully integrated A Guide to Building Dependable Dis-
in computer science and cognitive science
into the development process in such tributed Systems, John Wiley & from Indiana University. Contact him at
a way that findings can help improve Sons, 2001. [email protected].

www.computer.org/security/ ■ IEEE SECURITY & PRIVACY 87

You might also like