Privacy Violation

ABSTRACT

Mishandling private information, such as customer passwords or social security numbers, can compromise user privacy and is often illegal.

EXPLANATION

Privacy violations occur when:

1. Private user information enters the program.

2. The data is written to an external location, such as the console, file system or network.

Example: The following code sends account credentials to a web user. Specifically, the OWA_SEC.get_password() function returns the user-supplied plaintext password associated with the account, which is then printed to the HTTP response.


...
HTP.htmlOpen;
HTP.headOpen;
HTP.title (.Account Information.);
HTP.headClose;
HTP.bodyOpen;
HTP.br;
HTP.print('User ID: ' ||
OWA_SEC.get_user_id || '
');
HTP.print('User Password: ' ||
OWA_SEC.get_password || '
');
HTP.br;
HTP.bodyClose;
HTP.htmlClose;
...


Other examples may contain logging statements that store plaintext passwords to the filesystem. Although many developers trust the filesystem as a safe storage location for data, it should not be trusted implicitly, particularly when privacy is a concern.

Private data can enter a program in a variety of ways:

- Directly from the user in the form of a password or personal information

- Accessed from a database or other data store by the application

- Indirectly from a partner or other third party

Sometimes data that is not labeled as private can have a privacy implication in a different context. For example, student identification numbers are usually not considered private because there is no explicit and publicly-available mapping to an individual student's personal information. However, if a school generates identification numbers based on student social security numbers, then the identification numbers should be considered private.

Security and privacy concerns often seem to compete with each other. From a security perspective, you should record all important operations so that any anomalous activity can later be identified. However, when private data is involved, this practice can in fact create risk.

Although there are many ways in which private data can be handled unsafely, a common risk stems from misplaced trust. Programmers often trust the operating environment in which a program runs, and therefore believe that it is acceptable to store private information on the file system, in the registry, or in other locally-controlled resources. However, even if access to certain resources is restricted, this does not guarantee that the individuals who do have access can be trusted. For example, in 2004, an unscrupulous employee at AOL sold approximately 92 million private customer e-mail addresses to a spammer marketing an offshore gambling web site [1].

In response to such high-profile exploits, the collection and management of private data is becoming increasingly regulated. Depending on its location, the type of business it conducts, and the nature of any private data it handles, an organization may be required to comply with one or more of the following federal and state regulations:

- Safe Harbor Privacy Framework [3]

- Gramm-Leach Bliley Act (GLBA) [4]

- Health Insurance Portability and Accountability Act (HIPAA) [5]

- California SB-1386 [6]

Despite these regulations, privacy violations continue to occur with alarming frequency.

REFERENCES

[1] Standards Mapping - OWASP Top 10 2007 - (OWASP 2007) A6 Information Leakage and Improper Error Handling

[2] Standards Mapping - OWASP Top 10 2013 - (OWASP 2013) A6 Sensitive Data Exposure

[3] J. Oates AOL man pleads guilty to selling 92m email addies The Register

[4] Standards Mapping - Security Technical Implementation Guide Version 3 - (STIG 3) APP3210.1 CAT II, APP3310 CAT I, APP3340 CAT I

[5] Standards Mapping - Security Technical Implementation Guide Version 3.4 - (STIG 3.4) APP3210.1 CAT II, APP3340 CAT I

[6] Standards Mapping - Security Technical Implementation Guide Version 3.5 - (STIG 3.5) APP3210.1 CAT II, APP3340 CAT I

[7] Standards Mapping - Security Technical Implementation Guide Version 3.6 - (STIG 3.6) APP3210.1 CAT II, APP3340 CAT I

[8] Standards Mapping - Security Technical Implementation Guide Version 3.7 - (STIG 3.7) APP3210.1 CAT II, APP3340 CAT I

[9] California SB-1386 Government of the State of California

[10] Standards Mapping - Common Weakness Enumeration - (CWE) CWE ID 359

[11] Financial Privacy: The Gramm-Leach Bliley Act (GLBA) Federal Trade Commission

[12] Health Insurance Portability and Accountability Act (HIPAA) U.S. Department of Human Services

[13] Standards Mapping - Web Application Security Consortium 24 + 2 - (WASC 24 + 2) Information Leakage

[14] Privacy Initiatives U.S. Federal Trade Commission

[15] Standards Mapping - Payment Card Industry Data Security Standard Version 2.0 - (PCI 2.0) Requirement 3.2, Requirement 3.4, Requirement 4.2, Requirement 6.5.5, Requirement 8.4

[16] Standards Mapping - Payment Card Industry Data Security Standard Version 1.2 - (PCI 1.2) Requirement 3.2, Requirement 3.4, Requirement 4.2, Requirement 6.5.6, Requirement 8.4

[17] Standards Mapping - Payment Card Industry Data Security Standard Version 3.0 - (PCI 3.0) Requirement 3.2, Requirement 3.4, Requirement 4.2, Requirement 8.2.1

[18] Standards Mapping - Payment Card Industry Data Security Standard Version 1.1 - (PCI 1.1) Requirement 3.2, Requirement 3.4, Requirement 4.2, Requirement 8.4

[19] Safe Harbor Privacy Framework U.S. Department of Commerce

[20] M. Howard, D. LeBlanc Writing Secure Code, Second Edition Microsoft Press


Copyright 2014 Fortify Software - All rights reserved.
(Generated from version 2014.2.0.0007 of the Fortify Secure Coding Rulepacks)
desc.dataflow.sql.privacy_violation