Retired-Web Application Security Policy

I. Title

A. Name: Web Application Security Policy

B. Number: 20180418-webappsec

C. Author: M. Muth, G.M. Sanker (ISC)

D. Status:  [ ] proposed [ ] under review [ ] approved [ ] rejected [ ] obsolete [X] retired

E. Date proposed: 2018-04-18

F. Date revised: N/A

G. Date approved: N/A

H. Effective date: N/A

I. Retired date: 06/01/2021

Information Systems and Computing's Office of Information Security has the authority and responsibility to establish information security policies, guidelines, and standards.

This policy describes the requirements for securing web applications developed by Penn that access University data. It includes a baseline set of requirements for all such applications, and additional requirements for applications that access moderate- or high-risk University data as defined in the Data Risk Classification (see References, below). The policy also provides best practices recommendations to guide developers and application owners in further steps to protect Penn's data. Requirements pertaining to the computing systems on which the applications run are handled in the Computer Security Policy (see References, below).

The purpose of the policy is to protect the confidentiality, integrity, and availability of University data.

Web applications are a popular target for exploitation. Web applications tend to be publicly facing. They provide a broad diverse attack surface due to software diversity underlying web-based solutions, the promise of unconstrained user input and the potential for attackers to pivot from vulnerable web applications to sensitive Penn data or exploitation of other Penn systems.

Non-compliance with this policy poses great risk to Penn and to individuals whose data Penn maintains. For Penn and its Schools and Centers, there may be regulatory fines, lawsuits, reputational damage, and the loss of trust by critical members of our community. For individuals, a loss of privacy, identity theft, embarrassment, harassment, and other problems may result. Further, security incidents can threaten the confidentiality, integrity, and availability of Penn's computing infrastructure and the critical data on which Penn's research, teaching, and service missions depend.

Application Integration Point

Any abstract facility within a software application that allows for the programmatic manipulation of the behavior of that application

Developer

Any UPenn community member, third party or contractor responsible for creating or implementing changes to the functionality or logic of a University-run web application.

Moderate or High risk University data

Data defined on the Penn Data Risk Classification as High or Moderate risk.

Penetration Testing

An authorized, simulated attack on a system or application performed by individuals using both automated tools and custom or manual tests to evaluate security posture. The goal is to assess how resistant the system or application would be to a non-automated/at least minimally targeted or customized attack and recommend steps to remediate unacceptable risk.

Security Liaison

An individual appointed by a School or Center who is responsible to be aware of major information security policies, programs, and initiatives at Penn, to actively promote security awareness in his or her School or Center, and to serve as a contact person for information security incidents.

Software Development Lifecycle (SDLC)

A process for planning, creating, testing, and deploying a web application

Vulnerability Scanning

The process of inspecting applications or systems, typically over the network, for common vulnerabilities that may be exploited by an attacker

Web Application

A computer program created by a developer that provides dynamic content over HTTP or HTTPS

The policy applies to:

  • All web applications that are developed by Penn employees and that access Moderate or High risk University data.
  • All developers and application owners of those applications.
  • All web application dependencies, including libraries, frameworks, and software that serve those applications.
  1. Platform and Application Testing
    1. Critical Component registration will result in regular platform vulnerability scanning by ISC Information Security.
    2. In addition to platform vulnerability scanning, the application layer of all applications housing or accessing moderate- or high-risk data must be scanned for vulnerabilities. The application owner can comply with this policy by participating in processes developed and implemented by ISC Information Security in collaboration with the wider campus IT community. Alternatively, the application owner may opt to be responsible for maintaining their own scanning processes which are at least equivalent in both scope and nature to those offered by ISC Information Security.
      1. The above scanning must be performed at intervals appropriate to the application owner's assessment of technical or business impact of potential exploitation as well as an event-driven basis (e.g., such as prior to initial implementation, after a major code revision, upon publication of a new vulnerability, etc.)
        1. If the application owner participates in ISC's scanning program, application owner is responsible for notifying ISC and asking for a scan; ISC is responsible for conducting the scan in a timely manner.
        2. If the application owner maintains their own scanning process, they are responsible for scans that are event-driven.
  2. Secure Coding Implement a Software Development Life Cycle (SDLC) that includes:
    1. Security as a design requirement; and
    2. Use of a framework where feasible and appropriate as identified by the Application Best Practices (see Reference, below).
    3. Follow secure coding practices such as minimizing risks identified by the Coding Vulnerability Checklist (see References, below)
  3. Application Data Security
    1. Purge or move old data offline whenever possible.
    2. If stored in a database, encrypt high-risk data wherever feasible and determined to be not cost prohibitive by the data owner.
    3. Retain only the amount of log data needed to meet business needs.
    4. If high-risk data is hosted by a third party, ensure that the contract with them covers protection of the data (see Penn Purchase Order Terms and Conditions, Exhibit A in References, below).
  4. Patching
    Security patches (including libraries, frameworks, and any other dependencies) must be applied on a timely basis. Patches for security vulnerabilities that vendors or NIST's National Vulnerability Database designate critical must be applied immediately or as soon as reasonably practical.
  5. User Authentication
    1. Wherever possible, use campus authentication system (SAML - see References, below) for user accounts and passwords.
    2. If not possible to use campus authentication system, ensure that user passwords are at least as strong as PennKey passwords as described in the PennKey Password Rules (see References, below).
    3. Require two-step verification if application serves moderate- or high-risk data.
  6. Service-to-Service Application Authentication
    1. All Penn managed application integration points must be secured with a strong password, certificate authentication, or a Kerberos principal or other equivalent access control. This includes but is not limited to database connections, RESTful and SOAP web services, and SSH/SFTP calls to a platform, but excludes integration points where no University data is handled.
    2. Use a known trusted password generator to generate passwords (LastPass; see References, below). The generated password must meet or exceed the complexity rubric put forth by the guidelines for setting PennKey passwords (see PennKey Password Rules in References, below).
    3. The secret (private key or password) should not be hard coded into the source code of the application or stored in the source code repository.
    4. The secret must be encrypted at rest whenever possible using NIST-specified standards and appropriately secured on the file system per the Server Best Practices. (Note that in some cases, such as SSH keys, the associated private key file cannot itself be encrypted, since the OS needs it in a clear-text state to function.)
  7. Authorization
    Review application and utility (e.g. database) accounts & privileges regularly, especially when someone with elevated privileges leaves.
  8. Logging
    Unless necessary to the business function, never log moderate- or high-risk data.
  9. Training
    Developers should complete University-provided, security-specific application development training or be able to demonstrate equivalent understanding of secure web architecture design and secure coding practices.

DRAFT POLICY -- DRAFT POLICY -- DRAFT POLICY

Policy draft in 30-day comment period, from May 1, 2019 through the end of May 30, 2019
Submit comments via email to 20180418-webappsec-comments@lists.upenn.edu

IX. Recommendations and Best Practices

  1. Application Testing
    1. Conduct penetration testing (using an internal or external tester), particularly if the application accesses high-risk data. Note: ISC's Office of Information Security has a Penetration Testing program and recommendations for third-party providers in this space.
    2. Any web application should be scanned regularly or on an event driven basis to prevent operational impact due to vulnerabilities.
    3. Use well-respected frameworks and libraries.
    4. Don't take on unused dependencies, unnecessary features, components, files and documentation.
  2. Secure Coding
    1. Implement a Software Development Life Cycle (SDLC) that includes:
      1. Regular regression testing;
      2. Code review; and
      3. Documentation of the process required to administer and maintain security.
    2. Security defects should be entered into a defect tracking system and clearly defined as security defects. This information should be protected, prioritized and categorized according to severity. Defects found in unreleased application versions should be fixed prior to release. Defects discovered in released applications should be assessed based on:
      1. Sensitivity of potentially exposed data; and
      2. Access required to exploit said security defect.
  3. Security and Privacy Impact Assessment
    Conduct SPIA (Security and Privacy Impact Assessment), including:
    1. Every application that accesses high or moderate risk data;
    2. Libraries on which they depend; and
    3. Application contacts/developers.
  4. High Risk Data Considerations
    Consider any policy or legal implications as appropriate, consulting others as needed.
  5. User Authentication
    Require authentication for all parts of an application (TBD: Definition/other description) - don't leave a subset of functionality or data public, which significantly increases complexity and thus risk of protected data being exposed.
  6. Service-to-Service Application Authentication
    1. As events warrant (e.g. staff leaving, technical changes), change any keys or passwords used by the application. This should be a tested, documented procedure in order to minimize risk of downtime.
    2. >For older applications or those not developed according to best practices, check for weak passwords.
    3. When application integration points span zones of administrative control (e.g. between a network on campus and a cloud resource), add other controls (e.g. VPN or host-based IP restrictions) to reduce risk exposure.
  7. Logging
    1. Have a defined log monitoring practice to identify unusual or anomalous behavior associated with the application and follow it.
    2. Use off-host logging, with ISC's Security Logging Service or equivalent (see References, below).
  8. Host-based IDS
    Consider using a host intrusion detection system (HIDS) to monitor platforms housing applications with moderate or high risk data to observe unauthorized or unusual activity.
  9. Web Application Firewall
    Consider using a web application firewall to protect against common attacks and unwanted data injection
  10. Security Response
    Establish a repeatable process for responding to external notifications of current/observed attacks. This should include identifying your organization's Security Liaison, and how they will communicate critical information.
  11. Training
    1. Developers should complete security-specific application development training annually, assuming an updated, University-provided method is available.
    2. Applicationscowners are encouraged to engage local IT groups for support and contractor vetting to ensure third party developers and contractors can demonstrate an understanding of secure web architecture design and secure coding practices.
  12. Authorization
    1. Wherever possible, use campus authorization (see PennGroups in References, below) system as the system of record for roles.
    2. Wherever possible, limit network access to IP address space of application consumers.
    3. If it is not possible for PennGroups to be the system of record, then if possible, feed entitlements downstream to PennGroups (define ad hoc groups in PennGroups).
    4. Only provide access to those who need it (e.g. students vs faculty vs staff vs a particular department vs an ad hoc group).
    5. Regularly review accounts and ad hoc privileges that are not automatically removed (e.g. by affiliation changes).
    6. Allow application owners to view users who have access and give them a clear way to request access changes.
    7. Note or log when users are added or deleted from authorization lists.
  13. SSL/TLS
    SSL/TLS should be hardened such that it rates a B or better using SSL Labs tests.

A. Verification:ISC Information Security will use security scanners at least quarterly to scan all registered servers and applications.
B. Notification: ISC Information Security will report violations of this policy to the primary contact in ISC's Assignments Service, or to the appropriate Security Liaison.
C. Remedy:The remedy may be immediate removal of the system from the network, depending on the severity of the operational impact on PennNet. ISC Information Security shall report non-compliance to local School/Center management and University management. ISC Information Security will offer assistance to the LSP for the area in correcting security problems, after which the system may be reconnected to the network, and/or normal service restored.
D. Financial Implications: The individual, department, or unit owning the application shall bear the costs of ensuring compliance with this policy.
E. Responsibility: Responsibility for compliance lies with the application owner. The Office of Audit, Compliance, and Privacy, and the Office of Information Systems and Computing, are available for consultation in connection with this policy.
F. Time Frame:This policy shall be effective three months after final approval for new applications and 12 months after final approval for existing applications. If a school or center security liaison believes that the school or center cannot comply with this timeframe, he or she may petition for an extension under Appeals, below.
G. Enforcement:The Office of Audit, Compliance and Privacy shall conduct periodic audits. Please see the Policy on Computer Disconnection from PennNet in References, below. Individuals not adhering to the policy may be subject to sanctions such as immediate removal from the network.
H. Appeals: Requests for waiver from the requirements of this policy are decided by the Vice President for Information Systems and Computing. A waiver granted for the inability to meet one compliance requirement does not exempt the system owner from meeting all other requirements. All waiver requests may be submitted to ISC Information Security.

Policy Status
Status Date Approval
Retired 01/01/2022 ISC CIO - Tom Murphy