Scenario
When one of the university’s service providers suffered a data breach that exposed health information for several thousand students, Taylor saw an opportunity to make lemonade from an unfortunate circumstance. From his position in the institution’s IT department, Taylor had long advocated for a strong, formal program to evaluate the risks posed by third-party providers. His appeals had gone largely unheeded, though, caught between resistance from senior leaders about the cost for such a program and opposition from faculty—and sometimes students—who were loath to forgo certain technology tools and services if the service provider couldn’t pass a risk assessment. And yet when word of the breach got out, the students were angry and the administration wanted answers.
In this case, the student health center had implemented a third-party application that allowed students to submit family medical histories and information about their own health and stress levels to a service that triangulated those data with grades and other university-provided measures of engagement. The service would then provide students with weekly personalized recommendations about lifestyle practices that could improve their physical and mental health. Only students who opted in to the program were included, and the data were confidential…until the breach.
Seeing firsthand the consequences of insufficient attention to third-party risk management (TPRM), the campus community was suddenly on board. Taylor was charged with establishing TPRM policy, processes, and standards, and he assembled a small group from various units across campus, including legal counsel, regulatory compliance, IT, and the faculty senate. They began the arduous process of developing a full inventory of third-party products and their uses. What quickly became clear was that there were too many third-party products and services already deployed and too many more in the queue to perform an exhaustive review of each one. The group established guidelines for reviewing existing and new tools,
applying a prioritization schema based on risk and reach. Some tools were jettisoned because the university already had other products or services that performed the same or similar functions and were less risky. Some were replaced with more trustworthy alternatives. A long-running research project was using a third-party tool that the TPRM group would not have approved but that was required by the agency that sponsored the research; for this tool, the IT staff implemented additional security controls to better protect the university. Even with such exceptions, though, and relatively cursory reviews for other products and services, Taylor could confidently say to the members of the university community that significant amounts of risk had been identified and either eliminated or minimized, with relatively minor impacts on users and programs that depended on third-party tools.
1. What Is It?
Third-party risk management (TPRM) refers to the activities and policies designed to identify, assess, and mitigate the potential risks from products and services provided by outside vendors, suppliers, contractors, or service providers. Higher education relies on a large—and seemingly always expanding—catalog of technology tools, any of which carries some risk to the institution and its constituents. Using third-party products and services can bring significant benefits, but institutions need to weigh those benefits against the costs when evaluating the risk from third parties. Managing the risks of applications developed in-house presents its own challenges, and that difficulty is multiplied for technology developed and maintained by a third-party provider, which might be a commercial vendor, a different higher education institution, or another type of entity.
TPRM involves understanding the potential risks external parties may pose to an organization’s operations, data security, reputation, and regulatory compliance. Those risks encompass areas including information security, data privacy, regulatory compliance, business continuity, basic functionality (ensuring technology products and services function as intended, without breaking anything else), accessibility, and ethical considerations. On the security front, any of the elements of the C-I-A triad (confidentiality, integrity, and availability) could be compromised by a third-party product. For some campuses, environmental, social, and governance (ESG) considerations need to be taken into account, and a third party might not satisfy those requirements. One common risk for cloud-based services is the potential for breach of confidential information. Similar risks may apply to on-premises software if the source code is stolen and attackers then use that to find vulnerabilities in the software. Such risks apply broadly across many vendors and could involve enterprise tools used across the institution or specialized software used in a single class or research project. This risk also includes integrations between cloud services such as Learning Tools Interoperability in a campus learning management system. Risks extend to student-led activities—for example, if a student group takes credit card payments for a fund-raising activity, that may incur risk for the institution (if cardholder data is breached), even though it’s “just students” using the technology.
2. How Does It Work?
Classic risk modeling multiplies the likelihood (odds) of an event by its impact (cost) to understand the economics of reducing risk. An institution can then propose various controls to reduce risk and weigh the cost of those controls versus the value of the reduced risk. For a given product, multiple types of risk might apply, and each of those risks can be assigned a score. The overall risk for the product or services can then be expressed as the highest score of all the applicable risks. For example, if the risk of service outage is low, and the risk of data breach is medium, then the overall risk is medium. Assessing these risks, however, can be complex. In the case of a learning management system, the impact of an outage during winter break is very different from the impact on the first day of final exams.
Part of the complexity of TPRM is maintaining an accurate inventory of the third parties and knowing every instance on a campus that uses each tool or service, as well as knowing whom to contact about a particular third party and who is responsible for the third-party tool if it’s used by multiple units on campus. Another aspect of TPRM is monitoring vendor health and the risk that a vendor might go out of business, end support a product, or significantly increase pricing, forcing an unexpected change. All of this work often requires more effort than many campuses are able or willing to devote, resulting in point-in-time evaluations that are less reliable than a comprehensive program. The procurement process might address risk through standard terms and conditions that apply across all vendors, not just technology vendors. One way vendors can offer assurance is by undergoing a SOC 2 Type II audit, in which a third party audits the effectiveness of the vendor’s security practices.
