Methodology

Methodology

The following describes the methodology used to develop the Canadian Global Health Equity in Biomedical Research Report Card. The development of the methodology was overseen by a student committee, based on metrics that meet defined criteria and the advice of colleagues working in global health.

All elements od the evaluation, including selection of the universities, selection of metrics, data collection, scoring and grading, were conducted by UAEM Report Card Student Team between June 2016 and July 2017.

SELECTION OF UNIVERSITIES

The Canadian iteration of the UAEM University Report Card specifically evaluates the universities that are members institutions of the U15. The U15 is a non-governmental organization of the 15 research-intensive universities in Canada.

Collectively, the members of the U15 represent 47 percent of all university students in Canada, 71 percent of all full-time doctoral students in the country, 87 percent of all contracted private-sector research in Canada, and 80 percent of all patents and startups in Canada. These figures can be found here:

SELECTION OF EVALUATION METRICS

To provide a comprehensive overview of university commitment to global health research, the Canadian Report Card measures 65 performance indicators in 5 general categories.

The 65 specific metrics were selected on the basis of the following criteria:

  • Significance as indicators of global health research
  • Availability of standardized data sources for all evaluated institutions
  • Consistent data and comparability across evaluated institutions
  • Ability of evaluated institutions to concretely improve performance on these metrics
  • Diverse range of indicators measuring policy and implementation

INNOVATION

Q1-a: What percentage of the University’s total CIHR research funding in the last 2 fiscal years is allocated to global health research?

Q1-b: What percentage of the University’s total CIHR research funding in the last 2 fiscal years is allocated to rare disease research, Tuberculosis, Malaria, and/or antimicrobial resistance (AMR)?

Q1-c: What percentage of the University’s total CIHR research funding in the last 2 fiscal years is allocated to neglected disease (ND) research and diseases with neglected aspects, including HIV/AIDS, Tuberculosis, Malaria, etc.?

Q1-d: What percentage of the University’s total CIHR research funding in the last 2 fiscal years is allocated to emerging infectious disease research?

Q2: How many noncommunicable diseases (NCD) research grants (or other similar grants that aim to award projects focused on underserved Canadian populations) has the university been awarded in the last 2 calendar years? Please specify the name of each grant received, if multiple were received, and the amount of each.

Q3-a: What percentage of the university’s medical PubMed publications in the last 2 calendar years (January 1st, 2019 – December 31st, 2020) are focused on global health?

Q3-b: What percentage of the university’s medical PubMed publications in the last 2 calendar years (January 1st, 2019 – December 31st, 2020) are focused on public health strategy?

Q3-c: What percentage of the university’s medical PubMed publications in the last 2 calendar years (January 1st, 2019 – December 31st, 2020) are focused on neglected diseases, HIV, TB, malaria, antimicrobial resistance, and/or access to medicines in low- and middle-income countries?

Q3-d: What percentage of the university’s medical PubMed publications in the last 2 calendar years (January 1st, 2019 – December 31st, 2020) are focused on indigenous health?

Q4:  Is any of the university’s medical research being done in collaboration with, funded by, or driven by alternative models for research and development? (e.g Drug Discovery and Data-Sharing platforms, Prizes, Philanthropy for Drug Discovery, Drug Patent Pools, Public-Private-Partnerships etc.)

Q5: Is the University currently engaged in or supporting research on global drug pricing mechanisms to ensure equitable access to affordable medicines, or has research in this area been carried out? Examples of types of initiatives provided. 

Q6: Is the university currently in one or more partnerships with a pharmaceutical corporation either via a specific research project, lab, center, initiative, or another model?

Q7: Does the university have, or plan to open, a research center or institute dedicated specifically to neglected diseases, HIV/AIDS, TB, Malaria and/or antimicrobial resistance (AMR)?

Q8: Does the university have, or plan to open, a research center or institute dedicated specifically to neglected diseases? Does the university have, or plan to open, a research center or institute dedicated specifically to indigenous medical or health research?

ACCESS

Q1-a: Has the university officially and publicly committed to licensing its medical discoveries in ways that promote access and affordability for resource-limited populations?

Q1-b: Does the website of the university’s technology transfer office (TTO) make an effort to disclose, explain and promote access licensing commitments and practices?

Q2: Has the university adopted initiatives or policies supporting open access publications (EXCLUDING STATEMENTS REGARDING COVID-19)?

Q3: What percentage of the university’s total medical sciences publication output is published in open-access publications?

Q4-a: In the past 2 calendar years, what percentage of the university’s total research licenses were non-exclusive? (EXCLUDING COVID-19 RELATED LICENSES ) 

Q4-b: In the past 2 calendar years, what percentage of the university’s health technology licenses were non-exclusive?

Q5-a: In the past 2 calendar years, for what percentage of all health technologies did the university seek patents in low- and lower-middle-income countries as defined by the World Bank?

Q5-b: In the past 2 calendar years, for what percentage of all health technologies did the university seek patents in upper-middle-income countries as defined by the World Bank?

Q6-a: In the past 2 calendar years, what percentage of the university’s exclusive licenses of health technologies included provisions to promote access to those technologies in low- and middle-income countries as defined by the World Bank? Please provide examples of either template or redacted language. 

Q6-b: In the past 2 calendar years, what percentage of the university’s exclusive licenses of health technologies included provisions to promote access to populations with historically low access to medicines, including indigenous communities, within high-income countries as defined by the World Bank? Please provide examples of either template or redacted language. (not weighted)

Q6-c: In the past 2 calendar years, what percentage of the university’s exclusive licenses of health technologies included provisions to promote access to those technologies in high-income countries as defined by the World Bank? 

Q7-a: Has the university publicly acknowledged the existence/effectiveness of alternative models of research and development as being important to ensuring access to medical innovation?

Q7-b: What actions has the university and its technology transfer office (TTO) undertaken in the past 2 calendar years to improve access to the technologies they have licensed in resource-poor settings?

Q8: Has the university publicly stated the existence/effectiveness of alternative models of research and development as being important to ensuring access to medical innovation?

Q9: Has the university submitted a patent(s) recently (after 2010) to the Medicines Patent Pool (MPP) or World Intellectual Property Organization (WIPO) for protected intellectual property status for medicines treating HIV, hepatitis C, malaria, tuberculosis, neglected diseases, or other patented essential medicines in low- and middle-income countries?

EMPOWERMENT

Q1-a: Does the university offer its students access to global health engagement and/or education? [As indicated by the existence of a university center/institute, department, and/or non-degree program in global health.]

Q1-b: Does the university offer its students access to global health engagement and/or education? [As indicated by the existence of a university graduate degree, major/concentration, focus/specialization, certificate, or undergraduate degree in global health.]

Q2-a: Did the university offer courses in the last 2 academic years that address the policy and legal context of biomedical R&D, and more specifically, the impact of intellectual property policies on research priorities and global access to medical innovations? 

Q2-a-b: Is/are the course(s) required for any major programs?

Q2-b: Did the university offer courses in the last 2 academic years that address the prevalence of and/or lack of research on neglected diseases, including neglected aspects HIV, TB, and/or malaria?

Q2-b-b: Is/are the course(s) required for any major programs?

Q2-c: Did the university training opportunities (courses, conferences, seminars, etc.)  for students and faculty on cultural sensitivity and/or decolonized approaches to global health?

Q2-c-b: Is/are the course(s) required for any major programs?

Q3: Does the university have programs to maximize Indigenous student enrollment in health-related programs? If applicable, please describe the program. 

Q4: Has the university hosted a major (in person or remote) conference, symposium or campus-wide event in the past 2 academic years on:
Part A: The policy and legal context of biomedical R&D, specifically the impact of intellectual property rights on research priorities and global access to medical innovations? If the university has hosted multiple events for this category, please specify how many;
Part B: Neglected diseases, HIV, TB, and/or malaria, and health needs of low- and middle-income countries? If the university has hosted multiple events for this category, please specify how many;
Part C: Promoting drug access for low-income populations in high-income countries? If the university has hosted multiple events for this category, please specify how many.

Q5: Is the university formally involved in a global health partnership with one or more universities based in low- and middle-income countries? 

Q6: Does the university offer any of its students an opportunity to learn more about alternative models for research and development through courses, workshops, or other opportunities?

TRANSPARENCY

Q1: How responsive was the university’s Technology Transfer Office (TTO) to emails from UAEM regarding the section surveys?

Q2-a: For questions relying on public data (CATEGORY 1) in the Access section, was sufficient information available online?

Q2-b: For questions relying on public data (CATEGORY 1) in the Innovation section, was sufficient information available online?

Q2-c: For questions relying on public data (CATEGORY 1) in the Empowerment section, was sufficient information available online?

Q2-d: For questions relying on public data (CATEGORY 1) in the Transparency section, was sufficient information available in public sources/databases? 

Q2-e: For questions relying on public data (CATEGORY 1) in the COVID-19 section, was sufficient information available in public sources/databases? 

Q3: How much discrepancy exists between university responses in the submitted forms and what is being internally collected using publicly available data for Category 1 and 2 questions?

Q4: What percent of all of the university’s clinical trial data was published during the last 2 calendar years?

Q5: In the past 4 calendar years, what percentage of completed clinical trials conducted by the university and registered on ClinicalTrials.gov had their data shared as summary results on ClinicalTrials.gov or PubMed?

Q6: Does the university have policies that mandate that all university researchers publish all results of all clinical trials? Y/N. If so, please provide the university’s policy or examples of the university’s policy.

Q7: Do you recommend or require your researchers to prospectively register all clinical trials with an appropriate registry before any subject is enrolled? 

Q8: Does the university publicly acknowledge the need to be transparent in clinical trial results? If yes, check all that apply.

Q9-a: Does the university engage in commissioned research from private companies? Y/N

Q9-b: Do these companies have the ability to insert clauses affecting or preventing data publication? Y/N 

Q10: Does the university have clear guidelines, or policies, for conflict of interest regarding  partnerships with industry with a potential for commercial interest?

COVID-19

Q1: Has the University made COVID-19 specific adjustments to their technology licensing practices? If so, please summarize or provide examples of licensing changes. (not weighted)

Q2-a: Has the University signed on to the Open Covid Pledge or signed a similar agreement regarding intellectual property for the COVID-19 pandemic? Please specify which agreement the university is a signatory for.

Q2-b: If the agreement the university signed on to involve technology sharing, ex. AUTM Guidelines, what technologies have they shared so far? Please provide specific examples. (not weighted)

Q3: Has the University made agreements on COVID-19 research/technology licensing with private companies? If yes, is the license exclusive or non-exclusive? If possible, please provide sample language from the licensing agreement. 

Q4: What percentage of the University’s publications on COVID-19 biomedical research are available open access (from March 2020 – March 2021)? 

Q5-a: Has the university hosted a (remote) conference, symposium or event in the last 12 months on COVID-19 needs of low- and middle-income countries?

Q5-b: Has the university hosted a (remote) major conference, symposium, or event in the last 12 months on biomedical COVID-19 related challenges/solutions for populations with historically low access to medicines, including Indigenous communities, within high-income countries such as Canada and the United States?

Q6-a: Has the University translated its COVID-19 public health strategy research findings into practical resources that are available and accessible to indigenous communities? If yes, please provide examples. 

Q6-b: If yes, how many languages are the resources accessible in?

While we acknowledge there will be variation across universities selected for evaluation (e.g. in levels of research funding, student body size), we also recognize that these institutions are public universities. This homogeneity among Canadian universities will allow for more direct comparisons than would be possible with a mix of public and private institutions. Regardless, UAEM has selected evaluation criteria intended to minimize the impact of any variations that may arise.

Importantly, all metrics that analyze continuous variables account for variation in school size and funding by normalizing the absolute number to the overall level of combined CIHR, NSERC and Gates Foundation funding. For example, when evaluating a university’s investment in neglected disease (ND) research, Antimicrobial Resistance (AMR) and neglected aspects of HIV/TB/Malaria, our metric is calculated by dividing a given institution’s overall medical research funding devoted to ND and related research projects (from the >100 funding sources included in the G-Finder report) by the total CIHR + NSERC + Gates funding to generate an “ND Innovation Index”. This enables us to adjust for confounding by institutional size and allows for a meaningful comparison of performance across institutions.

For categorical metrics, we have developed pre-defined sets of discrete categories by which all universities can be uniformly evaluated, and for which performance is likely to be independent of variation in university size, funding, capacity or resources.

DATA SOURCES AND COLLECTION

A critical aspect of the Report Card methodology is the collection and analysis of data using two broad categories of data extraction:

  1. Data obtained by accessing publicly available sources, such as university websites, online grant databases, government publication databases and search engines; these data sets are collected by UAEM researchers, staff, and interns. This data is denoted Category 1 (PUBLIC DATA)
  2. Data obtained by self-report of university officials in response to survey instruments designed and provided by UAEM. This type of data is denoted Category 2 (SELF-REPORTED DATA)

We attempt to maintain rigor and minimize biases by systematically collecting and analyzing data through the use of detailed, predetermined standardized operating procedures (SOPs) elaborated specifically for  the Report Card Methodology. All researchers were briefed on and practiced these protocols prior to beginning data collection.

For CATEGORY 1 (PUBLIC DATA), we address data quality and consistency as follows:

  • We prospectively developed SOPs and standardized data entry forms, including uniform search terms to which all investigators are required to adhere.
  • We performed quality control tests to ensure that investigators were obtaining the same results from the collection procedures.
  • In all cases, multiple investigators independently and concurrently perform the same data collection and search processes to ensure data consistency. Discrepancies between these two researchers were resolved through the work of a third investigator. 

For CATEGORY 2 (SELF-REPORTED DATA), we address data quality and consistency, including concerns about questionnaire non-response, as follows:

  • Compared to the first iteration of the Report Card, we chose to reduce the number of questions we asked of administrators if answers could be easily verified via public sources by our team of investigators.
  • We provide the same questionnaires to all institutions.
  • We identified between 5 and 10 specific administrators in leadership positions at each university whom we felt were most likely to recognize the value of the surveys and would encourage a response from within their teams. The individual contact details were searched publicly via the university websites and other publicly available sources. These individuals were initially emailed the survey tools, with later calls made to confirm the receipt of the survey. The contact list for each university includes, but is not limited to, directors of technology transfer offices, deans of individual schools (law, public health, medicine), and vice presidents for research.
  • We use standardized communication strategies to deliver the survey instruments to all institutions and conduct consistent follow-up via e-mail; institutions were given at least 3 months to respond to all survey instruments, and each administrator was contacted a minimum of three times to encourage response.
  • Where possible, we have asked questions in a manner such that the variable under question is categorical rather than continuous; this is in an effort to maximize the likelihood of response from institutions.
  • We apply standardized scoring of responses across all institutions.
  • If more than one person per institution replies, and there is a discrepancy in the responses,we first aim to verify the correct answer via verified public sources. If this is not possible, we elect to use the answer that favors the university.

SCORING AND GRADING

As in previous iterations and given the purpose of the Report Card, greater weight is allocated to the Innovation and Access sections, with each section accounting for 25% of the total grade. The Empowerment section is worth 10% of the total grade due to the increased challenges in evaluating these specific metrics and the lack of a measurable correlation between them and their impact on increasing access to medicines and addressing neglected diseases in low- and middle-income countries. The Transparency section was allocated 20% of the total grade due to the implications of open and collaborative biomedical research in research progress and its essential nature to ensuring access and innovation for all. Finally, the new COVID-19 section, one that wishes to incorporate the values and questions from the other four sections to analyze university responses to evaluate university responses to the COVID-19 pandemic, is weighted at 20% of the total grade. 

For each question, the institution is assigned a raw score based on the gathered data. Each question is also associated with a weighting multiplier from 0.25 to 2.0, based on the relative importance of each question as determined by UAEM’s report card team. The weighted score for a given question is the product of the raw score and the weighting multiplier. To minimize bias due to non-response to CATEGORY 2 (self-reported) questions, we have designed the Report Card such that each section is a mix of CATEGORY 1 (public data) and CATEGORY 2 (self-reported) questions.