Understanding How COVID-19 Impacted the Collection and Use of Faculty Satisfaction Metrics by Research Administration Offices

By SRAI JRA posted 03-14-2025 07:03 PM

  

Volume LVI, Number 1

Understanding How COVID-19 Impacted the Collection and Use of Faculty Satisfaction Metrics by Research Administration Offices

 

Katherine Bui
Stanford University

Keith R. Berry Jr.
University of Arkansas

 

Abstract

Research administrators (RA) at institutions of higher education (IHE) provide critical support to faculty throughout the lifecycle of research, which include developing research, applying to funding opportunities, managing awards through closeout, and maintaining compliance. Fulfilling these tasks requires well-developed RA processes and clear customer service pathways for faculty. Metric collection on faculty satisfaction provides insight to how RA infrastructure is successfully supporting research and areas needing improvement. In 2019 RAs were surveyed on faculty satisfaction metric collection practices regarding processes and customer service, and if these metrics influenced change. After this initial data collection, the COVID-19 pandemic started and temporarily changed how research was performed, further impacting RA processes and customer service with mandatory remote work. Three years after the start of the pandemic, a survey was distributed to RAs to compare results and understand how COVID-19 impacted the collection of faculty satisfaction metrics. Results showed during the height of the pandemic there was not an increase in metric collection, however, within the last year of the pandemic, when RA offices were looking forward to a new permanent normal, there was a surge. The pandemic changed “where” faculty support was provided with initial necessary remote work; data showed permanent changes to the RA work environments with maintaining hybrid and fully remote work after the shutdown had ended across different types of offices and IHE sizes. COVID-19 impacted research, and this study examines how it impacted collecting faculty satisfaction metrics when RA offices were forced to pivot processes and customer services.

Keywords: Faculty Satisfaction Metrics, COVID-19 Impact, Remote Work Transition

 

Introduction

University Faculty Researchers are blazing trails discovering and developing new technologies, exploring historical works with modern equipment and methods, advancing healthcare-related treatments, and a plethora of other highly valuable avenues with the goal of improving the fundamental knowledge in their area of expertise for the betterment of their surrounding communities and society. Behind these faculty members are research administrators (RA) whose knowledge and expertise are required to support successful research programs throughout the lifecycle of research starting in research development, pre-award, post-award, compliance, and disseminating research results (Kulakowski & Chronister, 2006). The working relationship between an RA and a faculty researcher is critical for smooth and successful management of their research through timely development and submission of grant proposals, setting up and managing the awards, and closeout of completed awards. Over the years, the view of research administration has evolved from being solely a gateway for faculty researchers to submit grant proposals to fund research projects to a modern view of providing advanced services to faculty customers to help reduce the administrative burden and improve compliance.

As the view of research administration continues to change toward customer service, it’s important to gauge the overall satisfaction of the customers receiving the service through the collection of metrics. Metrics in research administration offices can be used to change an RA’s behavior, drive performances, and support investments (Marina et al., 2015). According to Kriemeyer and Lindemann, metrics are “a means of representing a quantitative or qualitative measurable aspect of an issue in a condensed form” and so a metric can represent a set of circumstances that are occurring in a particular environment (2011, p. 75). Specifically, faculty satisfaction metrics on research administration processes and services can be both a qualitative and quantitative metric and be used to improve the quality and types of services offered by RAs. Although limited, there are a few studies looking at faculty satisfaction metrics and research administration from both a qualitative and quantitative metric viewpoint (Besch, 2014; Cole, 2007). When used appropriately, faculty satisfaction metrics can demonstrate a “misalignment between tasks, goals, and/or people” (Marina et al., 2015, p.107) within a research administration office’s services and faculty satisfaction. This misalignment provides an opportunity for change to increase the efficiency and effectiveness of research administration processes and services. Despite a finite amount of literature discussing the collection of faculty satisfaction metrics for research administration processes and services, there is even less discussing if the collection of metrics leads to change that could benefit both the RAs and faculty.

 

How COVID-19 Impacted Everyday Processes for Faculty

In 2019 as part of a master’s thesis for the lead author, data was collected on if faculty satisfaction metrics were collected in institutes of higher education (IHEs) research administration offices and if the metrics were used to make changes to processes and customer services those offices initiate (Bui, 2019). Since this data was collected, the COVID-19 pandemic occurred and significantly altered how research administration offices at IHEs function. For this study the COVID-19 pandemic is defined based on when it was officially declared a pandemic by the CDC on March 11, 2020, and ending May 11, 2023, when the COVID-19 Public Health Emergency declaration ended (David J. Sencer CDC Museum, 2023; CDC, 2023). During these 3 years and 2 months, research and research administration was forced to adapt, transform, and consider permanent changes as safety became the priority while trying to maintain productivity for RAs and researchers.

At the beginning of the pandemic, March 15, 2020, states began to implement mandatory shutdowns to prevent further spread of the virus, which included IHEs making individual verdicts on how to implement safety and maintain mission-supporting operations (David J. Sencer CDC Museum, 2023; Smalley, 2021). With the immediacy and serious nature of COVID-19, IHE leaders needed to make critical decisions on what activities and research would be deemed essential and remain in-person with appropriate accommodations. While the definition of “essential” varied across institutes, research focused on COVID-19 was given high priority followed by whether halting the research activity would harm participants, critical samples, data, equipment, or institutional infrastructure (Sohrabi et al., 2021). Based on this characterization, most IHEs sent students home to finish classes remotely, allowed administrative staff to work from home, and enforced suspensions of in-person non-essential research functions (Coyne et al., 2020). These transitions were unprecedented for IHE leaders and were made quickly without faculty and stakeholder buy-in used in the decision analysis (Coyne et al., 2020). Despite these exceptionally unique circumstances and creative solutions, the process of grouping activities as essential and non-essential to satisfy public health recommendations created immediate and long-term consequences for faculty and their research.

With select research topics able to proceed during the shutdown, naturally inequity amongst faculty based on the type of research expertise emerged. Some faculty were able to proceed with research progress because their research allowed them to pivot and focus on COVID-19, or their research didn’t require extensive physical laboratory space or traveling field work. Myers et al. (2020) discusses how early on in COVID-19 “Scientists working in fields that tend to rely on physical laboratories and time-sensitive experiments—bench sciences such as biochemistry, biological sciences, chemistry, and chemical engineering—reported the largest declines in research time, in the range of 30–40% below pre-pandemic levels. Conversely, fields that are less equipment-intensive—such as mathematics, statistics, computer science and economics—reported the lowest declines in research time.”. For faculty whose research did immediately slow, they then experienced long-term delays with being able to write and submit manuscripts until they had complete data sets (Ramos, 2021). Even if faculty had complete data sets and were able to use the shutdown time to write up manuscripts for submission, they then also found delays in the peer review process due to prioritization of reviewing COVID-19 research (Ramos, 2021). A study examining the impact of COVID-19 on publications found that only 4 months after the start of COVID-19 almost half of published medical research in top journals were dedicated to COVID-19 (Raynaud et al., 2021). This demonstrates a shift in not only focus on COVID-19 research, but also a potential shift in the peer review process to focus on COVID-19 research for publication. However, another study by Wooden and Hanson (2022) looked at effects of COVID-19 on authors of American Geophysical Union Journals which found an increase in submissions during the pandemic. The mixed results on research publication productions during and after COVID-19 illustrates the inequity faculty faced. Even once faculty were able to start to transition back to being on campus and in labs, there were still hurdles to be faced such as: social distancing requirements preventing full lab staff capacity; delays and shortages with research supplies; re-starting research projects that had fallout due to inability to maintain samples and research specimens; and trying to figure out how to proceed with clinical research when there are competing demands between institutional mask requirements and protocol driven tests (Madhusoodanan, 2020). Overall, the impact of research and publication during COVID-19 varied greatly for several reasons, but the nature and requirements of the research did cause disparities amongst faculty.

Beyond faculties’ own research, they often have multiple responsibilities, such as teaching, academic advising, research mentorship, leadership duties, and administrative committees (New York University, n.d.). During the height of COVID-19, effort being allocated to these various tasks were often strained amongst each other. Faculty were asked to transition their teaching format and materials for online delivery that was often new to them, with little training, and required additional effort (Aydın et al., 2023; Coyne et al., 2020). Additionally, faculty had to assess the various methods they could utilize for distance learning such as self-paced independent study, remote interactive workshops, or real-time immersive environments (Cook et al., 2010). Figuring out how to quickly transition teaching formats for students with minimal disruption while maintaining the integrity of the learning was difficult for faculty (Aljanazrah et al., 2022). All the previous roadblocks faculty experienced with their own research, graduate students also encountered for their dissertation research, creating stress and delays for both students and faculty (Elmer & Durocher, 2020). The support role faculty play to students put them at risk of easily absorbing students' stress (Velez-Cruz & Holstun, 2022). During COVID-19, stress was already heightened with unique adjustments and an unforeseeable end, and faculty felt this stress for themselves and their students.

How COVID-19 Impacted Everyday Processes for Faculty

While faculty faced numerous challenges during the pandemic and mandatory shutdowns beyond what’s mentioned above, RAs experienced similar challenges adapting to remote work and trying to assist faculty during a period of unknown and confusion. One study by National Council of University Research Administrators (NCURA) found in 2021 that just over 600 respondents at public institutes were working remotely out of the total 1618 responses from across multiple types of institutes (Madnick et al., n.d.). Despite this study, there are limited resources available to quantify IHE RA remote work during COVID, but there are studies based on United States employees and those in higher education to deduce the majority of university RAs transitioned to remote work during the height of the pandemic. According to telecommuting statistics, before the pandemic, less than 5% of the U.S. employee workforce worked at home three or more days a week, but during the peak of the pandemic when stay at home regulations were in place, 60% of employees were able to successfully work from home (Global Workplace Analytics, n.d.-b). The Bureau of Labor Statistics (BLS) categorizes compatible jobs with remote work as those with professional specialties, technical support, administrative support, and half of sales jobs; these categories align with the nature of research administration work in a university setting (Global Workplace Analytics, n.d.-a). The University of Michigan Human Resources conducted a staff survey regarding remote work nine months after the start of the pandemic. Based on their survey, 71% of staff members had mostly worked remotely during the first 9 months of the pandemic, 87% of respondents were interested in continuing to work remotely at least one day a week, and 92% said their unit had successfully adapted to a remote work environment (Kaleba, 2021). Prior to COVID-19 flexible work arrangements for RAs did exist, but how widely accepted these arrangements were in IHEs is unknown. Often IHE leadership believed a common myth that to support university operations, staff must be physically present during standard operational hours, 8-5pm, however, COVID-19 was a forced test on versatile work conditions for university staff, including RAs (Jones, 2022). In the years following the pinnacle of COVID-19, several IHEs have continued to explore adaptable work for staff as a new tool. The University of Pittsburgh enacted an interim flexible work policy in July 2021 in preparation for the university to open back up fully and in the following year a survey found 95% of employees were satisfied with hybrid work (Jones, 2022). In October 2023 University of Pittsburgh implemented a permanent flexible work policy to show commitment and support to staff and how they conduct their work (University of Pittsburgh Human Resources, n.d.). This is just one example of institutes accepting flexible work for staff for the long term.

Although the transition to remote work for research administration has shifted to being more accepted, RAs and other IHE staff also encountered additional challenges similar to those experienced by faculty. One of the first complex challenges IHEs faced was ensuring that staff had the correct technological support to continue daily tasks such as portable computers, accessories for virtual meetings, software to support remote phone calls, and the ability to learn new virtual platforms for collaboration where training was limited (Sharma, 2023). A study by Anita Sharma (2023) looked at RAs in Canada and how they adapted to a new environment during COVID-19, which identified that RAs experienced techno-stress with organizing remote work setups and finding new norms with communicating day-to-day tasks (Sharma, 2023). With in-person communication at a mandatory standstill, RAs switched from in-person meetings and impromptu face-to-face conversations in an office setting to scheduled virtual video meetings, instant messaging, phone calls, and email conversations (Coyne et al., 2020). Also, processes needed to become more reliant on technology and automation, like transitioning from wet to electronic signatures for items such as faculty expenditure reports. COVID-19 changed the logistics of how and where RAs provide their services to faculty and figuring out available and preferred communication pathways was an important piece on how to successfully maintain those services and processes.

Purpose of this Study

With the change in how RAs were providing customer services and assisting with processes during the COVID-19 pandemic, collecting metrics on faculty satisfaction could provide offices with an insight on the impact of their transitions, and if further changes are necessary based on the metric results. Overall, COVID-19 may have impacted the motivation and need for research administration offices in IHEs to collect faculty satisfaction metrics on their customer services and processes, and therefore, may have increased the usage of faculty satisfaction metrics to implement change. This study is a direct comparison with the original study completed in 2019 (prior to COVID-19) with the addition of data collection on how COVID-19 influenced metric collection and utilization of metrics when making changes.

 

Methods

To capture if there was a change in faculty satisfaction metric collection from pre-COVID-19 to post-COVID-19, an electronic survey was widely distributed to RAs in IHEs through emails and community listservs including requesting information about metric collection, usage, and how COVID-19 has impacted their work environment. There was minimal change to metric collection post-COVID-19 in comparison to the original survey distribution, however, data represented an increase in metric collection within the last year, an increased focus on metric collection for service qualities over processes, and several changes to RA work environments by going to hybrid or fully remote work. The results of this study confirmed there was an impact to the RA work environment due to COVID-19 and with the pivot in focus to collect faculty satisfaction metrics for service qualities towards the end of the COVID-19 timeline demonstrates RAs and IHEs were able to start thinking about striving rather than surviving from the unknown the pandemic forced society into.

IRB Approval of Study

The original study in 2019 was reviewed and approved by IRB at Johns Hopkins University and the 2023 study was reviewed and approved by the University of Arkansas IRB. Both studies were approved via an exempt status due to the data collection methods and the structure of the questions.

Survey Distribution Methods

Data collection in 2019 and 2023 was approached in a similar manner, through targeted electronic communication of RAs at IHEs. In both occurrences data was collected through anonymous electronic surveys distributed amongst the RA community in the United States. There were multiple approaches to distributing the survey. The original email list (656 individual emails) used in 2019 for recruiting potential individual respondents was used again for distribution. To help offset anticipated email bounce backs from individuals who are no longer with their organization, an additional 145 emails were added to the email list in 2023 resulting in a total of 801 targeted individual emails. These emails were collected through research university public websites for individuals who had job titles associated with RA at varying levels of experience/leadership. Searching websites was driven by collecting email addresses from various universities based on research and development expenditures ranks from NSF (National Science Foundation, n.d.). The 801 targeted individual emails were sent out using mail merge with 190 email bounce backs that did not reach the intended individual. In addition to individual emails, the survey was distributed via RA online communities such as the RA listserv, Society of Research Administrator International (SRAI) Open Forum, Future of the Field Forum, National Council of University Research Administrators (NCURA) community, and National Organization of Research Development Professionals (NORDP). The email used for these various communities outlined the purpose of the survey, anticipated length, and stated that the answers to the survey would be anonymous.

Survey Structure

In the 2023 survey, all original questions from the 2019 survey were included with the addition of several questions to better assess previously identified limitations. The structure of questions in the survey varied from yes/no, select all that apply, or select the best provided answer. The initial determination question of the survey, for both 2019 and 2023, asked if the participants knew if their office collects faculty satisfaction metrics. However, branching logic was used based on if they answered “yes” or “no or unsure” to filter the respondents to the appropriate subset of questions. Initially in 2019 if the participant answered “no or unsure” they were brought to the end of the survey and thanked for their participation. For the survey in 2023, additional questions were added after the first determination question to gather data and information on the participants who answered “no or unsure” which was designated as Survey 3. If the participant answered yes to collecting faculty satisfaction metrics, they were shown a second set of determination questions on the participants’ current employment status in relation to pre-March 2020 employment. Specifically, the following distinguishing questions were asked to determine participants’ current position in relation to their pre-COVID-19 position: “are you working at the same university and in the same office”; “same university and in a different office”; “same university and were not previously working in research administration”; or “currently at a different university”. If the participant answered the same university and same office, they were directed to a subset of questions known as Survey 1. The remaining respondents would be presented with questions in Survey 2. Most of the questions between Survey 1 and 2 were similar, but Survey 1 had additional questions about collecting metrics prior to March 2020 with the anticipation that they would have more institutional knowledge on whether faculty satisfaction metrics were collected. Both Survey 1 and 2 maintained the initial questions from 2019 as the first 10 questions. Questions were added to both surveys to collect information on metrics collected on processes and customer service qualities after March 2020 regarding remote work and additional questions about the demographic of the participant. Survey 3 had new questions for participants who answered no or were unsure of collecting faculty satisfaction metrics. Specifically, participant demographic questions like those in Survey 1 and 2 were used, in addition to asking for participants' opinions on if metrics should be collected and gauging why metrics are not being collected.

Software and Storage of Survey Materials

In 2023, the new survey was designed in Qualtrics and all data from completed surveys were stored anonymously within the Qualtrics software with access limited to both authors. The survey was open to participants for approximately one month, opening on August 23, 2023, and closing September 22, 2023. Once the survey was closed, the authors used Qualtrics to assist with sorting and reviewing the data in addition to organizing the data analysis in Excel.

Limitations

There were several limitations for this study that are important to highlight. A major limitation was the low sample size and response rate despite extensive efforts in participant recruitment and providing the survey in electronic format with closed-ended questions for easy completion. COVID-19 was an unprecedented event for the generations within the current workforce. Scientists, employers, and the public were hungry to capture data on how society was evolving and the numerous downstream effects through the most convenient method that didn’t require face-to-face contact and could expand across continents—electronic surveys. However, with the increased desire to gather information through surveys, there is the potential to overuse these methods and create survey fatigue within the respondent population. The lower responses to the distributed survey within this study could be a result of survey burnout as seen amongst national survey response rates and surveys distributed for academic research. An article by Krieger et al. (2023) focused on looking at US national and state survey response rates for 6 major national surveys in comparison to before and after the start of the pandemic. The study found that 5 of the 6 surveys did see a decrease of about 29% in response rates when comparing 2019 to 2020 results (Krieger et al., 2023). This article demonstrates an initial dramatic decline in survey response rates at the beginning of the pandemic, however, according to the Census Bureau this response rate drop has sustained throughout the pandemic and even beyond (Rothbaum & Bee, 2022). For example, in February 2020 response rates for the Current Population Survey Annual Social and Economic Supplement (CPS ASEC) was 82.3%, April 2020 was 69.9%, March 2021 76.2%, March 2020 72.2%, March 2023 68.9%, and January 2024 was 69.5% (Rothbaum & Bee, 2022; U.S. Bureau of Labor Statistics, 2024). Even being close to four years after the onset of the pandemic, the response rates for this national survey remain lower than pre-pandemic and are continuously decreasing close to rates of April 2020. Within academic research, a decline in survey response rates have also been observed. An article by de Koning et al. (2021) conducted a review on survey responses administered to neurosurgeons, trainees, and medical students across published articles before and after the pandemic. The results found an increase in publications utilizing online surveys and a decrease in survey responses after the onset of COVID-19, indicating that through the increased use of online surveys during the pandemic, the method has been overused creating the phenomenon of survey fatigue (de Koning et al., 2021). Although there is minimal research for RAs and survey fatigue, both authors have observed an increase of widely distributed online surveys through RA communities.

As previously discussed, even with a larger sample size, it is possible the data may not represent a collection of faculty satisfaction metrics across the entire IHE RA population as not all RAs are knowledgeable on the topic. Additionally, there were several participants who started the survey but skipped questions or did not complete the survey. The data in 2023 was collected only three months after the emergency status of the pandemic ended, extending out the data collection further out from this end may show more of the anticipated increase in faculty satisfaction metrics.

Although this study aims to compare pre- and post-pandemic usage of faculty satisfaction metrics in RA offices, both survey results were anonymized and distributed to more than the original respondents, so a direct one-on-one comparison of results was not possible.

 

Results and Discussions

Survey answers highlighted several trends and insights into how COVD-19 has impacted the collection of faculty satisfaction metrics and IHE research administration work environments. It was hypothesized that there would be an increase in respondents answering yes to collecting faculty satisfaction metrics from 2019 to 2023 due to the impacts COVID-19 brought to research. However, when comparing the survey results from 2019 (prior to COVID-19) to 2023 there was not a significant change in the amount of faculty satisfaction metric collection by RAs in IHE, but there was an increase in faculty satisfaction metric collection within the most recent year from this survey. The last year of this survey corresponds to the end of the COVID-19 public health emergency in the U.S.A. Comparing pre- and post-COVID survey results revealed a shift in focus from collecting metrics for processes to customer service. This switch continues to be prevalent in the data when examining specific customer service qualities and processes that are common priorities for RAs. As the concentration for feedback on customer service qualities were highlighted in the 2023 survey results, the data also demonstrated an evolution in work setups. In 2023, most respondents recorded they work remotely in some capacity that wasn’t possible prior to COVID-19 and was demonstrated in both respondents who are at the same IHE and same office as 2019 and those who have transitioned to a new IHE after 2019. RAs who currently work remotely in some capacity were found across varying types of research administration offices and at different sizes of IHEs. Beyond analyzing the data for the respondents who collect faculty satisfaction metrics, additional insight was obtained on those who do not collect metrics and if their demographics illustrate a trend to who has access to this information, further impacting the ability to obtain data on this topic.

Overview of Responses

In 2019 there were 156 responses and in 2023 there were 98 responses to the widely distributed surveys. The participant recruitment timeline for when the surveys were active and notifications were provided to RAs between the 2019 and 2023 data collection was similar, lasting 33 days in 2019 and 30 days in 2023. In 2019 a total of 45 respondents answered yes to collecting faculty satisfaction metrics, 56 answered no, and 55 answered unsure. In 2023, 25 participants answered yes to collection faculty satisfaction metrics, 38 answered no, and 35 answered unsure.

Timeline of the Multiyear Study

Figure 1 outlines the overall COVID-19 timeline in comparison to the data collected for both surveys. The first set of data was collected about 6 months prior to the first recorded case of COVID-19 in the U.S.A and the second data set was collected about 3 months after the Public Health Emergency for COVID-19 ended (David J. Sencer CDC Museum, 2023; CDC, 2023). The time from when the first COVID-19 case was recorded in the U.S.A on January 20, 2020, till the pandemic was declared was 51 days and there were about 1,400 confirmed COVID-19 cases in the U.S.A. (Elflein, 2022). According to the CDC, when the pandemic was declared there were 118,000 cases in 114 countries and 4,291 deaths (David J. Sencer CDC Museum, 2023). The rise in COVID-19 cases quickly escalated both within the U.S.A and worldwide, and as cases continued to rise, state governments started to implement shutdowns on March 15, 2020. These shutdowns included public areas where face-to-face contact was common such as: school systems both K-12 and Higher Education, restaurants, retail stores, transportation, and factories (Fairlie, 2020). With IHEs impacted by the shutdown as discussed in the introduction, RAs transitioned how they provided support to faculty, mainly being remote. IHE transition plans remained in effect for various lengths of time from several months to years, and some IHEs have adapted to implement permanent changes that align with their initial transition plan, like having fully remote employees. Based on a survey across several countries, 838 IHEs were surveyed on their plans for when students would be allowed back on campus for education and 66% of those planned for on-campus classes in Fall of 2020 (Nurunnabi & Almusharraf, 2020). However, from 2020 to 2023 there have been several COVID-19 variants that caused IHEs to re-implement COVID-19 precautions such as masks, online learning, and distance to maintain overall safety of their students (Friedell, 2022). The timeline between March 15, 2020, to May 11, 2023, when the public health emergency ended, was over 3 years long. This period of time encountered long stretches of uncertainty and changes for RAs, but their everyday work was still required and important to researchers.

 

Figure 1. COVID-19 Timeline and Data Collection

COVID-19 timeline overview and relation to both data collection time points and how research and research administration changed during COVID-19.

Collection and Use of Faculty Satisfaction Metrics

Respondents who answered yes to collecting faculty satisfaction metrics both in 2019 and 2023 were asked about the length of time their office has collected faculty satisfaction metrics and Figure 2 highlights the comparisons between data sets.

Evaluating the duration of metric collection between 2019 and 2023 in relation to the COVID-19 timeline in Figure 1, there was a decrease in collection during the climax of COVID-19 (1-2- and 3-4-year categories). In 2019, RA offices who collected faculty satisfaction metrics for 1-2 years was 13%, dropping to 6% in 2023 and metrics collected from 3-4 years in 2019 was 22% decreasing to 11% in 2023. The data trends within the 1-2- and 3-4-year categories supports that during the height of the pandemic, universities, staff, faculty, and students were in survival mode rather than focusing effort on improving processes and services, despite large changes to how work was being completed (Van Der Linden, 2021). Starting in March of 2020 through the next several years, faculty and staff were required to adapt to numerous changes on how to continuously meet their responsibilities and support the mission of their IHE. Preservation of time and resources took precedence over expending valuable support on assessing faculty feedback when the research community faced uncertainty on when or if the previously defined “normal” would return to IHEs.

Months before the 2023 data set was collected the U.S. declared an end to the PHE and there was an increase in the “less than 1 Year” category for collecting metrics, going from 7% in 2019 to 22% in 2023. This increase aligns with universities and RAs emerging from the chronic stress of COVID-19 and having the capacity to reflect on pandemic transitions, assess a new normal, and the impact on processes and services (Van Looy, 2021). Additionally, the less than 1 year category in 2023 had 100% of the respondents who are currently employed in the same office as prior to COVID-19, demonstrating a change in initiating faculty satisfaction metric collection towards the end of COVID-19 that didn’t occur prior. The majority of respondents in the 5 years or more category were also in the same office as prior to COVID-19. The trends in these two categories display either a new desire for collecting faculty satisfaction metrics, or a prolonged need for metrics even through the transition of COVID-19.

It was anticipated to see an increase in faculty satisfaction metric collection during the COVID-19 pandemic as work environments shifted, but instead there was a decrease in collection during the pinnacle of the pandemic and an increase as the pandemic started to taper down. If the 2023 data collection would have been pushed out another year, it is possible the less than 1 year data could have been even higher, capturing RAs settling into a different routine where process and customer service adjustments need faculty satisfaction feedback now that a new standard for remote work has emerged in research administration.

 

Figure 2. Faculty Satisfaction Metrics 2019 vs 2023

Comparing 2019 and 2023 respondent answers on how long their office has collected faculty satisfaction metrics and if the respondent in 2023 is in the same office as prior to COVID-19.

Content of Faculty Satisfaction Surveys

In 2019, the survey was designed to assess not only if metrics are being collected in research administration offices, but also if metrics are being utilized to make changes to RA processes and customer service qualities. Figure 3 shows the respondents who answered yes to collecting metrics, what percentage of them collected metrics on customer service qualities and processes, and if changes occurred. From 2019 to 2023 there is an increase in collecting customer service-related metrics and changes, but a decrease in collecting and making changes for process-related metrics. These results indicated that during and immediately after COVID-19 there has been an increased focus on faculty feedback regarding customer service qualities and not processes.

Figure 3 further breaks down the specific processes and customer service qualities that were polled both in 2019 and 2023. It is evident from literature the pandemic transitioned how and where university staff complete their work, which had the potential to impact both processes and customer service for RAs. For over 20 years prior to the pandemic, research administration work has naturally progressed to being predominantly online and driven by new technologies such as the emergence of electronic Research Administration (eRA) systems (eRA, 2024). These systems allow IHEs access to federal grants through the internet and remove the need for processing applications by paper. With the government transitioning to electronic submissions, IHEs and RAs would have also wanted to adapt to relying on online technology for managing research administration processes (Cayuse, n.d.). When COVID-19 shifted the research community to a predominantly remote capacity, most IHEs already had several established online processes for effective research administration, requiring less transition, but there were IHEs like Thomas Jefferson University who required quick conversions of processes to an online format (Jones, 2020; Alcaine, 2020).

With the removal of IHE in-person office environments during COVID-19, how customer service was provided to faculty from RAs evolved. A study by Dr. Sharma, conducted in the spring of 2021 after the peak of the COVID-19 crisis, looked at the change in types of communication and how RAs preferred modes of contact has changed from the impact of COVID-19. Overall, prior to COVID-19 56% of respondents preferred communication by email and 42% in-person (Sharma, 2023). During the pandemic, email preference increased to 61% and in-person was no longer an option, so video communication was favored at 24%. This study was conducted approximately one year after the transition to remote work, and found that 57% of respondents favored email, 7% video, and 28% in-person (2023). Although email remained the preferred method of communication before, during, and after COVID-19, the preference for in-person communication had a dramatic shift. This study supports that the transition to remote work to accommodate the pandemic impacted how research administration customer service was provided via different modes of communication. In Figure 3, of the four customer service qualities chosen to survey, one was regarding effective communication and two are directly related to communication: responsiveness and availability. There was an increase in collecting metrics on effective communication and availability in 2023 compared to 2019, but there was a decrease in metrics for responsiveness. Removing face-to-face interaction there would have been an increase to gather faculty feedback on how to reach their RA (availability) and how to continue productive communication (effective communication). It would have been anticipated that responsiveness would have been another important customer service topic for faculty feedback with the elimination of impromptu office visits that are no longer an option when emails go unanswered.

With the change in where RAs completed their work and accounting for their new communication preferences with remote work, there would be an expected increase in collecting faculty feedback on these changes and if the new ways of providing customer service suffice the faculty’s needs. It is understandable when IHEs evaluated “how” RAs are doing their work during the pandemic, faculty feedback would concentrate on customer service qualities with the transition from office to remote work, rather than previously established online processes.

 

Figure 3. Metric Collection on Service Qualities and Processes

Comparing 2019 to 2023 metric collection and changes as a result from collection for both common research administration service qualities and processes.

Breaking Down Why Faculty Satisfaction Metrics are Not Collected

In 2019, the majority of survey respondents answered no (36%) or unsure (35%) as to if their offices collect faculty satisfaction metrics (71% total). In 2023 if a respondent answered no or was unsure about collecting faculty satisfaction metrics, they were directed to a new survey to gather additional information about the respondent and why metrics are not collected. Figure 4 provides an overview of these respondents and why metrics are not collected. The largest category was not knowing why faculty satisfaction metrics are collected. The second two largest categories were focused around not having enough time and staff for metric collection. During COVID-19 several types of employment saw an increase in employee turnover, and this also impacted research administration (Akinyooye & Nezamis, 2021; Anderson, 2022). Previously it was discussed that a decrease in taking faculty satisfaction metrics during COVID-19 could have been impacted by the “survival mode” of RAs and not having time for collection, but this data also supports that metrics may have not been taken due to the turnover of staff. The RA turnover could exacerbate the “not enough time to collect metrics”, but not having enough staff could also mean there isn’t an individual who is knowledgeable about best practices to collect metrics. Both time and staff are roadblocks for IHE research administration offices to collect faculty satisfaction metrics.

In 2019, 35% of respondents did not know if faculty satisfaction metrics were being collected and it appeared to be a limitation when collecting survey responses that not all RAs are privy to the information on if metrics are collected and if they influence change. In 2023 it was important to add in questions to assess if there are trends amongst those in research administration who have access to the knowledge about metrics being collected, for example, are the metrics being considered top-down knowledge by supervisors or are those with several years of experience more likely to be informed of metric collection?

The survey asked all participants about their length in research administration and if they supervise staff. Of the respondents who answered yes to collecting metrics, 70% answered yes to being a supervisor, compared to the respondents who answered no (42%) and unsure (34%). Furthermore, Figure 4 addresses the 42% of respondents who do not know why faculty satisfaction metrics are collected and the other 58% of respondents who do know why metrics aren’t collected. There is a trend within the respondents who do not know why metrics are collected as being non-supervisors and having more years of experience in research administration. For the respondents who do know why metrics aren’t collected, there was also a trend of increasing years of experience, but there are also more supervisors as respondents. Those who answered yes to collecting faculty satisfaction metrics, 17 answered the question about length of research administration experience and 12 of those respondents had 7+ years of experience. The number of respondents increased as the length of research administration experience increased for both those who collect and do not collect faculty satisfaction metrics regardless of if they know why metrics aren’t collected, meaning respondents were more likely to have a higher number of years of experience but not knowledge if faculty satisfaction metrics are collected. However, the majority of the respondents who do collect faculty satisfaction metrics also have supervising responsibilities (n=12) but not all have 7+ years of experience.

Sharing information on collecting metrics between supervisors and their subordinates may be withheld for various reasons, however, according to Butt (2021), supervisors retaining knowledge may be driven by protecting knowledge, maintaining the supervisor’s image, or position of authority. In 2019 and 2023, the survey asked respondents in general what is being done with faculty satisfaction metrics, and majority in both data sets answered they were being used to motivate staff, carry out performance evaluations, or execute adjustments in customer service and processes. The uses for these metrics in addition to common motivators described may cause supervisors to not share metric information to their employees if it will break confidentiality with the PI or create tension between the RA and PIs who provide feedback.

The survey included additional questions about RAs opinions on collecting faculty satisfaction metrics. Of those who do not collect metrics or if they do not know if metrics are collected, 78% agree that metrics should be collected on faculty feedback for processes and 78% also agree to this collection for customer service qualities. Despite most of the respondents not knowing why metrics are collected or having enough time/staff, there is an obvious desire and perceived value to collecting them in research administration offices. The smallest respondent category for this question was that the research administration office collected metrics previously and they were not used. Based on this data, the collection of faculty satisfaction metrics is valuable information, but not all RAs are knowledgeable if their offices prioritize faculty satisfaction metrics. It is more likely that one who is a supervisor is aware of the collection and one with 7+ more years in research administration will participate in research administration surveys.

 

Figure 4. Why Faculty Satisfaction Metrics Are Not Collected

2023 respondents who do not collect faculty satisfaction metrics responses to why metrics are not collected. Additional data on all respondents’ length of years in research administration and if they supervise RAs in relation to if the respondent knows about metric collection or why metrics are not collected.

Transitioning to Remote Work and the Impact on RA Turnover

COVID-19 catapulted research and research administration into a territory of remote work with forced closures of in-person activities and creation of unique solutions to maintain everyday responsibilities. In the second data collection in 2023, questions were added to better capture the evolution of remote work from prior to COVID-19 to after. Figure 5 illustrates respondents of the survey regardless of if they collect faculty satisfaction metrics, and their change in work environments. Two categories were initially addressed on if the respondent worked remotely in some capacity prior to the pandemic or did not. Then respondents were asked about their current work environment in 2023 and if they were in-person, hybrid, or fully remote with no expectations to go into the office. An additional dive into the data was if the respondent remained in the same office as prior to COVID-19. The respondents with the highest category are those who are in the same office/university as prior to COVID-19, did not work remotely prior to pandemic but currently works hybrid where part of their work environment is remote. Respondents were asked how many days they are required to be in the office if they have a hybrid schedule and the average was about 7.8 days per month. The second highest was those who now work at a different university as prior to COVID-19, did not work remotely prior to the pandemic, and now work fully remote. The two large trends in Figure 5 capture offices changing their practices from working full time in the office to allowing some remote work and that RAs are changing offices and IHEs to have flexibility to work remotely in some capacity.

In a three-part catalyst series for the Society of Research Administrators International, Meaghan Ventura discussed results from a 2023 survey on motivation and retention factors in research administration that had 316 respondents across various types of research institutions (2024a). Within this series, a highly rated motivation for RAs to leave a current employer was due to lack of flexible work options like working from home or remotely (Ventura, 2024b). Conversely, the survey also showed a motivator to stay with an employer is to have these flexible work options (Ventura, 2024b). In 2021 a similar study by Welch and Brantmeier was conducted on motivation factors for RAs to stay or leave their current employer (2021). This study was conducted at the beginning of the COVID-19 timeline when uncertainty loomed on if there will be a new normal for how research administration work is completed. The study included flexible work schedules as a motivating factor, but this option was one of the least selected (Welch & Brantmeier, 2021). From 2021 to 2023, COVID-19 gradually descended from the initial peak into a new normal where now remote work is more common (Main & Haan, 2023). RA motives for staying or leaving their workplace have also shifted to align with these new norms that are here to stay. These studies for RA motivators align with the data collected in this study and show trends for staying within the same office with new options on how to work or leaving the office to gain work flexibility.

 

Figure 5. Transiting to Remote Work

2023 respondents were asked questions about their work environments prior to COVID-19 and currently in 2023. This figure depicts the transition from those who did not work remotely prior to COVID-19 and currently work remotely in some capacity and if those respondents are in the same office from 2019 to 2023.

Remote Work Demographics

The data collected from the survey demonstrated a transition to remote work for RAs in IHE which was an assumption with the onset of COVID-19; however, it also depicts a potentially long-lasting transition with the majority of respondents working remotely either fully or in a hybrid structure. Respondents were also asked additional questions about the demographics of their IHE and their research administration position. Figure 6 outlines respondents based on work environment, type of research administration office, the size of their IHE based on research expenditures, if there is daily interaction with faculty (PI), and those that collect metrics and have daily interactions with PI. Majority of respondents were from larger IHE and worked fully remote or hybrid. Of the respondents who work remote and hybrid, they mostly come from central and department offices where a large number of respondents have daily interaction with faculty. For example, there were 17 respondents who work fully remote in central offices and of those, 11 of them have daily PI interaction and 5 collect metrics. Of the 11 that have daily PI interactions, 3 of them also collect metrics. As shown in Figure 6, there is a higher rate of faculty satisfaction metrics collected for those who have positions that are either fully remote or hybrid compared to those who work 100% on-site. COVID-19 forced the transition to remote work, but it is encouraging to see lasting changes to RA work environments even when those RA positions require consistent and frequent customer service to faculty, alluding to successful support from RAs regardless of where that support is being provided. Overall, remote and hybrid work is prevalent across different types of RA offices and IHE sizes but may be more common in central and department research administration offices within IHE with larger amounts of annual research expenditures. These observable increases in remote and hybrid positions have also shown a clear pattern of increased faculty satisfaction metric collection with 16:1 (remote+hybrid:in-person) in general and a 10:1 ratio for those who have daily interactions with PIs. As remote and hybrid options continue to remain for RA positions, the need to balance those positions with faculty satisfaction compared to those with in-person positions will also remain an important evaluation criteria moving forward.

 

Figure 6. Research Administrator Demographics

RA respondents’ demographics for work environment, type of RA office size of IHE based on research expenditures, daily PI interaction, and if metrics were collected.

 

 

Conclusions

Faculty satisfaction metrics were being collected in university RA offices on processes and customer service qualities both before and after COVID-19. Additionally, those metrics are influencing change in both categories, but there was not a significant change in the collection of faculty satisfaction metrics after COVID-19. Based on the data collected, there was the start of an increase in metric collection within the last year from the survey distribution (2023). When COVID-19 started to spread within the United States, there was a quick turnaround to make changes to protect workers, faculty, and students at universities based on CDC recommendations. These transformations on how IHEs would operate in a pandemic were swiftly made with little to no time for feedback from those the changes would impact. The pandemic continued for several years with regulations that caused prolonged accommodations that also allowed universities to think about new norms for RAs. Several respondents documented these new norms by changes in how and where their work is completed by the increase in hybrid and fully remote employees now after the COVID-19 pandemic has ended.

With the pandemic coming to an end, distancing regulations being removed, and university employees acclimating to the established new work environments there could be more capacity for offices to collect faculty satisfaction metrics which was seen in the increase of metric collection with the last year of the pandemic. With such a large transition within the research administration enterprise due to COVID-19, there was an increased focus on metrics for customer service qualities over processes. RAs provide a certain customer service expertise to faculty that supports their research and so focusing on how remote work has impacted the ability for administrators to communicate efficiently, responsiveness to inquiries, availability to collaborate, and the continued knowledge of the RA was seen as valuable information based on the survey results. Despite the RAs stating metrics should be collected based on faculty satisfaction for processes and customer service, rather it is related to remote work or not, there was still a large portion of respondents who currently do not collect or don’t know if metrics are collected. Staffing availability and necessary time needed to collect, evaluate, and implement change based on faculty feedback are current roadblocks to being able to utilize this pathway for informed changes in research administration offices. Even after the pandemic has ended and the wave of adjustments calmed, time and staff still impact the ability to collect metrics. Additionally, there is still a portion of respondents who either do not know if metrics are collected or why they are not collected. Even though the survey to collect this study data was distributed across various boards and universities to reach a large variety of RAs, it appears most research administrators who are aware of these metric collections are those in supervising positions. This can present a challenge with being able to collect data on if metrics are collected as the knowledge is not readily available to all RAs. Additionally, 56% of the respondents currently work in either a new office, university, or are new to research administration compared to prior to COVID-19 which can add complexity for respondents to know if metrics were taken before COVID-19 or if they are currently being collected due to being in a new work environment.

The data in this study was able to identify trends within varying types of research administration offices for collecting faculty satisfaction metrics, if metrics are implementing change, and how these results in 2023 compared to the results in 2019, prior to COVID-19. Future directions could continue to investigate other impacts of the change in work environments and turnover for RA because of COVID-19 and the shutdown implementations. Further, there was promising data to show that there has been an increase in collecting faculty satisfaction metrics within the last year (2022-2023). Re-collecting data on RA offices collecting faculty satisfaction metrics in 2-3 years may be able to better show if there was an increase in collecting faculty satisfaction metrics due to COVID-19 and long-lasting effects of remote/hybrid work.

Acknowledgements

This work was completed without the support of grant funding. We would like to thank the JRA Author Fellowship Program for connecting us through a mentee-mentor relationship that led to the collaboration on this project.

Katherine Bui
Stanford University School of Medicine
1520 Page Mill Rd, MC 5705
Palo Alto, CA 94304
phone: (650) 498-7093

Keith R. Berry Jr.
Division of Research and Innovation, University of Arkansas
Fayetteville, Arkansas 72701

 

Authors’ Notes

Correspondence concerning this article should be directed to Katherine Bui, Stanford University (katbui@stanford.edu). Katherine completed this project independent of Stanford University.

 

References

Akinyooye, L., & Nezamis, E. (2021, June). As the COVID-19 pandemic affects the nation, hires and turnover reach record highs in 2020. Monthly Labor Review, 1–24. https://doi.org/10.21916/MLR.2021.11

Alcaine, J. (2020, May 14). Out of the office: Teleworking in Research Administration is here to stay. https://www.srainternational.org/blogs/srai-news/2020/05/14/out-of-the-office-teleworking-in-research-administ

Almanzar, A., Yerousis, G., Hamed, G., & Khlaif, Z. N. (2022). Digital transformation in times of crisis: Challenges, attitudes, opportunities and lessons learned from students’ and faculty members’ perspectives. Frontiers in Education, 7. https://doi.org/10.3389/feduc.2022.1047035

Anderson, J. (2022, July 29). Institutions turn to new workforce strategies to counter effects of great resignation. Health Care Compliance Association (HCCA) - JDSupra. https://www.jdsupra.com/legalnews/institutions-turn-to-new-workforce-8206211/

Aydın, N., Sayır, M. F., Aydeniz, S., & Şimşek, T. (2023). How did COVID-19 change faculty members’ use of technology? SAGE Open, 13(1). https://doi.org/10.1177/21582440221149720

Besch, J. (2014). Using a client survey to support continuous improvement: An Australian case study in managing change. Research Management Review, 20(1). https://files.eric.ed.gov/fulltext/EJ1038829.pdf

Bui, K. D. (2019). Evaluation of faculty satisfaction metrics in university research administration offices: Is the data used? [Master’s thesis, John Hopkins University]. http://jhir.library.jhu.edu/handle/1774.2/62064

Butt, A. S. (2021). Consequences of top-down knowledge hiding: A multi-level exploratory study. VINE Journal of Information and Knowledge Management Systems, 51(5), 749–772. http://dx.doi.org/10.1108/VJIKMS-02-2020-0032

Cayuse. (n.d.). Research administration: Then and now. Retrieved March 16, 2024, from https://cayuse.com/research-administration-then-and-now/

CDC. (2023, September 12). The end of the federal COVID-19 Public Health Emergency (PHE) Declaration. https://www.cdc.gov/coronavirus/2019-ncov/your-health/end-of-phe.html

Cole, S. S. (2007). Research administration as a living system. The Journal of Research Administration, 38(2). https://files.eric.ed.gov/fulltext/EJ902221.pdf

Cook, D. A., Garside, S., Levinson, A. J., Dupras, D. M., & Montori, V. M. (2010). What do we mean by web-based learning? A systematic review of the variability of interventions. Medical Education, 44(8), 765–774. https://doi.org/10.1111/J.1365-2923.2010.03723.X

Coyne, C., Ballard, J. D., & Blader, I. J. (2020). Recommendations for future university pandemic responses: What the first COVID-19 shutdown taught us. PLoS Biology, 18(8). https://doi.org/10.1371/journal.pbio.3000889

David J. Sencer CDC Museum. (2023, March 15). CDC Museum COVID-19 Timeline. https://www.cdc.gov/museum/timeline/covid19.html#:~:text=January%2020%2C%202020,respond%20to%20the%20emerging%20outbreak

de Koning, R., Egiz, A., Kotecha, J., Ciuculete, A. C., Ooi, S. Z. Y., Bankole, N. D. A., Erhabor, J., Higginbotham, G., Khan, M., Dalle, D. U., Sichimba, D., Bandyopadhyay, S., & Kanmounye, U. S. (2021b). Survey fatigue during the COVID-19 pandemic: An analysis of neurosurgery survey response rates. Frontiers in Surgery, 8, 690680. https://doi.org/10.3389/fsurg.2021.690680

Elflein, J. (2022, November 17). Cumulative COVID cases in the U.S. from 2020 to 2022. Statista. https://www.statista.com/statistics/1103185/cumulative-coronavirus-covid19-cases-number-us-by-day/

Elmer, S. J., & Durocher, J. J. (2020). Moving student research forward during the COVID-19 pandemic. Advances in Physiology Education, 44, 741–743. https://journals.physiology.org/doi/full/10.1152/advan.00153.2020

eRA. (2024, January 10). How we got here: A history of eRA. https://www.era.nih.gov/about-era/how-we-got-here#

Fairlie, R. (2020). The impact of COVID‐19 on small business owners: Evidence from the first three months after widespread social‐distancing restrictions. Journal of Economics & Management Strategy, 29(4), 727. https://doi.org/10.1111/JEMS.12400

Friedell, D. (2022, April 25). Some American colleges return to COVID-19 restrictions. VOA News. https://learningenglish.voanews.com/a/some-american-colleges-return-to-covid-19-restrictions/6544416.html

Global Workplace Analytics. (n.d.-a). How many people could work-from-home. Retrieved March 2, 2024, from https://globalworkplaceanalytics.com/how-many-people-could-work-from-home

Global Workplace Analytics. (n.d.-b). Remote/hybrid work/in-office trends and forecast Retrieved March 2, 2024, from https://globalworkplaceanalytics.com/work-at-home-after-covid-19-our-forecast

Jones, M. (2020). Evaluating digital transformation during a worldwide pandemic in research administration [Student paper, School of Business, Thomas Jefferson University]. https://jdc.jefferson.edu/sbsp

Jones, S. (2022, September 22). COVID-birthed remote work becoming part of Pitt employee fabric. University Times. University of Pittsburgh. https://www.utimes.pitt.edu/news/covid-birthed-remote-work

Kaleba, N. (2021, February 16). Staff survey highlights interest in remote work, other concerns. The University Record. https://record.umich.edu/articles/staff-survey-highlights-interest-in-remote-work-other-concerns/

Krieger, N., LeBlanc, M., Waterman, P. D., Reisner, S. L., Testa, C., & Chen, J. T. (2023). Decreasing survey response rates in the time of COVID-19: Implications for analyses of population health and health inequities. American Journal of Public Health, 113(6), 667–670. https://doi.org/10.2105/AJPH.2023.307267

Kriemeyer, M., & Lindemann, U. (2011). Complexity metrics in engineering design. Springer.

Kulakowski, E., & Chronister, L. U. (2006). Research administration and management. Jones & Bartlett Learning.

Madhusoodanan, J. (2020, June 5). Frozen cells and empty cages: Researchers struggle to revive stalled experiments after the lockdown. Nature. https://doi.org/10.1038/D41586-020-01704-Y

Madnick, R., Bailey, E., Kinglsey, L., Rouleau, D., & Smelser, D. (n.d.). RemoteWorkSurvey.

Main, K., & Haan, K. (2023, June 12). Top remote work statistics & trends [in 2024]. Forbes Advisor. https://www.forbes.com/advisor/business/remote-work-statistics/

Marina, S., Davis-Hamilton, Z., & Charmanski, K. E. (2015). Evaluating research administration: Methods and utility. The Journal of Research Administration, 46, 95–114. http://dsp.research.uiowa.edu

Myers, K. R., Tham, W. Y., Yin, Y., Cohodes, N., Thursby, J. G., Thursby, M. C., Schiffer, P., Walsh, J. T., Lakhani, K. R., & Wang, D. (2020). Unequal effects of the COVID-19 pandemic on scientists. Nature Human Behaviour, 4(9), 880–883. Nature Research. https://doi.org/10.1038/s41562-020-0921-y

National Science Foundation. (n.d.). NSF – NCSES academic institution profiles – Rankings by total R&D expenditures. Retrieved March 2, 2024, from https://ncsesdata.nsf.gov/profiles/site?method=rankingBySource&ds=herd

New York University. (n.d.). Responsibilities of the faculty member. Retrieved March 2, 2024, from https://www.nyu.edu/faculty/governance-policies-and-procedures/faculty-handbook/the-faculty/other-faculty-policies/responsibilities-of-the-faculty-member.html

Nurunnabi, M., & Almusharraf, N. (2020). Social distancing and reopening universities after the COVID-19 pandemic: Policy complexity in G20 countries. Journal of Public Health Research, 9(Suppl 1), 50–59. https://doi.org/10.4081/JPHR.2020.1957

Ramos, S. (2021). COVID-19’s impact felt by researchers. https://www.apa.org/science/leadership/students/covid-19-impact-researchers

Raynaud, M., Goutaudier, V., Louis, K., Al-Awadhi, S., Dubourg, Q., Truchot, A., Brousse, R., Saleh, N., Giarraputo, A., Debiais, C., Demir, Z., Certain, A., Tacafred, F., Cortes-Garcia, E., Yanes, S., Dagobert, J., Naser, S., Robin, B., Bailly, É., … Loupy, A. (2021). Impact of the COVID-19 pandemic on publication dynamics and non-COVID-19 research production. BMC Medical Research Methodology, 21(1). https://doi.org/10.1186/s12874-021-01404-9

Rothbaum, J., & Bee, A. (2022, September 23). How has the pandemic continued to affect survey response? Using administrative data to evaluate nonresponse in the 2022 Current Population Survey Annual Social and Economic Supplement. https://www.census.gov/newsroom/blogs/research-matters/2022/09/how-did-the-pandemic-affect-survey-response.html

Sharma, A. (2023). Understanding the impact of the COVID-19 Pandemic on research administration in Canada. The Journal of Research Administration, 54. https://files.eric.ed.gov/fulltext/EJ1390792.pdf

Smalley, A. (2021, March 22). Higher education responses to Coronavirus (COVID-19). https://www.ncsl.org/education/higher-education-responses-to-coronavirus-covid-19

Sohrabi, C., Mathew, G., Franchi, T., Kerwan, A., Griffin, M., Soleil C Del Mundo, J., Ali, S. A., Agha, M., & Agha, R. (2021). Impact of the coronavirus (COVID-19) pandemic on scientific research and implications for clinical academic training – A review. International Journal of Surgery, 86, 57–63. https://doi.org/10.1016/j.ijsu.2020.12.008

U.S. Bureau of Labor Statistics. (2024, March 1). Household and establishment survey response rates. https://www.bls.gov/osmr/response-rates/

University of Pittsburgh Human Resources. (n.d.). Permanent flex work policy implemented. Retrieved March 2, 2024, from https://www.hr.pitt.edu/news/permanent-flex-work-policy-implemented

van der Linden, R. (2021, April 13). Process improvement and COVID-19 survey results. LinkedIn. https://www.linkedin.com/pulse/process-improvement-covid-19-survey-results-ric-van-der-linden/

van Looy, A. (2021). How the COVID-19 pandemic can stimulate more radical business process improvements: Using the metaphor of a tree. Knowledge and Process Management, 28(2), 107–116. https://doi.org/10.1002/kpm.1659

Velez-Cruz, R. & Holstun, V. (2022). Pandemic impact on higher education faculty self-care, burnout, and compassion satisfaction. Journal of Humanistic Counseling, 61, 124. https://pmc.ncbi.nlm.nih.gov/articles/PMC9348051/pdf/JOHC-61-118.pdf

Ventura, M. (2024a, January 9). Motivation and retention factors in research administration: Part 1: Literature review and focus on topic exploration. https://www.srainternational.org/blogs/srai-news/2024/01/09/motivation-and-retention-factors-in-research-admin

Ventura, M. (2024b, February 7). Motivation and retention factors in research administration: Part 2: Motivation factors. https://www.srainternational.org/blogs/srai-news/2024/02/07/motivation-and-retention-factors-in-research-admin?CommunityKey=8844bda2-af14-40c9-b985-b982f583786a

Welch, L., & Brantmeier, N. K. (2021). Examining employee retention and motivation trends in research administration. Journal of Research Administration, 52(2), 70–86. https://files.eric.ed.gov/fulltext/EJ1325462.pdf

Wooden, P., & Hanson, B. (2022). Effects of the COVID-19 pandemic on authors and reviewers of American Geophysical Union Journals. Earth and Space Science, 9(2). https://doi.org/10.1029/2021EA002050

Permalink