Skip to content Skip to site navigation

User Research

Qualitative user research was conducted in January-March of 2018 to better understand how community members experience IT at Stanford. The findings from this research are intended to inform strategic planning and service improvement projects.

This report contains qualitative findings from user research conducted January–March of 2018 by the University IT Service Design team as part of the inaugural Campus IT Plan program.

Research Goals

The findings from this research are intended to inform future strategic planning and service improvement projects by technical and business owners of IT systems across Stanford. The objective of this qualitative research was to understand how various audiences experience IT at Stanford. Specific goals were to:

  • Learn how key technology services, tools, and systems are experienced

  • Learn what’s working well and what can be improved

  • Understand how each audience obtains IT support and what their experience is when they receive it

  • Identify gaps in IT provided by Stanford

  • Identify future IT needs

What is qualitative vs. quantitative research?

Qualitative research is primarily exploratory and used to gain understanding of underlying reasons, opinions, and motivations. It helps us develop understanding of a problem space, build empathy with users, and collect evidence to expose potential patterns. Where quantitative research uses larger datasets to generalize results representing a larger population, qualitative research focuses on going deeper with a smaller sample size to uncover meaningful human stories to use as provocations for new designs, service improvements, and innovations.

Quantitative data can often answer “What? When? Where? How many?” while qualitative data focuses mainly on “Why? How?”. For this reason, qualitative data should be verified with quantitative data if the purpose is to extrapolate out broader trends in a population, and quantitative data can benefit from qualitative data for deeper understanding of how the data points connect and why.

How to interpret this report

When reading the findings in this report, please keep in mind that we are reporting on what we heard from the participants in the study and that the sample sizes and breakdown of audiences who participated should be understood in order to effectively interpret the findings (these are listed below). Through these reports, we are communicating  the rich human narratives that can help build empathy and understanding with our community.

This report captures what we believe to be useful insights, not necessarily statistically significant findings. We hope these findings may be useful in understanding the people in our community and capture unique perspectives that may deepen our understanding of various problem spaces that IT services address.

Report structure

The Executive Research Report (below) summarizes key findings from the research. Additional detailed findings by topic area are available below. Full participant breakdown and research protocol are documented in the Appendix as a separate document on this page.  

How you might use this report

These insights are not an end in themselves but meant to be provocations to lead to further discovery, ideation, and design work.

Some of the findings in this report may be specific and tactical feedback related to a particular service. Others might be more general or strategic.

When reading the findings, consider the following:

  • What are the implications of these findings to my service, area, or organization?

  • Where could we benefit from additional, deeper research (either qualitative or quantitative) to better understand context or validate these findings with a larger population?

  • How might we use these findings to inspire creative thinking and innovation in our team or organization?

How to provide feedback, ask questions, or request further discovery

You can provide feedback or ask follow-up questions to the Service Design team directly at servicedesignteam@lists.stanford.edu. If you are interested in having a member of the team come to present findings to your group or lead your team through an ideation workshop related to relevant findings, please reach out to discuss with us. We would be happy to partner with your group to conduct further secondary research, facilitate ideation workshops, or partner on design improvement projects coming out of this work.

Sample interview questions

Below are a sample of the questions used when interviewing research participants. All participants were prompted with a similar set of questions. The below questions were used for graduate students and postdocs.

  • What comes to mind when you think about Information Technology at Stanford?
  • What core technologies, systems, or apps do you use most regularly here at Stanford to support you as a grad student / postdoc?
  • Thinking about your typical quarter, what are your biggest pain points associated with technology, and how technology supports you as a grad student/postdoc? What works really well?
  • Where do you go when you want or need to figure out how to achieve something with technology at Stanford?
  • How do you get support for technology issues, and what are your pain points about getting IT support? (tell me about a recent experience)
  • Are there any gaps in what Stanford offers as central or departmental technology services/support, that you feel you have to seek from outside the university? If so, what are those gaps?
  • What are three things that technology could do better to support you as a grad student/postdoc?
  • What is a technology (here at Stanford or outside of Stanford) that you are really excited about, and would be excited to have more access to?
  • What technology do you think will be important to your studies/work in the next 3-5 years?

To learn more about the research protocols, please refer to the Appendix.

 

Read the executive summary of findings, including general insights that cross all categories, summaries of topic area findings, and key audiences insights (undergraduate, graduate, staff, and faculty).

View report (PDF, 36 pages)

The following reports highlight detailed findings in ten topic areas that reflect significant feedback gathered from participants during the user research. Review the summary of contents below to discover which reports may be of interest to your group(s).

Axess

Length: 14 pages

Contents:

  • Student feedback on Axess, including: usability, reliability, accessing transcripts, viewing grades, course and degree planning
  • Faculty feedback on Axess
  • Staff feedback on Axess

View Report

Business Operations and Applications

Length: 41 pages

Contents:

  • Core business applications used by participants
  • What matters to end users of enterprise systems
    • Reliability
    • Usability
    • Flexible browser and device support
    • Robust system integrations
  • How systems enable efficiency
  • Desires and gaps:
    • Awareness and onboarding
    • Communication and user input
    • Desire for increased vendor management support
    • Hardware and software to support business operations
    • More robust data analytics and reporting tools
  • User experience of key systems

View Report

Communication and Collaboration

Length: 44 pages

Contents:

  • Summary of tools used (supported by Stanford, and not)
  • How different audiences communicate and collaborate
  • Key themes heard from participants:
    • Increased adoption of chat
    • Challenges with email and calendar
    • People love Zoom
    • The importance of real-time collaboration
    • Too many tools
    • Low awareness of tools and functionality
    • Usability of tools matters

View Report

Conducting and Managing Research

Length: 55 pages

Contents:

  • Tools used to conduct and manage research
  • IT considerations for research labs:
    • Communication and collaboration
    • Device registration and compliance
    • Local IT support
  • Enabling efficient research management:
    • System usability and integrations
    • Surfacing opportunities
    • Real-time collaboration
    • Capturing research data
    • Documentation of conflicts of interest
    • Reference management
    • Formatting and templates
    • Profiles and websites
  • Research computing
  • Data storage and management
  • Research financials
  • Awareness and onboarding of technologies to support research

View report

Information Security and Privacy

Length: 17 pages

Contents:

  • Two-step authentication
  • Minimum security compliance
  • Backing up and accessing data
  • Managing multiple credentials
  • Data breaches and vulnerabilities

View report

IT Support

Length: 42 pages

Contents:

  • How different audiences get IT support
  • Positive experiences with IT support
  • What leads to poor client experiences
  • Preference for local, personalized support
  • Gaps in IT support
  • Awareness and onboarding to support options

View report

Network and Wifi

Length: 13 pages

Contents:

  • First-time setup and connection challenges
  • Wifi connectivity and performance issues
  • Wired network feedback

View report

Printing

Length: 7 pages

Contents:

  • Undergraduate experience of printing
    • Cost of printing
    • Difficulty connecting to printers
    • Printing from residential clusters
    • Reliability of printers
  • Graduate students experience of printing
  • Faculty experience of printing

View report

Room AV and Classroom Technology

Length: 9 pages

Contents:

  • Conference room AV
    • Technology enables connection
    • Inconsistent technology causes challenges
    • Effective use requires some training
    • Unclear pathways to support
  • Classroom technology
    • Quality varies widely across campus
    • AV makes an impression on recruiting and peers
    • Technology can be unreliable
    • Desire for hands-on support
    • Onboarding and instructional documentation
    • Unclear pathways to improvement

View report

Teaching, Learning, and Student Success

Length: 41 pages

Contents:

  • Teaching
    • Learning management systems
    • Course discovery and enrollment
    • Teaching team communication and collaboration
    • Alternative teaching methods and tools
    • Technology onboarding for TAs and faculty
  • Learning
    • Tools used to complete coursework
    • Student collaboration and communication
    • Online video lectures
    • Software licenses
    • Printing course materials and assignments
    • Residential and library computer clusters
  • Student success
    • Course and degree planning
    • Accessing grades and transcripts
    • Job searching and career planning
    • Technology onboarding for students
    • Usability of systems

View report

Download four "persona" posters for undergraduate students, graduate students (and TAs), researchers (faculty, grad, and postdoc), and staff. Posters are 11"x17" and meant to create quick, scannable views of audiences highlights based on role/affiliation.

View persona posters (PDF, 4 pages)

View a full participant breakdown by organization and role, and all research protocols (including interview approach, outreach tactics, interview scripts, and handouts).

View Appendix (PDF, 31 pages)

Questions?

If you have questions, concerns, or feedback regarding the findings presented in the above reports, please contact the Service Design team at servicedesignteam@lists.stanford.edu