Plenty of techniques can help you to teach more inclusively, but so can data. Knowing who is in your classroom and how various student groups have performed academically can spark powerful changes in your teaching. Imagine if you had data at your fingerprints to answer questions like these:
Does the percentage of Black students in my courses reflect their overall proportion at my institution?
Are students from lower socioeconomic backgrounds earning grades similar to those of their wealthier counterparts?
Did redesigning my course narrow the performance gap between first-generation college students and other students?
Or subscribe now to read with unlimited access for less than $10/month.
Don’t have an account? Sign up now.
A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.
If you need assistance, please contact us at 202-466-1032 or help@chronicle.com.
Plenty of techniques can help you to teach more inclusively, but so can data. Knowing who is in your classroom and how various student groups have performed academically can spark powerful changes in your teaching. Imagine if you had data at your fingerprints to answer questions like these:
Does the percentage of Black students in my courses reflect their overall proportion at my institution?
Are students from lower socioeconomic backgrounds earning grades similar to those of their wealthier counterparts?
Did redesigning my course narrow the performance gap between first-generation college students and other students?
It seems like it should be easy to get such questions answered by the appropriate campus office. Unfortunately, too many institutions sleep on the potential of basic demographic information to transform teaching. Instructors often find that requesting and retrieving this data is a challenging and lengthy process.
At the University of North Carolina at Chapel Hill, we have developed a web-based analytics tool to make accessing such data easier and automatic for instructors who teach undergraduates. Once logged in, a faculty member can interact with a dashboard that displays information via bar graphs and tables. We call the tool My Course Analytics Dashboard.
Courtesy UNC Chapel Hill
Since building it, we have had many people across academe ask us for advice on how to design a similar version for their own campus.
Before we dive into the logistics, first, an important caveat: The analytics in our tool are descriptive — not predictive. Say you are a faculty member at Chapel Hill: When you open the dashboard, it displays demographic and academic-performance information about student groups in your previous courses. The tool doesn’t provide data about your currentstudents or tell you why different student groups performed differently in your past courses. But the data does show the demographic mix and student grades — grouped by race, gender, first-generation status, and the like — and points out potential problems. For example, does the data show any significant discrepancies between how well transfer students have done in your courses compared with nontransfer students? Whether and how you use the information to adjust your teaching practices and narrow such gaps is entirely up to you. (Readers can find more information about how the tool is used in teaching in a new Chronicle report, “Diving Into Data to Improve Teaching.”)
ADVERTISEMENT
What follows is a blueprint for how to create this kind of descriptive data tool on your own campus:
Step No. 1: Identify champions of the idea. You need advocates who can communicate the value of a descriptive-analytics tool to administrators and faculty members. Privacy laws mean student information is tightly guarded by campus data stewards — and rightly so. To build an analytics tool, you will need to educate and build trust with those stewards.
Are there faculty members on your campus who have used demographic data to improve their teaching? They can demonstrate its value in promoting student success, something everyone can get behind. In making the case for this tool at Chapel Hill, one of us (Kelly) demonstrated how descriptive data and inclusive teaching methods allowed her to remove or narrow the performance gaps between certain demographic groups in her courses.
In the absence of such examples on your own campus, use stories from other institutions. Positive change often happens when administrators learn what other campuses are doing and want to make sure they don’t fall behind.
Step No. 2: Decide what to include in your dashboard. That depends first on what’s available. Think of this step like planning a dinner party, knowing that you may need to slim down the menu. At our university, we relied on people in institutional research and assessment to provide a list of the potential data our tool could pull in, from a technical standpoint. They also helped explain to us why certain data was coded the way it was based on its source. For example, the available campus data on gender may be binary — even though we know some of our students don’t identify in only one of two categories — because of the way it’s collected.
ADVERTISEMENT
We also had campus lawyers weigh in on our early ideas to identify the potential legal consequences of making this data accessible to faculty members.
Once we had a sense of the range of data available, we had to narrow down what to include on the dashboard. We began with eight demographic variables (things like race, first-generation college status, Pell eligibility) and later added one more (age) after requests from our professional-school colleagues. We recommend starting small, knowing that you can always add more data categories later.
Step No. 3: Make design decisions with the stakeholders. During the planning process, we worked as a team to answer a lot of questions, including:
Will faculty members be able to see their data during a current semester or only after it ends? We spent a lot of time discussing the pros and cons of this decision. We worried that some sensitive information might become identifiable to an instructor. For example, if there is only one student who is both Latino and of low socioeconomic status, then the instructor might easily identify the student and have access to sensitive financial information. In the end, we didn’t want data to bias interactions between instructor and student, or affect grading. Instead, we wanted to focus on how the data from a professor’s previous semesters could become an aid for reflection.
How should the dashboard function? For example, we wanted users to be able to access combined data from across courses and semesters. We also wanted campuswide demographic data easily available within the dashboard for comparison with the instructor’s results. We recommend thinking about the user experience early in development.
Can administrators see the data? Can instructors see their colleagues’ dashboard? We decided No on both questions because we wanted the focus to remain on individual professional development.
How far back will the data go? We went as far back as possible. For our university, that meant instructors could see data from 2010 onward.
Can graduate-student instructors access their data if they are listed as the instructor of record? We decided Yes.
What about very small courses? We decided not to report data on classes with fewer than 10 students.
Step No. 4: Don’t forget a maintenance plan. Which campus office(s) or program(s) will house the dashboard, train people to use it, and update the tool? In our case, the mix of people and offices that developed the tool continue to work together now that it is built:
Our institutional-data office manages the information and maintains the dashboard.
The faculty-development center and its website serve as the central hub for helping faculty members learn about the dashboard, including how to access it and interpret results.
Our undergraduate-education office specializes in data-driven approaches to teaching and advocates the dashboard’s value to administrators, chairs, and individual instructors.
This sort of tool requires staffing and incurs costs to create and maintain, so identifying the office(s) that will provide continuing support is crucial.
Step No. 5: Build a support system for users. Don’t assume people will understand what they read on the dashboard, or how to manage their reaction to potentially dismaying or confusing information. Some helpful questions to ask:
Will users be required to complete training to gain access to their course data? What will the training emphasize? We decided that, along with sharing our rationale for the tool, we wanted every user to complete a short online training on implicit bias and stereotypes before they could log in to the dashboard.
What kind of technical documentation will users need to get started and understand the data? You could provide some combination of screenshots, videos, and online training.
What resources are necessary to help users with next steps? You might need to gather resources or create workshops that provide ideas to instructors who want to make changes in their teaching after seeing their data.
Step No. 6: Corral some users to do a test run. No matter how much thought you put into its design, your tool will have a few bugs. Test users can provide invaluable feedback to identify problems before the official release. We assembled test users from a variety of departments for focus groups (and lunch!) to discuss how they navigated the dashboard, which features they liked, and which they would like to see added, as well as their thoughts on the nuts and bolts of the tool’s appearance and navigability.
In designing data tools, you run the risk of presenting information that is either too fine-grained (it overwhelms users) or too broad (it’s generic and unhelpful). Users at your institution can help you find the sweet spot.
ADVERTISEMENT
Step No. 7: Lastly, just because you build it, don’t assume they will come. It takes time for a new tool to gain traction. And faculty interest in using student data as a teaching-improvement tool will vary from individual to individual.
Create a multi-pronged and coordinated strategy to announce the tool. Promote it in campus publications, on the learning-management site, on faculty email lists, or all of the above. We found it compelling to collate stories about how faculty members can and have used the data. Be strategic in integrating your dashboard into campus activities. For example:
Present examples of what instructors can do with the data. This could be a 10-minute session at a departmental meeting or a longer presentation at a faculty retreat.
Make demonstrating the tool a routine part of new faculty orientation and teaching institutes.
Partner with units — such the campus diversity and inclusion office — that share common goals with your team.
Organize interdisciplinary faculty groups that meet regularly for a year to share ideas and experiences with the dashboard and discuss the scholarship of teaching and learning.
For faculty members who are up for tenure or promotion, provide models on how to use the data in their dashboard to demonstrate teaching effectiveness.
Use the tool as leverage in grant applications and show how it can be used to support grant outcomes related to inclusive teaching and other pedagogical goals.
Don’t underestimate the time and commitment it takes to create this type of collaborative project. You very likely will find that creating a vision and support for your dashboard on campus will require more time than it takes to actually build the thing. We hope our suggestions can help you fast-track the process.
In our faculty-development work and writing we often espouse the importance of holding up a mirror to our teaching. It could be in the form of peer feedback, student feedback, or self reflection with demographic data. What we see reflected may not always represent where we want to be as instructors, but we cannot begin to fix something if we do not know if it is broken. This dashboard gives us one more tool to enhance our teaching and make learning more inclusive for all students.
Viji Sathy is associate dean of evaluation and assessment in the office of undergraduate education and a professor of psychology and neuroscience at the University of North Carolina at Chapel Hill. She is co-author of a new book, Inclusive Teaching: Strategies for Promoting Equity in the College Classroom, published in August 2022 by West Virginia Press. Her Twitter handle is @vijisathy.
Kelly A. Hogan is associate dean of instructional innovation and a STEM teaching professor in biology at the University of North Carolina at Chapel Hill. She is co-author of a new book, Inclusive Teaching: Strategies for Promoting Equity in the College Classroom, published in August 2022 by West Virginia Press. She is on Twitter @DrMrsKellyHogan.
Bob Henshaw is an instructional consultant in the Center for Faculty Excellence at the University of North Carolina at Chapel Hill, and an ed-tech liaison.