University System of Georgia Case Study
Course Registration Systems & Reports
What is the systems strategy in a nutshell?
The University System of Georgia (USG) organized systematic collection of data at all 26 USG institutions on courses that have a study abroad component through entry of an attribute on the course section in the student information system.
Institutions enter a primary study abroad code and then there are supplementary codes to enter regarding the length of the overseas component of the course, administrative aspects of the experience (whether the course was faculty led, at an institution branch campus, embedded in a course that is partly not study abroad, or part of the USG’s study abroad consortium). There are also supplementary attributes to capture other types of credit bearing overseas experiences including service learning, internships, and research abroad.
The attribute facilitates the analysis of student and course data about overseas courses after the fact. The system office collects these course attributes in its Academic Data Collection, twice per semester, and connects them to students to carry out analysis and track progress of the overall USG in growing study abroad, as one type of experiential learning that is part of the USG strategic plan.
How was this strategy created, and who are the key players?
This initiative began as a collaboration between the USG Office of International Education and international education personnel on system campuses. The desire to collect this data came from the campuses themselves and this aspect is critical to its success; it is not just a mandate from the system office. The USG Office of International Education worked with other system offices to develop the standard coding system for the attributes and have them included in the Academic Data Collection.
The USG Academic Services (AS) unit of its Information Technology Services division did the technical work of extracting the new study abroad attributes in the Academic Data Collection, providing for their warehousing at USG, and producing documentation on the attributes in the USG academic data element dictionary. Further meetings and communications between key partners continued as needed to plan, create documentation, and roll out the data collection to campuses.
Academic Services also produced a document to guide campus personnel in how to enter these course section attributes in Banner. The work to include the attributes in system office data collection was a one-time effort. On campuses, the attributes must be entered on each course section, which takes time.
So, what’s the impact of this strategy?
The addition of these attributes to course section attributes since spring 2019 allows both campuses and the system office to track and analyze the course sections with overseas components. This information combined with additional course specific data allows institutions and the system to gain a broader picture of who is participating in learning overseas, at what point in their academic career and what the outcomes are for students.
Any helpful context about implementing this strategy on your campus?
The January 2020 USG strategic plan identifies student enrollments in experiential learning (including study abroad) as a metric under the goal of Community Impact. Monitoring this data and reporting progress towards the strategic goal is an incentivize for institutions entering the course attributes. In addition, the University System of Georgia uses the same student information system across all 26 institutions which facilitates standardizing data entry. The USG already had a robust academic data collection which pulls data directly from Banner on students and courses.
What tips do you have for institutions looking to implement a similar strategy?
Take the time to define different types of overseas learning and clearly articulate what you count as study abroad. USG formulated our definition of study abroad from the Forum on Education Abroad. Also, it is essential to use widely recognized time frames for counting experiences. We modeled our program length measurements on Open Doors. However, we found that some schools or departments used homemade program length measurements that would make it hard to benchmark and match against nationally accepted models such as that used in Open Doors.