Universities, by definition, are made up of several professional schools. While the more professional schools would indicate, to many, the more prestigious the university, the more professional schools within a university could present more complications at an IT level.
Just like any business, colleges and universities use data to improve performance – but they can face challenges unique to their field. Depending on how its technology is set up and the layout of its buildings, a university might have to spread a technology implementation across several schools – and, in many cases, across several locations. Here are three keys to keep in mind if your university is undergoing a large technology implementation.
An overall goal
It is very important to have an overall goal – something that will unite the diversity of colleges or departments within the organization. This speaks to the vision of leadership, setting a tone from the top that guides decisions made from the lower levels. For Texas A&M, a recent transformation of its electronic content management centered around saving money – using technology to reduce overhead expenses that could then be budgeted for teaching, research, and outreach.
That’s the kind of goal anyone in a university community can get behind. The university took advantage of the positive momentum the goal generated by forming a steering committee that worked on communicating that goal to the wider community and gathering feedback to help work towards achieving that specific goal.
The decision of the leadership can help lead to the selection of a tool – a university-wide recommendation at the start of the process can help save time and money down the road. Often universities face a situation where each individual college finds its own technology solution, sometimes addressing the common goal leadership has set in place. That siloed technology infrastructure presents a problem when a university is trying to organize that data in one place.
If it’s too late to get the entire organization on the same page using one technology, it is important to find a solution that can work with the many different databases already in place, integrating the data to produce the results the university needs. Data governance becomes important here, to make sure that the same numbers which might have been identified differently in separate systems are interpreted correctly upon integration.
The process doesn’t end with implementation
As with any major technology implementation, the initial installation might actually be the easiest part. Some of the hardest work comes next – trying to validate data, or convincing users why the data, which might look different from what they are used to seeing, is right. A lot of effort can be spent getting the users up to speed with the technology, especially if it is different from what they’ve been using. It helps to have expert users who can train others around campus to not tax even further an IT department that might already be stretched thin by the back-end work that needs to be done.
For example, at Texas A&M, the project manager helped promote buy-in for the changes throughout the university by working with communications and marketing staff from both the university and the content management vendor to build an identifiable brand for the new service. The university developed a reputation for transparency and open communications with stakeholders throughout the process, helped in part by the fact that stakeholders identified common terms and helpful articles to provide information about the service.
In other words, there was a lot of high-tech work that needed to be done…but a little low-tech communication went a long way in helping make that work successful.
Latest posts by John Sucich (see all)
- How Analytics Helps During an Economic Recession - January 16, 2020
- Setting Students Up for Success Using Data - October 24, 2019
- How Higher Ed Can Use Data to Track Alumni Engagement - October 17, 2019