Skip to main content

A rubric for exploring new technologies


When is the last time you explored the ever-expanding world of technology tools for teaching and learning? Perhaps a suggestion from a student, a conversation with a colleague, or an unsolicited email from a vendor sparked interest in a new and promising technology tool.

Academic Technology Support Services
receives a steady stream of new learning tool inquiries, such as:
  • clipboard with a paper with checkboxes, checked on left. On the right circles with tools, camera and sharing icons
    “I’m looking for a software that allows students to collectively create a digital concept map - a program that would enable students to collectively work (ideally simultaneously) on assembling a concept map.”
  • “I would like to create interactive exercises for learning (drag-and-drop, animations, matching exercises, interactive timelines, etc.).”
  • “I’m looking for a way that students can engage with course readings and each other directly by making comments on the reading and responding to each other. I would also like to interact with students as well.”
While each of these inquiries begins with valid teaching and learning goals, we also must consider that each tool we adopt becomes part of a larger ecosystem of technology at the University. It is in our interest to prioritize technologies that benefit our community, considering students' and instructors' user experience, expert support, and system interoperability.

Balancing individual with community needs is a challenge; we’d like to share a simple process you can use to identify if a new tool is a good fit.

Support Learning Goals

While Canvas provides a standard set of learning tools, there may be times when you have an instructional need that is not filled by Canvas options. Prompted to look beyond Canvas, or even beyond UMN-supported tools, you may find something that perfectly meets your needs. Before deciding to use any technology tool, first ensure that you have a specific instructional need that directly supports your learning goal. (See Evaluating a New Tech Tool).

But don’t let your review stop there. Assuming you have determined an authentic instructional need, let’s walk through a few additional considerations and suggest a high-level workflow for you to use in making your own informed decisions about technology tool use.

Put Students First

Logistics

In usability testing at the University of Minnesota, students have indicated that it’s easy to become overwhelmed with the variety of tools used requiring them to navigate multiple sites and platforms across their courses.
I wish that my professors used similar platforms [and tools] because while I get that they all have something different that works for them, I'm having to check something like six different websites every day to try and compile my stuff together. 
The logistics of using any technology tool may present usability issues for your students (or you) in tool availability and support, ease of use, cost, and more.

Accessibility

To create an inclusive learning environment for students and instructors, it’s important that course materials, activities, and tools be as accessible as possible. How can you determine if a non-university-supported tool is accessible? 
  • Start by reviewing the Voluntary Product Accessibility Template (VPAT) which is a vendor self-report. Perform a cursory review of the tool; is it intuitive and easy-to-use? Is the vendor committed to accessibility, e.g., is accessibility part of their product roadmap?

Data Practices

Technology tool data practices must be private and secure. When using University supported tools, you can trust that tools have been evaluated and protect student data. When using tools that are not supported by the University, you must make this determination. Any time an account login is required, there is information passing back and forth. What does it mean to be aware of the logistics of safe data practices?
  • If the tool is not integrated with the LMS (Canvas), students will need to establish their own accounts, generate and protect passwords, and ensure that they're following safe practices when sharing their collaborative efforts with working group teammates and the class at large. In the case of individual lapses or larger scale data breaches, you'll want to ensure that the vendor commits to reporting those issues to you and your students in a timely manner (typically, within 48-72 hours) and that they have appropriate plans for storing your data for future use and/or responsibly destroying it when the class -- or your license -- ends.
  • For any tools that have not been evaluated by the Office of Information Technology, University Information Security, or the University FERPA-Compliance Office, please exercise caution when sharing any private information -- including names and email addresses of students -- with the vendor. In the case of a data breach, you'll need to contact the FERPA-compliance office so that students can be notified.

Adopt a workflow

While the prospect of making informed decisions may seem overwhelming, use what you’ve learned here to create a process for evaluating any new technology tool. In time, it will become second nature to look at all new tools with a thoughtfully critical eye.
  • Start with a high-level rubric to evaluate basic use considerations. A simple graphic, as shown below, can help prompt your initial tool review. 
    Criteria listed across the top of table; Logistics, Accessibility, Data Practices. Example Questions below
    Red X, no considerations met; Gray minus, some consideration met; Green Check, all considerations met
    While this is a good starting point, be sure to review Evaluating a New Technology Tool for a comprehensive list of considerations when reviewing technology tools at the University of Minnesota.
  • Know when it’s best to make adjustments to your learning materials or activities, rather than take on another technology tool.
  • Consult with AT staff about your goals and options. Contact your local academic technologist or ATSS to set up a consultation.
Want to see the rubric in action? Join us for our next Digital Sparks session where we will demo four technology tools, and apply our high-level rubric to each of them. Register for Digital Sparks.

For more information, see Evaluating a New Technology Tool.