Emerging Technology (Enhancing, Engaging, Connecting)
- By David W. Dodd
- June 1st, 2018
Today’s institutions are typically characterized by widespread use of learning management systems, such as highly regarded Canvas. Our physical classrooms feature projectors, smartboards, sound systems, and similar tools from recognized vendors including Epson, Panasonic, NEC, and Smart, among others. Online courses are supported by Zoom, Kaltura, Cisco, and other digital media and conferencing technologies. The teaching and learning environments we’ve constructed are robust and feature-rich. But how do we know they are working? How do we know if students are actually learning more effectively as a result of these investments? Work has begun in this area of assessment. But there is far more to do.
As the late William Edwards Deming is noted for saying, “In God we trust; all others bring data.” Various studies have been conducted over the years aimed at assessing the impact of technology on teaching and learning. Many were relatively informal and not based on sound research methodology. Further, such research involves humans, which are far more complicated and challenging than, say, determining the specific gravity of a mineral. Quantitative studies are inadequate. Similarly, simply asking students how they “feel” about technology is meaningless. Previous studies have been all over the proverbial board in this regard.
One approach that began about a decade ago and that is rapidly growing in popularity is called learning analytics. As defined by the EDUCAUSE Learning Initiative (ELI), learning analytics is the use of data, analysis, and predictive modeling to improve teaching and learning. The term is thrown around so much and so loosely that its inherent value is often lost, and that is most unfortunate. Learning analytics in actual usage is both sound and substantive. This field takes into account the complex nature of humans, as well as the fact that individual learners are inherently different. Learning analytics also stipulates that learners act within a context that must be considered. Fundamentally, this equates to the difference between rote, recall, and grades on the one hand, and learning, retention, and application on the other.
The Goal is Improved Learning
Another critical aspect of learning analytics is that the fundamental goal is improving learning. Rather than grading, which is transactional, the goal of learning analytics is to be transformational. The purpose is understanding the complex process of human learning and finding ways to improve it. In the end, this can include accounting for individual student needs and strengths, learning environments, practices and behaviors, tools and technologies, and many other factors.
The information upon which learning analytics is based uses technologies that include learning management systems and next-generation student information systems. And in a bit of irony, research concerning this information is used to further strengthen not only these technologies, but also pedagogical practices, campus environments, “nudge” factors such as coaching effective student practices and behaviors, and other important elements.
When practiced effectively, learning analytics is a strategic rather than an operational endeavor. It is not merely quantitative optimization, but rather thoughtful and innovative. It does not assume the efficacy of legacy practices, but rather challenges them. It is not institutionally or teacher-centered, but is truly student-centered. Indeed, it is based on finding ways to most effectively fulfill the central mission of our institutions—student learning.
Look Beyond Legacy Practices
In order to build campus environments that can benefit from learning analytics, institutions must first look beyond legacy practices that are transaction-based. Doing so means incorporating more considerations into the selection of systems, technologies, and professional development programs for faculty. It means selecting systems not merely for successfully fulfilling transactions, but more importantly, for the information they provide to educators. Excellent systems are both available and emerging that incorporate these principles. They include such standouts as the Instructure Canvas LMS, the emerging Workday Student system, and systems being developed internally at visionary universities.
To be clear, learning analytics is not based on choosing the right technologies. It is based on creating a campus culture that places a high value on constantly rethinking and improving student learning. As a result, learning analytics is a holistic endeavor.
With innovative approaches, there are inevitably corresponding challenges. With learning analytics, these include respecting student privacy, valuing individual learner differences, and fostering conversations about student learning, among others. But students are at the core of our institutional missions. Constantly improving student learning is our most important challenge.
This article originally appeared in the June 2018 issue of College Planning & Management.
David W. Dodd is vice president of Information Technology and CIO at the Stevens Institute of Technology in Hoboken, NJ. He can be reached at 201/216-5491 or email@example.com.