By Brian Lamb
As Jim Groom notes in his paper, there is really no fun in bashing the Learning Management System (LMS) anymore.
That particular buzz was definitively killed two years ago, when the EDUCAUSE Learning Initiative (ELI) published its white paper on Next Generation Digital Learning Environments (NGDLE). The ELI is not widely regarded as a hotbed of radical, anti-ed tech sentiment, yet when it consulted “with more than 70 community thought leaders” it came to a sobering assessment of what is by far the most commonly used platform for online learning:
What is clear is that the LMS has been highly successful in enabling the administration of learning but less so in enabling learning itself. Tools such as the grade book and mechanisms for distributing materials such as the syllabus are invaluable for the management of a course, but these resources contribute, at best, only indirectly to learning success. Initial LMS designs have been both course- and instructor-centric, which is consonant with the way higher education viewed teaching and learning through the 1990s. Describing the emerging needs as “interoperability; personalization; analytics, advising, and learning assessment; collaboration; and accessibility and universal design”, the white paper promotes “a “Lego” approach to realizing the NGDLE, where NGDLE-conforming components are built that allow individuals and institutions the opportunity to construct learning environments tailored to their requirements and goals.”
The white paper has been widely read in the field, oft-cited and in large part defining what a serious, forward-thinking learning technology strategy should be addressing. This significance is re-asserted by the July-August 2017 issue of EDUCAUSE Review, which is largely dedicated to reflections, updates and elaborations on the NGDLE vision. These pieces for the most part double down on the existing claims of importance. In one article Stephen Laster of McGraw Hill Education argues that “the full implementation of the NGDLE framework not only will allow the edtech industry to cement the powerful role that technology can play in solving our efficiency and effectiveness issues but also will enable us to achieve an immensely positive impact on education at large.”
The technological vision articulated by Laster is no less bold: “a seamless, open ecosystem that prioritizes flexibility over structure and in which institutions have the freedom to construct learning environments that are central to their mission.”
It’s foolish to argue against calls for enhanced interoperability amongst learning technologies that promote seamless open ecosystems and free us to do what we need to do. That said, interoperability is hardly a new concept. Such calls sound all too familiar to those of us who worked to realize the decades-old vision of “learning objects”, which promised to develop standards to enhance “interoperability” of learning materials that would lead to adaptive, personalized learning experiences. These efforts cited LEGO blocks as their metaphor as well.
While it is a lovely thing to dream of a generational leap in the capabilities of our learning environments, of optimized bespoke toolsets connected seamlessly by APIs, it’s difficult to envision the transformation described in these manifestos being implemented by the average institution, where most ed tech units have been hollowed out by more than a decade of outsourcing and austerity. Yes, there are elite institutions that can still afford to employ teams of developers, pay the vendor licenses and hire the consultants and contractors to make NGDLE-type learning environments happen. But those options seem remote, bordering on incomprehensible to those of us struggling to keep software and materials up to date, to build relationships with our communities, to provide the support our students and instructors need to get through the next semester.
As the Director of a “Learning Technology and Innovation” team, I feel obliged to align our efforts with best practices in the field, and to position us to take advantage of future developments. But so far I have read little in NGDLE discourse that gives me a sense of what concrete steps I can be taking to position us for the next wave. ELI Director Malcolm Brown, a co-author of the 2015 white paper, hails the “Zen-like emptiness” of the NGDLE, and its non-prescriptive nature: NGDLE “makes no recommendation on vendor vs. local applications or on commercial vs. open.”
It’s not easy to articulate a vision for “what comes next”, and given the amount of half-baked speculation I’ve spewed out over the years I am in no position to criticize. But the promise of NGDLE remains fuzzy and inchoate to most of us, a dream of algorithmic secret sauce that will rescue us in the near future if we trust in the industry to provide.
In the same issue of EDUCAUSE Review, Chris Gilliard offers a welcome counterpoint to these happy technodreams, and also identifies a more pervasive danger in blind faith in what the digital future may hold if we simply accept things as they are.
I call the web “broken” because its primary architecture is based on what Harvard Business School Professor Shoshana Zuboff calls “surveillance capitalism,” a “form of information capitalism [that] aims to predict and modify human behavior as a means to produce revenue and market control.” Web2.0—the web of platforms, personalization, clickbait, and filter bubbles—is the only web most students know. That web exists by extracting individuals’ data through persistent surveillance, data mining, tracking, and browser fingerprinting and then seeking new and “innovative” ways to monetize that data. As platforms and advertisers seek to perfect these strategies, colleges and universities rush to mimic those strategies in order to improve retention.
…a web based on surveillance, personalization, and monetization works perfectly well for particular constituencies, but it doesn’t work quite as well for persons of color, lower-income students, and people who have been walled off from information or opportunities because of the ways they are categorized according to opaque algorithms.
Gilliard moves the case that seemingly value-neutral or even liberatory technologies embed values that in fact reinforce social injustice. Tressie McMillan Cottom call it “the iron cage in binary code. Not only is our social life rationalized in ways even Weber could not have imagined but it is also coded into systems in ways difficult to resist, legislate or exert political power.”
Critics such as Gilliard, Cottom, Audrey Watters articulate a wider sense that the web has not only failed to achieve the breathless utopian ideals of a space in which traditional power relationships would be challenged, it is increasingly a force for power to exert itself in ways that were until recently unimaginable. Higher education seems resigned to accepting the fundamental logic of surveillance capitalism as it stands, without asserting values or working to address its ill effects.
But if the implementation of NGDLE seems beyond the scope of action for most educational technologists, how can we begin to address such deeply-rooted and disturbing realities while clinging to the promise of digital and networked tools to enhance learning?
At the #OER17 conference in London, Kate Green, Markus Deimann and Christian Friedrich co-facilitated the Workshop “Towards Openness: Safety in Open Online Learning?” The core activity of this workshop was to invite participants to construct “interventions”, such as a “prototype, class, tool, process or even a recommendation” that would confront the darker realities that we face.
In other words, the facilitators challenged us to think of some action to uncover, to challenge or to address the dark complexities of open online learning today. Something that we ourselves might be able to accomplish. That strikes me as a very fruitful question to ask. And the follow up… What are some examples of contemporary practice in learning technology that constitutes such interventions? Before wrapping up, I’d like to point at some work that makes me feel more hopeful.
In the course of her work exploring “critical digital pedagogy in troubled political times”, Amy Collier has presented the challenge to educators to construct and protect “digital sanctuary” in their practice. Collier states that a “digital sanctuary initiative questions the role our technological systems play in students’ safety and looks for ways to minimize risks to students associated with those technological encounters.” She offer a number of practical guidelines for the handling of student data, such as to “closely examine and rethink student tracking protocols”, and asks “could we build an inventory of all of the digital tools that collect data and then surface that information to students as part of curriculum?”
Collier also suggests “using data literacy as a teaching opportunity”, which brings to mind a significant set of recent interventions from Michael Caulfield. In response to the wave of attention paid to “fake news” in the past year, Caulfield has been regularly posting responses to the present unreality. Some of these are abstract (his frequent criticisms of “the stream” as an inadequate frame for constructing knowledge), others are practical and easy to do (such as exercises promoting the use of reverse image search to determine the provenance of online memes). He has collected them into an open textbook Web Literacy for Student Fact Checkers, and if we were looking for immediate, concrete steps to move the needle it seems like these techniques and principles could be applied across all disciplines. As Caulfield argues, “the web is both the largest propaganda machine ever created and the most amazing fact-checking tool ever invented. But if we haven’t taught our students those capabilities is it any surprise that propaganda is winning?”
We need to do so much more, but I could point to a couple of interventions at my own Thompson Rivers University. During a research fellowship at TRU, Alan Levine built a number of tools that were dedicated to the notion of making it easier to share materials on the open web while also not requiring any personal data from the contributors (such as user accounts). These tools can be used for simple writing spaces, or collecting images or sounds. The concept is captured by the as-yet undefined yet undeniably brilliant acronym of SPLOT (which stands for something like Simplest Possible Learning Online Tool… or something). Whether or not one wishes to adopt the specific SPLOT tools, one wonders what might result if more of us worked to build tools that were tightly focused on simple experiences, and that resisted the trend to build learning on the accretions of personal data.
TRU Professor or Law Katie Sykes has used SPLOTs in some of her courses, but her most impactful learning technology activity was built on top of a vendor platform. In her course “Designing Legal Expert Systems: Apps for Access to Justice”, Sykes had her students build real, functioning apps for non-profit organizations, such as RISE Women’s Legal Centre, Animal Justice Canada and the BC SPCA. Students in this course get valuable and demonstrable experience and skills training, and far from going through the motions of “disposable assignments” instead engage in needed work with, as Sykes notes, a mission “to facilitate broader access to justice for all.”.
I can’t resist quoting Mike Caulfield’s take on Katie Sykes’s work, from a recent keynote that he gave:
Students walk away from this class having actually made a tangible difference in the world. This is a class that quite literally — literally! — saves puppies. Puppies! And the benefits go beyond that. Students are working with non-profit partners on this stuff and making connections that will help them get into that 60% of students that find a job as a lawyer. They are learning what law looks like in the real world, and how lawyers might collaborate with others. They are learning how to break down tricky law problems, while their colleagues are studying textbooks. And I can’t say this enough. They are literally saving puppies. How can you argue with that?
In my two decades working in this field, some things seem destined to repeat themselves over and over. Some things feel like they have not changed at all. But surveying the current landscape, it’s hard to not to feel we are at a very dangerous point. It is all too tempting to give in to despair. I’m grateful to Green, Deimann and Friedrich’s for planting the idea of interventions for more ethical open online learning. I’m looking for more to do.