Six years in at Canvas: Why open source still matters

It’s a cool coincidence that a commenter on the e-Literate blog raised a question about whether we should call the Canvas LMS open source or not; six years ago this week I wrote my first blog post as an employee of for Instructure on the importance of openness to our company culture and identity. (Full disclosure: I still work for Instructure, though this blog post is my own perspective six years in.)

Canvas is a learning management system used by K-20 institutions around the world. Even though Canvas was built for the cloud, since 2011 Canvas has been open source under AGPLv3.

Early on, it was obvious that being open source provided some natural inoculation against being acquired by one of our competitors. It also presented customers with an escape hatch in case Instructure were to fail them. Thankfully, those concerns are less relevant now that Instructure has made it to IPO and Canvas has a proven track record in the market.

Instructure’s commercial success may actually heighten questions about whether and why Canvas is open source, however. Let’s start by acknowledging that classifications of “open source” vs “commercial open source” vs “open core” is a sometimes hotly debated topic. I’m not overly sensitive about these labels; Instructure will continue to release new Canvas code to GitHub every few weeks no matter what folks want to call it :)

That said, I personally call Canvas “commercial open source” rather than “open core” for two main reasons:

1. Open core usually means a large part of the code base has not been open sourced. The only parts of Canvas that we don’t open source are those related to our commercial hosting, like Vector and Hot Tub. Canvas is a cloud-native, single-version multi-tenant application, and we if we didn’t have proprietary capabilities for automated scalability, burst ability, failover across AWS regions, etc, our commercial hosting service wouldn’t have the great reputation for reliability. But even modules like Canvas Analytics and our native mobile apps are open source. We just built a new accessibility checker add-on for the content editor that we’re preparing to open source.

2. Some projects labeled “open core” withhold so much code that it’s not realistic to run the software yourself. That’s not the case here. It is totally feasible to run a Canvas implementation from the Canvas open source code, and there are legit examples of this happening. While we don’t require anyone to register or tell us about it when they run Canvas themselves, we do know about those who post to our community forums or join us at our user conferences.

Of course, we’re super pleased with how many institutions do choose Canvas cloud, and we’re proud of the services and support we give them. And that brings me to what I think is the more interesting conversation, why we make Canvas open source at all:

1. Canvas open source is one way to test whether or not educational institutions think Instructure’s support, reliability, collaboration, etc is worth paying for. Hey, it’s working :)

2. Canvas open source is one way to give transparency or assurance around system security and code integrity. When your source code is open, anyone can test it for vulnerabilities — and talk about it. When your source code is closed, people not only can’t check to see if the code is vulnerable, they can’t verify that you fixed vulnerabilities. But because not everyone knows how to do a thorough security check themselves, Instructure runs an annual open security audit on Canvas where third-party, independent experts are paid to find vulnerabilities, and then report on it for everyone to see.

3. Canvas open source give schools, institutions, and even other companies who don’t have in-region AWS or just don’t have the money to pay for hosting the chance to use (we think) the best academic LMS out there.ability, collaboration, etc is worth paying for. Hey, it’s working :)

Just this month I heard a story that once again reminded me why open source is the right thing to do: A small start-up called Nucleos is experimenting with a “portable cloud” service to deliver teacher training on Canvas open source in places where even connectivity can’t be taken for granted. I don’t want to steal their thunder or misrepresent their story, but I do hear they’ll have a case study out soon.

Lumen nails student-centeredness in new personalization engine

This is so right

David Wiley’s description of Lumen Learning’s OER-based personalization engine reframes the learning analytics conversation by reminding us what we really want learners to become. The R&Ed team that I work with spent some time earlier this year aiming for the heart of the term “student-centered learning”, and we concluded that student-centered learning practices tend to share a common (if sometimes unspoken) goal: to develop learners’ capability for self-directed, lifelong learning by granting more control and responsibility for the learning process.

Nowadays, “student-centered” is often conflated with “personalization”, which is often conflated with “adaptive learning” technology. David deftly identifies a key problem with most technology-driven approaches to personalization:

There is no active role for the learner in this “personalized” experience. These systems reduce all the richness and complexity of deciding what a learner should be doing to – sometimes literally – a “Next” button. As these systems painstakingly work to learn how each student learns, the individual students lose out on the opportunity to learn this for themselves. Continued use of a system like this seems likely to create dependency in learners, as they stop stretching their metacognitive muscles and defer all decisions about what, when, and how long to study to The Machine.

The Lumen Learning approach appears to be quite different — and aligned with a truly student-centered approach:

First, Lumen is acknowledging that though developing learners’ understanding of the material is critical, there is a higher, more profound goal of learner autonomy.

Second, Lumen is using the power of learning analytics to improve learning habits through self-reflectiveness as a means to travel on the path toward autonomy.  

So much work in predictive analytics and adaptive learning seeks to relieve people from the time-consuming work of individual diagnosis and remediation — that’s a two-edged sword: Using technology to increase efficiency can too easily sacrifice humanness — if you’re not deliberate in the design and usage of the technology. This topic came up quickly amongst the #DigPedNetwork group when Jim Groom and I chatted about closed/open learning environments earlier this month, suggesting that we haven’t fully explored this dilemma as educators or educational technologists.

So it’s refreshing to me when technology providers like Lumen Learning recognize (to paraphrase Charles Graham) that there are things computers will always be better at, and things humans will always be better at. And one of the things humans are pretty great at is enriching the learning experience and producing affective outcomes through personal interaction. This is an ideal that attracted me to the Instructure Canvas team in the beginning, and is part of what motivates me to do the work I do, day-in and day-out.

Thanks for the introduction to this new project, David; I can’t wait to see this the new Lumen Learning in action.

Equal opportunity data monitoring in education

I stumbled on this article in the Guardian on the value of learning analytics, which indicates a degree of irony in our often unquestioning pursuit of educational improvement:

The University and College Union is wary about using quantitative data sources as a performance management tool. “By their very nature, such sources of data do not take into account a range of other contextual factors which are of critical importance when making judgments about…

…students? engagement? learning? Nope:

… individual staff members’ work,” says its president, Simon Renton.

The article is titled, “Are universities collecting too much information on staff and students?” The answer seems to be “yes” for the former; “no” for the latter. Data mining to improve student learning through top-down measures is almost uniformly accepted; but once you start talking about data mining to improve faculty teaching, well, that’s clearly a matter you wouldn’t want to divorce from the critically important contextual factors that surround the practice.

What Claims Should We Expect from Educational Technology Vendors?

There’s been some strong journalistic probing going on over at E-Literate the past couple weeks, aimed at an education technology vendor’s overstated marketing messages. The messages have repeatedly claimed or implied that their technology has a direct, positive impact on student outcomes. Seeing as I work for an ed tech company, and am in the position of continually talking with our sales, marketing, and product teams on what our technology might or mightn’t do for education, Phil and Michael’s posts definitely caught my attention.

One of the most important recurring threads in education technology research is this: We should default to skepticism that technology is a causal factor in improved learning. The foundation for this skepticism was set well over two decades ago by Dick Clark, who disputed any unique power in different instructional media, and substantiated when Thomas Russell found “no significant difference” in research on mode of delivery

Instead, the best starting point to understand the value of a given education technology is to look toward efficiency (especially cognitive efficiency) and cost-savings. These values may not seem paramount for all educators or learners (though I would argue cognitive efficiency should be), and so we also look for technology with affordances that best fit our desired learning outcomes and instructional philosophies. Again, that places the emphasis on instructional methods as the catalyst for change in student outcomes.

This doesn’t mean all educational technology is the same, or even “just as good”. Technology is built by different designers, with different beliefs about what people are good for, and what computers are good for. Some technology emphasizes automation of the learning process; some technology emphasizes personal interaction between learners. Some technology is easier to use. Some technology is more reliable. Or more adaptable. Etc.

I’ll stop here before this turns into me pitching our technology.

But I will say that in my current role I am often asked by prospective clients, “Can you prove that your product improves learning outcomes?”

The answer isn’t simple, as much as we both may want it to be.

So, I see these questions (and the discussion at E-Literate) as a chance to not only reach back to re-discover what historical research has shown about the impact of technology, but also to think about our own expectations for technology — and our expectations for the people who use it. It’s a chance to figure out how technology can support or amplify the roles that teachers, administrators, and students each have in creating educational change.

The credit, then, goes not to the technology, but to the people who employ it.

Peter Blair reading Wilfred Owen’s Futility

WWI postcard by Edith Cavell, artist signed T. Corbella.

WWI postcard by Edith Cavell, artist signed T. Corbella.

When I was in college I spent much of my free time playing with ways that poetry was beginning to intersect with the new digital world. Back in 1998 I recorded my good friend and thespian Peter Blair reading Wilfred Owen’s heart-rending poem, Futility. Here’s that reading:

by Wilfred Owen

Move him into the sun —
Gently its touch awoke him once,
At home, whispering of fields unsown.
Always it woke him, even in France,
Until this morning and this snow.
If anything might rouse him now
The kind old sun will know.

Think how it wakes the seeds —
Woke, once, the clays of a cold star.
Are limbs so dear-achieved, are sides
Full-nerved, — still warm, — too hard to stir?
Was it for this the clay grew tall?
— O what made fatuous sunbeams toil
To break earth's sleep at all?

Born on March 18, 1893, Wilfred Owen became one of the few outstanding poets who wrote about World War I. He joined the British Artist’s Rifle O.T.C. in 1915, and served with their 2nd Battalion from 1916-1917. Though he returned home in 1917 because of invalidity, he went back to the Western Front in 1918 where he was killed in battle on November 14.

Donald Clark on why we shouldn’t worry about teacher re-use of OER

Donald Clark’s recent post listing significant obstacles or setbacks to the OER movement is pretty brilliant. One key point he makes that I hadn’t thought of is an “obsession with reusability”. He writes, “the obsession with the reuse of content by teachers, rather than straight use by learners, has led to an inward-looking attitude. Teaching is a means to an end and the most valuable OER resources are those used directly by learners.”

Alan Levine notes the rarity of actual instances of reuse/remix. Part of the problem goes beyond awareness, past discoverability, and right to the actual conspicuousness of reusuable learning activities for everyday teachers. (I’ve talked about this as a key value of Canvas Commons).

But I think Clark’s point is especially brilliant, and leads back to his implied observation that the most successful OER projects were designed not to foster teacher re-use, but to directly engage learners (Wikipedia, Khan Academy, etc). This suggests that if OER efforts focus on student re-use, teacher re-use will follow.

“Leaky sensory gating” may support — and exacerbate — creativity

What is creativity? Often it’s marked by divergent thinking and the ability to make novel connections between different ideas or concepts. Think high-productivity conceptual blending.

No wonder, then, that new research from Northwestern University suggests that a failure to “filer out ‘irrelevant’ sensory information” is related to creativity. This study adds to previous research that suggests distractibility and creativity are somehow intertwined.

The Northwestern researchers write, “‘Leaky’ sensory gating … may help people integrate ideas that are outside of the focus of attention, leading to creativity in the real world…” They theorize that this is because “creative people with “leaky” sensory gating may have a propensity to deploy attention over a wider focus or a larger range of stimuli.”

It’s a two-edged sword, though. Creative people often lament their own distractibility and the nuisance that activity outside of their mind causes. My own son, who spends most of his leisure time designing games in Scratch or Unity, is continually harassed by the noises that surround him.

2U’s 2015 Impact Report defends online education through content marketing

2U has released a 2015 “impact report” that I think is worth mentioning for its substance, structure, and style.

2U’s message: You or someone you know thinks online ed is lame, but actually it’s not. Thanks to 2U.

Screen Shot 2015-04-22 at 6.14.33 PM

I’ll admit it makes me sad that, despite continuing evidence of no significant difference, we still have to justify/defend online learning in the year 2015. But they’re right. We do. For example, the 2014 Babson Survey report Grade Level states, “academic leaders rating online learning outcomes as ‘Inferior’ or ‘Somewhat Inferior’ remained steady [in 2014] at 25.9%.”

One of the things you’ll notice as you (double)scroll to navigate through the report is it is both elegant and painless to read. Which brings me to think about the structure and the style of this piece:

For those of you interested in sales or marketing tactics, 2U’s report is a good example of commercial teaching — the report leads from the problem (persistent negative perception of online education) to their unique solution (strongly partnered online degree programs).

My colleague Sean Morris and I are both deeply invested in figuring out how to make important education research both appealing and impactful. We talked in depth about the language used in this 2U piece, and how even though this is labeled an “impact report”, it certainly feels much lighter weight. We had to ask ourselves, is this in actuality a research report or is it mere marketing narrative? It’s both. In my previous career I would have been a prime target for this particular piece, yet I didn’t feel like the report was BS’ing me or hiding truth behind fluffy language. Perhaps that’s because I know enough of the underlying tension (e.g. the perception vs reality challenge that online learning faces) that I was able to accept the claims without too much scrutiny of sources.

If you work in education, what examples have you seen of simple but substantial (and, dare I say, influential) content marketing that worked for you?

Cathy Davidson on designing student-centered learning

I was very glad to read Cathy Davidson’s description of a student-centered approach to course design that itself ends up articulating the core, underlying goal of student-centered learning: Empower every learner to be autonomous, self-directed, and successful.

It’s because of this goal that student-centered learning is often confounded with “active learning”; achieving this goal in a real, transferable way requires active learning.

Coincidentally, the research team I work on has been focusing on this distinctive idea over the past two months as we prepare for InstructureCon 2015 . We’ve scoured the academic literature on “student-centered learning” and its various incarnations, and are coming to a sense of how this important idea is connected with associated theories, applications, and practices. It’s becoming clear that, if we as educators believe in the goals and principles of student-centered learning, we must deliberately provide scaffolding for our students; we can’t assume the mindsets, skills, and habits of self-directedness will emerge ex nilho.