Archive

Posts Tagged ‘tools’

Tool & Content Malleability

January 28th, 2009 jonmott Comments

I’ve recently finished an article with Mike Bush (a colleague here at BYU) in which we coin what we believe to be a new term in the standards debate–”content & tool malleability.” Our piece is modestly titled “The Transformation of Learning with Technology: Learner-Centricity, Content and Tool Malleability, and Network Effects” and will appear in the March-April edition of Educational Technology Magazine. A couple of months after it’s published, I’ll be able to publish the article in its entirety here. For now, I want to provide a preview of our notion of malleability. We suggest that malleability has three key attributes: openness, modularity, and interoperability and that teaching and learning tools and content must become more malleable if they are to become authentically reusable, remixable, redistributable, repurposable, etc.

We cite IBM’s ultimately successful implementation of the principles of modularity and interoperability which enabled the PC-maker to call on outside vendors for parts for their machines, creating an essentially malleable computing platform:

Their rejection of proprietary technology in favor of openness created the opportunity for IBM to call on Microsoft to develop the operating system and for a host of other companies (including Microsoft!) to go on to create thousands upon thousands of software applications, guaranteeing he long-term success of IBM’s initial design. Furthermore, competing companies that chose a proprietary and closed approach for their hardware, software, or both, (e.g., Texas Instruments, Amiga, Atari, Commodore, and Radio Shack) are nowhere to be found among Twenty-First Century personal computers. Even Apple, with the initial version of their innovative Macintosh, came close to meeting disaster until they opened things up with their Macintosh II (Bush, 1996). In the end, the nature of IBM’s approach not only ensured success in their initial venture, but the continued application of the same principles over the years by IBM’s successors also makes it possible for today’s machines to run much of the same software that was created for the original IBM PC.

Among the principles of openness, modularity, and interoperability that brought success to the IBM-PC venture, the importance of modularity seems perhaps preeminent and has been documented in detail by scholars at the Harvard Business School (Baldwin & Clark, 2000). In their initial work, they analyzed how modularity evolved as a set of design principles during the period between 1944 and 1960. Then using Holland’s theory of complex adaptive systems as a theoretical foundation, they explain how the design principles they identified went on to radically transform the information technology industry from the 1960s through the end of the century. They show how modular design and design processes have fostered change in the industry as it moved from one consisting of a few dozen companies and dominated by IBM to one that involves over a thousand companies and in which IBM plays a significantly lesser role. For example, the “packaged software” sector in the information technology industry consisted of about seven firms in 1970 that were valued at just over $1 billion (as measured in constant 2002 dollars). Thirty-two years later that sector had grown to 408 companies with a market capitalization of $490 billion (Baldwin & Clark, 2006).

Unfortunately, the application of the principles that made such developments possible in the computer industry is rare to nonexistent in many areas of education today. The education   technology landscape is best characterized by monolithic, enterprise technology silos with rigid, often impenetrable walls. Course management systems (CMSs), for example, are generally “all-or-nothing” propositions for institutions, teachers, and students. That is, even if you use an open source CMS like Moodle, you are (without significant customization) bound to use Moodle’s content publishing tool, Moodle’s quiz tool, Moodle’s gradebook, etc. Moreover, the CMS paradigm itself, tied as it is to semester calendars and time-bounded learning experiences (courses), severely limits learning continuity and persistence. Teachers and students are not free to choose the right / best / preferred tool for each teaching or learning activity they undertake, thus creating a technology paradigm that artificially limits possibilities and forecloses optimal teaching and learning choices.

The monolithic and rigid nature of today’s learning tools and content mirrors the way content has traditionally been made available to faculty and students—books and other resources (including online courses) have generally been all-or-nothing, take-them-or-leave them propositions. A similar business model was prevalent in pre-Internet days, resulting in CD-ROM databases that were more expensive than many potential consumers could afford. One analysis compared this marketing approach to a public water distribution system that would require selling the whole reservoir to each household rather than placing a meter at individual homes.

New approaches to content distribution, however, particularly the OpenCourseWare (OCW) and Open Educational Resource (OER) movements, promise to make a vast array of content open to instructors and students to reuse, revise, remix, and redistribute. The OCW Consortium, beginning with MIT in 2002, has now grown to include hundreds of institutions around the world that have chosen to place course materials online.iv The efforts of these institutions have spawned a related effort, dubbed Open Educational Resources (OER), to make learning materials and content (as opposed to complete courses) freely available as well (Breck, 2007). Around the world, millions of people, inside and outside of academia, are publishing content under Creative Commons licensing, making that content open for others to use in a variety of ways. We are rapidly approaching the tipping point at which a critical mass of participants in open content and open learning is sufficient to exponentially increase the value of each additional participant in the network (as described in the next section).

The stunning reality of the new standard of openness is that it is quite simple. The key is to create lots and lots of open content and provide open, easy access to it. While technical standards and specifications, such as the Shareable Content Object Reference Model (SCORM), are important when it comes to producing indexing, discovering, sequencing, packaging, and tracking of content, openness by itself is a paradigm shifting approach in the teaching and learning world.

The fact that content is openly available and usable is just as important as any particular technical feature of that content. While openness stands by itself as a radical new innovation, we need to avoid the temptation to downplay the importance of standards and specifications, for they are essential to the realization of the vision of open, modular, and interoperable learning environments.

This reality is not without historical precedent. Printing became affordable and available in large part due to what we today call standards. Indeed, as one scholar declared, “This then—the standardization and rapid multiplication of texts—was what the Fifteenth Century invention of printing made possible” (Bühler, 1952). Bühler also pointed out that printing’s contributions went beyond the replication issue, stating that modern scholarship only became possible with the production of identical copies of texts. Although the value of mass duplication is not to be discounted, the fact that scholars could reference each other’s work represented enormous value. Given this standardization, they were thus able to criticize, comment upon, connect to, and build upon what had come before. In many ways, printing standards facilitated the first widespread appearance of mashups in human history. The existence of identical copies was but one characteristic that facilitated the eventual widespread availability of books. In addition, several other factors contributed to the production process itself, eventually increasing the opportunity for wider distribution.

…

Although SCORM is not perfect, it at least began to address the issue of establishing a framework within which learning content can be made to interoperate in a variety of settings. Just as SIF opens up the opportunity for reuse of information created and used by various operational elements of schools, SCORM still holds the promise to facilitate the sharing of learning content, not only across learning management systems but also across tools that facilitate the design and development of learning content. In addition, common authentication schemes (e.g., OpenID) built upon Web services interoperability will ultimately allow learners to seamlessly navigate multiple Web-based teaching and learning applications, opening up possibilities for personal learning environments in which multiple sources of content and experiences work together to help students learn in ways that are tailored to each individual.

With developments like SCORM 2.0 on the horizon, as well as increasingly powerful software, hardware, and networking tools, technological barriers are falling. The challenge now is to harness these new enabling technologies to create more open, modular, and interoperable learning content as well as production and learning tools that are each malleable with respect to their individual functionality. Together, these technologies will help further the transformation of education from a teaching-oriented enterprise to a learning-centered one.

As noted, the entire piece will be available soon. We hope this notion of malleability helps move the conversation forward. What are your thoughts? Are we on the right track?

References:

Baldwin, C. Y., & Clark, K. B. (2000). Design rules,volume 1: The power of modularity. Cambridge, MA: MIT Press.

Baldwin, C. Y., & Clark, K. B. (2006). Modularity in the design of complex engineering systems. In D. Braha, A. A. Minai, & Y. Bar-Yam (Eds.), Complex engineered systems: Science meets technology (pp. 175–205). New York: Springer.

Breck, J. (2007, Nov./Dec.). Introduction to special issue on opening educational resources. Educational Technology,
47(6), 3–5.

Bühler, C. F. (1952). Fifteenth century books and the twentieth century. New York: The Golier Club.

Bush, M. D. (1996, Nov.). Fear & loathing in cyberspace: Of heroes and villains in the information age. Multimedia
Monitor; http://arclite.byu.edu/digital/heroesa5.html.

The Wikinomics of Education

August 8th, 2008 jonmott Comments

I started reading Wikinomics this week. In the book, the authors observe that “deep changes in technology, demographics, business, the economy and the world” have ushered in a “new age where people participate” like never before (2008, p. 10). Moreover, they contend that we have already reached a “tipping point where new forms of mass collaboration are changing how goods and services are invented, produced, marketed, and distributed on a global basis.” In The Wisdom of Crowds, Surowiecki explains that large groups of people can be “smart” when they are diverse, individuals in the group are independent from each other, and thought processes are decentralized (2004, p. 42). Another view of so-called “crowdsourcing” suggests that humanity is now capable of “using the kind of collective intelligence once reserved for ants and bees—but now with human IQ driving the mix” (Libert, 2007, p. 1). The result? A “quantum increase in the world’s ability to conceive, create, compute, and connect. We are only beginning to comprehend the consequences.”

The troubling thing to me about all of this is how little mention there is of education in these books. For example, Tapscott and Williams specifically mention education only four times in their 340 page volume on “Wikinomics” (see Index p. 343). The references themselves are also enlightening. The first is a mention of the MIT Open Courseware initiative (p. 22-23). The second references TakingITGlobal’s efforts to reform education by providing a “set of tools and curricular activities that will get students collaborating with other students in other countries” (p. 51). The third refers to the California Department of Education’s Open Source Textbook Project (p. 69). The fourth is merely an additional mention of the California textbook project (p. 301). Note that only one of these references relates to the way students actually learn—the others are about content creation and distribution.

This is additional evidence that technology’s real impact on education is yet to be realized. In a 2007 IRRODL article, David Annand observed: “Much like the Industrial Revolution before it, rapid technological change in the Information Age has to date created significant, fundamental change in virtually all sectors of society except education” (2007, emphasis added).

What are the factors that will bring about a fundamental paradigm shift in learning? For starters, I believe we need to press onward in our efforts to make teaching and learning technology (both tools and content) more modular and interoperable. We also need to do a better job of leveraging the network effect, connecting more learners to more content and more fellow-learners. Finally, none of this will be of any significance if we don’t doggedly stay focused on learning (instead of on making administrative and teaching tasks more efficient).

This is all the subject of an article I’m working on with my BYU colleague Mike Bush. I’ll post a link to it when it’s published.