Ariam's ARCADE

Many Abilities//Many Paths//Many Communities

I was so excited to publish this piece in EdSurge last month with my colleague Laura McBain. Reposting!

As we transition into the new year reckoning with a violent insurrection organized on social media, the spread of disinformation about a deadly pandemic and breakdowns in distance learning, it would be remiss of us not to acknowledge the impact of technology on every facet of our society. We all have been equally unified in our frustrations and concerns that we are inching closer to a dystopian future.

2020 shed more visibility on how racism, sexism, and other -isms permeate technology and will continue to create divides that may become irreparable. In the tech world, there have been DEI efforts, legislation targeting the racist impact of technology, warnings from ethicists and independent bias ratings created to rein in the harm—but none of these solutions address the real issue: humans. We must confront how the destructive harm unfolding through our technologies today is a reflection of the implicit bias and prejudice of the designers building the technology—and how they were taught. Many designers don’t know how to identify harmful bias and are completely unaware of how their own biases shape the products they build. So how can we start addressing this issue head-on in 2021?

Humans are the problem, and luckily education offers a solution.

We need to move beyond the quick fixes that are not working; and invest in the next generation of technologists by radically re-shaping why and how we teach computer science. The natural place to start is within the broad but influential community of computer science education, which includes teachers, administrators, curriculum designers and anyone involved in shaping how future technologists learn. Our young people need to be technically proficient in Python, R and Lisp to build AI, machine learning and other emerging technologies. However computing skills are not enough; we need to equip our young people with knowledge, skills and moral courage to design equitable tech that dismantles existing power dynamics, protects non-dominant groups, represents everyone and prioritizes the well-being of society.

As CS and technology educators we have helped create dozens of spaces for young people to tinker with technology for over a decade. Reflecting back on that time span, we can’t help but wonder how many young people graduated from those spaces capable of building a new bot, but incapable of recognizing their own biases. Where are they now? What cool and potentially dangerous technology have they put into the world? We cannot go back in time; but we can use this new insight to design a better, more equitable vision for computer science education.

A radically, reshaped computer science education will:

Prioritize racial literacy and history.

It’s important for all young people to believe they can be creators of technology, and it’s also reckless for us to omit that technology has historically been designed as a tool to surveil and oppress non-dominant communities. Prioritizing racial literacy means we must acknowledge how white supremacy has been ingrained into technology and collectively recognize that tech has not been neutral and that it has the power to harm. Examples might include how early punch card tabulators were used in Nazi Germany by the Third Reich to process racial censuses of Jewish German citizens; and how some of the first film stock centered white skin tones. Today we have technologies like facial recognition software which centers whiteness and can’t identify Black women, while also being designed to surveil and police Black and Brown communities.

Like emerging technologies, oppressive design practices have only evolved and manifested in new ways. K-12 administrators, educators and tech companies investing in computer science education need to support young people to examine the design, use and harmful consequences of discriminatory technologies.

Reflect and act on our own biases as creators.

It’s crucial for young people to understand how bias is embedded in code, data, policy and other facets of technology. The ability to reflect on how our positionality (shaped by identity and social status) influence the technologies we design is even more paramount for young people. At the Stanford d.school, we’ve built a design methodology that can help technology designers think through the first, second and third order implications of their creations before they release them into the world. Our budding technologists should iteratively evaluate their creations and ask themselves:

  • Am I creating this based on my own lived experience and expecting others who are different from me to use it?
  • Who benefits, who is being harmed or who is left out from the technology?
  • Whose stories is this dataset telling? Whose stories is this dataset leaving out? What was the historical context when this dataset was produced?
  • What don’t I know? Who should I ask and learn with?
  • I can design this but should I? What are the implications that need to be considered?

Recognize and make space for multiple perspectives.

The field of design can be an arena for the “pluriverse”, which anthropologist Arturo Escobar defines as multiple ways of knowing, being and thinking that are rooted in specific places and communities.

Young people are curious, and can be inspired by the diverse ontologies and perspectives amongst the peoples of the world, and in natural systems. Guiding them to channel this inspiration into design practices which shift the power dynamics in technology across race, gender, ability and culture can make our technologies profoundly more equitable. Encouraging them to see what is possible by tackling hyper-local problems and designing solutions with others who have wildly different perspectives, is one place to start. Intercultural experiences which challenge them to question why their perspective should be the perspective held by the world, and making room for other beliefs they may not relate to, is another. These early experiences can enable them to work with others and build technologies that are more inclusive and contextually appropriate.

How do we move forward?

We can gain inspiration from the 1619 Project and the Zinn Education Project, which have provided us with the tools to face our multifaceted histories in the hopes of repairing and shaping our futures. These projects prioritize racial literacy, help young people reflect on bias and recognize multiple perspectives.

We can work with our social studies departments and across other disciplines to ensure our students have a historical understanding of technology. We can celebrate what our students code and build, and ask them to consider the impact their creations might have on others. And we can celebrate and actively engage with different perspectives that challenge dominant voices and narratives in every step of our design process.

If we can apply these practices to computer science education, our young people might create cool technology that serves everyone and upholds a just world.

Reposting from publishing in Medium’s The Startup.


If you ask one or more AI assistants today “who discovered America?”, the alarming response you might get is:

“Christopher Columbus… Americans get a day off work on October 10 to celebrate Columbus Day. It’s an annual holiday that commemorates the day on October 12, 1492, when the Italian explorer Christopher Columbus officially set foot in the Americas, and claimed the land for Spain. It has been a national holiday in the United States since 1937.”

It’s Indigenous Peoples’ Day and there are many reasons this response is a problem.

First, let’s start with the facts. Christopher Columbus did not discover America. He initiated the “Columbian exchange”, which established settler colonies through violence and genocide. The dominant Eurocentric narrative about Columbus discovering America has been challenged and delegitimized for decades.

Second, whose narrative is being centered? When this question is asked, the perspective and history of Indigenous peoples of America, who were in fact here first, is not addressed or recognized by this AI assistant. Why does this seem like a big deal if many of us know that Christopher Columbus didn’t discover America? Because there are still many people, including young people in the United States and across the world who don’t, have been taught to accept this dominant narrative, or have yet to learn about this time in history. This AI assistant is disseminating and reinforcing a false historical narrative to thousands, if not millions of people, and this was a design choice.

All the information that disseminates from technologies like AI assistants are design choices made by people, and people can either choose to reinforce oppressive narratives or amplify the histories of those who have long been oppressed.

What if, when someone asks an AI assistant “who discovered America?”, the epistemology, knowledge and perspectives of Indigenous peoples’ of America disseminated from over a billion AI assistants?

“Who discovered America?” is just one question. What other questions are there for us to discover, unpack and make space for critical discourse? How might we re-center non-dominant perspectives through technology to advance social justice and equity?


Controlling the minds of the masses.

By the year 2026, the AI market is expected to reach $300.26 billion and one of the primary factors driving that demand is AI assistants like Google Home, Siri, and Alexa. There are already over a billion Google assistant devices in homes, offices, and other spaces and that will only grow exponentially. These technologies have incredible capabilities and help us do everything from complete mundane tasks to provide us with timely information. Want the latest news from NPR? Need directions to get to a friend’s place? Want to find out how cold it is outside for your morning run? Ask Google, Alexa, or Siri.

These technologies can perform many convenient functions and are becoming increasingly accessible, but are we assessing how they’re unconsciously shaping our understanding and knowledge of the world? Are we equipping our young people with the skills to recognize the influence of these technologies, question their authority, and push back?

Malcolm X once said:

“The media’s the most powerful entity on earth. They have the power to make the innocent guilty and to make the guilty innocent, and that’s power. Because they control the minds of the masses.”

AI assistants and other technologies are no different than the media or our education system, they’re an extension of this apparatus and wield bias and influence through power. Safiya NobleRuha BenjaminCathy O’Neil, and other scholars have thoroughly documented the many biases, racist and sexist in particular, perpetuated by emerging technologies. In spite of this scholarship, emerging technologies are positioned by the technology industry as “neutral” and when there have been incidents of bias, they’ve been written off as innocent “glitches.” Joy BuolamwiniTimnit Gebru, and other prominent computer scientists have uncovered how these technologies and the datasets they use are designed and curated by human beings who encode their own biases, values and identities.

If we return to our AI assistant and Christopher Columbus example, it’s possible that the person(s) who created the algorithm designed it to pull up the top Google search engine result fueled by advertising dollars (VOA News), without taking the time to critically review the information for historical accuracy; or they used autosuggestion; or they manually curated the dataset and believed it to be a good source of information for users, but we really don’t know.


“I don’t know how to respond to that.” — AI Assistant.


Unlike our nightly news anchor who we can tweet at, or our radio station where we can call in to, or the editor of our local newspaper who we can write a letter or op-ed to, emerging technologies maintain fortified black boxes. Actual people remain nameless and faceless and this prevents the creation of spaces for engagement or discussion.

“How does one escape a cage that doesn’t exist?”, Maeve (robot) from Westworld ponders in season three, and it’s a question that so aptly reflects this dilemma. The invisibility of how decision-making processes are designed and embedded in emerging technologies, and the perceived divorcement from human bias or error is what makes their influence so insidious. Many scholars cite how social trust and overdependence on technology prevents us from questioning these black box algorithms and data sources. We believe it’s not our place, we’re not the technical experts, we’re told it’s too complicated, we don’t think about it at all, or we’ve been indoctrinated into believing that everything in our Google search return is accurate and what we need to know (“just GOOGLE it!”).

AI assistants and other emerging technologies are a great case study for Foucault’s knowledge-power theory, positing that power is everywhere and pervasive, it is established through accepted forms of knowledge, scientific understanding, and “truth,” and few industries are better at upholding “universal truths” than the technology industry. As we saw in the case of the AI assistant and Christopher Columbus, these “universal truths” prop up dominant narratives which continue to oppress non-dominant peoples. Our consciousness and ability to rebel against these universal truths and dominant narratives is fundamental to dismantling structural inequity.


Why rebelling is even more important for K12 now.

AI assistants are increasingly being used as educational aids by young people to answer questions and fact-check their work outside of school, and these technologies are being positioned as tools to bolster the development of inquiry and curiosity. When schools are at their best, children conduct research on the web at school with the support of teachers and librarians, who are trained educators tasked with supporting them to build their information and media literacy skills. With adult guidance they learn how to evaluate a source, debate the content of that source with their peers and create their own content.

How are AI assistants and black box algorithms altering this dynamic, especially in light of the COVID-19 global pandemic? How might these technologies create even greater harm at scale in K12 education through the dissemination of misinformation and dominant narratives that are prioritized according to which private interest has the biggest budget for search engine optimization?

We at the Stanford d.school are determined to support educators, families, and children to participate: to see what’s not visible, question these technologies, and embrace the role of creator and decision-maker. We’re also determined to equip designers and technologists with the skills to reflect on their own positionality, recognize discriminatory design practices and inflict less harm. If we are serious about equity, we must thoroughly evaluate the implications of our work on society before and iteratively as we design, and those who might be affected should give the greenlight before we set our creations loose in the world.

We at the Stanford d.school are determined to support educators, families, and children to participate: to see what’s not visible, question these technologies, and embrace the role of creator and decision-maker. We’re also determined to equip designers and technologists with the skills to reflect on their own positionality, recognize discriminatory design practices and inflict less harm. If we are serious about equity, we must thoroughly evaluate the implications of our work on society before and iteratively as we design, and those who might be affected should give the greenlight before we set our creations loose in the world.

Good intentions aren’t good enough. This is why we created “Build a Bot.”


A Peek Inside the Prototype: “Build a Bot.”

There are many layers of design involved in creating AI assistants, which include how we interact with them, how they select and collect the information they share with us, and what they do with the information we give them (yes, we give them information, sometimes we just don’t know it). In our prototype “Build a Bot” educators, families, and young people can design their own personalized responses to help requests and contend with the implications of various design choices. If you asked Alexa for directions, how would YOU want Alexa to respond back to you? That’s an intentional design choice that we as designers make and can change.

Educators, families, and young people can explore other design choices that aren’t always made very public but are shaping our society and future, and sit in the driver seat. As you build and craft your own AI assistant, or tinker with one you might have, questions this learning experience will provoke are:

  • If my AI assistant doesn’t understand a question someone asks, how should I design it to respond? What kind of questions should I create so that my AI assistant can answer someone’s question and truly be helpful?
  • When someone asks my AI assistant a question, where should it get the answers or information from? Newspapers, Twitter, Wikipedia? Is one place better than another place? How do I know? Whose perspective is this information positioning and is it propping up a dominant narrative or misinformation? Should I pick a source that presents multiple perspectives?
  • Should my AI assistant be able to listen to every conversation someone has, and is that conversation safe? Where does that data go and should it be saved? Should someone else have access to it?

These cards were inspired by the early work of Josie Young on the Feminist PIA (personal intelligent assistant) standards, and the wonderful work of the Feminist Internet and Comuzi on F’xa. This prototype along with more information can be found here.

A number of popular AI assistants were recently updated with data sources to show support for the Black Lives Matter movement (“Black Lives Matter”), and if a person asks “do all lives matter?”, they all express some version of “saying ‘black lives matter’ doesn’t mean that all lives don’t. It means black lives are at risk in ways others are not.”

While it’s encouraging to see this response to the shifts in global discourse around policy brutality, what was the response to this query a few months ago? “I don’t know”? “Yes”? It shouldn’t take a civil rights movement to prompt the technology industry to simply do the right thing.

My colleague Manasa Yeturu and I started re-phrasing the popular slogan “design starts with the user” to design doesn’t start with the user, it starts with YOU” which not only includes examining our own positionality, but all that we don’t know and all the ways in which we fail to act, fail to learn more about others, and fail to prevent harm.

Everything we do and everything we don’t do is an intentional design choice.

Join us.

What questions do you want to discuss and debate with AI assistants and their creators? What new help requests should we add to this deck of cards? Tweet at us @k12lab.

ADDITIONAL RESOURCES:

Over the last few months I’ve been working at the Stanford d.school in their K12 Lab on a special project around emerging tech and equity. Re-posting a blogpost written in collaboration with stellar colleagues Laura McBain, Lisa Kay Solomon, Carissa Carter and Megan Stariha.

 

Technology is power.

It can enable you to share an idea with millions of people around the world in a matter of seconds. And in those same few seconds, it can enable someone else to steal your identity and drain your bank account.

Whether it’s being used to spread information, incite violence, influence elections, or shop for glasses, who should have access to such powers? Who should be able to design and utilize technology to shape the world in their vision and image?

The present reality is that this power is in the hands of very few, and manifesting into serious consequences for the most marginalized people in the world. This is why we all need to be technologists. We all have the right to participate in and shape the growing influence technology has on our lives and communities, and build our digital agency. Whether you are the creator, user or policymaker, we all have a role in designing and deciding the future we all want to live in.

Today many emerging technologies (still in a phase of development and/or haven’t reached commercial scale) like machine learning, wearable tech, synthetic biology and others are often riddled with embedded biases (Ruha Benjamin, 2018). Computer scientist Joy Buolamwini found that three widely-used gender recognition tools could only accurately identify dark-skinned women as women from a photograph 35 percent of the time, while white men were identified as men 99 percent of the time (New York Times, 2018). This is a symptom of how emerging technologies are not created by diverse groups of people who reflect different values, life experiences, expertise, and take the responsibility to ensure all voices are represented in the design process.

At the d.school we believe educators are uniquely situated to address this critical issue. Educators have the capacity to shape a future in which all voices are represented and valued. They have the ability to equip students with the skills, mindsets, and dispositions needed to evaluate the ethical implications of technology and prioritize equity-centered design. But educators, particularly those who are serving students furthest from opportunity, need new resources to help students engage and create with emerging technology.

IMG_8921

Educators experiment with the “I Love Algorithms” card deck designed by the Stanford d.school Teaching and Learning Studio. Photos courtesy of the Stanford d.school/Patrick Beaudouin.

We believe that design can play an important role in addressing the digital inequities that exist in our K-12 communities, and the challenges facing digital inclusion. Built on our ongoing exploration of emerging tech, equity, and design we are exploring questions like…

  • How are emerging technologies used by different communities?
  • Who is creating emerging technologies like machine learning, blockchain, and synthetic biology?
  • Who is not being represented in the creation and pioneering of these emerging technologies?
  • How are oppressive social structures and practices, like racial profiling, manifesting in the early stages of the creation and application of emerging technologies? Why?
  • How might we equip educators and students with the creative confidence to understand, evaluate, and create with emerging technologies in their communities?

These questions and the research we’ve done are leading us to this design challenge:

How might we leverage emerging technologies to advance equity, empathy, and self-efficacy in K-12 education?

Our design work is grounded in four pillars of understanding, centered around participation and radical access, built on the early design work from Carissa Carter’s You Love Algorithms:

  1. It’s not about becoming a coder; it’s about knowing what the code can do (Carissa Carter, 2018). We all need to understand what emerging technologies can do, how they’re interlinked, and how they can be designed by increasingly diverse groups of creators and decision makers. This means that each of us should have a basic understanding of how emerging technologies such as blockchain, artificial intelligence, the internet of things, brain computer interface technologies, etc. work. Does that mean we’re all verifying transactions on a blockchain? No. But it does mean that we understand it’s rooted in decentralization, transparency, and immutability, and why some systems may or may not benefit from using blockchain.
  2. If we want emerging technology to represent all of us, it needs to be created by all of us (Carissa Carter, 2018). Technology needs to be inclusive. Creation encompasses more than just technical production or programming, it means all of our experiences, perspectives and voices are incorporated in the creation, adaption, and delivery of the technology. It requires that we all have an understanding of the concepts underlying emerging technology, and that each of us are an integral part of the design process.
  3. Technology is personal. Educators need support with how to cultivate and leverage the valuable digital practices and identities their students bring into the classroom (Matt Rafalow, 2018). To cultivate students’ abilities and support them in connecting with emerging technology, we need to consistently find ways to make technology personal to them. If students don’t recognize themselves or their communities in the technology they are using or designing with, this only further marginalizes them and reinforces embedded bias.
  4. Learning is about lifelong participation and creation; not consumption. Constructionism has shown us that the most powerful learning experiences emerge from creating something from our own curiosity, understanding, knowledge, and experience of the world. There is nothing more rewarding than designing something that solves a problem for you and the people you care about in your community.

How we are getting started.

In our pursuit to expand radical access to emerging technology and to cultivate a diverse generation of technology creators, we’ve launched a design project called 10 Tools for 1,000 Schools, a portfolio of design resources, tools, and approaches to help build the creative and digital agency of K-12 communities.

In the toolbox, educators will find engaging activities which will help them understand and teach the foundational concepts of emerging technologies, and resources on how to integrate them into various academic disciplines, along with easy-to-adapt community-based design challenges. We kicked off the playteest of two of our first 10 Tools resources at the first ever K12 Futures Fest, a gathering of more than 200 educators, students, and other community members who showcased their work and engaged in our new experiments.

IMG_9006(1)

Educators participate in a Futures Fest session on Blockchain. Photos courtesy of the Stanford d.school/Patrick Beaudouin.

Educators participated in a session which immersed them in the blockchain concepts of decentralization and transparency through taking on the persona of detectives tasked with cracking unsolved mysteries; and in another session, designed their own dance moves to express different machine learning algorithms. Participants pushed back on the perceived benefits of the technologies, rapidly came up with new ideas for how they might apply these technologies to new design challenges, and asked thought-provoking questions about the potential impacts on their students.

As our prototypes and learning evolve, we aim to share our work on the K12 Lab site. And we hope to encourage more educators to take up this challenge in their own communities by adopting and remixing these resources to fit the diverse needs and identities of their students.

Our collaborators include a crew of pioneering educators: Kwaku Aning, Louka Parry, Jennifer Gaspar-Santos, Akala Francis, and Daniel Ramos. They are each collaborating with us to create, integrate, and adapt these resources in their own contexts.

On the horizon.

In 2020 Karen Ingram, a designer who has a special focus on synthetic biology will join the team as an Emerging Tech Fellow.

How to learn more?

Want to learn more about our work? Read updates here. You can also join our newsletter for updates and events! Follow our progress on twitter using #10tools4schools.

_ _ _ _ _ _ _ _ _ _ _ _ _ _

References:

  1. Benjamin, R. (2019). Race after Technology: Abolitionist Tools for the New Jim Code. Polity Press.
  2. Lohr, S. (2018, February 9). Facial Recognition Is Accurate, if You’re a White Guy. Retrieved from https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html.
  3. Green, B. (2019, April 17). Can predictive policing help stamp out racial profiling? — The Boston Globe. Retrieved from https://www.bostonglobe.com/magazine/2019/04/17/can-predictive-policing-help-stamp-out-racial-profiling/7GNaJrScBYu0a5lUr0RaKP/story.html.
  4. Matt Rafalow (2018). Disciplining Play: Digital Youth Culture as Capital at School. American Journal of Sociology. 123:5, 1416–1452.

This past summer I spent a few months working with Asian Development Bank on evaluating learning technologies for early childhood education. In 2019, they wanted to support the People’s Republic of China identify technologies which can lead to positive learning outcomes, are contextually appropriate, affordable, and overall accessible.

ADB13

I had a chance to meet with Putao ABC in Beijing, an ed tech company which develops a lot of AI-enabled and AR learning applications.

It was a fun mix of reviewing existing literature, conducting interviews with universities in Beijing and Shanghai to learn about the newest research, and visiting ed tech companies and test driving new products. Ultimately I synthesized all this data to create a framework which supports government determine which technologies are effective, and how and when to use them. The classification of technologies broke down into the administration of teaching to include student information systems and classroom management and communication tools, and the implementation of teaching, which more broadly encompassed everything from teaching aids and computer-assisted learning, to teacher professional development and assessment tools. The framework highlighted the technologies, types of software and hardware, costs, obstacles around implementation, use cases and examples of products.

I was especially delighted by this project with ADB because there’s a critical need for government, international organizations and NGO’s to make informed, evidence-based decisions for procurement of ed tech products. All too often technologies are procured and either don’t work for the context or are incredibly expensive (again…don’t work for the context). I hope this classification can provide some insights and guidance and will continue to iterate on it as new technologies enter the market.

 

Whoa.. it’s been a minute since I’ve have had a chance to write, reflect and post anything here, and the holidays are a great time for it. 

One of the coolest and challenging projects I had the opportunity to work on in 2019 was the Kenya Education Cloud. In late 2018, the Kenya Institute of Curriculum Development wanted to expand the content offerings on the Kenya Education Cloud (the country’s national digital learning platform), in a cost-affordable way for students, teachers and school communities. Digital learning content can be costly, and KICD wanted to understand how Open Educational Resources (OER) could be leveraged for the Kenyan context. It was fantastic to join KICD on this journey as a digital learning consultant. 

The Challenge

70% of OER globally are only available in English and because OER emerged to disrupt tertiary education, few OER exist for primary education, particularly in Kiswahili, Luo, Kikuyu and other indigenous languages of Kenya. After an extensive review of available OER, it was established early on that it would be critical for success to approach proprietary content developers and build strong partnerships within the local ecosystem. There was just a dearth of high-quality interactive OERs. Through a multi-pronged approach which consisted of reaching out to local content producers and global content producers, we mapped about 1,000 resources (a mix of local and global) to Kenya’s new competency-based curriculum. Some offered KICD no-fee licenses for either full repositories or select repositories of content. As expected, there was a bigger bulk of OER in Mathematics, Science and English Language, and scarcity in KSL (Kenyan Sign Language), Indigenous Languages and Religious Education. Locating interactive OER which were gender-responsive in line with national policies (Education and Training Sector Gender Policy) was also very challenging. And we hadn’t even gotten to the vetting process yet…

Quality Assurance: Will these hold up to our standards?

Once we had a set of OER, we designed a series of hands-on trainings to build the capacity of KICD curators, subject matter experts tasked with vetting content across Kenya’s 9 learning areas for grades 1-3 to meet quality assurance standards. While KICD curators were experts in their domains and in content creation, OERS were new terrain for most of them. Curators were taken through creative commons licenses, the remix/reuse/share/reduce OER model, and practical exercises of how to use OER in the classroom. They also engaged in project-based activities to explore OER expected to be available in the KEC, and reviewed them against existing OER standards established by KICD. There were great learning moments around defending decisions to recommend or not recommend content for the KEC, which also made us re-evaluate some of the standards.  It was good a time. 

IMG_5087 2

This PD expanded into a larger session a few months later with 63 public schools teachers from across Nairobi, Garissa, and Turkana on how to integrate OER into the classroom. The professional development took teachers through a cycle of evaluating and selecting an OER, designing a lesson plan with it for their students, and a dynamic facilitation of the lesson plan with peer feedback. Many teachers expressed that with more practice, they felt confident they could integrate OER into the classroom. One of the highlights of the session was watching teachers immerse themselves in the design process and recognize their own creativity. 

Over the project period, we created a full circle strategy encompassing interactive content mapping, partnerships and acquisition all the way through interactive content use in the classroom.

Key Outcomes, Lessons and New Questions

  1. It can be a fruitful endeavor to source OER, but due to the dearth of OER for primary and even secondary education, particularly OER that are of high-quality, it is equally important to mobilize time and resources to build relationships with content producers, donors, community-based organizations and other stakeholders. KICD and international organizations like UNICEF can be key conveners and play a critical role in building a sustainable OER ecosystem that can rapidly support demand with more supply, and ensure that for all involved there are clear incentives. For this initiative, this means working with global content producers and actors like the Global Digital Library and coordinating localization sprints for high-quality OER that is not in accessible formats or available in Swahili and indigenous languages. Creative Commons who have communities across the world and in Kenya could also be great partners. It also means building strong relationships with local content producers who are creating high-quality interactive content and tapping into the local/regional ed-tech community for information on new resources, technologies and opportunities for collaboration
  2. Decades of educational research has shown that ICT in Education initiatives often fail due to poor teacher training and low teacher adoption of technology. We learned through this process that teacher training can benefit from undergoing a full design process. This could include follow-up trainings to build on previous skills acquired, classroom observation paired with real-time feedback, establishing digital communities for teachers to share challenges and successes, etc.  I’ve written about this before, and recognize that government institutions sometimes don’t have the resources to make it a priority. How can we do better on this front?
  3. While it may not have been the focus of the teacher PD, incorporating basic digital literacy training for teachers is a must, especially when working with teachers from more marginalized parts of the country.  Teachers from certain counties in particular, struggled with basic digital literacy functions (ex: opening a new tab in a web browser, locating a file on the hard drive). With basic digital literacy skills, teachers will have more confidence in their ability to use the devices and it will also accelerate the speed of training to cover other important topics.