As you look through the ALT conference programme, you’ll see as much discussion of
important social justice issues (equity, diversity and inclusion, decolonisation, widening
participation) as there is of technology (artificial intelligence, VLEs, social media). Running
through so many of these presentations is criticality – critical digital literacy, critical digital
pedagogy, and critical reflection.
But what do we mean by criticality and why is it important? It’s important because if we’re
not engaging critically, we are just accepting the status quo. For some people that might be
enough, a quiet life if you like? But for many (particularly in the ALT community) that’s not
enough. We look around and see issues caused by climate change, by increasing inequalities
and rising discrimination against so many groups, and ask the age-old question – is this
really the world I want to live in, couldn’t we make it better?
So what about criticality? Stommel (2014) suggests several functions of ‘critical’ including:
- Critical, as in mission-critical, essential.
- Critical, as in literary criticism and critique, providing definitions and interpretation.
- Critical, as in reflective and nuanced thinking about a subject.
- Critical, as in criticizing institutional, corporate, or societal impediments to learning.
Each of the themes of ALT2023 deal with essential issues for the 21 st century – leadership,
diversity and inclusion, sustainability and social justice, and emerging technologies and
behaviours. Many of the presentations provide a critique of these issues or reflections on
them. Such critique and reflections help us to learn from past actions and to understand and
develop new approaches for the future. Hopefully helping us to better address these issues
and enhance society.
This critique and reflection is particularly necessary and visible at present in the discussions
around generative artificial intelligence (AI). Questions are being asked about the
development of these tools – how ethical is the labour involved, how equitable is the
training of the AI model, how sustainable is the use of such tools, how do they impact on
social justice? These are important questions, as are the questions being asked about the
role and regulation of the big tech companies who have developed AI, particularly when it is
potentially such a life-changing development. As we discuss these questions in relation to
AI, perhaps we should also turn that critical lens to other big tech companies, their products
may feel ubiquitous, but we have power as a community to hold them to account too, to
encourage them to develop more sustainable, equitable and ethical practices. The need to
reflect critically about the design, development and use of learning technology feels more
urgent than ever.
What role does ALT conference play in all this? It gives us a chance to come together and
debate the issues of the day, to hear from people with lived experience and those without,
to explore different perspectives and ideas. Perhaps even to hold some learning
technologies suppliers to account, or at least to ask them some critical questions such as:
- How have you ensured that your product is accessible, that it is equally effective
regardless of gender and race? - What steps have you taken to minimise any negative impact on the environment or
human rights due to the development and use of your product? - How do you ensure that your labour practices are equitable, legal and promote
human rights
At the ALT conference in 2018 Maren Deepwell said:
“Our practice is political, it’s personal and active participation in any of these
initiatives makes a difference. It helps us articulate a narrative that isn’t dominated
by advocacy alone and expands our personal learning networks beyond those we
already know and feel comfortable with, help burst the filter bubbles that surround
us.”
So I encourage you to come to ALT 2023 Conference, burst some bubbles and ask some
critical questions of each other, of the technology suppliers and of ourselves.