Guest editorial: Artificial intelligence and composing just education futures

Anna Smith (Illinois State University, Normal, Illinois, USA)
Jennifer Higgs (School of Education, University of California Davis, Davis, California, USA)
José Ramón Lizárraga (University of Colorado Boulder, Boulder, Colorado, USA)
Vaughn W.M. Watson (Department of Teacher Education, Michigan State University College of Education, East Lansing, Michigan, USA)

English Teaching: Practice & Critique

ISSN: 1175-8708

Article publication date: 8 April 2024

Issue publication date: 8 April 2024

411

Citation

Smith, A., Higgs, J., Lizárraga, J.R. and Watson, V.W.M. (2024), "Guest editorial: Artificial intelligence and composing just education futures", English Teaching: Practice & Critique, Vol. 23 No. 1, pp. 1-5. https://doi.org/10.1108/ETPC-04-2024-202

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Emerald Publishing Limited


Responsive to the rapid evolution of and spirited public discourses around generative artificial intelligence (AI), we offer this special issue as a conceptual and empirical (re)imagining of pasts, presents and futures of composing, education, and algorithmic lifeworlds (Gilbert, 2018; Habermas, 1987). At the edge of a post-digital era in which everyday life is thoroughly enmeshed with digital technologies (Jandrić et al., 2018; Nichols and Garcia, 2022; Lizárraga, 2023; Stornaiuolo et al., 2017), the next generation of automated systems in the form of generative AI presents a rupture point felt in spaces of creativity and authorship across industries, schools, and beyond. This rupture is marked by converging accelerations that include increased speed in the evolution of natural language processing, large language models, and neural network technologies, alongside the corporate race to integrate, hype and profit from the capacities of automated decision-making encoded across digital platforms.

This acceleration is also felt in the increasing reach across publics and communities via chatbot interfaces to access and engage with previously out-of-reach multimedia generation tools. Furthermore, the dazzling speed at which AI renders vibrant videos, images, or written texts can preclude closer examinations of what, exactly, is being produced and reproduced through these processes. Although marketed as novel, outputs are flattened facsimiles of existing training data – incomplete pasts of digitized text and media rife with technoableism (Shew, 2020), intersectional racism (Buolamwini and Gebru, 2018) and techno erasure (Small, 2023). “Generative” in this sense can be misleading. As machine learning is increasingly autonomously recursive in nature, artificial intelligence is a historical rather than future-making act of creation (Hughes-Warrington, 2022). Instead of predicting and generating “new” futures in language, art and media, it calculates and renders a most likely past.

This rupture point thus necessitates attention to the accelerated creation potential that AI technologies offer in the name of revolutionary progress, which cannot be disentangled from the violent disruptions they impose upon the world. Concurrent with the accelerations of creative processes through generative AI are perils such as invisiblized and exploitative labour (Bartholomew, 2023; Roberts, 2014; Stewart and Uanhoro, 2023), significant environmental and economic impacts associated with AI infrastructure (Crawford and Jolar, 2018), and disinformation, surveillance, and the amplification at scale of human biases (Buolamwini, 2023; Noble, 2018). In educational settings, we further observe an increasing normalization of student surveillance (LeBlanc et al., 2023; Maughan et al., 2022) encoded into the basic functioning of existing systems, the dominance of commercial EdTech hype around personalized learning, data mining, and learning analytics (Williamson and Eynon, 2020), and the rise of digital platform governance of education systems (Gulson and Witzenberger, 2022; Nichols and Dixon-Román, 2024). These work in tandem with the growing presence of automated writing tutor and assessment systems that position technology as a surefire solution to educational issues, thereby drawing attention away from the cultural and social practices that shape learning (Robinson, 2023; Stornaiuolo et al., 2024).

Scholars have argued that such points of epistemic rupture necessitate paradigmatic shifts, as gestured to by Kuhn (1962) and Ngugi Wa Thiong’o (1986). In such a moment, we must make way for increasingly critical perspectives of writing as a distributed practice that travels across vast sociotechnical systems. In this special issue, scholars forward conceptual and epistemic shifts through (re)imaginings of composing, education, and algorithmic lifeworlds by drawing from multiple fields to explore composition as a sociotechnical and “culturing” (Arola, 2017) enterprise and turning our attention to composition as a set of complex relations among sociocultural, symbolic, and material mediational means (Freedman et al., 2016). The work of this special issue highlights the ethical and socio-political implications for composing with and for imagined and unimaginable audiences (De Kosnik, 2021; Dixon-Román et al., 2020). As Enriquez et al. (this issue) emphasize, this is a moment requiring both recognition of the pasts and presents of algorithms in contemporary society–particularly in relation to how we create, compose, and interpret meaning – and a reimagining of what it means to reconfigure compositional technologies and lived, embodied enactments (Watson and Marciano, 2023) toward more just futures (Pangrazio et al., 2022; Wairegi et al., 2021).

Across the pieces of this special issue, the authors assert the urgency of complicating, playing with and questioning generative artificial intelligence in ways that humanize the activity of creation with emerging technologies, attending to individual and collective needs, dreams, and desires. As the intergenerational authorship team Stornaiuolo et al. (this issue) suggest, playful experimentation with AI composition can become a form of “critical play” (McBride et al., 2023) that engages youth in joyful activity that supports contemplation of everyday dilemmas around the digital. Thinking as both a writer and writing instructor, Li (this issue) explores creative and emancipatory potential in cyborg composing (Haraway, 1991; Lizárraga, 2023), inviting writers and writing instructors to consider how they might more intentionally resist closure, illusions of objectivity, and corpus ideologies. In a similar vein, Jerasa and Burriss (this issue) explore what we may know as relationships between and across human and algorithmic moments of composing in the range of forums such as TikTok. The authors extend critical and post-humanist literacy theorizing in their analysis of content creators who post to TikTok’s #BookTok subcommunity and navigate the platform’s algorithm as negotiated endeavour between and across human audiences and machine algorithms.

We support the calls from scholars in this special issue to examine how various learning contexts can be leveraged to complicate futures of artificial intelligence by interrogating power and ethics (Vakil and Higgs, 2019) and engaging speculative framings (Toliver, 2021). As explored by Thrall et al. (this issue), incorporating speculative fiction writing can foster a critical understanding of AI’s broader societal implications among learners. Such an approach not only fosters the development of critical digital composition literacies but also encourages a deeper engagement with the ethical dimensions of artificial intelligence use in society more broadly. Fassbender (this issue) notes that ethical deployment in education should also include safeguards for data privacy and mechanisms to uphold academic integrity, while also harnessing AI's capabilities to enhance educational outcomes and teacher well-being.

The need to recognize and address the unique challenges faced by Black, Latinx/e, LGBTQIA+ and other historically marginalized groups within the rapidly changing technoscape is a critical point of urgency, as emphasized in this collection of articles. Authors underscore the imperative to “engage in knowledge co-creation with a heightened awareness of settler colonial and anti-Black processes” (Stewart and Uanhoro, 2023, p. 121). McKnight and Shipp (this issue), for example, draw attention to the need to interrogate the gendered and racialized power relations embedded in the design and deployment of generative AI writing tools. They also suggest how scholarship shaped by intersectional critical lenses (in their work, theories of social and planetary justice informed by First Nations and feminist epistemologies) contributes novel perspectives that challenge “coloniser tool-thinking.” Relatedly, Tanksley’s (this issue) centring of the experiences of Black high school students in a critical race technology course demonstrates the vital importance of making visible the logics of white supremacy and anti-Blackness to young people, and to teachers, administrators, and school staff who may promote “objective” approaches to artificial intelligence in education. These studies suggest that explicitly focusing on equity, power and community interests facilitates ethical engagement with the complexities of AI and foregrounds empowerment and inclusion. The articles emphasize the creation and use of emerging technologies that support agentive participation, aligning with the broader goals of social justice and community-based activism. A significant objective and challenge of a post-digital age remains forging and sustaining relationships with community stakeholders to ensure that technological implementations are relevant, accessible and responsive to their needs (Elsinbawi et al., 2023; Gutiérrez et al., 2019; Vakil et al., 2023).

These scholars’ works orient us toward four shifts in pedagogies, epistemologies, and practices that can help us begin addressing the rupture points posed by generative artificial intelligence: recognition of algorithms, pasts and presents in contemporary lived experiences; complicating, playing with and questioning generative AI in humanizing ways; leveraging learning contexts to complicate AI futures and ethics; and critically affirming and addressing challenges and hopes of historically marginalized groups across fluid, changing technoscapes. Taken together, these articles push us to consider the ethical integration of emergent technologies in educational settings, recognizing the intertwined nature of technology, society, and learning. We encourage the field to continue exploring conceptually and empirically how integration of AI and other emergent technologies in education can be responsive to the goals, interests and contexts of diverse learning communities, and how emancipatory and decolonizing futures within shifting sociotechnical ecologies may be conceived and actualized.

References

Arola, K.L. (2017), “Composing as culturing: an American Indian approach to digital ethics”, in Mills, K.A., Stornaiuolo, A., Smith, A. and Pandya, J.Z. (Eds), Handbook of Writing, Literacies, and Education in Digital Cultures, Routledge, pp. 275-284.

Bartholomew, J. (2023), “Q&A: uncovering the labor exploitation that powers AI”, Columbia Journalism Review, available at: www.cjr.org/tow_center/qa-uncovering-the-labor-exploitation-that-powers-ai.php

Buolamwini, J. (2023), Unmasking AI: My Mission to Protect What is Human in a World of Machines, Random House.

Buolamwini, J. and Gebru, T. (2018), “Gender shades: intersectional accuracy disparities in commercial gender classification”, In Conference on Fairness, Accountability and Transparency, January, PMLR, pp. 77-91.

Crawford, K. and Jolar, V. (2018), “Anatomy of an AI system [digital image file]”, The Museum of Modern Art, New York, NY, United States, available at: www.moma.org/collection/works/401279

De Kosnik, A. (2021), Rogue Archives: Digital Cultural Memory and Media Fandom, MIT Press.

Dixon-Román, E., Nichols, T.P. and Nyame-Mensah, A. (2020), “The racializing forces of/in AI educational technologies”, Learning, Media and Technology, Vol. 45 No. 3, pp. 236-250.

Elsinbawi, M., Norris, A., Cohen, A. and Paley, M. (2023), “Culturally responsive computing in teacher training: designing towards the transformative learning of girls in STEM”, International Journal of Computer Science Education in Schools, Vol. 6 No. 2, doi: 10.21585/ijcses.v6i2.179.

Freedman, S.W., Hull, G.A., Higgs, J. and Booten, K. (2016), “Teaching writing in a digital and global age: toward access, learning, and development for all”, in Bell, C. and Gitomer, D. (Eds), Handbook of Research on Teaching, 5th ed., American Educational Research Association, pp. 1389-1449.

Gilbert, A.S. (2018), “Algorithmic culture and the colonization of life-worlds”, Thesis Eleven, Vol. 146 No. 1, pp. 87-96, doi: 10.1177/0725513618776699.

Gulson, K.N. and Witzenberger, K. (2022), “Repackaging authority: artificial intelligence, automated governance and education trade shows”, Journal of Education Policy, Vol. 37 No. 1, pp. 145-160, doi: 10.1080/02680939.2020.1785552.

Gutiérrez, K.D., Becker, B., Espinoza, M., Cortes, K., Cortez, A., Lizárraga, J.R., Rivero, E., Villegas, K. and Yin, P. (2019), “Youth as historical actors in the production of possible futures”, Mind, Culture, and Activity, Vol. 26 No. 4, pp. 291-308.

Habermas, J. (1987), The Theory of Communicative Action, Polity Press, Vol. 2.

Haraway, D. (1991), Simians, Cyborgs, and Women: The Reinvention of Nature, Routledge.

Hughes-Warrington, M. (2022), “Toward the recognition of artificial history makers”, History and Theory, doi: 10.1111/hith.12278.

Jandrić, P., Knox, J., Besley, T., Ryberg, T., Suoranta, J. and Hayes, S. (2018), “Postdigital science and education”, Educational Philosophy and Theory, Vol. 50 No. 10, doi: 10.1080/00131857.2018.1454000.

Kuhn, T.S. (1962), The Structure of Scientific Revolutions, University of Chicago Press.

LeBlanc, R., Aguilera, E., Burriss, S., de Roock, R., Fassbender, W., Monea, B., Nichols, T.P., Pandya, J.Z., Robinson, B., Smith, A. and Stornaiuolo, A. (2023), Digital Platforms and the ELA Classroom, National Council of Teachers of English.

Lizárraga, J.R. (2023), “Cyborg sociopolitical reconfigurations: designing for speculative fabulation in learning”, Journal of the Learning Sciences, Vol. 32 No. 1, pp. 21-44.

McBride, C., Smith, A. and Kalir, J.H. (2023), “Tinkering toward teacher learning: a case for critical playful literacies in teacher education”, English Teaching: Practice and Critique, Vol. 22 No. 2, pp. 221-233, doi: 10.1108/ETPC-08-2022-0114.

Maughan, T., Saheli Singh, S., Marshall, L. and Lueng, H. (2022), “#tresdancing [film]”, Screening Surveillance, available at: www.screeningsurveillance.com/

Ngugi Wa Thiong’o (1986), Decolonising the Mind: The Politics of Language in African Literature, J. Currey.

Nichols, T.P. and Dixon-Román, E. (2024), “Platform governance and education policy: power and politics in emerging EdTech ecologies”, Educational Evaluation and Policy Analysis, doi: 10.3102/01623737231202469.

Nichols, T.P. and Garcia, A. (2022), “Platform studies in education”, Harvard Educational Review, Vol. 92 No. 2, pp. 209-230.

Noble, S.U. (2018), Algorithms of Oppression, New York, NY University Press.

Pangrazio, L., Stornaiuolo, A., Nichols, T.P., Garcia, A. and Philip, T.M. (2022), “Datafication meets platformization: materializing data processes in teaching and learning”, Harvard Educational Review, Vol. 92 No. 2, pp. 257-283.

Roberts, S.T. (2014), Behind the Screen: The Hidden Digital Labor of Commercial Content Moderation, University of IL at Urbana-Champaign.

Robinson, B. (2023), “Speculative propositions for digital writing under the new autonomous model of literacy”, Postdigital Science and Education, Vol. 5 No. 1, pp. 117-135.

Shew, A. (2020), “Ableism, technoableism, and the future of AI”, IEEE Technology and Society Magazine, Vol. 39 No. 1, doi: 10.1109/MTS.2020.2967492.

Small, Z. (2023), “Black artists say A.I. shows bias, with algorithms erasing their history”, The New York Times, July 3, available at: www.nytimes.com/2023/07/04/arts/design/black-artists-bias-ai.html

Stewart, N.D. and Uanhoro, J. (2023), “The quantitative debt owed to Africa: a call to action”, Alliance for African Partnership Perspectives, Vol. 2 No. 1, pp. 121-129.

Stornaiuolo, A., Higgs, J., Nichols, T.P., LeBlanc, R. and de Roock, R. (2024), “The platformization of writing instruction: considering educational equity in new learning ecologies”, Review of Research in Education.

Stornaiuolo, A., Smith, A. and Phillips, N. (2017), “Developing a transliteracies framework for a connected world”, Journal of Literacy Research, Vol. 49 No. 1, pp. 68-91.

Toliver, S.R. (2021), Recovering Black Storytelling in Qualitative Research: Endarkened Storywork, Routledge.

Vakil, S. and Higgs, J. (2019), “It’s about power: a call to rethink ethics and equity in computing education”, Communications of the ACM, Vol. 62 No. 3, pp. 31-33.

Vakil, S., Reith, A. and Melo, N.A. (2023), “Jamming power: youth agency and community‐driven science in a critical technology learning program”, Journal of Research in Science Teaching, Vol. 60 No. 8, pp. 1723-1754.

Wairegi, A., Omino, M. and Rutenberg, I. (2021), “AI in Africa: framing AI through an African lens”, Communication, Technologies et Développement, No. 10.

Watson, V.W.M. and Marciano, J.E. (2023), “These tellings: explosive love as literacy research”, Journal of Literacy Research, Vol. 55 No. 2, pp. 130-144.

Williamson, B. and Eynon, R. (2020), “Historical threads, missing links, and future directions in AI in education”, Learning, Media and Technology, Vol. 45 No. 3, pp. 223-235, doi: 10.1080/17439884.2020.1798995.

Further reading

Ulmer, G. (2003), Internet Invention: From Literacy to Electracy, Longman.

Acknowledgements

Erratum: It has come to the attention of the publisher that the editorial, Smith, A., Higgs, J., Lizárraga, J.R. and Watson, V.W.M. (2024), “Guest editorial: Artificial intelligence and composing just education futures”, English Teaching: Practice & Critique, Vol. 23 No. 1, pp. 1-5. https://doi.org/10.1108/ETPC-04-2024-202 was published with uncorrected errors introduced during the production process. This has now been corrected in the online version. Edits include minor grammatical and typographic errors and the addition of previously missing references. The publisher sincerely apologises for this error and for any inconvenience caused.

Related articles