Integrare ChatGPT come strumento di apprendimento: Potenziali benefici e considerazioni critiche

Autori

Parole chiave:

Educazione ai media, Intelligenza Artificiale, Apprendimento coinvolgente, Tecnologie digitali, ChatGPT

Abstract

Questo articolo esamina la potenziale integrazione di ChatGPT di OpenAI nei contesti educativi, con particolare attenzione alla sua capacità di migliorare l’impegno degli studenti nei contesti di apprendimento. Per indagare il potenziale di ChatGPT nel fornire contenuti formativi personalizzati a gruppi di apprendimento diversificati, sono state condotte interviste strutturate tra diversi tipi di studenti e il chatbot. Queste interviste sono state progettate per simulare interazioni educative ipotetiche di tipo elementare che rispecchino situazioni reali. Sebbene i risultati indichino che ChatGPT è capace di adattarsi a una gamma di esigenze e stili educativi, facilitando lo sviluppo di esperienze di apprendimento più accessibili e coinvolgenti, sono state identificate numerose limitazioni, quali la mancanza di intelligenza emotiva e il potenziale di diminuire il pensiero critico – sottolineando la necessità di un’integrazione cauta e di un monitoraggio continuo delle tecnologie IA nei contesti educativi. Il ricercatore propone un approccio equilibrato all’integrazione dell’IA, enfatizzando il potenziale di sinergia tra gli strumenti IA e i metodi di apprendimento tradizionali.

Riferimenti bibliografici

Bender, E. M., & Koller, A. (2020). Climbing towards NLU: On meaning, form, and understanding in the age of data. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (pp. 5185–5198). Association for Computational Linguistics. https://doi.org/10.18653/v1/2020.acl-main.463

Bommasani, R., Hudson, D. A., Adeli, E., et al. (2021). On the opportunities and risks of foundation models. arXiv. https://doi.org/10.48550/arXiv.2108.07258

Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school (Expanded ed.). National Academy Press.

Brown, T. B., Mann, B., Ryder, N., et al. (2020). Language models are few-shot learners. arXiv. https://doi.org/10.48550/arXiv.2005.14165

Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 77–91). https://doi.org/10.1145/3287560.3287596

Chan, C. K. Y., & Tsi, L. H. (2023). The AI revolution in education: Will AI replace or assist teachers in higher education? arXiv. https://doi.org/10.48550/arXiv.2305.01185

Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). SAGE Publications.

Denzin, N. K., & Lincoln, Y. S. (Eds.). (2018). The SAGE handbook of qualitative research (5th ed.). SAGE Publications.

Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2018). BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv. https://doi.org/10.48550/arXiv.1810.04805

Edwards, L., & Veale, M. (2017). Slave to the algorithm? Why a ‘right to an explanation’ is probably not the remedy you are looking for. Duke Law & Technology Review, 16(1), 18–84. https://doi.org/10.2139/ssrn.2972855

Floridi, L. (2023). The ethics of artificial intelligence. Oxford University Press.

Floridi, L., & Chiriatti, M. (2020). GPT-3: Its nature, scope, limits, and consequences. Minds and Machines, 30(4), 681–694. https://doi.org/10.1007/s11023-020-09548-1

Fuchs, T. (2018). Ecology of the brain: The phenomenology and biology of the embodied mind. Oxford University Press.

Gligorea, I., Marius, C., Romana, O., Andra-Teodora, G., Hortensia, G., & Paul, R. (2023). Adaptive learning using artificial intelligence in e-learning: A literature review. Education Sciences, 13(12), 1216. https://doi.org/10.3390/educsci13121216

Grassini, S. (2023). Shaping the future of education: Exploring the potential and consequences of AI and ChatGPT in educational settings. Education Sciences, 13(7), 692. https://doi.org/10.3390/educsci13070692

Goleman, D. (2005). Emotional intelligence: Why it can matter more than IQ. Random House.

Haider, J., & Sundin, O. (2019). Invisible search and online search engines: The ubiquity of search in everyday life (1st ed.). Routledge. https://doi.org/10.4324/9780429448546

Holmes, W., Bialik, M., & Fadel, C. (2019). Artificial intelligence in education: Promises and implications for teaching and learning. Center for Curriculum Redesign.

Husserl, E. (1970). The crisis of European sciences and transcendental phenomenology: An introduction to phenomenological philosophy. Northwestern University Press.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. https://doi.org/10.1038/nature14539

Lee, J. (2020). Mental health effects of school closures during COVID-19. The Lancet Child & Adolescent Health, 4(6), 421. https://doi.org/10.1016/S2352-4642(20)30109-7

Luckin, R. (2017). Machine learning and human intelligence: The future of education for the 21st century. UCL IOE Press.

Martin, A. (2018). Digital literacy and the “digital society.” In Digital literacies: Concepts, policies and practices (pp. 151-176). Routledge.

Merleau-Ponty, M. (1962). Phenomenology of perception. Routledge.

Montenegro-Rueda, M., Fernández-Cerero, J., Fernández-Batanero, J. M., & López-Meneses, E. (2023). Impact of the implementation of ChatGPT in education: A systematic review. Computers, 12(8), 153. https://doi.org/10.3390/computers12080153

Nissenbaum, H. (2011). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.

Peters, M. E., Neumann, M., Iyyer, M., et al. (2018). Deep contextualized word representations. arXiv. https://doi.org/10.48550/arXiv.1802.05365

Selwyn, N. (2019). Should robots replace teachers? AI and the future of education. British Journal of Educational Technology, 50(6), 1373–1384. https://doi.org/10.1111/bjet.12831

Searle, J. R. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3(3), 417–424. https://doi.org/10.1017/S0140525X00005756

Suchman, L. A. (2007). Human-machine reconfigurations: Plans and situated actions. Cambridge University Press.

Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. Basic Books.

van Manen, M. (2023). Phenomenology of practice: Meaning-giving methods in phenomenological research and writing (2nd ed.). Routledge. https://doi.org/10.4324/9781003228073

Vaswani, A., Shazeer, N., Parmar, N., et al. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30. https://doi.org/10.48550/arXiv.1706.03762

Voigt, P., & Von dem Bussche, A. (2017). The EU General Data Protection Regulation (GDPR): A practical guide (1st ed.). Springer. https://doi.org/10.1007/978-3-319-57959-7

Wolf, T., Debut, L., Sanh, V., et al. (2020). Transformers: State-of-the-art natural language processing. arXiv. https://doi.org/10.48550/arXiv.1910.03771

Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education. International Journal of Educational Technology in Higher Education, 16(1), 39. https://doi.org/10.1186/s41239-019-0171-0

Ziegler, D. M., Stiennon, N., Wu, J., et al. (2019). Fine-tuning language models from human preferences. arXiv. https://doi.org/10.48550/arXiv.1909.08593

##submission.downloads##

Pubblicato

2024-08-13

Come citare

Petrassi, D. (2024). Integrare ChatGPT come strumento di apprendimento: Potenziali benefici e considerazioni critiche. Formazione & Insegnamento, 22, 7328. Recuperato da https://ojs.pensamultimedia.it/index.php/siref/article/view/7328

Fascicolo

Sezione

Articoli

Categorie