Jezik, književnost i empatija (2025) (22-34. стр.)
АУТОР(И) / AUTHOR(S):Nune Ayvazyan 
Download Full Pdf 
DOI: 10.46793/LLE25.022A
САЖЕТАК / ABSTRACT:
This article examines artificial empathy in large language models (LLMs) and its implications for social interaction and translation. Artificial empathy is defined as the simulation of cognitive empathy, that is, recognition and context-appropriate response without felt emotion, emerging from pretraining on narrative-rich corpora and alignment that rewards prosocial language. It is shown how politeness and validation function as trust signals, which may explain why users experience LLMs as caring. Two use cases illustrate benefits and risks: (1) migrants and linguistic minorities, for whom conversational systems can lower barriers to participation in host-society environments; and (2) young users seeking emotional support, where simulated empathy may help in low-to-moderate-stakes exchanges but can miscalibrate trust in high-stakes situations. Building on Putnam’s thin/thick trust, a calibration framework is proposed that links empathic style to confidence through transparency, privacy protection, and human involvement in high-stakes situations. An important argument is that literature matters as a crucial component of model training data, which can shape the quality of simulated empathy.
КЉУЧНЕ РЕЧИ / KEYWORDS:
artificial intelligence, large language models, empathy, communication, translation
ПРОЈЕКАТ / ACKNOWLEDGEMENT:
ЛИТЕРАТУРА / REFERENCES:
- Ayers, J. W., Poliak, A., Dredze, M., Leas, E. C., Zhu, Z., Kelley, J. B., Faix, D. J., Goodman, A. M., Longhurst, C. A., Hogarth, M., & Smith, D. M. (2023). Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum. JAMA Internal Medicine, 183(6), 589-596. doi: 10.1001/jamainternmed.2023.1838.
- Ayvazyan, N., & Pym, A. (2018). Mediation Choice in Immigrant Groups. A Study of Russian Speakers in Southern Catalonia. In Anthony Pym (Ed.), Language Problems and Language Planning, 42(2), 344-364.
- Barker, R. L. (2008). The Social Work Dictionary. Washington, DC: NASW Press.
- Busch, D. (2024). AI Translation and Intercultural Communication: New Questions for a New Field of Research. New Questions for a New Field of Research. SocArXiv. https://doi.org/10.31235/osf.io/r3zdx.
- Cuff, B. M. P., Brown, S. J., Taylor, L., & Howat, D. J. (2016). Empathy: A Review of the Concept. Emotion Review, 8(2), 144-153. https://doi.org/10.1177/1754073914558466.
- Davidson, H. (2025). In Taiwan and China, Young People Turn to AI Chatbots for ‘Cheaper, Easier’ Therapy. https://www.theguardian.com/world/2025/may/22/ai-therapy-therapist-chatbot-taiwan-china-mental-health.
- Decety, J. (2010). The Neurodevelopment of Empathy in Humans. Developmental Neuroscience, 32(4), 257-67. DOI: 10.1159/000317771.
- Duarte, F. (2025). “Number of ChatGPT Users (March 2025)”. https://explodingtopics.com/blog/chatgpt-users.
- ELIS. (2025). European Language Industry Survey. Trends, Expectations and Concerns of the European Language Industry.
- European Commission. (2020). White Paper on Artificial Intelligence: A European Approach to Excellence and Trust. https://commission.europa.eu/publications/white-paper-artificial-intelligence-european-approach-excellence-and-trust_en.
- Fielder, S. and Wohlfarth, A. (2018). Language Choices and Practices of Migrants in Germany: An Interview Study. Language Problems and Language Planning, 42(3), 267-287.
- Goldman, A. I. (2006). Simulating Minds: The Philosophy, Psychology, and Neuroscience of Mindreading. New York: Oxfod University Press. https://media.icamiami.org/2020/11/473e4c9b-simulating-minds-the-philosophy-psychology-and-neuroscience-of-mindreading-by-alvin-i.-goldman-z-lib.org_.pdf.
- Hoffman, B. D., Oppert M. L., & Owen, M. (2024). Understanding Young Adults’ Attitudes towards Using AI Chatbots for Psychotherapy: The Role of Self-stigma. Computers in Human Behavior: Artificial Humans, 2(2). https://doi.org/10.1016/j.chbah.2024.100086.
- Howarth, J. (2025). Most Visited Websites in the World (August 2025). https://explodingtopics.com/blog/most-visited-websites.
- Hu, K. (2023). ChatGPT Sets Record for Fastest-growing User Base. https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/.
- Hume, D. (1739). A Treatise of Human Nature. https://www.files.ethz.ch/isn/125487/5010_Hume_Treatise_Human_Nature.pdf
- Lopez, R. (2010). Empathy 101. How Do We Empathize? https://www.psychologytoday.com/intl/blog/our-social-brains/201007/empathy-101.
- McLennan, A. (2025). Young Australians Using AI Bots for Therapy. https://www.abc.net.au/news/2025-05-19/young-australians-using-ai-bots-for-therapy/105296348.
- Microsoft. (n.d.). Why Using a Polite Tone with AI Matters. https://www.microsoft.com/en-us/worklab/why-using-a-polite-tone-with-ai-matters.
- Montemayor, C., Halpern, J., & Fairweather A. (2022). In Principle Obstacles for Empathic AI: Why We Can’t Replace Human Empathy in Healthcare. AI and Society, 37, 1353–1359. https://doi.org/10.1007/s00146-021-01230-z.
- Ovsyannikova, D., Oldemburgo d. M. V., & Inzlicht, M. (2025). Third-party Evaluators Perceive AI as More Compassionate than Expert Humans. Communications Psychology 3(4). https://doi.org/10.1038/s44271-024-00182-6.
- Pena, C. (2023). Public Service Interpreting and Translation (PSIT) as a Social Integration Tool. New Voices in Translation Studies, 14(1), 74-99.
- Penagos-Corzo, J. C., van-Hasselt, M. C., Escobar, D., Vázquez-Roque, R. A., & Flores, G. (2022). Mirror Neurons and Empathy-related Regions in Psychopathy: Systematic Review, Meta-analysis, and a Working Model. Social Neuroscience, 17(5), 462–479. https://doi.org/10.1080/17470919.2022.2128868.
- Perry, A. (2023). AI Will Never Convey the Essence of Human Empathy” Nature Human Behavior, 7, 1808–1809. https://doi.org/10.1038/s41562-023-01675-w.
- Pokorn, N. & J. Čibej. (2018). Interpreting and Linguistic Inclusion – Friends or Foes? Results from a Field Study. The Translator 24(2), 111-127.
- Putnam, R. D. (2000). Bowling Alone. The Collapse and Revival of American Community. New York: Simon & Schuster.
- Pym, A. (2025). Risk Management in Translation. Cambridge: Cambridge University Press.
- PytlikZillig, L. M., & Kimbrough C. D. (2016). Consensus on Conceptualizations and Definitions of Trust: Are We There Yet? In E. Shockley, T. M. S. Neal, L. M. PytlikZillig, & B. H. Bornstein (Eds.), Interdisciplinary Perspectives on Trust: Towards Theoretical and Methodological Integration (pp. 17–47). Springer International Publishing/Springer Nature.
- Schaaff, K., Reinig, C., & Schlippe, T. (2023). Exploring ChatGPT’s Empathic Abilities. 11th International Conference on Affective Computing and Intelligent Interaction (ACII). https://arxiv.org/abs/2308.03527.
- Skjuve, M., Følstad, A., & Brandtzaeg, P. B. (2023). The User Experience of ChatGPT: Findings from a Questionnaire Study of Early Users. Proceedings of the 5th International Conference on Conversational User Interfaces (CUI ’23), 2, 1–10. Association for Computing Machinery, New York, USA. https://doi.org/10.1145/3571884.3597144.
- Sorin, V., Brin, D., Barash, Y., Konen, E., Charney, A., Nadkarni, G., & Klang, E. (2024). Large Language Models and Empathy: Systematic Review. Journal of Medical Internet Research, vol 26, e52597.
- Stenseke, J., & Tagesson, A. (2025). The Prospects of Artificial Empathy: A Question of Attitude?. In Seibt, J., Fazekas, P., Quick, O. S. (Eds.), Social Robots with AI: Prospects, Risks, and Responsible Methods. Frontiers in Artificial Intelligence and Applications.
- Stojiljković, S., Djigić, G., & Zlatković, B. (2012). Empathy and Teachers’ Roles. Procedia – Social and Behavioral Sciences, 69, 960-966. https://doi.org/10.1016/j.sbspro.2012.12.021.
- Stotland, E., Matthews, K. E., Sherman, S., Hansson, R. O., & Richardson, B. Z. (1978). Empathy, Fantasy and Helping. Beverly Hills, CA: Sage.
- Tagesson, A., & Stenseke, J. (2024). Do You Feel Like A(I) Feel? Frontiers in Psychology, 15. https://doi.org/10.3389/fpsyg.2024.1347890.
- Tidy, J. (2024). Character.ai: Young People Turning to AI Therapist Bots. https://www.bbc.com/news/technology-67872693.
- United Nations. (2025). International Migrant Stock 2024. Key Facts and Figures. https://www.un.org/development/desa/pd/sites/www.un.org.development.desa.pd/files/undesa_pd_2025_intlmigstock_2024_key_facts_and_figures_advance-unedited.pdf.
- Vitalaru, B. (2024). PSIT and Communication, Collaboration and Inclusion: Current Needs, Challenges and Proposals. FITISPos International Journal, 11(2), 1-11. https://doi.org/10.37536/FITISPos-IJ.2024.11.2.421.
- Wilkins, J. (2025). Sam Altman Admits That Saying “Please” and “Thank You” to ChatGPT Is Wasting Millions of Dollars in Computing Power. https://futurism.com/altman-please-thanks-chatgpt.
- World Health Organization. (2024). Mental Health of Adolescents. https://www.who.int/news-room/fact-sheets/detail/adolescent-mental-health#:~:text=Globally%2C%20one%20in%20seven%2010,lead%20fulfilling%20lives%20as%20adults.
- Yousif, N. (2025). Parents of Teenager Who Took His Own Life Sue OpenAI. BBC News. https://www.bbc.com/news/articles/cgerwp7rdlvo.
- Yuan, Y., Su, M., & Li, X. (2024). What Makes People Say Thanks to AI. In Degen, H., & Ntoa, S. (Eds.), Artificial Intelligence in HCI. HCII 2024. Lecture Notes in Computer Science, vol 14734. Springer, Cham. https://doi.org/10.1007/978-3-031-60606-9_9.
