Korean Journal of Psychology : General

理쒓렐샇 寃깋

Korean Journal of Psychology : General - Vol. 40 , No. 4

[ Article ]
The Korean Journal of Psychology: General - Vol. 40, No. 4, pp. 487-509
ISSN: 1229-067X (Print)
Print publication date 25 Dec 2021
Received 14 Dec 2021 Accepted 20 Dec 2021
DOI: https://doi.org/10.22257/kjp.2021.12.40.4.487

사용자 특성이 인공지능 수용성 및 인공지능 기반 제품 사용의도에 미치는 영향
김남희1) ; 최종안
1)서울대학교 행복연구센터

Effect of user characteristics on artificial intelligence acceptability and intention to use artificial intelligence-based products
Namhee Kim1) ; Jong An Choi
1)Center for Happiness Studies, Seoul National University
Department of Psychology, Kangwon National University
Correspondence to : 최종안, 강원대학교 사회과학대학 심리학과, (24341) 강원도 춘천시 강원대학길 1 Tel: 033-250-6856, E-mail: jonganchoi@kangwon.ak.kr

Funding Information ▼

초록

본 연구는 사용자 특성이 인공지능 기술의 수용성에 미치는 영향을 살펴보고자 하였다. 보다 구체적으로 인공지능에 대한 사용자 인식, 성격적 특성, 그리고 인구통계학적 특성이 인공지능 기술 수용성에 미치는 영향을 검증하였다. 연구 결과에 따르면, 인공 지능 수용성의 경우 사용자 인식의 영향이 가장 중요한 것을 확인할 수 있었다. 사용자 인식 중 인공지능 기기나 서비스를 유용하게 지각하는 정도와 인공지능 기술에 대한 수용성이 밀접하게 관련되어 있는 것으로 나타났다. 인공지능에 대한 불안의 경우, 성격적 특성과 사용자 인식이 사용자의 인구통계학적 특성에 비해 중요한 것으로 나타났다. 다섯 가지 성격 요인 중 개방성, 그리고 사용자 인식 중 의인화가 인공지능에 대한 불안에 있어서 중요한 요인으로 나타났다. 제품 사용의도의 경우, 수용성과 마찬가지로, 사용자 인식의 영향력이 가장 큰 것으로 나타났으며, 사용자 인식 중 쾌락적 동기와 사회적 영향력이 중요한 것으로 나타났다. 마지막으로 본 연구의 의의와 추후 연구를 위한 제언을 논의하였다.

Abstract

This study examined that the effect of user characteristics on the acceptability of artificial intelligence technology. More specifically, the effects of user perception, psychological characteristics, and demographic characteristics on the acceptability of artificial intelligence technology were examined. According to the results, in the case of artificial intelligence acceptability, the effect of user perception was found to be more important than others. In particular, it was found that the performance expectancy of artificial intelligence devices or services and acceptance of artificial intelligence were closely related. For anxiety about artificial intelligence, openness and anthropomorphism among user perception was found to be more important than the demographic characteristics of users. In the case of product use intention, like acceptability, user perception was found to have the greatest influence, and hedonic motivation and social influence were found to be important among user perception. Finally, the implications of our findings and suggestions for future research were discussed.


Keywords: Artificial intelligence acceptance, Anthropomorphism, Personality, Artificial intelligence anxiety, Intention to use AI
키워드: 인공지능 수용성, 의인화, 성격, 인공지능 불안, 인공지능 제품 사용의도

Acknowledgments

본 연구는 2021년 서울대학교 행복연구센터의 한국인 행복 종단 연구4(0404-20210003)의 지원을 받아 수행된 연구임.


References
1. 대통령직속 4차산업혁명위원회 (2021). 4차위 대국민 조사결과, AI시대 도래에 공감하고 AI 대중화 요구. https://www.4th-ir.go.kr/pressRelease/detail/1437?category=report에서 2021, 10, 28 인출.
2. 박윤균 (2021, 7, 7). “병원가기 전 AI문진 받으세요”...성큼 다가온 비대면 의료서비스. 매일경제. https://www.mk.co.kr/news/it/view/2021/07/654078/에서 2021, 10, 28 인출.
3. 여인택 (2017). 인공지능을 바라보는 인간의 시선: 축복인가 위협인가? [박사학위논문, 서울대학교 일반대학원]. https://hdl.handle.net/10371/137854
4. 왕재선 (2012). 과학기술 위험갈등의 근원-지식 혹은 감정?-. 한국정책학회보, 21(1), 219-250. G704-000110.2012.21.1.006
5. 이유정 (2017, 9, 27). AI가 만든 빼빼로 맛?. 한경경제. https://www.hankyung.com/economy/article/2017092717751에서 2021, 10, 28 인출.
6. 이창섭, 이현정 (2020). 확장된 계획적 행동이론을 통해 본 강한 인공지능 제품에 대한 이용자의 수용의도: 20대 연령층을 중심으로. 한국콘텐츠학회논문지, 20(10), 284-293. 10.5392/JKCA.2020.20.10.284
7. 황서이, 남영자 (2020). 인공지능에 대한 지식, 감정, 수용의도 관계에서 위험인식의 매개 및 조절효과 분석. 한국콘텐츠학회논문지, 20(8), 350-358. https://doi.org/10.5392/JKCA.2020.20.08.350
8. Ackerman, E. (2016). Study: Nobody wants social robots that look like humans because they threaten our identity. IEEE Spectrum, https://spectrum.ieee.org/study-nobody-wants-social-robots-that-look-like-humans
9. Allam, H., Bliemel, M., Spiteri, L., Blustein, J., & Ali-Hassan, H. (2019). Applying a multi-dimensional hedonic concept of intrinsic motivation on social tagging tools: A theoretical model and empirical validation. International Journal of Information Management, 45, 211-222. https://doi.org/10.1016/j.ijinfomgt.2018.11.005
10. Barni, D. (2015). Relative importance analysis for the study of the family: Accepting the challenge of correlated predictors. TPM: Testing, Psychometrics, Methodology in Applied Psychology, 22(2), 235-250. 10.4473/TPM22.2.5
11. Brown, S. A., & Venkatesh, V. (2005). Model of adoption of technology in households: A baseline model test and extension incorporating household life cycle. MIS quarterly, 399-426. 10.2307/25148690
12. Budescu, D. V. (1993). Dominance analysis: a new approach to the problem of relative importance of predictors in multiple regression. Psychological Bulletin, 114(3), 542-551. https://doi.org/10.1037/0033-2909.114.3.542
13. Chen, J., Chen, C., B. Walther, J., & Sundar, S. S. (2021, May). Do you feel special when an AI doctor remembers you? Individuation effects of AI vs. human doctors on user experience. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7). https://doi.org/10.1145/3411763.3451735
14. Choi, Y. (2021). A study of employee acceptance of artificial intelligence technology. European Journal of Management and Business Economics, 30(3), 318-330. https://doi.org/10.1108/EJMBE-06-2020-0158
15. Darlington, R. B. (1968). Multiple regression in psychological research and practice. Psychological Bulletin, 69(3), 161-182. https://doi.org/10.1037/h0025471
16. Davis, F. D. (1985). A technology acceptance model for empirically testing new end-user information systems: Theory and results (Doctoral dissertation, Massachusetts Institute of Technology). http://hdl.handle.net/1721.1/15192
17. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly,319-340. https://doi.org/10.2307/249008
18. Donnellan, M. B., Oswald, F. L., Baird, B. M., & Lucas, R. E. (2006). The mini-IPIP scales: Tiny-yet-effective measures of the Big Five factors of personality. Psychological Assessment, 18(2), 192-203. https://doi.org/10.1037/1040-3590.18.2.192
19. Fast, E., & Horvitz, E. (2017, February). Long-term trends in the public perception of artificial intelligence. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 31, No. 1). arXiv:1609.04904
20. Ferrari, F., Paladino, M. P., & Jetten, J. (2016). Blurring human-machine distinctions: Anthropomorphic appearance in social robots as a threat to human distinctiveness. International Journal of Social Robotics, 8(2), 287-302. https://doi.org/10.1007/s12369-016-0338-y
21. Gado, S., Kempen, R., Lingelbach, K., & Bipp, T. (2021). Artificial intelligence in psychology: How can we enable psychology students to accept and use artificial intelligence?. Psychology Learning & Teaching, 14757257211037149. https://doi.org/10.1177/14757257211037149
22. Gong, L. (2008). How social is social responses to computers? The function of the degree of anthropomorphism in computer representations. Computers in Human Behavior, 24(4), 1494-1509. https://doi.org/10.1016/j.chb.2007.05.007
23. Goudey, A., & Bonnin, G. (2016). Must smart objects look human? Study of the impact of anthropomorphism on the acceptance of companion robots. Recherche et Applications en Marketing (English Edition), 31(2), 2-20.10.1177/2051570716643961
24. Gursoy, D., Chi, O. H., Lu, L., & Nunkoo, R. (2019). Consumers acceptance of artificially intelligent (AI) device use in service delivery. International Journal of Information Management, 49, 157-169. https://doi.org/10.1016/j.ijinfomgt.2019.03.008
25. Holbrook, M. B., & Hirschman, E. C. (1982). The experiential aspects of consumption: Consumer fantasies, feelings, and fun. Journal of Consumer Research, 9(2), 132-140. https://doi.org/10.1086/208906
26. Ivaldi, S., Lefort, S., Peters, J., Chetouani, M., Provasi, J., & Zibetti, E. (2017). Towards engagement models that consider individual factors in HRI: On the relation of extroversion and negative attitude towards robots to gaze and speech during a human-robot assembly task. International Journal of Social Robotics, 9(1), 63-86.arXiv:1508.04603
27. Johnson, J. W. (2000). A heuristic method for estimating the relative weight of predictor variables in multiple regression. Multivariate Behavioral Research, 35(1), 1-19.10.1207/S15327906MBR3501_1
28. Kim, H. Y., & McGill, A. L. (2018). Minions for the rich? Financial status changes how consumers see products with anthropomorphic features. Journal of Consumer Research, 45(2),429-450. https://doi.org/10.1093/jcr/ucy006
29. Korukonda, A. R. (2005). Personality, individual characteristics, and predisposition to technophobia: Some answers, questions, and points to ponder about. Information Sciences, 170(2-4), 309-328. https://doi.org/10.1016/j.ins.2004.03.007
30. LeBreton, J. M., Hargis, M. B., Griepentrog, B., Oswald, F. L., & Ployhart, R. E. (2007). A multidimensional approach for evaluating variables in organizational research and practice. Personnel Psychology, 60(2), 475-498.10.1111/j.1744-6570.2007.00080.x
31. Liang, Y., & Lee, S. A. (2017). Fear of autonomous robots and artificial intelligence: Evidence from national representative data with probability sampling. International Journal of Social Robotics, 9(3), 379-384.10.1007/s12369-017-0401-3
32. Lockey, S., Gillespie, N., & Curtis, C. (2020). Trust in Artificial Intelligence: Australian Insights. https://doi.org/10.14264/b32f129
33. Lu, L., Cai, R., & Gursoy, D. (2019). Developing and validating a service robot integration willingness scale. International Journal of Hospitality Management, 80, 36-51. https://doi.org/10.1016/j.ijhm.2019.01.005
34. Matthews, G., Hancock, P. A., Lin, J., Panganiban, A. R., Reinerman-Jones, L. E., Szalma, J. L., & Wohleber, R. W. (2021). Evolution and revolution: Personality research for the coming world of robots, artificial intelligence, and autonomous systems. Personality and Individual Differences, 169, 109969. https://doi.org/10.1016/j.paid.2020.109969
35. Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98-100. 10.1109/MRA.2012.2192811
36. Nagy, S., & Hadjú, N. (2021). Consumer acceptance of the use of artificial intelligence in online shopping: Evidence from Hungary. Amfiteatru Economic, 23(56), 155-173.10.24818/EA/2021/56/155
37. Nowak, K. L., & Biocca, F. (2003). The effect of the agency and anthropomorphism on users' sense of telepresence, copresence, and social presence in virtual environments. Presence: Teleoperators & Virtual Environments, 12(5), 481-494. 10.1162/105474603322761289
38. Pelau, C., Dabija, D. C., & Ene, I. (2021). What makes an AI device human-like? The role of interaction quality, empathy and perceived psychological anthropomorphic characteristics in the acceptance of artificial intelligence in the service industry. Computers in Human Behavior, 122, 106855. 10.1016/j.chb.2021.106855
39. Riek, L. D., Rabinowitch, T. C., Chakrabarti, B., & Robinson, P. (2009, March). How anthropomorphism affects empathy toward robots. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction (pp. 245-246). https://doi.org/10.1145/1514095.1514158
40. Sohn, K., & Kwon, O. (2020). Technology acceptance theories and factors influencing artificial Intelligence-based intelligent products. Telematics and Informatics, 47, 101324. https://doi.org/10.1016/j.tele.2019.101324
41. Tonidandel, S., & LeBreton, J. M. (2011). Relative importance analysis: A useful supplement to regression analysis. Journal of Business and Psychology, 26(1), 1-9. https://doi.org/10.1007/s10869-010-9204-3
42. Tonidandel, S., & LeBreton, J. M. (2015). RWA web: A free, comprehensive, web-based, and user-friendly tool for relative weight analyses. Journal of Business and Psychology, 30(2), 207-216. https://doi.org/10.1007/s10869-014-9351-z
43. Van der Heijden, H. (2004). User acceptance of hedonic information systems. MIS quarterly, 695-704.
44. Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management science, 46(2), 186-204. https://doi.org/10.2307/25148660
45. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS quarterly, 425-478. https://doi.org/10.2307/30036540
46. Venkatesh, V., Thong, J. Y., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS quarterly, 157-178. 10.2307/41410412