Claim Missing Document
Check
Articles

Found 2 Documents
Search

From Opposition to Collaboration: The Evolution of Participatory Culture in The LinaBell IP Generation Yi, Liu; Rui, Chen
Mandarinable: Journal of Chinese Studies Vol. 3 No. 1 (2024): MANDARINABLE: Journal of Chinese Studies
Publisher : Published by Confucius Institute UNS

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.20961/mandarinable.v3i1.820

Abstract

The rapid rise of LinaBell has been a carnival driven by online user participation. It not only reflects the active agency and creative initiative of audiences in participatory culture but also advances the development of participatory texts, transforming the relationship between producers and recipients from opposition to collaboration. Throughout this process, both parties benefit and fulfill their respective needs. Creators generate higher commercial value with lower production costs, while recipients experience immersive and interactive aesthetic enjoyment through their engagement. The success of LinaBell popularity is also related to the unique form of fan culture in China. Furthermore, it ingeniously addresses the long-standing copyright issues that have hindered user participation in creative processes. However, it is important to acknowledge that this type of storyless intellectual property (IP) has both advantages and limitations.
Estimation of Confidence in the Dialogue based on Eye Gaze and Head Movement Information Dewen, Cui; Akihiro, Matsufuji; Yi, Liu; Shimokawa, Eri Sato-; Yamaguchi, Toru
EMITTER International Journal of Engineering Technology Vol 10 No 2 (2022)
Publisher : Politeknik Elektronika Negeri Surabaya (PENS)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.24003/emitter.v10i2.756

Abstract

In human-robot interaction, human mental states in dialogue have attracted attention to human-friendly robots that support educational use. Although estimating mental states using speech and visual information has been conducted, it is still challenging to estimate mental states more precisely in the educational scene. In this paper, we proposed a method to estimate human mental state based on participants’ eye gaze and head movement information. Estimated participants’ confidence levels in their answers to the miscellaneous knowledge question as a human mental state. The participants’ non-verbal information, such as eye gaze and head movements during dialog with a robot, were collected in our experiment using an eye-tracking device. Then we collect participants’ confidence levels and analyze the relationship between human mental state and non-verbal information. Furthermore, we also applied a machine learning technique to estimate participants’ confidence levels from extracted features of gaze and head movement information. As a result, the performance of a machine learning technique using gaze and head movements information achieved over 80 % accuracy in estimating confidence levels. Our research provides insight into developing a human-friendly robot considering human mental states in the dialogue.