This paper describes the use of an affective avatar that detects and shares non-verbal affective data in a 2D-based remote meeting application. It represents the foundation of the design and compares the effects of the affective-avatar with other modes in 2D-based remote meeting applications (audio-only, video, and affective-avatar) in the context of emotion sharing, social presence, and user preferences. A prototype application was developed to detect facial expressions and hand gestures, transmit affective data, and animate these expressions on an affective avatar using emoji cues. A user study involving 18 participants was conducted, in which participants were asked to perform a collaborative drawing task using the prototype application. The study found that emotion sharing remained consistent across all conditions, with no notable differences. The study also revealed that both video and affective-avatar modes fostered a stronger sense of social presence compared to the audio-only mode. Although there was no significant quantitative difference between the video and affective-avatar, participants generally preferred using either mode over audio-only. In the future, researchers aim to explore methods for enhancing detection accuracy in challenging lighting conditions. The study also wishes to improve the performance of the prototype application. As the research progresses, there is the possibility of strengthening the current prototype with additional methods for sharing nonverbal and affective information, such as physiological signals and body movements.