Claim Missing Document
Check
Articles

Found 22 Documents
Search

Analysis of the Role of Internal Audit in Enhancing Company's Internal Control Sari, Dian Kartika
Golden Ratio of Auditing Research Vol. 2 No. 1 (2022): July - January
Publisher : Manunggal Halim Jaya

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52970/grar.v2i1.369

Abstract

This study aims to explore the role of internal audit in enhancing a company's internal control system by examining internal audit practices, perceptions of internal control effectiveness, and the impact of internal audit recommendations. The research employs a qualitative approach, utilizing interviews and document analysis to gather data from participants representing various organizations. Results reveal significant variations in internal audit practices influenced by organizational size, industry dynamics, and regulatory frameworks. Despite these variations, internal audit functions universally serve as guardians of organizational integrity, providing independent assurance regarding internal control effectiveness. Participants expressed diverse perceptions of internal control effectiveness, citing resource constraints, inadequate training, and communication gaps as common challenges. However, internal audit was unanimously recognized as crucial in enhancing internal controls, offering independent assurance and promoting a culture of compliance and ethical behavior. Internal audit recommendations often led to tangible improvements in internal control systems, although challenges related to resource allocation and change management were reported. Theoretical implications highlight the multifaceted nature of internal audit functions and their critical role in organizational governance, while managerial implications emphasize the need for strategic resource allocation, prioritization, and change management initiatives. This research contributes to the understanding of internal audit practices and their impact on organizational effectiveness.
Detection of Vulgarity in Anime Character: Implementation of Detection Transformer Suciati, Amalia; Sari, Dian Kartika; Yunus, Andi Prademon; Amaliah, Nuuraan Rizqy
JURNAL TEKNIK INFORMATIKA Vol. 18 No. 1: JURNAL TEKNIK INFORMATIKA
Publisher : Department of Informatics, Universitas Islam Negeri Syarif Hidayatullah

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.15408/jti.v18i1.46064

Abstract

Vulgar and pornographic content has become a widespread issue on the internet, appearing in various fields include anime. Vulgar pornographic content in anime is not limited to the sexuality genre; anime from general genres such as action, adventure, and others also contain vulgar visual. The main focus of this research is the implementation of the Detection Transformer (DETR) object detection method to identify vulgar parts of anime characters, particularly female characters. DETR is a deep learning model designed for object detection tasks, adapting the attention mechanism of Transformers. The dataset used consists of 800 images taken from popular anime, based on viewership rankings, which were augmented to a total of 1,689 images. The research involved training models with different backbones, specifically ResNet-50 and ResNet-101, each with dilation convolution applied at different stages. The results show that the DETR model with a ResNet-50 backbone and dilation convolution at stage 5 outperformed other backbones and dilation configurations, achieving a mean Average Precision of 0.479 and  of 0.875. The other result is dilated convolution improves small object detection by enlarging the receptive field, applying it in early stages tends to reduce spatial detail and harm performance on medium and large objects. However, the primary focus of this research is not solely on achieving the highest performance but on exploring the potential of transformer-based models, such as DETR, for detecting vulgar content in anime. DETR benefits from its ability to understand spatial context through self-attention mechanisms, offering potential for further development with larger datasets, more complex architectures, or training at larger data scales.