cover
Contact Name
-
Contact Email
-
Phone
-
Journal Mail Official
-
Editorial Address
-
Location
Kota adm. jakarta barat,
Dki jakarta
INDONESIA
ComTech: Computer, Mathematics and Engineering Applications
ISSN : 20871244     EISSN : 2476907X     DOI : -
The journal invites professionals in the world of education, research, and entrepreneurship to participate in disseminating ideas, concepts, new theories, or science development in the field of Information Systems, Architecture, Civil Engineering, Computer Engineering, Industrial Engineering, Food Technology, Computer Science, Mathematics, and Statistics through this scientific journal.
Arjuna Subject : -
Articles 1,591 Documents
Comparative Performance Analysis of Object-Oriented Programming and Data-Oriented Programming in TensorFlow Mangapul Siahaan; Jefriyanto Chandra; Muhamad Dody Firmansyah
ComTech: Computer, Mathematics and Engineering Applications Vol. 17 No. 1 (2026): ComTech
Publisher : Bina Nusantara University

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.21512/comtech.v17i1.14648

Abstract

The rapid advancement of deep learning significantly increases computational demands, making performance optimization essential for model scalability and deployment. While numerous studies optimize neural network architectures, the effect of different programming paradigms on computational efficiency remains insufficiently explored. This study aims to compare Object-Oriented Programming (OOP) and Data-Oriented Programming (DOP) paradigms in TensorFlow-based deep learning workflows, focusing on their performance across four processing phases: build, compile, train, and evaluate, under a controlled experimental environment with repeated iterations and systematic measurements. Both paradigms are implemented using identical Convolutional Neural Network (CNN) architectures trained on the CIFAR-100 image dataset over thirty controlled experimental iterations. A custom profiler integrating the Python System and Process Utilities (psutil) and NVIDIA Management Library (pynvml) monitors real-time system performance, capturing CPU and GPU utilization as well as memory usage. The results reveal that DOP achieves better resource efficiency with lower memory usage (549.98 MB versus 676.25 MB), higher GPU utilization (64.68% versus 61.08%), and faster evaluation execution (1.50 seconds versus 2.59 seconds), while also attaining higher model accuracy (32.38% versus 28.08%). In contrast, OOP benefits from TensorFlow’s Sequential API optimizations, resulting in faster training times but greater CPU and memory consumption. These findings highlight that DOP provides superior runtime efficiency and offers practical benefits for performancecritical deep learning applications.

Filter by Year

2010 2026


Filter By Issues
All Issue Vol. 17 No. 1 (2026): ComTech Vol. 16 No. 2 (2025): ComTech Vol. 16 No. 1 (2025): ComTech Vol. 15 No. 2 (2024): ComTech Vol. 15 No. 1 (2024): ComTech Vol. 14 No. 2 (2023): ComTech Vol. 14 No. 1 (2023): ComTech Vol. 13 No. 2 (2022): ComTech Vol. 13 No. 1 (2022): ComTech Vol. 12 No. 2 (2021): ComTech Vol. 12 No. 1 (2021): ComTech Vol. 11 No. 2 (2020): ComTech Vol 11, No 1 (2020): ComTech (Inpress) Vol. 11 No. 1 (2020): ComTech Vol 10, No 2 (2019): ComTech Vol. 10 No. 2 (2019): ComTech Vol 10, No 1 (2019): ComTech (In Press) Vol 10, No 1 (2019): ComTech Vol. 10 No. 1 (2019): ComTech Vol. 9 No. 2 (2018): ComTech Vol 9, No 2 (2018): ComTech Vol 9, No 2 (2018): ComTech Vol 9, No 1 (2018): ComTech Vol 9, No 1 (2018): ComTech Vol. 9 No. 1 (2018): ComTech Vol. 8 No. 4 (2017): ComTech Vol 8, No 4 (2017): ComTech Vol 8, No 4 (2017): ComTech Vol 8, No 3 (2017): ComTech Vol 8, No 3 (2017): ComTech Vol. 8 No. 3 (2017): ComTech Vol 8, No 2 (2017): ComTech Vol 8, No 2 (2017): ComTech Vol. 8 No. 2 (2017): ComTech Vol 8, No 1 (2017): ComTech Vol. 8 No. 1 (2017): ComTech Vol 8, No 1 (2017): ComTech Vol 7, No 4 (2016): ComTech Vol 7, No 4 (2016): ComTech Vol. 7 No. 4 (2016): ComTech Vol. 7 No. 3 (2016): ComTech Vol 7, No 3 (2016): ComTech Vol 7, No 3 (2016): ComTech Vol. 7 No. 2 (2016): ComTech Vol 7, No 2 (2016): ComTech Vol 7, No 2 (2016): ComTech Vol 7, No 1 (2016): ComTech Vol 7, No 1 (2016): ComTech Vol. 7 No. 1 (2016): ComTech Vol 6, No 4 (2015): ComTech Vol 6, No 4 (2015): ComTech Vol. 6 No. 4 (2015): ComTech Vol. 6 No. 3 (2015): ComTech Vol 6, No 3 (2015): ComTech Vol 6, No 3 (2015): ComTech Vol 6, No 2 (2015): ComTech Vol. 6 No. 2 (2015): ComTech Vol 6, No 2 (2015): ComTech Vol. 6 No. 1 (2015): ComTech Vol 6, No 1 (2015): ComTech Vol 6, No 1 (2015): ComTech Vol 5, No 2 (2014): ComTech Vol. 5 No. 2 (2014): ComTech Vol 5, No 2 (2014): ComTech Vol 5, No 1 (2014): ComTech Vol. 5 No. 1 (2014): ComTech Vol 5, No 1 (2014): ComTech Vol. 4 No. 2 (2013): ComTech Vol 4, No 2 (2013): ComTech Vol 4, No 2 (2013): ComTech Vol 4, No 1 (2013): ComTech Vol 4, No 1 (2013): ComTech Vol. 4 No. 1 (2013): ComTech Vol 3, No 2 (2012): ComTech Vol 3, No 2 (2012): ComTech Vol. 3 No. 2 (2012): ComTech Vol 3, No 1 (2012): ComTech Vol 3, No 1 (2012): ComTech Vol. 3 No. 1 (2012): ComTech Vol 2, No 2 (2011): ComTech Vol. 2 No. 2 (2011): ComTech Vol 2, No 2 (2011): ComTech Vol 2, No 1 (2011): ComTech Vol. 2 No. 1 (2011): ComTech Vol 2, No 1 (2011): ComTech Vol. 1 No. 2 (2010): ComTech Vol 1, No 2 (2010): ComTech Vol 1, No 2 (2010): ComTech Vol 1, No 1 (2010): ComTech Vol 1, No 1 (2010): ComTech Vol. 1 No. 1 (2010): ComTech More Issue