cover
Contact Name
-
Contact Email
-
Phone
-
Journal Mail Official
-
Editorial Address
-
Location
Kota yogyakarta,
Daerah istimewa yogyakarta
INDONESIA
International Journal of Informatics and Communication Technology (IJ-ICT)
ISSN : 22528776     EISSN : 27222616     DOI : -
Core Subject : Science,
International Journal of Informatics and Communication Technology (IJ-ICT) is a common platform for publishing quality research paper as well as other intellectual outputs. This Journal is published by Institute of Advanced Engineering and Science (IAES) whose aims is to promote the dissemination of scientific knowledge and technology on the Information and Communication Technology areas, in front of international audience of scientific community, to encourage the progress and innovation of the technology for human life and also to be a best platform for proliferation of ideas and thought for all scientists, regardless of their locations or nationalities. The journal covers all areas of Informatics and Communication Technology (ICT) focuses on integrating hardware and software solutions for the storage, retrieval, sharing and manipulation management, analysis, visualization, interpretation and it applications for human services programs and practices, publishing refereed original research articles and technical notes. It is designed to serve researchers, developers, managers, strategic planners, graduate students and others interested in state-of-the art research activities in ICT.
Arjuna Subject : -
Articles 462 Documents
Improving 4G LTE network quality using the automatic cell planning Yuhanef, Afrizal; Yusnita, Sri; Khairani, Redha Anadia
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 2: August 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i2.pp231-238

Abstract

The growing demand for network services leads to an increase in traffic load on eNodeB, resulting in decreased network quality and performance, necessitating optimization. This research analyses the results of optimising 4G reference signal received power (RSRP), signal to interference noise ratio (SINR) and throughput parameters using the automatic cell planning (ACP) method. ACP has been shown to significantly improve the performance and quality of 4G LTE networks compared to traditional cell planning methods. Based on the standard parameter RSRP, increased after ACP optimisation which is dominant in the range ≥ -100 s.d ˃ -85 dBm and obtained an average value of -98.59 dBm with good category. The average SINR has increased by 18.23 dB with a good category. The dominant throughput is in the 14,000 Kbps range with an average value of 50,241.08 Kbps with the excellent category. The ACP method can enhance the performance of 4G LTE networks, potentially addressing operator issues of unstable network quality due to poor coverage. The ACP method significantly enhances 4G LTE network performance, coverage, and user experience, potentially addressing unstable network quality due to poor coverage. This research is crucial for both users and the telecoms industry.
Adaptive resource allocation in NOMA-enabled backscatter communications systems Das, Deepa; Khadanga, Rajendra Kumar; Rout, Deepak Kumar
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 1: April 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i1.pp67-79

Abstract

The integration of NOMA with Backscatter communication (BackCom) is a promising solution for developing a green future wireless network. However, system performance degrades with the deployment of multiple backscatter devices (BDs) in a network. Hence, energy efficiency (EE) maximization with proper resource allocation is among the primary concerns. In this regard, this paper proposes an adaptive resource allocation method for maximizing EE by simultaneously optimizing the transmission power from the base station (BS), power allocation coefficients, and reflection coefficients under the constraints of maximum allowable transmission power and minimum achievable data rate. Specifically, an iterative method based on a parametric transformation approach is adopted for maximizing EE by jointly optimizing the coefficients, in which the power allocation problem to the BDs is solved by an adaptive method that is based on improved proportionate normalized least mean square (IPNLMS) algorithm. Then, the system performance is evaluated, and the impact of different parameters is also studied it is observed that EE is significantly improved as compared to the existing scheme, and maximum at η=-0.5.
Solana blockchain technology: a review Mishra, Debani Prasad; Behera, Sandip Ranjan; Behera, Subhashis Satyabrata; Patro, Aditya Ranjan; Salkuti, Surender Reddy
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 2: August 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i2.pp197-205

Abstract

The introduction of a review article on the Solana blockchain is critical to setting the stage for the arguments and evidence to follow. This paragraph will provide context to the reader by discussing the current state of blockchain technology and introducing Solana as a potential solution. Blockchain technology has the potential for countless applications, ranging from financial transactions to secure data storage. However, existing blockchain systems suffer from scalability issues, were confirmation times and network congestion limit transaction volumes. This review paper on the Solana blockchain is valuable for those seeking an in-depth understanding of the design and efficacy. Given the increasing number of blockchain technologies available in the market, potential adopters face the challenge of selecting the most suitable blockchain network for their specific use case. A well-constructed review provides necessary information on the functioning of the technology, including its strengths and limitations. It also enables readers to compare various blockchain technologies and judge their suitability for their specific needs. Therefore, reviews like this one play a crucial role in helping to advance blockchain technology by driving the adoption of superior blockchain networks.
Enhancing PI controller performance in grid-connected hybrid power systems Pavan, Gollapudi; Babu, A. Ramesh
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 2: August 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i2.pp264-271

Abstract

The optimal operation of a microgrid was buildup of both uncontrollable (solar, wind) and controllable (batteries, diesel generators) electrical energy sources are enclosed in this paper. By replacing controllers, the variations in wavering power supply caused by load fluctuations are managed. The objective of the research paper is to optimize these controller gain settings for effective use of electrical energy. In this paper integral time square error principle is combined along with the Cuckoo search algorithm (CSA) and particle swarm algorithm (PSA) to obtain the accurate, precise and appropriate results. It enhances the microgrid's steady-state sensitive responsiveness in comparison to trial-and-error techniques, assuring a stable supply of electricity to the load.
A micro size terahertz wheel shaped antenna with non-defected ground structure Swaminathan, Narayanan; Rajendiran, Murugesan
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 1: April 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i1.pp101-107

Abstract

A micro dimension antenna with wheel-geometrical wide-band terahertz (THz) is suggested in this research. In the circular shaped patch, concentric circle shaped slots are incorporated to form a wheel shaped patch antenna. The suggested model is designed on a polymide substrate with dielectric constant of 4.3 and thickness of 20 µm. The suggested prototype antenna is very much compact in size of 210×160 µm2 . The designed antenna achieves a wideband operation from 8.692 THz to 9.772 THz. This prototype antenna’s maximum realized gain is 10.2 dBi at 9.0 THz. This high gain is important for wide range of wireless applications. The radiation pattern, radiation efficiency, reflection coefficient, surface current distribution and voltage standing wave ratio are examined through the simulation results. In future video rate imaging system, super fast close-range in-door wireless communication, biomedical picturing, homeland defence equipments, security scanning, explosive detection, and characterisation of materials in the THz level will be benefited from the suggested THz antenna.
Indonesian generative chatbot model for student services using GPT Priccilia, Shania; Girsang, Abba Suganda
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 1: April 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i1.pp50-56

Abstract

The accessibility of academic information greatly impacts the satisfaction and loyalty of university students. However, limited university resources often hinder students from conveniently accessing information services. To address this challenge, this research proposes the digitization of the question-answering process between students and student service staff through the implementation of generative chatbot. A generative chatbot can provide students with human-like responses to academic inquiries at their convenience. This research developed generative chatbot using pre-trained GPT-2 architecture in three different sizes, specifically designed for addressing practicum-related questions in a private university in Indonesia. The experiment utilized 1288 question-answer pairs in Indonesian and demonstrated the best performance with a BLEU score of 0.753, signifying good performance accuracy in generating text despite dataset limitations.
Design of an efficient Transformer-XL model for enhanced pseudo code to Python code conversion Kuche, Snehal H.; Gaikwad, Amit K.; Deshmukh, Meghna
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 2: August 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i2.pp223-230

Abstract

The landscape of programming has long been challenged by the task of transforming pseudo code into executable Python code, a process traditionally marred by its labor-intensive nature and the necessity for a deep understanding of both logical frameworks and programming languages. Existing methodologies often grapple with limitations in handling variable-length sequences and maintaining context over extended textual data. Addressing these challenges, this study introduces an innovative approach utilizing the Transformer-XL model, a significant advancement in the domain of deep learning. The Transformer-XL architecture, an evolution of the standard Transformer, adeptly processes variable-length sequences and captures extensive contextual dependencies, thereby surpassing its predecessors in handling natural language processing (NLP) and code synthesis tasks. The proposed model employs a comprehensive process involving data preprocessing, model input encoding, a self-attention mechanism, contextual encoding, language modeling, and a meticulous decoding process, followed by post-processing. The implications of this work are far-reaching, offering a substantial leap in the automation of code conversion. As the field of NLP and deep learning continues to evolve, the Transformer-XL based model is poised to become an indispensable tool in the realm of programming, setting a new benchmark for automated code synthesis.
One time pad for enhanced steganographic security using least significant bit with spiral pattern Rihartanto, Rihartanto; Budi Utomo, Didi Susilo; Rizal, Ansar; Diartono, Dwi Agus; Februariyanti, Herny
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 2: August 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i2.pp168-177

Abstract

Data is an important commodity in today’s digital era. Therefore, data needs to get adequate security to prevent misuse. A common data security practice in the transmission of information is cryptography. Another approach is steganography, which hides secret messages in other media that are not confidential and can be accessed by the public. In this study, the spiral pattern is used for data placement using the least significant bit (LSB) method. Modifications were made to the 2-bits LSB to increase the data capacity that can be hidden. In order to increase security, the data is first converted into a datastream using random numbers as one time pad (OTP). exclusive-OR (XOR) operation is performed on datastream and OTP to get encrypted data to be hidden. The results showed that the image quality of the steganography results at a capacity close to 100% was still fairly good, as indicated by a peak signal-to-noise ratio (PSNR) value greater than 46 dB. Visually, the steganographic image does not look different from the original one. Likewise, the use of random numbers as OTP succeeded in changing the hidden data significantly, as indicated by the avalanche effect value above 50%.
Blockchain and ML in land registries a transformative alliance Shukla, Vishnu; Raipurkar, Abhijeet Ramesh; Chandak, Manoj B.
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 2: August 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i2.pp239-247

Abstract

This study presents a novel method for merging blockchain security and machine learning (ML) valuation to update land register systems. The system offers a safe, open, and effective framework for documenting and managing land ownership, addressing issues with conventional land registry procedures. Blockchain technology creates a tamper-proof record by cryptographically combining transactions and time-stamped entries to provide an immutable and decentralized ledger. In addition to building a solid foundation for the land registry system, this strengthens trust. Simultaneously, ML algorithms examine variables such as amenities and location to remove inflated pricing, providing accurate assessments and encouraging openness in the real estate sector. The system has been put into practice and verified in small-scale applications. Its features include enhanced data security, expedited ownership transfers, and accurate asset appraisals. Collaboration between governments, regulatory agencies, and technology suppliers is necessary for widespread deployment. Land registration procedures will change as a result of the revolutionary partnership between blockchain and ML technology, which offers a more effective, safe, and future-ready environment. Accepting this ground-breaking technique establishes a new benchmark for the updating of land ownership data and is a major step toward a more sophisticated and dependable method in the industry.
ChatGPT's effect on the job market: how automation affects employment in sectors using ChatGPT for customer service Mishra, Debani Prasad; Agarwal, Nandini; Shah, Dhruvi; Salkuti, Surender Reddy
International Journal of Informatics and Communication Technology (IJ-ICT) Vol 13, No 1: April 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijict.v13i1.pp116-122

Abstract

A significant language model called ChatGPT, created by OpenAI, has gained attention in artificial intelligence (AI) and natural language processing. This research paper aims to provide an in-depth analysis of ChatGPT and its potential impact on the future, including its limitations, pros and cons, and how it came to be. This paper first provides a brief overview of ChatGPT, including its architecture and training process, and how it differs from previous language models. It then delves into the model's limitations, such as its lack of common sense and susceptibility to discrimination or biases present in the data it was trained on. This paper also explores the potential benefits of ChatGPT, such as its ability to generate human-like text, its potential use in customer service, and its potential impact on the job market. The paper also discusses the ethical and social implications of ChatGPT, such as the potential for the model to perpetuate biases and the need for transparency and accountability in its deployment. Finally, the paper concludes by discussing the future of ChatGPT and similar language models and their potential impact on various industries and society as a whole. Overall, this research paper provides a comprehensive and nuanced survey of the AI tool ChatGPT and its potential impact on the future.