Journal of ICT Research and Applications
Journal of ICT Research and Applications welcomes full research articles in the area of Information and Communication Technology from the following subject areas: Information Theory, Signal Processing, Electronics, Computer Network, Telecommunication, Wireless & Mobile Computing, Internet Technology, Multimedia, Software Engineering, Computer Science, Information System and Knowledge Management.
Articles
302 Documents
Free Model of Sentence Classifier for Automatic Extraction of Topic Sentences
M.L. Khodra;
D.H. Widyantoro;
E.A. Aziz;
B.R. Trilaksono
Journal of ICT Research and Applications Vol. 5 No. 1 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.1.2
This research employs free model that uses only sentential features without paragraph context to extract topic sentences of a paragraph. For finding optimal combination of features, corpus-based classification is used for constructing a sentence classifier as the model. The sentence classifier is trained by using Support Vector Machine (SVM). The experiment shows that position and meta-discourse features are more important than syntactic features to extract topic sentence, and the best performer (80.68%) is SVM classifier with all features.
The Effectiveness of Chosen Partial Anthropometric Measurements in Individualizing Head-Related Transfer Functions on Median Plane
Hugeng Hugeng;
Wahidin Wahab;
Dadang Gunawan
Journal of ICT Research and Applications Vol. 5 No. 1 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.1.3
Individualized head-related impulse responses (HRIRs) to perfectly suit a particular listener remains an open problem in the area of HRIRs modeling. We have modeled the whole range of magnitude of head-related transfer functions (HRTFs) in frequency domain via principal components analysis (PCA), where 37 persons were subjected to sound sources on median plane. We found that a linear combination of only 10 orthonormal basis functions was sufficient to satisfactorily model individual magnitude HRTFs. It was our goal to form multiple linear regressions (MLR) between weights of basis functions acquired from PCA and chosen partial anthropometric measurements in order to individualize a particular listener's H RTFs with his or her own anthropometries. We proposed a novel individualization method based on MLR of weights of basis functions by employing only 8 out of 27 anthropometric measurements. The experiments' results showed the proposed method, with mean error of 11.21%, outperformed our previous works on individualizing minimum phase HRIRs (mean error 22.50%) and magnitude HRTFs on horizontal plane (mean error 12.17%) as well as similar researches. The proposed individualization method showed that the individualized magnitude HRTFs could be well estimated as the original ones with a slight error. Thus the eight chosen anthropometric measurements showed their effectiveness in individualizing magnitude HRTFs particularly on median plane.
Digital Dermatoscopy Method for Human Skin Roughness Analysis
Suprijanto Suprijanto;
V. Nadhira;
Dyah A. Lestari;
E. Juliastuti;
Sasanti T. Darijanto
Journal of ICT Research and Applications Vol. 5 No. 1 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.1.4
In this study we propose a digital dermatoscopy method to measure the human skin roughness. By using this method we eliminate the use of silicon replica. Digital dermatoscopy consists of handheld digital microscope, image processing and information extraction of skin roughness level. To reduce the noise due to the variation of reflection factor on the skin we use median filter. Hence, by Fourier transform the skin texture is imaged in terms of 2D frequencyspatial distribution. Skin roughness is determined from its entropy, where the roughness level is proportional to the entropy. Three types of experiment have been performed by evaluating: (i) the skin replicas; (ii) young and elderly skin; and (iii) seven volunteers treated by anti wrinkle cosmetic in three weeks period. We find that for the first and second experiment that our system did manage to quantify the roughness, while on the third experiment, six of seven volunteers, the roughness are succeeded to identify.
RFID-based Positioning System in Complex Environments
Mudrik Alaydrus
Journal of ICT Research and Applications Vol. 5 No. 2 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.2.1
For effective identification of objects, Radio Frequency Identification (RFID) is used in miscellaneous activities. In recent times, RFID is also used for positioning purposes. We show a scenario of wireless propagation in free space observed by up to eight antennas with different polarization located in different positions. In this way, the polarization and diagram radiation of the antennas will play a significant role in producing electromagnetic field in the region. In the second case, the effects of disturbances in form of metallic boxes are studied. The determination of the position is carried out by fingerprinting procedure.
Synthesis Optimization on Galois-Field Based Arithmetic Operators for Rijndael Cipher
Petrus Mursanto
Journal of ICT Research and Applications Vol. 5 No. 2 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.2.2
A series of experiments has been conducted to show that FPGA synthesis of Galois-Field (GF) based arithmetic operators can be optimized automatically to improve Rijndael Cipher throughput. Moreover, it has been demonstrated that efficiency improvement in GF operators does not directly correspond to the system performance at application level. The experiments were motivated by so many research works that focused on improving performance of GF operators. Each of the variants has the most efficient form in either time (fastest) or space (smallest occupied area) when implemented in FPGA chips. In fact, GF operators are not utilized individually, but rather integrated one to the others to implement algorithms. Contribution of this paper is to raise issue on GF-based application performance and suggest alternative aspects that potentially affect it. Instead of focusing on GF operator efficiency, system characteristics are worth considered in optimizing application performance.
New Methodology of Block Cipher Analysis Using Chaos Game
Budi Sulistyo;
Budi Rahardjo;
Dimitri Mahayana;
Carmadi Machbub
Journal of ICT Research and Applications Vol. 5 No. 2 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.2.3
Block cipher analysis covers randomness analysis and cryptanalysis. This paper proposes a new method potentially used for randomness analysis and cryptanalysis. The method uses true random sequence concept as a reference for measuring randomness level of a random sequence. By using this concept, this paper defines bias which represents violation of a random sequence from true random sequence. In this paper, block cipher is treated as a mapping function of a discrete time dynamical system. The dynamical system framework is used to make the application of various analysis techniques developed in dynamical system field becomes possible. There are three main parts of the methodology presented in this paper: the dynamical system framework for block cipher analysis, a new chaos game scheme and an extended measure concept related to chaos game and fractal analysis. This paper also presents the general procedures of the proposed method, which includes: symbolic dynamic analysis of discr ete dynamical system whose block cipher as its mapping function, random sequence construction, the random sequence usage as input of a chaos game scheme, output measurement of chaos game scheme using extended measure concept, analysis the result of the measurement. The analysis process and of a specific real or sample block cipher and the analysis result are beyond the scope of this paper.
Architecture for the Secret-Key BC3 Cryptography Algorithm
Arif Sasongko;
Hidayat Hidayat;
Yusuf Kurniawan;
Sarwono Sutikno
Journal of ICT Research and Applications Vol. 5 No. 2 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.2.4
Cryptography is a very important aspect in data security. The focus of research in this field is shifting from merely security aspect to consider as well the implementation aspect. This paper aims to introduce BC3 algorithm with focus on its hardware implementation. It proposes an architecture for the hardware implementation for this algorithm. BC3 algorithm is a secret-key cryptography algorithm developed with two considerations: robustness and implementation efficiency. This algorithm has been implemented on software and has good performance compared to AES algorithm. BC3 is improvement of BC2 and AE cryptographic algorithm and it is expected to have the same level of robustness and to gain competitive advantages in the implementation aspect. The development of the architecture gives much attention on (1) resource sharing and (2) having single clock for each round. It exploits regularity of the algorithm. This architecture is then implemented on an FPGA. This implementation is three times smaller area than AES, but about five times faster. Furthermore, this BC3 hardware implementation has better performance compared to BC3 software both in key expansion stage and randomizing stage. For the future, the security of this implementation must be reviewed especially against side channel attack.
Frontal Face Detection using Haar Wavelet Coefficients and Local Histogram Correlation
Iwan Setyawan;
Ivanna K. Timotius;
Andreas A. Febrianto
Journal of ICT Research and Applications Vol. 5 No. 3 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.3.1
Face detection is the main building block on which all automatic systems dealing with human faces is built. For example, a face recognition system must rely on face detection to process an input image and determine which areas contain human faces. These areas then become the input for the face recognition system for further processing. This paper presents a face detection system designed to detect frontal faces. The system uses Haar wavelet coefficients and local histogram correlation as differentiating features. Our proposed system is trained using 100 training images. Our experiments show that the proposed system performed well during testing, achieving a detection rate of 91.5%.
Lossless Compression Performance of a Simple Counter-Based Entropy Coder
Armein Z.R. Langi
Journal of ICT Research and Applications Vol. 5 No. 3 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.3.2
This paper describes the performance of a simple counter based entropy coder, as compared to other entropy coders, especially Huffman coder. Lossless data compression, such as Huffman coder and arithmetic coder, are designed to perform well over a wide range of data entropy. As a result, the coders require significant computational resources that could be the bottleneck of a compression implementation performance. In contrast, counter-based coders are designed to be optimal on a limited entropy range only. This paper shows the encoding and decoding process of counter-based coder can be simple and fast, very suitable for hardware and software implementations. It also reports that the performance of the designed coder is comparable to that of a much more complex Huffman coder.
A Cognitive Skill Classification Based on Multi Objective Optimization Using Learning Vector Quantization for Serious Games
Moh. Aries Syufagi;
Mochamad Hariadi;
Mauridhi Hery Purnomo
Journal of ICT Research and Applications Vol. 5 No. 3 (2011)
Publisher : LPPM ITB
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.5614/itbj.ict.2011.5.3.3
Nowadays, serious games and game technology are poised to transform the way of educating and training students at all levels. However, pedagogical value in games do not help novice students learn, too many memorizing and reduce learning process due to no information of player's ability. To asses the cognitive level of player ability, we propose a Cognitive Skill Game (CSG). CSG improves this cognitive concept to monitor how players interact with the game. This game employs Learning Vector Quantization (LVQ) for optimizing the cognitive skill input classification of the player. CSG is using teacher's data to obtain the neuron vector of cognitive skill pattern supervise. Three clusters multi objective target will be classified as; trial and error, carefully and, expert cognitive skill. In the game play experiments employ 33 respondent players demonstrates that 61% of players have high trial and error, 21% have high carefully, and 18% have high expert cognitive skill. CSG may provide information to game engine when a player needs help or when wanting a formidable challenge. The game engine will provide the appropriate tasks according to players' ability. CSG will help balance the emotions of players, so players do not get bored and frustrated.