This Author published in this journals
All Journal Jendela ASWAJA
Dewika, Erna
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Konstruksi Instrumen Tes Kemampuan Proses Sains Mata Pelajaran IPA SD Dewika, Erna; Zulaiha, Fanni; Suganda, Mikkey Anggara
Jurnal Jembatan Efektivitas Ilmu dan Akhlak Ahlussunah Wal Jama'ah Vol 6 No 2 (2025): Juni
Publisher : LPPM UNU CIREBON

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52188/jeas.v6i2.1432

Abstract

This study aims to develop a Science Process Skills (SPS) test instrument for elementary school science subjects that meets the criteria of validity and reliability. The background of this research is the low level of students’ SPS, which has not been adequately facilitated due to the limited availability of appropriate evaluation instruments. The research employed a test construction and validation approach, involving fourth-grade students as the trial sample. The instrument was designed in the form of multiple-choice two-tier test items covering eight aspects of SPS: observing, classifying, interpreting data, formulating hypotheses, designing experiments, communicating, measuring, and drawing conclusions. Content validity was examined through expert judgment and readability testing, while empirical validity was analyzed using SPSS. Reliability was measured using Cronbach’s Alpha coefficient. The results showed that out of 32 developed items, 20 items (62.5%) were valid, while 12 items (37.5%) were eliminated. The instrument reliability reached a Cronbach’s Alpha value of 0.659, indicating an adequate level of internal consistency. Thus, the developed instrument can be considered feasible to consistently measure students’ science process skills. This research provides a significant contribution to the availability of authentic evaluation instruments and supports teachers in designing more targeted science learning. Future research is recommended to apply the Item Response Theory (IRT), involve larger and more diverse samples, and integrate the instrument into digital platforms to make it more adaptive to the demands of 21st-century learning.