Claim Missing Document
Check
Articles

Found 2 Documents
Search
Journal : Journal of Applied Data Sciences

Volatility Analysis of Cryptocurrencies using Statistical Approach and GARCH Model a Case Study on Daily Percentage Change Sarmini, Sarmini; Widiawati, Chyntia Raras Ajeng; Febrianti, Diah Ratna; Yuliana, Dwi
Journal of Applied Data Sciences Vol 5, No 3: SEPTEMBER 2024
Publisher : Bright Publisher

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.47738/jads.v5i3.261

Abstract

Cryptocurrency has become a significant subject in the global financial market, attracting investors and traders with its high volatility and profit potential. This study analyzes the daily volatility and GARCH volatility of six major cryptocurrencies: Bitcoin (BTC), Ethereum (ETH), Litecoin (LTC), USD Coin (USDC), Tether (USDT), and Ripple (XRP). Daily percentage change data and GARCH volatility are analyzed over specific time periods. The analysis reveals that Bitcoin (BTC) has an average daily percentage change of 0.366%, while Ethereum (ETH) has 0.376%. Litecoin (LTC) shows a daily percentage change of 0.166%, whereas USD Coin (USDC) and Tether (USDT) have very low daily percentage changes, nearly approaching zero. In terms of GARCH volatility, Ethereum (ETH) stands out with a volatility of 0.198, followed by Bitcoin (BTC) with a volatility of 0.121. The study's results indicate that cryptocurrencies are vulnerable to extreme price fluctuations, evidenced by their asymmetry distribution and kurtosis. Volatility correlation analysis reveals significant relationships, important for risk management and portfolio diversification. These findings contribute to understanding cryptocurrency volatility characteristics and aid stakeholders in making informed investment decisions.
Predicting Network Performance Degradation in Wireless and Ethernet Connections Using Gradient Boosting, Logistic Regression, and Multi-Layer Perceptron Models Widiawati, Chyntia Raras Ajeng; Sarmini, Sarmini; Yuliana, Dwi
Journal of Applied Data Sciences Vol 6, No 1: JANUARY 2025
Publisher : Bright Publisher

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.47738/jads.v6i1.519

Abstract

This study explores predicting network performance degradation in wireless and Ethernet connections using three machine learning algorithms: XGBoost, Logistic Regression, and Multi-Layer Perceptron (MLP). Key metrics, including accuracy, precision, recall, F1-score, and AUC-ROC, were employed to evaluate model performance. The MLP classifier achieved the highest accuracy (98.7%) and AUC-ROC (0.9998), with a precision of 1.0000 and recall of 0.8622, resulting in an F1-score of 0.9260. Logistic Regression provided reasonable baseline performance, with an accuracy of 93.67%, AUC-ROC of 0.9565, and an F1-score of 0.5992, but struggled with non-linear dependencies. XGBoost showed limited utility in detecting degradation events, achieving an F1-score of 0 despite a perfect AUC-ROC (1.0), indicating sensitivity to imbalanced data. Through hyperparameter tuning, MLP demonstrated robustness in capturing complex patterns in network latency metrics (local_avg and remote_avg), with remote_avg emerging as the most predictive feature for identifying degradation across both network types. Visualizations of latency dynamics demonstrate the higher predictive relevance of remote latency (remote_avg) in both network types, where spikes in this metric are closely associated with degradation. The findings underscore the effectiveness of using latency metrics and machine learning to anticipate network issues, suggesting that MLP is particularly well-suited for real-time, predictive network monitoring. Integrating such models could enhance network reliability by enabling proactive intervention, crucial for sectors reliant on continuous connectivity. Future work could expand on feature sets, explore adaptive thresholding, and implement these predictive models in live network environments for real-time monitoring and automated response.