The destructive geohazard of landslides produces significant economic and environmental damages and social effects. State-of-the-art advances in landslide detection and monitoring are made possible through the integration of increased Earth Observation (EO) technologies and Deep Learning (DL) methods with traditional mapping methods. This assessment examines the EO and DL union for landslide detection by summarizing knowledge from more than 500 scholarly works. The research included examinations of studies that combined satellite remote sensing information, including Synthetic Aperture Radar (SAR) and multispectral imaging, with up-to-date Deep Learning models, particularly Convolutional Neural Networks (CNNs) and their U-Net versions. The research categorizes the examined studies into groups based on their methodological development, spatial extent, and validation techniques. Real-time EO data monitoring capabilities become more extensive through their use, but DL models perform automated feature recognition, which enhances accuracy in detection tasks. The research faces three critical problems: the deficiency of training data quantity for building stable models, the need to improve understanding of AI's predictions, and its capacity to function across diverse geographical landscapes. We introduce a combined approach that uses multi-source EO data alongside DL models incorporating physical laws to improve the evaluation and transferability between different platforms. Incorporating explainable AI (XAI) technology and active learning methods reduces the uninterpretable aspects of deep learning models, thereby improving the trustworthiness of automated landslide maps. The review highlights the need for a common agreement on datasets, benchmark standards, and interdisciplinary team efforts to advance the research topic. Research efforts in the future must combine semi-supervised learning approaches with synthetic data creation and real-time hazardous event predictions to optimise EO-DL framework deployments regarding landslide danger management. This study integrates EO and AI analysis methods to develop future landslide surveillance systems that aid in reducing disasters amid the current acceleration of climate change.
The Agriculture Trading Platform (ATP) represents a significant innovation in the realm of agricultural trade in Malaysia. This web-based platform is designed to address the prevalent inefficiencies and lack of transparency in the current agricultural trading environment. By centralizing real-time data on agricultural production, consumption, and pricing, ATP provides a comprehensive dashboard that facilitates data-driven decision-making for all stakeholders in the agricultural supply chain. The platform employs advanced deep learning algorithms, including Long Short-Term Memory (LSTM) networks and Convolutional Neural Networks (CNN), to forecast market trends and consumption patterns. These predictive capabilities enable producers to optimize their market strategies, negotiate better prices, and access broader markets, thereby enhancing the overall efficiency and transparency of agricultural trading in Malaysia. The ATP’s user-friendly interface and robust analytical tools have the potential to revolutionize the agricultural sector by empowering farmers, reducing reliance on intermediaries, and fostering a more equitable trading environment.
Brain tumors are a primary factor causing cancer-related deaths globally, and their classification remains a significant research challenge due to the variability in tumor intensity, size, and shape, as well as the similar appearances of different tumor types. Accurate differentiation is further complicated by these factors, making diagnosis difficult even with advanced imaging techniques such as magnetic resonance imaging (MRI). Recent techniques in artificial intelligence (AI), in particular deep learning (DL), have improved the speed and accuracy of medical image analysis, but they still face challenges like overfitting and the need for large annotated datasets. This study addresses these challenges by presenting two approaches for brain tumor classification using MRI images. The first approach involves fine-tuning transfer learning cutting-edge models, including SEResNet, ConvNeXtBase, and ResNet101V2, with global average pooling 2D and dropout layers to minimize overfitting and reduce the need for extensive preprocessing. The second approach leverages the Vision Transformer (ViT), optimized with the AdamW optimizer and extensive data augmentation. Experiments on the BT-Large-4C dataset demonstrate that SEResNet achieves the highest accuracy of 97.96%, surpassing ViT’s 95.4%. These results suggest that fine-tuning and transfer learning models are more effective at addressing the challenges of overfitting and dataset limitations, ultimately outperforming the Vision Transformer and existing state-of-the-art techniques in brain tumor classification.
Accurate drug-drug interaction (DDI) prediction is essential to prevent adverse effects, especially with the increased use of multiple medications during the COVID-19 pandemic. Traditional machine learning methods often miss the complex relationships necessary for effective DDI prediction. This study introduces a deep learning-based classification framework to assess adverse effects from interactions between Fluvoxamine and Curcumin. Our model integrates a wide range of drug-related data (e.g., molecular structures, targets, side effects) and synthesizes them into high-level features through a specialized deep neural network (DNN). This approach significantly outperforms traditional classifiers in accuracy, precision, recall, and F1-score. Additionally, our framework enables real-time DDI monitoring, which is particularly valuable in COVID-19 patient care. The model’s success in accurately predicting adverse effects demonstrates the potential of deep learning to enhance drug safety and support personalized medicine, paving the way for safer, data-driven treatment strategies.
The telecommunications services market faces essential challenges in an increasingly flexible and customer-adaptable environment. Research has highlighted that the monopolization of the spectrum by one operator reduces competition and negatively impacts users and the general dynamics of the sector. This article aims to present a proposal to predict the number of users, the level of traffic, and the operators’ income in the telecommunications market using artificial intelligence. Deep Learning (DL) is implemented through a Long-Short Term Memory (LSTM) as a prediction technique. The database used corresponds to the users, revenues, and traffic of 15 network operators obtained from the Communications Regulation Commission of the Republic of Colombia. The ability of LSTMs to handle temporal sequences, long-term dependencies, adaptability to changes, and complex data management makes them an excellent strategy for predicting and forecasting the telecom market. Various works involve LSTM and telecommunications. However, many questions remain in prediction. Various strategies can be proposed, and continued research should focus on providing cognitive engines to address further challenges. MATLAB is used for the design and subsequent implementation. The low Root Mean Squared Error (RMSE) values and the acceptable levels of Mean Absolute Percentage Error (MAPE), especially in an environment characterized by high variability in the number of users, support the conclusion that the implemented model exhibits excellent performance in terms of precision in the prediction process in both open-loop and closed-loop.
Recognizing the importance of competition analysis in telecommunications markets is essential to improve conditions for users and companies. Several indices in the literature assess competition in these markets, mainly through company concentration. Artificial Intelligence (AI) emerges as an effective solution to process large volumes of data and manually detect patterns that are difficult to identify. This article presents an AI model based on the LINDA indicator to predict whether oligopolies exist. The objective is to offer a valuable tool for analysts and professionals in the sector. The model uses the traffic produced, the reported revenues, and the number of users as input variables. As output parameters of the model, the LINDA index is obtained according to the information reported by the operators, the prediction using Long-Short Term Memory (LSTM) for the input variables, and finally, the prediction of the LINDA index according to the prediction obtained by the LSTM model. The obtained Mean Absolute Percentage Error (MAPE) levels indicate that the proposed strategy can be an effective tool for forecasting the dynamic fluctuations of the communications market.
Copyright © by EnPress Publisher. All rights reserved.