To save patients’ lives, it is important to go for an early diagnosis of intracranial hemorrhage (ICH). For diagnosing ICH, the widely used method is non-contrast computed tomography (NCCT). It has fast acquisition and availability in medical emergency facilities. To predict hematoma progression and mortality, it is important to estimate the volume of intracranial hemorrhage. Radiologists can manually delineate the ICH region to estimate the hematoma volume. This process takes time and undergoes inter-rater variability. In this research paper, we develop and discuss a fine segmentation model and a coarse model for intracranial hemorrhage segmentations. Basically, two different models are discussed for intracranial hemorrhage segmentation. We trained a 2DDensNet in the first model for coarse segmentation and cascaded the coarse segmentation mask output in the fine segmentation model along with input training samples. A nnUNet model is trained in the second fine stage and will use the segmentation labels of the coarse model with true labels for intracranial hemorrhage segmentation. An optimal performance for intracranial hemorrhage segmentation solution is obtained.
Recognizing the discipline category of the abstract text is of great significance for automatic text recommendation and knowledge mining. Therefore, this study obtained the abstract text of social science and natural science in the Web of Science 2010-2020, and used the machine learning model SVM and deep learning model TextCNN and SCI-BERT models constructed a discipline classification model. It was found that the SCI-BERT model had the best performance. The precision, recall, and F1 were 86.54%, 86.89%, and 86.71%, respectively, and the F1 is 6.61% and 4.05% higher than SVM and TextCNN. The construction of this model can effectively identify the discipline categories of abstracts, and provide effective support for automatic indexing of subjects.
The objective of this work was to analyze the effect of the use of ChatGPT in the teaching-learning process of scientific research in engineering. Artificial intelligence (AI) is a topic of great interest in higher education, as it combines hardware, software and programming languages to implement deep learning procedures. We focused on a specific course on scientific research in engineering, in which we measured the competencies, expressed in terms of the indicators, mastery, comprehension and synthesis capacity, in students who decided to use or not ChatGPT for the development and fulfillment of their activities. The data were processed through the statistical T-Student test and box-and-whisker plots were constructed. The results show that students’ reliance on ChatGPT limits their engagement in acquiring knowledge related to scientific research. This research presents evidence indicating that engineering science research students rely on ChatGPT to replace their academic work and consequently, they do not act dynamically in the teaching-learning process, assuming a static role.
Abrupt changes in environmental temperature, wind and humidity can lead to great threats to human life safety. The Gansu marathon disaster of China highlights the importance of early warning of hypothermia from extremely low apparent temperature (AT). Here a deep convolutional neural network model together with a statistical downscaling framework is developed to forecast environmental factors for 1 to 12 h in advance to evaluate the effectiveness of deep learning for AT prediction at 1 km resolution. The experiments use data for temperature, wind speed and relative humidity in ERA-5 and the results show that the developed deep learning model can predict the upcoming extreme low temperature AT event in the Gansu marathon region several hours in advance with better accuracy than climatological and persistence forecasting methods. The hypothermia time estimated by the deep learning method with a heat loss model agrees well with the observed estimation at 3-hour lead. Therefore, the developed deep learning forecasting method is effective for short-term AT prediction and hypothermia warnings at local areas.
The telecommunications services market faces essential challenges in an increasingly flexible and customer-adaptable environment. Research has highlighted that the monopolization of the spectrum by one operator reduces competition and negatively impacts users and the general dynamics of the sector. This article aims to present a proposal to predict the number of users, the level of traffic, and the operators’ income in the telecommunications market using artificial intelligence. Deep Learning (DL) is implemented through a Long-Short Term Memory (LSTM) as a prediction technique. The database used corresponds to the users, revenues, and traffic of 15 network operators obtained from the Communications Regulation Commission of the Republic of Colombia. The ability of LSTMs to handle temporal sequences, long-term dependencies, adaptability to changes, and complex data management makes them an excellent strategy for predicting and forecasting the telecom market. Various works involve LSTM and telecommunications. However, many questions remain in prediction. Various strategies can be proposed, and continued research should focus on providing cognitive engines to address further challenges. MATLAB is used for the design and subsequent implementation. The low Root Mean Squared Error (RMSE) values and the acceptable levels of Mean Absolute Percentage Error (MAPE), especially in an environment characterized by high variability in the number of users, support the conclusion that the implemented model exhibits excellent performance in terms of precision in the prediction process in both open-loop and closed-loop.
Copyright © by EnPress Publisher. All rights reserved.