The human brain has been described as a complex system. Its study by means of neurophysiological signals has revealed the presence of linear and nonlinear interactions. In this context, entropy metrics have been used to uncover brain behavior in the presence and absence of neurological disturbances. Entropy mapping is of great interest for the study of progressive neurodegenerative diseases such as Alzheimer’s disease. The aim of this study was to characterize the dynamics of brain oscillations in such disease by means of entropy and amplitude of low frequency oscillations from Bold signals of the default network and the executive control network in Alzheimer’s patients and healthy individuals, using a database extracted from the Open Access Imaging Studies series. The results revealed higher discriminative power of entropy by permutations compared to low-frequency fluctuation amplitude and fractional amplitude of low-frequency fluctuations. Increased entropy by permutations was obtained in regions of the default network and the executive control network in patients. The posterior cingulate cortex and the precuneus showed differential characteristics when assessing entropy by permutations in both groups. There were no findings when correlating metrics with clinical scales. The results demonstrated that entropy by permutations allows characterizing brain function in Alzheimer’s patients, and also reveals information about nonlinear interactions complementary to the characteristics obtained by calculating the amplitude of low frequency oscillations.
In this study, nano-scale microstructural evolution in 6061-T6 alloy after laser shock processing (LSP) was studied. 6061-T6 alloy plate was subjected to multiple LSP. The LSP treated area was characterized by X-ray diffraction and the microstructure of the samples was analyzed by transmission electron microscopy. Focused Ion Beam (FIB) tools were used to prepare TEM samples in precise areas. It was found that even though aluminum had high stacking fault energy, LSP yielded to formation of ultrafine grains and deformation faults such as dislocation cells, stacking faults. The stacking fault probability (PSF) was obtained in LSP-treated alloy using X-Ray diffraction. Deformation induced stacking faults lead to the peak position shifts, broadening and asymmetry of diffraction. XRD analysis and TEM observations revealed significant densities of stacking faults in LSP-treated 6061-T6 alloy. And mechanical properties of LSP-treated alloy were also determined to understand the hardening behavior with high concentration of structural defects.
Cardiovascular imaging analysis is a useful tool for the diagnosis, treatment and monitoring of cardiovascular diseases. Imaging techniques allow non-invasive quantitative assessment of cardiac function, providing morphological, functional and dynamic information. Recent technological advances in ultrasound have made it possible to improve the quality of patient treatment, thanks to the use of modern image processing and analysis techniques. However, the acquisition of these dynamic three-dimensional (3D) images leads to the production of large volumes of data to process, from which cardiac structures must be extracted and analyzed during the cardiac cycle. Extraction, three-dimensional visualization, and qualification tools are currently used within the clinical routine, but unfortunately require significant interaction with the physician. These elements justify the development of new efficient and robust algorithms for structure extraction and cardiac motion estimation from three-dimensional images. As a result, making available to clinicians new means to accurately assess cardiac anatomy and function from three-dimensional images represents a definite advance in the investigation of a complete description of the heart from a single examination. The aim of this article is to show what advances have been made in 3D cardiac imaging by ultrasound and additionally to observe which areas have been studied under this imaging modality.
The purpose of this study is to investigate the relationship between the use of business intelligence applications in accounting, particularly in invoice handling, and the resultant disruption and technical challenges. Traditionally a manual process, accounting has fundamentally changed with the incorporation of BI technology that automates processes and allows for sophisticated data analysis. This study addresses the lack of understanding about the strategic implications and nuances of implementation. Data was collected from 467 accounting stakeholder surveys and analyzed quantitatively using correlational analysis. Multiple regression was utilized to investigate the effect of BI adoption, technical sophistication on operational and organizational performance enhancements. The results show a weak association between the use of BI tools and operational enhancements, indicating that the time for processing invoices has decreased. Challenges due to information privacy and bias were significant and negative on both operational and organizational performance. This study suggests that a successful implementation of a BI technology requires an integrated plan that focuses on strategic management, organizational learning, and sound policies This paper informs practitioners of how accounting is being transformed in the digital age, motivating accountants and policy makers to better understand accounting as it evolves with technology and for businesses to invest in concomitant advances.
Named Entity Recognition (NER), a core task in Information Extraction (IE) alongside Relation Extraction (RE), identifies and extracts entities like place and person names in various domains. NER has improved business processes in both public and private sectors but remains underutilized in government institutions, especially in developing countries like Indonesia. This study examines which government fields have utilized NER over the past five years, evaluates system performance, identifies common methods, highlights countries with significant adoption, and outlines current challenges. Over 64 international studies from 15 countries were selected using PRISMA 2020 guidelines. The findings are synthesized into a preliminary ontology design for Government NER.
The range migration algorithm (RMA) is an accurate imaging method for processing synthetic aperture radar (SAR) signals. However, this algorithm requires a big amount of computation when performing Stolt mapping. In high squint and wide beamwidth imaging, this operation also requires big memory size to store the result spectrum after Stolt mapping because the spectrum will be significantly expanded. A modified Stolt mapping that does not expand the signal spectrum while still maintains the processing accuracy is proposed in this paper to improve the efficiency of the RMA when processing frequency modulated continuous wave (FMCW) SAR signals. The modified RMA has roughly the same computational load and required the same memory size as the range Doppler algorithm (RDA) when processing FMCW SAR data. In extreme cases when the original spectrum is significantly modified by the Stolt mapping, the modified RMA achieves better focusing quality than the traditional RMA. Simulation and real data is used to verify the performance of the proposed RMA.
Copyright © by EnPress Publisher. All rights reserved.