This study comprehensively evaluates the system performance by considering the thermodynamic and exergy analysis of hydrogen production by the water electrolysis method. Energy inputs, hydrogen and oxygen production capacities, exergy balance, and losses of the electrolyzer system were examined in detail. In the study, most of the energy losses are due to heat losses and electrochemical conversion processes. It has also been observed that increased electrical input increases the production of hydrogen and oxygen, but after a certain point, the rate of efficiency increase slows down. According to the exergy analysis, it was determined that the largest energy input of the system was electricity, hydrogen stood out as the main product, and oxygen and exergy losses were important factors affecting the system performance. The results, in line with other studies in the literature, show that the integration of advanced materials, low-resistance electrodes, heat recovery systems, and renewable energy is critical to increasing the efficiency of electrolyzer systems and minimizing energy losses. The modeling results reveal that machine learning programs have significant potential to achieve high accuracy in electrolysis performance estimation and process view. This study aims to contribute to the production of growth generation technologies and will shed light on global and technological regional decision-making for sustainable energy policies as it expands.
Soil salinization is a difficult challenge for agricultural productivity and environmental sustainability, particularly in arid and semi-arid coastal regions. This study investigates the spatial variability of soil electrical conductivity (EC) and its relationship with key cations and anions (Na+, K+, Ca2+, Mg2+, Cl⁻, CO32⁻, HCO3⁻, SO42⁻) along the southeastern coast of the Caspian Sea in Iran. Using a combination of field-based soil sampling, laboratory analyses, and Landsat 8 spectral data, linear Multiple Linear Regression and Partial Least Squares Regression (MLR, PLSR) and nonlinear Artifician Neural Network and Support Vector Machine (ANN, SVM) modeling approaches were employed to estimate and map soil EC. Results identified Na+ and Cl⁻ as the primary contributors to salinity (r = 0.78 and r = 0.88, respectively), with NaCl salts dominating the region’s soil salinity dynamics. Secondary contributions from Potassium Chloride KCl and Magnesium Chloride MgCl2 were also observed. Coastal landforms such as lagoon relicts and coastal plains exhibited the highest salinity levels, attributed to geomorphic processes and anthropogenic activities. Among the predictive models, the SVM algorithm outperformed others, achieving higher R2 values and lower RMSE (RMSETest = 27.35 and RMSETrain = 24.62, respectively), underscoring its effectiveness in capturing complex soil-environment interactions. This study highlights the utility of digital soil mapping (DSM) for assessing soil salinity and provides actionable insights for sustainable land management, particularly in mitigating salinity and enhancing agricultural practices in vulnerable coastal systems.
The expanding adoption of artificial intelligence systems across high-impact sectors has catalyzed concerns regarding inherent biases and discrimination, leading to calls for greater transparency and accountability. Algorithm auditing has emerged as a pivotal method to assess fairness and mitigate risks in applied machine learning models. This systematic literature review comprehensively analyzes contemporary techniques for auditing the biases of black-box AI systems beyond traditional software testing approaches. An extensive search across technology, law, and social sciences publications identified 22 recent studies exemplifying innovations in quantitative benchmarking, model inspections, adversarial evaluations, and participatory engagements situated in applied contexts like clinical predictions, lending decisions, and employment screenings. A rigorous analytical lens spotlighted considerable limitations in current approaches, including predominant technical orientations divorced from lived realities, lack of transparent value deliberations, overwhelming reliance on one-shot assessments, scarce participation of affected communities, and limited corrective actions instituted in response to audits. At the same time, directions like subsidiarity analyses, human-cent
Mangrove forests are vital to coastal protection, biodiversity support, and climate regulation. In the Niger Delta, these ecosystems are increasingly threatened by oil spill incidents linked to intensive petroleum activities. This study investigates the extent of mangrove degradation between 1986 and 2022 in the lower Niger Delta, specifically the region between the San Bartolomeo and Imo Rivers, using remote sensing and machine learning. Landsat 5 TM (1986) and Landsat 8 OLI (2022) imagery were classified using the Support Vector Machine (SVM) algorithm. Classification accuracy was high, with overall accuracies of 98% (1986) and 99% (2022) and Kappa coefficients of 0.97 and 0.98. Healthy mangrove cover declined from 2804.37 km2 (58%) to 2509.18 km2 (52%), while degraded mangroves increased from 72.03 km2 (1%) to 327.35 km2 (7%), reflecting a 354.46% rise. Water bodies expanded by 101.17 km2 (5.61%), potentially due to dredging, erosion, and sea-level rise. Built-up areas declined from 131.85 km2 to 61.14 km2, possibly reflecting socio-environmental displacement. Statistical analyses, including Chi-square (χ2 = 1091.33, p < 0.001) and Kendall’s Tau (τ = 1, p < 0.001), showed strong correlations between oil spills and mangrove degradation. From 2012 to 2022, over 21,914 barrels of oil were spilled, with only 38% recovered. Although paired t-tests and ANOVA results indicated no statistically significant changes at broad scales, localized ecological shifts remain severe. These findings highlight the urgent need for integrated environmental policies and restoration efforts to mitigate mangrove loss and enhance sustainability in the Niger Delta.
The fast-growing field of nanotheranostics is revolutionizing cancer treatment by allowing for precise diagnosis and targeted therapy at the cellular and molecular levels. These nanoscale platforms provide considerable benefits in oncology, including improved disease and therapy specificity, lower systemic toxicity, and real-time monitoring of therapeutic outcomes. However, nanoparticles' complicated interactions with biological systems, notably the immune system, present significant obstacles for clinical translation. While certain nanoparticles can elicit favorable anti-tumor immune responses, others cause immunotoxicity, including complement activation-related pseudoallergy (CARPA), cytokine storms, chronic inflammation, and organ damage. Traditional toxicity evaluation approaches are frequently time-consuming, expensive, and insufficient to capture these intricate nanoparticle-biological interactions. Artificial intelligence (AI) and machine learning (ML) have emerged as transformational solutions to these problems. This paper summarizes current achievements in nanotheranostics for cancer, delves into the causes of nanoparticle-induced immunotoxicity, and demonstrates how AI/ML may help anticipate and create safer nanoparticles. Integrating AI/ML with modern computational approaches allows for the detection of potentially dangerous nanoparticle qualities, guides the optimization of physicochemical features, and speeds up the development of immune-compatible nanotheranostics suited to individual patients. The combination of nanotechnology with AI/ML has the potential to completely realize the therapeutic promise of nanotheranostics while assuring patient safety in the age of precision medicine.
Copyright © by EnPress Publisher. All rights reserved.