Fog computing (FC) has been presented as a modern distributed technology that will overcome the different issues that Cloud computing faces and provide many services. It brings computation and data storage closer to data resources such as sensors, cameras, and mobile devices. The fog computing paradigm is instrumental in scenarios where low latency, real-time processing, and high bandwidth are critical, such as in smart cities, industrial IoT, and autonomous vehicles. However, the distributed nature of fog computing introduces complexities in managing and predicting the execution time of tasks across heterogeneous devices with varying computational capabilities. Neural network models have demonstrated exceptional capability in prediction tasks because of their capacity to extract insightful patterns from data. Neural networks can capture non-linear interactions and provide precise predictions in various fields by using numerous layers of linked nodes. In addition, choosing the right inputs is essential to forecasting the correct value since neural network models rely on the data fed into the network to make predictions. The scheduler may choose the appropriate resource and schedule for practical resource usage and decreased make-span based on the expected value. In this paper, we suggest a model Neural Network model for fog computing task time execution prediction and an input assessment of the Interpretive Structural Modeling (ISM) technique. The proposed model showed a 23.9% reduction in MRE compared to other methods in the state-of-arts.
The food supply chain in South Africa faces significant challenges related to transparency, traceability, and consumer trust. As concerns about food safety, quality, and sustainability grow, there is an increasing need for innovative solutions to address these issues. Blockchain technology has emerged as a promising tool to enhance transparency and accountability across various industries, including the food sector. This study sought to explore the potential of blockchain technology in revolutionizing through promoting transparency that enable the achievement of sustainable food supply chain infrastructure in South Africa. The study found that blockchain technology used in food supply chain creates an immutable and decentralized ledger of transactions that has the capacity to provide real-time, end-to-end visibility of food products from farm to table. This increased transparency can help mitigate risks associated with food fraud, contamination, and inefficiencies in the supply chain. The study found that blockchain technology can be leveraged to enhance supply chain efficiency and trust among stakeholders. This technology used and/or applied in South Africa can reshape the agricultural sector by improving production and distribution processes. Its integration in the food supply chain infrastructure can equally improve data management and increase transparency between farmers and food suppliers.There is need for policy-makers and scholars in the fields of service delivery and food security to conduct more research in blockchain technology and its roles in creating a more transparent, efficient, and trustworthy food supply chain infractructure that address food supply problems in South Africa. The paper adopted a qualitative methodology to collect data, and document and content analysis techniques were used to interpret collected data.
The destructive geohazard of landslides produces significant economic and environmental damages and social effects. State-of-the-art advances in landslide detection and monitoring are made possible through the integration of increased Earth Observation (EO) technologies and Deep Learning (DL) methods with traditional mapping methods. This assessment examines the EO and DL union for landslide detection by summarizing knowledge from more than 500 scholarly works. The research included examinations of studies that combined satellite remote sensing information, including Synthetic Aperture Radar (SAR) and multispectral imaging, with up-to-date Deep Learning models, particularly Convolutional Neural Networks (CNNs) and their U-Net versions. The research categorizes the examined studies into groups based on their methodological development, spatial extent, and validation techniques. Real-time EO data monitoring capabilities become more extensive through their use, but DL models perform automated feature recognition, which enhances accuracy in detection tasks. The research faces three critical problems: the deficiency of training data quantity for building stable models, the need to improve understanding of AI's predictions, and its capacity to function across diverse geographical landscapes. We introduce a combined approach that uses multi-source EO data alongside DL models incorporating physical laws to improve the evaluation and transferability between different platforms. Incorporating explainable AI (XAI) technology and active learning methods reduces the uninterpretable aspects of deep learning models, thereby improving the trustworthiness of automated landslide maps. The review highlights the need for a common agreement on datasets, benchmark standards, and interdisciplinary team efforts to advance the research topic. Research efforts in the future must combine semi-supervised learning approaches with synthetic data creation and real-time hazardous event predictions to optimise EO-DL framework deployments regarding landslide danger management. This study integrates EO and AI analysis methods to develop future landslide surveillance systems that aid in reducing disasters amid the current acceleration of climate change.
This study comprehensively evaluates the system performance by considering the thermodynamic and exergy analysis of hydrogen production by the water electrolysis method. Energy inputs, hydrogen and oxygen production capacities, exergy balance, and losses of the electrolyzer system were examined in detail. In the study, most of the energy losses are due to heat losses and electrochemical conversion processes. It has also been observed that increased electrical input increases the production of hydrogen and oxygen, but after a certain point, the rate of efficiency increase slows down. According to the exergy analysis, it was determined that the largest energy input of the system was electricity, hydrogen stood out as the main product, and oxygen and exergy losses were important factors affecting the system performance. The results, in line with other studies in the literature, show that the integration of advanced materials, low-resistance electrodes, heat recovery systems, and renewable energy is critical to increasing the efficiency of electrolyzer systems and minimizing energy losses. The modeling results reveal that machine learning programs have significant potential to achieve high accuracy in electrolysis performance estimation and process view. This study aims to contribute to the production of growth generation technologies and will shed light on global and technological regional decision-making for sustainable energy policies as it expands.
In Nigeria, deforestation has led to an unimaginable loss of genetic variation within tree populations. Regrettably, little is known about the genetic variation of many important indigenous timber species in Nigeria. More so, the specific tools to evaluate the genetic diversity of these timber species are scarce. Therefore, this study developed species-specific markers for Pterygota macrocarpa using state-of-the-art equipment. Leaf samples were collected from Akure Forest Reserve, Ondo State, Nigeria. DNA isolation, quantification, PCR amplification, gel electrophoresis, post-PCR purification, and sequencing were done following a standardized protocol. The melting temperatures (TM) of the DNA fragments range from 57.5 ℃to 60.1 ℃ for primers developed from the MatK gene and 58.7 ℃ to 60.5 ℃ for primers developed from the RuBisCo gene. The characteristics of the ten primers developed are within the range appropriate for genetic diversity assessment. These species-specific primers are therefore recommended for population evaluation of Pterygota macrocarpa in Nigeria.
To achieve the Paris Agreement's temperature goal, greenhouse gas emissions should be reduced as soon as, and by as much, as possible. By mid-century, CO2 emissions would need to be cut to zero, and total greenhouse gases would need to be net zero just after mid-century. Achieving carbon neutrality is impossible without carbon dioxide removal from the atmosphere through afforestation/reforestation. It is necessary to ensure carbon storage for a period of 100 years or more. The study focuses on the theoretical feasibility of an integrated climate project involving carbon storage, emissions reduction and sequestration through the systemic implementation of plantation forestry of fast-growing eucalyptus species in Brazil, the production of long-life wood building materials and their deposition. The project defines two performance indicators: a) emission reduction units; and b) financial costs. We identified the baseline scenarios for each stage of the potential climate project and developed different trajectory options for the project scenario. Possible negative environmental and reputational effects as well as leakages outside of the project design were considered. Over 7 years of the plantation life cycle, the total CO2 sequestration is expected to reach 403 tCO2∙ha−1. As a part of the project, we proposed to recycle or deposit for a long term the most part of the unused wood residues that account for 30% of total phytomass. The full project cycle can ensure that up to 95% of the carbon emissions from the grown wood will be sustainably avoided.
Copyright © by EnPress Publisher. All rights reserved.