Call for Participation 9th International Conference on Materials Science and Engineering (MSE 2025).
May 30 ~ 31, 2025, Virtual Conference
We invite you to join us on 9th International Conference on Materials Science and Engineering (MSE 2025).
This conferencewill provide an excellent will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the areas of Education. It also aims to provide a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
Non-Author / Co-Author/ Simple Participants (no paper)
100 USD (With proceedings)
Here's where you can reach us mail: : mse@mse2025.org or mseconference123@yahoo.com
Multipoint Moving Nodes: the Key to Enhancing Approximate Analytical Solutions of Parabolic Equations
DalabaevUmurdin and XasanovaDilfuza, University of World Economy and Diplomacy, Tashkent, Uzbekistan
This article discusses approximate solutions of linear parabolic equations with initial-boundary conditions. The primary focus is on methods that effectively find such solutions by employing a moving finite difference analog of the differential equation. This approach allows us to formulate an approximate analytical solution, significantly simplifying the computation process .By transitioning from the differential equation to an algebraic equation, we obtain a single equation, the solution of which represents an approximate analytical solution to the original problem. However, to achieve higher accuracy in this solution, we apply additional moving nodes, which enhances the results.By using multipoint moving nodes, we can form a system of algebraic equations, the solution of which provides us with an improved analytical solution. The article also presents numerical experiments that confirm the effectiveness of the proposed method and its advantages over traditional approaches.
Boundary conditions, differential equation, multipoint moving nodes, initial-value problem.
The Number Π: From Antiquity to Modern Computing with Geogebra Software
Cloves Rocha Sampaio Júnior, Retired Federal Civil Servant - Independent Researcher
In this work, we explore the number π from a new perspective, using advanced mathematical methods and the GeoGebra software for investigations in Cartesian, isometric and polar coordinates. We focus on proportional relationships in the first quadrant of the unit circle, extending the analyses to the following quadrants with precision and rigorous logic. Traditionally, π is recognized as an irrational number, intrinsic to nature and geometry. However, our research suggests the possibility of understanding π in a rational and repetitive way, through divisions, infinite series, and pertinent theorems. We propose a renewed view on π, questioning established assumptions and enriching mathematical understanding by challenging old concepts and broadening our understanding. The methodology analyzes the relationship between circumference perimeters and their diameters, radian angles, and square roots. The integration of these factors with inscribed and circumscribed polygons indicates a rational pattern for the number π. Advanced computing is essential for calculations that surpass ruler and compass capabilities. Recent studies suggest that, although traditional geometric methods do not determine an exact and repeatable value for π, advanced computational techniques and mathematical models can reveal new aspects of this essential constant.
Number π; GeoGebra Software; Pure trigonometry.
The Vortex Impulse Theory for Finite Wings
Shiang Yu Lee, Chief Scientist, LW Aeroscience, 583 Battery Street Unit 2106N, Seattle WA
Based on the observation that lift producing circulation is the result of divergent reflective flows circumventing planform edges, this research presents a comprehensive new theory for finite rectangular wing aerodynamics. First, it features an unconventional expression for potential lift, emphasizing geometrical aspects that reduce circulation strengths and lifting capacity. Of fundamental significance is the derivation of a novel "crossflow separation vortex normal force" through the application of a long-overlooked Milne- Thompson "Vortex Impulse Theory." An empirical non-linear pressure drag normal force relationship is also provided to complement linear theories. Since critical experimental evidence shows that potential lift is absent in narrow wing cases, it is excluded in that range and only provides limited contributions in wider configurations. This distinctive three lifting element theory is found to provide highly accurate predictions for all experimental cases evaluated.
Aerodynamics, Airplane, Wing, Vortex, Circulation, Navier Stokes Equations, Lift, Drag.
Did Fishing Nets with Shell Weights Precede the Bow and Arrow? Digitally Edited Photographs Computer Model another use for Prehistoric Punctured Shells
Silvia Stein1 and Susana Pacheco2, 1Independent Communication Researcher on researchgate.net, Florence, Italy, 2Universidade NOVA de Lisboa, Lisbon, Portugal
Digital photographs can produce a mental event reshaping perception [1], generating a usewear redescription of the use of prehistoric shells as net weights [2]. We rely on string [3] preceding bow and arrow. String is essential in making fishing nets with net weights [4]. We address how altered photographs can create false memories [5], and show that strategically reconfiguring digital shell images to align with prehistoric tool use studies [6] a better visual guide is introduced. This might explain associated precuneus brain evolution involved in bimanual processes and spatial thinking [7], preceding by about 30,000 years the bow and arrow. As etched on the Blombos ochre stone [8], the net had to have a diamond shape to which a string of shells was attached, preventing shells from entangling. Digital editing then helps us mirror the prehistoric cognitive style and visual grammar [9] describing how the net would look.
Archaeology, Photograph, Precuneus, Shell Weights, String, Synthetic Memory, Use-Wear.
Development of a Statistical Thermodynamic Model for Calculation of Self-diffusion Parameters in Metals
Serhii Bobyr1, Martin Sahlberg1, Joakim Odqvist2, 1Uppsala University, Ångström Laboratory, Lägerhyddsvägen 1, Box 538, SE 751 21, Uppsala, 2KTH Royal Institute of Technology, Department of Materials Science and Engineering, Brinellvägen 23, SE-100 44, Stockholm, Sweden
Diffusion is the most important transfer processes of substances and masses in metals. The theory of diffusion in metals is based on the fundamental concepts of physical kinetics and is very important branch in Material Science. The purpose of this project is to develop a statistical thermodynamic model for the self-diffusion in metals and to apply it for calculating the diffusion coefficients of atoms in metals and alloys. The relationship between the diffusion flux of vacancies and the gradient of their chemical potential has been obtained from the basic principles of statistical thermodynamics. For a solid solution of vacancies in a metal, an expression for the self-diffusion coefficient has been found. To calculate the activation energy of self-diffusion in metals and simple alloys, the statistical calculation model (SCM) was proposed that consider energy calculations from first principles, statistical processing of experimental data on self-diffusion of metals, and a physicochemical model using the correlation relationship between the activation energy of self-diffusion and the melting temperature of metals. The calculations are compared with the known experimental data on the diffusion in iron and other metals with a good agreement between the results. Based on the processing of experimental results on diffusion in metals using SCM, thermodynamic parameters of self-diffusion activation energies in a- and ?-Fe, Ti, V, Mo, W, Ag, Cu and other metals were established. The SCM for self-diffusion in metals and simple alloys was implemented in the corresponding database, including 43 base metals and silicon.
metals; statistical thermodynamics; self-diffusion; vacancies; pre-exponential factor; activation energy.
On the Origin of Inertia: Implications for Dark Matter and Dark Energy
Konstantinos I. Tsarouchas, Department of physics, National Technical University of Athens, Greece
In this paper, we present a new theory explaining the origin of inertia based on two key ideas: gravity as a spin-1 gauge field theory and the relativity of all motion. This theory proposes that inertial mass is influenced by the distribution of matter across the Universe, offering potential insights into dark matter and dark energy. For gravity to be described by a spin-1 gauge field theory, we propose that gravitational mass m, distinct from inertial mass, is a Lorentz invariant and should be replaced by an imaginary mass im for like masses to attract. According to this theory, while gravitational mass is imaginary, inertial mass remains a real quantity. These two key ideas, lead to the principle of Equivalence and the conclusion that gravity shapes the geometry of spacetime, which is FinslerRanders spacetime. For bodies with gravitational mass, this curved spacetime is equivalent to a flat Minkowski spacetime with an added gravitomagnetic field. The theory shows that external inertial forces originate from the entire Universe, while internal forces depend on the body’s structure. In free fall, all bodies experience only external forces, explaining why they fall at the same rate regardless of internal structure.
Gravitomagnetism , Mach’s principle , origin of inertia , dark matter , dark energy.
The Quantistic Nature of Space-Time
Flavio Barbiero, Pisa University, Italy
The invariance of the speed of light in RFs inmotion proves that the space time is a quantistic entity, because its “density” is determined only if related to an observer. The difference of this density between two observers in relative motion is given bya set of transformation equationsobtained by analysing the propagation of aflash of light in a 3D RF.They show that the RFs of the two observers are displaced, onewith respect to the other,towards an imaginary direction by a spatial component transverse to the motion,and as a consequence their respective “density" is reducedby a ratio √(1-v^2/c^2 ).Therefore the two observers, watching the same physical phenomenon (for example the displacement of an object from point A to point B), would measure different values for space and time, same value for velocities, but different accelerations and therefore different forces acting on the object.
Invariance of speed of light, relativity of space-time, longitudinal and transverse mass, the ether, gravito-magnetic field.
The Nuclear Second Law
Fred D. Lang, Exergetic Systems Limited, Salt Spring Island, Canada
This paper asserts that the nuclear phenomena has had no traditional computational nexus with the Second Law of thermodynamics. Since N Reactor days, there has been no direct nexus between nuclear power and its resultant energy flow delivered to the coolant. N Reactor neutron flu was not well measured, but was the motive force behind delivering 4000 MWt to the the Columbia River. It is well known that neutron flux in a 1270 MWe PWR is approximately 1.0x1013 1n0cm-2 sec-1 . This flux a product of Neutron Transport Theory (NTT). However, a back-calculating a flux based on the PWR’s rated 3640 MWt divided by average macro cross section, fissile volume and recoverable MeV/Fission, produces twice the NTT flux. From the time of the Manhattan Project, nuclear engineers had treated flux as a relative parameter: hard to measure, a relative value to be normalized, is not uniform, etc., etc. Postulated in this work is that nuclear phenomena are inertial processes, devoid of terrestrial reference. Such phenomena encompass fission and fusion power, astrophysics and radioactivity; all such phenomena provided the reaction produces a mass defect. This approach demands reinterpretation of Einstein’s 𝛥𝐸 = 𝑐 2𝛥𝑚 by describing his ΔE as an exergetic potential, an ultimate “Free Exergy”. Free Exergy consists of both recoverable and irreversible portions of the MeV release. In transference to a coolant, the recoverable release produces an exergetic increase (ṁΔg) in the fluid; its exergy’s 𝑇𝑅𝑒𝑓 derives from an Inertial Conversation Factor (Ξ). Importantly, Ξ also transforms this recoverable nuclear release to an explicit, and consistent, thermal power (ṁΔh).
First and Second Laws, Core Thermal Power, Irreversibilities, Neutrino, Neutron Flux.
Theoretical Derivation of Bohr’s Postulate for the Charge in a Hydrogen Atom. Coulomb’s Law in Logarithmic form with Corrections for Strong Interactions at Small Distances. The Physical Meaning of Planck’s Constant
Vadim Khoruzhenko, France
This article proposes a revolutionary mathematical model that introduces a fifth spatial dimension—”space density”—as a fundamental property determining gravitational, electromagnetic, strong, and weak interactions. The model is based on the hypothesis that changes in space density can lead to phenomena analogous to known fundamental forces. Through a series of mathematical derivations, it is shown how the distribution of space density around spherical objects influences classical field theories. The main results include: 1. Theoretical Proof of Bohr’s Postulate: For the first time, a theoretical justification for Bohr’s postulate on the quantization of the electron’s angular momentum in a hydrogen atom is proposed, which is key to quantum mechanics. 2. Connection Between Charge and Mass: A novel connection between charge and mass is established, allowing mass to be interpreted as the energy required to compress a clump of space density. 3. Complex Solution and Imaginary Energy: It is shown that the interaction of two clumps of space density has only a complex solution, where the imaginary part determines the resonant frequency of the system. 4. Strong and Weak Interactions: The model offers an explanation for strong and weak interactions through the properties of space density, opening new possibilities for understanding nuclear forces. This work not only reproduces known physical regularities but also offers a new perspective on the nature of fundamental interactions, linking them to the intrinsic properties of space.
The Information Framework: a Pascal's Wager for 21st Century Science?
Degroo Michaël, Ai2ia.org, Rabat, Morocco
Information is a pivotal concept, central to research and technological advancements. In quantum physics, information plays an active role in shaping reality, moving beyond mere descriptions of matter and energy. Complexity sciences view information as an organizing principle, and artificial intelligence exemplifies its importance through exponential growth. The Information Framework expands current scientific paradigms by recognizing information as a fundamental, irreducible component of reality alongside matter and energy, represented by the formula [ R' = (E=mc²)↦I ]. It advocates that information exists "in itself" with causal power and relational significance, challenging reductionist approaches. While philosophical, the framework is scientifically grounded, proposing a trans-disciplinary perspective that integrates physics, biology, AI, and philosophy to understand reality's interactive dimensions. Still under development, it invites critique, refinement, and testing to address challenges across diverse fields and deepen our understanding of the interplay between matter, energy, and information.
Theoretical Physics, Quantum Physics, Metaphysics, Ontology, Philosophy of Science, Epistemology.
Geometric Unification of Forces in the Continuous Creation Model: Theoretical Framework, Physical Insights, and Experimental Pathways
Brian Hall, Independent Researcher, USA
The Continuous Creation Model (CCM) presents a groundbreaking geometric framework for unifying fundamental forces. By encoding physical constants, quantum phenomena, and spacetime into a dynamic symmetry landscape, the CCM reveals how geometric frustration, memory effects, and spectral projection drive emergent behaviors. This paper outlines the mathematical formalism, quantum foundations, and testable predictions of the CCM, positioning it as a unified theory accessible to experimental validation. With implications for quantum gravity, particle physics, and cosmology, the CCM offers a novel perspective on the interconnectedness of nature’s forces.
Geometric Frustration, Pentagonal Symmetry, Field Theory, Particle Mass Prediction, Oscillatory Model.
Exploring Quantum Gravity at the Nanoscale
Michael Boyd, United States of America
This study recalculates quantum gravity parameters for seven nanobumps and seven nanopits on a spinning hard disk platen, exploring nanoscale gravitational effects. The spacetime curvature metric (G00) matches provided table values, and the antigravity force for nanopits is used to calculate tunneling barrier magnitudes. Using the Puthoff model’s predictions, the analysis examines a 20-nanosecond pre-signal in expanded spacetime (nanopit) and a 20-nanosecond post-signal in denser spacetime (nanobump), calculating their apparent speeds relative to the speed of light. Results are analyzed within a semiconductor-like model, spacetime metric engineering, extra-dimensional gravity, and General Relativity, including frame-dragging effects. The Casimir force is assessed for superposition with gravitational forces. The findings support magnified gravitational interactions at nanoscale distances, with tunneling-like behavior in nanopits indicating a quantum gravity regime and confirm spacetime dynamics predicted by the Puthoff model.
quantum gravity, Casimir force, General Relativity, spacetime metrics engineering.
A Strategic Framework for AI-Driven Ip Portfolio Development and Evaluation with Iso Standards
Albert van Niekerk, Department of Applied Research, AvNFoundationRsa, RSA
Intellectual property (IP) management is evolving with artificial intelligence (AI) integration, offering enhanced efficiency, accuracy, and strategic decision-making. This paper presents a framework for AI-driven IP portfolio development, positioning prompt engineering as both an intellectual asset and a tool for optimising key processes such as identification, valuation, and monetisation. The framework aligns with international standards, including ISO 56005, ISO 10668, and ISO 31000, ensuring compliance and governance. A structured validation process using weighted scoring and statistical methods enhances the reliability of AI-generated insights. Case studies highlight the framework's speed, scalability, and cost-efficiency benefits while addressing data quality, bias, and regulatory compliance challenges. The paper concludes with recommendations for businesses and policymakers to adopt AI-driven IP strategies and suggests future research directions, contributing to the growing discourse on AI in IP management.
Artificial intelligence (AI), Intellectual property (IP) management, Prompt engineering,IP portfolio development, ISO standards compliance, IP valuation and monetisation, AI-driven decision-making, Risk management, Ethical AI considerations, and Innovation strategy.
Evaluating Prompt-Learning-Based API Review Classification Through Pre-trained Models
Xia Li, Allen Kim, The Department of Software Engineering and Game Design and Development, Kennesaw State University, Marietta, USA
To improve the work efficiency and code quality of modern software development, users always reuse Application Programming Interfaces (APIs) provided by third-party libraries and frameworks rather than implementing from scratch. However, due to time constraints in software development, API developers often refrain from providing detailed explanations or usage instructions for APIs, resulting in confusion for users. It is important to categorize API reviews into different groups for easily usage. In this paper, we conduct a comprehensive study to evaluate the effectiveness of prompt-based API review classification based on various pre-trained models such as BERT, RoBERTa, BERTOverflow. Our experimental results show that prompts with complete context can achieve best effectiveness and the model RoBERTa outperforms other two models due to the size of training corpus. We also utilize the widely-used fine-tuning approach LoRA to evaluate that the training overhead can be significantly reduced (e.g., 50% reduction) without the loss of the effectiveness of classification.
Software engineering, API review classification, pre-trained models, fine-tuning.
The Evolution of AI Chatbots in Sustainable Tourism: a Systematic Literature Review
T D C Pushpakumara1, And Fazeela Jameel Ahsan2, 1Department of Civil Engineering, University of Moratuwa, Sri Lanka, 2Department of Marketing, University of Colombo, Sri Lanka
This systematic literature review explores the transformative role of artificial intelligence (AI) chatbots in promoting sustainable tourism, particularly in the ecotourism sector. AI chatbots are pivotal in enhancing operational efficiency, fostering environmental responsibility, and improving tourist engagement. The study identifies their contributions to sustainability by optimizing resource use, reducing environmental impact, and educating tourists about local cultural and ecological practices. Despite these benefits, significant challenges such as data privacy concerns, infrastructural limitations, and cultural biases hinder widespread adoption. The findings emphasize the need for robust digital infrastructure, ethical frameworks, and culturally adaptive chatbot designs to overcome these barriers. By aligning technological innovation with sustainability goals, AI chatbots can significantly advance sustainable tourism practices. Future research should prioritize empirical analyses and inclusive strategies to maximize the potential of AI chatbots in fostering long-term sustainable development in ecotourism.
AI, Chatbots, Ecotourism, Sustainability, Management, innovation.
Asthma Wellness Care with Personalized and Predictive Support Platform using Artificial Intelligence and Machine Learning
Sona Daison, Department of Computer Science and Engineering Karunya Institute of Technology and Sciences, India
Theworld-wide populationwithasthma experiences ongoingmedical issues because urgent emergency conditions often result in necessary hospital admissions which impacts their general health quality.The control of asthma becomes harder because asthma triggers suddenly emerge from the interaction between environmental factors and personal health conditions. The Asthma Wellness Care Platform resolves asthma care difficulties by integrating AI technology for predicting physician records and individualized treatment and continuous data monitoring systems. The application processes real- time data by merging Air Quality Index information with sleeppattern and stress measurement data submitted by users through multiple machine learning models including KNN, Random Forest and XGBoost and Logistic Regression to forecast asthma attacks. Using this platform lets users access breath exercise tools while also providing them with an asthma journal record system to enhance asthma management.The chatbot responds immediately to userneedsatthe same time emergency alerts immediately contact both emergencyresponders and healthcare providers in criticalsituations. The platform provides secure respon- sible services for data sharing authentication that allows users to enhance asthma management while decreasing hospital visits and boosting medical care efficiency through its novel features.
asthma prediction, machine learning, smart healthcare, real-time monitoring, artificial intelligence, personalized management.
Efficient Defect Detection Method for Yolov5 Circuit Board Based on REPVGG and SE Attention Mechanism
Yuxun Chen, Jianlang Deng, Zili Wang, Mingrui Li, Zexuan Pan, Computer Science and Engineering Faculty, South China University of Technology, Guangzhou, China
With the development of intelligent manufacturing and industrial automation, circuit board quality inspection, as a crucial part of industrial production, urgently needs efficient and precise target detection models. This project aims to design and optimize a target detection model based on deep learning methods that can quickly and accurately identify defective circuit boards. By introducing the RepVGG structure to improve the Yolov5 backbone network and integrating multiple attention mechanisms (such as CBAM, SE, SCA, etc.), this research significantly enhances the detection performance. Experimental results show that the improved Yolov5 + RepVGG + SE model achieved an accuracy rate of 89% on the defective circuit board dataset provided by the Beijing University Intelligent Robot Open Laboratory, which is higher than other combinations.
Intelligent Manufacturing, Object Detection, RepVGG, Attention Mechanism, Defect Detection.
Integrating Universal Generative AI Platforms in Educational Labs to Foster Critical Thinking and Digital Literacy
Vasiliy Znamenskiy, Rafael Niyazov, Joel Hernandez, Borough of Manhattan Community College, The City University of New York, USA
This study investigates the educational potential of generative artificial intelligence (GenAI) platforms based on large language models (LLMs), such as ChatGPT, Claude, and Gemini, as tools for student-centered learning. Recognizing the current limitations of GenAI—particularly its propensity for generating inaccurate or misleading information—the paper proposes a novel instructional strategy: an interdisciplinary laboratory designed to foster critical evaluation of GenAI-generated outputs. In this pedagogical model, students engage with GenAI systems by posing questions or solving problems drawn from topics they have already studied and understand. Equipped with correct answers, students are positioned to assess the accuracy, and relevance of AI-generated responses across multiple modalities, including text, images, and video. Students design such difficulty prompts and tasks which help compare the intellectual performance of various GenAI. This approach was implemented for a specially designed lab session within a general astronomy course for non-science majors. Multiple student groups completed the lab, demonstrating high levels of engagement, initiative, and critical thinking. Findings suggest that such activities not only deepen students’ comprehension of scientific content of learning courses but also cultivate essential skills in digital literacy and critical interaction with AI technologies.
Transforming Deployment and Release Management for Salesforce with Copado’s AI
Vyshnavi Thanneeru1 and Murali Mohan Reddy Seelam2, 1Senior DevOps Engineer, Fidelity Investments, Westlake, USA, 2Senior Software Engineer, Cisco Systems, Dallas, USA
This Research article explores the changing impact of AI (Artificial Intelligence) on automation capabilities of Copado, improving the deployment and change management in Salesforce DevOps. In this paper we have outlined the AI- based methodologies that automate the version control, optimize change management workflows, and improve accuracy of the deployments. The traditional Salesforce DevOps pipelines face many challenges like deployment errors, merge conflicts, roll back issues, and dependencies between the components. By implementing predictive analytics, machine learning, and automated risk assessment, Copado automation provides improved efficiency in deployments, decrease in errors, and optimized release velocity in complex salesforce environments. The results from integrating these AI-driven improvements across different salesforce instances highlight the critical value of integrating Artificial Intelligence in Salesforce DevOps Pipelines. This research paper shows Copado’s AI-powered automation as an important advancement towards scalable, robust, and adaptive Salesforce DevOps implementation.
Salesforce, Artificial Intelligence, Copado, Release Management, Deployment, Change management.
Prospects for the Development of Blockchain Technology in Corporate Information Systems
Yurii Tulashvili, Iurii Lukianchuk and Viktor Kosheliuk, Department of Computer Sciences, Lutsk National Technical University, Lutsk, Ukraine
Over the past decade, blockchain technology has undergone rapid development and is recognized as one of the pivotal information technologies driving industrial transformation. Today, blockchain technology offers a promising solution to the problems faced by corporate information systems. Namely, with the help of appropriate measures of anonymization and preservation of confidentiality, the blockchain enhances data security, reduces the risk of unauthorized access, and ensures user privacy in corporate systems. Blockchain technology increases transparency, allowing users to monitor and verify the content of information, assess the integrity of data stores. In recent years, blockchain has become a subject of interest for state governments, multinational corporations, and major financial institutions. Today, considerable attention is paid to the development of private (corporate) blockchains. This article examines the prospects and development of blockchain technology in corporate information systems. The study is aimed at providing additional clarity regarding the concept of blockchain applications in corporate information systems. Existing enterprise applications cannot operate seamlessly with traditional transactional requirements when integrating blockchain technology. Therefore, they will have to be modified with inclusion in the existing data storage for asynchronous interaction with distributed nodes of the blockchain. A scheme of the principle of interaction of nodes - storages that support data synchronization and their current states are proposed. This scheme is based on a blockchain structure centered around cloud storage as a corporate document circulation system. Consensus to include a new entry - Byzantine fault tolerance. Blockchain nodes receive a parametric weight for decision making.
Blockchain, Consensus, Corporate Systems, Security.
Proactive Health Surveillance System for Soldiers in Combat
SVAdesh, SanjayMP, GPrerithSShetty and Priyanshu, DepartmentofComputerScienceandEngineering,SahyadriCollegeofEngineering and Management Mangalore ,Karnataka , India
In modern combat scenarios, the health and safety of soldiers are of paramount importance. Rapid and accurate health monitoring can significantly enhance the ability to provide timely medical interventions, potentially saving lives. This project centers on creating a cutting-edge health monitoring system for soldiers, leveraging ArtificialIntelligence and Machine Learning (AI/ML) technologies to analyze critical health parameters in real-time. Our system integrates multiple sensors into awearable jacket that soldiers can comfortably wear during combat operations. These sensors continuously collect vital health data, including Electrocardiogram (ECG) readings, heart rate, and body temperature. The collected data is transmitted to a AWS Cloud, where it is analyzed using sophisticated AI/ML algorithms.The primary objective of the AI/ML component is to determine whether a soldier requires medical attention based on the analyzed health parameters. The system leverages historical health data and patterns to trainmachine learning models capable of identifying anomalies and predicting health issues. By employing a combination of supervised and unsupervised learning techniques, the system can detect irregularities in real-time and alert medical personnel immediately
Health Monitoring System, Soldier Safety, Artificial Intelligence (AI), Real-Time Analysis, Wearable Technology, Electrocardiogram (ECG), Heart Rate Monitoring, Body Temperature Tracking, Sensor Integration, AWS Cloud, Anomaly Detection, Supervised Learning, Medical Alert System, Cloud-Based Processing. .
Boswell Quotient: Test Metrics for Assessing Chatbot Indispensability
Alan Wilhelm1 and Peter Luh2, 1CTO @ Referential.ai, San Francisco, California, USA, 2Retired physicist, San Jose, California, USA
The Boswell Test, designed to measure AI model indispensability, comprises Test-A, assessing deep insight into a host’s nuances, and Test-B, evaluating critical thinking and reasoning. While Test-A remains beyond current AI capabilities, Test-B is implementable today. This paper details the Boswell Quotient, a novel metric methodology for Test-B, and explores avenues for its further refinement.
Boswell Test, Boswell Quotient, Turing Test, Indispensability, Performance, Evaluation, Efficiency, Boswell Test-A, Boswell Test-B, Personal insight, Critical thinking, Truthfulness, Empathy, Adaptability, Ethical alignment.
Population, Urbanization and Economic Growth in Nigeria
MATTHEW, UDUAK PATRICK, Department of Economics, University of Uyo, Nigeria
The study examines the effects of population, urbanization and economic growth in Nigeria using secondary data source from the publication of central bank of Nigeria statistical bulletin from 1986 to 2023. The main objective od the study was to ascertain the effect of population growth on economic growth in Nigeria and to ascertain the effect of urbanization on economic growth in Nigeria. Moreover, to achieve the stated objectives, the autoregressive distributive lag model employed to examine the effects of the independent variables on the dependent variable. Finding shows that population growth has a negative effect on economic growth in Nigeria while urbanization in Nigeria has been found to have a positive relationship with economic growth. The study therefore recommend that government should prioritize educational programs and vocational training to equip the growing population with skills necessary for the job market, thereby reducing unemployment rates and enhancing productivity. Also, to maximize the benefits of urbanization, effective urban planning must be prioritized. This includes creating policies that manage urban sprawl, ensuring access to basic services, and promoting sustainable living conditions in cities.
Population, Urbanization, and Economic growth.
The Afterlife in the Age of AI a Psychological, Ethical, and Technological Analysis
Fabrizio Degni, AI Ethics and Governance Researcher, GSOM Polimi, Milano, Italy
The trend and convergence of Artificial Intelligence technologies with the human conceptions of death and afterlife presents unspotted and underrated challenges and but also opportunities for understanding consciousness, identity, and grief. This research provides a comprehensive interdisciplinary analysis of how AI is reshaping our relationship with mortality, under different domains such as the psychological impacts, technological capabilities, ethical considerations, and cultural perspectives. Through analysis of current digital memorial technologies, psychological frameworks of attachment and grief, and philosophical questions of identity, we establish that AI-enabled afterlife simulations introduce complex dynamics that both extend and disrupt traditional mourning processes: we propose a regulatory framework grounded in principles of informed consent, psychological safeguarding, and cultural sensitivity. It is a first seminal analysis and contribute to the emerging discourse on post-mortem digital identity looking forward to establishing parameters for ethically sound development of afterlife technologies.
artificial intelligence, afterlife beliefs, digital immortality, grief processing, consciousness simulation.
The Transition From True Alphabets to Logical and Empirical Methods
Yan Zhou, Ph.D., Independent Researcher, USA
Alphabetic letters are symbols representing phonemes (vowels and consonants), while spellings are arrays of letters denoting the orderly utterance of phonemes. Formal logic and true alphabets share notable similarities. Core elements of classical logic—such as inheritance, conjunction, validity, soundness, linearity, consistency, analysis, and justification—are reflected in true alphabets. Intellectual certainty is achievable through valid inference, fostering empirical experimentation to identify reliable premises for sound inference and argument. Phonemes and spoken words abstracted from communal language are empirical evidence. Regardless of their origins, scientific (logical and empirical) methods have been applied in true alphabets, formally established, and preserved. Logical inference and analysis are traceable and more trustworthy, encouraging inquiries into indefinite effects and ultimate causes. The quasi-logical inference and analysis inherent in other phonographies are deficient to varying degrees but remain preferable to complete ignorance of formal logic. Western societies, traditionally shaped by true alphabets, are inherently inclined to adhere to logical principles and observe factual evidence.
Alphabet, Intrinsic Feature, Logical Inference, Empirical Evidence, Scientific Method
On the "Double Adaptation" of Teaching and Its Comprehensive Effect
Yan Zhou, Ph.D., Independent Researcher, USA
The teaching methods of each subject should not only adapt to the internal needs of students, but also adapt to the objective laws of the development of external things, and should make the two skillfully combined and promote each other, so as to achieve the "double adaptation" of teaching. After repeated research and experiments, we found that the "learning guide" implemented according to a "four-step procedure" can comprehensively solve this problem. This can also bring about a high degree of integration of "teaching" and "learning", a high degree of integration of teaching plans, textbooks, assignments and exams, a high degree of integration of teaching materials and teaching methods, a high degree of integration of teaching methods of various disciplines, and a high degree of consistency between school education and social needs. The role, energy and various effects of "highly integrated" and "highly consistent" can not be underestimated. It can not only ensure that all teaching tasks can be easily completed within the specified teaching time, but also expand the depth and breadth of knowledge on the original basis, creating extremely favorable conditions for the cultivation of talents and even talents.
inner need; objective law; double adaptation;four-step procedure; highly integrated;
User Name : john
Posted 24-05-2025 on 16:08:15 AEDT