Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (261)

Search Parameters:
Keywords = formal concept analysis

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 681 KiB  
Article
Building Traceable Redactable Blockchain with Time-Verifiable Chameleon Hash
by Mingliang Chen, Guoqiang Xie, Benren Pan, Jinyan Fang, Zaide Xu and Zhen Zhao
Electronics 2025, 14(5), 846; https://doi.org/10.3390/electronics14050846 - 21 Feb 2025
Viewed by 233
Abstract
Blockchain is a decentralized digital ledger that records transactions across a distributed network of computers, enabling secure and transparent operations without requiring trust in a central authority. While initially developed for Bitcoin, blockchain technology now underpins many cryptocurrencies and other applications. It serves [...] Read more.
Blockchain is a decentralized digital ledger that records transactions across a distributed network of computers, enabling secure and transparent operations without requiring trust in a central authority. While initially developed for Bitcoin, blockchain technology now underpins many cryptocurrencies and other applications. It serves as an open trust layer without central reliance and is widely used in cryptocurrencies such as Bitcoin and Ethereum. However, this public and permanent open storage has raised concerns about its potential misuse for illegal trades or the distribution of unwanted content. In EuroS&P 2017, Ateniese et al. introduced the concept of the redactable blockchain, which utilizes the trapdoor collision function provided by chameleon hash to rewrite block contents without causing hashing inconsistencies. Recent research has continued to propose solutions for redactable blockchains, leveraging cryptographic algorithms such as chameleon hash and attribute-based encryption (ABE). Current solutions often employ sophisticated cryptographic schemes, such as ABE, but lack sufficient focus on developing secure and scalable solution for practical use. In this work, we propose the time-verifiable policy-based chameleon hash (TPCH) as a candidate solution for practical redaction to rewrite blockchain contents. Our solution for redactable blockchains enables the verification of whether a redaction was executed at a specific time, thereby offering time-based traceability for dominant algorithms in TPCH. Additionally, it restricts misbehavior or abuse of redaction powers by introducing a new trapdoor finding algorithm, Update, in addition to the adapt algorithm Adapt. We formally introduce TPCH with both black-box and white-box constructions. Our experimental and theoretical analysis demonstrates the feasibility and practicality of the proposed solution. Full article
(This article belongs to the Special Issue Applied Cryptography and Practical Cryptoanalysis for Web 3.0)
Show Figures

Figure 1

Figure 1
<p>System framework.</p>
Full article ">Figure 2
<p>Hash Costs.</p>
Full article ">Figure 3
<p>Adapt costs.</p>
Full article ">Figure 4
<p>Update by policies.</p>
Full article ">Figure 5
<p>Update by rounds.</p>
Full article ">
27 pages, 577 KiB  
Article
Approximate Description of Indefinable Granules Based on Classical and Three-Way Concept Lattices
by Hongwei Wang, Huilai Zhi and Yinan Li
Mathematics 2025, 13(4), 672; https://doi.org/10.3390/math13040672 - 18 Feb 2025
Viewed by 226
Abstract
Granule description is a fundamental problem in granular computing. However, how to describe indefinable granules is still an open, interesting, and important problem. The main objective of this paper is to give a preliminary solution to this problem. Before proceeding, the framework of [...] Read more.
Granule description is a fundamental problem in granular computing. However, how to describe indefinable granules is still an open, interesting, and important problem. The main objective of this paper is to give a preliminary solution to this problem. Before proceeding, the framework of approximate description is introduced. That is, any indefinable granule is characterized by an ordered pair of formulas, which form an interval set, where the first formula is the β-prior approximate optimal description and the second formula is the α-prior approximate optimal description. More concretely, given an indefinable granule, by exploring the description of its lower approximate granule, its β-prior approximate optimal description is obtained. Likewise, by consulting the description of its upper approximate granule, its α-prior approximate optimal description can also be derived. Following this idea, the descriptions of indefinable granules are investigated. Firstly, ∧-approximate descriptions of indefinable granules are investigated based on the classical concept lattice, and (,)-approximate descriptions of indefinable granules are given via object pictorial diagrams. And then, it is revealed from some examples that the classical concept lattice is no longer effective and negative attributes must be taken into consideration. Therefore, a three-way concept lattice is adopted instead of the classical concept lattice to study (,¬)-approximate descriptions and (,,¬)-approximate descriptions of indefinable granules. Finally, some discussions are presented to show the differences and similarities between our study and existing ones. Full article
(This article belongs to the Special Issue Recent Advances and Prospects in Formal Concept Analysis (FCA))
Show Figures

Figure 1

Figure 1
<p>Descriptive ability comparison of four types of logic languages.</p>
Full article ">Figure 2
<p>Concept lattice of the formal context in <a href="#mathematics-13-00672-t003" class="html-table">Table 3</a>.</p>
Full article ">Figure 3
<p><math display="inline"><semantics> <mrow> <mo>(</mo> <msub> <mi>H</mi> <mi>d</mi> </msub> <mo>,</mo> <mo>≤</mo> <mo>)</mo> </mrow> </semantics></math> of the formal context in <a href="#mathematics-13-00672-t003" class="html-table">Table 3</a>.</p>
Full article ">Figure 4
<p>Three-way concept lattice of the formal context in <a href="#mathematics-13-00672-t003" class="html-table">Table 3</a>.</p>
Full article ">Figure 5
<p>Experimental results on the number of basic granules.</p>
Full article ">Figure 6
<p>Experimental results on time consumption.</p>
Full article ">
22 pages, 6709 KiB  
Article
Photobiomodulation LED Devices for Home Use: Design, Function and Potential: A Pilot Study
by Mark Cronshaw, Steven Parker, Omar Hamadah, Josep Arnabat-Dominguez and Martin Grootveld
Dent. J. 2025, 13(2), 76; https://doi.org/10.3390/dj13020076 - 10 Feb 2025
Viewed by 700
Abstract
Background/Objectives: Many commercial light-emitting diode (LED) devices are available for consumer home usage. The performance characteristics in respect to the dosimetry of many of the devices, currently on direct sale to the public, have not been subject to formal appraisal. In order [...] Read more.
Background/Objectives: Many commercial light-emitting diode (LED) devices are available for consumer home usage. The performance characteristics in respect to the dosimetry of many of the devices, currently on direct sale to the public, have not been subject to formal appraisal. In order to ‘bridge the gap’ between the evidence-based photobiomodulation therapy (PBMT) community and other interested parties, an evaluation is made of a selection of torch type hand-held LED PBMT products currently available for home use. Methods: Five randomly chosen intra-oral and hand-held LED PBMT devices were selected. The optical delivery parameters of the devices were measured, including the beam divergence angle, surface area exposure as well as the output power at the level of the LEDs. The surface and sub-surface temperature changes in porcine tissue samples were assessed under standardised conditions. The manufacturer’s patient instructions were correlated to the measured optical parameters. Calculations were made of irradiance and surface radiant exposure. Consumer satisfaction ratings and feedback data were collated, and a relevant statistical analysis conducted. Results: The results were heterogeneous with a wide range of applied wavelengths, output power and irradiance. Power output stability was variable, and, together with a wide beam divergence angle of 74°, the manufacturer’s directions for dosimetry were found to be inconsistent with an accurate dose delivery. Conclusions: The manufacturer’s proposed dosimetry fails to consider the relevance of the beam divergence angle and optical attenuation in view of the scatter and absorption. Appropriate instructions on how best to gain and optimise an acceptable clinical outcome were inconsistent with an evidence-based approach. Subject to validation by well-planned clinical trials, the concept of home PBMT may open interesting new therapeutic approaches. Full article
(This article belongs to the Special Issue Laser Dentistry: The Current Status and Developments)
Show Figures

Figure 1

Figure 1
<p>Test set-up. LED source irradiates test tissues. Thermal measurements (FLIR ETS-320 camera) recorded and analysed (FLIR Thermal Studio Pro, Teledyne FLIR, Wilsonville, OR, USA). Transillumination: thermal measurements are taken of seven sets of samples. Surface thermal measurements are taken immediately before and after radiant exposure at the manufacturer’s preset cut off times of three minutes (Kinreen, device 5) and five minutes (Homesta, device 4).</p>
Full article ">Figure 2
<p>Names and images of devices include 1–5 plus accessories (protective eyewear and glass light guides).</p>
Full article ">Figure 3
<p>Plot of adjusted χ<sup>2</sup> contingency table residuals (differences between observed and expected values) for the Likert scale score values acquired for each reported study conducted. These results clearly confirm the association between rows (product studies) and columns (Likert score values). Red and grey colour codes represent statistically significant and non-significant adjusted cellular residuals, i.e., raw differences between the observed and expected contingency table counts divided by their corresponding standard error estimates.</p>
Full article ">Figure 4
<p>The plot of adjusted χ<sup>2</sup> contingency table residuals (differences between observed and expected values) for the Likert scale score values acquired for each product evaluated (summated values for four of these). These results clearly confirm the association between rows (LED products) and columns (Likert score values). Red and grey colour codes represent statistically significant and non-significant adjusted cellular residuals, i.e., raw differences between the observed and expected contingency table counts divided by their corresponding standard error estimates.</p>
Full article ">Figure 5
<p>Bar diagram plots of mean ± 95% confidence intervals for percentages of consumer score ratings for each of the five different products evaluated.</p>
Full article ">Figure 6
<p>Consumer approval ratings 4–5* for the five devices (numbers of participants in brackets).</p>
Full article ">Figure 7
<p>Mean ± 95% confidence interval power loss value bar diagrams for the LED devices evaluated. Plots for (<b>a</b>) devices and (<b>b</b>) sampling time points are plotted. Power measured at time intervals of 1.00 min and was based on 5 sets of readings for each device. Colour codings for (<b>a</b>) grey, 0.00 min; pink, 1.00 min; blue, 2.00 min; green, 3.0 min; purple, 4.0 min and brown 5.0 min. Colour codings for (<b>b</b>) grey, Homesta; pink, Kinreen; blue, Shenglaite; and green, Yzoe.</p>
Full article ">Figure 8
<p>Effect of optical safety glasses. Bar diagram of mean ± 95% CIs for the percentage beam transparencies of three devices tested (mean values are those for 5 sets of readings).</p>
Full article ">Figure 9
<p>For the purposes of illustration, if an LED source with a diameter of 3 cm has a beam divergence angle of 70°, then, at a distance of 2 cm away, the area of exposure is around 4-fold the spot surface area of the device (28 cm<sup>2</sup>).</p>
Full article ">Figure 10
<p>A laser with a 3 cm diameter beam at a source and beam divergence angle of 30°, applied at a distance of 2 cm, has a spot size of &lt;2-fold the source (12.6 cm<sup>2</sup>). In consequence, the laser has an average irradiance of ~50% of the source. In contrast, the average irradiance (W/cm<sup>2</sup>) of the LED at the target is &lt;25% of the emission source.</p>
Full article ">Figure 11
<p>The wide beam divergence of an LED source increases the area of surface exposure and reduces the peak power concentration in the mid-third of the beam, unlike a laser source. However, a laser has a narrow spectral range with a coherent waveform, and the light waves move forward together both in time and space.</p>
Full article ">
20 pages, 1611 KiB  
Article
Functional Language Logic
by Vincenzo Manca
Electronics 2025, 14(3), 460; https://doi.org/10.3390/electronics14030460 - 23 Jan 2025
Viewed by 638
Abstract
The formalism of Functional Language Logic (FLL) is presented, which is an extension of a logical formalism already introduced to represent sentences in natural languages. In the FLL framework, a sentence is represented by aggregating primitive predicates corresponding to words of a fixed [...] Read more.
The formalism of Functional Language Logic (FLL) is presented, which is an extension of a logical formalism already introduced to represent sentences in natural languages. In the FLL framework, a sentence is represented by aggregating primitive predicates corresponding to words of a fixed language (English in the given examples). The FLL formalism constitutes a bridge between mathematical logic (high-order predicate logic) and the classical logical analysis of discourse, rooted in the Western linguistic tradition. Namely, FLL representations reformulate on a rigorous logical basis many fundamental classical concepts (complementation, modification, determination, specification, …), becoming, at the same time, a natural way of introducing mathematical logic through natural language representations, where the logic of linguistic phenomena is analyzed independently from the single syntactical and semantical choices of particular languages. In FLL, twenty logical operators express the mechanisms of logical aggregation underlying meaning constructions. The relevance of FLL in chatbot interaction is considered, and a problem concerning the relationship between embedding vectors in LLM (Large Language Model) transformers and FLL representations is posed. Full article
(This article belongs to the Section Artificial Intelligence)
Show Figures

Figure 1

Figure 1
<p>A graphical representation of the predication Good(a) in a direct way (<b>bottom</b>) and through predicative abstraction (<b>Top</b>).</p>
Full article ">Figure 2
<p>A graphical representation of the sentence given in <a href="#electronics-14-00460-t003" class="html-table">Table 3</a>.</p>
Full article ">Figure 3
<p>A graphical representation of ”Good(a)” (<b>Top</b>) and ”Good_Policeman(a)” (<b>Bottom</b>).</p>
Full article ">Figure 4
<p>A graphical representation of the sentence given in <a href="#electronics-14-00460-t009" class="html-table">Table 9</a>.</p>
Full article ">Figure 5
<p>A graphical representation of sentence given in <a href="#electronics-14-00460-t015" class="html-table">Table 15</a>.</p>
Full article ">Figure 6
<p>A graphical representation of sentence given in <a href="#electronics-14-00460-t016" class="html-table">Table 16</a>.</p>
Full article ">Figure 7
<p>An FLL representation of the Chinese sentence: “Yesterday I was walking along the sea”.</p>
Full article ">Figure 8
<p>A graphical representation of FLL operators (apart from connectives and typing).</p>
Full article ">
25 pages, 397 KiB  
Article
Sustainable Development of Traditional Business Culture: Merchant Guild Culture and Enterprise Innovation
by Li Ren and Yanping Cheng
Sustainability 2025, 17(3), 853; https://doi.org/10.3390/su17030853 - 22 Jan 2025
Viewed by 694
Abstract
By exploring the positive elements of traditional business culture and combining them with modern enterprise management concepts, this paper aims to realize the sustainable development of enterprises and cultural heritage and innovation. In this context, this study empirically examines the impact and mechanisms [...] Read more.
By exploring the positive elements of traditional business culture and combining them with modern enterprise management concepts, this paper aims to realize the sustainable development of enterprises and cultural heritage and innovation. In this context, this study empirically examines the impact and mechanisms of merchant guild culture (MGC) on corporate innovation, using A-share listed companies from 2010 to 2022 as a sample. The findings indicate that MGC positively influences contemporary corporate innovation through both external and internal channels. External channels include alleviating financing constraints and enhancing ESG performance, and internal influence channels such as improving integrity and emphasizing human capital. Additionally, social networks strengthen the relationship between MGC and corporate innovation. Furthermore, using the legal environment as a moderating variable has led to the discovery of a certain substitution relationship between formal and informal institutions. A heterogeneity analysis further shows that the effect of MGC on innovation is more pronounced in enterprises with low-risk preference and a foreign cultural impact. Full article
Show Figures

Figure 1

Figure 1
<p>Latitudinal and longitudinal distribution of places of origin and listed companies.</p>
Full article ">
23 pages, 454 KiB  
Article
New Simplification Rules for Databases with Positive and Negative Attributes
by Domingo López-Rodríguez, Manuel Ojeda-Hernández and Carlos Bejines
Mathematics 2025, 13(2), 309; https://doi.org/10.3390/math13020309 - 18 Jan 2025
Viewed by 553
Abstract
In this paper, new logical equivalences are presented within the simplification logic with mixed attributes paradigm, which allow the obtention of bases of shorter, easier-to-read attribute implications. In addition to the theoretical results which show that the proposed equivalences indeed hold in simplification [...] Read more.
In this paper, new logical equivalences are presented within the simplification logic with mixed attributes paradigm, which allow the obtention of bases of shorter, easier-to-read attribute implications. In addition to the theoretical results which show that the proposed equivalences indeed hold in simplification logic with mixed attributes, experimental results which showcase the effectiveness of this method are also provided. Furthermore, the simplification method presented is iterative and gives sufficiently good results in only one or two iterations, therefore presenting itself as a reasonable procedure in time-sensitive experiments. Full article
Show Figures

Figure 1

Figure 1
<p>The size (<b>left</b>) and cardinality (<b>right</b>) of the resulting implicational system, <math display="inline"><semantics> <msup> <mo>Σ</mo> <mo>′</mo> </msup> </semantics></math>, as a function of the size and cardinality of the initial set, <math display="inline"><semantics> <mo>Σ</mo> </semantics></math>.</p>
Full article ">Figure 2
<p>The execution time of the three algorithms, as a function of the initial size, <math display="inline"><semantics> <mrow> <mo>|</mo> <mo>|</mo> <mo>Σ</mo> <mo>|</mo> <mo>|</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 3
<p>The reduction in the proportion of attributes present after the first iterations of each algorithm.</p>
Full article ">Figure 4
<p>The final size (left) and running time (right) as a function of <math display="inline"><semantics> <mrow> <mo>|</mo> <mo>|</mo> <mo>Σ</mo> <mo>|</mo> <mo>|</mo> </mrow> </semantics></math>, including the new versions, (v3.1) and (v3.2).</p>
Full article ">
22 pages, 839 KiB  
Article
A Randomized Response Framework to Achieve Differential Privacy in Medical Data
by Andreas Ioannidis, Antonios Litke and Nikolaos K. Papadakis
Electronics 2025, 14(2), 326; https://doi.org/10.3390/electronics14020326 - 15 Jan 2025
Viewed by 580
Abstract
In recent years, differential privacy has gained substantial traction in the medical domain, where the need to balance privacy preservation with data utility is paramount. As medical data increasingly relies on cloud platforms and distributed sharing among multiple stakeholders, such as healthcare providers, [...] Read more.
In recent years, differential privacy has gained substantial traction in the medical domain, where the need to balance privacy preservation with data utility is paramount. As medical data increasingly relies on cloud platforms and distributed sharing among multiple stakeholders, such as healthcare providers, researchers, and policymakers, the importance of privacy-preserving techniques has become more pronounced. Trends in the field focus on designing efficient algorithms tailored to high-dimensional medical datasets, incorporating privacy guarantees into federated learning for distributed medical devices, and addressing challenges posed by adversarial attacks. Our work lays a foundation for these emerging applications by emphasizing the role of randomized response within the broader differential privacy framework, paving the way for advancements in secure medical data sharing and analysis. In this paper, we analyze the classical concept of a randomized response and investigate how it relates to the fundamental concept of differential privacy. Our approach is both mathematical and algorithmic in nature, and our purpose is twofold. On the one hand, we provide a formal and precise definition of differential privacy within a natural and convenient probabilistic—statistical framework. On the other hand, we position a randomized response as a special yet significant instance of differential privacy, demonstrating its utility in preserving individual privacy in sensitive data scenarios. To substantiate our findings, we include key theoretical proofs and provide indicative simulations, accompanied by open-access code to facilitate reproducibility and further exploration. Full article
Show Figures

Figure 1

Figure 1
<p>Flow diagram of the RR framework.</p>
Full article ">Figure 2
<p>The spinner wheel.</p>
Full article ">Figure 3
<p>Simulation for various sizes of <math display="inline"><semantics> <mi>α</mi> </semantics></math> and <span class="html-italic">n</span>.</p>
Full article ">Figure 4
<p>Simulation of binomial distribution various sizes of <math display="inline"><semantics> <mi>κ</mi> </semantics></math> and <span class="html-italic">n</span>.</p>
Full article ">
25 pages, 4633 KiB  
Review
The Concept of Informal Green Space in Academic Research: A Comprehensive Literature Review on the Terminology Used
by Duy Thong Ta, Huixin Wang and Katsunori Furuya
Land 2025, 14(1), 43; https://doi.org/10.3390/land14010043 - 28 Dec 2024
Viewed by 720
Abstract
Informal green spaces (IGSs) are vital yet under-researched urban areas that enhance biodiversity, provide ecosystem services, and improve the well-being of urban residents. However, the lack of a consistent definition and comprehensive understanding of their multifunctional roles has hindered their effective integration into [...] Read more.
Informal green spaces (IGSs) are vital yet under-researched urban areas that enhance biodiversity, provide ecosystem services, and improve the well-being of urban residents. However, the lack of a consistent definition and comprehensive understanding of their multifunctional roles has hindered their effective integration into urban planning. The current literature review aimed to clarify the concept of IGSs, analyze research trends, and identify further research areas. Using a combined bibliometric and systematic analysis approach, 150 articles from the Web of Science database, published from 1996 to 2024, were analyzed. The systematic analysis identified 54 relevant documents on the effects of green areas, revealing a diverse and growing body of research on IGSs, including their types, distribution, and socioeconomic contexts. The findings indicated an increasing trend in collaborative studies, using “informal green space” as the official term. This review proposed a clear and comprehensive definition of IGS, emphasizing its visibility, lack of formal recognition, minimal management, spontaneous vegetation, and temporary nature and underscoring its substantial environmental and social benefits. Furthermore, this review highlighted the need for standardized definitions and interdisciplinary studies to fully harness the potential of IGSs, thereby emphasizing their essential contribution to urban biodiversity and the regulation of urban microclimates. Full article
(This article belongs to the Special Issue Urban Forestry Dynamics: Management and Mechanization)
Show Figures

Figure 1

Figure 1
<p>IGS on a sideroad.</p>
Full article ">Figure 2
<p>The research flow with five steps and three stages of selection.</p>
Full article ">Figure 3
<p>The number of publications on IGSs from 1996 to 2024. Although there are some short-run fluctuations, an overall upward trend can be observed, confirming the increased attention on this subject area.</p>
Full article ">Figure 4
<p>The geographic distribution of research outputs regarding IGSs, highlighting the frequency of contributions from various regions. The data underscore a significant concentration of research activity in the USA, reflecting regional priorities and expertise in the field.</p>
Full article ">Figure 5
<p>The number of scientific productions (<b>a</b>) and citations (<b>b</b>) related to IGSs by countries worldwide.</p>
Full article ">Figure 5 Cont.
<p>The number of scientific productions (<b>a</b>) and citations (<b>b</b>) related to IGSs by countries worldwide.</p>
Full article ">Figure 6
<p>The graph illustrates the evolving trends in research topics related to IGSs from 2013 to 2024. Each topic is plotted on the vertical axis while the timeline is represented horizontally. The bubble size indicates each term’s frequency, with bigger bubbles denoting higher frequency. The data reveal several key patterns in the research landscape over the past decade, with the shift being from “space” and “conservation” to “benefits” and “ecosystem services”.</p>
Full article ">Figure 7
<p>The main structure of research themes in IGSs is based on keyword co-occurrence. There are five clusters of themes, and the main research theme is “biodiversity” in the “cities”.</p>
Full article ">Figure 8
<p>The number of distinct methods by research purpose.</p>
Full article ">Figure 9
<p>The IGS definition evolution map from the 17th century till now. In research from 2007 [<a href="#B71-land-14-00043" class="html-bibr">71</a>], it was stated that the term “urban wilderness” originated in the 17th century. It continued to be used and developed into an “urban void” and then an “urban wasteland.” During that period, the term “urban spontaneous vegetation” appeared and has been used in parallel with “informal green space” since 2014. Since then, many sub-terms have been developed depending on the research purpose.</p>
Full article ">Figure 10
<p>The concept of an IGS and its characteristics are connected. IGSs contain “spontaneous vegetation” that grows due to “nonmanagement”. This overgrown state can lead to negative feelings such as fear of crime, insects, and wild animals. The IGSs may be withered “temporarily” owing to weather or redevelopment. This affects their “visibility” and makes them insignificant; therefore, government institutions do not formally recognize them as green spaces. As a result, there are no policies or regulations regarding IGSs, leading to “nonmanagement”.</p>
Full article ">
18 pages, 645 KiB  
Article
Applying the Surge Capacity Components for Capacity-Building Purposes in the Context of the EMT Initiative
by Lina Echeverri, Flavio Salio, Richard Parker, Pryanka Relan, Oleg Storozhenko, Ives Hubloue and Luca Ragazzoni
Int. J. Environ. Res. Public Health 2024, 21(12), 1712; https://doi.org/10.3390/ijerph21121712 - 23 Dec 2024
Viewed by 774
Abstract
Background: On 16 January 2021 (EB148/18 Session), the World Health Organization (WHO) and Member States emphasized the importance of expanding the WHO Emergency Medical Teams (EMT) Initiative, investing in a global health workforce and multidisciplinary teams capable of being rapidly deployed, equipped, and [...] Read more.
Background: On 16 January 2021 (EB148/18 Session), the World Health Organization (WHO) and Member States emphasized the importance of expanding the WHO Emergency Medical Teams (EMT) Initiative, investing in a global health workforce and multidisciplinary teams capable of being rapidly deployed, equipped, and fully trained to respond to all-hazard emergencies effectively. This resulted in the need to define a comprehensive framework. To achieve this, the EMT Initiative proposes the application of the four components of Surge Capacity, known as the 4“S” (Staff, Systems, Supplies, and Structure/Space), to build global capacities and capabilities, ensuring rapid mobilization and efficient coordination of national and international medical teams for readiness and response, complying with crisis standards of care defined in an ethical and evidence-based manner. Methods: A mixed-qualitative research approach was used, incorporating expert consensus through focus group discussions (FGDs), between 2021 and July 2022. This facilitated a detailed process analysis for the application of the surge capacity components to build global capacities and capabilities. This research highlighted the similarities between surge capacity and capacity building from an initial desk review and unified these concepts within the EMT Initiative. A standardized formal pathway was developed to enhance local, regional, and global capacities for emergency readiness and response. Results: The results showed that the framework successfully integrated the essential components of surge capacity and capacity building, making it adaptable to various settings. Conclusions: This framework provides a unified and replicable approach for readiness and response for all-hazards emergencies. Full article
Show Figures

Figure 1

Figure 1
<p>Research steps.</p>
Full article ">Figure 2
<p>Implementation modalities/pathways.</p>
Full article ">
19 pages, 2632 KiB  
Article
Measuring and Forecasting the Development Concept of the “Green” Macrosystem Using Data Analysis Technologies
by Aleksei I. Shinkevich, Farida F. Galimulina and Naira V. Barsegyan
Sustainability 2024, 16(24), 11152; https://doi.org/10.3390/su162411152 - 19 Dec 2024
Viewed by 587
Abstract
A research framework is formed by the semantics of the “green” macrosystem, supported by a methodological approach, data analysis, and forecasting, with a focus on the dynamics of transition to a qualitatively new state. The purpose of the work is to develop conceptual [...] Read more.
A research framework is formed by the semantics of the “green” macrosystem, supported by a methodological approach, data analysis, and forecasting, with a focus on the dynamics of transition to a qualitatively new state. The purpose of the work is to develop conceptual provisions and methodological tools for assessing the implementation of the concept of a “green” macrosystem. Applying methods of system analysis, content analysis, formalization, statistical analysis (technologies of knowledge discovery in databases and time series analysis), and discriminant analysis contributed to achieving the goal. As a result of the research, the categorical apparatus of a “green” economy was clarified by outlining narrow and broad approaches to defining the “green” macrosystem; the author’s method of assessing the implementation of the concept of the “green” macrosystem was modified by expanding the list of factors of greening the economic system, transforming the condition of the “green” corridor to calculate the developed dynamic coefficient of transition to the concept of the “green” macrosystem (DCGM), and adapting the method to the macrosystem level; the regularities of a transition to the concept of a “green” macrosystem were revealed. The novelty of the study lies in the proposal of an integral DCGM indicator, which avoids the problems of normalization, weighting, and loss of relevant data, incorporates the determinants of the “green” economy (natural resources and pollution), and relies on available data. The formulated provisions develop the theoretical basis on which to transform the macrosystem to the “green” concept and can be taken into account in the implementation’s framework of strategic planning documents for the greening of production and economic systems. Full article
Show Figures

Figure 1

Figure 1
<p>Block diagram of the dynamic model for assessing the level of greening of production systems.</p>
Full article ">Figure 2
<p>A block diagram of the research work.</p>
Full article ">Figure 3
<p>Author’s interdisciplinary approach to justification of a “green” macrosystem.</p>
Full article ">Figure 4
<p>Statistical analysis of the dynamic coefficient of transition to the concept of a “green” macrosystem.</p>
Full article ">
20 pages, 1956 KiB  
Article
Enhancing Ontological Metamodel Creation Through Knowledge Extraction from Multidisciplinary Design and Optimization Frameworks
by Esma Karagoz, Olivia J. Pinon Fischer and Dimitri N. Mavris
Systems 2024, 12(12), 555; https://doi.org/10.3390/systems12120555 - 12 Dec 2024
Viewed by 601
Abstract
The design of complex aerospace systems requires a broad multidisciplinary knowledge base and an iterative approach to accommodate changes effectively. Engineering knowledge is commonly represented through engineering analyses and descriptive models with underlying semantics. While guidelines from systems engineering methodologies exist to guide [...] Read more.
The design of complex aerospace systems requires a broad multidisciplinary knowledge base and an iterative approach to accommodate changes effectively. Engineering knowledge is commonly represented through engineering analyses and descriptive models with underlying semantics. While guidelines from systems engineering methodologies exist to guide the development of system models, creating a system model from scratch with every new application/system requires research into more adaptable and reusable modeling frameworks. In this context, this research demonstrates how a physics-based multidisciplinary analysis and optimization tool, SUAVE, can be leveraged to develop a system model. By leveraging the existing physics-based knowledge captured within SUAVE, the process benefits from the expertise embedded in the tool. To facilitate the systematic creation of the system model, an ontological metamodel is created in SysML. This metamodel is designed to capture the inner workings of the SUAVE tool, representing its concepts, relationships, and behaviors. By using this ontological metamodel as a modeling template, the process of creating the system model becomes more structured and organized. Overall, this research aims to streamline the process of building system models from scratch by leveraging existing knowledge and utilizing an ontological metamodel as a modeling template. This approach enhances formal knowledge representation and its consistency, and promotes reusability in multidisciplinary design problems. Full article
(This article belongs to the Section Systems Engineering)
Show Figures

Figure 1

Figure 1
<p>Knowledge types in engineering design [<a href="#B14-systems-12-00555" class="html-bibr">14</a>,<a href="#B16-systems-12-00555" class="html-bibr">16</a>].</p>
Full article ">Figure 2
<p>Projections of knowledge on to the system model [<a href="#B23-systems-12-00555" class="html-bibr">23</a>].</p>
Full article ">Figure 3
<p>Research methodology.</p>
Full article ">Figure 4
<p>Structure of SUAVE.</p>
Full article ">Figure 5
<p>Logical decomposition: In the system model, the logical components are represented by disciplinary analyses within SUAVE.</p>
Full article ">Figure 6
<p>SUAVE—aerodynamics analyses: The decomposition of aerodynamics analyses involves different levels of fidelity.</p>
Full article ">Figure 7
<p>Fidelity zero aerodynamics method.</p>
Full article ">Figure 8
<p>Data flow between the analysis method functions.</p>
Full article ">Figure 9
<p>Parametric diagram showing the relation between the components and inputs/outputs.</p>
Full article ">Figure 10
<p>Physical decomposition: This decomposition reflects the structure of SUAVE. It is important to note that this figure provides a high-level view of the physical decomposition to avoid overcrowded representations.</p>
Full article ">Figure 11
<p>Ontological metamodel.</p>
Full article ">Figure 12
<p>A high-level representation of the mission analysis for a Boeing 737-800, including the required analyses and the physical components whose values serve as input for the analysis.</p>
Full article ">
23 pages, 2628 KiB  
Article
Enhanced Feature Selection via Hierarchical Concept Modeling
by Jarunee Saelee, Patsita Wetchapram, Apirat Wanichsombat, Arthit Intarasit, Jirapond Muangprathub, Laor Boongasame and Boonyarit Choopradit
Appl. Sci. 2024, 14(23), 10965; https://doi.org/10.3390/app142310965 - 26 Nov 2024
Viewed by 847
Abstract
The objectives of feature selection include simplifying modeling and making the results more understandable, improving data mining efficiency, and providing clean and understandable data preparation. With big data, it also allows us to reduce computational time, improve prediction performance, and better understand the [...] Read more.
The objectives of feature selection include simplifying modeling and making the results more understandable, improving data mining efficiency, and providing clean and understandable data preparation. With big data, it also allows us to reduce computational time, improve prediction performance, and better understand the data in machine learning or pattern recognition applications. In this study, we present a new feature selection approach based on hierarchical concept models using formal concept analysis (FCA) and a decision tree (DT) for selecting a subset of attributes. The presented methods are evaluated based on all learned attributes with 10 datasets from the UCI Machine Learning Repository by using three classification algorithms, namely decision trees, support vector machines (SVM), and artificial neural networks (ANN). The hierarchical concept model is built from a dataset, and it is selected by top-down considering features (attributes) node for each level of structure. Moreover, this study is considered to provide a mathematical feature selection approach with optimization based on a paired-samples t-test. To compare the identified models in order to evaluate feature selection effects, the indicators used were information gain (IG) and chi-squared (CS), while both forward selection (FS) and backward elimination (BS) were tested with the datasets to assess whether the presented model was effective in reducing the number of features used. The results show clearly that the proposed models when using DT or using FCA, needed fewer features than the other methods for similar classification performance. Full article
Show Figures

Figure 1

Figure 1
<p>A conceptual overview of the proposed models.</p>
Full article ">Figure 2
<p>The experimental design.</p>
Full article ">Figure 3
<p>The classification accuracy comparison across DT, SVM, and ANN.</p>
Full article ">Figure 4
<p>The comparison DT-SVM-ANN classifiers with features selected using DT from the first to the fifth level and with all original features.</p>
Full article ">Figure 5
<p>The classification accuracy using FCA-based feature selection across levels and classifiers (<a href="#applsci-14-10965-t005" class="html-table">Table 5</a>).</p>
Full article ">Figure 6
<p>A comparison DT-SVM-ANN classifiers with feature selection using FCA from the first to the fifth level and with all original features.</p>
Full article ">Figure 7
<p>A comparison of average classification accuracies with each classifier model between features selected using DT or FCA.</p>
Full article ">Figure 8
<p>The classification accuracy across hierarchical levels for selected datasets. (<b>a</b>) Feature selection using decision tree. (<b>b</b>) Feature selection using FCA.</p>
Full article ">Figure 9
<p>An example hierarchical structure using decision tree and the accuracies with selected features.</p>
Full article ">Figure 10
<p>The classification performance comparison in <a href="#applsci-14-10965-t007" class="html-table">Table 7</a>.</p>
Full article ">Figure 11
<p>Comparison of proposed feature selection using DT on each level and others with DT-SVM-ANN classifiers.</p>
Full article ">Figure 12
<p>Comparison of proposed feature selection using FCA in each level to others on using DT-SVM-ANN classifiers.</p>
Full article ">Figure 13
<p>The number of selected features by (<b>a</b>) DT and (<b>b</b>) FCA methods in each level, compared to the other selection methods.</p>
Full article ">
25 pages, 397 KiB  
Article
From Natural to Artificial: The Transformation of the Concept of Logical Consequence in Bolzano, Carnap, and Tarski
by Lassi Saario-Ramsay
Philosophies 2024, 9(6), 178; https://doi.org/10.3390/philosophies9060178 - 23 Nov 2024
Viewed by 927
Abstract
Our standard model-theoretic definition of logical consequence is originally based on Alfred Tarski’s (1936) semantic definition, which, in turn, is based on Rudolf Carnap’s (1934) similar definition. In recent literature, Tarski’s definition is described as a conceptual analysis of the intuitive ‘everyday’ concept [...] Read more.
Our standard model-theoretic definition of logical consequence is originally based on Alfred Tarski’s (1936) semantic definition, which, in turn, is based on Rudolf Carnap’s (1934) similar definition. In recent literature, Tarski’s definition is described as a conceptual analysis of the intuitive ‘everyday’ concept of consequence or as an explication of it, but the use of these terms is loose and largely unaccounted for. I argue that the definition is not an analysis but an explication, in the Carnapian sense: the replacement of the inexact everyday concept with an exact one. Some everyday intuitions were thus brought into a precise form, others were ignored and forgotten. How exactly did the concept of logical consequence change in this process? I suggest that we could find some of the forgotten intuitions in Bernard Bolzano’s (1837) definition of ‘deducibility’, which is traditionally viewed as the main precursor of Tarski’s definition from a time before formalized languages. It turns out that Bolzano’s definition is subject to just the kind of natural features—paradoxicality of everyday language, Platonism about propositions, and dependence on the external world—that Tarski sought to tame by constructing an artificial concept for the special needs of mathematical logic. Full article
25 pages, 1557 KiB  
Article
Evidential Analysis: An Alternative to Hypothesis Testing in Normal Linear Models
by Brian Dennis, Mark L. Taper and José M. Ponciano
Entropy 2024, 26(11), 964; https://doi.org/10.3390/e26110964 - 10 Nov 2024
Viewed by 1134
Abstract
Statistical hypothesis testing, as formalized by 20th century statisticians and taught in college statistics courses, has been a cornerstone of 100 years of scientific progress. Nevertheless, the methodology is increasingly questioned in many scientific disciplines. We demonstrate in this paper how many of [...] Read more.
Statistical hypothesis testing, as formalized by 20th century statisticians and taught in college statistics courses, has been a cornerstone of 100 years of scientific progress. Nevertheless, the methodology is increasingly questioned in many scientific disciplines. We demonstrate in this paper how many of the worrisome aspects of statistical hypothesis testing can be ameliorated with concepts and methods from evidential analysis. The model family we treat is the familiar normal linear model with fixed effects, embracing multiple regression and analysis of variance, a warhorse of everyday science in labs and field stations. Questions about study design, the applicability of the null hypothesis, the effect size, error probabilities, evidence strength, and model misspecification become more naturally housed in an evidential setting. We provide a completely worked example featuring a two-way analysis of variance. Full article
Show Figures

Figure 1

Figure 1
<p>Probability density functions (solid curves) of the noncentral F(<math display="inline"><semantics> <mrow> <mi>q</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>n</mi> <mo>−</mo> <mi>r</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>λ</mi> </mrow> </semantics></math>) distribution for various values of sample size <math display="inline"><semantics> <mi>n</mi> </semantics></math> and the noncentrality parameter <math display="inline"><semantics> <mi>λ</mi> </semantics></math>, as represented in the formula for <math display="inline"><semantics> <mrow> <mi>f</mi> <mfenced> <mi>u</mi> </mfenced> </mrow> </semantics></math> in the text, Equation (30). Here, <math display="inline"><semantics> <mrow> <mi>λ</mi> <mo>=</mo> <mi>n</mi> <msup> <mi>δ</mi> <mn>2</mn> </msup> </mrow> </semantics></math>, which is in the common form of a simple experimental design, where <math display="inline"><semantics> <mi>n</mi> </semantics></math> is the number of observations and <math display="inline"><semantics> <mrow> <msup> <mi>δ</mi> <mn>2</mn> </msup> </mrow> </semantics></math> is a generalized squared per-observation effect size. The cumulative distribution function of the noncentral F distribution, exemplified here as the area under each density curve to the left of the dashed vertical line, is a monotone decreasing function of <math display="inline"><semantics> <mi>n</mi> </semantics></math>. Here, <math display="inline"><semantics> <mrow> <mi>q</mi> <mo>=</mo> <mn>6</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>r</mi> <mo>=</mo> <mn>12</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>δ</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0.25</mn> </mrow> </semantics></math>, and <math display="inline"><semantics> <mi>n</mi> </semantics></math> has the values <math display="inline"><semantics> <mrow> <mn>24</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mn>36</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mn>48</mn> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <mn>60</mn> </mrow> </semantics></math>. Dashed curve is the density function for the F(<math display="inline"><semantics> <mrow> <mi>q</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>n</mi> <mo>−</mo> <mi>r</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>λ</mi> </mrow> </semantics></math>) distribution with <math display="inline"><semantics> <mrow> <mi>n</mi> <mo>=</mo> <mn>24</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msup> <mi>δ</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math> (central F distribution). Notice that for a given effect size, the noncentral distribution increasingly diverges from the central distribution as sample size increases.</p>
Full article ">Figure 2
<p>Curves: Estimated cdf of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>SIC</mi> </mrow> </semantics></math> for the citrus tree example (two-factor analysis of variance, <a href="#entropy-26-00964-t001" class="html-table">Table 1</a>, with model 1 representing no interactions, model 2 representing interactions) using parametric (solid) and nonparametric (dashed) bootstrap with <math display="inline"><semantics> <mrow> <mn>1024</mn> </mrow> </semantics></math> bootstrap samples. Dotted horizontal lines depict 0.05 and 0.95 levels.</p>
Full article ">Figure 3
<p>The effect of sample size on the uncertainty of an evidential estimation. The data are simulated from the estimated model 2 (representing interactions). For each data set, confidence intervals were generated with 1024 bootstraps. To depict the expected behavior of such intervals the confidence points from 1024 simulated data sets are averaged. The vertical lines indicate the average 90% confidence intervals. The open circles and the dashes indicate the average location of the 50% confidence point. The solid horizontal line indicates equal evidence for model 1 and model 2. The dotted horizontal line indicates the pseudo-true difference of Kullback–Leibler divergences in the simulations.</p>
Full article ">Figure 4
<p>Interaction plot. An interaction plot is a graphical display of the potential magnitude and location of interaction in a linear model. For a two-factor ANOVA, a basic interaction plot displays a central measure for each cell (generally mean or median) on the <span class="html-italic">Y</span>-axis plotted against a categorical factor indicated on the <span class="html-italic">X</span>-axis. The second factor is indicated by lines joining cells that share a factor level. If there is no interaction, these lines will be parallel. The stronger an interaction, the greater the deviation from parallelism will be. Of course, some deviation may result from error in the estimation of cell central values. As consequence, interaction plots often include a display, such as a boxplot or confidence interval, of the uncertainty in the estimate of cell central value. In this figure, we plot 95% confidence intervals of cell means. Because replication is low (2 observations per cell), we calculate these intervals using a pooled estimate of the standard error. We further enhance this plot by including confidence intervals on the slope of the lines. If one considers any value within an interval for a central value a plausible value, a line from any plausible central value to any plausible value in the next interval represents a plausible slope. The maximum plausible slope runs from the lower bound on the left to the upper bound on the right. Similarly, the minimum plausible slope runs from the upper bound on the left to the lower bound on the right. If the intervals on central values are confidence intervals, then these maximum and minimum plausible slopes are themselves a pair confidence bounds on the slopes whose confidence level is equal to the square of the central value interval confidence level. Since in the figure we are using 95% intervals on the cell means, the confidence level on slopes is 90.5%. In the case study of citrus yields, the interaction plot readily shows that small changes in the cell mean yields well within the uncertainties in cell means could make all lines parallel. This interpretation matches the quantitative estimate of very low evidence for interactions.</p>
Full article ">
20 pages, 698 KiB  
Article
Beyond Human and Machine: An Architecture and Methodology Guideline for Centaurian Design
by Remo Pareschi
Sci 2024, 6(4), 71; https://doi.org/10.3390/sci6040071 - 4 Nov 2024
Viewed by 1597
Abstract
The concept of the centaur, symbolizing the fusion of human and machine intelligence, has intrigued visionaries for decades. Recent advancements in artificial intelligence have made this concept not only realizable but also actionable. This synergistic partnership between natural and artificial intelligence promises superior [...] Read more.
The concept of the centaur, symbolizing the fusion of human and machine intelligence, has intrigued visionaries for decades. Recent advancements in artificial intelligence have made this concept not only realizable but also actionable. This synergistic partnership between natural and artificial intelligence promises superior outcomes by leveraging the strengths of both entities. Tracing its origins back to early pioneers of human–computer interaction in the 1960s, such as J.C.R. Licklider and Douglas Engelbart, the idea initially manifested in centaur chess but faced challenges as technological advances began to overshadow human contributions. However, the resurgence of generative AI in the late 2010s, exemplified by conversational agents and text-to-image chatbots, has rekindled interest in the profound potential of human–AI collaboration. This article formalizes the centaurian model, detailing properties associated with various centaurian designs, evaluating their feasibility, and proposing a design methodology that integrates human creativity with artificial intelligence. Additionally, it compares this model with other integrative theories, such as the Theory of Extended Mind and Intellectology, providing a comprehensive analysis of its place in the landscape of human–machine interaction. Full article
Show Figures

Figure 1

Figure 1
<p>Simon’s Cognitive Architecture.</p>
Full article ">Figure 2
<p>The Evolution of the Cognitive Trading System from Human-Operated Trading to AI-Augmented Trading to Centauric Cognitive Trading System.</p>
Full article ">Figure 3
<p>Evolution of a Centaur NLP System.</p>
Full article ">Figure 4
<p>From Monotonic to Non-monotonic.</p>
Full article ">Figure 5
<p>CreatiChain Creativity Loop.</p>
Full article ">Figure 6
<p>Evolution of Chess Systems: Closed/Reductionist Approach.</p>
Full article ">Figure 7
<p>Evolution of Art Systems: Open/Centauric Approach.</p>
Full article ">
Back to TopTop