Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (249)

Search Parameters:
Keywords = formal concept analysis

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 397 KiB  
Article
From Natural to Artificial: The Transformation of the Concept of Logical Consequence in Bolzano, Carnap, and Tarski
by Lassi Saario-Ramsay
Philosophies 2024, 9(6), 178; https://doi.org/10.3390/philosophies9060178 (registering DOI) - 23 Nov 2024
Viewed by 23
Abstract
Our standard model-theoretic definition of logical consequence is originally based on Alfred Tarski’s (1936) semantic definition, which, in turn, is based on Rudolf Carnap’s (1934) similar definition. In recent literature, Tarski’s definition is described as a conceptual analysis of the intuitive ‘everyday’ concept [...] Read more.
Our standard model-theoretic definition of logical consequence is originally based on Alfred Tarski’s (1936) semantic definition, which, in turn, is based on Rudolf Carnap’s (1934) similar definition. In recent literature, Tarski’s definition is described as a conceptual analysis of the intuitive ‘everyday’ concept of consequence or as an explication of it, but the use of these terms is loose and largely unaccounted for. I argue that the definition is not an analysis but an explication, in the Carnapian sense: the replacement of the inexact everyday concept with an exact one. Some everyday intuitions were thus brought into a precise form, others were ignored and forgotten. How exactly did the concept of logical consequence change in this process? I suggest that we could find some of the forgotten intuitions in Bernard Bolzano’s (1837) definition of ‘deducibility’, which is traditionally viewed as the main precursor of Tarski’s definition from a time before formalized languages. It turns out that Bolzano’s definition is subject to just the kind of natural features—paradoxicality of everyday language, Platonism about propositions, and dependence on the external world—that Tarski sought to tame by constructing an artificial concept for the special needs of mathematical logic. Full article
25 pages, 1557 KiB  
Article
Evidential Analysis: An Alternative to Hypothesis Testing in Normal Linear Models
by Brian Dennis, Mark L. Taper and José M. Ponciano
Entropy 2024, 26(11), 964; https://doi.org/10.3390/e26110964 - 10 Nov 2024
Viewed by 490
Abstract
Statistical hypothesis testing, as formalized by 20th century statisticians and taught in college statistics courses, has been a cornerstone of 100 years of scientific progress. Nevertheless, the methodology is increasingly questioned in many scientific disciplines. We demonstrate in this paper how many of [...] Read more.
Statistical hypothesis testing, as formalized by 20th century statisticians and taught in college statistics courses, has been a cornerstone of 100 years of scientific progress. Nevertheless, the methodology is increasingly questioned in many scientific disciplines. We demonstrate in this paper how many of the worrisome aspects of statistical hypothesis testing can be ameliorated with concepts and methods from evidential analysis. The model family we treat is the familiar normal linear model with fixed effects, embracing multiple regression and analysis of variance, a warhorse of everyday science in labs and field stations. Questions about study design, the applicability of the null hypothesis, the effect size, error probabilities, evidence strength, and model misspecification become more naturally housed in an evidential setting. We provide a completely worked example featuring a two-way analysis of variance. Full article
Show Figures

Figure 1

Figure 1
<p>Probability density functions (solid curves) of the noncentral F(<math display="inline"><semantics> <mrow> <mi>q</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>n</mi> <mo>−</mo> <mi>r</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>λ</mi> </mrow> </semantics></math>) distribution for various values of sample size <math display="inline"><semantics> <mi>n</mi> </semantics></math> and the noncentrality parameter <math display="inline"><semantics> <mi>λ</mi> </semantics></math>, as represented in the formula for <math display="inline"><semantics> <mrow> <mi>f</mi> <mfenced> <mi>u</mi> </mfenced> </mrow> </semantics></math> in the text, Equation (30). Here, <math display="inline"><semantics> <mrow> <mi>λ</mi> <mo>=</mo> <mi>n</mi> <msup> <mi>δ</mi> <mn>2</mn> </msup> </mrow> </semantics></math>, which is in the common form of a simple experimental design, where <math display="inline"><semantics> <mi>n</mi> </semantics></math> is the number of observations and <math display="inline"><semantics> <mrow> <msup> <mi>δ</mi> <mn>2</mn> </msup> </mrow> </semantics></math> is a generalized squared per-observation effect size. The cumulative distribution function of the noncentral F distribution, exemplified here as the area under each density curve to the left of the dashed vertical line, is a monotone decreasing function of <math display="inline"><semantics> <mi>n</mi> </semantics></math>. Here, <math display="inline"><semantics> <mrow> <mi>q</mi> <mo>=</mo> <mn>6</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>r</mi> <mo>=</mo> <mn>12</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>δ</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0.25</mn> </mrow> </semantics></math>, and <math display="inline"><semantics> <mi>n</mi> </semantics></math> has the values <math display="inline"><semantics> <mrow> <mn>24</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mn>36</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mn>48</mn> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <mn>60</mn> </mrow> </semantics></math>. Dashed curve is the density function for the F(<math display="inline"><semantics> <mrow> <mi>q</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>n</mi> <mo>−</mo> <mi>r</mi> <mrow> <mo>,</mo> <mo> </mo> </mrow> <mi>λ</mi> </mrow> </semantics></math>) distribution with <math display="inline"><semantics> <mrow> <mi>n</mi> <mo>=</mo> <mn>24</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msup> <mi>δ</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math> (central F distribution). Notice that for a given effect size, the noncentral distribution increasingly diverges from the central distribution as sample size increases.</p>
Full article ">Figure 2
<p>Curves: Estimated cdf of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>SIC</mi> </mrow> </semantics></math> for the citrus tree example (two-factor analysis of variance, <a href="#entropy-26-00964-t001" class="html-table">Table 1</a>, with model 1 representing no interactions, model 2 representing interactions) using parametric (solid) and nonparametric (dashed) bootstrap with <math display="inline"><semantics> <mrow> <mn>1024</mn> </mrow> </semantics></math> bootstrap samples. Dotted horizontal lines depict 0.05 and 0.95 levels.</p>
Full article ">Figure 3
<p>The effect of sample size on the uncertainty of an evidential estimation. The data are simulated from the estimated model 2 (representing interactions). For each data set, confidence intervals were generated with 1024 bootstraps. To depict the expected behavior of such intervals the confidence points from 1024 simulated data sets are averaged. The vertical lines indicate the average 90% confidence intervals. The open circles and the dashes indicate the average location of the 50% confidence point. The solid horizontal line indicates equal evidence for model 1 and model 2. The dotted horizontal line indicates the pseudo-true difference of Kullback–Leibler divergences in the simulations.</p>
Full article ">Figure 4
<p>Interaction plot. An interaction plot is a graphical display of the potential magnitude and location of interaction in a linear model. For a two-factor ANOVA, a basic interaction plot displays a central measure for each cell (generally mean or median) on the <span class="html-italic">Y</span>-axis plotted against a categorical factor indicated on the <span class="html-italic">X</span>-axis. The second factor is indicated by lines joining cells that share a factor level. If there is no interaction, these lines will be parallel. The stronger an interaction, the greater the deviation from parallelism will be. Of course, some deviation may result from error in the estimation of cell central values. As consequence, interaction plots often include a display, such as a boxplot or confidence interval, of the uncertainty in the estimate of cell central value. In this figure, we plot 95% confidence intervals of cell means. Because replication is low (2 observations per cell), we calculate these intervals using a pooled estimate of the standard error. We further enhance this plot by including confidence intervals on the slope of the lines. If one considers any value within an interval for a central value a plausible value, a line from any plausible central value to any plausible value in the next interval represents a plausible slope. The maximum plausible slope runs from the lower bound on the left to the upper bound on the right. Similarly, the minimum plausible slope runs from the upper bound on the left to the lower bound on the right. If the intervals on central values are confidence intervals, then these maximum and minimum plausible slopes are themselves a pair confidence bounds on the slopes whose confidence level is equal to the square of the central value interval confidence level. Since in the figure we are using 95% intervals on the cell means, the confidence level on slopes is 90.5%. In the case study of citrus yields, the interaction plot readily shows that small changes in the cell mean yields well within the uncertainties in cell means could make all lines parallel. This interpretation matches the quantitative estimate of very low evidence for interactions.</p>
Full article ">
20 pages, 698 KiB  
Article
Beyond Human and Machine: An Architecture and Methodology Guideline for Centaurian Design
by Remo Pareschi
Sci 2024, 6(4), 71; https://doi.org/10.3390/sci6040071 - 4 Nov 2024
Viewed by 669
Abstract
The concept of the centaur, symbolizing the fusion of human and machine intelligence, has intrigued visionaries for decades. Recent advancements in artificial intelligence have made this concept not only realizable but also actionable. This synergistic partnership between natural and artificial intelligence promises superior [...] Read more.
The concept of the centaur, symbolizing the fusion of human and machine intelligence, has intrigued visionaries for decades. Recent advancements in artificial intelligence have made this concept not only realizable but also actionable. This synergistic partnership between natural and artificial intelligence promises superior outcomes by leveraging the strengths of both entities. Tracing its origins back to early pioneers of human–computer interaction in the 1960s, such as J.C.R. Licklider and Douglas Engelbart, the idea initially manifested in centaur chess but faced challenges as technological advances began to overshadow human contributions. However, the resurgence of generative AI in the late 2010s, exemplified by conversational agents and text-to-image chatbots, has rekindled interest in the profound potential of human–AI collaboration. This article formalizes the centaurian model, detailing properties associated with various centaurian designs, evaluating their feasibility, and proposing a design methodology that integrates human creativity with artificial intelligence. Additionally, it compares this model with other integrative theories, such as the Theory of Extended Mind and Intellectology, providing a comprehensive analysis of its place in the landscape of human–machine interaction. Full article
Show Figures

Figure 1

Figure 1
<p>Simon’s Cognitive Architecture.</p>
Full article ">Figure 2
<p>The Evolution of the Cognitive Trading System from Human-Operated Trading to AI-Augmented Trading to Centauric Cognitive Trading System.</p>
Full article ">Figure 3
<p>Evolution of a Centaur NLP System.</p>
Full article ">Figure 4
<p>From Monotonic to Non-monotonic.</p>
Full article ">Figure 5
<p>CreatiChain Creativity Loop.</p>
Full article ">Figure 6
<p>Evolution of Chess Systems: Closed/Reductionist Approach.</p>
Full article ">Figure 7
<p>Evolution of Art Systems: Open/Centauric Approach.</p>
Full article ">
18 pages, 417 KiB  
Article
The Connections Between Attribute-Induced and Object-Induced Decision Rules in Incomplete Formal Contexts
by Hongwei Wang, Huilai Zhi, Yinan Li, Daxin Zhu and Jianbing Xiahou
Symmetry 2024, 16(10), 1403; https://doi.org/10.3390/sym16101403 - 21 Oct 2024
Viewed by 907
Abstract
For a given incomplete context, object-induced approximate concepts have been defined, and this type of approximate concept can induce a type of decision rule. Based on the duality principle, another set of approximate concepts can be defined from the perspective of attributes, i.e., [...] Read more.
For a given incomplete context, object-induced approximate concepts have been defined, and this type of approximate concept can induce a type of decision rule. Based on the duality principle, another set of approximate concepts can be defined from the perspective of attributes, i.e., attribute-induced approximate concepts. Although object induced approximate concepts and attribute induced approximate concepts are symmetrical by duality principle, their induced decision rules exhibit different properties and the connections between attribute induced decision rules and object induced decision rules in incomplete formal contexts are not clear. To this end, a type of attribute-induced approximate concept and a method of extracting attribute-induced decision rules are presented. More importantly, it is revealed that given a type of decision rules, there must be corresponding decision rules of the other type, and both of them can provide some useful information, but they are not equivalent to each other. In other words, each type of decision rule can provide some unique and irreplaceable information. Full article
(This article belongs to the Topic Mathematical Modeling of Complex Granular Systems)
Show Figures

Figure 1

Figure 1
<p>Concept lattice <math display="inline"><semantics> <mrow> <mi>L</mi> <mo>(</mo> <munder> <mi mathvariant="double-struck">K</mi> <mo>̲</mo> </munder> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 2
<p>Concept lattice <math display="inline"><semantics> <mrow> <mi>L</mi> <mo>(</mo> <mover> <mi mathvariant="double-struck">K</mi> <mo>¯</mo> </mover> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 3
<p>Approximate concept lattice <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>L</mi> <mo>(</mo> <mi mathvariant="double-struck">K</mi> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Experimental results caused by variations of objects (<math display="inline"><semantics> <mrow> <mo>|</mo> <mi>E</mi> <mo>|</mo> <mo>/</mo> <mo>|</mo> <mi>C</mi> <mo>|</mo> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>).</p>
Full article ">Figure 5
<p>Experimental results caused by variations of objects (<math display="inline"><semantics> <mrow> <mo>|</mo> <mi>E</mi> <mo>|</mo> <mo>/</mo> <mo>|</mo> <mi>C</mi> <mo>|</mo> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math>).</p>
Full article ">Figure 6
<p>Experimental results caused by variations of objects (<math display="inline"><semantics> <mrow> <mo>|</mo> <mi>E</mi> <mo>|</mo> <mo>/</mo> <mo>|</mo> <mi>C</mi> <mo>|</mo> <mo>=</mo> <mn>2</mn> </mrow> </semantics></math>).</p>
Full article ">Figure 7
<p>Experimental results caused by variations of fill ratios.</p>
Full article ">
16 pages, 578 KiB  
Article
Bridging the Knowledge–Practice Gap: Assessing Climate Change Literacy Among Science Teachers
by Hiya Almazroa
Sustainability 2024, 16(20), 9088; https://doi.org/10.3390/su16209088 - 20 Oct 2024
Viewed by 790
Abstract
This research aimed to investigate the knowledge levels and teaching practices of Saudi science teachers regarding climate change, focusing on exploring the correlation between these aspects. The cross-sectional descriptive survey included teachers at middle and high school levels in public schools. The questionnaire [...] Read more.
This research aimed to investigate the knowledge levels and teaching practices of Saudi science teachers regarding climate change, focusing on exploring the correlation between these aspects. The cross-sectional descriptive survey included teachers at middle and high school levels in public schools. The questionnaire study comprised three sections: collecting demographic data, assessing teachers’ understanding of climate change through factual inquiries, and evaluating teaching practices related to climate change. The findings reveal a promising degree of awareness among teachers, with a majority correctly identifying crucial elements of climate change while also exposing misconceptions and knowledge gaps. While a notable portion of teachers reported teaching climate change-related aspects, some indicated minimal involvement in extracurricular activities linked to climate change. The correlation analysis between science teachers’ climate change knowledge and practices indicates a weak connection between the two variables, suggesting that teachers’ knowledge might not substantially impact their actual teaching practices regarding climate change concepts. Limitations included reliance on self-reported data and a sample size that could impact result generalizability. Future research recommendations include combining quantitative data with qualitative methods, comparing knowledge and practices across regions or demographics, and conducting longitudinal studies. This study’s implications stress the importance of targeted professional development, advocating for climate change education integration into formal curricula, and policy adjustments mandating climate change education. Full article
(This article belongs to the Section Air, Climate Change and Sustainability)
Show Figures

Figure 1

Figure 1
<p>Process diagram to illustrate the research flow.</p>
Full article ">
45 pages, 9737 KiB  
Article
Residential Care Facilities for Users with Alzheimer’s Disease: Characterisation of Their Architectural Typology
by Santiago Quesada-García, Pablo Valero-Flores and María Lozano-Gómez
Buildings 2024, 14(10), 3307; https://doi.org/10.3390/buildings14103307 - 19 Oct 2024
Viewed by 1143
Abstract
The design and construction of residences for persons with Alzheimer’s disease (AD) have been based on the recommendations of design guides, the results of empirical tests with samples of the population, and the experience of architects and planners. The reiteration of certain patterns, [...] Read more.
The design and construction of residences for persons with Alzheimer’s disease (AD) have been based on the recommendations of design guides, the results of empirical tests with samples of the population, and the experience of architects and planners. The reiteration of certain patterns, criteria, and guidelines has given rise to a new type of building that has not yet been explicitly described. The aim of this paper is to determine the main characteristics of this typology. This research is based on a critical review methodology, analysing 30 care homes built over the last four decades across various global contexts. Detailed surveys of plans, projects, and buildings were carried out, allowing a comparative analysis of the architectural attributes to determine the most influential parameters for these buildings. The results indicate that environments designed with safety, accessibility, and opportunities for social interaction in mind—and, above all, those that are personalised to the needs of this collective—significantly enhance the behaviour, emotional state, and cognitive state of their residents. The main theoretical contributions include identifying and stating the key features of this type, such as small scale, basic cell housing, comprehensible organisation, and sensory stimulation of spaces, among others. The breakthrough of this study that differentiates it from other works in this field is that it provides concrete guidelines to approach the planning, design, and construction of these kinds of residences. The significance of this research lies in the definition of this unique typology, which is not characterised by its morphology, shape, or formal composition but rather focused on promoting an adequate cognitive and physiological reception of the space by the users. This building concept has important management implications, as its construction must provide for and integrate specific care services in a residential setting for people with AD. Full article
(This article belongs to the Special Issue Advances of Healthy Environment Design in Urban Development)
Show Figures

Figure 1

Figure 1
<p>Green House Project. Photo of The Green House Homes at Green Hill. Bottom left, the general plan of the Green House Project for the 2008 National Design Competition. Bottom right, the group-living unit of the same Green House. (Source: Prepared by the authors and adapted from NKArchitects).</p>
Full article ">Figure 2
<p>Main contributions of various authors over the last forty years to design principles for environments for people with dementia. (Source: Prepared by the authors).</p>
Full article ">Figure 3
<p>Corinne Dolan Alzheimer Center, designed by Taliesin Associated Architects and built in 1985 in Heather Hill (Cleveland). (Source: Prepared by the authors and adapted from photography by Pete Guerrero and Daniel Ruark).</p>
Full article ">Figure 4
<p>Woodside Place, designed by the architectural firm Perkins Eastman, built in 1991 in Oakmont (Pennsylvania, USA). (Source: prepared by the authors and adapted from photography by Robert Ruschak).</p>
Full article ">Figure 5
<p>Chronology of the Alzheimer’s residential facilities reviewed and analysed in this research. (Source: Prepared by the authors).</p>
Full article ">Figure 6
<p>Sample selection flow diagram of the Alzheimer’s residential facilities reviewed and analysed in this research. (Source: Prepared by the authors).</p>
Full article ">Figure 7
<p>On the left, the Boswijk Residence, built in 2010 in Vught (Holland), an extensive single-storey building designed by EGM Architecten. On the right, the Kompetenzzentrum Demenz Nürnberg residence, designed by Feddersen Architeckten, built in 2006 in Nuremberg (Germany), an example of a high-rise development of this residential model. (Source: Prepared by the authors).</p>
Full article ">Figure 8
<p>Norra Vram Nursing Home, designed by Marge Architeckten, built in 2008, in Billesholm (Sweden), with homely environment and domestic scale. (Source: Prepared by the authors and adapted from photography by Johan Fowelin).</p>
Full article ">Figure 9
<p>Functional programme and zoning of the Woodside Place residence. (Source: Prepared by the authors).</p>
Full article ">Figure 10
<p>Reina Sofía Foundation, Alzheimer’s Centre, designed by Estudio Lamela, built in 2007 in Madrid (Spain). (Source: Prepared by the authors and adapted from Estudio Lamela Arquitectos).</p>
Full article ">Figure 11
<p>Alzheimer’s Respite Centre, designed by Niall McLaughlin, built in 2011, in Dublin (Ireland). (Source: Prepared by the authors and adapted from Niall McLaughlin Architects and photography by Nick Kane).</p>
Full article ">Figure 12
<p>Characteristics of the typology of residential facilities for Alzheimer’s patients. (Source: Prepared by the authors).</p>
Full article ">Figure A1
<p>Data collection sheet for the selected buildings in the sample (Source: Prepared by the authors).</p>
Full article ">
17 pages, 1446 KiB  
Article
Cell Cycle Complexity: Exploring the Structure of Persistent Subsystems in 414 Models
by Stephan Peter, Arun Josephraj and Bashar Ibrahim
Biomedicines 2024, 12(10), 2334; https://doi.org/10.3390/biomedicines12102334 - 14 Oct 2024
Viewed by 663
Abstract
Background: The regulation of cellular proliferation and genomic integrity is controlled by complex surveillance mechanisms known as cell cycle checkpoints. Disruptions in these checkpoints can lead to developmental defects and tumorigenesis. Methods: To better understand these mechanisms, computational modeling has been [...] Read more.
Background: The regulation of cellular proliferation and genomic integrity is controlled by complex surveillance mechanisms known as cell cycle checkpoints. Disruptions in these checkpoints can lead to developmental defects and tumorigenesis. Methods: To better understand these mechanisms, computational modeling has been employed, resulting in a dataset of 414 mathematical models in the BioModels database. These models vary significantly in detail and simulated processes, necessitating a robust analytical approach. Results: In this study, we apply the chemical organization theory (COT) to these models to gain insights into their dynamic behaviors. COT, which handles both ordinary and partial differential equations (ODEs and PDEs), is utilized to analyze the compartmentalized structures of these models. COT’s framework allows for the examination of persistent subsystems within these models, even when detailed kinetic parameters are unavailable. By computing and analyzing the lattice of organizations, we can compare and rank models based on their structural features and dynamic behavior. Conclusions: Our application of the COT reveals that models with compartmentalized organizations exhibit distinctive structural features that facilitate the understanding of phenomena such as periodicity in the cell cycle. This approach provides valuable insights into the dynamics of cell cycle control mechanisms, refining existing models and potentially guiding future research in this area. Full article
(This article belongs to the Section Cell Biology and Pathology)
Show Figures

Figure 1

Figure 1
<p>A biochemical reaction network illustrating the interactions and transitions between cyclin, cdc2, and their phosphorylated states in cell cycle regulation according to the paper [<a href="#B37-biomedicines-12-02334" class="html-bibr">37</a>]. The plot was obtained from the EBI Biomodels website. The proteins cdc2 (C2, phosphorylated: C2P) and cyclin (Y, phosphorylated: YP) form a heterodimer (maturation-promoting factor) P-cyclin–cdc2 (and P-cyclin–cdc2-P) that controls the major events of the cell cycle. The numbers inside the square green box denote the reaction numbers.</p>
Full article ">Figure 2
<p>Lattice of organizations of the reaction network of Tyson’s model [<a href="#B37-biomedicines-12-02334" class="html-bibr">37</a>] with reaction numbering according to the SBML model in the BioModels database [<a href="#B69-biomedicines-12-02334" class="html-bibr">69</a>]. An illustrative representation of the organizations as subnetworks of the reaction network follows in <a href="#biomedicines-12-02334-f004" class="html-fig">Figure 4</a>.</p>
Full article ">Figure 3
<p>The numerical simulation of limit cycle oscillations in Tyson’s 1991 model [<a href="#B37-biomedicines-12-02334" class="html-bibr">37</a>] illustrates the log-scaled dynamic concentrations of Cdc2 (C2), the cyclin–Cdc2 complex (CP), phosphorylated cyclin–Cdc2 (pM), cyclin (Y), and phosphorylated cyclin (YP) over a period of 90 min. This visualization highlights the regulatory feedback mechanisms that drive cell cycle progression. The initialization values for the variables C2, CP, M, PM, Y, and Yp were 0, 0, 1, 0, 0.25, 0, and 1, respectively. The reaction constants were k<sub>1</sub> = 0.015, k<sub>2</sub> = 0, k<sub>3</sub> = 200, k<sub>4</sub> = 180, k<sub>5</sub> = 0, k<sub>6</sub> = 1, k<sub>7</sub> = 0.6, k<sub>8</sub> = 1,000,000, and k<sub>9</sub> = 1000.</p>
Full article ">Figure 4
<p>(<b>Left</b>) Reaction network of Tyson’s model [<a href="#B37-biomedicines-12-02334" class="html-bibr">37</a>] overlaid with a Venn diagram depicting the various (non-empty) organizations. (<b>Right</b>) Lattice of organizations from <a href="#biomedicines-12-02334-f002" class="html-fig">Figure 2</a>, with arrows indicating the corresponding subsystems within the reaction network shown on the left.</p>
Full article ">Figure 5
<p>Venn diagram (<b>left</b>) and lattice (<b>right</b>) of organizations in Markevich’s model [<a href="#B68-biomedicines-12-02334" class="html-bibr">68</a>]. There are four non-empty organizations. From the smallest to the biggest, these are <math display="inline"><semantics> <msub> <mi>O</mi> <mn>1</mn> </msub> </semantics></math> (light gray) containing four species; <math display="inline"><semantics> <msub> <mi>O</mi> <mn>2</mn> </msub> </semantics></math>, which contains at least two compartments, one that is colored blue (left) and one that is colored orange (right) and includes all of <math display="inline"><semantics> <msub> <mi>O</mi> <mn>1</mn> </msub> </semantics></math> and a blue-colored one; and finally <math display="inline"><semantics> <msub> <mi>O</mi> <mn>3</mn> </msub> </semantics></math> (dark gray), which is the biggest one and includes the whole system.</p>
Full article ">Figure 6
<p>Histogram of cell cycle models categorized by the organisms studied. Some models overlap, particularly those addressing transitions such as S/G2 or G2/M. The majority of the models focus on the M phase, with approximately 160 dedicated to that stage.</p>
Full article ">Figure 7
<p>Reaction network complexity: Scatter plot of the number of species vs. the number of reactions of each model. As expected, there is an overall positive correlation between the two. The number of species ranges from 1 to 189, and the number of reactions from 2 to 316.</p>
Full article ">Figure 8
<p>Lattice of organizations’ complexity: (<b>a</b>) Histogram of the number of models according to their number of organizations and (<b>b</b>) scatter plot of the number of organizations vs. the number of reactions. The most frequent number of organizations per model was two. The model size with regard to the number of reactions was not strongly connected to the number of organizations. (<b>c</b>) Height vs. width scatter plot, with values of each data point given by color. To better represent the correlation, two models were removed by cutting the diagram above a width of 10: one with a width of 56 and a height of 8, and another with a width of 20 and a height of 7. Additionally, 39 models with no species and 10 very large models in their Hasse diagram were excluded from the plot. In total, the full lattice of organizations was calculated for 275 models within the pre-limited time. (<b>d</b>) Histogram showing the values of persistence of different models. (<b>a</b>) Histogram of the model frequencies according to their number of organizations; (<b>b</b>) Number of organizations vs. number of reactions; (<b>c</b>) Height vs. width scatter plot, with values of each data point represented by color; (<b>d</b>) Histogram showing the values of persistence of different models.</p>
Full article ">Figure 9
<p>Compartmentalization complexity: (<b>a</b>) Comparison of organizations with only one compartment versus those requiring more than one compartment. (<b>b</b>) Histogram showing the count of the maximum number of required compartments in an organization. (<b>a</b>) Number of models with organizations all containing only one compartment vs. models containing at least one organization requiring more than one compartment; (<b>b</b>) histogram of required compartments across models.</p>
Full article ">Figure 10
<p>Time complexity: The range of computation time (between <math display="inline"><semantics> <msup> <mn>10</mn> <mn>15</mn> </msup> </semantics></math> and <math display="inline"><semantics> <mrow> <mn>5</mn> <mo>·</mo> <msup> <mn>10</mn> <mn>16</mn> </msup> </mrow> </semantics></math> milliseconds) required for the majority of models. The simulations were performed on a machine with an Intel Core i9-9300H CPU, a base clock speed of 2.4 GHz and 16 GB of DDR4 RAM. The operating system used was Windows 11 64-bit. (<b>a</b>) Number of reactions vs. time to compute organizations; (<b>b</b>) number of species vs. time to compute organizations; (<b>c</b>) number of organizations vs. time to compute organizations (milliseconds); (<b>d</b>) distribution of the number of models and time (milliseconds) required by them to compile.</p>
Full article ">Figure 10 Cont.
<p>Time complexity: The range of computation time (between <math display="inline"><semantics> <msup> <mn>10</mn> <mn>15</mn> </msup> </semantics></math> and <math display="inline"><semantics> <mrow> <mn>5</mn> <mo>·</mo> <msup> <mn>10</mn> <mn>16</mn> </msup> </mrow> </semantics></math> milliseconds) required for the majority of models. The simulations were performed on a machine with an Intel Core i9-9300H CPU, a base clock speed of 2.4 GHz and 16 GB of DDR4 RAM. The operating system used was Windows 11 64-bit. (<b>a</b>) Number of reactions vs. time to compute organizations; (<b>b</b>) number of species vs. time to compute organizations; (<b>c</b>) number of organizations vs. time to compute organizations (milliseconds); (<b>d</b>) distribution of the number of models and time (milliseconds) required by them to compile.</p>
Full article ">
17 pages, 367 KiB  
Article
Three-Valued Concept Analysis for 2R Formal Contexts
by Taisheng Zeng, Huilai Zhi, Yinan Li, Daxin Zhu and Jianbing Xiahou
Mathematics 2024, 12(19), 3015; https://doi.org/10.3390/math12193015 - 27 Sep 2024
Viewed by 354
Abstract
Russian Roulette is a well-known cruel gambling game and its concepts and methods have been exploited in a lot of research fields for decades. However, abundant useful information contained in the process of Russian Roulette is seldom studied with a mathematical model with [...] Read more.
Russian Roulette is a well-known cruel gambling game and its concepts and methods have been exploited in a lot of research fields for decades. However, abundant useful information contained in the process of Russian Roulette is seldom studied with a mathematical model with interpretability. To this end, we define the 2R formal context to model Russian Roulette and carry out 3-valued concept analysis for 2R formal contexts to mine useful information. At first, the uniqueness of 2R formal contexts is discussed from a formal concept analysis viewpoint. And then we propose 3-valued 2R concepts and discuss their properties and the connections with the basic 2R concepts. Experimental analysis demonstrates that 3-valued 2R concept lattices can show many more different details compared with basic 2R concept lattices. Finally, a case study about a Chinese herbal medicine is introduced to demonstrate the feasibility of the proposed model. Full article
(This article belongs to the Topic New Advances in Granular Computing and Data Mining)
Show Figures

Figure 1

Figure 1
<p><math display="inline"><semantics> <mrow> <mi>P</mi> <mi>L</mi> <mn>2</mn> <mi>R</mi> <mo>(</mo> <mi>K</mi> <mo>)</mo> </mrow> </semantics></math> of Example 2.</p>
Full article ">Figure 2
<p>Three-valued <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>R</mi> </mrow> </semantics></math> concept lattice <math display="inline"><semantics> <mrow> <mn>3</mn> <mi>V</mi> <mi>L</mi> <mn>2</mn> <mi>R</mi> <mo>(</mo> <mi>K</mi> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 3
<p>Three-valued <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>R</mi> </mrow> </semantics></math> concept vs. basic <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>R</mi> </mrow> </semantics></math> concept with fixed number of attributes.</p>
Full article ">Figure 4
<p>Three-valued <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>R</mi> </mrow> </semantics></math> concept vs. basic <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>R</mi> </mrow> </semantics></math> concept with fixed number of objects.</p>
Full article ">
15 pages, 314 KiB  
Article
“We Owe It to Those Who Shall Come After Us”: Considering the Role of Social Work Education in Disrupting Carceral Complicity
by Carly Mychl Murray, Samantha A. Martinez, Alexa Cinque, Yejin Sohn and Grace Newton
Soc. Sci. 2024, 13(9), 491; https://doi.org/10.3390/socsci13090491 - 17 Sep 2024
Viewed by 889
Abstract
Reflecting upon Mary Richmond’s early call for formalized social work training to address the historical struggles of the field, this analysis examines how American social work education has addressed the paradoxes of help and harm present in the field for more than a [...] Read more.
Reflecting upon Mary Richmond’s early call for formalized social work training to address the historical struggles of the field, this analysis examines how American social work education has addressed the paradoxes of help and harm present in the field for more than a century. We examine how, under the guise of benevolence and care, social work has exerted social control and contributed to gendered criminalization. We use the term carceral complicity to extend the concept of carceral social work, illustrating how carceral complicity has contributed to women’s criminalization through the embedding, enacting, and invisibilizing of carceral logics in social work. In addition to describing how carceral complicity has been addressed in social work education, we illustrate the gendered nature of carceral complicity, highlighting how women have historically and contemporarily been positioned as both the proprietors and the recipients of carceral complicity. In line with recent scholarship, we suggest that through a transformative approach to social work education we may disrupt carceral complicity and support liberatory futures. Full article
26 pages, 3780 KiB  
Article
Open-Source Data Formalization through Model-Based Systems Engineering for Concurrent Preliminary Design of CubeSats
by Giacomo Luccisano, Sophia Salas Cordero, Thibault Gateau and Nicole Viola
Aerospace 2024, 11(9), 702; https://doi.org/10.3390/aerospace11090702 - 27 Aug 2024
Viewed by 664
Abstract
Market trends in the space sector suggest a notable increase in satellite operations and market value for the coming decade. In parallel, there has been a shift in the industrial and academic sectors from traditional Document-Based System Engineering to Model-based systems engineering (MBSE) [...] Read more.
Market trends in the space sector suggest a notable increase in satellite operations and market value for the coming decade. In parallel, there has been a shift in the industrial and academic sectors from traditional Document-Based System Engineering to Model-based systems engineering (MBSE) combined with Concurrent engineering (CE) practices. Due to growing demands, the drivers behind this change have been the need for quicker and more cost-effective design processes. A key challenge in this transition remains to determine how to effectively formalize and exchange data during all design stages and across all discipline-specific tools; as representing systems through models can be a complex endeavor. For instance, during the Preliminary design (PD) phase, the integration of system models with external mathematical models for simulations, analyses, and system budgeting is crucial. The introduction of CubeSats and their standard has partly addressed the question of standardization and has aided in reducing overall development time and costs in the space sector. Nevertheless, questions about how to successfully exchange data endure. This paper focuses on formalizing a CubeSat model for use across various stages of the PD phase. The entire process is conducted with the exclusive use of open-source tools, to facilitate the transparency of data integration across the PD phases, and the overall life cycle of a CubeSat. The paper has two primary outcomes: (i) developing a generic CubeSat model using Systems modeling language (SysML) that includes data storage and visualization through the application of Unified modeling language (UML) stereotypes, streamlining in parallel information exchange for integration with various simulation and analysis tools; (ii) creating an end-to-end use case scenario within the Nanostar software suite (NSS), an open-source framework designed to streamline data exchange across different software during CE sessions. A case study from a theoretical academic space mission concept is presented as the illustration of how to utilize the proposed formalization, and it serves as well as a preliminary validation of the proposed formalization. The proposed formalization positions the CubeSat SysML model as the central data source throughout the design process. It also supports automated trade-off analyses by combining the benefits of SysML with effective data instantiating across all PD study phases. Full article
(This article belongs to the Special Issue Space Systems Preliminary Design)
Show Figures

Figure 1

Figure 1
<p>Proposed Framework Formalization Steps.</p>
Full article ">Figure 2
<p>System UML stereotypes definition.</p>
Full article ">Figure 3
<p>Orbit and Propagation Losses UML stereotypes definition.</p>
Full article ">Figure 4
<p>Ground station UML stereotype.</p>
Full article ">Figure 5
<p>Operating mode UML stereotype.</p>
Full article ">Figure 6
<p>Payload BDD example.</p>
Full article ">Figure 7
<p>TWC output power and data budgets. (<b>a</b>) TWC power budget. (<b>b</b>) TWC data budget.</p>
Full article ">Figure 8
<p>Example of requirement list output.</p>
Full article ">Figure 9
<p>Example of payload relationship graph. Connections are derived from the BDD in <a href="#aerospace-11-00702-f006" class="html-fig">Figure 6</a>.</p>
Full article ">Figure 10
<p>Data flow scheme of an application of the proposed formalization.</p>
Full article ">Figure A1
<p>Example of UML class and object with inherited attributes and operations.</p>
Full article ">Figure A2
<p>Example of UML generalization.</p>
Full article ">Figure A3
<p>UML stereotype examples. (<b>a</b>) Stereotype definition example. (<b>b</b>) Stereotype application example.</p>
Full article ">Figure A4
<p>TWC system-level Block definition diagram implemented in SysML.</p>
Full article ">
12 pages, 281 KiB  
Article
“They Can’t Possibly Understand What I’m Going Through”: Female Farmers’ Perspectives on Barriers to Care in Georgia
by Noah Hopkins, Lauren Ledbetter Griffeth, Chase Reece and Christina Proctor
Int. J. Environ. Res. Public Health 2024, 21(9), 1130; https://doi.org/10.3390/ijerph21091130 - 27 Aug 2024
Viewed by 1709
Abstract
The purpose of this study was to explore female farmers’ perspectives on barriers to engaging with resources for physical and mental healthcare faced by agriculture producers in the state of Georgia. In-depth interviews were conducted with female farm owners and managers (n [...] Read more.
The purpose of this study was to explore female farmers’ perspectives on barriers to engaging with resources for physical and mental healthcare faced by agriculture producers in the state of Georgia. In-depth interviews were conducted with female farm owners and managers (n = 16) across the state. Interviews were recorded and transcribed, and researchers coded interviews separately before thematic analysis was used to identify common themes. Three primary themes were identified: (i) formal healthcare challenges, (ii) stigma, and (iii) cultural norms. Formal healthcare challenges included time constraints, healthcare costs, and a lack of cultural competence from healthcare providers. Both community and self-stigma were identified as barriers to engaging with mental health resources. Cultural norms that acted as a barrier to care included the prioritization of farm operations, self-reliance, pride, and the minimization of health concerns. Interviewees identified gender differences in the impact of stigma and cultural norms, reporting that these sociocultural barriers were more prominent among older, male producers. Central to many of these barriers is the concept of ‘farm identity’, where farmers’ commitment to their operations consistently trumped concerns about physical or mental health. Future efforts to improve health outcomes among farmers should utilize the concept of farm identity as a guide for tailoring interventions and improving cultural competence among rural healthcare providers. Full article
22 pages, 1224 KiB  
Article
Consumers’ Financial Knowledge in Central European Countries in the Light of Consumer Research
by Łukasz Gębski and Georges Daw
J. Risk Financial Manag. 2024, 17(9), 379; https://doi.org/10.3390/jrfm17090379 - 23 Aug 2024
Viewed by 763
Abstract
Consumer protection in the financial market has several dimensions. From a formal point of view, consumer rights are guaranteed by law. Educational programs are implemented in schools and the media to promote knowledge and responsible use of financial products and services. Despite the [...] Read more.
Consumer protection in the financial market has several dimensions. From a formal point of view, consumer rights are guaranteed by law. Educational programs are implemented in schools and the media to promote knowledge and responsible use of financial products and services. Despite the efforts made, the number of incorrect and suboptimal financial decisions is so high that the risk of households falling into excessive debt remains significant. The limited effectiveness of the law led to the claim that only effective education can reduce the risk of suboptimal financial decisions. Unfortunately, the efforts made in this area are not fully satisfactory. The study of financial knowledge of consumers, which was conducted in Poland in January 2024, aimed to verify consumer errors and their nature. As part of the consumer study, not only declared knowledge was verified, but also actual knowledge. The researchers’ doubts resulted from a comparison of the results of scientific research in this area with the current market situation. Consumers declare a high level of knowledge of economic and financial concepts. In practice, however, they make mistakes that do not only indicate behavioral cognitive errors but also a lack of knowledge. The test questions were constructed in such a way as to verify the declared knowledge (based on verification questions). These showed that the actual level of knowledge was lower than the declared one. A review of the literature and studies of financial knowledge and financial competence of consumers in Central European countries was also carried out. Analysis of the results allowed for the formulation of conclusions regarding the educational gap in relation to social characteristics. The conclusions resulting from the study raise questions about the effectiveness of the educational methods used and indicate possible directions of changes in the consumer regulation policy, the aim of which is to ensure a high level of consumer protection. Full article
(This article belongs to the Section Economics and Finance)
Show Figures

Figure 1

Figure 1
<p>Exchange rate changes in Poland (1996–2024). Source: <a href="#B33-jrfm-17-00379" class="html-bibr">National Bank of Poland</a> (<a href="#B33-jrfm-17-00379" class="html-bibr">2024b</a>).</p>
Full article ">Figure 2
<p>Interest rates in Poland (2015–2024). <b>Source:</b> <a href="#B32-jrfm-17-00379" class="html-bibr">National Bank of Poland</a> (<a href="#B32-jrfm-17-00379" class="html-bibr">2024a</a>).</p>
Full article ">Figure 3
<p>The relationship between the declared level of financial knowledge and actual knowledge (tested using Kendall’s Tau). Source: own research.</p>
Full article ">
18 pages, 3351 KiB  
Article
Examining an Evolving Biologically Inspired Design Professional Learning Environment through Conjecture Mapping and Design-Based Research
by Abeera P. Rehmat, Alexandra A. Towner, Meltem Alemdar, Michael E. Helms, Jeffrey H. Rosen, Roxanne A. Moore and Marc J. Weissburg
Biomimetics 2024, 9(8), 468; https://doi.org/10.3390/biomimetics9080468 - 2 Aug 2024
Viewed by 865
Abstract
Biologically inspired design (BID) in engineering is a convergent, systematic approach that uses analogies from biological organisms to develop solutions for human engineering and design problems. Based on outcomes from prior studies of integrating BID in higher education, incorporating BID into pre-college education [...] Read more.
Biologically inspired design (BID) in engineering is a convergent, systematic approach that uses analogies from biological organisms to develop solutions for human engineering and design problems. Based on outcomes from prior studies of integrating BID in higher education, incorporating BID into pre-college education is a logical evolution. For effective BID instruction of these convergent concepts in pre-college education, teachers need to be well-equipped with biological, engineering, and pedagogical knowledge, both in general and those unique to the convergent, still evolving discipline. In this paper, we investigate the Professional Learning (professional learning) environment designed to foster engineering teachers’ understanding of BID integration in engineering and to determine to what extent the evolving professional learning environment fostered engineering teachers’ conceptual knowledge of BID across the three-year project. This design study applies conjecture mapping with design-based research (DBR) to examine a professional learning environment that changed over three summers and its impact on teachers’ conceptual understanding of BID integration in engineering. The analysis indicates that a combination of experiential and informal learning experiences along with engagement in a formal design challenge promoted teacher enthusiasm and a conceptual understanding of BID across the three years. Professional learning fostered teachers’ understanding of BID integration in engineering and enabled them to integrate BID into their engineering teaching practice. Full article
(This article belongs to the Special Issue Biomimetic Process and Pedagogy: Second Edition)
Show Figures

Figure 1

Figure 1
<p>Generalized conjecture map for educational design research [<a href="#B27-biomimetics-09-00468" class="html-bibr">27</a>].</p>
Full article ">Figure 2
<p>An evaluation cycle using design-based research and conjecture mapping.</p>
Full article ">Figure 3
<p>Embedded design cycles of the professional learning across the three years.</p>
Full article ">Figure 4
<p>Original conjecture map of the professional learning environment.</p>
Full article ">Figure 5
<p>The 2020 professional learning environment conjecture map [<a href="#B11-biomimetics-09-00468" class="html-bibr">11</a>].</p>
Full article ">Figure 6
<p>The 2021 professional learning environment conjecture map [<a href="#B11-biomimetics-09-00468" class="html-bibr">11</a>].</p>
Full article ">Figure 7
<p>The 2022 professional learning environment conjecture map.</p>
Full article ">
24 pages, 1452 KiB  
Article
An Innovative Method for Deterministic Multifactor Analysis Based on Chain Substitution Averaging
by Veselin Mitev and Nikolay Hinov
Mathematics 2024, 12(14), 2215; https://doi.org/10.3390/math12142215 - 15 Jul 2024
Viewed by 732
Abstract
The aims of this paper are to present the methodology, derived mathematical expressions for determining the individual factor influences and the adaptation for the conditions of dynamic deterministic factor analysis and the results of the application of the developed new method for deterministic [...] Read more.
The aims of this paper are to present the methodology, derived mathematical expressions for determining the individual factor influences and the adaptation for the conditions of dynamic deterministic factor analysis and the results of the application of the developed new method for deterministic factor analysis, called the averaged chain substitution method. After formulating the concept of the considered approach, all mathematical expressions used to create models containing up to four factor variables are presented and summarized. The scientific novelty of the study is in the obtained new equations for determining the individual factor influences by the method of averaged chain substitution and the method of analogy for five-factor additive or difference-multiplicative and for five-factor additive or difference-multifactor models with an additive or different part in the numerator of the factor model. The presented mathematical expressions accurately and unambiguously quantify the impact of individual factor influences for all types of factor models and thus significantly expand the applicability of the averaged chain substitution method in the theory and practice of financial-economic analysis. The proposed formalization and algorithmization of the evaluation process makes the method easy to apply by all economic and financial analysts for the purposes of deterministic factor analysis. The methodology was applied to perform a dynamic deterministic factor analysis of the total liquidity of Monbat AD and ELHIM-ISKRA AD for the period 2017–2021, based on the consolidated annual financial statements of the companies, available on the website of the Bulgarian Stock Exchange. Full article
(This article belongs to the Section Financial Mathematics)
Show Figures

Figure 1

Figure 1
<p>Stages of the averaged chain substitution method [<xref ref-type="bibr" rid="B21-mathematics-12-02215">21</xref>] (p. 93).</p>
Full article ">Figure 2
<p>The quantitative impact of absolute changes of the variable factors of value of material inventories, value of current receivables, value of financial assets, value of cash and cash equivalents, and value of current liabilities on the absolute change of the resultative indicator—total liquidity ratio by sub-period of “Monbat” JSC for the period 2017–2022.</p>
Full article ">Figure 3
<p>The quantitative impact of absolute changes of the variable factors of value of material inventories, value of current receivables, value of financial assets, value of cash and cash equivalents, and value of current liabilities on the absolute change of the resultative indicator—total liquidity ratio by sub-period of ELHIM-ISKRA JSC for the period 2017–2022.</p>
Full article ">
13 pages, 1846 KiB  
Article
Multidimensional Representation of Semantic Relations between Physical Theories, Fundamental Constants and Units of Measurement with Formal Concept Analysis
by Mariana Espinosa-Aldama and Sergio Mendoza
Symmetry 2024, 16(7), 899; https://doi.org/10.3390/sym16070899 - 15 Jul 2024
Viewed by 884
Abstract
We propose several hierarchical graphs that represent the semantic relations between physical theories, their fundamental constants and units of measurement. We begin with an alternative representation of Zel’manov’s cube of fundamental constants as a concept lattice. We then propose the inclusion of a [...] Read more.
We propose several hierarchical graphs that represent the semantic relations between physical theories, their fundamental constants and units of measurement. We begin with an alternative representation of Zel’manov’s cube of fundamental constants as a concept lattice. We then propose the inclusion of a new fundamental constant, Milgrom’s critical acceleration, and discuss the implications of such analysis. We then look for the same fundamental constants in a graph that relates magnitudes and units of measurement in the International System of Units. This exercise shows the potential of visualizing hierarchical networks as a tool to better comprehend the symmetries, interrelations and dependencies of physical magnitudes, units and theories. New regimes of application may be deduced, as well as an interesting reflection on our ontologies and corresponding theoretical objects. Full article
Show Figures

Figure 1

Figure 1
<p>Bronstein’s space of physical theories. The image was taken from [<a href="#B2-symmetry-16-00899" class="html-bibr">2</a>], under a Creative Commons Copyright License.</p>
Full article ">Figure 2
<p>Bronstein’s space of physical theories and their correlation to cosmology. “Continues lines correspond to already existing theories. Dotted lines correspond to still unresolved problems” (Image taken from [<a href="#B2-symmetry-16-00899" class="html-bibr">2</a>] under a Creative Commons Copyright License).</p>
Full article ">Figure 3
<p>Redrawing of Zel’manov’s cube of physical theories in the cG<span class="html-italic">ℏ</span> coordinate system, as described by Gorelik and Frenkel [<a href="#B8-symmetry-16-00899" class="html-bibr">8</a>]. Acronyms are as follows: NG: Newton’s theory of gravity; STR: Special theory of relativity; QM: Quantum mechanics; GTR: General theory of relativity; SRQFT: Specially relativistic quantum field theory, GRQT: General relativistic quantum theory. Notice there are two corners without a label.</p>
Full article ">Figure 4
<p>On the left: the cube of physical theories by Okun [<a href="#B7-symmetry-16-00899" class="html-bibr">7</a>], where TOE, NM and QFT appear, and the c axis is now the 1/c axis. The abbreviations in the diagram stand for NM—Newtonian Mechanics; NG—Newtonian Gravity; STR—Special Theory of Relativity; QM—Quantum Mechanics; GTR—General Theory of Relativity; NQG—Non-relativistic Quantum Gravity; QFT—Quantum Field Theory; TOE—Theory of Everything. The figure on the right corresponds to the lattice of physical theories. Unlike a cube with coordinate axes, where Newtonian Mechanics (NM) lies at «the origin» (0,0,0), here it lies in the supreme node while the Theory of Everything (TOE) is at the bottom node. We have generalized even more the names for NQG to non-Relativist Quantum Gravitation (nRQG), QM to non-Relativistic Quantum Physics, and QFT to Relativist Quantum Physics (RQPh) in order to accommodate other models that would describe such regimes.</p>
Full article ">Figure 5
<p>Lattice of fundamental constants includes Milgrom’s constant <math display="inline"><semantics> <msub> <mi>a</mi> <mn>0</mn> </msub> </semantics></math> and four new model theories: Extended Gravitation (ExG), Extended Relativistic Gravitation (ExRG) and Extended Quantum Gravitation (ExQG). Relativistic Quantum Gravity (QGR) replaces the original Theory Of Everything (TOE), leaving the new TOE in the lowest node, which does not exist either, but would have to include all fundamental constants—that is, all regimes. Blue half nodes indicate a new attribute; black half nodes indicate a new object.</p>
Full article ">Figure 6
<p>Concept lattice with more constants: the electric charge <span class="html-italic">e</span>, Boltzmann’s constant <span class="html-italic">k</span> and Milgrom’s acceleration <math display="inline"><semantics> <msub> <mi>a</mi> <mn>0</mn> </msub> </semantics></math> call for the Electrodynamic Theory, Quantum Relativistic Extended Gravitation and Thermodynamics.</p>
Full article ">Figure 7
<p>Two representations of the relationships between units of the international system through fundamental constants. The left image was taken from Pisanty [<a href="#B49-symmetry-16-00899" class="html-bibr">49</a>] under a Creative Commons Copyright License and the right one is our proposal. The colors used in the lattice follow Pisanty’s selection to facilitate comparison between both diagrams.</p>
Full article ">Figure 8
<p>Hierarchical network for derived magnitudes. Red arrows represent multiplication of basic units. Blue arrows represent a division of basic units. Dotted bold arrows keep the hierarchical relations brought from Pisanty’s network.</p>
Full article ">Figure 9
<p>Hierarchical network for derived magnitudes with fundamental constants (circle nodes).</p>
Full article ">
Back to TopTop