Hasil untuk "Information theory"

Menampilkan 20 dari ~21723317 hasil · dari arXiv, DOAJ, CrossRef, Semantic Scholar

JSON API
S2 Open Access 2014
From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0

Masafumi Oizumi, L. Albantakis, G. Tononi

This paper presents Integrated Information Theory (IIT) of consciousness 3.0, which incorporates several advances over previous formulations. IIT starts from phenomenological axioms: information says that each experience is specific – it is what it is by how it differs from alternative experiences; integration says that it is unified – irreducible to non-interdependent components; exclusion says that it has unique borders and a particular spatio-temporal grain. These axioms are formalized into postulates that prescribe how physical mechanisms, such as neurons or logic gates, must be configured to generate experience (phenomenology). The postulates are used to define intrinsic information as “differences that make a difference” within a system, and integrated information as information specified by a whole that cannot be reduced to that specified by its parts. By applying the postulates both at the level of individual mechanisms and at the level of systems of mechanisms, IIT arrives at an identity: an experience is a maximally irreducible conceptual structure (MICS, a constellation of concepts in qualia space), and the set of elements that generates it constitutes a complex. According to IIT, a MICS specifies the quality of an experience and integrated information ΦMax its quantity. From the theory follow several results, including: a system of mechanisms may condense into a major complex and non-overlapping minor complexes; the concepts that specify the quality of an experience are always about the complex itself and relate only indirectly to the external environment; anatomical connectivity influences complexes and associated MICS; a complex can generate a MICS even if its elements are inactive; simple systems can be minimally conscious; complicated systems can be unconscious; there can be true “zombies” – unconscious feed-forward systems that are functionally equivalent to conscious complexes.

989 sitasi en Medicine, Computer Science
DOAJ Open Access 2025
Developing a Change Management Framework to Enhance Operational Excellence in Law Enforcement Organizations

Ayda Mussa Yousif Abdulrahman, Rafiduraida binti Abdul Rahman

This research aims to investigate the current operational status of the Ajman Police, focusing on identifying elements and issues that affect operational excellence. Using change management models, including Kotter's 8 Step Model and the ADKAR Model, the paper critically examines the hierarchical structure of the Ajman Police, its specialist groups, and their performance indicators. The problem statement highlights the negative impact of traditional and rigid organizational structures on innovation, responsiveness, and the limitations of implementing effective public safety measures, prevention, and community policing. The research design adopted is a qualitative methodology, and a sample of senior police officers was interviewed to record their views on the issues of operation and preparedness to change. In conducting the study, Semi-structured interviews were conducted with 10 participants. Results indicate that the Ajman Police has already ventured into technological advancements and civil policing. However, there are still gaps in continuous development, innovation, and the implementation of modern change management practices. The research proposes a culturally, operationally, and technologically oriented framework for change management, specifically tailored to the context of the Ajman Police. The study makes a significant research contribution to both the practice and theory fields by providing a guideline for a change management roadmap for the Ajman Police and other similar agencies, ensuring operational excellence in fast-changing environments.

Management information systems, Economic history and conditions
DOAJ Open Access 2025
Potęga pustki. Spojrzenie na problem banalizacji z perspektywy myśli buddyjskiej (pamięci Czesława Robotyckiego)

Grzegorz Dąbrowski

This article was inspired by a paper written by Czesław Robotycki, which was entitled O banalizacji tekstów w etnografii [On the Trivialization of Texts in Ethnography]. According to Robotycki, the trivialization boils down to the disproportion between the tools used and the social and cultural reality they describe. This is particularly the case when authors resort to ready‑made and often identical interpretative clichés, usually theories that are popular at a given time. This article is, in a sense, an elaboration of Robotycki’s ideas, although its main aim is to provide key information and categories related to the understanding of the Buddhist concept of emptiness which, from the perspective of Buddhist thought, expresses the complexity of all phenomena while sensitizing us to the fact that every generalization, whether it is a single word or an elaborate theory, is at best a form of image of the reality described in this way.

History of Civilization
DOAJ Open Access 2025
Assessing patient preferences for medical decision making - a comparison of different methods

Jakub Fusiak, Andreas Wolkenstein, Verena S. Hoffmann

BackgroundPatient preferences are a critical component of shared decision-making (SDM), particularly when choosing between treatment options with differing risks and outcomes. Many methods exist to elicit these preferences, but their complexity, usability, and acceptance vary.ObjectiveWe aim to gain insight into the acceptance, effort and preferences of participants regarding five different methods of preference assessment. Additionally, we investigate the influence of health status, experiences within the health system and of demographic factors on the results.MethodsWe conducted a cross-sectional online survey including five preference elicitation Methods: best-worst scaling, direct weighting, PAPRIKA (Potentially All Pairwise Rankings of all Possible Alternatives), time trade-off, and standard gamble. The questionnaire was distributed via academic and patient advocacy mailing lists, reaching both healthy individuals and those with acute or chronic illnesses. Participants rated each method using six standardized statements on a 5-point Likert scale. Additional items assessed general acceptance of algorithm-assisted preference assessments and the clarity of the questionnaire.ResultsOf 258 initiated questionnaires, 123 (48%) were completed and included in the analysis. Participants were diverse in age, gender, and health status, but predominantly highly educated and digitally literate. Across all measures, the PAPRIKA method received the highest ratings for clarity, usability, and perceived ability to express preferences. Simpler methods (best-worst scaling, direct weighting) were rated as less useful for capturing nuanced preferences, while abstract utility-based methods (standard gamble, time trade-off) were seen as cognitively demanding. Subgroup analyses showed minimal variation across demographic groups. Most participants (82%) could imagine using at least one of the presented methods in real clinical settings, but also emphasized the importance of physician involvement in interpreting results.ConclusionThe interactive PAPRIKA method best balanced cognitive demand and expressiveness and was preferred by most participants. Structured methods for preference elicitation may enhance SDM when integrated into clinical workflows and supported by healthcare professionals. Further research is needed to evaluate their use in real-world decisions and among more diverse patient populations.

Medicine, Public aspects of medicine
S2 Open Access 2018
A Tutorial for Information Theory in Neuroscience

N. Timme, C. Lapish

Abstract Understanding how neural systems integrate, encode, and compute information is central to understanding brain function. Frequently, data from neuroscience experiments are multivariate, the interactions between the variables are nonlinear, and the landscape of hypothesized or possible interactions between variables is extremely broad. Information theory is well suited to address these types of data, as it possesses multivariate analysis tools, it can be applied to many different types of data, it can capture nonlinear interactions, and it does not require assumptions about the structure of the underlying data (i.e., it is model independent). In this article, we walk through the mathematics of information theory along with common logistical problems associated with data type, data binning, data quantity requirements, bias, and significance testing. Next, we analyze models inspired by canonical neuroscience experiments to improve understanding and demonstrate the strengths of information theory analyses. To facilitate the use of information theory analyses, and an understanding of how these analyses are implemented, we also provide a free MATLAB software package that can be applied to a wide range of data from neuroscience experiments, as well as from other fields of study.

214 sitasi en Computer Science, Medicine
arXiv Open Access 2024
On Jacob Ziv's Individual-Sequence Approach to Information Theory

Neri Merhav

This article stands as a tribute to the enduring legacy of Jacob Ziv and his landmark contributions to information theory. Specifically, it delves into the groundbreaking individual-sequence approach -- a cornerstone of Ziv's academic pursuits. Together with Abraham Lempel, Ziv pioneered the renowned Lempel-Ziv (LZ) algorithm, a beacon of innovation in various versions. Beyond its original domain of universal data compression, this article underscores the broad utility of the individual-sequence approach and the LZ algorithm across a wide spectrum of problem areas. As we traverse through the forthcoming pages, it will also become evident how Ziv's visionary approach has left an indelible mark on my own research journey, as well as on those of numerous colleagues and former students. We shall explore, not only the technical power of the LZ algorithm, but also its profound impact on shaping the landscape of information theory and its applications.

en cs.IT
DOAJ Open Access 2024
Construction of a semi-distributed hydrological model considering the combination of saturation-excess and infiltration-excess runoff space under complex substratum

Yingying Xu, Qiying Yu, Chengshuai Liu et al.

Study region: Typical basin in humid areas in the Huaihe River Study focus: Accurate flood forecasting is essential for making timely decisions regarding flood control and disaster reduction. The theory of watershed runoff generation and convergence serves as a crucial foundation for flood forecasting, while the calculation of runoff is necessary to simulate flood discharge. Identifying watershed runoff generation mechanisms has been a challenging task, particularly under complex underlying surface conditions. To improve the accuracy of flood simulation, this study examines the underlying surface information in the watershed, such as particle composition and content, soil bulk density, geological slope, land use, and other spatial attributes, aiming to analyze the mechanisms of runoff generation. In the study of sub-watersheds, various combinations of runoff generation mechanisms are identified to determine the patterns of runoff. Subsequently, a semi-distributed hydrological model is developed, which incorporates both saturation-excess and infiltration-excess runoff, utilizing the information obtained from the underlying surface. The model is validated using rainfall-runoff data from 14 events at the Xiagushan watershed. New hydrological insights for the region: The analysis of the fundamental physical conditions of the underlying surface of the watershed revealed that 69.70% of the area is prone to saturation-excess runoff, with an additional 30.30% of the area being susceptible to infiltration-excess runoff. The model considers the spatial distribution of runoff patterns by incorporating complex underlying surface information and demonstrates high accuracy in simulating flood events (NSE= 0.87, Epeak = 12.08%, Wpeak = 13.16%, Tpeak = 0.14 h, R2 = 0.90). The model is straightforward, practical, and exhibits promising potential in terms of timeliness and applicability, thus lending itself well to further application in other watersheds, contributing to the scientific foundation of flood warning and forecasting efforts.

Physical geography, Geology
DOAJ Open Access 2024
Progressive Optimal Fault-Tolerant Control Combining Active and Passive Control Manners

Dan Du, Zetao Li, Boutaib Dahhou

This study develops a progressive optimal fault-tolerant control method based on insufficient fault information. By combining passive and active fault-tolerant control manners during the process of fault diagnosis, insufficient fault information is fully used, and optimal fault-tolerant control effect is achieved. In addition, the fault-tolerant control method based on guaranteed robust cost control is introduced. The proposed progressive optimal fault-tolerant control method considers two aspects. First, as the amount of fault information continually increases, the performance index of the progressive optimal fault-tolerant controller improves. Second, at each moment, based on the corresponding insufficient fault information and prior knowledge, optimal fault-tolerant control is achieved according to current fault information. The process of progressive optimal fault-tolerant control converges to active fault-tolerant control when the fault is completely identified, and the optimal fault-tolerant controller is no longer reconfigured until no more useful fault information can be provided. Furthermore, a progressive optimal fault-tolerant control algorithm based on the grid segmentation in the parameter uncertainty domain and the selection of different auxiliary center points is introduced. Simulation results verified the feasibility of the proposed algorithm and the validity of the proposed theory.

Materials of engineering and construction. Mechanics of materials, Production of electric energy or power. Powerplants. Central stations
DOAJ Open Access 2024
Psychometric analysis and the implications for the use of the scoliosis research society questionnaire (SRS-22r English) for individuals with adolescent idiopathic scoliosis

Donna J. Oeffinger, PhD, Henry Iwinski, MD, Vishwas Talwalkar, MD et al.

Background: Despite widespread usage of the SRS-22r questionnaire (Scoliosis Research Society Questionnaire-22r), the English version has only sparingly been subjected to analysis using modern psychometric techniques for patients with adolescent idiopathic scoliosis (AIS). The study purpose was to improve interpretation and clinical utility of the SRS-22r for adolescents with AIS by generating additional robust evidence, using modern statistical techniques. Questions about (1) Structure and (2) Item and Scale Functioning are addressed and interpreted for clinicians and researchers. Methods: This retrospective case review analyzed SRS-22r data collected from 1823 patients (mean age 14.9±2.2years) with a primary diagnosis of AIS who clinically completed an SRS-22r questionnaire.Individual SRS-22r questions and domain scores were retrieved through data queries. Patient information collected through chart review included diagnosis, age at assessment, sex, race and radiographic parameters. From 6044 SRS-22r assessments, 1 assessment per patient was randomly selected. Exploratory structural equation modeling (ESEM) and item response theory (IRT) techniques were used for data modeling, item calibration, and reliability assessment. Results: ESEM demonstrated acceptable fit to the data: χ2 (130)=343.73, p<.001; RMSEA=0.035; CFI=0.98; TLI=0.96; SRMR=0.02. Several items failed to adequately load onto their assigned factor. Item fit was adequate for all items except SRSq10 (Self-Image), SRSq16 (Mental Health), and SRSq20 (Mental Health). IRT models found item discriminations are within normal levels for items in psychological measures, except items SRSq1 (pain), SRSq2 (pain), and SRSq16 (mental health). Estimated reliability of the Function domain (ρ=0.69) was low, however, Pain, Self-Image and Mental Health domains exhibited high (ρ>0.80) reliability. Conclusions: Modern psychometric assessment of the SRS-22r, in adolescent patients with AIS, are presented and interpreted to assist clinicians and researchers in understanding its strengths and limitations. Overall, the SRS-22r demonstrated good psychometric properties in all domains except function. Cautious interpretation of the total score is suggested, as it does not reflect a single HRQoL construct.

Orthopedic surgery, Neurology. Diseases of the nervous system
DOAJ Open Access 2024
Analogue Computation Converter for Nonhomogeneous Second-Order Linear Ordinary Differential Equation

Gabriel Nicolae Popa, Corina Maria Diniș

Among many other applications, electronic converters can be used with sensors with analogue outputs (DC voltage). This article presents an analogue computation converter with two DC voltages at the inputs (one input changes the frequency of the output signal, another input changes the amplitude of the output signal) that provide a periodic sinusoidal signal (with variable frequency and amplitude) at the output. On the basis of the analogue computation converter is a nonhomogeneous second-order linear ordinary differential equation which is solved analogically. The analogue computation converter consists of analogue multipliers and operational amplifiers, composed of seven function circuits: two analogue multiplication circuits, two analogue addition circuits, one non-inverting amplifier, and two integration circuits (with RC time constants). At the output of an oscillator is a sinusoidal signal which depends on the DC voltages applied on two inputs (0 ÷ 10 V): at one input, a DC voltage is applied to linearly change the sinusoidal frequency output (up to tens of kHz, according to two time constants), and at the other input, a DC voltage is applied to linearly change the amplitude of the oscillator output signal (up to 10 V). It can be used with sensors which have a DC output voltage and must be converted to a sine wave signal with variable frequency and amplitude with the aim of transmitting information over longer distances through wires. This article presents the detailed theory of the functioning, simulations, and experiments of the analogue computation converter.

Electronic computers. Computer science
DOAJ Open Access 2023
Soft theorems for boosts and other time symmetries

Lam Hui, Austin Joyce, Ilia Komissarov et al.

Abstract We derive soft theorems for theories in which time symmetries — symmetries that involve the transformation of time, an example of which are Lorentz boosts — are spontaneously broken. The soft theorems involve unequal-time correlation functions with the insertion of a soft Goldstone in the far past. Explicit checks are provided for several examples, including the effective theory of a relativistic superfluid and the effective field theory of inflation. We discuss how in certain cases these unequal-time identities capture information at the level of observables that cannot be seen purely in terms of equal-time correlators of the field alone. We also discuss when it is possible to phrase these soft theorems as identities involving equal-time correlators.

Nuclear and particle physics. Atomic energy. Radioactivity
DOAJ Open Access 2023
Toward an Environmental Education of Students at the Faculty of Natural Sciences, the University of Namibe

Ubaldo Jorge Augusto de Filipe André, Ana Paula Sarmento do Santos, Onelis Portuondo Savón et al.

Context: Today, quite a few environmental problems require swift responses toward adjustment, mitigation, and sustainability. Accordingly, how could university students acquire effective environmental education so they can play their social roles in balance with environmental protection? Aim: To recommend methodological actions to contribute to student education at the Faculty of Natural Sciences, the University of Namibe, Angola. Methods: Consequently, this study took from qualitative methods of social research. Methods and techniques, such as analysis-synthesis, inductive-deductive, and documentary review for processing information about environmental education and climate change in university education. Results: Five methodological guidelines for environmental education were established. They were inserted in subject Physics II, in the first year of the Marine Biology Bachelor Degree, with six general actions that link theory and practice, through the teaching process in the degrees of Oceanography, Marine Biology, and Marine Resources. The study demonstrated the fulfillment of learning objectives related to Sustainable Development Goals No. 13 and 14, based on UNESCO (2017) guidelines. Conclusions: There is a potential for students to acquire environmental information through methodological actions by the staff, in terms of subject preparation at the Faculty of Natural Sciences, the University of Namibe.

Agriculture (General)
DOAJ Open Access 2023
Health Equity Journal: Special Issue Guest Editorial

The practices of contemporary clinical decision-making and care rely heavily on racial biological essentialism, which is a set of ideas originating in modern science that describes populations as comprising distinct subpopulations with unique sets of essential, heritable characteristics and propensities (i.e., races) purportedly due to their biology.1 Racial biological essentialism exaggerates the relevance of biology to health inequities and promotes the misuse of race (e.g., ?Black race?) in clinical decision-making, care, and research. Decades of research and scholarship2?4 (e.g., human genome project which inadvertently established that more genetic variation exists within each U.S. racial category than between them) have shown that race is fundamentally not biological. A substantial body of evidence clarifies that race is a sociopolitical, not biological construct.5 Nevertheless, the harmful, unscientific practices of racial biological essentialism persist, which helps explain why the misuse of race in clinical decision-making, research, and education remains pervasive. For many years, medical trainees, health equity scholars, and public health physicians have explained how race consciousness (i.e., racism consciousness), which is the understanding of race as a sociopolitical construct, provides a more useful understanding for medicine and public health than racial biological essentialism does.6 This work has taken many forms, including the de-implementation of race-based algorithms used as clinical decision-making tools. In 2020, the Ways and Means Committee in the U.S. House of Representatives asked professional societies across medical disciplines to rethink their use of race-based clinical algorithms. The Committee sent a ?Request for Information? (RFI) to medical professional societies endorsing the elimination of race-based clinical algorithms. The study findings and RFI responses were captured in a 2021 report and captured responses from the professional societies as well as recommendations to improve clinical decision-making.7 The Agency for Healthcare Research and Quality (AHRQ) is also taking on this issue. At the time of this publication, AHRQ is undertaking a systematic review to provide Congress and the public responses to key questions on the impact of race-based clinical algorithms on health outcomes, and what can be done to address and/or mitigate racial bias on the development, validation, etc., of clinical algorithms.8 Since 2021, several professional societies have updated their positions on the inclusion of race in clinical algorithms within their respective specialties. The National Kidney Foundation and the American Society of Nephrology officially endorsed an estimated glomerular filtration rate (e-GFR) calculator without a race variable.9 The American College of Obstetrics and Gynecology no longer endorses a vaginal birth after caesarean calculator that uses race.10 Most recently, the American Thoracic Society issued updated recommendations in spirometry testing and the race-neutral reference equation for all patients, irrespective of race.11 As research continues to elucidate the harms and any benefits from including race in clinical algorithms, the urgency to address race-based algorithms is only intensifying. For instance, 35% of Americans suffering from renal failure are Black, while only representing 13% of the population.12 Many social and clinical factors contribute to this stark inequity, including the misuse of race to modify the e-GFR score, which is used in the diagnosis and treatment of chronic kidney disease (CKD). Race and other social factors have been linked to the e-GFR and other statistics, which has been associated with disproportionate suffering due to (CKD) and its sequalae among Black populations (e.g., higher rates of end-stage CKD diagnosis and lower rates of kidney transplantation eligibility among Black populations).13 Health equity experts agree to the implementation of nonrace-based clinical algorithms that cannot be subjected to the over 10-year timeframe typical for medical research and its adoption into practice.14 To meet the urgency of this moment, the NYC Department of Health and Mental Hygiene launched the Coalition to End Racism in Clinical Algorithms (CERCA). This coalition is a citywide initiative consisting of both safety-net hospitals and academic medical centers representing all five boroughs of NYC. Participation in CERCA requires that each coalition member commit to de-implement at least one race-based algorithm. Members are also required to furnish work, evaluation, and patient engagement plans regarding their de-implementation of race-based algorithms.15,16 In the summer of 2023, the NYC Department of Health and Mental Hygiene hosted the first annual New York City Anti-racism in Medical Education Symposium in partnership with the Josiah Macy Jr. Foundation, the American Academy of Medical Colleges, and the Fund for Public Health NYC. This symposium aimed to identify key stakeholders involved in anti-racism and curriculum development at NYC medical schools and understand the depth and breadth of anti-racism praxis incorporated into their educational programming.16 This special issue of Health Equity hopes to contribute to this growing body of knowledge regarding the de-implementation efforts needed to holistically address and eradicate race essentialism from practice and education. Specifically, this issue highlights scholarship in the following areas: Historical origins of race adjustment in medicine, clinical decision-making tools, and artificial intelligence tools in medicine; Current activities, successes, and challenges around removal of race from clinical decision-making tools at the institution- and system-level and its impact on patient outcomes; City, state, and federal policies and policy analysis supporting removal of race adjustment from clinical decision-making tools; and Programs, interventions, and policies intended to interrupt or end algorithmic racial discrimination in medicine and health care. We hope this publication captures the latest work in addressing race-based medicine and facilitate the adaptation and implementation of initiatives to correct and mitigate the harmful effects of racial discrimination in health care. As we chart the next steps of this movement?which include equitable transplantation access, federal changes in Medicaid and Medicare policy on use of race-based algorithms, biases in artificial intelligence, application of public health critical race praxis (PHCRP) in research, to name a few emerging areas?remaining abreast of current and needed work will be essential to realizing a more equitable, just, and healthy society. According to PHCRP, which is a health equity offshoot of Critical Race Theory, the first step toward advancing health equity is to acknowledge how the conventions of our field help reinforce inequities however well-intentioned our efforts may be. The use of arbitrary race corrections in clinical algorithms relies on and reifies racial biological determinism. It undermines the ability of clinicians to uphold their commitment to beneficence, nonmaleficence, and justice in the provision of care. Failure to uphold them harms minoritized patients and communities through, for instance, delayed or missed diagnoses and the exacerbation of racialized stigmata. A substantial body of scholarship and research now exists to promote more equitable clinical decision-making and care. Consistent with the PHCRP principle of disciplinary self-critique, this special issue documents the continued misuse of race in clinical algorithms. It also offers constructive alternatives that can be implemented immediately.17,18

Public aspects of medicine
DOAJ Open Access 2023
Iterative algorithm for accurate superposition of contours with non-uniform sampling step

R.R. Diyazitdinov

In this article, we describe an iterative algorithm for accurate superposition of contours with non-uniform sampling step. The processing contours are characterized by the same shape, but the sampling step is non-uniform, with no matching between points of the superposed contours. This makes impossible the use of methods for estimating superposition parameters by matching points. The algorithm proposed herein allows estimating the offsets and rotation angle separately. The idea of the algorithm is to perform the iterative correction of parameters. An estimate of the offsets is used to estimate the rotation angle and, vice versa, an estimate of the rotation angle is used to estimate the offsets. The proposed algorithm is characterized by a higher speed of processing than a brute force algorithm and a lower estimation error than algorithms that analyze contour macroparameters.

Information theory, Optics. Light
DOAJ Open Access 2023
Study of the growth mechanism of a self-assembled and ordered multi-dimensional heterojunction at atomic resolution

Zunyu Liu, Chaoyu Zhao, Shuangfeng Jia et al.

Abstract Multi-dimensional heterojunction materials have attracted much attention due to their intriguing properties, such as high efficiency, wide band gap regulation, low dimensional limitation, versatility and scalability. To further improve the performance of materials, researchers have combined materials with various dimensions using a wide variety of techniques. However, research on growth mechanism of such composite materials is still lacking. In this paper, the growth mechanism of multi-dimensional heterojunction composite material is studied using quasi-two-dimensional (quasi-2D) antimonene and quasi-one-dimensional (quasi-1D) antimony sulfide as examples. These are synthesized by a simple thermal injection method. It is observed that the consequent nanorods are oriented along six-fold symmetric directions on the nanoplate, forming ordered quasi-1D/quasi-2D heterostructures. Comprehensive transmission electron microscopy (TEM) characterizations confirm the chemical information and reveal orientational relationship between Sb2S3 nanorods and the Sb nanoplate as substrate. Further density functional theory calculations indicate that interfacial binding energy is the primary deciding factor for the self-assembly of ordered structures. These details may fill the gaps in the research on multi-dimensional composite materials with ordered structures, and promote their future versatile applications. Graphical Abstract

Applied optics. Photonics

Halaman 2 dari 1086166