R. Magin
Hasil untuk "Mathematics"
Menampilkan 20 dari ~3520922 hasil · dari DOAJ, Semantic Scholar, CrossRef
E. Weintraub
How Economics Became a Mathematical Science By E. Roy Weintraub. Durham, NC: Duke University Press. 2002. Pp. xiii, 313. $18.95 (paperback). It is a contemporary truism that if you hope to make a contribution to the field of economics, or even to study it, you had better first get a solid background in mathematics. It should also be evident that this has not always been the case, as anyone who has read Adam Smith or Karl Marx, or for that matter Frank Knight, J. M. Keynes, or F. A. Hayek, well knows. Sometime in the twentieth century economics changed, and changed profoundly. In How Economics Became a Mathematical Science, Roy Weintraub attempts to make some sense of the transformation. I was not prepared to enjoy this book. I knew Weintraub to be a lively writer, but his topic was daunting. I was half expecting a forthrightly pedantic and in the end (at least, given my own tastes) ploddingly dull monograph that followed the usual formulaic listing of "seminal contributions" on the road to full mathematization. In glum anticipation I imagined the questions to be covered: Shall we start with Cournot? With Bentham and the felicific calculus? Was Walras, as Schumpeter wrote, the greatest economist who ever lived, at least that is until his twentieth century successors, Arrow and Debreu and McKenzie, surpassed him? There would of course be the obligatory chapter on Samuelson's Foundations, another on the introduction in economics of fixed point theorems, still another on the fine points of proof strategies. At least for me, reading such a book would be a real challenge. It would be a duty, and a grim one. To my great good fortune, Roy Weintraub's How Economics Became a Mathematical Science is neither pedantic nor dull. It is, in fact, an altogether extraordinary book. Organized not in terms of a grand narrative, it is instead a series of snapshots. The snapshots are not necessarily of seminal moments, but of representative ones, and they are well chosen. Like all good history, the stories Weintraub recounts are multilayered, complex, even messy, a word he uses (p. 207). It would be unfair to expect that such a collection of vignettes should hang together. Incredibly, though, they do. As a final bonus, there are real surprises to be found in nearly every chapter. This, then, is an enjoyable book to read. But it is also an important one. I do not think it is an exaggeration to say that How Economics Became a Mathematical Science itself promises to change the way that people view the relationship between mathematics and economics. Weintraub says as much at the start of one of his chapters: "Modern controversies over formalism in economics rest on misunderstandings about the history of mathematics, the history of economics, and the history of the relationship between mathematics and economics" (p. 72). I usually hate such bald statements. It turns out, though, that he's right. Weintraub starts from a startlingly simple premise-both economics and mathematics have changed over the course of the past 100 plus years, so it makes sense to look at them both. Looking at how they have changed, and their interaction, might reveal some things that would be lost if one followed the all-too-usual Whig history route, the latter a form of backward induction in which one uses the present state of economics as a guide for picking out which episodes in the past are worthy of attention. The surprises begin in the first chapter. When most economists think of mathematics, they think of a stable discipline consisting of a set of related subject areas (algebra, geometry, calculus, topology) that yield tools for economists to use in constructing models of varying levels of generality. The tools are out there, as it were, sitting in books on the shelf, only to be learned and applied.' This "bookshelf view of mathematical knowledge" is challenged in the very first chapter, one that carries the intriguing title, "Bum the Mathematics (Tripos). …
Reza Firmansyah Putranto, Novita Kurnia Ningrum
The accumulation of unmanaged organic waste remains a critical environmental issue, highlighting the need for technological support to improve composting efficiency and monitoring. This study proposes an Internet of Things (IoT)-based system for monitoring compost fermentation conditions using temperature and humidity sensors, combined with Fuzzy Logic and R-square (R²) analysis to evaluate fermentation quality. The system employs a DHT11 sensor integrated with an ESP8266 microcontroller to collect temperature and humidity data in real time over a 20-day observation period, resulting in 1,008 data points. Fuzzy Logic is applied through fuzzification, rule-based inference, and defuzzification to classify compost conditions into four categories: poor, good, very good, and cooling needed. The model’s performance is further validated using multiple linear regression, with temperature and humidity as independent variables and average temperature as the dependent variable. The results show that compost temperature ranged between 28–32°C and humidity between 50–87%, indicating that the fermentation process was predominantly in the mesophilic or early composting phase. The fuzzy inference results demonstrate that most conditions fell within the “good” category, while the R² value of 0.87 indicates a strong relationship between the observed variables. These findings confirm that the integration of IoT, Fuzzy Logic, and statistical analysis is effective as a real-time monitoring and decision support system for compost management, while also highlighting the need for additional parameters to achieve a more comprehensive compost quality assessment.
Sadri Alija, Bright Asare, Senad Orhani et al.
Gianluigi Rozza, Oliver Schütze, Nicholas Fantuzzi
This Special Issue comprises the first collection of papers submitted by the Editorial Board Members (EBMs) of the journal <i>Mathematical and Computational Applications</i> (MCA), as well as outstanding scholars working in the core research fields of MCA [...]
Uwe Wolter
Based on a formalization of open formulas as statements in context, the paper presents a freshly new and abstract view of logics and specification formalisms. Generalizing concepts like sets of generators in Group Theory, underlying graph of a sketch in Category Theory, sets of individual names in Description Logic and underlying graph-based structure of a software model in Software Engineering, we coin an abstract concept of context. We show how to define, in a category independent way, arbitrary first-order statements in arbitrary contexts. Examples of those statements are defining relations in Group Theory, commutative, limit and colimit diagrams in Category Theory, assertional axioms in Description Logic and constraints in Software Engineering. To validate the appropriateness of the newly proposed abstract framework, we prove that our category independent definitions and constructions give us a very broad spectrum of Institutions of Statements at hand. For any Institution of Statements, a specification (presentation) is given by a context together with a set of first-order statements in that context. Since many of our motivating examples are variants of sketches, we will simply use the term sketch for those specifications. We investigate exhaustively different kinds of arrows between sketches and their interrelations. To pave the way for a future development of category independent deduction calculi for sketches, we define arbitrary first-order sketch conditions and corresponding sketch constraints as a generalization of graph conditions and graph constraints, respectively. Sketch constraints are the crucial conceptual tool to describe and reason about the structure of sketches. We close the paper with some vital observations, insights and ideas related to future deduction calculi for sketches. Moreover, we outline that our universal method to define sketch constraints enables us to establish and to work with conceptual hierarchies of sketches.
Hao Xu, Yuntian Chen, Dongxiao Zhang
Abstract The interpretability of deep neural networks has attracted increasing attention in recent years, and several methods have been created to interpret the “black box” model. Fundamental limitations remain, however, that impede the pace of understanding the networks, especially the extraction of understandable semantic space. In this work, the framework of semantic explainable artificial intelligence (S‐XAI) is introduced, which utilizes a sample compression method based on the distinctive row‐centered principal component analysis (PCA) that is different from the conventional column‐centered PCA to obtain common traits of samples from the convolutional neural network (CNN), and extracts understandable semantic spaces on the basis of discovered semantically sensitive neurons and visualization techniques. Statistical interpretation of the semantic space is also provided, and the concept of semantic probability is proposed. The experimental results demonstrate that S‐XAI is effective in providing a semantic interpretation for the CNN, and offers broad usage, including trustworthiness assessment and semantic sample searching.
Miotk Mateusz, Żyliński Paweł
In this paper, we provide a structural characterization of graphs having a spanning tree with disjoint dominating and 2-dominating sets.
Marc Van Zanten, Marja Van den Heuvel-Panhuizen
Since the late 1960s, a reform in mathematics education, which is currently known under the name Realistic Mathematics Education (RME), has been taking place in the Netherlands. Characteristic for this approach to mathematics education is that mathematics is not seen as ready-made knowledge but as an activity of the learner. Although much has been written about the big ideas and intentions of RME, and multiple RME-oriented textbooks have been published, up to now the development of this approach to mathematics education has not been thoroughly investigated. In the research reported in this article, we traced how RME has evolved over the years. The focus in our study was on early addition and subtraction in primary school. For this, we studied RME core curriculum documents and analyzed RME-oriented textbooks that have been published between the onset of RME and the present. We found that the big ideas and teaching principles of RME were clearly reflected in the learning facilitators for learning early addition and subtraction and were steadily present in curriculum documents over the years, although some were made concrete in further detail. Furthermore, we found all RME learning facilitators also to be present in all RME-oriented textbooks, though in some cases in other ways than originally intended. Our research shows the complexity of a curriculum reform process and its implementation in textbooks.
Shrooq Alsenan, Isra Al-Turaiki, Alaaeldin Hafez
The blood–brain barrier plays a crucial role in regulating the passage of 98% of the compounds that enter the central nervous system (CNS). Compounds with high permeability must be identified to enable the synthesis of brain medications for the treatment of various brain diseases, such as Parkinson’s, Alzheimer’s, and brain tumors. Throughout the years, several models have been developed to solve this problem and have achieved acceptable accuracy scores in predicting compounds that penetrate the blood–brain barrier. However, predicting compounds with “low” permeability has been a challenging task. In this study, we present a deep learning (DL) classification model to predict blood–brain barrier permeability. The proposed model addresses the fundamental issues presented in former models: high dimensionality, class imbalances, and low specificity scores. We address these issues to enhance the high-dimensional, imbalanced dataset before developing the classification model: the imbalanced dataset is addressed using oversampling techniques and the high dimensionality using a non-linear dimensionality reduction technique known as kernel principal component analysis (KPCA). This technique transforms the high-dimensional dataset into a low-dimensional Euclidean space while retaining invaluable information. For the classification task, we developed an enhanced feed-forward deep learning model and a convolutional neural network model. In terms of specificity scores (i.e., predicting compounds with low permeability), the results obtained by the enhanced feed-forward deep learning model outperformed those obtained by other models in the literature that were developed using the same technique. In addition, the proposed convolutional neural network model surpassed models used in other studies in multiple accuracy measures, including overall accuracy and specificity. The proposed approach solves the problem inevitably faced with obtaining low specificity resulting in high false positive rate.
Xin Wang, Peng Cao
In this paper, we prove the following result by perturbation technique. If <i>q</i> is a quasinilpotent element of a Banach algebra and spectrum of <inline-formula> <math display="inline"> <semantics> <mrow> <mi>p</mi> <mo>+</mo> <mi>q</mi> </mrow> </semantics> </math> </inline-formula> for any other quasinilpotent <i>p</i> contains at most <i>n</i> values then <inline-formula> <math display="inline"> <semantics> <mrow> <msup> <mi>q</mi> <mi>n</mi> </msup> <mo>=</mo> <mn>0</mn> </mrow> </semantics> </math> </inline-formula>. Applications to C* algebras are given.
Dmytro Malenko
Most multi-user applications need an access system so that users can log in from different devices. To identify a user, at least one authentication method must be used. The article presents a comparison of the authentication method using NFC technology with other methods that are available for Android devices and differ in both implementation and specific usage.
Mohammed Al-Neima, Amir Mohammed
Cabrera-Mohammed proved that the imbedding of a norm ideal on Hilbert space in algebra of quotients with bounded evaluation is continuous with other properties. In this paper we improve this result by using complex Banach space instated of Hilbert space.<br />
Gabriella Bognár, Krisztián Hriczó
The aim of this paper is to investigate the boundary layer of ferrofluid flow induced by a permeable stretching sheet. Fluid is electrically non-conducting in the presence of non-uniform magnetic field. The governing non-linear partial differential equations are reduced to non-linear ordinary differential equations by applying a similarity transformation. Numerical solutions are obtained by using Maple. The effects of the magnetic field, the Reynolds number and the porosity on the velocity and thermal fields are investigated. The impact of the parameters on the skin friction and the local Nusselt number is numerically examined. The skin friction and heat transfer coefficients are decreasing with enhancing the stretching, the values of porosity and the ferromagnetic parameter.
M. Iftakhar Alam, Jafrin Sultana
One of the most challenging tasks in clinical trials is finding the maximum tolerated dose (MTD) to be tested in the next phase. An assurance for the safety of the patients and recommendation of a suitable dose for phase II are the main objectives of a phase I trial. The MTD can be identified through various approaches. A non-parametric approach, known as the isotonic design, has been explored in our study. The design relies on the monotonicity assumption of the dose-toxicity relationship. Usually the number of patients in a trial have an impact on the adequacy of dose recommendation. This paper is a humble attempt to see the impact of cohort size and total cohorts on the isotonic design. It investigates the possibility of improving the current algorithm of the isotonic design for escalation and de-escalation. Also, the paper proposes a stopping rule to avoid any severely toxic dose as the MTD. The simulation study shows that along with total cohort, cohort size also has an appreciable effect on the MTD selection. The proposed modification of the algorithm has also been found to work satisfactorily in majority of the cases.
J. R. Newman
Huda Aldweby, Maslina Darus
We derive the Fekete-Szegö theorem for new subclasses of analytic functions which are q-analogue of well-known classes introduced before.
P.D. Manrique, J.C. Beier, N.F. Johnson
New outbreaks of Zika in the U.S. are imminent. Human nature dictates that many individuals will continue to revisit affected ‘Ground Zero’ patches, whether out of choice, work or family reasons − yet this feature is missing from traditional epidemiological analyses. Here we show that this missing visit-revisit mechanism is by itself capable of explaining quantitatively the 2016 human Zika outbreaks in all three Ground Zero patches. Our findings reveal counterintuitive ways in which this human flow can be managed to tailor any future outbreak’s duration, severity and time-to-peak. Effective public health planning can leverage these results to impact the evolution of future outbreaks via soft control of the overall human flow, as well as to suggest best-practice visitation behavior for local residents.
LIAN Yueyong,ZHANG Chao,LI Qiang
In view of the defect existed in the time synchronization of Local Area Network(LAN) using hardware and software methods,this paper introduces the principle of time synchronization in LAN and the method of robust estimation.The model of time comparison is established.A new method for time synchronization in LAN using the robust estimation method is proposed,and uses the clock retaining strategy to save the overhead of CPU.The least squares adjustment method and the robust estimation method are respectively used for measuring data processing to analyze the accuracy difference.Experimental results show that the synchronization method has the advantages of simple hardware configuration with high synchronization precision,and it effectively restrains the influence of network delay while keeping the good stability of system.
H. Bass, Jiang-Hua Lu, J. Oesterlé et al.
Halaman 27 dari 176047