We investigate whether the hidden states of large language models (LLMs) can be used to estimate and impute economic and financial statistics. Focusing on county-level (e.g. unemployment) and firm-level (e.g. total assets) variables, we show that a simple linear model trained on the hidden states of open-source LLMs outperforms the models' text outputs. This suggests that hidden states capture richer economic information than the responses of the LLMs reveal directly. A learning curve analysis indicates that only a few dozen labelled examples are sufficient for training. We also propose a transfer learning method that improves estimation accuracy without requiring any labelled data for the target variable. Finally, we demonstrate the practical utility of hidden-state representations in super-resolution and data imputation tasks.
This study aims to analyze the economic performance of various parks under different conditions, particularly focusing on the operational costs and power load balancing before and after the deployment of energy storage systems. Firstly, the economic performance of the parks without energy storage was analyzed using a random forest model. Taking Park A as an example, it was found that the cost had the greatest correlation with electricity purchase, followed by photovoltaic output, indicating that solar and wind power output are key factors affecting economic performance. Subsequently, the operation of the parks after the configuration of a 50kW/100kWh energy storage system was simulated, and the total cost and operation strategy of the energy storage system were calculated. The results showed that after the deployment of energy storage, the amount of wind and solar power curtailment in each park decreased, and the operational costs were reduced. Finally, a genetic algorithm was used to optimize the energy storage configuration of each park. The energy storage operation strategy was optimized through fitness functions, crossover operations, and mutation operations. After optimization, the economic indicators of Parks A, B, and C all improved. The research results indicate that by optimizing energy storage configuration, each park can reduce costs, enhance economic benefits, and achieve sustainable development of the power system.
As Virtual Reality (VR) games become more popular, it is crucial to understand how deceptive game design patterns manifest and impact player experiences in this emerging medium. Our study sheds light on the presence and effects of manipulative design techniques in commercial VR games compared to a traditional computer game. We conducted an autoethnography study and developed a VR Deceptive Game Design Assessment Guide based on a critical literature review. Using our guide, we compared how deceptive patterns in a popular computer game are different from two commercial VR titles. While VR's technological constraints, such as battery life and limited temporal manipulation, VR's unique sensory immersion amplified the impact of emotional and sensory deception. Current VR games showed similar but evolved forms of deceptive design compared to the computer game. We forecast more sophisticated player manipulation as VR technology advances. Our findings contribute to a better understanding of how deceptive game design persists and escalates in VR. We highlight the urgent need to develop ethical design guidelines for the rapidly advancing VR games industry.
Ruijun Hou, Samuel Baker, Stephanie von Hinke
et al.
We study the long-term health and human capital impacts of local economic conditions experienced during the first 1,000 days of life. We combine historical data on monthly unemployment rates in urban England and Wales 1952-1967 with data from the UK Biobank on later-life outcomes. Leveraging variation in unemployment driven by national industry-specific shocks weighted by industry's importance in each area, we find no evidence that small, common fluctuations in local economic conditions during the early life period affect health or human capital in older age.
Global Navigation Satellite Systems (GNSS) are fundamental in ubiquitously providing position and time to a wide gamut of systems. Jamming remains a realistic threat in many deployment settings, civilian and tactical. Specifically, in Unmanned Aerial Vehicles (UAVs) sustained denial raises safety critical concerns. This work presents a strategy that allows detection, localization, and classification both in the frequency and time domain of interference signals harmful to navigation. A high-performance Vertical Take Off and Landing (VTOL) UAV with a single antenna and a commercial GNSS receiver is used to geolocate and characterize RF emitters at long range, to infer the navigation impairment. Raw IQ baseband snapshots from the GNSS receiver make the application of spectral correlation methods possible without extra software-defined radio payload, paving the way to spectrum identification and monitoring in airborne platforms, aiming at RF situational awareness. Live testing at Jammertest, in Norway, with portable, commercially available GNSS multi-band jammers demonstrates the ability to detect, localize, and characterize harmful interference. Our system pinpointed the position with an error of a few meters of the transmitter and the extent of the affected area at long range, without entering the denied zone. Additionally, further spectral content extraction is used to accurately identify the jammer frequency, bandwidth, and modulation scheme based on spectral correlation techniques.
Confining electrons or holes in quantum dots formed in the channel of industry-standard fully depleted silicon-on-insulator CMOS structures is a promising approach to scalable qubit architectures. In this article, we present our results on a calibrated model of a commercial nanostructure using the simulation tool Quantum TCAD, along with our experimental verification of all model predictions. We demonstrate here that quantum dots can be formed in the device channel by applying a combination of a common-mode voltage to the source and drain and a back gate voltage. Moreover, in this approach, the amount of quantum dots can be controlled and modified. Also, we report our results on an effective detuning of the energy levels in the quantum dots by varying the barrier gate voltages. Given the need and importance of scaling to larger numbers of qubits, we demonstrate here the feasibility of simulating and improving the design of quantum dot devices before their fabrication based on a commercial process.
We suggest employing log-ergodic processes to simulate the velocity of money in an ergodic manner. Our approach sheds light on economic behavior, policy implications, and financial dynamics by maintaining long-term stability. By bridging theory and practice, the partially ergodic model helps analysts and policymakers comprehend and forecast velocity of money. The empirical analysis, using historical U.S. GDP and money supply data, demonstrates the model's effectiveness in capturing the long-term stability of the velocity of money. Key findings indicate that the log-ergodic model offers superior predictive power compared to traditional models, making it a valuable tool for policymakers to control economic factors in vital situations.
Jan E. Snellman, Rafael A. Barrio, Kimmo K. Kaski
et al.
Pandemics, in addition to affecting the health of populations, can have huge impacts on their social and economic behavior. These factors, on the other hand, have the potential to feed back to and influence the disease spreading. It is important to systematically study these interrelations, to determine which ones have significant effects, and whether the effects are adverse or beneficial. Our recently developed epidemic model with agent-based and geographical elements is used in this study for such a purpose. We perform an extensive parameter space exploration of the socio-economic part of the model, including factors like the attitudes (called values) of the agents towards the disease spreading, health, economic situation, and regulations by government agents. We search for prominent patterns from the resulting simulated data using basic classification tools, namely self-organizing maps and principal component analysis. We seek to isolate the most important value parameters of the population and government agents influencing the disease spreading speed and patterns, and monitor different quantities of the model output, such as infection rates, the propagation speed of the epidemic, economic activity, government regulations, and the compliance of population. Out of these, the ones describing the epidemic spreading were resulting in the most distinctive clustering of the data, and they were selected as the basis of the remaining analysis. We relate the found clusters to three distinct types of disease spreading: wave-like, chaotic, and transitional spreading patterns. The most important value parameter contributing to phase changes between these phases was found to be the compliance of the population agents towards the government regulations.
In the present study, for the first time, an effort sharing approach based on Inertia and Capability principles is proposed to assess European Union (EU27) carbon budget distribution among the Member States. This is done within the context of achieving the Green Deal objective and EU27 carbon neutrality by 2050. An in-depth analysis is carried out about the role of Economic Decoupling embedded in the Capability principle to evaluate the correlation between the expected increase of economic production and the level of carbon intensity in the Member States. As decarbonization is a dynamic process, the study proposes a simple mathematical model as a policy tool to assess and redistribute Member States carbon budgets as frequently as necessary to encourage progress or overcome the difficulties each Member State may face during the decarbonization pathways.
John J. Howard, Yevgeniy B. Sirotin, Jerry L. Tipton
et al.
Human face features can be used to determine individual identity as well as demographic information like gender and race. However, the extent to which black-box commercial face recognition algorithms (CFRAs) use gender and race features to determine identity is poorly understood despite increasing deployments by government and industry. In this study, we quantified the degree to which gender and race features influenced face recognition similarity scores between different people, i.e. non-mated scores. We ran this study using five different CFRAs and a sample of 333 diverse test subjects. As a control, we compared the behavior of these non-mated distributions to a commercial iris recognition algorithm (CIRA). Confirming prior work, all CFRAs produced higher similarity scores for people of the same gender and race, an effect known as "broad homogeneity". No such effect was observed for the CIRA. Next, we applied principal components analysis (PCA) to similarity score matrices. We show that some principal components (PCs) of CFRAs cluster people by gender and race, but the majority do not. Demographic clustering in the PCs accounted for only 10 % of the total CFRA score variance. No clustering was observed for the CIRA. This demonstrates that, although CFRAs use some gender and race features to establish identity, most features utilized by current CFRAs are unrelated to gender and race, similar to the iris texture patterns utilized by the CIRA. Finally, reconstruction of similarity score matrices using only PCs that showed no demographic clustering reduced broad homogeneity effects, but also decreased the separation between mated and non-mated scores. This suggests it's possible for CFRAs to operate on features unrelated to gender and race, albeit with somewhat lower recognition accuracy, but that this is not the current commercial practice.
There has been considerable public debate about whether the economic impact of the current COVID19 restrictions are worth the costs. Although the potential impact of COVID19 has been modelled extensively, very few numbers have been presented in the discussions about potential economic impacts. For a good answer to the question - will the restrictions cause as much harm as COVID19? - credible evidence-based estimates are required, rather than simply rhetoric. Here we provide some preliminary estimates to compare the impact of the current restrictions against the direct impact of the virus. Since most countries are currently taking an approach that reduces the number of COVID19 deaths, the estimates we provide for deaths from COVID19 are deliberately taken from the low end of the estimates of the infection fatality rate, while estimates for deaths from an economic recession are deliberately computed from double the high end of confidence interval for severe economic recessions. This ensures that an adequate challenge to the status quo of the current restrictions is provided. Our analysis shows that strict restrictions to eradicate the virus are likely to lead to at least eight times fewer total deaths than an immediate return to work scenario.
Katherine A. Keith, Christoph Teichmann, Brendan O'Connor
et al.
Methods and applications are inextricably linked in science, and in particular in the domain of text-as-data. In this paper, we examine one such text-as-data application, an established economic index that measures economic policy uncertainty from keyword occurrences in news. This index, which is shown to correlate with firm investment, employment, and excess market returns, has had substantive impact in both the private sector and academia. Yet, as we revisit and extend the original authors' annotations and text measurements we find interesting text-as-data methodological research questions: (1) Are annotator disagreements a reflection of ambiguity in language? (2) Do alternative text measurements correlate with one another and with measures of external predictive validity? We find for this application (1) some annotator disagreements of economic policy uncertainty can be attributed to ambiguity in language, and (2) switching measurements from keyword-matching to supervised machine learning classifiers results in low correlation, a concerning implication for the validity of the index.
Maximilian Beikirch, Simon Cramer, Martin Frank
et al.
We study the qualitative and quantitative appearance of stylized facts in several agent-based computational economic market (ABCEM) models. We perform our simulations with the SABCEMM (Simulator for Agent-Based Computational Economic Market Models) tool recently introduced by the authors (Trimborn et al. 2019). Furthermore, we present novel ABCEM models created by recombining existing models and study them with respect to stylized facts as well. This can be efficiently performed by the SABCEMM tool thanks to its object-oriented software design. The code is available on GitHub (Trimborn et al. 2018), such that all results can be reproduced by the reader.
Bangladesh is the 2nd largest growing country in the world in 2016 with 7.1% GDP growth. This study undertakes an econometric analysis to examine the relationship between population growth and economic development. This result indicates population growth adversely related to per capita GDP growth, which means rapid population growth is a real problem for the development of Bangladesh.
Maria Letizia Bertotti, Amit K Chattopadhyay, Giovanni Modanese
In this article, we discuss a dynamical stochastic model that represents the time evolution of income distribution of a population, where the dynamics develop from an interplay of multiple economic exchanges in the presence of multiplicative noise. The model remit stretches beyond the conventional framework of a Langevin-type kinetic equation in that our model dynamics is self-consistently constrained by dynamical conservation laws emerging from population and wealth conservation. This model is numerically solved and analyzed to interpret the inequality of income as a function of relevant dynamical parameters like the {\it mobility} $M$ and the {\it total income} $μ$. In our model, inequality is quantified by the {\it Gini index} $G$. In particular, correlations between any two of the mobility index $M$ and/or the total income $μ$ with the Gini index $G$ are investigated and compared with the analogous correlations resulting from an equivalent additive noise model. Our findings highlight the importance of a multiplicative noise based economic modeling structure in the analysis of inequality. The model also depicts the nature of correlation between mobility and total income of a population from the perspective of inequality measure.
A generalization of the economic model of natural growth, which takes into account the power-law memory effect, is suggested. The memory effect means the dependence of the process not only on the current state of the process, but also on the history of changes of this process in the past. For the mathematical description of the economic process with power-law memory we used the theory of derivatives of non-integer order and fractional-order differential equation. We propose equations take into account the effects of memory with one-parameter power-law damping. Solutions of these fractional differential equations are suggested. We proved that the growth and downturn of output depend on the memory effects. We demonstrate that the memory effect can lead to decrease of output instead of its growth, which is described by model without memory effect. Memory effect can lead to increase of output, rather than decrease, which is described by model without memory effect.
In this article, we briefly review the different aspects and applications of kinetic exchange models in economics and sociology. Our main aim is to show in what manner the kinetic exchange models for closed economic systems were inspired by the kinetic theory of gas molecules. The simple yet powerful framework of kinetic theory, first proposed in 1738, led to the successful development of statistical physics of gases towards the end of the 19th century. This framework was successfully adapted to modeling of wealth distributions in the early 2000's. In later times, it was applied to other areas like firm dynamics and opinion formation in the society, as well. We have tried to present the flavour of the several models proposed and their applications, intentionally leaving out the intricate mathematical and technical details.
In the aftermath of the global financial crisis, much attention has been paid to investigating the appropriateness of the current practice of default risk modeling in banking, finance and insurance industries. A recent empirical study by Guo et al.(2008) shows that the time difference between the economic and recorded default dates has a significant impact on recovery rate estimates. Guo et al.(2011) develop a theoretical structural firm asset value model for a firm default process that embeds the distinction of these two default times. To be more consistent with the practice, in this paper, we assume the market participants cannot observe the firm asset value directly and developed a reduced-form model to characterize the economic and recorded default times. We derive the probability distribution of these two default times. The numerical study on the difference between these two shows that our proposed model can both capture the features and fit the empirical data.
Dion Harmon, Marcus A. M. de Aguiar, David D. Chinellato
et al.
Predicting panic is of critical importance in many areas of human and animal behavior, notably in the context of economics. The recent financial crisis is a case in point. Panic may be due to a specific external threat, or self-generated nervousness. Here we show that the recent economic crisis and earlier large single-day panics were preceded by extended periods of high levels of market mimicry --- direct evidence of uncertainty and nervousness, and of the comparatively weak influence of external news. High levels of mimicry can be a quite general indicator of the potential for self-organized crises.