City council meetings are vital sites for civic participation where the public can speak directly to their local government. By addressing city officials and calling on them to take action, public commenters can potentially influence policy decisions spanning a broad range of concerns, from housing, to sustainability, to social justice. Yet studies of these meetings have often been limited by the availability of large-scale, geographically-diverse data. Relying on local governments' increasing use of YouTube and other technologies to archive their public meetings, we propose a framework that characterizes comments along two dimensions: the local concerns where concerns are situated (e.g., housing, election administration), and the societal concerns raised (e.g., functional democracy, anti-racism). Based on a large record of public comments we collect from 15 cities in Michigan, we produce data-driven taxonomies of the local concerns and societal concerns that these comments cover, and employ machine learning methods to scalably apply our taxonomies across the entire dataset. We then demonstrate how our framework allows us to examine the salient local concerns and societal concerns that arise in our data, as well as how these aspects interact.
Yiran Wang, Alicia E. Boyd, Lillian Rountree
et al.
Data-driven decisions shape public health policies and practice, yet persistent disparities in data representation skew insights and undermine interventions. To address this, we advance a structured roadmap that integrates public health data science with computer science and is grounded in reflexivity. We adopt data equity as a guiding concept: ensuring the fair and inclusive representation, collection, and use of data to prevent the introduction or exacerbation of systemic biases that could lead to invalid downstream inference and decisions. To underscore urgency, we present three public health cases where non-representative datasets and skewed knowledge impede decisions across diverse subgroups. These challenges echo themes in two literatures: public health highlights gaps in high-quality data for specific populations, while computer science and statistics contribute criteria and metrics for diagnosing bias in data and models. Building on these foundations, we propose a working definition of public health data equity and a structured self-audit framework. Our framework integrates core computational principles (fairness, accountability, transparency, ethics, privacy, confidentiality) with key public health considerations (selection bias, representativeness, generalizability, causality, information bias) to guide equitable practice across the data life cycle, from study design and data collection to measurement, analysis, interpretation, and translation. Embedding data equity in routine practice offers a practical path for ensuring that data-driven policies, artificial intelligence, and emerging technologies improve health outcomes for all. Finally, we emphasize the critical understanding that, although data equity is an essential first step, it does not inherently guarantee information, learning, or decision equity.
Networked urban systems facilitate the flow of people, resources, and services, and are essential for economic and social interactions. These systems often involve complex processes with unknown governing rules, observed by sensor-based time series. To aid decision-making in industrial and engineering contexts, data-driven predictive models are used to forecast spatiotemporal dynamics of urban systems. Current models such as graph neural networks have shown promise but face a trade-off between efficacy and efficiency due to computational demands. Hence, their applications in large-scale networks still require further efforts. This paper addresses this trade-off challenge by drawing inspiration from physical laws to inform essential model designs that align with fundamental principles and avoid architectural redundancy. By understanding both micro- and macro-processes, we present a principled interpretable neural diffusion scheme based on Transformer-like structures whose attention layers are induced by low-dimensional embeddings. The proposed scalable spatiotemporal Transformer (ScaleSTF), with linear complexity, is validated on large-scale urban systems including traffic flow, solar power, and smart meters, showing state-of-the-art performance and remarkable scalability. Our results constitute a fresh perspective on the dynamics prediction in large-scale urban networks.
Electric vehicle (EV) charging infrastructure is crucial for advancing EV adoption, managing charging loads, and ensuring equitable transportation electrification. However, there remains a notable gap in comprehensive accessibility metrics that integrate the mobility of the users. This study introduces a novel accessibility metric, termed Trajectory-Integrated Public EVCS Accessibility (TI-acs), and uses it to assess public electric vehicle charging station (EVCS) accessibility for approximately 6 million residents in the San Francisco Bay Area based on detailed individual trajectory data in one week. Unlike conventional home-based metrics, TI-acs incorporates the accessibility of EVCS along individuals' travel trajectories, bringing insights on more public charging contexts, including public charging near workplaces and charging during grid off-peak periods. As of June 2024, given the current public EVCS network, Bay Area residents have, on average, 7.5 hours and 5.2 hours of access per day during which their stay locations are within 1 km (i.e. 10-12 min walking) of a public L2 and DCFC charging port, respectively. Over the past decade, TI-acs has steadily increased from the rapid expansion of the EV market and charging infrastructure. However, spatial disparities remain significant, as reflected in Gini indices of 0.38 (L2) and 0.44 (DCFC) across census tracts. Additionally, our analysis reveals racial disparities in TI-acs, driven not only by variations in charging infrastructure near residential areas but also by differences in their mobility patterns.
This paper examines public goods and evaluates the mechanism through the game theory. Public goods are characterized by nonexclusivity and nonrivalry and this creates fundamental challenges for allocation. We analyze why competitive markets undersupply public goods by deriving the inefficiency formally through Nash equilibrium. The paper evaluates theoretical solutions including Lindahl pricing, Clarke-Groves mechanisms, and voting schemes. The paper will cover their efficiency properties and practical limitations. We show how strategic interaction leads to free-riding behavior using roommates dilemma and other examples. We also cover why a large household lives in messy conditions not because individuals are lazy, but because they are rational players in a Nash equilibrium. We also examine voting mechanisms, the median voter theorem, and recent developments in truth-revealing mechanisms.
The most effective differentially private machine learning algorithms in practice rely on an additional source of purportedly public data. This paradigm is most interesting when the two sources combine to be more than the sum of their parts. However, there are settings such as mean estimation where we have strong lower bounds, showing that when the two data sources have the same distribution, there is no complementary value to combining the two data sources. In this work we extend the known lower bounds for public-private learning to setting where the two data sources exhibit significant distribution shift. Our results apply to both Gaussian mean estimation where the two distributions have different means, and to Gaussian linear regression where the two distributions exhibit parameter shift. We find that when the shift is small (relative to the desired accuracy), either public or private data must be sufficiently abundant to estimate the private parameter. Conversely, when the shift is large, public data provides no benefit.
Lindahl equilibrium is a solution concept for allocating a fixed budget across several divisible public goods. It always lies in the weak core, meaning that the equilibrium allocation satisfies desirable stability and proportional fairness properties. We consider a model where agents have separable linear utility functions over the public goods, and the output assigns to each good an amount of spending, summing to at most the available budget. In the uncapped setting, each of the public goods can absorb any amount of funding. In this case, it is known that Lindahl equilibrium is equivalent to maximizing Nash social welfare, and this allocation can be computed by a public-goods variant of the proportional response dynamics. We introduce a new convex programming formulation for computing this solution and show that it is related to Nash welfare maximization through double duality and reformulation. We then show that the proportional response dynamics is equivalent to running mirror descent on our new formulation. Our new formulation has similarities to Shmyrev's convex program for Fisher market equilibrium. In the capped setting, each public good has an upper bound on the amount of funding it can receive, which is a type of constraint that appears in fractional committee selection and participatory budgeting. In this setting, existence of Lindahl equilibrium was only known via fixed-point arguments. The existence of an efficient algorithm computing one has been a long-standing open question. We prove that our new convex program continues to work when the cap constraints are added, and its optimal solutions are Lindahl equilibria. Thus, we establish that approximate Lindahl equilibrium can be efficiently computed. Our result also implies that approximately core-stable allocations can be efficiently computed for the class of separable piecewise-linear concave (SPLC) utilities.
Nataliya A. Balabanova, Manh Hong Duong, The Anh Han
Understanding the emergence and stability of cooperation in public goods games is important due to its applications in fields such as biology, economics, and social science. However, a gap remains in comprehending how mutations, both additive and multiplicative, as well as institutional incentives, influence these dynamics. In this paper, we study the replicator-mutator dynamics, with combined additive and multiplicative mutations, for public goods games both in the absence or presence of institutional incentives. For each model, we identify the possible number of (stable) equilibria, demonstrate their attainability, as well as analyse their stability properties. We also characterise the dependence of these equilibria on the model's parameters via bifurcation analysis and asymptotic behaviour. Our results offer rigorous and quantitative insights into the role of institutional incentives and the effect of combined additive and multiplicative mutations on the evolution of cooperation in the context of public goods games.
Daniel Lindenschmitt, Benedikt Veith, Khurshid Alam
et al.
The landscape of wireless communication systems is evolving rapidly, with a pivotal role envisioned for dynamic network structures and self-organizing networks in upcoming technologies like the 6G mobile communications standard. This evolution is fueled by the growing demand from diverse sectors, including industry, manufacturing, agriculture, and the public sector, each with increasingly specific requirements. The establishment of non-public networks in the current 5G standard has laid a foundation, enabling independent operation within certain frequencies and local limitations, notably for Internet of Things applications. This paper explores the progression from non-public networks to nomadic non-public networks and their significance in the context of the forthcoming 6G era. Building on existing work in dynamic network structures, non-public networks regulations, and alternative technological solutions, this paper introduces specific use cases enhanced by nomadic networks. In addition, relevant Key Performance Indicators are discussed on the basis of the presented use cases. These serve as a starting point for the definition of requirement clusters and thus for a evaluation metric of nomadic non-public networks. This work lays the groundwork for understanding the potential of nomadic non-public networks in the dynamic landscape of 6G wireless communication systems.
We initiate the study of locally differentially private (LDP) learning with public features. We define semi-feature LDP, where some features are publicly available while the remaining ones, along with the label, require protection under local differential privacy. Under semi-feature LDP, we demonstrate that the mini-max convergence rate for non-parametric regression is significantly reduced compared to that of classical LDP. Then we propose HistOfTree, an estimator that fully leverages the information contained in both public and private features. Theoretically, HistOfTree reaches the mini-max optimal convergence rate. Empirically, HistOfTree achieves superior performance on both synthetic and real data. We also explore scenarios where users have the flexibility to select features for protection manually. In such cases, we propose an estimator and a data-driven parameter tuning strategy, leading to analogous theoretical and empirical results.
Introduction The carbon emissions that cities contribute drive the development of low-carbon cities (LCCs) and low-carbon city pilot (LCCP) policies. However, the lack of comprehensive understanding regarding the impacts of LCCP policies on natural population growth hampers effective policy design and implementation, thus constraining sustainable development at the city level. Methodology Extending the existing papers which focus on the relations between low-carbon pilot policies and industry transformation or economic growth, this research applies several experimental methods [e.g., Propensity Score Matching-Difference in Differences (PSM-DID)] to investigate the impacts of low-carbon pilot policies on natural population growth by applying the data from 287 prefecture-level cities in China from 2003 to 2019. Results and Discussion This research found that low-carbon pilot policies would positively influence the low-carbon citiesβ natural population growth by influencing (a) economic factors, (b) political factors, (c) technological factors, and (d) the living environment. This research establishes a framework for understanding the impact mechanisms of LCCP on natural population growth. This paper investigates how industrial structure optimization, policy design and implementation in different regions, technological innovations, and urban green space theoretically affect natural population growth. This paper also proposed characteristics of LCCP which should be theoretically concerned by the government. From a practical perspective, this research suggests several policy recommendations. Central and local governments are encouraged to prioritize industrial structure optimization and assess populationsβ dependence on cultivated land. Providing additional policy support to underdeveloped areas is crucial to promote the balance between economic and environmental development. Furthermore, establishing online public health platforms and urban green spaces is proposed to enhance the populationβs health and complement the implementation of LCCP policies. This offers both theoretical and practical insights into the impacts of LCCP policies on natural population growth. Its findings contribute to designing and implementing LCCP policies in China and other developing countries at a similar development stage.
Public finances are one of the fundamental mechanisms of economic governance that refer to the financial activities and decisions made by government entities to fund public services, projects, and operations through assets. In today's globalized landscape, even subtle shifts in one nation's public debt landscape can have significant impacts on that of international finances, necessitating a nuanced understanding of the correlations between international and national markets to help investors make informed investment decisions. Therefore, by leveraging the capabilities of artificial intelligence, this study utilizes neural networks to depict the correlations between US and International Public Finances and predict the changes in international public finances based on the changes in US public finances. With the neural network model achieving a commendable Mean Squared Error (MSE) value of 2.79, it is able to affirm a discernible correlation and also plot the effect of US market volatility on international markets. To further test the accuracy and significance of the model, an economic analysis was conducted that aimed to correlate the changes seen by the results of the model with historical stock market changes. This model demonstrates significant potential for investors to predict changes in international public finances based on signals from US markets, marking a significant stride in comprehending the intricacies of global public finances and the role of artificial intelligence in decoding its multifaceted patterns for practical forecasting.
Abdelghani Maddi, Emmanuel Monneau, Catherine Gaspare
et al.
The Streetlight Effect represents an observation bias that occurs when individuals search for something only where it is easiest to look. Despite the significant development of Post-Publication Peer Review (PPPR) in recent years, facilitated in part by platforms such as PubPeer, existing literature has not examined whether PPPR is affected by this type of bias. In other words, if the PPPR mainly concerns publications to which researchers have direct access (eg to analyze image duplications, etc.). In this study, we compare the Open Access (OA) structures of publishers and journals among 51,882 publications commented on PubPeer to those indexed in OpenAlex database (\#156,700,177). Our findings indicate that OA journals are 33% more prevalent in PubPeer than in the global total (52% for the most commented journals). This result can be attributed to disciplinary bias in PubPeer, with overrepresentation of medical and biological research (which exhibits higher levels of openness). However, after normalization, the results reveal that PPPR does not exhibit a Streetlight Effect, as OA publications, within the same discipline, are on average 16% less prevalent in PubPeer than in the global total. These results suggest that the process of scientific self-correction operates independently of publication access status.
Annisa Indira Putri, Dwi Ananda Rizka Octavia, Mohammad Insan Romadhan
Intense competition in the era of the industrial revolution and the development of digital technology is a challenge for a company to have an effective and targeted marketing strategy. Integrated Marketing Communication (IMC) is a marketing strategy that must be owned by a company, including a Food n Beverage company. Aiola eatery is one of the most successful FnB companies in Surabaya. This success is accompanied by a variety of good marketing strategies. This study aims to find out how the IMC implementation carried out by Aiola Eatery in the Halal bi Halal promo and at the same time analyzes its success. This research uses qualitative method with constructivism paradigm as the basis. The study was conducted by two researchers with data collection techniques using in-depth interviews with Marketing Communication from Aiola Eatery as primary data and secondary data in the form of promo designs and promo recap results. The results of this study are Aiola Eatery uses 4 kinds of IMC strategies, namely public relations & publicity, digital marketing, sales promotion and direct marketing. This strategy supports the success of Aiola Eatery in making promos, one of which is the Halal bi Halal promo with targeted results. In addition, Aiola Eatery also uses IMC as an effort to align customers both online or on social media and those who come in person or dine in.
In cities around the world, locating public parking lots with vacant parking spots is a major problem, costing commuters time and adding to traffic congestion. This work illustrates how a dataset of Geo-tagged images from a mobile phone camera, can be used in navigating to the most convenient public parking lot in Johannesburg with an available parking space, detected by a neural network powered-public camera. The images are used to fine-tune a Detectron2 model pre-trained on the ImageNet dataset to demonstrate detection and segmentation of vacant parking spots, we then add the parking lot's corresponding longitude and latitude coordinates to recommend the most convenient parking lot to the driver based on the Haversine distance and number of available parking spots. Using the VGG Image Annotation (VIA) we use images from an expanding dataset of images, and annotate these with polygon outlines of the four different types of objects of interest: cars, open parking spots, people, and car number plates. We use the segmentation model to ensure number plates can be occluded in production for car registration anonymity purposes. We get an 89% and 82% intersection over union cover score on cars and parking spaces respectively. This work has the potential to help reduce the amount of time commuters spend searching for free public parking, hence easing traffic congestion in and around shopping complexes and other public places, and maximize people's utility with respect to driving on public roads.
Changes in the number of publications in a certain field might reflect the dynamic of scientific progress in this field, since an increase in the number of publications can be interpreted as an increase in the field-specific knowledge. In this paper, we present a methodological approach to analyse the dynamics of science on lower aggregation levels, i.e., the level of research fields. Our trend analysis approach is able to uncover very recent trends, and the methods used to study the trends are simple to understand for the possible recipients of the results. In order to demonstrate the trend analysis approach, we focused in this study on the annual number of publications (and patents) in chemistry (and related areas) between 2014 and 2020 identifying those fields in chemistry with the highest dynamics (largest rates of change in publication counts). The study is based on the mono-disciplinary literature database CAplus. Our results reveal that the number of publications in the CAplus database is increasing since many years. Research regarding optical phenomena and electrochemical technologies was found to be among the emerging topics in recent years.
Online public opinion usually spreads rapidly and widely, thus a small incident probably evolves into a large social crisis in a very short time, and results in a heavy loss in credit or economic aspects. We propose a method to rate the crisis of online public opinion based on a multi-level index system to evaluate the impact of events objectively. Firstly, the dissemination mechanism of online public opinion is explained from the perspective of information ecology. According to the mechanism, some evaluation indexes are selected through correlation analysis and principal component analysis. Then, a classification model of text emotion is created via the training by deep learning to achieve the accurate quantification of the emotional indexes in the index system. Finally, based on the multi-level evaluation index system and grey correlation analysis, we propose a method to rate the crisis of online public opinion. The experiment with the real-time incident show that this method can objectively evaluate the emotional tendency of Internet users and rate the crisis in different dissemination stages of online public opinion. It is helpful to realizing the crisis warning of online public opinion and timely blocking the further spread of the crisis.
This study aims to analyze the school's collaboration with industry and business community (DU/DI) in improving the quality of Vocational High School (SMK) graduates. This study applies a qualitative descriptive approach of case study design at SMK Negeri 3 Jayapura. The researcher as the key instrument of the research conducted an in-depth interview with principals, teachers/public relations, students, school committees, supervisors as well as industry and business community (DU/DI). Participatory observation and document analysis were also utilized in order to support the data gathering process. The techniques for validating the data were gathered by extending the time of observation, peer discussion and triangulation. The data was analyzed qualitatively by means of data reduction, data display, and drawing conclusions/verification. The results of this study indicate that the school's collaboration with industry and business community (DU/DI) in improving the quality of SMK Negeri 3 Jayapura is favorable but still needs some improvement. It still requires some adjustments in term of alignment of the data-based industry-based curriculum which involves industry and business community (DU/DI) from its early establishment, the presence of industrial teachers which has not yet touched the substance of industrial practice and the absorption of graduates is still limited to MoUs bound by contracts and high-achieving graduates.