There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn’t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems.
This paper develops a robust mathematical framework for Constant Function Market Makers (CFMMs) by transitioning from traditional token reserve analyses to a coordinate system defined by price and intrinsic liquidity. We establish a canonical parametrization of the bonding curve that ensures dimensional consistency across diverse trading functions, such as those employed by Uniswap and Balancer, and demonstrate that asset reserves and value functions exhibit a linear dependence on this intrinsic liquidity. This linear structure facilitates a streamlined approach to arbitrage-free pricing, delta hedging, and systematic risk management. By leveraging the Carr-Madan spanning formula, we characterize Impermanent Loss (IL) as a weighted strip of vanilla options, thereby defining a fine-grained implied volatility structure for liquidity profiles. Furthermore, we provide a path-dependent analysis of IL using the last-passage time. Empirical results from Uniswap v3 ETH/USDC pools and Deribit option markets confirm a volatility smile consistent with crypto-asset dynamics, validating the framework's utility in characterizing the risk-neutral fair value of liquidity provision.
Cyberattacks targeting the process industry have become increasingly prevalent in recent years. The ISA TR84.00.09 standard and the CCPS guidelines propose methodologies for conducting process risk assessments against cyberattacks on process facilities, such as attacks on the Basic Process Control System (BPCS) and the Safety Instrumented System (SIS), to ensure robust functional requirement management throughout the plant lifecycle. However, hazard identification and risk assessment techniques addressing process incidents triggered by cyberattacks remain largely unstandardized. Contemporary cybersecurity (CS) risk assessments predominantly focus on general Information Technology (IT) risks within business contexts. A notable contributing factor is the persistent misalignment between IT and Operational Technology (OT), including Process Safety (PS). OT professionals often regard CS as the responsibility of IT personnel, while IT teams typically lack familiarity with OT systems. Consequently, integrated IT-OT risk assessments are not widely implemented. This study explores an effective framework and methodology for conducting CS risk assessments specific to process incidents. The research utilizes a typical LNG plant model as the basis for a detailed CS risk assessment. The findings reveal several potential pathways for cyberattacks that could lead to major process incidents, underscoring the criticality of inherent safety measures and effective coordination between CS and PS disciplines. The CS risk assessment framework and procedural guidance detailed in this study are anticipated to significantly enhance the effectiveness of CS risk evaluations and the precise definition of functional requirements to mitigate cybersecurity risks.
Chemical engineering, Computer engineering. Computer hardware
Maksim S. Maramygin, Natalya B. Boldyreva, Lyudmila G. Reshetnikova
In the new economic reality, air quality protection behaviour of companies remains a burning issue shifting the em phasis towards saving human life and health. The article explores the relationship between the air quality protection behav iour of public companies and stock returns in different economic sectors. The research methodology is based on environmental management theory, financial management theory, and stakeholder theory. The analytical procedures performed on the data involved cross-sectoral economic analysis and econometric analysis. Empirical data are retrieved from the Federal State Statistics Service (Rosstat) and the Moscow Exchange and cover statistics on 45 public joint-stock companies (PAOs) from five sectors of the economy for the period of 2014–2022. Cross-sectoral economic analysis has shown that investors lack interest in PAOs’ air quality protection behaviour and invariably favour companies from the sectoral indices paying higher dividends compared to the Moscow Exchange, such as Chemicals and Petrochemicals, Metals and Mining, Oil and Gas. By integrating air quality protection factors into the Fama-French-Carhart model through econometric modelling, we estimated the dependence of returns of indus try portfolios on these factors and classical risk premiums. The study demonstrates a positive impact of market premium and size premium on returns of industry portfolios. No statistically significant impact of air quality protection factors on stock returns was found. Our empirical findings confirm that businesses are poorly motivated to take air environment protection measures and air pollution-reducing behaviour should be encouraged at the state level.
Risks associated with the use of AI, ranging from algorithmic bias to model hallucinations, have received much attention and extensive research across the AI community, from researchers to end-users. However, a gap exists in the systematic assessment of supply chain risks associated with the complex web of data sources, pre-trained models, agents, services, and other systems that contribute to the output of modern AI systems. This gap is particularly problematic when AI systems are used in critical applications, such as the food supply, healthcare, utilities, law, insurance, and transport. We survey the current state of AI risk assessment and management, with a focus on the supply chain of AI and risks relating to the behavior and outputs of the AI system. We then present a proposed taxonomy specifically for categorizing AI supply chain entities. This taxonomy helps stakeholders, especially those without extensive AI expertise, to "consider the right questions" and systematically inventory dependencies across their organization's AI systems. Our contribution bridges a gap between the current state of AI governance and the urgent need for actionable risk assessment and management of AI use in critical applications.
The success of OpenAI's ChatGPT in 2023 has spurred financial enterprises into exploring Generative AI applications to reduce costs or drive revenue within different lines of businesses in the Financial Industry. While these applications offer strong potential for efficiencies, they introduce new model risks, primarily hallucinations and toxicity. As highly regulated entities, financial enterprises (primarily large US banks) are obligated to enhance their model risk framework with additional testing and controls to ensure safe deployment of such applications. This paper outlines the key aspects for model risk management of generative AI model with a special emphasis on additional practices required in model validation.
Based on the manifold implications of the relationship between cyber risk management and digital transformation, the present paper intends to perform a bibliometric analysis stressing the main topics which have been investigated so far under this overarching research theme. The aim is to conduct a structured examination of cyber risk management in the context of digital transformation, with a focus on management practices. The bibliometric analysis was conducted following the steps in the PRISMA guidelines. 73 sources retrieved from Scopus database were analysed, covering 37 papers presented at conferences, 24 articles published in scientific journals, 8 book chapters and 1 full book together with 2 reviews and 1 conference review. In terms of research areas, the majority of studies came from the disciplines of engineering, computer science and social sciences. By using various approaches to assess cyber risks, the bibliometric analysis provides a solid framework for understanding and managing threats in a systematic and effective way. Moreover, the analyses reflect discrepancies between the perceived level of cybersecurity requirements and the actual level of preparedness and awareness of cyber risks in various industry sectors. This underlines the need for a more comprehensive and proactive approach to cybersecurity management, taking into account not only technologies and protection methods, but also cultural and organisational aspects.
Economics as a science, Business records management
In order to determine the susceptibility of gully erosion at a small watershed scale on the Loess Plateau of China, three hybrid models were developed. These models were based on the Multi-Attributive Border Approximation Area Comparison (MABAC), frequency ratio (FR), CatBoost (CB), LightGBM (LG), and extremely-randomized tree (ET). Based on the Unmanned Aerial Vehicles (UAV) photos, a total of 83 gullies with 12,150 gully pixels and 8 conditioning variables were extracted and used to create the gully inventory database. The correlations between the conditioning parameters and the pixels of gullies were then determined using FR, and the relative importance of these conditioning factors was quantified using machine learning. Then, for gully erosion susceptibility mapping (GESM), three hybrid gully erosion susceptibility models called MABAC-FR-CB, MABAC-FR-LG, and MABAC-FR-ET were developed. The performance of three hybrid models was assessed using the receiver operating characteristic curve (ROC) and the Kappa coefficient. The results claimed that slope steepness greatly influenced the erosion of the gully. The MABAC-FR-ET performed the most precisely, with area under curvature (AUC) of 0.998 and a Kappa of 0.952. As a result, it was determined that MABAC-FR-ET is the most exact and accurate method for predicting the susceptibility to gully erosion in the study watershed.
Fernando Acebes, José Manuel González-Varona, Adolfo López-Paredes
et al.
The project managers who deal with risk management are often faced with the difficult task of determining the relative importance of the various sources of risk that affect the project. This prioritisation is crucial to direct management efforts to ensure higher project profitability. Risk matrices are widely recognised tools by academics and practitioners in various sectors to assess and rank risks according to their likelihood of occurrence and impact on project objectives. However, the existing literature highlights several limitations to use the risk matrix. In response to the weaknesses of its use, this paper proposes a novel approach for prioritising project risks. Monte Carlo Simulation (MCS) is used to perform a quantitative prioritisation of risks with the simulation software MCSimulRisk. Together with the definition of project activities, the simulation includes the identified risks by modelling their probability and impact on cost and duration. With this novel methodology, a quantitative assessment of the impact of each risk is provided, as measured by the effect that it would have on project duration and its total cost. This allows the differentiation of critical risks according to their impact on project duration, which may differ if cost is taken as a priority objective. This proposal is interesting for project managers because they will, on the one hand, know the absolute impact of each risk on their project duration and cost objectives and, on the other hand, be able to discriminate the impacts of each risk independently on the duration objective and the cost objective.
Fernando Acebes, Javier Pajares, Jose M Gonzalez-Varona
et al.
Project managers need to manage risks throughout the project lifecycle and, thus, need to know how changes in activity durations influence project duration and risk. We propose a new indicator (the Activity Risk Index, ARI) that measures the contribution of each activity to the total project risk while it is underway. In particular, the indicator informs us about what activities contribute the most to the project's uncertainty so that project managers can pay closer attention to the performance of these activities. The main difference between our indicator and other activity sensitivity metrics in the literature (e.g. cruciality, criticality, significance, or schedule sensitivity indices) is that our indicator is based on the Schedule Risk Baseline concept instead of on cost or schedule baselines. The new metric not only provides information at the beginning of the project, but also while it is underway. Furthermore, the ARI is the only one to offer a normalized result: if we add its value for each activity, the total sum is 100%.
Md. Abdul Moktadir, A. Dwivedi, Nadia Sultana Khan
et al.
Abstract In the present competitive business environment and era of globalized marketing, supply chain (SC) of the leather industry is facing a variety of risks. Hence, one of the fundamental concerns in the leather industry supply chain (LISC) is recognizing and prioritizing the various risk factors for attaining sustainability. The present study is an attempt to determine a comprehensive evaluation of SC risk factors considering the case of the leather industry. Based on the literature search and interviews with the domain experts’, forty-four risk factors in the context of LISC are identified. The identified risk factors are further segregated into five-dimensions to sustainability (social, environmental, economic, technical, and institutional). A Pareto analysis is performed to discover the most pertinent risk factors. Further, the best-worst method (BWM) is embraced for evaluating the importance of each pertinent risk factor for the decision-making purpose. The findings from the study reflect that ‘inefficient effluent treatment’, ‘change in consumer preference’, ‘improper dumping of solid waste’, ‘volatility of price and cost’ and ‘fiscal changes’ are the crucial risk factors that are required to be addressed for the successful execution of sustainable supply chain management (SSCM) practices in an emerging economy context. It is expected that the results and findings will assist the leather industry managers in decision-making for better administration and alleviation of supply chain risks to achieve sustainability.
Antimicrobial resistance (AMR) is involved in veterinary medicine, food, environment, medicine, and other fields. It endangers food safety, international trade, economic development, and life health. It has become a major public health problem facing the world. China was the world's largest producer and consumer of antimicrobials, about 60% of which were used in breeding industry. Due to its wide spread use and abuse in the breeding industry, many antimicrobial resistant bacteria appeared and spreaded rapidly. Following the “One Health” strategy, the United Nations encouraged countries to establish cross-sectoral AMR coordination mechanisms. This research compared the development, framework, and some monitoring results of foodborne bacterial antimicrobial resistance surveillance systems in China, the United States, and Europe. National surveillance systems basically covered population and food animal and food-related foodborne bacterial antimicrobial resistance, while also monitoring the use of antimicrobials in human medicine and food animals. American and European countries and regions started earlier and had relatively perfect systems. Through the development of nearly 30 years, they have mastered the baseline level of drug resistance and drug use of foodborne bacteria. Furthermore, scientific evaluation of surveillance data can play a risk management role, like optimizing surveillance programs (such as increasing surveillance of antimicrobial resistance in pets and environmental monitoring) and proposing interventions to limit the spread of resistant bacteria. The surveillance system of foodborne bacterial antimicrobial resistance started late in China. The surveillance of foodborne bacterial antimicrobial resistance in humans, food animals and related foods developed rapidly for 20 years, while the monitoring of antibiotic drug use in food animals began in 2018, and further improvement was needed for all aspects. At the same time, the EU monitoring system implemented the mechanism of breaking down departments to achieve data sharing. At present, different departments responsible for foodborne bacteria antimicrobial resistance monitoring systems have been established in our country. However, the data sharing mechanism has not been realized, and data “chimney” and information isolated island existed, which made it impossible to maximize the utility of existing data resources. By discussing the experiences of the United States and Europe, this research has the following inspirations for the antimicrobial resistance surveillance system in China. It is necessary to gradually improve the antimicrobial resistance surveillance system, establish a multi-sectoral collaborative governance mechanism, and accelerate the application of new technologies in data mining, to comprehensively improve the ability to curb bacterial resistance and protect people's health.(抗微生物药物耐药性(antimicrobial resistance,AMR)涉及兽医、食品、环境、医学等多个领域,危及食品安全、国际贸易、经济发展和生命健康,已成为全球面临的重大公共卫生问题。我国是全球最大的抗微生物药物生产和消费国,其中约60%的抗微生物药物被用于养殖业。由于抗微生物药物在养殖业的广泛使用甚至滥用,导致大量耐药细菌的出现和迅速传播。遵循“One Health”策略,联合国鼓励各国建立跨部门间的AMR协调机制。研究比较了中美欧食源性细菌耐药性监测系统的发展、框架及部分监测结果,发现各国监测体系基本涵盖人群、食品动物和相关食品食源性细菌耐药性,同时也监测人医和食品动物抗微生物药物使用量。欧美国家和地区起步较早,通过近30多年的发展,基本掌握了食源性细菌耐药性的基线水平及药物使用水平。我国食源性细菌耐药性监测体系起步较晚,人群、食品动物和相关食品食源性细菌耐药性监测发展近20年,而食品动物抗生素药物使用量监测于2018年开始启动,各方面都需要进一步完善。欧盟监测体系实施跨部门协作机制,数据共享共用;目前,虽然我国已经建立了不同部门负责的食源性细菌耐药性监测体系,但尚未建立数据共享机制,存在数据“烟囱”和信息孤岛,无法最大化发挥现有数据资源的效用。欧美经验对我国食源性细菌耐药性监测系统有一定启发:我国应逐步完善耐药监测系统、建立多部门协同共治机制及加快新技术在数据挖掘中的运用,从而综合提升遏制细菌耐药性的能力,保障人民健康。)
AbstractThe advent of non-pillar mining technology of self-formed roadway based on roof cutting theory (NPMTSFRRCT) has revolutionized the method of tackling the difficulties posed by hard suspended roofs in mining engineering. The design of roof cutting parameters plays a crucial role in determining the roof fracture characteristics. The principles of roof cutting parameter design have been analysed, with a focus on the key and challenging aspect of the collapse of single-layer thick hard rock under the condition of a thin immediate roof. A mechanical model of roof fracture has been established, and the impact of immediate roof thickness, roof cutting height, roof cutting angle, and main roof thickness on roof fracture has been analysed. A numerical model based on the UDEC software has been created, and the roof fracture, stress, and displacement variation characteristics have been studied using the failure criterion of polygonal blocks. The results of the theoretical analysis have been verified, and it was found that the roof fracture in partial roof cutting occurs in the form of hinge bite, while complete roof cutting results in step sinking. Engineering practice has shown that the deformation of the roadway surrounding rock has been effectively controlled.
Most geotechnical stability research is linked to “active” failures, in which soil instability occurs due to soil self-weight and external surcharge applications. In contrast, research on passive failure is not common, as it is predominately caused by external loads that act against the soil self-weight. An earlier active trapdoor stability investigation using the Terzaghi's three stability factor approach was shown to be a feasible method for evaluating cohesive-frictional soil stability. Therefore, this technical note aims to expand “active” trapdoor research to assess drained circular trapdoor passive stability (blowout condition) in cohesive-frictional soil under axisymmetric conditions. Using numerical finite element limit analysis (FELA) simulations, soil cohesion, surcharge, and soil unit weight effects are considered using three stability factors (Fc, Fs, and Fγ), which are all associated with the cover-depth ratio and soil internal friction angle. Both upper-bound (UB) and lower-bound (LB) results are presented in design charts and tables, and the large dataset is further studied using an artificial neural network (ANN) as a predictive model to produce accurate design equations. The proposed passive trapdoor problem under axisymmetric conditions is significant when considering soil blowout stability owing to faulty underground storage tanks or pipelines with high internal pressures.
Olga E. Bashina, Marina D. Simonova, Lilia V. Matraeva
et al.
Countries with a large energy sector are faced with the issues of forming and developing a state energy policy that takes into account not only sectoral and intersectoral aspects, but also the components of managing significant amounts of rental income. In this regard, any of these economic systems, on the one hand, has great opportunities associated with the management of energy resources as a factor of development, on the other hand, it is constantly at risk of destabilization of the economic system as a whole. To date, the economic history allows us to speak about the accumulation of a sufficient number of observations for conducting a comprehensive study of the features of the development of public energy policies. The study is based on the formalization of historical descriptions of the experience of 24 countries (30 cases). The article describes in detail the experience of 13 of the most striking cases. This made it possible to identify 14 variables for evaluating the state energy policy, while outlining three areas (areas of attention) of public administration. The choice of variables used in the model was made on the basis of the relative frequencies of the mechanism application for the observed population, MNRW-TF recommendations within the improving extractive industry, the formation of the contribution of resource industries to the socio-economic development of the country and etc. Further cluster analysis led to the identification of both a pronounced polarity in the development of the state energy policy and options for combining its areas. JEL Classifications: C82, Q43, Q48, P51, O57, O11.
History of scholarship and learning. The humanities, Social Sciences
Reinforcement learning (RL) based investment strategies have been widely adopted in portfolio management (PM) in recent years. Nevertheless, most RL-based approaches may often emphasize on pursuing returns while ignoring the risks of the underlying trading strategies that may potentially lead to great losses especially under high market volatility. Therefore, a risk-manageable PM investment framework integrating both RL and barrier functions (BF) is proposed to carefully balance the needs for high returns and acceptable risk exposure in PM applications. Up to our understanding, this work represents the first attempt to combine BF and RL for financial applications. While the involved RL approach may aggressively search for more profitable trading strategies, the BF-based risk controller will continuously monitor the market states to dynamically adjust the investment portfolio as a controllable measure for avoiding potential losses particularly in downtrend markets. Additionally, two adaptive mechanisms are provided to dynamically adjust the impact of risk controllers such that the proposed framework can be flexibly adapted to uptrend and downtrend markets. The empirical results of our proposed framework clearly reveal such advantages against most well-known RL-based approaches on real-world data sets. More importantly, our proposed framework shed lights on many possible directions for future investigation.
We introduce a decision-making framework tailored for the management of systemic risk in networks. This framework is constructed upon three fundamental components: (1) a set of acceptable network configurations, (2) a set of interventions aimed at risk mitigation, and (3) a cost function quantifying the expenses associated with these interventions. While our discussion primarily revolves around the management of systemic cyber risks in digital networks, we concurrently draw parallels to risk management of other complex systems where analogous approaches may be adequate.