A. Charnes, W. Cooper, G. Symonds
Hasil untuk "Production management. Operations management"
Menampilkan 20 dari ~6411770 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar
Subodha Kumar, Rakesh R. Mallipeddi
The new age economy is primarily driven by Industry 4.0 and Industry 5.0, which facilitate smartification of organizations by helping them integrate and automate decision making. Recent advances in information and communication technologies, such as the cloud, big data, Internet of things, and artificial intelligence and nanotechnology, have accelerated the adoption of Industry 4.0 and Industry 5.0. Because of these advancements, organizations are now facing new challenges in the form of cybersecurity risks that are partly caused by these technologies. In recent years, there has been a spike in the number of cyberattacks, and organizations are taking steps to minimize the impacts of these attacks. To address this critical issue, in this article, we discuss possible future research directions that production and operations management (POM) researchers can undertake to help organizations, supply chains, and governments develop robust strategies for reducing the number of attacks and their repercussions. In particular, we identify several avenues for future research in the following domains of POM: (1) global operations strategy, (2) healthcare operations management, (3) public policy, (4) management of technology, (5) supply chain management, and (6) disruptive technologies. Research on the topic of cybersecurity is not only an opportunity for operations management researchers but also critical for industry and society to overcome the challenges of cybersecurity risks.
Yang Lu, Xiaoming Yan, Yugang Yu
Popularity bias reflects the positive impact of popularity information on consumer choices. There are two common display formats for popularity information on online retail platforms: the total-based cumulative sales format, where the total sales of a product are displayed since its launch, and the period-based cumulative sales format, where only sales within a specific recent period are shown. Both types of popularity information are continuously updated. This paper focuses on the multiproduct dynamic pricing problem with popularity bias. We employ the widely used multinomial logit model to investigate the impact of popularity bias on consumer choices. In particular, we examine how popularity bias affects marginal revenue, pricing decisions, and market shares. Moreover, we highlight that ignoring popularity bias can lead to a suboptimal outcome. As the multiproduct dynamic pricing problem suffers from the curse of dimensionality, we propose a semi-myopic pricing policy, which is computationally tractable, and demonstrate its asymptotic optimality under both formats. Our numerical simulations further indicate that ignoring popularity bias can result in substantial revenue losses, while the semi-myopic pricing policy consistently outperforms other heuristics under both formats. Finally, empirical tests on real data provide a comprehensive procedure for identifying the most appropriate choice models, which offer practical insights for implementation.
Tinglong Dai, Jayashankar M Swaminathan
Artificial intelligence (AI) is poised to reshape operations across industries. Yet its real-world impact reveals a jagged and uneven implementation frontier. To make sense of this emerging landscape, we develop a foundational framework that synthesizes research and practice at the intersection of AI and Operations Management (OM), anchored in three interdependent pillars: (1) AI for OM, (2) OM for AI, and (3) Human–AI Interaction. First, AI for OM analyzes how AI enhances core operational processes, including design, procurement, production, and delivery. Second, OM for AI argues that scaling AI safely and effectively stands to benefit from core OM principles, including workflow design, capacity management, process control, drift detection, and continuous improvement, all of which are central to AI development and deployment. Third, Human–AI Interaction emphasizes the role of trust, incentives, and organizational design in mediating how humans and machines learn from and collaborate with each other. This triadic framework provides a foundation for organizing research on AI and OM and offers practical guidance for integrating AI into business and societal systems.
Daehoon Jung, Seung Jae Park
This study explores the effects of competition on fair trade operations. Specifically, as a benchmark, we first consider a model with a single fair trade organization (FTO), that is, a monopoly FTO. We then extend the analysis to a setting with two competing FTOs, that is, duopoly FTOs. Although both FTOs share the common goal of alleviating farmers’ poverty, they adopt contrasting strategies: One FTO seeks to lower the barriers to fair trade, aiming to mainstream it, while the other maintains stricter standards to prevent fairwashing (or greenwashing ). Following industry practices, we implement two modes of competition and compare them with the monopoly benchmark. We find that, while competition between FTOs increases the aggregate demand for fair trade certified products ( mainstreaming ), it results in less aggregate premium ( fairwashing ). Furthermore, we consider the potential application of blockchain technology in fair trade operations and demonstrate that it offers a promising solution to mitigate the fairwashing effects from competition between FTOs.
Yasin Alan, Mümin Kurtuluş, Alper Nakkas
We study the impact of planogram design (i.e., placement of products on shelves) and display fees (i.e., fees manufacturers pay retailers for prime shelf space) on a retailer’s category management strategy and interactions with national brand manufacturers (NBMs). We consider a game-theoretic model with one retailer and multiple NBMs. Each NBM offers one product. In addition, the retailer has the option to introduce a store brand (SB) product. If the retailer decides to introduce its SB, it needs to drop one of the national brand (NB) products from its assortment because it has limited shelf space. The retailer’s planogram has one prime shelf space and multiple nonprime shelf spaces. The prime shelf space has a demand-stimulating impact. The NBMs determine their wholesale prices and how much they are willing to pay for the prime shelf space. The retailer makes SB introduction and planogram decisions. It also sets the quantities for each product in its assortment, resulting in market-clearing retail prices. Our analysis leads to three key findings: first, the presence of a prime shelf space may not only prevent the retailer from introducing its SB but also lead to sizable changes in the retail and wholesale prices in the category. For example, an extensive numerical study reveals that receiving the prime shelf space increases an NB product’s retail price by 5.48%, on average. The average increase in the same product’s wholesale price is 2.52%, as the prospect of charging a higher wholesale price incentivizes its manufacturer to pay a display fee for securing the prime shelf space. Second, there are cases in which the retailer uses SB introduction as a strategic lever to intensify the competition for the prime shelf space and thereby increase its revenue from display fees. Third, despite being an additional expense, display fees can increase the NBMs’ profits by allowing them to influence the retailer’s assortment and planogram decisions in their favor. We discuss the implications of these findings for managers and researchers.
Shuo Zhou, Boyu Liu, Jianquan Wang et al.
Climate change has emerged as one of the most pressing global challenges in recent decades. Agricultural activities significantly influence climate dynamics, necessitating thorough investigation of their emission patterns. Using the FAO datasets, the objectives of this study were to assess agricultural GHG emissions, identify influencing factors, and explore potential mitigation strategies. The results show that emissions related to crop production are strongly correlated with the yields of predominant crops. Maize production had the largest impact on crop emissions (0.023), followed by potato (0.021) and rice (0.007). Notably, these three crops accounted for substantial portions of total crop-related emissions, with maize contributing 11.70%, potatoes (<i>Solanum tuberosum</i> L.) 10.21%, and rice 9.25%. In the livestock sector, cattle herds generated 10.75% of emissions, with pigs and sheep contributing 9.82% and 10.03%, respectively. Multivariate analysis revealed the cattle/buffalo population as the dominant emission driver (0.32), followed by sheep/goat (0.21) and swine (0.10) populations. Simultaneously, emissions from livestock operations were closely associated with the populations of key livestock species. Thus, from a climate mitigation perspective, prioritizing yield-optimized agronomic approaches for maize and potato cultivation, along with strategic population management of cattle and sheep, represents a critical pathway toward achieving emission reduction targets in global agricultural systems.
Amirreza Hosseini, Amro M. Farid
Megaprojects are large-scale, complex, and one-off engineering endeavors that require significant investments from a public or private sector. Such projects generally cost more than a billion dollars, take many years to develop and construct, involve stakeholders both in the public and private sectors, and impact millions of people. Most of the extant megaproject research is concerned with understanding why the engineering management of megaprojects fails so frequently and which dimensions make them so difficult to manage, including size, uncertainty, complexity, urgency, and institutional structure \cite{denicol:2020:00}. Recently, the literature on mega-projects has advocated for a convergence of the engineering management and production system management literature. To that end, this paper proposes the use of Model-Based System Engineering (MBSE) and Hetero-Functional Graph Theory (HFGT), where the latter, quite interestingly, finds its origins in the mass-customized production system literature. More specifically, HFGT was developed so that the physical and informatic parts of production system planning, operations, and decision-making are readily reconfigured to support production customization at scale. As the literature on megaprojects is rapidly evolving with a significant amount of divergence between authors, this report builds upon the recent and extensive megaproject literature review provided by Denicol et. al. \cite{denicol:2020:00}. The paper concludes that MBSE and HFGT provide a means for addressing many of the concluding recommendations provided by Denicol et. al. MBSE and HFGT not only align with current research on megaprojects but also push the boundaries of how the engineering management of megaprojects can gain a unified theoretical foundation.
Sylvester Willys Namagwa
As life expectancy in Kenya increases, so does the need for efficient pension schemes that can secure a dignified retirement and protect members from old age poverty. Limited research, however, has explored the efficiency of these schemes under existing governance structures. This study addresses that gap by examining the combined effects of corporate governance, risk management, and industry regulation on pension scheme efficiency in Kenya. Using a quantitative design, we conducted a panel regression analysis on a seven-year secondary dataset of 128 Kenyan pension schemes, totaling 896 observations. Our results reveal significant insights That the presence of employee representatives on the board and effective risk management have a significant positive effect on efficiency. Conversely, independent board members exhibit a significant negative effect. Other factors, including top management representation, female board members, and industry regulation, showed no significant effect on efficiency in the joint model. These findings suggest that the impact of governance and risk management on efficiency is nuanced, with specific factors like employee representation playing a more prominent role. We propose that the electoral process for employee board members may introduce a Self Cleaning Mechanism that progressively enhances scheme efficiency. This mechanism offers a novel theoretical extension of Agency Theory, explaining the convergence of interests between elected trustees and scheme members.
ManMohan S Sodhi, Christopher S Tang
In 2015, the United Nations (UN) countries signed up to achieve 17 sustainable development goals (SDGs) for people, planet, prosperity, peace, and partnership by 2030. However, the trend of progress toward achieving these goals indicates that none of the 17 goals may be achieved by 2030 globally. We first provide La foundation for operations management (OM) researchers to help shape the interventions for countries and companies to help achieve the SDGs by (1) identifying the synergies among the SDGs so that interventions can impact multiple SDGs positively and (2) linking some of the extant OM research with the synergies among the various SDGs. This way, researchers can understand the complexity of the challenges ahead and build on the OM literature to influence the interventions of governments and organizations to maximize the attainment of the SDGs. We also list some research opportunities to help OM researchers develop research agendas.
Reza Zanjirani Farahani, Nasrin Asgari, Luk N. Van Wassenhove
Textile waste is one of the most pollutant items globally, being strongly affected by fast fashion (FF) products. Public pressure has made many FF firms voluntarily collect a small fraction of their preowned items and export them to developing countries for reuse. However, some developing countries are launching import bans on second‐hand clothes. In addition, FF firms may soon be forced by extended producer responsibility legislation to collect more preowned items for reuse and recycling. To date, they do not have sufficient capacity to deal with this. Charities have been the key collectors and recyclers of unwanted clothes. Therefore, charities could help FF firms increase their capacity in this reverse supply chain (SC). However, we hardly witness such a collaboration for two main reasons: (i) charities prefer to sell high‐quality preowned items in the primary market to generate the highest possible revenue and FF firms may fear cannibalization, (ii) many charities believe that FF firms generate quantities of low‐quality items that require collection and sorting while being difficult to sell in the primary market. Charities also face competition from many small for‐profit organizations selling FF preowned items. While charities have the support of volunteers, they tend to be less efficient. This work urges Operations Management (OM) researchers to suggest innovative business models to help (i) FF firms and charities collaborate to solve the abovementioned issues, and (ii) charities to improve their traditional practices for competitiveness. This study is primarily a position paper highlighting some challenges and introducing interesting research problems. Although the paper is not a research paper, it follows a qualitative research method to collect and analyze the required supporting documents to justify arguments and statements. We collected primary and secondary data from the textile reverse SC members to familiarize the OM community with this context. The current changes in the textile reverse SC offer many great opportunities for impactful OM research.
Sushil Gupta, Hossein Rikhtehgar Berenji, Manish Shukla et al.
We review and analyze the farming (upstream agribusiness supply chain) research literature since 1965 to identify farming research opportunities for operations management (OM) researchers. A majority of reviewed papers in our corpus, until the turn of the 21st century, primarily focus on improving operational efficiency and effectiveness of farming using optimization techniques. However, during the last two decades, farmers’ welfare and the interests of other stakeholders have drawn OM researchers’ attention. This expanded focus on farming research has become possible due to the proliferation of mobile communication devices and the Internet as well as advancements in information technology platforms and social media. Our review also shows that there is a paucity of OM literature that leverages increased data availability from the emergence of precision agriculture and blockchain to address major challenges for the farming sector emanating from climate change, natural disasters, food security, and sustainable and equitable agriculture, among others. Big data, in conjunction with opportunities for field‐based experimentation, artificial intelligence and machine learning, and integration of predictive and prescriptive analytics, can be leveraged by OM scholars engaged in farming research. We zero in on specific questions, issues, and opportunities for research in farming.
Górny Adam
The safety of employees during the implementation of work processes is one of the important conditions for the efficient implementation of production tasks. The situation does not change with the use of automation and robotization. On the contrary, it may be a source of hazards in new areas, requiring specific actions to limit the possibility of accidents occurring or to limit their consequences. These actions should be taken in a manner appropriate to the nature of the non-conformities. The study, based on the analysis of literature and a case study conducted in a production organization, showed the most important factors that may disturb the efficient implementation of production tasks. The study presents the key conditions for the development and operation of Industry 4.0, in accordance with the recommendations of the European Union and guidelines indicated in the scientific literature. The obtained results are addressed to the information of people responsible for the development of I4.0, taking into account “the safety aspects”, in industrial enterprises where the implementation of tasks requires cooperation between humans and machines. Statistical analysis was not included in the study. The author attempts to identify relationships that justify further research conducted to establish the existing correlations regarding supporting organizational development professionals, and in particular helping them identify potential problems in order to provide their organization with an advantage in the global competitive environment.
Czerwińska-Lubszczyk Agnieszka, Byrtek Nikola
The concept of Work-Life Balance (WLB), which involves finding a harmonious equilibrium between work and personal life, has gained considerable importance across the European Union. In 2019, the European Parliament adopted a directive that requires all member states to integrate the principles of WLB, aiming to promote gender equality and fair treatment in the labor market (Directive on transparent and predictable working conditions in the European Union and Directive on work-life balance for parents and carers). This perspective offers significant opportunities and potential, while also posing challenges for entrepreneurs. An analysis of the literature on the subject indicates that the issue of WLB should be analyzed in the context of company size. The main objective of the research is to analyze and evaluate WLB among employees in small and medium-sized enterprises (SMEs) and large enterprises in Poland. The main results reflect that employees of companies in Poland (both SMEs and large enterprises) indicate a wide range of tools as having, in their opinion, an impact on WLB. Companies in Poland most commonly utilize tools such as flexible working hours and hybrid work. Findings confirm that WLB is less frequently implemented in the SME sector compared to large enterprises, and that employees in large enterprises have a better maintained WLB compared to employees in the SME sector. Implementing WLB is a challenge for the SME sector.
Elena Mussinelli
In the Italian context, the first law directly affecting the urban planning and building sector dates back to approximately 160 years ago, precisely Law 2248/1865. It established the administrative unification of the Kingdom of Italy, empowering municipal councils to deliberate on ‘hygiene, building and local police regulations’, and was followed a few months later by Law 2359/1865 on expropriations for public purpose. By contrast, the first regulations for the protection of artistic, historical, archaeological and ethnographic heritage (1089/1938), and natural beauty (1497/1939), are just over 80 years old. From that time onwards, the rules governing planning and design actions have been considerably enriched and developed. Hence, it is worth reflecting on the effectiveness and efficiency of a regulatory framework that has been governing territorial, urban and building transformations in an increasingly articulated and specialised manner with a view to improving the quality and sustainability of natural and anthropic habitats. Moreover, its ability to govern the ways, times and cultural and technical contents of the project production process to carry out high quality creations is worthy of consideration. Perhaps the issue of standardisation has never been the centre of attention in all sectors of civil life as today: in public administration and scientific research, among economic operators, planners, and citizens themselves. Regulatory systems are increasingly pervasive in regulating design activity and the characteristics of works in response to a general «increase in the variety and complexity of public interests that appear worthy of protection, such as the quality of the environment, the safeguarding of the natural and historical-artistic heritage, the protection of health, the safety of persons, and security […]» (Bassanini et al., 2005). Changing interests require frequent updates to adapt regulations to rapid socio-economic, cultural, and technological changes. The centres of regulatory production have also multiplied, breaking up into different levels and sectors of regulation, namely with multi-level (international, EU, national, regional, local), sectoral (economy, environment, territory, landscape, infrastructure, cultural heritage, health, etc.) and institutional governance structures, with corresponding different interests (public/private, collective/individual) and complicated relationships of interconnection, conditionality and/or competition (Raveraira, 2009). The scenario is even more complex, if we broaden the scope to include, in addition to prescriptive and binding rules, the vast universe of guiding principles, voluntary standards, guidelines, best practices, etc. Moreover, also due to the nature of the legal system model of reference (civil law derived from Roman law, as opposed to the common law of English-speaking countries, founded on the binding force of practice and judgements), Italian legislation has been stratified by an anomalous number of rules, which are often not mutually coordinated, sometimes contradictory or bearing inconsistent definitions. They are either incapable of producing the desired results, or are not the cause of effects even diametrically opposed to those expected. The attempt to solve every problem through a special regulation results in limiting the free and responsible action of citizens (and planners). Indeed, as Marco Romano points out, «to reduce people’s desires to rights codified in the doctrine of planning, imposed by enlightened and pedagogical governments on rebellious citizens unaware of their own good, is to erase what makes them citizens: the diversity of their individual life projects» (Romano, 2013). On the other hand, the discrepancy between this regulatory approach and the reality that surrounds us is evident. On Alessandro Pizzorno’s death, Fabrizio Schiaffonati recalled how, back in the 1960s, the doyen of Italian political sociology had already warned that in Italy «everything must be regulated so that everything can be conceded», pointing out that «this is still the case nowadays, more than half a century later, with good peace for the quality of the project, which is overwhelmed by constraints and contradictory procedures that are obstructive to a necessary qualitative transformation of the anthropic environment within proper time and costs» (Schiaffonati, 2019). This hypertrophic growth of laws and regulations (a true ‘legislative inflation’ or ‘regulatory pollution’) is accompanied by their rapid variability over time, so much so that a building intervention begun within a given legislative framework risks being completed in the presence of a different regulatory framework, which would not have allowed its execution, and vice versa. Not to mention the «badly written, lengthy regulations that are difficult to read and even more difficult to apply, (which) now represent a constant factor with which even the most prepared and motivated operator must come to terms» (Gorlani, 2022), which lead to confusion and interpretative doubts. This makes bureaucratic formalities unnecessarily complex, overloads administrative action, and increases the regulatory and management costs for citizens, businesses and the public institutions themselves, including those dedicated to monitoring and control actions (which, in a context of shrinking public resources, are often the first to be lacking…). Legal uncertainty leads to opaque, if not arbitrary decisions, facilitates corruption, increases discrimination and social conflict, and limits economic development, sometimes to the point of inhibiting it (Bassanini et al., 2005). A vulnus with dramatic effects, if it is true that certainty does not have to be of the law, but: «certainty is law, just as, vice versa, law is certainty, if it is true that law […], is constituted for the specific purpose of giving certainty, or rather: certainties» (emphasis added; Ruggeri, 2005). The body of urban planning legislation has expanded considerably, imposing on city and regional planning new objectives and constraints aimed at protecting and improving the quality of the environment and landscape. Strategic environmental and impact assessments, regulations to limit land consumption, to increase climate resilience and to regenerate the built environment have been in use for many years now, with their rich set of analyses and tools to manage knowledge, build scenarios, compare alternatives, and quantify their effects through indicators (environmental, socioeconomic, etc.). And yet, all this does not seem to have produced the expected effects, as witnessed by the continuing degradation of urban suburbs, the continuous increase in soil erosion by new urbanisations and infrastructures, the abandonment of ‘inland areas’, and the hydrogeological instability of the most ‘fragile’ territories. Instead, by moving more and more on the level of so-called policies, planning seems to have lost its technical capacity to conform the quality of spaces, even in their cultural value and use, in a sort of throwback of illiteracy forgetting the grammatical and syntactical rules of construction of the European city. The disciplinary crisis of the plan is evident, incapable of governing land uses and built forms, as well as the quality of public space, relying, instead, on the abstraction of ‘tactical squares’ and social streets totally inadequate to determine an organic configuration of the urban structure. There is no large city that does not have a plan for climate resilience or sustainable mobility, nor is there a major project that cannot boast top-level environmental and/or energy performance, duly certified even when it plans to replace a tree-lined park of more than 50,000 square metres with green roofs on a shopping centre (for example, San Siro in Milan). Greenwashing operations often characterise the private actions of real estate operators, in the absence of checks and controls by the public authorities. The public works sector has long been searching for a better balance of time, cost and quality of works. «A long journey, which has allowed for advances […] and regulatory innovations during the Nineties» (Schiaffonati, 2006) and which, after thirty years of conjunctural measures (suspensions, temporary derogations, emergency decrees, special procedures and competences, variations of thresholds, etc.1) has led to the new Procurement Code (legislative decree no. 36/2023). It features a text of more than 150,000 words, to which the regulatory and procedural innovations introduced by the PNRR must be added, with the related set of regulations, guidelines, explanatory circulars, protocols and technical instructions2. It is a seemingly unstoppable process of continuous correction and integration to reform the reform, in the absence of the indispensable monitoring activity that should, instead, verify and assess the effects of the application of the regulation to correctly finalise its amendment. Nevertheless, there has been no lack of significant precedents in this regard, as in the case of the French experimentation of the Spinetta Law on construction insurance systems3. If we apply to the standard the historical notion of “quality as fitness for intended use” (Juran, 1951), or to the more recent notion of «the set of properties and characteristics of a product or service that provide the capacity to satisfy expressed or implicit needs» (UNI EN ISO 8402:1995), it clearly appears that the challenge to be faced concerns not so much or only regulatory and administrative simplification, or the replacement of redundant, obsolete or unjustified regulations, but precisely the “quality of regulation”. A direction undertaken since 2001 by OECD and Apec countries with a Regulatory Reform (reference criteria to ensure quality and transparency in regulatory activity), in line with the obligation to formulate rules that are conceptually and semantically precise, clear and comprehensible in the terms used, in the objectives set, in the required behaviour (Constitutional Court, ruling no. 364 of 1988) and, above all, with contents derived from consensual and shared planning (Raveraira, 2009). Responsibility, consensus and collaboration are, I believe, the key words to possibly rethink the relationship between design and regulation. In fact, I agree with Marco Dugato’s observation in this Dossier when he argues that «the fault of normative hypertrophy cannot be attributed to the omnipotence of the regulator by itself, rather it is attributable to the contribution of the ones regulated». If it is true that architectural design is constrained by regulations, it certainly cannot be mechanically determined by them for mere reasons of conformity. Conversely, as Maria Chiara Torricelli emphasises again in the Dossier, the norm is a tool that provides valid and shared knowledge to the project; and the project itself, as a projective activity, contributes proactively to its definition. There are many examples spanning technical directives regulating the implementation cycles of the INA Casa, the result of design research in support of the political project, and the various procedural and meta design regulations derived from research in the Architectural Technology Field. Such design experiences have unfolded in an experimental manner, in derogation of the regulations and leading to their renewal. Instead, deductive design approaches seem to prevail today, due to the growing availability of algorithmic procedures that do not merely support the design process, but develop it in an almost automated manner through conditioning and prevailing indicators and parameters. These tools legitimise choices where conformity to the standard acts as a screen for the assumption of precise responsibilities. There is a conceptual and operational reversal with respect to creative, responsibly inductive design action, which experiments and innovates, putting the principles of adequate performance and compliance with needs over the criteria of formal conformity. This is evident in the relationship between technical regulations and techno-typological innovation for evolutions that move the parameters of regulatory congruity “forward”, but sometimes even “sideways”. This also counteracts the phenomena of norm obsolescence. In consideration of the pervasiveness of the regulatory systems that rule design action, it is, finally, disturbing to observe the very limited importance assigned to this subject in the education of new designers. The didactics of design, which have long been the focus of Architecture studies, rarely envisage a structured discussion on regulatory and normative aspects, leaving them to the discretion of professors. Hence, at the end of the course, a large proportion of students have never heard about the Code of Procurement, environmental impact assessment or minimum environmental criteria… Whereas it is, instead, essential to solicit, from the first year, critical attention to the normative paradigm, also for the ethical, social and professional responsibilities it entails, and to encourage the assumption of norms and constraints as factors that nourish the entire design process. The norm thus becomes a «tool for guiding and controlling design choices», which as such «must be assumed in the organisation of the starting data» (Del Nord, 1992). Not to mention the need for qualifying training programmes, as Mario Avagnina points out, so that all those involved in the process, particularly public clients, are able to carry out their tasks. The objective is far from being achieved, and «necessarily passes through the training of the figures involved, starting with the RUPs». Figures characterised not only by technical knowledge of the building process and its rules, but also by a culture of standards and conscious responsibility that can only derive from a design practice, which is continually verified in the real context, and by design actions based on an experimental method that aims to face the issues of society. Figures characterised not only by technical know-how of the building process and its rules, but also by a culture of standards and conscious responsibility, which can only derive from a practice continually verified by comparison with reality, and by design actions marked by an experimental method that finds its arguments in taking on the problems of society.
Nanda Kishor Panda, Simon H. Tindemans
Electric vehicles (EVs) play a crucial role in the transition towards sustainable modes of transportation and thus are critical to the energy transition. As their number grows, managing the aggregate power of EV charging is crucial to maintain grid stability and mitigate congestion. This study analyses more than 500 thousand real charging transactions in the Netherlands to explore the challenge and opportunity for the energy system presented by increased charging needs and smart charging flexibility. Specifically, it quantifies the collective ability to provide dependable congestion management services according to the specifications of those services in the Netherlands. In this study, a data-driven model of charging behaviour is created to explore the implications of delivering dependable congestion management services at various aggregation levels and types of service. The probabilistic ability to offer different flexibility products, namely, redispatch and capacity limitation, for congestion management, is assessed for different categories of charging stations (CS) and dispatch strategies. These probabilities can help EV aggregators, such as charging point operators, make informed decisions about offering congestion mitigation products per relevant regulations and distribution system operators to assess their potential. Further, it is shown how machine learning models can be incorporated to predict the day-ahead consumption, followed by operationally predicting redispatch flexibility. The findings demonstrate that the timing of EV arrivals, departures, and connections plays a crucial role in determining the feasibility of product offerings, and dependable services can generally be delivered using a sufficiently large number of CSs.
Edward G. Anderson, David R. Keith, Jose Lopez
Operations management (OM) in the public policy context is extremely complex with many mutually interacting factors characterized by feedback loops, delays and nonlinearities, as well as multiple stakeholders pursuing divergent objectives. Prior researchers have called for a systems approach in these contexts, arguing that standard OM methodologies such as mathematical programming, and queuing theory often cannot fully address these problems. Researchers have employed one such systems approach, system dynamics, successfully for decades for studying OM problems in public policy because it can address such complexity and can also integrate disciplines from outside OM such as political science, epidemiology, ecology, etc. In this paper, we create a roadmap for researchers—both those who are familiar with systems dynamics and those who are not—for the expanded use of system dynamics studying public policy‐related OM problems. We review and organize relevant system dynamics literature in both traditional operations management venues as well as public policy venues unfamiliar to OM audiences . We then identify a set of interesting open questions and potential system dynamics building blocks for answering them by topic. Leveraging this review, we describe under what conditions system dynamics is most appropriate. We then identify several overarching methodological and domain gaps for future research. Finally, we build on previous work to extend a process for using system dynamics with traditional operations management methodologies. It separates model building into two sequential phases: consensus‐building models and detailed operational models. It also incorporates scenario planning and feedback from implementation outcomes.
Mengshi Lu, Zuo‐Jun Max Shen
Over the past two decades, there has been explosive growth in the application of robust optimization in operations management (robust OM), fueled by both significant advances in optimization theory and a volatile business environment that has led to rising concerns about model uncertainty. We review some common modeling frameworks in robust OM, including the representation of uncertainty and the decision‐making criteria, and sources of model uncertainty that have arisen in the literature, such as demand, supply, and preference. We discuss the successes of robust OM in addressing model uncertainty, enriching decision criteria, generating structural results, and facilitating computation. We also discuss several future research opportunities and challenges.
Afarin Akhavan, Seid Mahdi Ebrahimi, Ali Sadri Esfahani
Purpose: This research aims to propose an integrative approach for selecting suppliers for Nasr Niroo Engineering Company in Yazd. The proposed model applies value engineering in a type-2 fuzzy environment. Design/methodology/approach: First, the purchase value has been calculated for each supplier using type-2 fuzzy data and the experts' opinions. Then, by dividing the value by the cost coefficient, the purchase value function of each supplier has been calculated for each product. The purchase value coefficients have been determined according to the opinion of experts and the buyers of equipment. In this research, verbal data has been collected from the experts, and out of this data, type-2 fuzzy numerical data has been extracted, calculated, and applied. The influence of type-2 fuzzy numbers in reducing uncertainty in the collected expert opinions has been the cause of such a problem. Following the determination of the purchase value function for each supplier, it has been assumed that quantitative information such as order cost, purchase cost, as well as each supplier's return rate, are also effective in selecting each supplier. Findings: In this study, the subject of supplier selection was examined and solved for four separate product categories - iron, fibre, insulators, and fittings - for which there were two, two. Four, and four different suppliers, respectively. After examining and solving the model considering the purchase value index as well as additional parameters including ordering and purchasing costs, each supplier's capacity, and the rate of return and failure of parts, five suppliers were selected. Two suppliers were selected for iron, and one supplier was selected for each of the other three products. Research limitations/implications: The application of the proposed model appears to be highly difficult due to the numerous fuzzy number calculations, and obtaining information in this area is one of the limitations of this model. Also, type-2 fuzzy numbers could not be used for mathematical modelling and solution, because no model has been presented yet for the optimal solution of linear programming problems using type-2 fuzzy numbers. Practical implications: The simultaneous use of value engineering techniques or multi-criteria decision-making methods with mathematical models can improve qualitative factors along with improving quantitative factors. This method can be used in project portfolio selection problems and problems with multiple quantitative and qualitative factors as well as different prices and values. Social implications: For reasons such as the utilization of experts' perspectives on the quality factors affecting the selection of suppliers, the model described in this study can help ensure proper supplier selection; transforming the opinions of experts into type-2 fuzzy numbers, which is a practical method for reducing errors and uncertainty in judgments; employing value engineering to analyze viewpoints while keeping in mind the value of purchasing from each provider; using a mathematical model that accounts for costs, the percentage of orders that fail, and defects of order, which may be based on prior experiences or the supplier's announcement. Also, the use of type-2 fuzzy numbers is one of the solutions to face uncertainty in a system. Originality/value: The topic of measuring the value of purchasing from suppliers is concentrated in this study instead of using other indicators such as purchasing risk, as a new indicator. Also, the simultaneous review of quantitative and qualitative data for the selection of suppliers can be considered one of the strengths of the reviewed model. Utilizing indications such as purchase value will be far more effective and better than using multi-criteria decision-making procedures, as seen when comparing this model with other applied models. The use of type-2 fuzzy numbers to lessen the uncertainty associated with using expert opinions and the development of the value calculation method in the area of supplier selection are the two major innovations of this study.
Ginevičius Romualdas
Various indicators are used to determine the level of company diversification. Their adequacy largely depends on the structure of the production programme. Its essential feature is the comparative weight of the main product in the total scope of the company’s work. In this situation, the intensity of the diversification process is reflected by the decrease in the volume of this product due to the inclusion of new products in the production programme. In this case, the adequacy of the diversification indicator can be reflected by comparing the scale of the main product with changes in the value of these indicators. The adequacy will be higher with more changes in the values of diversification indicators corresponding to changes in the volumes of the main product. Four indicators of corporate diversification are the most well-known and widely used: the Berry index, the entropy measure, Utton’s measure and the DG index. All of them have both strong and weak sides, so it is important to determine situations of the company’s production programme in which diversification indicators are appropriate to use, i.e., in which situations their adequacy is the greatest. The research has established that if the comparative weight of the main product of the production programme in the total scope of work is greater than 0.5, then the adequacy of the entropy measure and index DG is higher compared to the Berry index and Utton’s measure. If it is lower than 0.5, the other two diversification indicators should be used. The obtained results will help to more efficiently manage the process of diversification as a company’s development strategy.
Halaman 27 dari 320589