Abstract The Resource-Constrained Project Scheduling Problem (RCPSP) is a general problem in scheduling that has a wide variety of applications in manufacturing, production planning, project management, and various other areas. The RCPSP has been studied since the 1960s and is an NP-hard problem. As being an NP-hard problem, solution methods are primarily heuristics. Over the last two decades, the increasing interest in operations research for metaheuristics has resulted in a general tendency of moving from pure metaheuristic methods for solving the RCPSP to hybrid methods that rely on different metaheuristic strategies. The purpose of this paper is to survey these hybrid approaches. For the primary hybrid metaheuristics that have been proposed to solve the RCPSP over the last two decades, a description of the basic principles of the hybrid metaheuristics is given, followed by a comparison of the results of the different hybrids on the well-known PSPLIB data instances. The distinguishing features of the best hybrids are also discussed.
Abstract The involvement of corporate social responsibility (CSR) in sustainable development (SD) is becoming a popular topic on research and business domain. However, the co-themed research is still rather new and hasn’t been fully studied. An in-depth bibliometric analysis using the ‘CiteSpace’ software is applied to analyze and visualize the knowledge map of the CSR research related to SD. Main findings show that the CSR involvement in SD is a lasting but recent prosperous research topic. The top 3 influential journals in this area are Corporate Social Responsibility and Environment Management; Sustainability; and Journal of Cleaner Production. Porter ME, Carroll AB, etc., are the most impactful authors. The co-author network is fragmented, while cross-national co-operations occur in groups. 11 clusters are identified to be highly concerned, among which, “stakeholder” and “NGO” are long lasting till now. 13 burst terms has changed over 15 years (2005–2019) indicated the research frontiers evolution in this field, with the earliest “sustainability” to “strategy”, “performance” and then “stakeholder”, “developing country”, “disclosure” and “supply chain management”, etc., and “climate change” being the newest but strongest. Four stages of the evolution can be identified: initial phase (1997–2004), debating phase (2005–2009), rapid developing phase (2010–2013), and research specialization phase (2014–2019). Finally, contributions, limitations and further research directions are discussed.
Since the launch of its first satellite in 2009, Tohoku University has continuously developed and operated Earth observation satellites and engineering demonstration satellites in the 50cm-class and CubeSat-class (up to 3U). The 50cm-class satellite launched into operation in 2021 enabled efficient operations through cloud-based management functions for both the satellite and ground stations, including automatic command generation. By 2022, up to eight operational satellites were simultaneously managed on a daily basis using three ground stations (Sendai, Hakodate, and Sweden). This paper presents the operational achievements to date and introduces the system that supports efficient satellite operations
Transaction cost economics (TCE) is one of the most widely referenced organization theories in operations and supply chain management research. Even though TCE is a broadly applicable theory of governance, one of its specific topics of interest—the make‐or‐buy decision—readily aligns with some of the central research questions on how firms manage supply chains. However, both general management and operations management researchers sometimes misunderstand and misapply TCE's aims, assumptions, and logic. A common mistake is to read TCE as a theory of competence or of power. While TCE relates to both, TCE is essentially a theory of efficient governance of transactions in particular and exchange relationships in general. Our purpose in this study is to review the intellectual and theoretical foundations of TCE, its primary aims, and its applicability as a theory of supply chain efficiency. To this end, we discover much common ground between TCE and research in operations and supply chain management. We close by discussing implications for future research, focusing on how operations and supply chain management researchers could contribute to broader academic conversations on management and governance.
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">In this research, using the double-objective mixed integer linear programming method, an optimal supply chain network for the collection and recycling of urban waste has been presented in terms of source separation and the uncertainty of per capita waste generation by citizens. Due to the uncertainty in the parameters of the problem, the two-stage stochastic programming method has been used to model the problem. The objective functions include an economic function to minimize investment costs and a social objective function to maximize the amount of recycling. In order to accurately solve the problem on a large scale, the Lagrange release method has been used. To validate and confirm the effectiveness of the model presented in this research, the model was implemented on a case study in the city of Karaj. According to the obtained results, to increase the amount of recycling in the waste supply chain network, more infrastructural and operational investments are needed. By increasing recycling, the harmful environmental and destructive effects of burying and burning waste will be reduced. The Lagrange release solution method can be used as a suitable method to reduce problem-solving time. In this research, it was observed that the Lagrange release method can solve large-scale problems with appropriate accuracy and in less time compared to the commercial CPLEX solver.</span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"> </span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"><strong>Key Words:</strong> two-stage stochastic programming, Lagrange's release method, linear programming, supply chain, waste management </span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"> </span></p>
<p style="text-align: left;"><span style="font-size: 12pt;"><strong><span style="font-family: times new roman, times, serif;">1.Introduction</span></strong></span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">In Iran, 50,000 tons of waste are produced daily, of which only about 10% are recycled. In the city of Tehran, approximately 2% of daily urban waste production is separated at the source. The operation of collecting and disposing urban waste is very expensive due to the high investment costs for the waste collection and transportation fleet and the need to spend significant operational costs. Therefore, even small and partial reductions in the operating costs of waste management lead to large savings in the cost of municipalities. On average, between 60 and 80 percent of urban solid waste management costs are related to waste collection and transportation costs. While in the world, on average, 70% of the produced waste is recycled, optimistically, this figure reaches about 20% in Iran, and this means that in the country, about 16 million tons of waste are buried in the ground without being recycled. One of the important reasons for the low waste recycling in Iran is the lack of separation from the source of all types of waste produced in the country. The purpose of this research is to reduce the costs of urban waste management through separation at the source of waste and creating special hubs for each type of separated waste.</span></p>
<p style="text-align: left;"><span style="font-size: 12pt;"><strong><span style="font-family: times new roman, times, serif;">2- Literature review</span></strong></span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">Among all municipal solid waste management strategies, waste recycling has received more attention than other options due to its impact on economic growth in addition to protecting the environment and human health. Given the need for investment in collection and disposal facilities along with high operating costs, conducting waste collection, recycling, or disposal operations is very costly. Therefore, a slight improvement in this process causes a significant reduction in the costs of municipalities (Babaei et al., 2017). The meaning of solid waste management is a set of coherent and systematic programs and laws related to the control of production, collection, transportation, separation, recycling and burial of waste based on the principles of public health, economy and conservation of biological resources (Akbarpour Shirazi et al., 2015). According to the conducted research, urban solid waste management can be considered as a supply chain network design problem (Mohammadi et al., 2019). This network includes facilities such as waste collection stations, transfer stations and recycling and disposal facilities. In the process of household waste collection, waste collected from local collection stations is first sent to transfer facilities where it is unloaded from municipal collection trucks and loaded into larger trucks to be transported to landfills in bulk (Habibi et al., 2017). In order to design an efficient and suitable supply chain network for urban waste collection, mathematical programming models can be used to improve the performance of this network by optimizing the location of facility locations and their allocations, and therefore, making them valuable tools for improving overall supply chain efficiency (Habibi et al., 2017). Since the parameters and information required for designing the waste supply chain network are not always certain, designing the supply chain in a deterministic way decreases its practical efficiency. Therefore, considering uncertainty in designing the model is inevitable (Rahimi & Qadavati, 2017). The findings from previous research indicate that, in the majority of studies, the issue of waste separation at the source and the establishment of hub centers for each type of the separated waste have not been taken into consideration; therefore, in the present study, both of the above mentioned issues have been taken into consideration in designing the model.</span></p>
<p style="text-align: left;"><span style="font-size: 12pt;"><strong><span style="font-family: times new roman, times, serif;">3- Methodology</span></strong></span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">Building on the points mentioned, this research is an attempt to design a multi-level supply chain network for urban waste collection and recycling, focusing on source segregation and uncertainty in citizens' per capita waste generation. This supply chain network includes urban points (segregated waste collection tanks) as waste collection centers, transfer centers or hubs for separated waste, recycling centers, as well as burial centers and waste incinerators. The flow of materials in this supply chain is considered in such a way that the waste is separated at the source by the citizens and placed in the tanks specific to each type of waste. Then, these wastes are transported by collection trucks to the hub or waste transfer centers specific to each type of waste, and then, transported by larger trucks to recycling centers and disposal centers (including burying or burning waste centers). In order to design the network, a mixed integer programming problem is designed, which includes two economic and social objectives. The first objective function seeks to minimize initial investment and operating costs, while the second objective function, that is, the social objective function, focuses on maximizing urban waste recycling. In order to take into account, the uncertainty in the citizens' per capita waste generation, a two-stage random programming method has been used. In order to linearize the above two objective functions, the epsilon constraint method is used. Also, using a case study in the city of Karaj, the efficiency of the designed model has been investigated. The solution method used to solve the presented model in large dimensions is the Lagrange release method, which is classified in the group of exact problem-solving methods.</span></p>
<p style="text-align: left;"><span style="font-size: 12pt;"><strong><span style="font-family: times new roman, times, serif;">4- Results</span></strong></span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">In the present study, with the aim of addressing the existing research gap, an integer linear programming mathematical model was developed for designing the waste collection and recycling network, focusing on source separation and the establishment of hubs for each type of separated waste.</span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">In this research, in addition to the concept of recycling, the concept of separation hub was also considered in the design of the supply chain. In order to validate the model, a case study was conducted in Karaj and its results were presented. Collecting suitable data to solve the problem was one of the problems of designing the model due to the high amount of required data and the difficulty of accessing some statistics. According to the obtained results, in order to increase the amount of recycling in the waste supply chain network, more infrastructural and operational investments are needed. By increasing recycling, the environmental and destructive effects of burying and burning waste will be reduced. The Lagrange release method can be used as a suitable solution method to reduce problem solving time in problems with high values. In this research, it was observed that the Lagrange release method can solve problems with high values with appropriate accuracy and in less time compared to the CPLEX solver. Therefore, it can be said that the innovations of this research include the development of the waste collection and recycling supply chain model under uncertainty, considering the separation hubs, using the Lagrange release method to solve the model and the case study of Karaj city. </span></p>
<p style="text-align: left;"><span style="font-size: 12pt;"><strong><span style="font-family: times new roman, times, serif;">5- Discussion</span></strong></span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;">The purpose of this study was to design a multi-objective mathematical model to manage municipal and hospital waste. Hence, after reviewing the related literature, the gap in the existing studies was determined. In the studies conducted in the relevant articles, it was found that the specialized hub for each type of waste was not included, and only in a few articles, the concept of recycling was considered in the design of the supply chain. Also, to the researchers' best knowledge, no article was found in which exact solution methods were used to reduce the difficulty of solving the problem. Considering the collection and recycling supply chain network presented in this research and also based on the literature review, issues such as routing the movement of waste collection and transportation trucks in the mentioned network, production and storage planning for recycling centers, using heuristic and meta-heuristic solution methods to solve the model which are combined with other exact mathematical solution methods such as Lagrange release, considering environmental objective functions such as pollution reduction in the waste transportation and recycling process and considering the uncertainty in the capacity of facilities can be considered as the subject of future research.</span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"> </span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"> </span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"> </span></p>
<p style="text-align: left;"><span style="font-family: times new roman, times, serif; font-size: 12pt;"> </span></p>
Marta Sidorkiewicz, Karolina Królikowska, Berenika Dyczek
et al.
This study examines the role of Artificial Intelligence (AI) in enhancing sustainability and efficiency within the wine industry. It focuses on AI-driven intelligent management in viticulture, wine production, and enotourism. As the wine industry faces environmental and economic challenges, AI offers innovative solutions to optimize resource use, reduce environmental impact, and improve customer engagement. Understanding AI's potential in sustainable winemaking is crucial for fostering responsible and efficient industry practices. The research is based on a questionnaire survey conducted among Polish winemakers, combined with a comprehensive analysis of AI methods applicable to viticulture, production, and tourism. Key AI technologies, including predictive analytics, machine learning, and computer vision, are explored. The findings indicate that AI enhances vineyard monitoring, optimizes irrigation, and streamlines production processes, contributing to sustainable resource management. In enotourism, AI-powered chatbots, recommendation systems, and virtual tastings personalize consumer experiences. The study highlights AI's impact on economic, environmental, and social sustainability, supporting local wine enterprises and cultural heritage. Keywords: Artificial Intelligence, Sustainable Development, AI-Driven Management, Viticulture, Wine Production, Enotourism, Wine Enterprises, Local Communities
Data centers (DCs) as mission-critical infrastructures are pivotal in powering the growth of artificial intelligence (AI) and the digital economy. The evolution from Internet DC to AI DC has introduced new challenges in operating and managing data centers for improved business resilience and reduced total cost of ownership. As a result, new paradigms, beyond the traditional approaches based on best practices, must be in order for future data centers. In this research, we propose and develop a novel Physical AI (PhyAI) framework for advancing DC operations and management. Our system leverages the emerging capabilities of state-of-the-art industrial products and our in-house research and development. Specifically, it presents three core modules, namely: 1) an industry-grade in-house simulation engine to simulate DC operations in a highly accurate manner, 2) an AI engine built upon NVIDIA PhysicsNemo for the training and evaluation of physics-informed machine learning (PIML) models, and 3) a digital twin platform built upon NVIDIA Omniverse for our proposed 5-tier digital twin framework. This system presents a scalable and adaptable solution to digitalize, optimize, and automate future data center operations and management, by enabling real-time digital twins for future data centers. To illustrate its effectiveness, we present a compelling case study on building a surrogate model for predicting the thermal and airflow profiles of a large-scale DC in a real-time manner. Our results demonstrate its superior performance over traditional time-consuming Computational Fluid Dynamics/Heat Transfer (CFD/HT) simulation, with a median absolute temperature prediction error of 0.18 °C. This emerging approach would open doors to several potential research directions for advancing Physical AI in future DC operations.
With the evolution of process approaches within organizations, the increasing importance of quality management systems (like ISO 9001) and the recent introduction of ISO 30401 for knowledge management, we examine how these different elements converge within the framework of an Integrated Management System. The article specifically demonstrates how an ISO30401-compliant knowledge management system can be implemented by deploying the mechanisms of the SECI model through the steps of the PDCA cycle as applied in the processes of the integrated management system.
Anqi Qu, Nur Aminatulmimi Ismail, Jose G. Delgado-Linares
et al.
Gas hydrate formation poses a significant challenge in offshore oil and gas production, particularly during cold restarts after extended shut–ins, which can lead to pipeline blockages. Although steady–state models have traditionally been used to predict hydrate formation under continuous production conditions, these models are often inadequate for transient operations due to issues like near–zero fluid flow shear affecting the viscosity calculations of hydrate slurries. This study introduces novel conceptual models for dispersed water–in–crude oil systems specifically designed for cold restart scenarios. The models are supported by direct observations and various experimental approaches, including bottle tests, rheometer measurements, micromechanical force apparatus, and rocking cell studies, which elucidate the underlying mechanisms of hydrate formation. Additionally, this work introduces a modeling approach to represent conceptual pictures, incorporating particle settling and yield stress, to determine whether the system will plug or not upon restart. Validation is provided through transient large–scale flowloop tests, confirming the plugging mechanisms outlined. This comprehensive approach offers insights into conditions that may safely prevent or potentially lead to blockages in the fully dispersed system during field restarts, thereby enhancing the understanding and management of gas hydrate risks in offshore oil and gas operations.
Andrea Gritti, Leonardo Michele Carluccio, Lorenzo Pellegrini
The management of dust hazards within industrial environments remains a critical concern, as sadly testified by the catastrophic events occurred in recent history. Dust Hazard Analysis (DHA) is a risk assessment technique used for identifying, managing, and mitigating the risks related to the handling, production, and storage of combustible dusts. This study explores the efficacy of employing a semi-quantitative approach within the framework of DHA, leveraging a Risk Matrix to assess and define the severity and likelihood of potential hazardous events associated with combustible dust.
The semi-quantitative method presented herein integrates qualitative expert judgments with quantitative data, fostering a comprehensive evaluation of various scenarios involving dust-related hazards: central to this approach is the utilization of a Risk Matrix, where severity and likelihood classes are intersected to generate risk levels associated with identified dust-related scenarios. This allows the prioritization of mitigation strategies, focusing resources on high-risk scenarios while acknowledging lower-risk occurrences. Furthermore, the use of a common Risk Matrix facilitates the decision-making process, allowing the same benchmark to be applied to the recommendations and actions emerging from different risk assessment techniques (e.g., HazOp).
The study underscores the value of a semi-quantitative approach in DHA, highlighting the potential and the limitations of the current model, offering a structured methodology that aids stakeholders in decision-making processes concerning risk mitigation and control measures, ensuring the safety and integrity of industrial operations.
Chemical engineering, Computer engineering. Computer hardware
ObjectiveTo analyze the epidemiological characteristics of foodborne disease outbreaks caused by takeaway in China’s Mainland from 2010 to 2020, and put forward relevant regulatory suggestions.MethodsThrough the National Foodborne Disease Outbreak Monitoring System, the data of foodborne disease outbreaks caused by takeaway in China’s Mainland from 2010 to 2020 were collected and analyzed, and descriptive epidemiological characteristics were performed.ResultsA total of 549 foodborne disease outbreaks caused by takeaway were reported in China’s Mainland (except Tibet Autonomous Region) from 2010 to 2020, resulting in 9 285 illnesses and 2 deaths. The largest number of outbreaks and illnesses was in the third quarter, accounting for 41.53% and 44.58% of the total respectively. Except the unknown pathogenic factors, the number of outbreaks and illnesses caused by microbial pathogenic factors were the highest, accounting for 39.16% and 60.26% of the total respectively. Except the unknown suspected food, the number of outbreaks and illnesses caused by mixed food and multiple food were higher, accounting for 21.86% and 15.12% of the outbreaks respectively, and accounting for 20.58% and 13.10% of the number of illnesses respectively. Except the unknown food source, the top 3 food source were school canteens, collective dining delivery units/central kitchens and fast food restaurants, the number of outbreaks accounted for 20.04%, 15.66% and 15.48% respectively, and the number of illnesses accounted for 35.30%, 17.52% and 10.57% respectively. Except the multiple factors and unknown factors, the number of outbreaks caused by improper storage were the highest accounting for 8.74%, and the number of illnesses caused by improper processing accounted for 7.74%.ConclusionMicrobial growth and reproduction due to improper storage and processing is the major cause of foodborne disease outbreaks caused by takeaway. It is suggested that the food safety supervision and administration departments should strengthen the whole process supervision and management of takeaway, establish and ensure catering services strictly abide by the good hygiene operations according to food raw in materials, production, transportation, distribution and other aspects, so as to effectively prevent and control the occurrence foodborne diseases.
Food processing and manufacture, Nutrition. Foods and food supply
Tengku Nur Azila binti Raja Mamat, Nur Ain Arina binti Johan, Fatin Nurina binti Mohamad Khair
et al.
Production planning and control (PPC) is a branch of knowledge that commonly discusses the available concepts and methods for operation management. The knowledge and understanding of PPC concepts, together with proper implementation can assist an organization in fulfilling customer demands with minimum cost. Large organizations are exposed to PPC concepts due to the qualification of workers and the availability of training. However, for small and medium enterprises (SMEs), awareness and knowledge of PPC concepts are lacking. This study is carried out in an SME, a bakery company located in Selangor, Malaysia, to determine how PPC concepts can be applied in SMEs. The case study found several problems faced by the company, ABC Bakery, that relate to supplier selection and workforce planning. In order to overcome the problems, two PPC techniques are introduced. Firstly, the quantity discount model is proposed to solve supplier selection issues. Secondly, the aggregate planning concept is utilized to overcome the workforce issue. From the calculation and analysis conducted, it is found that ABC Bakery can make a wise decision in the supplier selection process when the quantity discount model is implemented, in which selection is based on the minimum total cost incurred. Next, by applying aggregate planning; specifically varying workforce size and influencing demand strategies, the number of workforces required can be estimated and then further improved to increase demand in the future. In conclusion, this study shows that PPC concepts are suitable to be implemented in SMEs specifically to solve operations problems, and then help SMEs in increasing productivity and sales in the long term.
Production management. Operations management, Business
The article presents the problems of determining the mass efficiency of a rotary feeder depending on the selection of design parameters of the device, such as outer diameter, number of blades and rotational speed of the rotor. The hitherto theoretical methods of calculating the feeder efficiency were presented, as well as a new method of determining the device operation parameters was proposed. For this purpose, the numerical Discrete Element Method was used, which allowed simulating the transport of limestone powder in a cell feeder with various design variants. The results of the tests showed that the above design parameters affect the instantaneous efficiency of the feeder and thus impact the distribution of the dosed material during the operation of the device. Depending on the design solution, the simulation results gave information on the fill factor of the feeders. The study showed a significant potential of DEM simulation in the design of circular feeders intended for dosing bulk materials.
Synthetic assets are decentralized finance (DeFi) analogues of derivatives in the traditional finance (TradFi) world - financial arrangements which derive value from and are directly pegged to fluctuations in the value of an underlying asset (ex: futures and options). Synthetic assets occupy a unique niche, serving to facilitate currency exchange, giving traders a means to speculate on the value of crypto assets without directly holding them, and powering more complex financial tools such as yield optimizers and portfolio management suites. Unfortunately, the academic literature on this topic is highly disparate and struggles to keep up with rapid changes in the space. We present the first Systematization of Knowledge (SoK) in this area, focusing on presenting the key mechanisms, protocols, and issues in an accessible fashion to highlight risks for participants as well as areas of research interest. This paper takes a broad perspective in establishing a general framework for synthetic assets, from the ideological origins of crypto to legal barriers for firms in this space, encapsulating the basic mechanisms underpinning derivatives markets as well as presenting data-driven analyses of major protocols.
The success of any business depends on how well it is able to satisfy customer demand, while remaining financially viable. Globalisation and the growth of e-commerce have resulted in retail businesses having to manage increasing numbers of products. As the number of products sold increases, so does the complexity of the corresponding inventory management problem. High-quality decision making, in this regard, necessitates the utilisation of computerised decision support systems capable of accommodating the complexity presented by a large number of products. Existing frameworks for decision support in inventory management are mainly focused on a single facet of the inventory management problem - generic, integrated frameworks seem to be absent from the literature. In this paper, a holistic framework design is proposed which integrates all of the major components expected to form part of a generic framework for inventory management. The objective of the framework is to provide decision support in respect of various inventory management operations, such as product segmentation, demand forecasting and determining the sizes and timings of replenishment orders, in pursuit of a desirable balance between the conflicting objectives in inventory management.
Traditional portfolio management methods can incorporate specific investor preferences but rely on accurate forecasts of asset returns and covariances. Reinforcement learning (RL) methods do not rely on these explicit forecasts and are better suited for multi-stage decision processes. To address limitations of the evaluated research, experiments were conducted on three markets in different economies with different overall trends. By incorporating specific investor preferences into our RL models' reward functions, a more comprehensive comparison could be made to traditional methods in risk-return space. Transaction costs were also modelled more realistically by including nonlinear changes introduced by market volatility and trading volume. The results of this study suggest that there can be an advantage to using RL methods compared to traditional convex mean-variance optimisation methods under certain market conditions. Our RL models could significantly outperform traditional single-period optimisation (SPO) and multi-period optimisation (MPO) models in upward trending markets, but only up to specific risk limits. In sideways trending markets, the performance of SPO and MPO models can be closely matched by our RL models for the majority of the excess risk range tested. The specific market conditions under which these models could outperform each other highlight the importance of a more comprehensive comparison of Pareto optimal frontiers in risk-return space. These frontiers give investors a more granular view of which models might provide better performance for their specific risk tolerance or return targets.
Subaveerapandiyan A, Ammaji Rajitha, Mohd Amin Dar
et al.
E-resources are inevitable, technology has grown and libraries are also adopting the technologies although adopting have many challenges to the library professionals. Whenever something new comes they need to update themselves. A study investigated E-Resources management and management issues of Indian library professional perspectives. For this study, data was collected from various academic institutes/university libraries in India. It includes institutes of national importance, central, state, deemed and private universities. The study finds that the majority of the libraries subscribed to E-journals and E-books, administration related challenges faced by LIS professionals. The t-test results revealed a lack of professional skills is the reason for issues and challenges of Library management.
Context: The Importance of Dynamic Variability Management in Dynamic Software Product Lines. Objective: Define a protocol for conducting a systematic mapping study to summarize and synthesize evidence on dynamic variability management for Dynamic Software Product Lines in self-adaptive systems. Method: Application the protocol to conduct a systematic mapping study according the guidelines of K. Petersen. Results: A validated protocol to conduct a systematic mapping study. Conclusions: First findings show that it is necessary to visualize new ways to manage variability in dynamic software product lines.