Tyson R. Browning
Hasil untuk "Industrial engineering. Management engineering"
Menampilkan 20 dari ~11151277 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar
N. Leveson
M. Gruninger
Zhou Wang, A. Bovik
This book is about objective image quality assessmentwhere the aim is to provide computational models that can automatically predict perceptual image quality. The early years of the 21st century have witnessed a tremendous growth in the use of digital images as a means for representing and communicating information. A considerable percentage of this literature is devoted to methods for improving the appearance of images, or for maintaining the appearance of images that are processed. Nevertheless, the quality of digital images, processed or otherwise, is rarely perfect. Images are subject to distortions during acquisition, compression, transmission, processing, and reproduction. To maintain, control, and enhance the quality of images, it is important for image acquisition, management, communication, and processing systems to be able to identify and quantify image quality degradations. The goals of this book are as follows; a) to introduce the fundamentals of image quality assessment, and to explain the relevant engineering problems, b) to give a broad treatment of the current state-of-the-art in image quality assessment, by describing leading algorithms that address these engineering problems, and c) to provide new directions for future research, by introducing recent models and paradigms that significantly differ from those used in the past. The book is written to be accessible to university students curious about the state-of-the-art of image quality assessment, expert industrial R&D engineers seeking to implement image/video quality assessment systems for specific applications, and academic theorists interested in developing new algorithms for image quality assessment or using existing algorithms to design or optimize other image processing applications.
G. J. Hahn
The Fourth Industrial Revolution – also known as Industry 4.0 (i4.0) – comprises the digitalisation of the industrial sector. This paper uses the theoretical lens of supply chain innovation (SCI) to investigate the implications of i4.0 on supply chain management. For these purposes, the method of structured content analysis is applied to more than 200 use cases of i4.0-enabled SCI introduced by both established and startup companies. i4.0-enabled SCI manifests along three dimensions: process, technology, and business architecture. The key findings of this study can be summarised as follows: first, i4.0-enabled SCI extends the initial focus on productivity improvements in SC processes towards scalability and flexibility. Second, extant i4.0 solutions rely mostly on analytics and smart things while omitting smart people technology and the human-centric approach associated with the i4.0 paradigm. Third, established companies adopt i4.0 merely to sustain their existing business architectures while startup companies radically change their operating models, relying heavily on data analytics and the platform economy. Consequently, established companies pursue a problem-driven, engineering-based approach to SCI while startup companies follow an ‘asset-light’, business-driven approach. Lastly, there are two distinct approaches to digitalising operational SC processes: platform-based crowdsourcing of standard processes and on-demand provision of customised services.
Sofia Tapias Montana, Ronnie de Souza Santos
This paper investigates how software professionals perceive the economic implications of diversity in software engineering teams. Motivated by a gap in software engineering research, which has largely emphasized socio-technical and process-related outcomes, we adopted a qualitative interview approach to capture practitioners' reasoning about diversity in relation to economic and market-oriented considerations. Based on interviews with ten software professionals, our analysis indicates that diversity is perceived as economically relevant through its associations with cost reduction and containment, revenue generation, time to market, process efficiency, innovation, and market alignment. Participants typically grounded these perceptions in concrete project experiences rather than abstract economic reasoning, framing diversity as a practical resource that supports project delivery, competitiveness, and organizational viability. Our findings provide preliminary empirical insights into how economic aspects of diversity are understood in software engineering practice.
Feng Zhou, Hao Hu, Fengjie Wang et al.
Weather forecast ensembles are commonly used to assess the uncertainty and confidence of weather predictions. Conventional methods in meteorology often employ ensemble mean and standard deviation plots, as well as spaghetti plots, to visualize ensemble data. However, these methods suffer from significant information loss and visual clutter. In this paper, we propose a new approach for uncertainty visualization of weather forecast ensembles, including isovalue selection based on information loss and hierarchical visualization that integrates visual abstraction and detail preservation. Our approach uses non-uniform downsampling to select key-isovalues and provides an interactive visualization method based on hierarchical clustering. Firstly, we sample key-isovalues by contour probability similarity and determine the optimal sampling number using an information loss curve. Then, the corresponding isocontours are presented to guide users in selecting key-isovalues. Once the isovalue is chosen, we perform agglomerative hierarchical clustering on the isocontours based on signed distance fields and generate visual abstractions for each isocontour cluster to avoid visual clutter. We link a bubble tree to the visual abstractions to explore the details of isocontour clusters at different levels. We demonstrate the utility of our approach through two case studies with meteorological experts on real-world data. We further validate its effectiveness by quantitatively assessing information loss and visual clutter. Additionally, we confirm its usability through expert evaluation.
Christoph Treude, Margaret-Anne Storey
The adoption of large language models (LLMs) and autonomous agents in software engineering marks an enduring paradigm shift. These systems create new opportunities for tool design, workflow orchestration, and empirical observation, while fundamentally reshaping the roles of developers and the artifacts they produce. Although traditional empirical methods remain central to software engineering research, the rapid evolution of AI introduces new data modalities, alters causal assumptions, and challenges foundational constructs such as "developer", "artifact", and "interaction". As humans and AI agents increasingly co-create, the boundaries between social and technical actors blur, and the reproducibility of findings becomes contingent on model updates and prompt contexts. This vision paper examines how the integration of LLMs into software engineering disrupts established research paradigms. We discuss how it transforms the phenomena we study, the methods and theories we rely on, the data we analyze, and the threats to validity that arise in dynamic AI-mediated environments. Our aim is to help the empirical software engineering community adapt its questions, instruments, and validation standards to a future in which AI systems are not merely tools, but active collaborators shaping software engineering and its study.
Reza Rezaeian Farashahi, Mojtaba Fadavi, Soheila Sabbaghian
An addition law for an elliptic curve is complete if it is defined for all possible pairs of input points on the elliptic curve. In Elliptic Curve Cryptography (ECC), a complete addition law provides a natural protection against side-channel attacks which are based on Simple Power Analysis (SPA). Montgomery curves are a specific family of elliptic curves that play a crucial role in ECC because of its well-known Montgomery ladder, particularly in the Elliptic Curve Diffie-Hellman Key Exchange (ECDHKE) protocol and the Elliptic Curve factorization Method (ECM). However, the complete addition law for Montgomery curves, as stated in the literature, has a computational cost of 14M+ 2D, where M,D denote the costs of a field multiplication and a field multiplication by a constant, respectively. The lack of a competitive complete addition law has led implementers towards twisted Edwards curves, which offer a complete addition law at a lower cost of 8M+ 1D for appropriately chosen curve constants. In this paper, we introduce extended Montgomery coordinates as a novel representation for points on Montgomery curves. This coordinate system enables us to define birational multiplication-free maps between the extended twisted Edwards coordinates and extended Montgomery coordinates. Using this map, we can transfer the complete addition laws from twisted Edwards curves to Montgomery curves without incurring additional multiplications or squarings. In addition, we employ a technique known as scaling to refine the addition laws for twisted Edwards curves, which results in having i) Complete addition laws with the costs varying between 8M+1D and 9M+1D for a broader range of twisted Edwards curves, ii) Incomplete addition laws for twisted Edwards curves with the cost of 8M. Consequently, by leveraging our birational multiplication-free maps, we present complete addition laws for Montgomery curves with the cost of 8M+1D. This shows a significant improvement for complete addition law for Montgomery curves by reducing the computational cost by 6M+ 1D. This improvement makes Montgomery curves a more attractive option for applications where an efficient complete addition law is essential.
Roselane Silva Farias, Iftekhar Ahmed, Eduardo Santana de Almeida
Software Quality Assurance (SQA) Engineers are responsible for assessing a product during every phase of the software development process to ensure that the outcomes of each phase and the final product possess the desired qualities. In general, a great SQA engineer needs to have a different set of abilities from development engineers to effectively oversee the entire product development process from beginning to end. Recent empirical studies identified important attributes of software engineers and managers, but the quality assurance role is overlooked. As software quality aspects have become more of a priority in the life cycle of software development, employers seek professionals that best suit the company's objectives and new graduates desire to make a valuable contribution through their job as an SQA engineer, but what makes them great? We addressed this knowledge gap by conducting 25 semi-structured interviews and 363 survey respondents with software quality assurance engineers from different companies around the world. We use the data collected from these activities to derive a comprehensive set of attributes that are considered important. As a result of the interviews, twenty-five attributes were identified and grouped into five main categories: personal, social, technical, management, and decision-making attributes. Through a rating survey, we confirmed that the distinguishing characteristics of great SQA engineers are curiosity, the ability to communicate effectively, and critical thinking skills. This work will guide further studies with SQA practitioners, by considering contextual factors and providing some implications for research and practice.
Egor Klimov, Muhammad Umair Ahmed, Nikolai Sviridov et al.
Bus factor (BF) is a metric that tracks knowledge distribution in a project. It is the minimal number of engineers that have to leave for a project to stall. Despite the fact that there are several algorithms for calculating the bus factor, only a few tools allow easy calculation of bus factor and convenient analysis of results for projects hosted on Git-based providers. We introduce Bus Factor Explorer, a web application that provides an interface and an API to compute, export, and explore the Bus Factor metric via treemap visualization, simulation mode, and chart editor. It supports repositories hosted on GitHub and enables functionality to search repositories in the interface and process many repositories at the same time. Our tool allows users to identify the files and subsystems at risk of stalling in the event of developer turnover by analyzing the VCS history. The application and its source code are publicly available on GitHub at https://github.com/JetBrains-Research/bus-factor-explorer. The demonstration video can be found on YouTube: https://youtu.be/uIoV79N14z8
Ricardo D. Caldas
Resilient cyber-physical systems comprise computing systems able to continuously interact with the physical environment in which they operate, despite runtime errors. The term resilience refers to the ability to cope with unexpected inputs while delivering correct service. Examples of resilient computing systems are Google's PageRank and the Bubblesort algorithm. Engineering for resilient cyber-physical systems requires a paradigm shift, prioritizing adaptability to dynamic environments. Software as a tool for self-management is a key instrument for dealing with uncertainty and embedding resilience in these systems. Yet, software engineers encounter the ongoing challenge of ensuring resilience despite environmental dynamic change. My thesis aims to pioneer an engineering discipline for resilient cyber-physical systems. Over four years, we conducted studies, built methods and tools, delivered software packages, and a website offering guidance to practitioners. This paper provides a condensed overview of the problems tackled, our methodology, key contributions, and results highlights. Seeking feedback from the community, this paper serves both as preparation for the thesis defense and as insight into future research prospects.
Lina Boman, Jonatan Andersson, Francisco Gomes de Oliveira Neto
Women in computing were among the first programmers in the early 20th century and were substantial contributors to the industry. Today, men dominate the software engineering industry. Research and data show that women are far less likely to pursue a career in this industry, and those that do are less likely than men to stay in it. Reasons for women and other underrepresented minorities to leave the industry are a lack of opportunities for growth and advancement, unfair treatment and workplace culture. This research explores how the potential to cultivate or uphold an industry unfavourable to women and non-binary individuals manifests in software engineering education at the university level. For this purpose, the study includes surveys and interviews. We use gender name perception as a survey instrument, and the results show small differences in perceptions of software engineering students based on their gender. Particularly, the survey respondents anchor the values of the male software engineer (Hans) to a variety of technical and non-technical skills, while the same description for a female software engineer (Hanna) is anchored mainly by her managerial skills. With interviews with women and non-binary students, we gain insight on the main barriers to their sense of ambient belonging. The collected data shows that some known barriers from the literature such as tokenism, and stereotype threat, do still exist. However, we find positive factors such as role models and encouragement that strengthen the sense of belonging among these students.
Syeda Fauzia Farheen Zofair, Sumbul Ahmad, Md. Amiruddin Hashmi et al.
We are facing a high risk of exposure to emerging contaminants and increasing environmental pollution with the concomitant growth of industries. Persistence of these pollutants is a major concern to the ecosystem. Laccases, also known as "green catalysts" are multi-copper oxidases which offers an eco-friendly solution for the degradation of these hazardous pollutants to less or non-toxic compounds. Although various other biological methods exist for the treatment of pollutants, the fact that laccases catalyze the oxidation of broad range of substrates in the presence of molecular oxygen without any additional cofactor and releases water as the by-product makes them exceptional. They have a good possibility of utilization in various industries, especially for the purpose of bioremediation. Besides this, they have also been used in medical/health care, food industry, bio-bleaching, wine stabilization, organic synthesis and biosensors. This review covers the catalytic behaviour of laccases, their immobilization strategies, potential applications in bioremediation of recalcitrant environmental pollutants and their engineering. It provides a comprehensive summary of most factors to consider while working with laccases in an industrial setting. It compares the benefits and drawbacks of the current techniques. Immobilization and mediators, two of the most significant aspects in working with laccases, have been meticulously discussed.
Shiwangi Singh, Sanjay Dhir, Stuart Evans et al.
In the year 2020, the Global Journal of Flexible Systems Management (GJFSM) has celebrated its two decades of publication. This study is an attempt to commemorate the two decades of publication by presenting the overview of the GJFSM along with the trajectory of flexibility research in various journals. By using multiple bibliometric tools and indicators, the study finds that the GJFSM has grown over the years in terms of total publications and citations. The contributors are from across the globe, i.e., South America, North America, Asia, Europe, Africa, and the Pacific. The Journal publishes several flexibility areas, including information systems flexibility, financial flexibility, supply chain flexibility, technology management flexibility, marketing flexibility, organizational flexibility, strategic flexibility, and manufacturing flexibility. The GJFSM is cited by authors from various countries across the globe. It has been cited across different Scopus categorizations, including “Strategy and Management,” “Business and International Management,” “General Business, Management and Accounting,” “Industrial and Manufacturing Engineering,” and “Management Science and Operations Research.” Keyword co-occurrences analysis helps to analyze the various groups of keywords cited together in the GJFSM. Co-citation analysis of references helps to identify crucial clusters of GJFSM, i.e., strategic flexibility, manufacturing flexibility, the conceptual framework of flexibility, supply chain flexibility, modeling flexibility, and application of TISM. Overall, GJFSM has seen an increase in both publications and citations, reflecting its increasing presence among the journals publishing flexibility research. The diversity of flexibility research and its contributions to research under-one-roof make this Journal unique. The paper concludes with the gap areas of flexibility research and the way forward.
Wenshu Zhou, Xiaodan Wei
We consider a diffusive predator–prey model of Leslie–Gower type, and obtain a new global stability result by combining the Lyapunov function method and the transformation technique used in Qi and Zhu, (2016). Our result partially answers the question proposed in [Y. H. Du and S. B. Hsu, J. Differential Equations 203(2004) 331–364]. In addition, we extend the result to a class of diffusive systems with a more general type of reaction-terms.
Irina V. Levchenko, Albina R. Sadykova, Lyudmila I. Kartashova et al.
Problem statement . Currently, various global and national institutions promote mainstreaming artificial intelligence (AI) technology into training programs for school students. The effectiveness of introducing artificial intelligence into school curricula depends on four factors: 1) defining methodological foundations for creating educational content; 2) selecting and structuring appropriate learning content; 3) adapting the content to the needs of different age groups; 4) integrating the content into school programs. The current study provides theoretical foundations for generating learning content for AI lessons aimed at secondary school students and determines possible ways of integrating that content into school programs. Methodology. The empirical part of the study involved 225 secondary school students aged 11-14 (forms 5 to 9) as well as 125 teachers from comprehensive schools located in Moscow and the Moscow region. Analysis, synthesis, testing and sampling average methods were used. Results. The authors conducted a pilot testing of the developed educational materials, measured students’ AI-related skill and knowledge and processed the obtained data using the method of selective averages. The theoretical research conducted showed the leadership of artificial intelligence training in primary schools, mechanisms for developing learning outcomes in the field of artificial intelligence for primary school students, the opportunity to reveal the possibility of forming the content of artificial intelligence training based on various approaches. The goals and results of teaching the basics of artificial intelligence within the framework of basic school were determined. The content of training was formulated. Conclusion. The research is characterized by scientific and practical novelty, as it helps determine methodological grounds for teaching AI to secondary school students and proposes a detailed unit plan for an AI training course in secondary school.
Mihai-Horia Băieş, Vlad-Dan Cotuţiu, Marina Spînu et al.
Internal parasitic diseases of swine constitute a major welfare and health concern in low-input livestock farming. Due to an increase in chemical resistance, phytotherapeutic remedies have become an alternative for the prophylaxis and therapy of digestive parasitosis, albeit few remedies have been subjected to scientific validation. Low-input swine farming in Romania has adopted the traditional use of phytotherapy for controlling pathogens in livestock. The current study aimed to assess the antiparasitic potential of <i>Calendula officinalis</i> and <i>Satureja hortensis</i> against digestive parasites of swine in two low-input farms. The fecal samples were collected from sows, fatteners, and weaners, and were tested using the following coproparasitological methods: centrifugal sedimentation, flotation (Willis, McMaster egg counting technique), Ziehl–Neelsen stain modified by Henricksen, modified Blagg method, and in vitro nematode larvae/protozoan oocyst cultures. Six species of digestive parasites were diagnosed, namely <i>Ascaris suum</i>, <i>Trichuris suis</i>, <i>Oesophagostomum</i> spp., <i>Balantioides coli</i>, <i>Eimeria</i> spp., and <i>Cryptosporidium</i> spp., in various combinations, dependent on the swine category. A dose of 140 mg/kg bw/day of <i>C. officinalis</i> and 100 mg/kg bw/day of <i>S. hortensis</i> powders administered for 10 consecutive days revealed a strong antiprotozoal and anthelmintic activity on the aforementioned parasites. The curative efficacy can be attributed to the presence of polyphenols, sterols, tocopherols, and methoxylated flavones. In conclusion, our results indicate that <i>S. hortensis</i> and <i>C. officinalis</i> are promising alternatives to the commercially available antiparasitics, enabling their use as natural antiparasitic products against gastrointestinal parasites in pigs.
Khlood Ahmad, Mohamed Abdelrazek, Chetan Arora et al.
[Context] Engineering Artificial Intelligence (AI) software is a relatively new area with many challenges, unknowns, and limited proven best practices. Big companies such as Google, Microsoft, and Apple have provided a suite of recent guidelines to assist engineering teams in building human-centered AI systems. [Objective] The practices currently adopted by practitioners for developing such systems, especially during Requirements Engineering (RE), are little studied and reported to date. [Method] This paper presents the results of a survey conducted to understand current industry practices in RE for AI (RE4AI) and to determine which key human-centered AI guidelines should be followed. Our survey is based on mapping existing industrial guidelines, best practices, and efforts in the literature. [Results] We surveyed 29 professionals and found most participants agreed that all the human-centered aspects we mapped should be addressed in RE. Further, we found that most participants were using UML or Microsoft Office to present requirements. [Conclusion] We identify that most of the tools currently used are not equipped to manage AI-based software, and the use of UML and Office may pose issues to the quality of requirements captured for AI. Also, all human-centered practices mapped from the guidelines should be included in RE.
M. Dehghan Banadaki, H. Navidi
The Tau method based on the Bernoulli polynomials is implemented efficiently to approximate the Nash equilibrium of open-loop kind in non-linear differential games over a finite time horizon. By this treatment, the system of two-point boundary value problems of differential game ex-tracted from Pontryagin’s maximum principle is transferred to a system of algebraic equations that Newton’s iteration method can be used for solving it. Also, for the mentioned approximation by the Bernoulli polynomials, the convergence analysis and the error upper bound are discussed. To demonstrate the applicably and accuracy of the proposed approach, some illustrated examples are presented at the final.
Halaman 42 dari 557564