Shams El-Adawy, A. R. Piña, Benjamin M. Zwickl
et al.
This report builds upon the Categorization of Roles in the Quantum Industry report by providing detailed profiles for 29 distinct roles across the quantum workforce. While the earlier report established a framework of four major role categories (hardware, software, bridging, and public facing and business) and their subcategories, the current report expands on this structural framework by characterizing what professionals in each role actually do, particularly by identifying the tasks, knowledge, skills, abilities (KSAs), and experience typically required for each role. Each role profile follows a standardized structure guided by the Occupational Information Network (O*NET) framework. By presenting a fine-grained view of day-to-day work and qualification expectations, this report serves as a practical resource for educators, students, industry professionals, and policymakers aiming to understand, educate, and support the evolving quantum workforce.
Artificial intelligence (AI) in healthcare has led to many promising developments; however, increasingly, AI research is funded by the private sector leading to potential trade-offs between benefits to patients and benefits to industry. Health AI practitioners should prioritize successful adaptation into clinical practice in order to provide meaningful benefits to patients, but translation usually requires collaboration with industry. We discuss three features of AI studies that hamper the integration of AI into clinical practice from the perspective of researchers and clinicians. These include lack of clinically relevant metrics, lack of clinical trials and longitudinal studies to validate results, and lack of patient and physician involvement in the development process. For partnerships between industry and health research to be sustainable, a balance must be established between patient and industry benefit. We propose three approaches for addressing this gap: improved transparency and explainability of AI models, fostering relationships with industry partners that have a reputation for centering patient benefit in their practices, and prioritization of overall healthcare benefits. With these priorities, we can sooner realize meaningful AI technologies used by clinicians where mutua
Sanaa Salama, Ashraf Abuelhaija, Mohammed Hamdan
et al.
This study presents a compact, flexible U-shaped wearable antenna with a defected ground structure (DGS) for biomedical applications. Fabricated on a 1 mm polydimethylsiloxane (PDMS) substrate using conductive adhesive tape, the antenna achieves a wide bandwidth of 1.52–5.38 GHz, covering key ISM bands. The DGS enhances impedance matching and stability, while curvature adaptation to a 46 mm arm diameter ensures practical wearability. Simulations in CST and COMSOL demonstrate robust performance near a multilayer human arm phantom, with specific absorption rate (SAR) values compliant with safety standards (< 2 W/kg). Experimental validation confirms the design’s reliability, making it a promising solution for wireless body area networks (WBANs) and next-generation biomedical wearables.
As Artificial Intelligence (AI) technologies continue to evolve, the gap between academic AI education and real-world industry challenges remains an important area of investigation. This study provides preliminary insights into challenges AI professionals encounter in both academia and industry, based on semi-structured interviews with 14 AI experts - eight from industry and six from academia. We identify key challenges related to data quality and availability, model scalability, practical constraints, user behavior, and explainability. While both groups experience data and model adaptation difficulties, industry professionals more frequently highlight deployment constraints, resource limitations, and external dependencies, whereas academics emphasize theoretical adaptation and standardization issues. These exploratory findings suggest that AI curricula could better integrate real-world complexities, software engineering principles, and interdisciplinary learning, while recognizing the broader educational goals of building foundational and ethical reasoning skills.
Theofanis P. Raptis, Andrea Passarella, Marco Conti
Wireless edge networks in smart industrial environments increasingly operate using advanced sensors and autonomous machines interacting with each other and generating huge amounts of data. Those huge amounts of data are bound to make data management (e.g., for processing, storing, computing) a big challenge. Current data management approaches, relying primarily on centralized data storage, might not be able to cope with the scalability and real time requirements of Industry 4.0 environments, while distributed solutions are increasingly being explored. In this paper, we introduce the problem of distributed data access in multi-hop wireless industrial edge deployments, whereby a set of consumer nodes needs to access data stored in a set of data cache nodes, satisfying the industrial data access delay requirements and at the same time maximizing the network lifetime. We prove that the introduced problem is computationally intractable and, after formulating the objective function, we design a two-step algorithm in order to address it. We use an open testbed with real devices for conducting an experimental investigation on the performance of the algorithm. Then, we provide two online improvements, so that the data distribution can dynamically change before the first node in the network runs out of energy. We compare the performance of the methods via simulations for different numbers of network nodes and data consumers, and we show significant lifetime prolongation and increased energy efficiency when employing the method which is using only decentralized low-power wireless communication instead of the method which is using also centralized local area wireless communication.
With the rise of large language models (LLMs), LLM agents capable of autonomous reasoning, planning, and executing complex tasks have become a frontier in artificial intelligence. However, how to translate the research on general agents into productivity that drives industry transformations remains a significant challenge. To address this, this paper systematically reviews the technologies, applications, and evaluation methods of industry agents based on LLMs. Using an industry agent capability maturity framework, it outlines the evolution of agents in industry applications, from "process execution systems" to "adaptive social systems." First, we examine the three key technological pillars that support the advancement of agent capabilities: Memory, Planning, and Tool Use. We discuss how these technologies evolve from supporting simple tasks in their early forms to enabling complex autonomous systems and collective intelligence in more advanced forms. Then, we provide an overview of the application of industry agents in real-world domains such as digital engineering, scientific discovery, embodied intelligence, collaborative business execution, and complex system simulation. Additionally, this paper reviews the evaluation benchmarks and methods for both fundamental and specialized capabilities, identifying the challenges existing evaluation systems face regarding authenticity, safety, and industry specificity. Finally, we focus on the practical challenges faced by industry agents, exploring their capability boundaries, developmental potential, and governance issues in various scenarios, while providing insights into future directions. By combining technological evolution with industry practices, this review aims to clarify the current state and offer a clear roadmap and theoretical foundation for understanding and building the next generation of industry agents.
In industry, the networking and automation of machines through the Internet of Things (IoT) continues to increase, leading to greater digitalization of production processes. Traditionally, business and production processes are controlled, optimized and monitored using business process management methods that require process discovery. However, these methods cannot be fully applied to industrial production processes. Nevertheless, processes in the industry must also be monitored and discovered for this purpose. The aim of this paper is to develop an approach for process discovery methods and to adapt existing process discovery methods for application to industrial processes. The adaptations of classic discovery methods are presented as universally applicable guidelines specifically for the Industrial Internet of Things (IIoT). In order to create an optimal process model based on process evaluation, different methods are combined into a standardized discovery approach that is both efficient and cost-effective.
Nowadays, electric robots play big role in many fields as they can replace humans and/or decrease the amount of load on humans. There are several types of robots that are present in the daily life, some of them are fully controlled by humans while others are programmed to be self-controlled. In addition there are self-control robots with partial human control. Robots can be classified into three major kinds: industry robots, autonomous robots and mobile robots. Industry robots are used in industries and factories to perform mankind tasks in the easier and faster way which will help in developing products. Typically industrial robots perform difficult and dangerous tasks, as they lift heavy objects, handle chemicals, paint and assembly work and so on. They are working all the time hour after hour, day by day with the same precision and they do not get tired which means that they do not make errors due to fatigue. Indeed, they are ideally suited to complete repetitive tasks.
A low-profile wide-scanning conformal phased array (CPA), with cavity-backed stacked patch elements, is presented in this paper. To reduce the array profile, a partially dielectric-filled cavity is employed in each element, and to enhance its scanning performance, the cavity walls are deliberately modified. Finally, the designed element operates in a frequency band of 1.45∼1.75 GHz, with a profile of 0.086 λh (λh is the free-space wavelength at 1.75 GHz), and achieves ±60° scanning in the E/H-planes, with the reflection coefficient below −7 dB. The proposed design method and all numerical results are experimentally verified by a 4 × 4 CPA prototype.
The ability to automatically identify industry sector coverage in articles on legal developments, or any kind of news articles for that matter, can bring plentiful of benefits both to the readers and the content creators themselves. By having articles tagged based on industry coverage, readers from all around the world would be able to get to legal news that are specific to their region and professional industry. Simultaneously, writers would benefit from understanding which industries potentially lack coverage or which industries readers are currently mostly interested in and thus, they would focus their writing efforts towards more inclusive and relevant legal news coverage. In this paper, a Machine Learning-powered industry analysis approach which combined Natural Language Processing (NLP) with Statistical and Machine Learning (ML) techniques was investigated. A dataset consisting of over 1,700 annotated legal articles was created for the identification of six industry sectors. Text and legal based features were extracted from the text. Both traditional ML methods (e.g. gradient boosting machine algorithms, and decision-tree based algorithms) and deep neural network (e.g. transformer models) were applied for performance comparison of predictive models. The system achieved promising results with area under the receiver operating characteristic curve scores above 0.90 and F-scores above 0.81 with respect to the six industry sectors. The experimental results show that the suggested automated industry analysis which employs ML techniques allows the processing of large collections of text data in an easy, efficient, and scalable way. Traditional ML methods perform better than deep neural networks when only a small and domain-specific training data is available for the study.
The Industrial Internet of Things (IIoT) is a developing research area with potential global Internet connectivity, turning everyday objects into intelligent devices with more autonomous activities. IIoT services and applications are not only being used in smart homes and smart cities, but they have also become an essential element of the Industry 4.0 concept. The emergence of the IIoT helps traditional industries simplify production processes, reduce production costs, and improve industrial efficiency. However, the involvement of many heterogeneous devices, the use of third-party software, and the resource-constrained nature of the IoT devices bring new security risks to the production chain and expose vulnerabilities to the systems. The Distributed Denial of Service (DDoS) attacks are significant, among others. This article analyzes the threats and attacks in the IIoT and discusses how DDoS attacks impact the production process and communication dysfunctions with IIoT services and applications. This article also proposes a reference security framework that enhances the advantages of fog computing to demonstrate countermeasures against DDoS attacks and possible strategies to mitigate such attacks at scale.
Past research on software product lines has focused on the initial development of reusable assets and related challenges, such as cost estimation and implementation issues. Naturally, as software product lines are increasingly adopted throughout industry, their ongoing maintenance and evolution are getting more attention as well. However, it is not clear to what degree research is following this trend, and where the interests and demands of the industry lie. In this technical report, we provide a survey and comparison of selected publications on software product line maintenance and evolution at SPLC. In particular, we analyze and discuss similarities and differences of these papers with regard to their affiliation with industry and academia. From this, we infer directions for future research that pave the way for systematic and organized evolution of software product lines, from which industry may benefit as well.
The field of computer science is perhaps uniquely connected with industry. For example, our main publication outlets (i.e. conferences) are regularly sponsored by large technology companies, and much of our research funding is either directly or indirectly provided by industry. In turn, this places potential limitations on academic freedom, which is a profound ethical concern, yet curiously is not directly addressed within existing ethical codes. A field that limits academic freedom presents the risk that the results of the work conducted within it cannot always be relied upon. In the context of a field that is perhaps unique in both its connection to industry and impact on society, special measures are needed to address this problem. This paper discusses the range of protections that could be provided.
In this paper, an automatic antenna design method based on the shape blending algorithm is proposed. The algorithm is used to construct the shape of the wide slot of a CPW-fed antenna. Firstly, two basic shapes are chosen as the initial shape and the target shape. The shape blending process is then applied on them to get a series of shapes, which are used as the geometry structure of the wide slot. In this way, a series of CPW-fed wide slot antennas are obtained. And they have similar but gradually changing characteristics. The bandwidth ranges are 8.00–9.24 GHz, 7.95–9.05 GHz, 7.05–8.55 GHz, 6.95–8.13 GHz, and 6.55–7.50 GHz, respectively. The overall size of the antenna is 26 mm ∗ 20 mm ∗ 0.6 mm. Experimental results show that the resonant frequencies vary (via translation) with the change of slot shape in a specific frequency band. The experiments also validate that the antennas have omnidirectional radiation characteristics. The radiation gains and aperture efficiencies of the antennas are about 3.8–5.5 dBi and 57.7–83.0% at their centre frequencies, respectively. The experiment results show that the proposed antennas could be used in C-band and X-band radar applications.
In order to analyze the transmission capacity performance of the cluster flight spacecraft network, there are two different types of outage performance theory which are derived in this paper. First of all, by applying the mean value theorem of integrals, the expression of the outage probability of decode-and-forward relaying is derived. Subsequently, according to the Macdonald random variable form, the expression of the outage probability of amplify-and-forward is derived. By simulating the transmission capacity of decode-and-forward, the transmission capacity characteristics of a single hop and dual hops are analyzed. The simulation results showed that transmission capacity performance changes with the change of the time slot in the orbital hyperperiod, and the transmission capacity of a dual-hop relay has better performance than a single-hop transmission in the cluster flight spacecraft network.
This paper presents a production Semi-Supervised Learning (SSL) pipeline based on the student-teacher framework, which leverages millions of unlabeled examples to improve Natural Language Understanding (NLU) tasks. We investigate two questions related to the use of unlabeled data in production SSL context: 1) how to select samples from a huge unlabeled data pool that are beneficial for SSL training, and 2) how do the selected data affect the performance of different state-of-the-art SSL techniques. We compare four widely used SSL techniques, Pseudo-Label (PL), Knowledge Distillation (KD), Virtual Adversarial Training (VAT) and Cross-View Training (CVT) in conjunction with two data selection methods including committee-based selection and submodular optimization based selection. We further examine the benefits and drawbacks of these techniques when applied to intent classification (IC) and named entity recognition (NER) tasks, and provide guidelines specifying when each of these methods might be beneficial to improve large scale NLU systems.
Tiago Espinha Gasiba, Kristian Beckers, Santiago Suppan
et al.
Teaching industry staff on cybersecurity issues is a fundamental activity that must be undertaken in order to guarantee the delivery of successful and robust products to market. Much research attention has been devoted to this topic over the last years. However, the research which has been done has not focused on developing secure code in industrial environments. In this paper we take a look at the constraints and requirements for delivering a training, by means of cybersecurity challenges, that covers secure coding topics from an industry perspective. Using requirements engineering, we aim at understanding the design requirements for such challenges. Along the way, we give details on our experience of delivering cybersecurity challenges in an industrial setting and show the outcome and lessons learned. The proposed requirements for cybersecurity challenges geared towards software developers in an industrial environment are based on systematic literature review, interviews with security experts from the industry and semi-structured evaluation of participant feedback.
Two types of modified crossed-wire antennas are investigated to enhance a circularly polarized (CP) wave bandwidth. The wire length of each antenna is increased twice as long as that of the original antenna. First, a bent-type antenna is analyzed using the method of moments. It is found that the CP wave bandwidth for a 3 dB axial ratio criterion is twice as wide as that of the original antenna. Next, a spiral-type antenna is analyzed. It is revealed that the antenna shows a CP wave bandwidth of 28%, which is wider than that of the original antenna by a factor of 3.5. The analysis results are validated by experimental work.
Nanomaterials have much improved properties compared to their bulk counterparts, which promotes them as ideal material for applications in various industries. Among the various nanomaterials, different nanoallotropes of carbon, namely fullerene, carbon nanotubes, and graphene, are the most important as indicated by the fact that their discoverers gained prestigious awards such as Nobel Prize or Kavli Prize. Carbon forms different nano-allotropes by varying the nature of orbital hybridization. Since all nanoallotropes of carbon possess exotic physical and chemical properties, they are extensively used in different applications, especially in the electronic industry.