Logic-based methods for explaining neural network decisions offer formal guarantees of correctness and non-redundancy, but they often suffer from high computational costs, especially for large networks. In this work, we improve the efficiency of such methods by combining bound propagation with constraint simplification. These simplifications, derived from the propagation, tighten neuron bounds and eliminate unnecessary binary variables, making the explanation process more efficient. Our experiments suggest that combining these techniques reduces explanation time by up to 89.26\%, particularly for larger neural networks.
This article delves into the transformative impact of digital technologies on the effectiveness of internal controls within organizations. By examining how tools like process automation, artificial intelligence (AI), and Big Data can both enhance and complicate internal control procedures, we draw on key theories of risk management, technology adoption, and corporate governance. Our analysis highlights the benefits of integrating digital technologies, such as reducing human errors, increasing the speed of verification processes, and enabling continuous real-time monitoring of organizational activities. These technologies also offer advanced data analysis capabilities, facilitating the early detection of anomalies and suspicious behaviors, which supports proactive risk management by identifying and mitigating potential threats before they escalate. However, while digital technologies can significantly strengthen internal controls, they also introduce new complexities. Organizations must address challenges such as data security, privacy protection, and the management of technological incidents. To ensure successful adoption, we recommend implementing continuous employee training, establishing robust security policies, and using integrated and compatible technological solutions. Notably, this study fills a gap in existing research by providing a comprehensive analysis of the impact of digital technologies on internal controls. It offers practical recommendations for organizations seeking to leverage these technologies while minimizing their risks, thereby optimizing monitoring, risk management, and data security.
Andrea Marelli, Luca Magri, Federica Arrigoni
et al.
In industrial settings, weakly supervised (WS) methods are usually preferred over their fully supervised (FS) counterparts as they do not require costly manual annotations. Unfortunately, the segmentation masks obtained in the WS regime are typically poor in terms of accuracy. In this work, we present a WS method capable of producing accurate masks for semantic segmentation in the case of video streams. More specifically, we build saliency maps that exploit the temporal coherence between consecutive frames in a video, promoting consistency when objects appear in different frames. We apply our method in a waste-sorting scenario, where we perform weakly supervised video segmentation (WSVS) by training an auxiliary classifier that distinguishes between videos recorded before and after a human operator, who manually removes specific wastes from a conveyor belt. The saliency maps of this classifier identify materials to be removed, and we modify the classifier training to minimize differences between the saliency map of a central frame and those in adjacent frames, after having compensated object displacement. Experiments on a real-world dataset demonstrate the benefits of integrating temporal coherence directly during the training phase of the classifier. Code and dataset are available upon request.
Lexical Simplification (LS) methods use a three-step pipeline: complex word identification, substitute generation, and substitute ranking, each with separate evaluation datasets. We found large language models (LLMs) can simplify sentences directly with a single prompt, bypassing the traditional pipeline. However, existing LS datasets are not suitable for evaluating these LLM-generated simplified sentences, as they focus on providing substitutes for single complex words without identifying all complex words in a sentence. To address this gap, we propose a new annotation method for constructing an all-in-one LS dataset through human-machine collaboration. Automated methods generate a pool of potential substitutes, which human annotators then assess, suggesting additional alternatives as needed. Additionally, we explore LLM-based methods with single prompts, in-context learning, and chain-of-thought techniques. We introduce a multi-LLMs collaboration approach to simulate each step of the LS task. Experimental results demonstrate that LS based on multi-LLMs approaches significantly outperforms existing baselines.
Improving search efficiency serves as one of the crucial objectives of Neural Architecture Search (NAS). However, many current approaches ignore the universality of the search strategy and fail to reduce the computational redundancy during the search process, especially in one-shot NAS architectures. Besides, current NAS methods show invalid reparameterization in non-linear search space, leading to poor efficiency in common search spaces like DARTS. In this paper, we propose TopoNAS, a model-agnostic approach for gradient-based one-shot NAS that significantly reduces searching time and memory usage by topological simplification of searchable paths. Firstly, we model the non-linearity in search spaces to reveal the parameterization difficulties. To improve the search efficiency, we present a topological simplification method and iteratively apply module-sharing strategies to simplify the topological structure of searchable paths. In addition, a kernel normalization technique is also proposed to preserve the search accuracy. Experimental results on the NASBench201 benchmark with various search spaces demonstrate the effectiveness of our method. It proves the proposed TopoNAS enhances the performance of various architectures in terms of search efficiency while maintaining a high level of accuracy. The project page is available at https://xdedss.github.io/topo_simplification.
Ho Truong Nam Hai, Thanh Tam Nguyen, Maiko Nishibori
et al.
The persistent existence of plastic waste causes serious problems for the environment, directly and indirectly affecting the health of organisms and humans. Photoreforming is a nature-friendly method that only uses solar energy to convert plastic waste into green hydrogen (H2) and valuable organic products. This study shows that a high-entropy oxynitride (HEON) photocatalyst, synthesized by the addition of nitrogen to a Ti-Zr-Hf-Nb-Ta-containing high-entropy oxide (HEO), exhibits a higher potential for the production of H2, formic acid and acetic acid from polyethylene terephthalate (PET) photoreforming compared to the relevant HEO. Examination of X-ray absorption near edge structure (XANES) and extended X-ray absorption fine structure (EXAFS) by synchrotron light shows that, in addition to hybridization of 2p orbitals from oxygen and nitrogen, nitrogen atoms distort the structure and completely change the neighborhood of niobium and titanium (a main contributor to the conduction band), expands the atomic bonds of zirconium and tantalum, contracts the atomic bonds of hafnium and decreases the binding energy of titanium, niobium and tantalum. These electronic structure changes lead to a narrower bandgap and diminished electron-hole recombination, enhancing the photoreforming performance. This study introduces HEONs with distorted atomic bond structures as efficient low-bandgap and stable catalysts for transforming plastics into high-value organic chemicals and H2 by photocatalysis.
But how can theory be put into practice? Using the industrialized country Germany as an example, this paper outlines the current state of developments as well as necessary future measures, with a focus on the sustainable use of biogenic residues and wastes. Germany should lead the way in climate-neutrality, but is, in fact, light-years away from this at the moment. Current figures of the German Federal Environmental Agency do not show any positive developments (UBA, 2022). In 2021, Germany emitted 762 Mt CO2 eq, which is roughly 33 Mt more than in 2020. First figures for 2022 (761 Mt CO2 eq) even show stagnation, and reduction rates in relation to 1990 are below 39% (Agora Energiewende, 2023). In order to achieve the legally binding goal of climate neutrality by 2045, Germany needs to implement quick-acting, tangible and forceful measures. First and foremost, current energy consumption must be reduced by half. This requires both consistent energy saving measures and significant increases in energy efficiency. Secondly, Germany’s energy supply must switch to renewables completely and in all sectors over the coming decades. Wind, solar, geothermal, hydro and bioenergy will need to be seamlessly integrated, also across the sectors of heating/cooling, electricity and mobility, and especially the three former will require massive new installations as well. The share of renewables in Germany’s primary energy consumption was only about 17% in 2022 (Arbeitsgemeinschaft Energiebilanzen, 2023), and roughly 60% of this was supplied by bioenergy (Fachagentur Nachwachsende Rohstoffe, 2022). Energetic use of biomass meant that about 79 Mt CO2 eq Greenhouse Gas were avoided in 2021. Thus, it made a substantial contribution to climate protection. In future, bioenergy will especially be needed to fill supply gaps in a system relying entirely on renewable energies. The energetic use of biogenic wastes and residues will see increasing importance, as will negative emissions, which can be generated by storing ‘green’ carbon. Thirdly, we need to develop our ‘linear’ economic system to a truly circular one – again, we are still far from success on this path. As a key prerequisite, industry supply with organic compounds largely needs to switch from fossil-based to biobased raw materials. Biomass therefore both needs to be integrated into a sustainable energy system (‘Energiewende’) and form the basis of a bioeconomy (‘Rohstoffwende’, ‘raw material turnaround’). Such an ambitious goal can only be reached if biomass is deployed in an efficient and environmentally friendly way, and to maximum economic benefit. This will require new technological concepts as well as combined and cascading uses. Moreover, biomass must be supplied from sustainable production and in form of wastes and residues. Flexible use of biomass, together with other renewable energies, provides the highest systemic benefit in the different energy sectors. Synergies of bioenergy with other elements of energy and climate politics, such as hydrogen or atmospheric carbon removal (e.g. BECCS, bioenergy with carbon capture and storage), are of increasing importance. With any application, one needs to bear in mind that the use of biogenic resources is not sustainable per se. Suitable methods and tools of applied sustainability assessment are needed to provide comprehensive monitoring of established and new processes, concepts and products in the bioeoconomy. A matching regulatory framework is also critical. The key role an optimized material and energetic use of biogenic wastes and residues will play is increasingly being recognized, and acted upon, by relevant national and international stakeholders in the circular economy. The International Solid Waste Association (ISWA) has, for example, numerous ongoing actions on material as well as energetic utilization of bio-wastes. In the German National Biomass Strategy which the German government is planning to finish by 2023, the topic also plays an important role. Lastly, it also makes up focal areas of research of both the Deutsches Biomasseforschungszentrum gGmbH, German Biomass Research Centre (DBFZ) in Leipzig and the Chair of Waste and Resource Management in Rostock (see also Nelles et al., 2022). In order to assess quantities and availability of biogenic resources – the basis for any evaluation of chances and risks of current and future uses – the DBFZ develops and implements resource monitoring systems for different geographical regions and makes them available in a standardized format. All results and documentation can be accessed free of charge for individual analyses (DBFZ, 2020). As of today, the database already takes into account over 100 different biogenic residues from numerous sectors, for example, agricultural and forestry side products, The role of biogenic wastes and residues in a climate-neutral society: Carbon source, bioenergy and negative emissions 1161506WMR0010.1177/0734242X231161506Waste Management & ResearchEditorial editorial2023
Time flies by, but almost 7 years ago we published an editorial motivating the discussion on the (at that time) strategic aspects of circular economy and the bioeconomy. In particular, the invitation was to discuss the concept of ‘cascading’, which was a novel concept at the time that intended to evaluate the efforts of the sustainable and strategic utilization of natural resources. Nowadays, the discussion is about ‘circularity,’ which in my opinion is a good development, as it brings into the equation many further facets of the circular systems. However, even after all these years, the discussion is still ongoing, circularity having not been sufficiently defined (and by the way, nor is the definition of cascading!), which has led to conflicting discussions when it comes to evaluating the impacts of new technological concepts. First of all, it is not possible to explicitly evaluate the effectiveness of the ongoing measures to achieve circularity. In addition, different stakeholders have come up with their own definitions based on their experiences, visions, needs and goals, deriving methodological approaches whose results often cannot be compared, making it hard for decision-makers to track progress over time for the ongoing initiatives, or to make comparisons across different sectors and regions, and therefore to have sufficient information to make informed decisions. For this reason, I would like to take two steps back, give an overview of the status quo of the situation, and then identify the current challenges in this field, hoping to move the discussion forward. After several years of development, currently the circular systems such as the circular economy and the bioeconomy are based on several principles such as ‘design for circularity,’ which aims at designing durable, repairable/low maintenance but also recyclable products, and takes into account the whole life cycle of the envisaged products. This same systemic thinking enables fostering implementation of the necessary infrastructure for the material recovery with the aim of establishing resilient ‘closed-loop systems’, another key principle. In addition, the principle of ‘product life extension’ through reuse, maintenance and repair allows keeping the value of the materials within the economic system and minimizing the need for more raw material inputs. Aligned with the material inputs, another principle of utmost importance is ‘resource efficiency’ that aims at minimizing the environmental impacts of products through increasing the overall system efficiency, minimizing waste generation and at the same time fostering the use of renewable resources. A further key principle is the ‘fostering innovation principle’, which considers a systematic perspective and therefore fosters innovation not only in terms of novel technological concepts but also considering innovation in the logistics and services sectors, in the design of products as well as the financing and business models that are applied in the establishment of new – or improved – value chains. Such a principle has led to an expansion of what we considered the circular economy and the bioeconomy systems in terms of the greater diversity of stakeholders that can be identified as characteristics of both of these systems. Finally, taking into account this expansion of the involved stakeholders, ‘collaboration’ could be considered as the last key principle for circular systems. Collaboration now not only in terms of synergic industrial systems but through the involvement of business, (local, regional and national) governments, financing bodies and the different interest groups representing the involved communities. But how is circularity measured, if we do not have a clear definition of circularity itself? Good question. For this purpose, a series of indicators have been developed to evaluate a system’s circularity. Considering the life cycle of a system, we first find input indicators that correlate the amount of resources that enter a system with the efficiency to which these inputs are being used (e.g. materials use, water use, energy use). We then find the output indicators, which correlate the amount of waste that is generated and the rates at which materials within the system are being reused, repaired or recycled. Examples of output indicators are waste generation, material recycling rates (e.g. closed loop recycling) and reuse rates. Finally, we have the impact indicators, which denote a measurement of the environmental, social, and economic impacts of the activities associated to the system. Examples of impact indicators are the carbon and water footprints and economic benefits. In summary, the definitions and the methodological developments have come a long way. However, there are still remaining challenges: In my opinion, the first challenge we face nowadays is the lack of a common definition of circularity. This definition is key to derive a standardized set of ‘circularity indicators’ that can help us measure the effectiveness of technical concepts as they are established in the national and regional infrastructures, or of policy measures that can foster innovation in the circular economy and the bioeconomy fields. We need to address the ambiguity derived from the lack of a circularity definition to ensure collaboration among all involved stakeholders. The second challenge is inherent to the transitional process we are currently experiencing. Both the circular economy’s and the bioeconomy’s political and behavioural frameworks are changing and developing as well, making it difficult for monitoring systems and evaluation methodologies to reflect all these developments. Measuring circularity: A (still) ongoing methodological challenge 1170615WMR0010.1177/0734242X231170615Waste Management & ResearchEditorial editorial2023
Cross-lingual science journalism generates popular science stories of scientific articles different from the source language for a non-expert audience. Hence, a cross-lingual popular summary must contain the salient content of the input document, and the content should be coherent, comprehensible, and in a local language for the targeted audience. We improve these aspects of cross-lingual summary generation by joint training of two high-level NLP tasks, simplification and cross-lingual summarization. The former task reduces linguistic complexity, and the latter focuses on cross-lingual abstractive summarization. We propose a novel multi-task architecture - SimCSum consisting of one shared encoder and two parallel decoders jointly learning simplification and cross-lingual summarization. We empirically investigate the performance of SimCSum by comparing it with several strong baselines over several evaluation metrics and by human evaluation. Overall, SimCSum demonstrates statistically significant improvements over the state-of-the-art on two non-synthetic cross-lingual scientific datasets. Furthermore, we conduct an in-depth investigation into the linguistic properties of generated summaries and an error analysis.
Deep learning, especially convolutional neural networks, has triggered accelerated advancements in computer vision, bringing changes into our daily practice. Furthermore, the standardized deep learning modules (also known as backbone networks), i.e., ResNet and EfficientNet, have enabled efficient and rapid development of new computer vision solutions. Yet, deep learning methods still suffer from several drawbacks. One of the most concerning problems is the high memory and computational cost, such that dedicated computing units, typically GPUs, have to be used for training and development. Therefore, in this paper, we propose a quantifiable evaluation method, the convolutional kernel redundancy measure, which is based on perceived image differences, for guiding the network structure simplification. When applying our method to the chest X-ray image classification problem with ResNet, our method can maintain the performance of the network and reduce the number of parameters from over $23$ million to approximately $128$ thousand (reducing $99.46\%$ of the parameters).
We present new approximation results on curve simplification and clustering under Fréchet distance. Let $T = \{τ_i : i \in [n] \}$ be polygonal curves in $R^d$ of $m$ vertices each. Let $l$ be any integer from $[m]$. We study a generalized curve simplification problem: given error bounds $δ_i > 0$ for $i \in [n]$, find a curve $σ$ of at most $l$ vertices such that $d_F(σ,τ_i) \le δ_i$ for $i \in [n]$. We present an algorithm that returns a null output or a curve $σ$ of at most $l$ vertices such that $d_F(σ,τ_i) \le δ_i + εδ_{\max}$ for $i \in [n]$, where $δ_{\max} = \max_{i \in [n]} δ_i$. If the output is null, there is no curve of at most $l$ vertices within a Fréchet distance of $δ_i$ from $τ_i$ for $i \in [n]$. The running time is $\tilde{O}\bigl(n^{O(l)} m^{O(l^2)} (dl/ε)^{O(dl)}\bigr)$. This algorithm yields the first polynomial-time bicriteria approximation scheme to simplify a curve $τ$ to another curve $σ$, where the vertices of $σ$ can be anywhere in $R^d$, so that $d_F(σ,τ) \le (1+ε)δ$ and $|σ| \le (1+α) \min\{|c| : d_F(c,τ) \le δ\}$ for any given $δ> 0$ and any fixed $α, ε\in (0,1)$. The running time is $\tilde{O}\bigl(m^{O(1/α)} (d/(αε))^{O(d/α)}\bigr)$. By combining our technique with some previous results in the literature, we obtain an approximation algorithm for $(k,l)$-median clustering. Given $T$, it computes a set $Σ$ of $k$ curves, each of $l$ vertices, such that $\sum_{i \in [n]} \min_{σ\in Σ} d_F(σ,τ_i)$ is within a factor $1+ε$ of the optimum with probability at least $1-μ$ for any given $μ, ε\in (0,1)$. The running time is $\tilde{O}\bigl(n m^{O(kl^2)} μ^{-O(kl)} (dkl/ε)^{O((dkl/ε)\log(1/μ))}\bigr)$.
Anna Pegels, Jorge Luis Castañeda, C. Humphreys
et al.
In a transdisciplinary project with the Municipality of Trelew (Argentina), we assessed barriers to households disposing of separated waste, developed supportive behavioral interventions, tested the interventions in a randomized controlled trial, and supported the Municipality in upscaling the most successful and cost-effective intervention to a total of 20,000 households. The interventions were designed to address the three main barriers to waste separation detected through a baseline study: a lack of knowledge on how separation works; the additional hassle it represents; and the self-regulation challenge it poses. The interventions consisted of envelopes containing simplifying information, empathetic messages, a magnetic calendar acting as a reminder, or a combination thereof. The interventions roughly halved the prevalence of bags containing unusable mixed waste two weeks after the intervention. This impact was still present after six months. We did not find evidence for an additional effect of empathetic messages or the reminder. Based on these results, the simplified information intervention was rolled out. The results provide evidence of the high potential of using the full range of behavioral methods to increase sustainable behaviors, particularly in the context of limited options to adapt the waste management system as such.
The ZX-calculus is a graphical language for suitably represented tensor networks, called ZX-diagrams. Calculations are performed by transforming ZX-diagrams with rewrite rules. The ZX-calculus has found applications in reasoning about quantum circuits, condensed matter systems, quantum algorithms, quantum error correcting codes, and counting problems. A key notion is the stabiliser fragment of the ZX-calculus, a subfamily of ZX-diagrams for which rewriting can be done efficiently in terms of derived simplifying rewrites. Recently, higher dimensional qudits - in particular, qutrits - have gained prominence within quantum computing research. The main contribution of this work is the derivation of efficient rewrite strategies for the stabiliser fragment of the qutrit ZX-calculus. Notably, this constitutes a first non-trivial step towards the simplification of qutrit quantum circuits. We then give further unexpected areas in which these rewrite strategies provide complexity-theoretic insight; namely, we reinterpret known results about evaluating the Jones polynomial, an important link invariant in knot theory, and counting graph colourings.
Urban greening is politically fostered as an adaptation strategy to climate change. Therefore, the demand for fertile planting substrates increases. Such substrates are usually mixed from mined geogenic resources but should rather be produced from recycled materials. Furthermore, their hydraulic properties should be designed according to their application, e.g., by optimizing the mixing ratio of their components. Therefore, this study introduces an approach to investigate the water retention curves (WRC) of soil-like substrates as a function of the mixing ratio of two recycled components: exemplarily for green waste compost (GWC) and ground bricks (GB) in the fraction of sand. Seven mixing ratios for GWC and GB, 0/100, 18/82, 28/72, 37/63, 47/53, 68/32, and 100/0 have been packed to mixture-specific densities using a newly constructed packing device. The packing density resulted from applying six strokes with a constant momentum of 5.62 × 10−3 N s m−2 that was chosen according to the German green roof guideline. Thus, a standardized compaction was assured. The WRCs were measured using the simplified evaporation method in five replicates for each of the seven mixtures. A set of water retention models was parameterized and analyzed in regard to their suitability to represent the full range of binary mixtures. The newly constructed packing device enables to pack cylinders reproducibly. The densities in the cylinders for the mixtures varied from 0.64 g cm−3 (GWC/GB = 100/0) to 1.35 g cm−3 (GWC/GB = 0/100) with a coefficient of variation less than 1.3%. The simplified evaporation method delivered homogeneous results for all five replicates of the investigated mixtures. The WRC of the seven mixtures is the result of a complex combination of the pore systems of GWC and GB. The multi-modal water retention models of Peters, Durner, and Iden are principally suitable to describe soil-like substrates that are rich in organic matter. The models PDI (van Genuchten) and PDI (Fredlund–Xing) best described the WRCs for the full range of mixing ratios according to the quality criterion RMSE. The study delivers a template how to prepare and analyze soil-like substrates regarding their WRCs using the simplified evaporation method. Complemented by total porosity and measurements at pF > 4, it is a suitable method to gain high-resolution WRCs of soil-like substrates. Available water retention models are capable to describe the hydraulic behavior of binary mixtures over the full mixing ratio. Therefore, it would be possible to model the WRC of binary mixtures as a function of their mixing ratio.
Dietary supplements are widely used but not always safe. With the rapid development of the Internet, consumers usually seek health information including dietary supplement information online. To help consumers access quality online dietary supplement information, we have identified trustworthy dietary supplement information sources and built an evidence-based knowledge base of dietary supplement information-the integrated DIetary Supplement Knowledge base (iDISK) that integrates and standardizes dietary supplement related information across these different sources. However, as information in iDISK was collected from scientific sources, the complex medical jargon is a barrier for consumers' comprehension. To assess how different approaches to simplify and represent dietary supplement information from iDISK will affect lay consumers' comprehension, using a crowdsourcing platform, we recruited participants to read dietary supplement information in four different representations from iDISK: original text, syntactic and lexical text simplification, manual text simplification, and a graph-based visualization. We then assessed how the different simplification and representation strategies affected consumers' comprehension of dietary supplement information in terms of accuracy and response time to a set of comprehension questions. With responses from 690 qualified participants, our experiments confirmed that the manual approach had the best performance for both accuracy and response time to the comprehension questions, while the graph-based approach ranked the second outperforming other representations. In some cases, the graph-based representation outperformed the manual approach in terms of response time. A hybrid approach that combines text and graph-based representations might be needed to accommodate consumers' different information needs and information seeking behavior.
In this paper, we focus on the challenge of learning controllable text simplifications in unsupervised settings. While this problem has been previously discussed for supervised learning algorithms, the literature on the analogies in unsupervised methods is scarse. We propose two unsupervised mechanisms for controlling the output complexity of the generated texts, namely, back translation with control tokens (a learning-based approach) and simplicity-aware beam search (decoding-based approach). We show that by nudging a back-translation algorithm to understand the relative simplicity of a text in comparison to its noisy translation, the algorithm self-supervises itself to produce the output of the desired complexity. This approach achieves competitive performance on well-established benchmarks: SARI score of 46.88% and FKGL of 3.65% on the Newsela dataset.