Victor Guedes Barbosa, Renato Ribeiro Siman, Luciana Harue Yamane
Hasil untuk "Standardization. Simplification. Waste"
Menampilkan 20 dari ~454912 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar
Purva Kulkarni, Aravind Shankara Narayanan
Mesh simplification is the process of reducing the number of vertices, edges and triangles in a three-dimensional (3D) mesh while preserving the overall shape and salient features of the mesh. A popular strategy for this is edge collapse, where an edge connecting two vertices is merged into a single vertex. The edge to collapse is chosen based on a cost function that estimates the error introduced by this collapse. This paper presents a comprehensive, implementation-oriented guide to edge collapse for practitioners and researchers seeking both theoretical grounding and practical insight. We review and derive the underlying mathematics and provide reference implementations for foundational cost functions including Quadric Error Metrics (QEM) and Lindstrom-Turk's geometric criteria. We also explain the mathematics behind attribute-aware edge collapse in QEM variants and Hoppe's energy-based method used in progressive meshes. In addition to cost functions, we outline the complete edge collapse algorithm, including the specific sequence of operations and the data structures that are commonly used. To create a robust system, we also cover the necessary programmatic safeguards that prevent issues like mesh degeneracies, inverted normals, and improper handling of boundary conditions. The goal of this work is not only to consolidate established methods but also to bridge the gap between theory and practice, offering a clear, step-by-step guide for implementing mesh simplification pipelines based on edge collapse.
Rémi Cardon, A. Seza Doğruöz
Readability is a key concept in the current era of abundant written information. To help making texts more readable and make information more accessible to everyone, a line of researched aims at making texts accessible for their target audience: automatic text simplification (ATS). Lately, there have been studies on the correlations between automatic evaluation metrics in ATS and human judgment. However, the correlations between those two aspects and commonly available readability measures (such as readability formulas or linguistic features) have not been the focus of as much attention. In this work, we investigate the place of readability measures in ATS by complementing the existing studies on evaluation metrics and human judgment, on English. We first discuss the relationship between ATS and research in readability, then we report a study on correlations between readability measures and human judgment, and between readability measures and ATS evaluation metrics. We identify that in general, readability measures do not correlate well with automatic metrics and human judgment. We argue that as the three different angles from which simplification can be assessed tend to exhibit rather low correlations with one another, there is a need for a clear definition of the construct in ATS.
Jiaxuan Chen, Ling Zhou Shen, Jinchen Liu
In this article, we will discuss the optimization of Shanghai's recycling collection program, with the core of the task as making a decision among the choice of the alternatives. We will be showing a vivid and comprehensive application of the classical mathematical multi-criteria decision model: Analytical Hierarchy Process (AHP), using the eigenvector method. We will also seek the key criteria for the sustainability development of human society, by assessing the important elements of waste recycling.First, we considered the evaluation for a quantified score of the benefits and costs of recycling household glass wastes in Shanghai, respectively. In the evaluation of each score, we both adopted the AHP method to build a hierarchical structure of the problem we are facing. We first identified the key assessment criteria of the evaluation, on various perspectives including direct money costs and benefits, and further environmental and indirect considerations. Then, we distributed questionnaires to our school science teachers, taking the geometric mean, to build the pairwise comparison matrix of the criterion. After the theoretical modeling works are done, we began collecting the essential datasets for the evaluation of each score, by doing research on the official statistics, Internet information, market information and news reports. Sometimes, we proceed a logical pre-procession of the data from other data, if the data wanted isn't directly accessible. Then, we crucially considered the generalization of our mathematical model. We considered from several perspectives, including the extension of assessment criteria, and the consideration of the dynamic interdependency between the wastes, inside a limited transportation container.
Tanay Kumar, Hongying Zhao, Xuehua Zhang
Drying concentrated slurry waste is slow, particularly due to the entrapment and limited accessibility of water entrained between the particles in the slurry. A sailboat evaporator with a root-like structure is a new system that enables wind-assisted interfacial evaporation of concentrated particle slurries. In this work, we create access to the disconnected water pockets in concentrated slurry waste, facilitating faster water conduction and efficient evaporation at extremely high solid concentration. The evaporator's long roots effectively extracted water beneath 150 cm deep supernatant water layer. Through replantation of the evaporator to a separate location, an impressive evaporation rate (ER) of 4 kg/(m^2*h) close to 80 wt% solid concentration, a 25% increase to a non-replanted sample. Furthermore, long periods of efficient of evaporation was achieved even at high solid concentration through hydrodynamic flushing of roots. Outdoor experiments achieved substantial volumetric reduction, yielding dried residues with over 75 wt% solid concentration. These results underscore the system's reliable performance against highly concentrated slurries, yet to be by conventional industrial methods, including flocculation and tail-lift drying. The integration of renewable energy coupled with efficient enhancement techniques makes the sailboat evaporator a scalable and sustainable pathway for industrial wastewater dewatering.
Akio Hayakawa, Stefan Bott, Horacio Saggion
Despite their strong performance, large language models (LLMs) face challenges in real-world application of lexical simplification (LS), particularly in privacy-sensitive and resource-constrained environments. Moreover, since vulnerable user groups (e.g., people with disabilities) are one of the key target groups of this technology, it is crucial to ensure the safety and correctness of the output of LS systems. To address these issues, we propose an efficient framework for LS systems that utilizes small LLMs deployable in local environments. Within this framework, we explore knowledge distillation with synthesized data and in-context learning as baselines. Our experiments in five languages evaluate model outputs both automatically and manually. Our manual analysis reveals that while knowledge distillation boosts automatic metric scores, it also introduces a safety trade-off by increasing harmful simplifications. Importantly, we find that the model's output probability is a useful signal for detecting harmful simplifications. Leveraging this, we propose a filtering strategy that suppresses harmful simplifications while largely preserving beneficial ones. This work establishes a benchmark for efficient and safe LS with small LLMs. It highlights the key trade-offs between performance, efficiency, and safety, and demonstrates a promising approach for safe real-world deployment.
Jon Crall
Small, amorphous waste objects such as biological droppings and microtrash can be difficult to see, especially in cluttered scenes, yet they matter for environmental cleanliness, public health, and autonomous cleanup. We introduce "ScatSpotter": a new dataset of images annotated with polygons around dog feces, collected to train and study object detection and segmentation systems for small potentially camouflaged outdoor waste. We gathered data in mostly urban environments, using "before/after/negative" (BAN) protocol: for a given location, we capture an image with the object present, an image from the same viewpoint after removal, and a nearby negative scene that often contains visually similar confusers. Image collection began in 2020. This paper focuses on two dataset checkpoints from 2025 and 2024. The dataset contains over 9000 images and 6000 polygon annotations. Of the author-captured images we held out 691 for validation and used the rest to train. Via community participation we obtained a 121-image test set that, while small, is independent from author-collected images and provides some generalization confidence across photographers, devices, and locations. Due to its limited size, we report both validation and test results. We explore the difficulty of the dataset using off-the-shelf VIT, MaskRCNN, YOLO-v9, and DINO-v2 models. Zero-shot DINO performs poorly, indicating limited foundational-model coverage of this category. Tuned DINO is the best model with a box-level average precision of 0.69 on a 691-image validation set and 0.7 on the test set. These results establish strong baselines and quantify the remaining difficulty of detecting small, camouflaged waste objects. To support open access to models and data, we compare centralized and decentralized distribution mechanisms and discuss trade-offs for sharing scientific data. Code and project details are hosted on GitHub.
Jordina Francès de Mas, Juliana Bowles
This paper presents a novel simplification calculus for propositional logic derived from Peirce's existential graphs' rules of inference and implication graphs. Our rules can be applied to propositional logic formulae in nested form, are equivalence-preserving, guarantee a monotonically decreasing number of variables, clauses and literals, and maximise the preservation of structural problem information. Our techniques can also be seen as higher-level SAT preprocessing, and we show how one of our rules (TWSR) generalises and streamlines most of the known equivalence-preserving SAT preprocessing methods. In addition, we propose a simplification procedure based on the systematic application of two of our rules (EPR and TWSR) which is solver-agnostic and can be used to simplify large Boolean satisfiability problems and propositional formulae in arbitrary form, and we provide a formal analysis of its algorithmic complexity in terms of space and time. Finally, we show how our rules can be further extended with a novel n-ary implication graph to capture all known equivalence-preserving preprocessing procedures.
Minsu Oh
Heat is an inevitable outcome in energy consumption processes, as more than 65% of input energy is wasted as heat. If we can generate electricity from waste heat, it will help minimize the needs for fossil fuels in power plants and can reduce carbon emissions. One way to generate power from heat is through the use of thermophotovoltaics (TPVs), where photons radiated from thermal emitters are converted into electricity. To optimize TPV performance, it is crucial to design emitters such that their emissivity spectrum matches their operating temperature. For example, higher emissivity is needed at shorter (longer) wavelengths at higher (lower) temperatures. Thus, having the ability to create wavelength-selective emitters can enable TPV applications for a wider range of temperatures. This research focuses on utilizing metamaterials (2D emitters) and planar thin films (1D emitters) to create those emitters. Simulation, fabrication, material property analysis, and radiation measurements were used to characterize the emitters. Based on simulation, metamaterial emitters exhibit engineerable emissivity due to various mechanisms of their optical resonance. Also, large-area fabrication of 1D emitters (78 cm2) was achieved owing to their simple structure, which is required to produce higher TPV power output. Incorporating the characteristics of emitters of each type, their advantages and challenges are discussed. Therefore, the comprehensive results of this research help realize practical implementation of TPV applications.
Sweta Agrawal, Marine Carpuat
Text simplification (TS) systems rewrite text to make it more readable while preserving its content. However, what makes a text easy to read depends on the intended readers. Recent work has shown that pre-trained language models can simplify text using a wealth of techniques to control output simplicity, ranging from specifying only the desired reading grade level, to directly specifying low-level edit operations. Yet it remains unclear how to set these control parameters in practice. Existing approaches set them at the corpus level, disregarding the complexity of individual inputs and considering only one level of output complexity. In this work, we conduct an empirical study to understand how different control mechanisms impact the adequacy and simplicity of text simplification systems. Based on these insights, we introduce a simple method that predicts the edit operations required for simplifying a text for a specific grade level on an instance-per-instance basis. This approach improves the quality of the simplified outputs over corpus-level search-based heuristics.
Tannon Kew, Alison Chi, Laura Vásquez-Rodríguez et al.
We present BLESS, a comprehensive performance benchmark of the most recent state-of-the-art large language models (LLMs) on the task of text simplification (TS). We examine how well off-the-shelf LLMs can solve this challenging task, assessing a total of 44 models, differing in size, architecture, pre-training methods, and accessibility, on three test sets from different domains (Wikipedia, news, and medical) under a few-shot setting. Our analysis considers a suite of automatic metrics as well as a large-scale quantitative investigation into the types of common edit operations performed by the different models. Furthermore, we perform a manual qualitative analysis on a subset of model outputs to better gauge the quality of the generated simplifications. Our evaluation indicates that the best LLMs, despite not being trained on TS, perform comparably with state-of-the-art TS baselines. Additionally, we find that certain LLMs demonstrate a greater range and diversity of edit operations. Our performance benchmark will be available as a resource for the development of future TS methods and evaluation metrics.
J. Murphy, C. Miller, E. Yu
As today’s laboratories adjust to the ever-changing landscape of growing demands while continuing to provide quality diagnostic testing, the need to evaluate and improve workflow, productivity and efficiency has never been more important. One way to ensure current processes and technology are fully optimized is to perform a Workflow Analysis Study. Geisinger Medical Center (GMC) recognized an opportunity to consolidate their immunology laboratory for allergy and autoimmune testing with annual volume of approximately 138 000. A consolidation solution can result in many improvements for the laboratory, such as: improved turnaround times, decreased manual labor, reduced reagent waste, increased employee morale, enhanced space utilization. Maximizing operational efficiencies is and will always be a good business model. We aimed to assess pre- and post- workflow changes to determine impact to laboratory resources. To measure the impact of the instrument consolidation and integration of Phadia250TM systems, operational data was collected through direct workflow observations, time and motion studies, and targeted interviews of testing personnel at GMC’s Immunology Laboratory. The initial/baseline study was conducted in May 2022 and the follow-up/post study occurred in October 2022. The baseline study encompassed seven separate platforms that GMC utilized for autoimmune and allergy testing. The following systems were included: two DYNEX DSX®, two Werfen BIO-FLASH®, two Siemens IMMULITE® 2000, one ZEUS IFA™. The workflow assessment resulted in GMC consolidating down to three platforms for autoimmune and allergy testing: two Phadia250, one ZEUS IFA™. The major benefit of the change was the reduction in technology from three types of technology to one type of technology, resulting in standardization of practices. The decrease in number of testing systems simplified overall test management (reduced LIS interface, process steps, instrument maintenance, reagent management, contract management) and reduced system footprints. A total of 361 square feet of laboratory space was saved equating to a 57% improvement over the baseline metric. After consolidating most of the testing to the two Phadia250 systems, the total daily manual time went from 4.2 h to just over 2.5 h. The combined workflow assessment resulted in saving a total of one full time employee (FTE). To keep up with growing testing demands, laboratories must continue to produce high-quality results in an efficient manner. Test assay quality and utilization should also be evaluated when determining test consolidation options. Instrument consolidation is a viable strategy to save on technologist time, space and costs.This can result in not only economic savings to the laboratory, but also allows medical technologist to be funneled to other more needed areas of the laboratory, while saved space can be used for test expansion. In the climate of major medical technologist shortage nationwide, efficient workflows and productivity are important considerations to ensure the continued success of any laboratory.
T. Cajuhi, J. Maßmann, G. Ziefle et al.
Abstract. Understanding complex systems, such as radioactive waste repositories, involves the study of cross-scale coupled processes. We discuss some important concepts and their mutual interactions for interpreting such systems based on complementary model-based analyses at various scales (Fig. 1). These points are linked with practical examples that pertain to the hydromechanical effects and cracking of the Opalinus Clay. In the Federal Ministry of Education and Research (BMBF)-funded project “Geomechanical integrity of host and barrier rocks – experiment, modeling and analysis of discontinuities (GeomInt2)” (Cajuhi et al., 2023a in Kolditz et al., 2023), these effects have been investigated experimentally and numerically, both at laboratory and at field scales. While interpretation influences the conceptualization of experimental and/or numerical models, a clear goal definition is required to delimit the complex system. One goal statement in the context of this contribution is to explain the formation of drying cracks, where essential processes and boundary conditions must be pinned down. Monitoring data are critical for determining the correlation between experimental and numerical setups. This is achieved by quantifying and questioning the assignability of information derived from data obtained from core samples tested in the laboratory to field-scale analyses. Numerical modeling can help validate model ideas by reproducing measured data and making predictions beyond the experimentally observable range. The detailed information gained from these studies can be used for interpretation and simplification. In a recent study, we used the phase-field approach to model the formation and evolution of cracks and stated the conditions under which desiccation cracks will develop at the field scale as well as how deep they propagate into the rock (Cajuhi et al., 2023b). This information about the near field can be used to determine how detailed repository far-field models must be and, for example, whether cracks need to be taken into account. Consequently, the cross-scale study of complex systems can lead to more robust analysis results.
A. Brai, C. Vagaggini, C. Pasqualini et al.
Agricultural and industrial waste represent valuable starting materials to create novel products with economic value added. Winery industry represents an important economic sector in Italy, which produces tons of by-products every year. Global warming and the increasing demand for food and feed led us to analyse the nutraceutical properties of distillery by-products as possible supplements to feed Tenebrio molitor larvae (TML). Grape pomace (GP) and grape marcs (GM), grape skin pulp, grape seeds and winery waste sludge were analysed for their antioxidant activity and fatty acid (FA) profile. Even if subjected to multiple processing, by-products had an important content of antioxidant compounds, in particular polyphenols, flavonols, flavonoids and condensed tannins. Moreover, the high amount of polyunsaturated fatty acids and the low percentage of saturated fatty acids found make them useful feed supplements. Herein has been disclosed that their use as TML feed material was well tolerated over TML development, with a significant mean weight gain respect to control of about 25% and no effect on survival rate. Interestingly, total antioxidant activity and FA profile ameliorate significantly, suggesting that distillery by-products can be used to extend the shelf-life of TML and ameliorate their nutraceutical properties, with possible application in controlled dietary regimens. This work confirmed that by-products largely produced in Europe can be used as TML feed materials, simplifying waste management and reducing rearing costs.
H. Muin, Z. Alias, Adibi M. Nor et al.
Reem Hazim, Hind Saddiki, Bashar Alhafni et al.
This demo paper presents a Google Docs add-on for automatic Arabic word-level readability visualization. The add-on includes a lemmatization component that is connected to a five-level readability lexicon and Arabic WordNet-based substitution suggestions. The add-on can be used for assessing the reading difficulty of a text and identifying difficult words as part of the task of manual text simplification. We make our add-on and its code publicly available.
Asma Sakri, A. Aouabed, A. Nassour et al.
As in many developing countries, municipal solid waste (MSW) management is one of the most significant challenges facing urban communities in Algeria. The effective management of solid waste involves the application of various treatment methods, and technologies to ensure the protection of public health and the environment. This research work aimed to examine potential production and utilization of refuse-derived fuel (RDF) from MSW to be used as a substitute fuel in cement kilns in Algeria. After receiving the input waste, sieves were used to categorize MSW according to size. The waste fractions >80 mm were subjected to a drying process in an open-air area and had been turned periodically in order to increase the dry matter (DM). A cost study was performed to evaluate the environmental and economic savings of RDF utilization in the cement industry. At the end of the drying process, as a consequence of the waste moisture reduction, the low heating value was found to be 16 MJ kg−1, and the DM 87%. Concerning heavy metal content, their concentrations were within the limits set by the European Committee for Standardization (CEN)/TC 343 standardization. The chlorine content was around 0.37% to 0.80%. The feasibility study of adding RDF as a substitute fuel in the cement industry showed that when 15% of RDF is used, the RDF consumption will be 4.7 metric tonnes (Mt) h−1, which will save 4347.2 Nm3 h−1 of natural gas and 0.3 Mt h−1 in carbon dioxide emissions, with a net gas cost saving of 65 USD h−1.
P. Herrmann, Maria A. Yudina
The following introduces the concept of overlife, not claiming that it is an entirely new idea, however suggesting that it is a suitable term to bring different problems of contemporary societal development together. Broadly speaking, overload is defined as simultaneously condensing patterns of life and the actual living, i.e. intensifying living by establishing patterns of multitasking; however, doing so occurs for the price of a shallowed concept of life by a differentiated system of standardization. Simplification of cognition and education, not least in the context of digitization, are important factors: The apparently increasing control, everybody experiences, goes hand-in-hand with increasing difficulties of understanding – and enjoying – the complexity with which we are confronted. Still, although this seems to be a secular process concerning humanity and humans in general, control and power remains in the hands of a few who, as individuals and corporations, design life and society. Paradoxically, the theoretically gained possibility to answer complex questions and develop long-term perspectives, turns, at least under capitalist conditions, into narcissistic idiosyncrasies, and wasting huge amounts of monies for the thrill of egos instead of strategically developing socio-economic strategies addressing major challenges as poverty, environmental threats, digitisation and new forms of stupidification
T. McGinley, Thomas Vestergaard, C. Jeong et al.
Architects require the insight of acoustic engineers to understand how to improve and/or optimize the acoustic performance of their buildings. Normally this is supported by the architect providing digital models of the design to the acoustic engineer for analysis in the acoustician’s disciplinary software, for instance Odeon. This current workflow suffers from the following challenges: (1) architects typically require feedback on architectural disciplinary models that have too much geometric information unnecessarily complicating the acoustic analysis process; (2) the acoustician then has to waste time simplifying that geometry, (3) finally, this extra work wastes money which could otherwise be spent on faster design iterations supported by frequent feedback between architects and acousticians early in the design process. This paper focuses on the architect / acoustician workflow, however similar challenges can be found in other disciplines. OpenBIM workflows provide opportunities to increase the standardization of processes and interfaces between disciplines by reducing the reliance on the proprietary discipline specific file formats and tools. This paper lays the foundation for an OpenBIM workflow to enable the acoustic engineer to provide near real time feedback on the acoustic performance of the architectural design. The proposed workflow investigates the use of the international standard IFC as a design format rather than simply an exchange format. The workflow is presented here with the intention that this will be further explored and developed by other researchers, architects and acousticians.
Mees van de Kerkhof, Irina Kostitsyna, Maarten Löffler
We prove that circle graphs (intersection graphs of circle chords) can be embedded as intersection graphs of rays in the plane with polynomial-size bit complexity. We use this embedding to show that the global curve simplification problem for the directed Hausdorff distance is NP-hard. In this problem, we are given a polygonal curve $P$ and the goal is to find a second polygonal curve $P'$ such that the directed Hausdorff distance from $P'$ to $P$ is at most a given constant, and the complexity of $P'$ is as small as possible.
Halaman 24 dari 22746