Hasil untuk "Standardization. Simplification. Waste"

Menampilkan 20 dari ~425154 hasil · dari CrossRef, DOAJ, arXiv

JSON API
arXiv Open Access 2026
GaussianPOP: Principled Simplification Framework for Compact 3D Gaussian Splatting via Error Quantification

Soonbin Lee, Yeong-Gyu Kim, Simon Sasse et al.

Existing 3D Gaussian Splatting simplification methods commonly use importance scores, such as blending weights or sensitivity, to identify redundant Gaussians. However, these scores are not driven by visual error metrics, often leading to suboptimal trade-offs between compactness and rendering fidelity. We present GaussianPOP, a principled simplification framework based on analytical Gaussian error quantification. Our key contribution is a novel error criterion, derived directly from the 3DGS rendering equation, that precisely measures each Gaussian's contribution to the rendered image. By introducing a highly efficient algorithm, our framework enables practical error calculation in a single forward pass. The framework is both accurate and flexible, supporting on-training pruning as well as post-training simplification via iterative error re-quantification for improved stability. Experimental results show that our method consistently outperforms existing state-of-the-art pruning methods across both application scenarios, achieving a superior trade-off between model compactness and high rendering quality.

en cs.CV
arXiv Open Access 2025
Square Packing with Asymptotically Smallest Waste Only Needs Good Squares

Hong Duc Bui

We consider the problem of packing a large square with nonoverlapping unit squares. Let $W(x)$ be the minimum wasted area when a large square of side length $x$ is packed with unit squares. In Roth and Vaughan's paper that proves the lower bound $W(x) \notin o(x^{1/2})$, a good square is defined to be a square with inclination at most $10^{-10}$ with respect to the large square. In this article, we prove that in calculating the asymptotic growth of the wasted space, it suffices to only consider packings with only good squares. This allows the lower bound proof in Roth and Vaughan's paper to be simplified by not having to handle bad squares.

en cs.CG
arXiv Open Access 2025
Artificial Intelligence in the Food Industry: Food Waste Estimation based on Computer Vision, a Brief Case Study in a University Dining Hall

Shayan Rokhva, Babak Teimourpour

Quantifying post-consumer food waste in institutional dining settings is essential for supporting data-driven sustainability strategies. This study presents a cost-effective computer vision framework that estimates plate-level food waste by utilizing semantic segmentation of RGB images taken before and after meal consumption across five Iranian dishes. Four fully supervised models (U-Net, U-Net++, and their lightweight variants) were trained using a capped dynamic inverse-frequency loss and AdamW optimizer, then evaluated through a comprehensive set of metrics, including Pixel Accuracy, Dice, IoU, and a custom-defined Distributional Pixel Agreement (DPA) metric tailored to the task. All models achieved satisfying performance, and for each food type, at least one model approached or surpassed 90% DPA, demonstrating strong alignment in pixel-wise proportion estimates. Lighter models with reduced parameter counts offered faster inference, achieving real-time throughput on an NVIDIA T4 GPU. Further analysis showed superior segmentation performance for dry and more rigid components (e.g., rice and fries), while more complex, fragmented, or viscous dishes, such as stews, showed reduced performance, specifically post-consumption. Despite limitations such as reliance on 2D imaging, constrained food variety, and manual data collection, the proposed framework is pioneering and represents a scalable, contactless solution for continuous monitoring of food consumption. This research lays foundational groundwork for automated, real-time waste tracking systems in large-scale food service environments and offers actionable insights and outlines feasible future directions for dining hall management and policymakers aiming to reduce institutional food waste.

en cs.CV, cs.AI
arXiv Open Access 2024
Impact of high-pressure torsion on hydrogen production from photodegradation of polypropylene plastic wastes

Thanh Tam Nguyen, Kaveh Edalati

Plastic waste entering the environment through landfilling or improper disposal poses substantial risks to ecosystems and human health. Photoreforming is emerging as a clean photocatalytic technology that degrades plastic waste to organic compounds while simultaneously producing hydrogen fuel. This study introduces high-pressure torsion (HPT), a severe plastic deformation (SPD) method, as an innovative technique to enhance the photoreforming of polypropylene (PP) plastic mixed with a brookite TiO2 photocatalyst. Hydrogen production systematically increases with the number of HPT turns, accompanied by the formation of valuable small organic molecules. The enhancement in photocatalytic activity is attributed to strain-induced defect formation in both catalysts and plastics, as well as the creation of catalyst/plastic interphases that enhance charge carrier transport between inorganic and organic phases. These findings reveal a new functional application for SPD in energy conversion and sustainability.

en cond-mat.mtrl-sci
arXiv Open Access 2024
CWF: Consolidating Weak Features in High-quality Mesh Simplification

Rui Xu, Longdu Liu, Ningna Wang et al.

In mesh simplification, common requirements like accuracy, triangle quality, and feature alignment are often considered as a trade-off. Existing algorithms concentrate on just one or a few specific aspects of these requirements. For example, the well-known Quadric Error Metrics (QEM) approach prioritizes accuracy and can preserve strong feature lines/points as well but falls short in ensuring high triangle quality and may degrade weak features that are not as distinctive as strong ones. In this paper, we propose a smooth functional that simultaneously considers all of these requirements. The functional comprises a normal anisotropy term and a Centroidal Voronoi Tessellation (CVT) energy term, with the variables being a set of movable points lying on the surface. The former inherits the spirit of QEM but operates in a continuous setting, while the latter encourages even point distribution, allowing various surface metrics. We further introduce a decaying weight to automatically balance the two terms. We selected 100 CAD models from the ABC dataset, along with 21 organic models, to compare the existing mesh simplification algorithms with ours. Experimental results reveal an important observation: the introduction of a decaying weight effectively reduces the conflict between the two terms and enables the alignment of weak features. This distinctive feature sets our approach apart from most existing mesh simplification methods and demonstrates significant potential in shape understanding.

en cs.GR, cs.CG
arXiv Open Access 2024
Health Text Simplification: An Annotated Corpus for Digestive Cancer Education and Novel Strategies for Reinforcement Learning

Md Mushfiqur Rahman, Mohammad Sabik Irbaz, Kai North et al.

Objective: The reading level of health educational materials significantly influences the understandability and accessibility of the information, particularly for minoritized populations. Many patient educational resources surpass the reading level and complexity of widely accepted standards. There is a critical need for high-performing text simplification models in health information to enhance dissemination and literacy. This need is particularly acute in cancer education, where effective prevention and screening education can substantially reduce morbidity and mortality. Methods: We introduce Simplified Digestive Cancer (SimpleDC), a parallel corpus of cancer education materials tailored for health text simplification research, comprising educational content from the American Cancer Society, Centers for Disease Control and Prevention, and National Cancer Institute. Utilizing SimpleDC alongside the existing Med-EASi corpus, we explore Large Language Model (LLM)-based simplification methods, including fine-tuning, reinforcement learning (RL), reinforcement learning with human feedback (RLHF), domain adaptation, and prompt-based approaches. Our experimentation encompasses Llama 2 and GPT-4. A novel RLHF reward function is introduced, featuring a lightweight model adept at distinguishing between original and simplified texts, thereby enhancing the model's effectiveness with unlabeled data. Results: Fine-tuned Llama 2 models demonstrated high performance across various metrics. Our innovative RLHF reward function surpassed existing RL text simplification reward functions in effectiveness. The results underscore that RL/RLHF can augment fine-tuning, facilitating model training on unlabeled text and improving performance.

en cs.CL, cs.AI
arXiv Open Access 2023
Packed bed thermal energy storage for waste heat recovery in the iron and steel industry: An experimental study on powder hold-up and pressure drop

Paul Schwarzmayr, Felix Birkelbach, Heimo Walter et al.

Waste heat recovery in the energy intensive industry is one of the most important measures for the mitigation of climate change. The utilization of just a fraction of the theoretically available waste heat potential would lead to a significant reduction of the primary energy consumption and hence a reduction of greenhouse gas emissions. The present study examines the integration of a packed bed thermal energy storage for waste heat recovery in the iron and steel industry. Along with the highly fluctuating availability of excess heat the main difficulty of waste heat recovery in industrial processes is the high amount of powder that is transported by the hot exhaust gases. Therefore, investigations focus on the pressure drop and powder hold-up in a packed bed thermal energy storage that is operated with a gas-powder two phase exhaust gas as heat transfer fluid with the ultimate goal to assess its suitability and robustness under such challenging operational conditions. The results indicate, that 98 % of the powder that is introduced into the system with the heat transfer fluid during charging accumulates in the packed bed. Remarkably, most of the powder hold-up in the packed bed is concentrated near the surface at which the heat transfer fluid enters the packed bed. When reversing the flow direction of the heat transfer fluid to discharge the storage with a clean single phase gas, this gas is not contaminated with the powder that has been accumulated in previous charging periods. Further, the radial distribution of the powder hold-up in the packed bed is observed to be even which indicates that there is no risk of random flow channel formation that could affect the thermal performance (storage capacity, thermal power rate) of the system. The results reinforce the great potential of packed bed thermal energy storage systems for waste heat recovery in the energy intensive industry.

en physics.app-ph
arXiv Open Access 2023
Production of Porous Glass-foam Materials from Photovoltaic Panel Waste Glass

Bui Khac Thach, Le Nhat Tan, Do Quang Minh et al.

The Solar energy production is growing quickly for the global demand of renewa-ble one, decrease the dependence on fossil fuels. However, disposing of used pho-tovoltaic (PV) panels will be a serious environmental challenge in the future dec-ades since the solar panels would eventually become a source of hazardous waste. The potential of waste solar panel glass to generate porous glass material with the addition of CaCO3 and water glass was assessed in this study. The porous glass firing temperature range, from 830°C - 910°C, was determined using a simu-lation of heating microscope technique. The created samples have the smallest volumetric density of 0.25 g/cm3 and the largest water absorption of 303.08 wt.%. This indicates that the image analysis of samples during the heating process could be used to identify the firing temperature for better foaming, which was favorably indicated by specific physicochemical parameters. The created glass-foam mate-rials with an apparent porosity up to 81.49% could be used as a water-retaining medium in hydroponic and aquaponic systems

en cond-mat.soft
arXiv Open Access 2023
Surface Simplification using Intrinsic Error Metrics

Hsueh-Ti Derek Liu, Mark Gillespie, Benjamin Chislett et al.

This paper describes a method for fast simplification of surface meshes. Whereas past methods focus on visual appearance, our goal is to solve equations on the surface. Hence, rather than approximate the extrinsic geometry, we construct a coarse intrinsic triangulation of the input domain. In the spirit of the quadric error metric (QEM), we perform greedy decimation while agglomerating global information about approximation error. In lieu of extrinsic quadrics, however, we store intrinsic tangent vectors that track how far curvature "drifts" during simplification. This process also yields a bijective map between the fine and coarse mesh, and prolongation operators for both scalar- and vector-valued data. Moreover, we obtain hard guarantees on element quality via intrinsic retriangulation - a feature unique to the intrinsic setting. The overall payoff is a "black box" approach to geometry processing, which decouples mesh resolution from the size of matrices used to solve equations. We show how our method benefits several fundamental tasks, including geometric multigrid, all-pairs geodesic distance, mean curvature flow, geodesic Voronoi diagrams, and the discrete exponential map.

en cs.GR
arXiv Open Access 2020
Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders

Yanbin Zhao, Lu Chen, Zhi Chen et al.

Text simplification (TS) rephrases long sentences into simplified variants while preserving inherent semantics. Traditional sequence-to-sequence models heavily rely on the quantity and quality of parallel sentences, which limits their applicability in different languages and domains. This work investigates how to leverage large amounts of unpaired corpora in TS task. We adopt the back-translation architecture in unsupervised machine translation (NMT), including denoising autoencoders for language modeling and automatic generation of parallel data by iterative back-translation. However, it is non-trivial to generate appropriate complex-simple pair if we directly treat the set of simple and complex corpora as two different languages, since the two types of sentences are quite similar and it is hard for the model to capture the characteristics in different types of sentences. To tackle this problem, we propose asymmetric denoising methods for sentences with separate complexity. When modeling simple and complex sentences with autoencoders, we introduce different types of noise into the training process. Such a method can significantly improve the simplification performance. Our model can be trained in both unsupervised and semi-supervised manner. Automatic and human evaluations show that our unsupervised model outperforms the previous systems, and with limited supervision, our model can perform competitively with multiple state-of-the-art simplification systems.

en cs.CL
arXiv Open Access 2020
Simple-QE: Better Automatic Quality Estimation for Text Simplification

Reno Kriz, Marianna Apidianaki, Chris Callison-Burch

Text simplification systems generate versions of texts that are easier to understand for a broader audience. The quality of simplified texts is generally estimated using metrics that compare to human references, which can be difficult to obtain. We propose Simple-QE, a BERT-based quality estimation (QE) model adapted from prior summarization QE work, and show that it correlates well with human quality judgments. Simple-QE does not require human references, which makes the model useful in a practical setting where users would need to be informed about the quality of generated simplifications. We also show that we can adapt this approach to accurately predict the complexity of human-written texts.

en cs.CL
arXiv Open Access 2019
Dense 3D Visual Mapping via Semantic Simplification

Luca Morreale, Andrea Romanoni, Matteo Matteucci

Dense 3D visual mapping estimates as many as possible pixel depths, for each image. This results in very dense point clouds that often contain redundant and noisy information, especially for surfaces that are roughly planar, for instance, the ground or the walls in the scene. In this paper we leverage on semantic image segmentation to discriminate which regions of the scene require simplification and which should be kept at high level of details. We propose four different point cloud simplification methods which decimate the perceived point cloud by relying on class-specific local and global statistics still maintaining more points in the proximity of class boundaries to preserve the infra-class edges and discontinuities. 3D dense model is obtained by fusing the point clouds in a 3D Delaunay Triangulation to deal with variable point cloud density. In the experimental evaluation we have shown that, by leveraging on semantics, it is possible to simplify the model and diminish the noise affecting the point clouds.

en cs.CV
arXiv Open Access 2019
Topology-Preserving Terrain Simplification

Ulderico Fugacci, Michael Kerber, Hugo Manet

We give necessary and sufficient criteria for elementary operations in a two-dimensional terrain to preserve the persistent homology induced by the height function. These operations are edge flips and removals of interior vertices, re-triangulating the link of the removed vertex. This problem is motivated by topological terrain simplification, which means removing as many critical vertices of a terrain as possible while maintaining geometric closeness to the original surface. Existing methods manage to reduce the maximal possible number of critical vertices, but increase thereby the number of regular vertices. Our method can be used to post-process a simplified terrain, drastically reducing its size and preserving its favorable properties.

en cs.CG
arXiv Open Access 2019
Reference-less Quality Estimation of Text Simplification Systems

Louis Martin, Samuel Humeau, Pierre-Emmanuel Mazaré et al.

The evaluation of text simplification (TS) systems remains an open challenge. As the task has common points with machine translation (MT), TS is often evaluated using MT metrics such as BLEU. However, such metrics require high quality reference data, which is rarely available for TS. TS has the advantage over MT of being a monolingual task, which allows for direct comparisons to be made between the simplified text and its original version. In this paper, we compare multiple approaches to reference-less quality estimation of sentence-level text simplification systems, based on the dataset used for the QATS 2016 shared task. We distinguish three different dimensions: gram-maticality, meaning preservation and simplicity. We show that n-gram-based MT metrics such as BLEU and METEOR correlate the most with human judgment of grammaticality and meaning preservation, whereas simplicity is best evaluated by basic length-based metrics.

en cs.CL
arXiv Open Access 2018
Optimal distillation of quantum coherence with reduced waste of resources

Gokhan Torun, Ludovico Lami, Gerardo Adesso et al.

We present an optimal probabilistic protocol to distill quantum coherence. Inspired by a specific entanglement distillation protocol, our main result yields a strictly incoherent operation that produces one of a family of maximally coherent states of variable dimension from any pure quantum state. We also expand this protocol to the case where it is possible, for some initial states, to avert any waste of resources as far as the output states are concerned, by exploiting an additional transformation into a suitable intermediate state. These results provide practical schemes for efficient quantum resource manipulation.

en quant-ph, math-ph
arXiv Open Access 2017
Transmutation prospect of long-lived nuclear waste induced by high-charge electron beam from laser plasma accelerator

X. L. Wang, Z. Y. Xu, W. Luo et al.

Photo-transmutation of long-lived nuclear waste induced by high-charge relativistic electron beam (e-beam) from laser plasma accelerator is demonstrated. Collimated relativistic e-beam with a high charge of approximately 100 nC is produced from high-intensity laser interaction with near-critical-density (NCD) plasma. Such e-beam impinges on a high-Z convertor and then radiates energetic bremsstrahlung photons with flux approaching 10^{11} per laser shot. Taking long-lived radionuclide ^{126}Sn as an example, the resulting transmutation reaction yield is the order of 10^{9} per laser shot, which is two orders of magnitude higher than obtained from previous studies. It is found that at lower densities, tightly focused laser irradiating relatively longer NCD plasmas can effectively enhance the transmutation efficiency. Furthermore, the photo-transmutation is generalized by considering mixed-nuclide waste samples, which suggests that the laser-accelerated high-charge e-beam could be an efficient tool to transmute long-lived nuclear waste.

en physics.plasm-ph
arXiv Open Access 2015
Formalization of simplification for context-free grammars

Marcus V. M. Ramos, Ruy J. G. B. de Queiroz

Context-free grammar simplification is a subject of high importance in computer language processing technology as well as in formal language theory. This paper presents a formalization, using the Coq proof assistant, of the fact that general context-free grammars generate languages that can be also generated by simpler and equivalent context-free grammars. Namely, useless symbol elimination, inaccessible symbol elimination, unit rules elimination and empty rules elimination operations were described and proven correct with respect to the preservation of the language generated by the original grammar.

en cs.FL
arXiv Open Access 2014
A new concept for safeguarding and labeling of long-term stored waste and its place in the scope of existing tagging techniques

Dina Chernikova, Kåre Axell

The idea of a novel labeling method is suggested for a new way of long-term security identification, inventory tracking, prevention of falsification and theft of waste casks, copper canisters, spent fuel containers, mercury containers, waste packages and other items. The suggested concept is based on the use of a unique combination of radioisotopes with different predictable half life. As an option for applying the radioisotope tag to spent fuel safeguarding it is suggested to use a mixture of α-emitting isotopes, such as 241Am etc., with materials that easily undergo α-induced reactions with emission of specific γ-lines. Thus, the existing problem of the disposing of smoke detectors or other devices [1] which contain radioisotopes can be addressed, indirectly solving an existing waste problem. The results of the first pilot experiments with two general designs of storage canisters, namely a steel container which corresponds to the one which is commonly used for long-term storing of mercury in Europe and USA and a copper canister, the one which is in applications for nuclear repositories, are presented. As one of the options for a new labeling method it is proposed to use a multidimensional bar code symbology and tungsten plate with ultrasound techniques. It is shown that the new radioisotope label offers several advantages in the scope of existing tagging techniques (overview is given) and can be implemented even with low activity sources.

en physics.ins-det, nucl-ex

Halaman 21 dari 21258