Hasil untuk "Standardization. Simplification. Waste"

Menampilkan 20 dari ~454931 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar

JSON API
arXiv Open Access 2025
Simplification of Trajectory Streams

Siu-Wing Cheng, Haoqiang Huang, Le Jiang

While there are software systems that simplify trajectory streams on the fly, few curve simplification algorithms with quality guarantees fit the streaming requirements. We present streaming algorithms for two such problems under the Fréchet distance $d_F$ in $\mathbb{R}^d$ for some constant $d \geq 2$. Consider a polygonal curve $τ$ in $\mathbb{R}^d$ in a stream. We present a streaming algorithm that, for any $\varepsilon\in (0,1)$ and $δ> 0$, produces a curve $σ$ such that $d_F(σ,τ[v_1,v_i])\le (1+\varepsilon)δ$ and $|σ|\le 2\,\mathrm{opt}-2$, where $τ[v_1,v_i]$ is the prefix in the stream so far, and $\mathrm{opt} = \min\{|σ'|: d_F(σ',τ[v_1,v_i])\le δ\}$. Let $α= 2(d-1){\lfloor d/2 \rfloor}^2 + d$. The working storage is $O(\varepsilon^{-α})$. Each vertex is processed in $O(\varepsilon^{-α}\log\frac{1}{\varepsilon})$ time for $d \in \{2,3\}$ and $O(\varepsilon^{-α})$ time for $d \geq 4$ . Thus, the whole $τ$ can be simplified in $O(\varepsilon^{-α}|τ|\log\frac{1}{\varepsilon})$ time. Ignoring polynomial factors in $1/\varepsilon$, this running time is a factor $|τ|$ faster than the best static algorithm that offers the same guarantees. We present another streaming algorithm that, for any integer $k \geq 2$ and any $\varepsilon \in (0,\frac{1}{17})$, maintains a curve $σ$ such that $|σ| \leq 2k-2$ and $d_F(σ,τ[v_1,v_i])\le (1+\varepsilon) \cdot \min\{d_F(σ',τ[v_1,v_i]): |σ'| \leq k\}$, where $τ[v_1,v_i]$ is the prefix in the stream so far. The working storage is $O((k\varepsilon^{-1}+\varepsilon^{-(α+1)})\log \frac{1}{\varepsilon})$. Each vertex is processed in $O(k\varepsilon^{-(α+1)}\log^2\frac{1}{\varepsilon})$ time for $d \in \{2,3\}$ and $O(k\varepsilon^{-(α+1)}\log\frac{1}{\varepsilon})$ time for $d \geq 4$.

en cs.CG
arXiv Open Access 2025
Hamilton-Jacobi Reachability for Viability Analysis of Constrained Waste-to-Energy Systems under Adversarial Uncertainty

Achraf Bouhmady, Othman Cherkaoui Dekkaki

This paper investigates the problem of maintaining the safe operation of Waste-to-Energy (WtE) systems under operational constraints and uncertain waste inflows. We model this as a robust viability problem, formulated as a zero-sum differential game between a control policy and an adversarial disturbance. Within a Hamilton-Jacobi framework, the viability kernel is characterized as the zero sublevel set of a value function satisfying a constrained Hamilton-Jacobi-Bellman (HJB) equation in the viscosity sense. This formulation provides formal guarantees for ensuring that system trajectories remain within prescribed operational limits under worst-case scenarios. Compared to existing viability studies, this work introduces a rigorous HJB-based characterization explicitly incorporating uncertainty, tailored to nonlinear WtE dynamics. A numerical scheme based on the Local Lax-Friedrichs method is employed to approximate the viability kernel. Numerical experiments illustrate how increasing inflow uncertainty significantly reduces the viability domain, shrinking the safe operating envelope. The proposed method is computationally tractable for systems of moderate dimension and offers a basis for synthesizing robust control policies, contributing to the design of resilient and sustainable WtE infrastructures.

arXiv Open Access 2025
SF-Recon: Simplification-Free Lightweight Building Reconstruction via 3D Gaussian Splatting

Zihan Li, Tengfei Wang, Wentian Gan et al.

Lightweight building surface models are crucial for digital city, navigation, and fast geospatial analytics, yet conventional multi-view geometry pipelines remain cumbersome and quality-sensitive due to their reliance on dense reconstruction, meshing, and subsequent simplification. This work presents SF-Recon, a method that directly reconstructs lightweight building surfaces from multi-view images without post-hoc mesh simplification. We first train an initial 3D Gaussian Splatting (3DGS) field to obtain a view-consistent representation. Building structure is then distilled by a normal-gradient-guided Gaussian optimization that selects primitives aligned with roof and wall boundaries, followed by multi-view edge-consistency pruning to enhance structural sharpness and suppress non-structural artifacts without external supervision. Finally, a multi-view depth-constrained Delaunay triangulation converts the structured Gaussian field into a lightweight, structurally faithful building mesh. Based on a proposed SF dataset, the experimental results demonstrate that our SF-Recon can directly reconstruct lightweight building models from multi-view imagery, achieving substantially fewer faces and vertices while maintaining computational efficiency. Website:https://lzh282140127-cell.github.io/SF-Recon-project/

en cs.CV
arXiv Open Access 2025
Simplifications are Absolutists: How Simplified Language Reduces Word Sense Awareness in LLM-Generated Definitions

Lukas Ellinger, Miriam Anschütz, Georg Groh

Large Language Models (LLMs) can provide accurate word definitions and explanations for any context. However, the scope of the definition changes for different target groups, like children or language learners. This is especially relevant for homonyms, words with multiple meanings, where oversimplification might risk information loss by omitting key senses, potentially misleading users who trust LLM outputs. We investigate how simplification impacts homonym definition quality across three target groups: Normal, Simple, and ELI5. Using two novel evaluation datasets spanning multiple languages, we test DeepSeek v3, Llama 4 Maverick, Qwen3-30B A3B, GPT-4o mini, and Llama 3.1 8B via LLM-as-Judge and human annotations. Our results show that simplification drastically degrades definition completeness by neglecting polysemy, increasing the risk of misunderstanding. Fine-tuning Llama 3.1 8B with Direct Preference Optimization substantially improves homonym response quality across all prompt types. These findings highlight the need to balance simplicity and completeness in educational NLP to ensure reliable, context-aware definitions for all learners.

en cs.CL
arXiv Open Access 2024
New algorithms for the simplification of multiple trajectories under bandwidth constraints

Gilles Dejaegere, Mahmoud Sakr

This study introduces time-windowed variations of three established trajectory simplification algorithms. These new algorithms are specifically designed to be used in contexts with bandwidth limitations. We present the details of these algorithms and highlight the differences compared to their classical counterparts. To evaluate their performance, we conduct accuracy assessments for varying sizes of time windows, utilizing two different datasets and exploring different compression ratios. The accuracies of the proposed algorithms are compared with those of existing methods. Our findings demonstrate that, for larger time windows, the enhanced version of the bandwidth-constrained STTrace outperforms other algorithms, with the bandwidth-constrained improved version of \squish also yielding satisfactory results at a lower computational cost. Conversely, for short time windows, only the bandwidth-constrained version of Dead Reckoning remains satisfactory.

en cs.OH
arXiv Open Access 2024
Simplification & Incidence: How an Incidence-focused Perspective Patches Category-theoretic Problems in Graph Theory

Will Grilliette

By applying simplification operations to categories of multigraphs, several natural graph operations are shown to demonstrate categorical issues. The replacement of an undirected edge with a directed cycle for digraphs admits both a left and a right adjoint, while the analogous operation for quivers only admits a left adjoint. The clique-replacement graph, intersection graph, and dual hypergraph fail to be functorial with traditional graph homomorphisms. The three failures are remedied by considering weak set-system homomorphisms, which form a category isomorphic to both the category of incidence structures and a lax comma category.

en math.CT, math.CO
arXiv Open Access 2022
ALEXSIS-PT: A New Resource for Portuguese Lexical Simplification

Kai North, Marcos Zampieri, Tharindu Ranasinghe

Lexical simplification (LS) is the task of automatically replacing complex words for easier ones making texts more accessible to various target populations (e.g. individuals with low literacy, individuals with learning disabilities, second language learners). To train and test models, LS systems usually require corpora that feature complex words in context along with their candidate substitutions. To continue improving the performance of LS systems we introduce ALEXSIS-PT, a novel multi-candidate dataset for Brazilian Portuguese LS containing 9,605 candidate substitutions for 387 complex words. ALEXSIS-PT has been compiled following the ALEXSIS protocol for Spanish opening exciting new avenues for cross-lingual models. ALEXSIS-PT is the first LS multi-candidate dataset that contains Brazilian newspaper articles. We evaluated four models for substitute generation on this dataset, namely mDistilBERT, mBERT, XLM-R, and BERTimbau. BERTimbau achieved the highest performance across all evaluation metrics.

en cs.CL, cs.AI
S2 Open Access 2021
Comparison of trace element mobility from MSWI ash before and after plasma vitrification

Wesley N Oehmig, Justin G. Roessler, A. Saleh et al.

A common perception of plasma arc treatment systems for municipal solid waste incineration ash is that the resulting vitrified slag is inert from an environmental perspective. Research was conducted to examine this hypothesis and to assess whether reduced pollutant release results from pollutant depletion during the process of the ash with plasma, or encapsulation in the glassy vitrified matrix. The concentrations of four discrete municipal solid waste incineration ash samples before and after plasma arc vitrification in a bench-scale unit were compared. Slag and untreated ash samples were leached using several standardized approaches and mobility among the four metals of interest (e.g. As, Cd, Pb and Sb) varied across samples, but was generally high (as high as 100% for Cd). Comparison across methods did not indicate substantial encapsulation in the vitrified slag, which suggests that reduced pollutant release from plasma arc vitrified slag is due to pollutant depletion by volatilization, not encapsulation. This has significant implications for the management of air pollution control residues from waste-to-energy facilities using plasma arc vitrification.

5 sitasi en Medicine
S2 Open Access 2021
Building the bridge between safety requirements and numerical modeling: an example considering crack development of Opalinus clay in laboratory and field scales

T. Cajuhi, J. Maßmann, G. Ziefle

Abstract. Salt, crystalline and clay formations are under discussion as potential host rocks for storage of heat-generating radioactive waste. Each of these rocks has a different structure and composition, and consequently a different material behavior. The latter needs to be studied and evaluated with respect to the main aim: to find a place to store the waste in a safe and sustainable manner. Several requirements in the context of the safety of a repository need to be fulfilled, concerning the long-term as well as the operational phase. One key point in this matter is the integrity, which refers to retention of the isolating rock zone's containment capabilities. With the focus on some experimental and numerical investigations on the excavation influenced near-field behavior of Opalinus clay (OPA), this contribution aims to illustrate an example for the role of numerical modeling in safety assessment. Once, e.g. anthropogenic action such as excavation starts, the natural state of equilibrium in the formation is disturbed. Trying to restore it, the rock deforms (convergence) and/or releases energy in other ways such as cracking. This could lead to loss of integrity since crack nucleation and propagation can affect the mechanical stability and create paths to transport contaminants. During operation in the excavated rock, environmental changes, e.g. temperature and humidity, further affect its behavior. The understanding of these dynamic phenomena ideally needs to occur at the in situ scale; however, performing an experiment in the spatial and time scales of interest is not always possible. For this reason, the in situ problem needs to be formulated, abstracted and mathematically modeled. The interpretation of the results must take place with simplifying assumptions and complementary laboratory scale experiments can be used to improve understanding of the system. The real problem is approached stepwise, each step associated to the size of the model and its complexity. The gradually obtained knowledge is necessary to achieve a better understanding of the process and to evaluate the capacities and limitations of the models. This contribution aims at showing the basic practical steps for numerical modeling with particular focus on the preparation and interpretation of the models and results, e.g. model calibration, verification and validation. As an example, the OPA at the Mont Terri site is chosen. The material parameters are obtained either experimentally or from the literature. We choose and perform laboratory scale simulations that are related to nearly the same mechanism as in the in situ scale. To have a first impression on the latter, a simplified, large-scale numerical model is prepared. The mechanism in study is drying and wetting, which is associated with shrinkage and swelling. We analyze the pore pressure and stress development in both scales. Thus, hydraulic mechanically coupled approaches are essential. The concept of effective stress is used, which combines the contributions of the solid and fluid phases (gas and liquid). In the current modeling approach, the gas pressure remains constant (atmospheric pressure) and during drying, the liquid pressure induces capillary pressure development and decrease of saturation. The laboratory scale simulation is important to evaluate the model of choice and to assess potential numerical problems. Furthermore, it can be used to perform a sensitivity study of material and numerical parameters. This step is necessary during the development or extension of numerical models as well as to evaluate their applicability on new research questions. The simplified in situ scale numerical model is then extended. In this phase the numerical model is evaluated once again, especially with respect to its complexity. Furthermore, specific questions related to this scale are posed: overall behavior of the rock, influence of the excavation, seasonal and long-term effects. In this contribution we deal with the long-term cyclic deformation (CD-A) experiment. The CD-A experiment has been taking place in the Mont Terri Rock Laboratory since October 2019. It consists of twin niches, a closed and an open niche, subjected to either high air humidity or seasonal humidity changes leading to saturation/desaturation during summer/winter in the OPA, respectively. Several parameters are periodically or continuously measured, including relative air humidity, convergence and crack development. We attempt to transfer the knowledge and numerical models developed in the small scale to the large scale and to evaluate the possibilities and limitations of the chosen approaches by comparing the numerical and experimental results.

4 sitasi en
arXiv Open Access 2021
Instanced model simplification using combined geometric and appearance-related metric

Sadia Tariq, Anis Ur Rahman, Tahir Azim et al.

Evolution of 3D graphics and graphical worlds has brought issues like content optimization, real-time processing, rendering, and shared storage limitation under consideration. Generally, different simplification approaches are used to make 3D meshes viable for rendering. However, many of these approaches ignore vertex attributes for instanced 3D meshes. In this paper, we implement and evaluate a simple and improved version to simplify instanced 3D textured models. The approach uses different vertex attributes in addition to geometry to simplify mesh instances. The resulting simplified models demonstrate efficient time-space requirements and better visual quality.

en cs.GR, cs.CC
arXiv Open Access 2021
Exergy of passive states: Waste energy after ergotropy extraction

F. H. Kamin, S. Salimi, Alan C. Santos

Work extraction protocol is always a significant issue in the context of quantum batteries, in which the notion of ergotropy is used to quantify a particular amount of energy that can be extracted through unitary processes. Given the total amount of energy stored in a quantum system, quantifying wasted energy after the ergotropy extraction is a question to be considered when undesired coupling with thermal reservoirs is taken into account. In this paper, we show that some amount of energy can be lost when we extract ergotropy from a quantum system and quantified by the exergy of passive states. Through a particular example, one shows that ergotropy extraction can be done by preserving the quantum correlations of a quantum system. Our study opens the perspective for new advances in open system quantum batteries able to explore exergy stored as quantum correlations.

en quant-ph
arXiv Open Access 2021
Asymptotic simplification of Aggregation-Diffusion equations towards the heat kernel

José A. Carrillo, David Gómez-Castro, Yao Yao et al.

We give sharp conditions for the large time asymptotic simplification of aggregation-diffusion equations with linear diffusion. As soon as the interaction potential is bounded and its first and second derivatives decay fast enough at infinity, then the linear diffusion overcomes its effect, either attractive or repulsive, for large times independently of the initial data, and solutions behave like the fundamental solution of the heat equation with some rate. The potential $W(x) \sim \log |x|$ for $|x| \gg 1$ appears as the natural limiting case when the intermediate asymptotics change. In order to obtain such a result, we produce uniform-in-time estimates in a suitable rescaled change of variables for the entropy, the second moment, Sobolev norms and the $C^α$ regularity with a novel approach for this family of equations using modulus of continuity techniques.

en math.AP
arXiv Open Access 2020
Kirchhoff's Circuit Law Applications to Graph Simplification in Search Problems

Jaeho Choi, Joongheon Kim

This paper proposes a new analysis of graph using the concept of electric potential, and also proposes a graph simplification method based on this analysis. Suppose that each node in the weighted-graph has its respective potential value. Furthermore, suppose that the start and terminal nodes in graphs have maximum and zero potentials, respectively. When we let the level of each node be defined as the minimum number of edges/hops from the start node to the node, the proper potential of each level can be estimated based on geometric proportionality relationship. Based on the estimated potential for each level, we can re-design the graph for path-finding problems to be the electrical circuits, thus Kirchhoff's Circuit Law can be directed applicable for simplifying the graph for path-finding problems.

en cs.DM, eess.SP
S2 Open Access 2017
Improvement based on standardized work: an implementation case study

Julio Cesar Fin, G. Vidor, I. Cecconello et al.

Standardized work is an effective way for process improvement, specially when it is applied to manual tasks such as assembly lines. This tool is part of Toyota Production System and based on wastes reduction. Thus, the objectives were to implement standardized work in a medium chassis assembly line and measure the benefis from optimization of operators’ tasks and movement through waste reduction. In order to achieve these results, a single case study was performed in a medium chassis assembly line that is part of a company in South Brazil. The steps involved: defining the object of study; time measurement and takt time definition; creating a production capacity sheet; defining the minimum number of operators and balancing the line; determining minimum work in process; defining the new layout; creating a standardized operations sheet and a standardized operations routine sheet; training and implementing standardized work; and verifying the results. Results show 36 minutes reduction in terms of assembly time and 200 meters reduction in terms of operators’ movement on average. Other contributions regard the 9.6% reduction in terms of the assembly line downtime.

16 sitasi en Engineering

Halaman 29 dari 22747