A. Gómez-Barea, B. Leckner
Hasil untuk "Standardization. Simplification. Waste"
Menampilkan 20 dari ~454859 hasil · dari DOAJ, CrossRef, arXiv, Semantic Scholar
Suman Kunwar
This study introduces the Garbage Dataset (GD), a publicly available image dataset designed to advance automated waste segregation through machine learning and computer vision. It is a diverse dataset that covers 10 categories of common household waste: metal, glass, biological, paper, battery, trash, cardboard, shoes, clothes, and plastic. The dataset comprises 12,259 labeled images collected through multiple methods, including the DWaste mobile app and curated web sources. The methods included rigorous validation through checksums and outlier detection, analysis of class imbalance and visual separability through PCA/t-SNE, and assessment of background complexity using entropy and saliency measures. The dataset was benchmarked using state-of-the-art deep learning models (EfficientNetV2M, EfficientNetV2S, MobileNet, ResNet50, ResNet101) evaluated on performance metrics and operational carbon emissions. The results of the experiment indicate that EfficientNetV2S achieved the highest performance with a accuracy of 95.13% and an F1-score of 0.95 with moderate carbon cost. Analysis revealed inherent dataset characteristics including class imbalance, a skew toward high-outlier classes (plastic, cardboard, paper), and brightness variations that require consideration. The main conclusion is that GD provides a valuable real-world benchmark for waste classification research while highlighting important challenges such as class imbalance, background complexity, and environmental trade-offs in model selection that must be addressed for practical deployment. The dataset is publicly released to support further research in environmental sustainability applications.
Moniesha Thilakarathna, Xing Wang, Min Wang et al.
Food waste management is critical for sustainability, yet inorganic contaminants hinder recycling potential. Robotic automation presents a compelling approach to this challenge by accelerating the sorting process through automated contaminant removal. Still, the diverse and unpredictable nature of contaminants creates major challenges for robotic grasping. Benchmarking frameworks are critical for evaluating challenges from various perspectives. However, existing protocols rely on limited simulation datasets, prioritise simple metrics such as success rate, and overlook key object and environment-related pre-grasp conditions. This paper introduces GRAB, a comprehensive Grasping Real-World Article Benchmarking framework that addresses this gap by integrating diverse deformable objects, advanced grasp-pose-estimation vision, and, importantly, pre-grasp conditions, establishing a set of critical graspability metrics. It systematically compares industrial grasping modalities through an in-depth experimental evaluation involving 1,750 food contaminant grasp attempts across four high-fidelity scenes. This large-scale evaluation provides an extensive assessment of grasp performance for food waste sorting, offering a level of depth that has rarely been explored in previous studies. The results reveal distinct gripper strengths and limitations, with object quality emerging as the dominant performance factor in cluttered environments, while vision quality and clutter levels play moderate roles. These findings highlight essential design considerations and reinforce the necessity of developing multimodal gripper technologies capable of robust cross-category performance for effective robotic food waste sorting.
Nathaniel LeCompte, Andrew Caratenuto, Yi Zheng
Highly reflective Calcium Phosphate (CAP) nanoparticles have been obtained from waste chicken and porcine bones. Chicken and pork bones have been processed and calcined at temperatures between 600°C and 1200°C to remove organic material and resulting in CAP bio-ceramic compounds with high reflectance. The reflectivity of the materials in the solar wavelength region is on par with chemically synthesized CAP. The high reflectivity, consistently over 90%, as well as the size distribution and packing density of the nanoparticles obtained in these early bone studies make a strong case for pursuing this avenue to obtain pigment for high solar reflectivity applications, such as passive daytime radiative cooling. The results presented indicate a viable path toward a cost-effective and eco-friendly source of highly reflective cooling pigments. By sourcing calcium phosphates from animal bones, there is also the potential to divert large quantities of bone waste generated by the meat industry from landfills, further contributing toward sustainability and energy reduction efforts in the construction industry and beyond.
A. P. Onyena, Donald Chukwudi Aniche, Bright O Ogbolu et al.
Threats emerging from microplastic pollution in the marine environment have received much global attention. This review assessed sources, fate, and impacts of microplastics in marine ecosystems and identified gaps. Most studies document the ubiquity of microplastics and associated environmental effects. Effects include impacts to marine ecosystems, risks to biodiversity, and threats to human health. Microplastic leakage into marine ecosystems arises from plastic waste mismanagement and a lack of effective mitigative strategies. This review identified a scarcity of microplastics’ mitigation strategies from different stakeholders. Lack of community involvement in microplastic monitoring or ecosystem conservation exists due to limited existence of citizen science and stakeholder co-management initiatives. Although some management strategies exist for controlling effects of microplastics (often implemented by local and global environmental groups), a standardized management strategy to mitigate microplastics in coastal areas is urgently required. There is a need to review policy interventions aimed at plastic reduction in or near coastal ecosystems and evaluate their effectiveness. There is also a need to identify focal causes of microplastic pollution in the marine environment through further environmental research and governance approaches. These would extend to creating more effective policies as well as harmonized and extended efforts of educational campaigns and incentives for plastic waste reduction while mandating stringent penalties to help reduce microplastic leakage into the marine environment.
Blanca Carbajo Coronado, Antonio Moreno Sandoval
This paper delves into concept extraction and lexical simplification in the financial domain in Spanish. In our approach, concept extraction involves identifying relevant terms and phrases using AI language models, while lexical simplification aims to make complex financial concepts more accessible. For this study, terms were annotated in the FinT-esp financial corpus and the mT5 neural model was used for accurate term extraction. The model yielded remarkable results: 96% of the detected terms had not been manually annotated before, showcasing its noteworthy generative capability. For lexical simplification, the paper proposes three main strategies: paraphrasing, synonym substitution, and translation, all integrated into an interactive interface that addresses the issue of sentence length. This research significantly contributes to financial concept detection and offers an effective method for simplifying financial language in Spanish.
Regina Stodden
In this work, we propose EASSE-multi, a framework for easier automatic sentence evaluation for languages other than English. Compared to the original EASSE framework, EASSE-multi does not focus only on English. It contains tokenizers and versions of text simplification evaluation metrics which are suitable for multiple languages. In this paper, we exemplify the usage of EASSE-multi for German TS, resulting in EASSE-DE. Further, we compare text simplification results when evaluating with different language or tokenization settings of the metrics. Based on this, we formulate recommendations on how to make the evaluation of (German) TS models more transparent and better comparable. The code of EASSE-multi and its German specialisation (EASSE-DE) can be found at https://github.com/rstodden/easse-de.
Wei Quan Chin, Yeong Huei Lee, Mugahed Amran et al.
The fabrication of bricks commonly consumes relatively high natural resources. To reduce the carbon footprint in the brick production industry, repurposing industrial wastes in the making of sustainable bricks is a recent trend in research and application. Local wastes, such as oil palm shell (OPS), palm oil fuel ash (POFA), and quarry dust (QD), are massively produced annually in the palm oil-exporting countries. Moreover, QD from mining industries is hazardous to both water and air quality. For better waste management in marching towards sustainability, these wastes should be given their second life as construction materials. Therefore, this paper investigates the possibility of incorporating agro-industrial wastes into the brick mixture by examining their properties by means of several standardized tests. For the mix design, a 100% replacement of coarse aggregate with OPS, 20% replacement of cement with POFA, 20% cement weight of limestone as admixture, and 0 to 50% replacements of fine aggregate with QD are experimentally considered. The optimum mix of these wastes is preliminarily determined by focusing on high compressive strength as an indicator. Other examinations include splitting tensile, flexural strength, water absorption, and efflorescence tests. Although the agro-industrial waste cement brick is 18% lower in the strength to weight ratio compared to that of conventional, it is observed that it has better late strength development due to its POFA pozzolanic properties. Moreover, the proposed green cement brick is further checked for compliance with several standards for feasible use in the construction industry. Financially, the cost for the brick with the new mix design is almost equivalent to that of conventional. Hence, this green cement brick is reasonable to be employed in the construction industry to promote material sustainability for better waste management.
Liam Cripwell, Joël Legrand, Claire Gardent
Automatic evaluation for sentence simplification remains a challenging problem. Most popular evaluation metrics require multiple high-quality references -- something not readily available for simplification -- which makes it difficult to test performance on unseen domains. Furthermore, most existing metrics conflate simplicity with correlated attributes such as fluency or meaning preservation. We propose a new learned evaluation metric (SLE) which focuses on simplicity, outperforming almost all existing metrics in terms of correlation with human judgements.
Muhammad Salman, Armin Haller, Sergio J. Rodríguez Méndez
Text simplification is one of the domains in Natural Language Processing (NLP) that offers an opportunity to understand the text in a simplified manner for exploration. However, it is always hard to understand and retrieve knowledge from unstructured text, which is usually in the form of compound and complex sentences. There are state-of-the-art neural network-based methods to simplify the sentences for improved readability while replacing words with plain English substitutes and summarising the sentences and paragraphs. In the Knowledge Graph (KG) creation process from unstructured text, summarising long sentences and substituting words is undesirable since this may lead to information loss. However, KG creation from text requires the extraction of all possible facts (triples) with the same mentions as in the text. In this work, we propose a controlled simplification based on the factual information in a sentence, i.e., triple. We present a classical syntactic dependency-based approach to split and rephrase a compound and complex sentence into a set of simplified sentences. This simplification process will retain the original wording with a simple structure of possible domain facts in each sentence, i.e., triples. The paper also introduces an algorithm to identify and measure a sentence's syntactic complexity (SC), followed by reduction through a controlled syntactic simplification process. Last, an experiment for a dataset re-annotation is also conducted through GPT3; we aim to publish this refined corpus as a resource. This work is accepted and presented in International workshop on Learning with Knowledge Graphs (IWLKG) at WSDM-2023 Conference. The code and data is available at www.github.com/sallmanm/SynSim.
Valentin Knappich, Simon Razniewski, Annemarie Friedrich
Automatic simplification can help laypeople to comprehend complex scientific text. Language models are frequently applied to this task by translating from complex to simple language. In this paper, we describe our system based on Llama 2, which ranked first in the PLABA shared task addressing the simplification of biomedical text. We find that the large portion of shared tokens between input and output leads to weak training signals and conservatively editing models. To mitigate these issues, we propose sentence-level and token-level loss weights. They give higher weight to modified tokens, indicated by edit distance and edit operations, respectively. We conduct an empirical evaluation on the PLABA dataset and find that both approaches lead to simplifications closer to those created by human annotators (+1.8% / +3.5% SARI), simpler language (-1 / -1.1 FKGL) and more edits (1.6x / 1.8x edit distance) compared to the same model fine-tuned with standard cross entropy. We furthermore show that the hyperparameter $λ$ in token-level loss weights can be used to control the edit distance and the simplicity level (FKGL).
Zachary W. Taylor, Maximus H. Chu, Junyi Jessy Li
Access to higher education is critical for minority populations and emergent bilingual students. However, the language used by higher education institutions to communicate with prospective students is often too complex; concretely, many institutions in the US publish admissions application instructions far above the average reading level of a typical high school graduate, often near the 13th or 14th grade level. This leads to an unnecessary barrier between students and access to higher education. This work aims to tackle this challenge via text simplification. We present PSAT (Professionally Simplified Admissions Texts), a dataset with 112 admissions instructions randomly selected from higher education institutions across the US. These texts are then professionally simplified, and verified and accepted by subject-matter experts who are full-time employees in admissions offices at various institutions. Additionally, PSAT comes with manual alignments of 1,883 original-simplified sentence pairs. The result is a first-of-its-kind corpus for the evaluation and fine-tuning of text simplification systems in a high-stakes genre distinct from existing simplification resources.
Anilkumar Bohra, Satish Vitta
The amount of waste heat exergy generated globally is 69.058 EJ which can be divided into, low temperature 373 K, 30.496 EJ, medium temperature 373 K to 573 K, 14.431 EJ and high temperature 573 K, 24.131 EJ. These values of exergy have been used to determine the minimum number of pn junctions required to convert the exergy into electrical power. It is found that the number of junctions required to convert high temperature exergy increases from 8.22x10^11 to 24.66x10^11 when the aspect ratio of the legs increases from 0.5 cm^1 to 1.5 cm^1. To convert the low temperature exergy, 81.76x10^11 to 245.25x10^11 junctions will be required depending on the legs aspect ratio. The quantity of alloys containing elements such as Pb, Bi, Te, Sb, Se and Sn required to synthesize these junctions therefore is of the order of millions of tons which means the elements required is also of similar magnitude. The current world production of these elements however falls far short of this requirement, indicating significant supply chain risk. The production of these elements, even if resources are available, will emit millions of tons of CO2 showing that current alloys are non-sustainable for waste heat recovery.
Yuanqi Li, Jianwei Guo, Xinran Yang et al.
The growing size of point clouds enlarges consumptions of storage, transmission, and computation of 3D scenes. Raw data is redundant, noisy, and non-uniform. Therefore, simplifying point clouds for achieving compact, clean, and uniform points is becoming increasingly important for 3D vision and graphics tasks. Previous learning based methods aim to generate fewer points for scene understanding, regardless of the quality of surface reconstruction, leading to results with low reconstruction accuracy and bad point distribution. In this paper, we propose a novel point cloud simplification network (PCS-Net) dedicated to high-quality surface mesh reconstruction while maintaining geometric fidelity. We first learn a sampling matrix in a feature-aware simplification module to reduce the number of points. Then we propose a novel double-scale resampling module to refine the positions of the sampled points, to achieve a uniform distribution. To further retain important shape features, an adaptive sampling strategy with a novel saliency loss is designed. With our PCS-Net, the input non-uniform and noisy point cloud can be simplified in a feature-aware manner, i.e., points near salient features are consolidated but still with uniform distribution locally. Experiments demonstrate the effectiveness of our method and show that we outperform previous simplification or reconstruction-oriented upsampling methods.
Rolandos Alexandros Potamias, Giorgos Bouritsas, Stefanos Zafeiriou
The recent advances in 3D sensing technology have made possible the capture of point clouds in significantly high resolution. However, increased detail usually comes at the expense of high storage, as well as computational costs in terms of processing and visualization operations. Mesh and Point Cloud simplification methods aim to reduce the complexity of 3D models while retaining visual quality and relevant salient features. Traditional simplification techniques usually rely on solving a time-consuming optimization problem, hence they are impractical for large-scale datasets. In an attempt to alleviate this computational burden, we propose a fast point cloud simplification method by learning to sample salient points. The proposed method relies on a graph neural network architecture trained to select an arbitrary, user-defined, number of points from the input space and to re-arrange their positions so as to minimize the visual perception error. The approach is extensively evaluated on various datasets using several perceptual metrics. Importantly, our method is able to generalize to out-of-distribution shapes, hence demonstrating zero-shot capabilities.
Vishesh Agarwal, Somak Aditya, Navin Goyal
Symbolic Mathematical tasks such as integration often require multiple well-defined steps and understanding of sub-tasks to reach a solution. To understand Transformers' abilities in such tasks in a fine-grained manner, we deviate from traditional end-to-end settings, and explore a step-wise polynomial simplification task. Polynomials can be written in a simple normal form as a sum of monomials which are ordered in a lexicographic order. For a polynomial which is not necessarily in this normal form, a sequence of simplification steps is applied to reach the fully simplified (i.e., in the normal form) polynomial. We propose a synthetic Polynomial dataset generation algorithm that generates polynomials with unique proof steps. Through varying coefficient configurations, input representation, proof granularity, and extensive hyper-parameter tuning, we observe that Transformers consistently struggle with numeric multiplication. We explore two ways to mitigate this: Curriculum Learning and a Symbolic Calculator approach (where the numeric operations are offloaded to a calculator). Both approaches provide significant gains over the vanilla Transformers-based baseline.
Ying-Qiu Gu
Most fully developed galaxies have vivid spiral structure, but the formation and evolution of spiral structure is still a mystery that is not fully understood in astrophysics. We find that the currently used equations of galactic dynamics contain some unreasonable components. In this paper, the following three working assumptions are introduced to simplify the galactic structural equations. 1. In the research of large-scale structure, the retarded potential of the gravitational field should be taken into account. The propagating time of the gravitational field from center to border is longer than the revolution periods of the stars near the center of galaxy. Newton's gravitational potential is unreasonable for such case, and the weak field and low velocity approximation of Einstein's field equation should be adopted. 2. The stars in a fully developed galaxy should be zero-pressure and inviscid fluid, and the equation of motion is different from that of ordinary continuum mechanics. Stars move along geodesics. 3. The structure of the galaxy is only related to the total mass density distribution. The equation of state of dark halo is different from that of ordinary luminous interstellar matter, so their trajectories are also very different. Dark halo and ordinary matter in galaxy are automatically separated. The total mass density distribution can be presupposed according to the observation data, and then it can be determined by comparing the solution of the equations with the observed data. These assumptions and treatments are supported by theory and observation. The variables of the equations of simplified galactic dynamics are separated from each other, and the equations are well-posed and can be solved according to a definite procedure. Therefore, this simplified dynamic equation system provides a more reasonable and practical framework for the further study of galactic structure, and can solve many practical problems. Besides, it is closely related to the study of dark energy and dark matter.
Shteryo Nozharov
The main purpose of the study is to develop the model for transaction costs measurement in the Collective Waste Recovery Systems. The methodology of New Institutional Economics is used in the research. The impact of the study is related both to the enlargement of the limits of the theory about the interaction between transaction costs and social costs and to the identification of institutional failures of the European concept for circular economy. A new model for social costs measurement is developed. Keywords: circular economy, transaction costs, extended producer responsibility JEL: A13, C51, D23, L22, Q53
Mounica Maddela, Wei Xu
Current lexical simplification approaches rely heavily on heuristics and corpus level features that do not always align with human judgment. We create a human-rated word-complexity lexicon of 15,000 English words and propose a novel neural readability ranking model with a Gaussian-based feature vectorization layer that utilizes these human ratings to measure the complexity of any given word or phrase. Our model performs better than the state-of-the-art systems for different lexical simplification tasks and evaluation datasets. Additionally, we also produce SimplePPDB++, a lexical resource of over 10 million simplifying paraphrase rules, by applying our model to the Paraphrase Database (PPDB).
Halaman 19 dari 22743