Digital soil mapping (DSM),as one of the sub -disciplines of soil science, was first introduced in McBratny et al. in 2003. It has since then witnessed many developments and has had a lot of scientific contributions at the global level. DSM aims to create and populate spatial soil information collected through field and laboratory observations that are coupled through quantitative relationships with environmental data. The output involves raster maps of predictions and uncertainties. The enhanced availability of spatial data, such as digital elevation models and satellite images; the increasing computation power to process data; the development of data-mining tools and GIS; and increasing global demand for spatial data including uncertainty assessments are some of the factors that have led to the success of field.This paper reviews the development of digital soil mapping through time, the covariates, some modeling examples,and the DSM studies so far carried out in Iran.
The livestock sector globally is highly dynamic. In developing countries, it is evolving in response to rapidly increasing demand for livestock products. In developed countries, demand for livestock products is stagnating, while many production systems are increasing their efficiency and environmental sustainability. Historical changes in the demand for livestock products have been largely driven by human population growth, income growth and urbanization and the production response in different livestock systems has been associated with science and technology as well as increases in animal numbers. In the future, production will increasingly be affected by competition for natural resources, particularly land and water, competition between food and feed and by the need to operate in a carbon-constrained economy. Developments in breeding, nutrition and animal health will continue to contribute to increasing potential production and further efficiency and genetic gains. Livestock production is likely to be increasingly affected by carbon constraints and environmental and animal welfare legislation. Demand for livestock products in the future could be heavily moderated by socio-economic factors such as human health concerns and changing socio-cultural values. There is considerable uncertainty as to how these factors will play out in different regions of the world in the coming decades.
P. Jain, Sean C. P. Coogan, Sriram Ganapathi Subramanian
et al.
Artificial intelligence has been applied in wildfire science and management since the 1990s, with early applications including neural networks and expert systems. Since then, the field has rapidly progressed congruently with the wide adoption of machine learning (ML) methods in the environmental sciences. Here, we present a scoping review of ML applications in wildfire science and management. Our overall objective is to improve awareness of ML methods among wildfire researchers and managers, as well as illustrate the diverse and challenging range of problems in wildfire science available to ML data scientists. To that end, we first present an overview of popular ML approaches used in wildfire science to date and then review the use of ML in wildfire science as broadly categorized into six problem domains, including (i) fuels characterization, fire detection, and mapping; (ii) fire weather and climate change; (iii) fire occurrence, susceptibility, and risk; (iv) fire behavior prediction; (v) fire effects; and (vi) fire management. Furthermore, we discuss the advantages and limitations of various ML approaches relating to data size, computational requirements, generalizability, and interpretability, as well as identify opportunities for future advances in the science and management of wildfires within a data science context. In total, to the end of 2019, we identified 300 relevant publications in which the most frequently used ML methods across problem domains included random forests, MaxEnt, artificial neural networks, decision trees, support vector machines, and genetic algorithms. As such, there exists opportunities to apply more current ML methods — including deep learning and agent-based learning — in the wildfire sciences, especially in instances involving very large multivariate datasets. We must recognize, however, that despite the ability of ML models to learn on their own, expertise in wildfire science is necessary to ensure realistic modelling of fire processes across multiple scales, while the complexity of some ML methods such as deep learning requires a dedicated and sophisticated knowledge of their application. Finally, we stress that the wildfire research and management communities play an active role in providing relevant, high-quality, and freely available wildfire data for use by practitioners of ML methods.
Background: There has been increasing interest in the concept that exposures to environmental chemicals may be contributing factors to the epidemics of diabetes and obesity. On 11–13 January 2011, the National Institute of Environmental Health Sciences (NIEHS) Division of the National Toxicology Program (NTP) organized a workshop to evaluate the current state of the science on these topics of increasing public health concern. Objective: The main objective of the workshop was to develop recommendations for a research agenda after completing a critical analysis of the literature for humans and experimental animals exposed to certain environmental chemicals. The environmental exposures considered at the workshop were arsenic, persistent organic pollutants, maternal smoking/nicotine, organotins, phthalates, bisphenol A, and pesticides. High-throughput screening data from Toxicology in the 21st Century (Tox21) were also considered as a way to evaluate potential cellular pathways and generate -hypotheses for testing which and how certain chemicals might perturb biological processes related to diabetes and obesity. Conclusions: Overall, the review of the existing literature identified linkages between several of the environmental exposures and type 2 diabetes. There was also support for the “developmental obesogen” hypothesis, which suggests that chemical exposures may increase the risk of obesity by altering the differentiation of adipocytes or the development of neural circuits that regulate feeding behavior. The effects may be most apparent when the developmental exposure is combined with consumption of a high-calorie, high-carbohydrate, or high-fat diet later in life. Research on environmental chemical exposures and type 1 diabetes was very limited. This lack of research was considered a critical data gap. In this workshop review, we outline the major themes that emerged from the workshop and discuss activities that NIEHS/NTP is undertaking to address research recommendations. This review also serves as an introduction to an upcoming series of articles that review the literature regarding specific exposures and outcomes in more detail.
Background: Synthesizing what is known about the environmental drivers of health is instrumental to taking prevention-oriented action. Methods of research synthesis commonly used in environmental health lag behind systematic review methods developed in the clinical sciences over the past 20 years. Objectives: We sought to develop a proof of concept of the “Navigation Guide,” a systematic and transparent method of research synthesis in environmental health. Discussion: The Navigation Guide methodology builds on best practices in research synthesis in evidence-based medicine and environmental health. Key points of departure from current methods of expert-based narrative review prevalent in environmental health include a prespecified protocol, standardized and transparent documentation including expert judgment, a comprehensive search strategy, assessment of “risk of bias,” and separation of the science from values and preferences. Key points of departure from evidence-based medicine include assigning a “moderate” quality rating to human observational studies and combining diverse evidence streams. Conclusions: The Navigation Guide methodology is a systematic and rigorous approach to research synthesis that has been developed to reduce bias and maximize transparency in the evaluation of environmental health information. Although novel aspects of the method will require further development and validation, our findings demonstrated that improved methods of research synthesis under development at the National Toxicology Program and under consideration by the U.S. Environmental Protection Agency are fully achievable. The institutionalization of robust methods of systematic and transparent review would provide a concrete mechanism for linking science to timely action to prevent harm. Citation: Woodruff TJ, Sutton P. 2014. The Navigation Guide systematic review methodology: a rigorous and transparent method for translating environmental health science into better health outcomes. Environ Health Perspect 122:1007–1014; http://dx.doi.org/10.1289/ehp.1307175
How does the climatic experience of previous generations affect today's attention to environmental questions? Using self-reported beliefs and environmental themes in folklore, we show empirically that the realized intensity of deviations from typical climate conditions in ancestral generations influences how much descendants care about the environment. The effect exhibits a U-shape where more stable and more unstable ancestral climates lead to higher attention today, with a dip for intermediate realizations. We propose a theoretical framework where the value of costly attention to environmental conditions depends on the perceived stability of the environment, prior beliefs about which are shaped through cultural transmission by the experience of ethnic ancestors. The U-shape is rationalized by a double purpose of learning about the environment: optimal utilization of typical conditions and protection against extreme events.