Alexander Alexander, Darmawan Pontan, Efa Ayu Nabila
et al.
The Surface Maintenance Operation (SMO) Construction Services Work Unit Rate Earthwork (WUR EW) project in the Rokan Block, Indonesia, utilizes a unit price contract model with hypothetical volumes. This contractual model transfers the risk of uncertain work quantities from the project owner to the contractor. This study aims to analyze the characteristics of this contract and formulate a secure strategy for preparing bidding prices and managing construction costs. The research employed a quantitative analysis of historical data from two work packages (Package 3 and Mitigation Package 3). The analysis was conducted through the calculation of Key Performance Indicators (KPIs) such as the Cost Performance Index (CPI) and Budget Variance (BV), as well as Cost and Quantity Variance Analysis per work item. The results indicate that both projects achieved excellent cost performance, exceeding the planned profit targets (Realized CPI > Planned CPI). The most effective strategy was to set a high profit percentage on the top 10 Earthwork & Civil items (the Pareto items), which experienced significant volume increases and were the main drivers of the total profit growth.
The PBR theorem, which implies that the Einsteinian realist view on quantum mechanics (QM) is inconsistent with predictions of the standard Copenhagen view on QM, has been hailed as one of the most important theorems in the foundations of QM. Here we show that the special measurement, used by Pusey et al. to derive the theorem, is nonexisting from the Einsteinian view on QM.
In order to celebrate this double birthday the journal Foundations of Physics publishes a topical collection `Pilot-wave and beyond' on the developments that have followed the pioneering works of Louis de Broglie and David Bohm on quantum foundations. This topical collection includes contributions from physicists and philosophers debating around the world about the scientific legacy of Bohm and de Broglie concerning the interpretation and understanding of quantum mechanics. In these forewords we give a general review of the historical context explaining how de Broglie and Bohm developed their interpretations of quantum mechanics. We further analyze the relationship between these two great thinkers and emphasize the role of several collaborators and continuators of their ontological approach to physics.
<p>In August 2018, the European Space Agency (ESA) launched the first Doppler wind lidar into space, which has since then been providing continuous
profiles of the horizontal line-of-sight wind component at a global scale. Aeolus data have been successfully assimilated into several numerical
weather prediction (NWP) models and demonstrated a positive impact on the quality of the weather forecasts. To provide valuable input data for NWP
models, a detailed characterization of the Aeolus instrumental performance as well as the realization and minimization of systematic error sources
is crucial. In this paper, Aeolus interferometer spectral drifts and their potential as systematic error sources for the aerosol and wind products
are investigated by means of instrument spectral registration (ISR) measurements that are performed on a weekly basis. During these measurements,
the laser frequency is scanned over a range of 11 <span class="inline-formula">GHz</span> in steps of 25 <span class="inline-formula">MHz</span> and thus spectrally resolves the transmission curves of the
Fizeau interferometer and the Fabry–Pérot interferometers (FPIs) used in Aeolus. Mathematical model functions are derived to analyze the
measured transmission curves by means of non-linear fit procedures. The obtained fit parameters are used to draw conclusions about the Aeolus
instrumental alignment and potentially ongoing drifts. The introduced instrumental functions and analysis tools may also be applied for upcoming
missions using similar spectrometers as for instance EarthCARE (ESA), which is based on the Aeolus FPI design.</p>
<p>HOLODEC, an airborne cloud particle imager, captures holographic images of a fixed volume of cloud to characterize the types and sizes of cloud particles, such as water droplets and ice crystals. Cloud particle properties include position, diameter, and shape. In this work we evaluate the potential for processing HOLODEC data by leveraging a combination of GPU hardware and machine learning with the eventual goal of improving HOLODEC processing speed and performance. We present a hologram processing algorithm, HolodecML, which utilizes a neural network segmentation model and computational parallelization to achieve these goals. HolodecML is trained using synthetically generated holograms based on a model of the instrument, and it predicts masks around particles found within reconstructed images. From these masks, the position and size of the detected particles can be characterized in three dimensions. In order to successfully process real holograms, we find we must apply a series of image corrupting transformations and noise to the synthetic images used in training.</p>
<p>In this evaluation, HolodecML had comparable position and size estimations performance to the standard processing method, but it improved particle detection by nearly 20 % on several thousand manually labeled HOLODEC images. However, the particle detection improvement only occurred when image corruption was performed on the simulated images during training, thereby mimicking non-ideal conditions in the actual probe. The trained model also learned to differentiate artifacts and other impurities in the HOLODEC images from the particles, even though no such objects were present in the training data set. By contrast, the standard processing method struggled to separate particles from artifacts. HolodecML also leverages GPUs and parallel computing that enables large processing speed gains over serial and CPU-only based evaluation. Our results demonstrate that the machine-learning based framework may be a possible path to both improving and accelerating hologram processing. The novelty of the training approach, which leveraged noise as a means for parameterizing non-ideal aspects of the HOLODEC detector, could be applied in other domains where the theoretical model is incapable of fully describing the real-world operation of the instrument and accurate truth data required for supervised learning cannot be obtained from real-world observations.</p>
We present a quantum mechanical (QM) analysis of Bell’s approach to quantum foundations based on his hidden-variable model. We claim and try to justify that the Bell model contradicts to the Heinsenberg’s uncertainty and Bohr’s complementarity principles. The aim of this note is to point to the physical seed of the aforementioned principles. This is the Bohr’s quantum postulate : the existence of indivisible quantum of action given by the Planck constant h . By contradicting these basic principles of QM, Bell’s model implies rejection of this postulate as well. Thus, this hidden-variable model contradicts not only the QM-formalism, but also the fundamental feature of the quantum world discovered by Planck.
When it comes to responding to the problem of anthropogenic climate change, the overwhelming preference among policy experts is for a system of carbon pricing. This is normally justified on the grounds that it maximizes the welfare of future generations. The objective of this chapter is to provide a philosophical defense of carbon pricing based instead on contractualist foundations. The central claim is that the negative externality of greenhouse gas emissions violates a reciprocity condition that is a central normative constraint in the system of cooperation in our society. A system of carbon taxation is recommended on the grounds that it addresses this externality problem more directly than any other policy alternative. The merits of such a regime are illustrated using the example of agricultural production and the carbon emissions associated with food supply.
The successive urban development in various parts of the world necessitated further improvement of the infrastructure accompanying the constructed facilities. Compacted finegrained soils are used in the infrastructure earthworks such as the construction embankment of roads, highways, road foundations. Fine-grained soils (especially clayey soils) consider as a problematic soil and can induce damages to roads founded on them, due to their volume changes, higher water content and/or low bearing capacity. The use of ordinary Portland cement; its components or residues; has been widely used in stabilizing cohesionless and some types of problematic soils like clayey soil. Studies conducted in this field may be classified into three main categories: use byproduct from cement production operations, direct use of cement alone or mixed with other materials, and recycling of cement as concrete waste. The use of cement byproduct, especially cement kiln dust to stabilize or improve clay soil was cover by many studies (Adeyanju & Okeke, 2019; Amadi & Osu, 2018; Miller & Azad, 2000; Naseem et al., 2019). The mixing of cement with fly ash become commonly used to reduce the amount of cement used or improve specific geotechnical properties of soil (Amu et al., 2008; Chenari et al., 2018; Khemissa & Mahamedi, 2014). Portland cement was also used with other stabilizing materials to improve the soil engineering properties. Lime is used with cement to improve the soil strength and reduce the swelling and settlement (Amu et al., 2008; Joel & Agbede, 2010; Lemaire et al., 2013; Mousavi & Leong Sing, 2015; Riaz et al., 2014; Saeed et al., 2015; Sharma et al., 2018; Umesha et al., 2009; Wei et al., 2014). Nayak & Sarvade (2012) used cement and quarry dust to improve the shear strength and hydraulic features of lithomarge clay. Ayeldeen & Kitazume (2017) utilized fiber, and liquid polymer to enhance the strength of cement-soft clay blends. The fibers and liquid polymers displayed a notable mechanically, economically and environmentally prospects to be used as an additive to cement in improving the soft clay. Also, organic soils have become the target of many studies that have addressed improving the properties of these soils by adding cement and other materials (Kalantari & Huat, 2008; Kalantari & Prasad, 2014). Moreover, Osinubi et al. (2011) used ordinary Portland cement –Locust bean waste ash mixture to enhance the engineering properties such as (UCS) and California bearing ratio (CBR) for black cotton clayey soil. Crushed concrete waste, which represents the last form of cement used, has been used in many studies to improve the properties of clay soils (Abdulnafaa et al., 2019; Abstract This research work examines the utilization of cement in order to improve low plasticity clay soil. The soil samples treated with 2, 4 and 6% cement percents and cured for different curing times extended to 90 days. Laboratory investigations include unconfined compression, indirect tensile, gas permeability and microstructural tests, which were conducted on the tested samples. The soil-water retention behavior has been also investigated. The test results showed that the cement addition improved both the compressive and tensile strength properties of soil specimens. These strength properties were also increased with curing times. pH and electrical conductivity values were good indicators for the enhancement in the strengths properties. The results of micro structural tests illustrated that the natural soil specimens contain voids and the open structure. Further, these tests showed the cementation of soil grains and filling the voids among soil grains with cementing compounds. Gas permeability and soil-water retention behavior of soil specimens are strongly related to the variations in the soil structures. Further examination illustrated that in the case of low cement content, the pore size distribution (PSD) and the efficiency of gas permeability are more sensitive to curing times.
<p>A cloud particle sensor (CPS) sonde is an observing system attached with a radiosonde sensor to observe the vertical structure of cloud properties. The signals obtained from CPS sondes are related to the phase, size, and number of cloud particles. The system offers economic advantages including human resource and simple operation costs compared with aircraft measurements and land-/satellite-based remote sensing. However, the observed information should be appropriately corrected because of several uncertainties. Here we made field experiments in the Arctic region by launching approximately 40 CPS sondes between 2018 and 2020. Using these data sets, a better practical correction method was proposed to exclude unreliable data, estimate the effective cloud water droplet radius, and determine a correction factor for the total cloud particle count. We apply this method to data obtained in October 2019 over the Arctic Ocean and March 2020 at Ny-Ålesund, Svalbard, Norway, to compare with a particle counter aboard a tethered balloon and liquid water content retrieved by a microwave radiometer. The estimated total particle count and liquid water content from the CPS sondes generally agree with those data. Although further development and validation of CPS sondes based on dedicated laboratory experiments would be required, the practical correction approach proposed here would offer better advantages in retrieving quantitative information on the vertical distribution of cloud microphysics under the condition of a lower number concentration.</p>
High-speed working units are very widely used in modern construction, in the destruction of durable materials and soils and more. When calculating the power and energy parameters of dynamic working units take into account changes in the nature of the interaction of the cutting element with the environment and the emergence and propagation of soil stresses from the action on the boundary of the array of the cutting element. This leads to the emergence of a stress-strain state in the soil mass, which has an oscillatory-wave character.
The nature of the stress-strain state is influenced by the state of the working environment and the speed of cutting (destruction) of the soil mass.
An unsolved problem in the dynamic destruction of soils is to take into account the kinematic features and technology of work with high-speed peripheral and front-end working units. The design parameters of these working units must take into account not only the dynamic parameters of the fracture process, but also the phenomenon of accumulation of fatigue deformations in the working environment.
<p>Large microwave surface emissivities with a highly heterogeneous
distribution and the relatively small hydrometeor signal over land make it
challenging to use satellite microwave data to retrieve precipitation and to
be assimilated into numerical models. To better understand the microwave
emissivity over land surfaces, we designed and established a ground
observation system for the in situ observation of microwave emissivities
over several typical surfaces. The major components of the system include a
dual-frequency polarized ground microwave radiometer, a mobile observation
platform, and auxiliary sensors to measure the surface temperature and soil
temperature and moisture; moreover, observation fields are designed
comprising five different land surfaces.</p>
<p>Based on the observed data from the mobile system, we preliminarily
investigated the variations in the surface microwave emissivity over
different land surfaces. The results show that the horizontally polarized
emissivity is more sensitive to land surface variability than the
vertically polarized emissivity is: the former decreases to 0.75 over cement
and increases to 0.90 over sand and bare soil and up to 0.97 over grass. The
corresponding emissivity polarization difference is obvious over water
(<span class="inline-formula"><i>></i>0.3</span>) and cement (approximately 0.25) but reduces to 0.1 over
sand and 0.05 over bare soil and almost 0.01 or close to zero over grass;
this trend is similar to that of the <span class="inline-formula"><i>T</i><sub>b</sub></span> polarization difference. At
different elevation angles, the horizontally/vertically polarized
emissivities over land surfaces obviously increase/slightly decrease with
increasing elevation angles but exhibit the opposite trend over water.</p>
<p>The aerosol fine-mode fraction (FMF) is an important
optical parameter of aerosols, and the FMF is difficult to accurately
retrieve by traditional satellite remote sensing methods. In this study, FMF
retrieval was carried out based on the multiangle polarization data of
Polarization and Anisotropy of Reflectances for Atmospheric Science coupled
with Observations from Lidar (PARASOL), which overcame the shortcomings of
the FMF retrieval algorithm in our previous research. In this research, FMF
retrieval was carried out in China and compared with the AErosol RObotic
NETwork (AERONET) ground-based observation results, Moderate Resolution
Imaging Spectroradiometer (MODIS) FMF products, and Generalized Retrieval of
Aerosol and Surface Properties (GRASP) FMF results. In addition, the FMF
retrieval algorithm was applied, a new FMF dataset was produced, and the
annual and quarterly average FMF results from 2006 to 2013 were obtained
for all of China. The research results show that the FMF retrieval results
of this study are comparable with the AERONET ground-based observation
results in China and the correlation coefficient (<span class="inline-formula"><i>r</i></span>), mean absolute error
(MAE), root mean square error (RMSE), and the proportion of results that
fall within the expected error (Within EE) are 0.770, 0.143, 0.170, and
65.01 %, respectively. Compared with the MODIS FMF products, the FMF
results of this study are closer to the AERONET ground-based observations.
Compared with the FMF results of GRASP, the FMF results of this study are
closer to the spatial variation in the ratio of PM<span class="inline-formula"><sub>2.5</sub></span> to PM<span class="inline-formula"><sub>10</sub></span> near
the ground.</p>
<p>Frozen hydrometeors are found in a huge range of shapes and sizes, with variability on much smaller scales than those of typical model grid boxes or satellite fields of view. Neither models nor in situ measurements can fully describe this variability, so assumptions have to be made in applications including atmospheric modelling and radiative transfer. In this work, parameter estimation has been used to optimise six different assumptions relevant to frozen hydrometeors in passive microwave radiative transfer. This covers cloud overlap, convective water content and particle size distribution (PSD), the shapes of large-scale snow and convective snow, and an initial exploration of the ice cloud representation (particle shape and PSD combined). These parameters were simultaneously adjusted to find the best fit between simulations from the European Centre for Medium-range Weather Forecasts (ECMWF) assimilation system and near-global microwave observations covering the frequency range 19 to 190 GHz. The choices for the cloud overlap and the convective particle shape were particularly well constrained (or identifiable), and there was even constraint on the cloud ice PSD. The practical output is a set of improved assumptions to be used in version 13.0 of the Radiative Transfer for TOVS microwave scattering package (RTTOV-SCATT), taking into account newly available particle shapes such as aggregates and hail, as well as additional PSD options. The parameter estimation explored the full parameter space using an efficient assumption of linearly additive perturbations. This helped illustrate issues such as multiple minima in the cost function, and non-Gaussian errors, that would make it hard to implement the same approach in a standard data assimilation system for weather forecasting. Nevertheless, as modelling systems grow more complex, parameter estimation is likely to be a necessary part of the development process.</p>
As of 2020, the international workshop on Procedural Content Generation enters its second decade. The annual workshop, hosted by the international conference on the Foundations of Digital Games, has collected a corpus of 95 papers published in its first 10 years. This paper provides an overview of the workshop’s activities and surveys the prevalent research topics emerging over the years.
<p>Before the launch of the TROPOspheric Monitoring Instrument (TROPOMI), only two other satellite instruments were able to observe aerosol plume heights globally, the Multi-angle Imaging SpectroRadiometer (MISR) and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP). The TROPOMI aerosol layer height is a potential game changer, since it has daily global coverage, and the aerosol layer height retrieval is available in near real time. The aerosol layer height can be useful for aviation and air quality alerts, as well as for improving air quality forecasting related to wildfires. Here, TROPOMI's aerosol layer height product is evaluated with MISR and CALIOP observations for wildfire plumes in North America for the 2018 fire season (June to August). Further, observing system simulation experiments were performed to interpret the fundamental differences between the different products. The results show that MISR and TROPOMI are, in theory, very close for aerosol profiles with single plumes. For more complex profiles with multiple plumes, however, different plume heights are retrieved; the MISR plume height represents the top layer, and the plume height retrieved with TROPOMI tends to have an average altitude of several plume layers.</p>
<p>The comparison between TROPOMI and MISR plume heights shows that, on average, the TROPOMI aerosol layer heights are lower, by approximately 600 m, compared to MISR, which is likely due to the different measurement techniques. From the comparison to CALIOP, our results show that the TROPOMI aerosol layer height is more accurate over dark surfaces, for thicker plumes, and plumes between approximately 1 and 4.5 km.</p>
<p>MISR and TROPOMI are further used to evaluate the plume height of Environment and Climate Change Canada's operational forecasting system FireWork with fire plume injection height estimates from the Canadian Forest Fire Emissions Prediction System (CFFEPS). The modelled plume heights are similar compared to the satellite observations but tend to be slightly higher with average differences of 270–580 and 60–320 m compared to TROPOMI and MISR, respectively.</p>
<p>The development of fast-response analysers for the
measurement of nitrous oxide (N<span class="inline-formula"><sub>2</sub></span>O) has resulted in exciting
opportunities for new experimental techniques beyond commonly used static
chambers and gas chromatography (GC) analysis. For example, quantum cascade
laser (QCL) absorption spectrometers are now being used with eddy covariance
(EC) or automated chambers. However, using a field-based QCL EC system to
also quantify N<span class="inline-formula"><sub>2</sub></span>O concentrations in gas samples taken from static
chambers has not yet been explored. Gas samples from static chambers are
often analysed by GC, a method that requires labour and time-consuming
procedures off-site. Here, we developed a novel field-based injection
technique that allowed the use of a single QCL for (1) micrometeorological
EC and (2) immediate manual injection of headspace samples taken from static
chambers. To test this approach across a range of low to high N<span class="inline-formula"><sub>2</sub></span>O
concentrations and fluxes, we applied ammonium nitrate (AN) at 0, 300, 600
and 900 kg N ha<span class="inline-formula"><sup>−1</sup></span> (AN<span class="inline-formula"><sub>0</sub></span>, AN<span class="inline-formula"><sub>300</sub></span>, AN<span class="inline-formula"><sub>600</sub></span>, AN<span class="inline-formula"><sub>900</sub>)</span> to
plots on a pasture soil. After analysis, calculated N<span class="inline-formula"><sub>2</sub></span>O fluxes from QCL
(<span class="inline-formula"><i>F</i><sub>N2O_QCL</sub>)</span> were compared with fluxes determined by a
standard method, i.e. laboratory-based GC (<span class="inline-formula"><i>F</i><sub>N2O_GC</sub>)</span>. Subsequently, the comparability of QCL and GC data was tested using
orthogonal regression, Bland–Altman and bioequivalence statistics. For AN-treated plots, mean cumulative N<span class="inline-formula"><sub>2</sub></span>O emissions across the 7 d
campaign were 0.97 (AN<span class="inline-formula"><sub>300</sub>)</span>, 1.26 (AN<span class="inline-formula"><sub>600</sub>)</span> and 2.00 kg N<span class="inline-formula"><sub>2</sub></span>O-N ha<span class="inline-formula"><sup>−1</sup></span> (AN<span class="inline-formula"><sub>900</sub>)</span> for <span class="inline-formula"><i>F</i><sub>N2O_QCL</sub></span> and 0.99
(AN<span class="inline-formula"><sub>300</sub>)</span>, 1.31 (AN<span class="inline-formula"><sub>600</sub>)</span> and 2.03 kg N<span class="inline-formula"><sub>2</sub></span>O-N ha<span class="inline-formula"><sup>−1</sup></span> (AN<span class="inline-formula"><sub>900</sub>)</span> for <span class="inline-formula"><i>F</i><sub>N2O_GC</sub></span>. These <span class="inline-formula"><i>F</i><sub>N2O_QCL</sub></span> and <span class="inline-formula"><i>F</i><sub>N2O_GC</sub></span> were highly correlated (<span class="inline-formula"><i>r</i>=0.996</span>, <span class="inline-formula"><i>n</i>=81</span>) based on orthogonal regression, in agreement following the Bland–Altman approach (i.e. within <span class="inline-formula">±1.96</span> standard deviation of the mean
difference) and shown to be for all intents and purposes the same (i.e.
equivalent). The <span class="inline-formula"><i>F</i><sub>N2O_QCL</sub></span> and <span class="inline-formula"><i>F</i><sub>N2O_GC</sub></span> derived under near-zero flux conditions (AN<span class="inline-formula"><sub>0</sub>)</span> were weakly
correlated (<span class="inline-formula"><i>r</i>=0.306</span>, <span class="inline-formula"><i>n</i>=27</span>) and not found to agree or to be
equivalent. This was likely caused by the calculation of small, but apparent
positive and negative, <span class="inline-formula"><i>F</i><sub>N2O</sub></span> when in fact the actual flux was below the
detection limit of static chambers. Our study demonstrated (1) that the
capability of using one QCL to measure N<span class="inline-formula"><sub>2</sub></span>O at different scales,
including manual injections, offers great potential to advance field
measurements of N<span class="inline-formula"><sub>2</sub></span>O (and other greenhouse gases) in the future and (2) that suitable statistics have to be adopted when formally assessing the
agreement and difference (not only the correlation) between two methods of
measurement.</p>
Quantum Counterfactual Communication is the recently-proposed idea of using quantum physics to send messages between two parties, without any matter/energy transfer associated with the bits sent. While this has excited massive interest, both for potential ‘unhackable’ communication, and insight into the foundations of quantum mechanics, it has been asked whether this process is essentially quantum, or could be performed classically. We examine counterfactual communication, both classical and quantum, and show that the protocols proposed so far for sending signals that don’t involve matter/energy transfer associated with the bits sent must be quantum, insofar as they require wave-particle duality.