896 resultados para real-scale modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the coming decades, the mining industry faces the dual challenge of lowering both its water and energy use. This presents a difficulty since technological advances that decrease the use of one can increase the use of the other. Historically, energy and water use have been modelled independently, making it difficult to evaluate the true costs and benefits from water and energy improvements. This paper presents a hierarchical systems model that is able to represent interconnected water and energy use at a whole of site scale. In order to explore the links between water and energy four technologies advancements have been modelled: use of dust suppression additives, the adoption of thickened tailings, the transition to dry processing and the incorporation of a treatment plant. The results show a synergy between decreased water and energy use for dust suppression additives, but a trade-off for the others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A single plant cell was modeled with smoothed particle hydrodynamics (SPH) and a discrete element method (DEM) to study the basic micromechanics that govern the cellular structural deformations during drying. This two-dimensional particle-based model consists of two components: a cell fluid model and a cell wall model. The cell fluid was approximated to a highly viscous Newtonian fluid and modeled with SPH. The cell wall was treated as a stiff semi-permeable solid membrane with visco-elastic properties and modeled as a neo-Hookean solid material using a DEM. Compared to existing meshfree particle-based plant cell models, we have specifically introduced cell wall–fluid attraction forces and cell wall bending stiffness effects to address the critical shrinkage characteristics of the plant cells during drying. Also, a moisture domain-based novel approach was used to simulate drying mechanisms within the particle scheme. The model performance was found to be mainly influenced by the particle resolution, initial gap between the outermost fluid particles and wall particles and number of particles in the SPH influence domain. A higher order smoothing kernel was used with adaptive smoothing length to improve the stability and accuracy of the model. Cell deformations at different states of cell dryness were qualitatively and quantitatively compared with microscopic experimental findings on apple cells and a fairly good agreement was observed with some exceptions. The wall–fluid attraction forces and cell wall bending stiffness were found to be significantly improving the model predictions. A detailed sensitivity analysis was also done to further investigate the influence of wall–fluid attraction forces, cell wall bending stiffness, cell wall stiffness and the particle resolution. This novel meshfree based modeling approach is highly applicable for cellular level deformation studies of plant food materials during drying, which characterize large deformations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational fluid dynamics, analytical solutions, and mathematical modelling approaches are used to gain insights into the distribution of fumigant gas within farm-scale, grain storage silos. Both fan-forced and tablet fumigation are considered in this work, which develops new models for use by researchers, primary producers and silo manufacturers to assist in the eradication grain storage pests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common method variance (CMV) has received little attention within the field of road safety research despite a heavy reliance on self-report data. Two surveys were completed by 214 motorists over a two-month period, allowing associations between social desirability and key road safety variables and relationships between scales across the two survey waves to be examined. Social desirability was found to have a strong negative correlation with the Driver Behaviour Questionnaire (DBQ) sub-scales as well as age, but not with crashes and offences. Drivers who scored higher on the social desirability scale were also less likely to report aberrant driving behaviours as measured by the DBQ. Controlling for social desirability did not substantially alter the predictive relationship between the DBQ and the crash and offences variables. The strength of the correlations within and between the two waves were also compared with the results strongly suggesting that effects associated with CMV were present. Identification of CMV would be enhanced by the replication of this study with a larger sample size and comparing self-report data with official sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mining industry faces three long term strategic risks in relation to its water and energy use: 1) securing enough water and energy to meet increased production; 2) reducing water use, energy consumption and emissions due to social, environmental and economic pressures; and 3) understanding the links between water and energy, so that an improvement in one area does not create an adverse effect in another. This project helps the industry analyse these risks by creating a hierarchical systems model (HSM) that represents the water and energy interactions on a sub-site, site and regional scales; which is coupled with a flexible risk framework. The HSM consists of: components that represent sources of water and energy; activities that use water and energy and off-site destinations of water and produced emissions. It can also represent more complex components on a site, with inbuilt examples including tailings dams and water treatment plants. The HSM also allows multiple sites and other infrastructure to be connected together to explore regional water and energy interactions. By representing water and energy as a single interconnected system the HSM can explore tradeoffs and synergies. For example, on a synthetic case study, which represents a typical site, simulations suggested that while a synergy in terms of water use and energy use could be made when chemical additives were used to enhance dust suppression, there were trade-offs when either thickened tailings or dry processing were used. On a regional scale, the HSM was used to simulate various scenarios, including: mines only withdrawing water when needed; achieving economics-of-scale through use of a single centralised treatment plant rather than smaller decentralised treatment plants; and capturing of fugitive emissions for energy generation. The HSM also includes an integrated risk framework for interpreting model output, so that onsite and off-site impacts of various water and energy management strategies can be compared in a managerial context. The case studies in this report explored company, social and environmental risks for scenarios of regional water scarcity, unregulated saline discharge, and the use of plantation forestry to offset carbon emissions. The HSM was able to represent the non-linear causal relationship at the regional scale, such as the forestry scheme offsetting a small percentage of carbon emissions but causing severe regional water shortages. The HSM software developed in this project will be released as an open source tool to allow industry personnel to easily and inexpensively quantify and explore the links between water use, energy use, and carbon emissions. The tool can be easily adapted to represent specific sites or regions. Case studies conducted in this project highlighted the potential complexity of these links between water, energy, and carbon emissions, as well as the significance of the cumulative effects of these links over time. A deeper understanding of these links is vital for the mining industry in order to progress to more sustainable operations, and the HSM provides an accessible, robust framework for investigating these links.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Project work can involve multiple people from varying disciplines coming together to solve problems as a group. Large scale interactive displays are presenting new opportunities to support such interactions with interactive and semantically enabled cooperative work tools such as intelligent mind maps. In this paper, we present a novel digital, touch-enabled mind-mapping tool as a first step towards achieving such a vision. This first prototype allows an evaluation of the benefits of a digital environment for a task that would otherwise be performed on paper or flat interactive surfaces. Observations and surveys of 12 participants in 3 groups allowed the formulation of several recommendations for further research into: new methods for capturing text input on touch screens; inclusion of complex structures; multi-user environments and how users make the shift from single- user applications; and how best to navigate large screen real estate in a touch-enabled, co-present multi-user setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epigenetic changes correspond to heritable modifications of the chromosome structure, which do not involve alteration of the DNA sequence but do affect gene expression. These mechanisms play an important role in normal cell differentiation, but aberration is associated also with several diseases, including cancer and neural disorders. In consequence, despite intensive studies in recent years, the contribution of modifications remains largely unquantified due to overall system complexity and insufficient data. Computational models can provide powerful auxiliary tools to experimentation, not least as scales from the sub-cellular through cell populations (or to networks of genes) can be spanned. In this paper, the challenges to development, of realistic cross-scale models, are discussed and illustrated with respect to current work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moral vitalism refers to a tendency to view good and evil as actual forces that can influence people and events. We introduce a scale designed to assess the belief in moral vitalism. High scorers on the scale endorse items such as “There are underlying forces of good and evil in this world”. After establishing the reliability and criterion validity of the scale (Studies 1, 2a, 2b), we examined the predictive validity of the moral vitalism scale, showing that “moral vitalists” worry about being possessed by evil (Study 3), being contaminated through contact with evil people (Study 4), and forfeiting their own mental purity (Study 5). We discuss the nature of moral vitalism and the implications of the construct for understanding the role of metaphysical lay theories about the nature of good and evil in moral reasoning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D printing (3Dp) has long been used in the manufacturing sector as a way to automate, accelerate production and reduce waste materials. It is able to build a wide variety of objects if the necessary specifications are provided to the printer and no problems are presented by the limited range of materials available. With 3Dp becoming cheaper, more reliable and, as a result, more prevalent in the world at large, it may soon make inroads into the construction industry. Little is known however, of 3Dp in current use the construction industry and its potential for the future and this paper seeks to rectify this situation by providing a review of the relevant literature. In doing this, the three main 3Dp methods of contour crafting, concrete printing and D-shape 3Dp are described which, as opposed to the traditional construction method of cutting materials down to size, deliver only what is needed for completion, vastly reducing waste. Also identified is 3Dp’s potential to enable buildings to be constructed many times faster and with significantly reduced labour costs. In addition, it is clear that construction 3Dp can allow the further inclusion of Building Information Modelling into the construction process - streamlining and improving the scheduling requirements of a project. However, current 3Dp processes are known to be costly, unsuited to large-scale products and conventional design approaches, and have a very limited range of materials that can be used. Moreover, the only successful examples of construction in action to date have occurred in controlled laboratory environments and, as real world trials have yet to be completed, it is yet to be seen whether it can be it equally proficient in practical situations. Key Words: 3D Printing; Contour Crafting; Concrete Printing; D-shape; Building Automation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most important aspect of modelling a geological variable, such as metal grade, is the spatial correlation. Spatial correlation describes the relationship between realisations of a geological variable sampled at different locations. Any method for spatially modelling such a variable should be capable of accurately estimating the true spatial correlation. Conventional kriged models are the most commonly used in mining for estimating grade or other variables at unsampled locations, and these models use the variogram or covariance function to model the spatial correlations in the process of estimation. However, this usage assumes the relationships of the observations of the variable of interest at nearby locations are only influenced by the vector distance between the locations. This means that these models assume linear spatial correlation of grade. In reality, the relationship with an observation of grade at a nearby location may be influenced by both distance between the locations and the value of the observations (ie non-linear spatial correlation, such as may exist for variables of interest in geometallurgy). Hence this may lead to inaccurate estimation of the ore reserve if a kriged model is used for estimating grade of unsampled locations when nonlinear spatial correlation is present. Copula-based methods, which are widely used in financial and actuarial modelling to quantify the non-linear dependence structures, may offer a solution. This method was introduced by Bárdossy and Li (2008) to geostatistical modelling to quantify the non-linear spatial dependence structure in a groundwater quality measurement network. Their copula-based spatial modelling is applied in this research paper to estimate the grade of 3D blocks. Furthermore, real-world mining data is used to validate this model. These copula-based grade estimates are compared with the results of conventional ordinary and lognormal kriging to present the reliability of this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pilot and industrial scale dilute acid pretreatment data can be difficult to obtain due to the significant infrastructure investment required. Consequently, models of dilute acid pretreatment by necessity use laboratory scale data to determine kinetic parameters and make predictions about optimal pretreatment conditions at larger scales. In order for these recommendations to be meaningful, the ability of laboratory scale models to predict pilot and industrial scale yields must be investigated. A mathematical model of the dilute acid pretreatment of sugarcane bagasse has previously been developed by the authors. This model was able to successfully reproduce the experimental yields of xylose and short chain xylooligomers obtained at the laboratory scale. In this paper, the ability of the model to reproduce pilot scale yield and composition data is examined. It was found that in general the model over predicted the pilot scale reactor yields by a significant margin. Models that appear very promising at the laboratory scale may have limitations when predicting yields on a pilot or industrial scale. It is difficult to comment whether there are any consistent trends in optimal operating conditions between reactor scale and laboratory scale hydrolysis due to the limited reactor datasets available. Further investigation is needed to determine whether the model has some efficacy when the kinetic parameters are re-evaluated by parameter fitting to reactor scale data, however, this requires the compilation of larger datasets. Alternatively, laboratory scale mathematical models may have enhanced utility for predicting larger scale reactor performance if bulk mass transport and fluid flow considerations are incorporated into the fibre scale equations. This work reinforces the need for appropriate attention to be paid to pilot scale experimental development when moving from laboratory to pilot and industrial scales for new technologies.