512 resultados para Reliable Computations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical study is carried out using large eddy simulation to study the heat and toxic gases released from fires in real road tunnels. Due to disasters about tunnel fires in previous decade, it attracts increasing attention of researchers to create safe and reliable ventilation designs. In this research, a real tunnel with 10 MW fire (which approximately equals to the heat output speed of a burning bus) at the middle of tunnel is simulated using FDS (Fire Dynamic Simulator) for different ventilation velocities. Carbone monoxide concentration and temperature vertical profiles are shown for various locations to explore the flow field. It is found that, with the increase of the longitudinal ventilation velocity, the vertical profile gradients of CO concentration and smoke temperature were shown to be both reduced. However, a relatively large longitudinal ventilation velocity leads to a high similarity between the vertical profile of CO volume concentration and that of temperature rise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable communications is one of the major concerns in wireless sensor networks (WSNs). Multipath routing is an effective way to improve communication reliability in WSNs. However, most of existing multipath routing protocols for sensor networks are reactive and require dynamic route discovery. If there are many sensor nodes from a source to a destination, the route discovery process will create a long end-to-end transmission delay, which causes difficulties in some time-critical applications. To overcome this difficulty, the efficient route update and maintenance processes are proposed in this paper. It aims to limit the amount of routing overhead with two-tier routing architecture and introduce the combination of piggyback and trigger update to replace the periodic update process, which is the main source of unnecessary routing overhead. Simulations are carried out to demonstrate the effectiveness of the proposed processes in improvement of total amount of routing overhead over existing popular routing protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laboratory-based studies of human dietary behaviour benefit from highly controlled conditions; however, this approach can lack ecological validity. Identifying a reliable method to capture and quantify natural dietary behaviours represents an important challenge for researchers. In this study, we scrutinised cafeteria-style meals in the ‘Restaurant of the Future.’ Self-selected meals were weighed and photographed, both before and after consumption. Using standard portions of the same foods, these images were independently coded to produce accurate and reliable estimates of (i) initial self-served portions, and (ii) food remaining at the end of the meal. Plate cleaning was extremely common; in 86% of meals at least 90% of self-selected calories were consumed. Males ate a greater proportion of their self-selected meals than did females. Finally, when participants visited the restaurant more than once, the correspondence between selected portions was better predicted by the weight of the meal than by its energy content. These findings illustrate the potential benefits of meal photography in this context. However, they also highlight significant limitations, in particular, the need to exclude large amounts of data when one food obscures another.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable approaches for predicting pollutant build-up are essential for accurate urban stormwater quality modelling. Based on the in-depth investigation of metal build-up on residential road surfaces, this paper presents empirical models for predicting metal loads on these surfaces. The study investigated metals commonly present in the urban environment. Analysis undertaken found that the build-up process for metals primarily originating from anthropogenic (copper and zinc) and geogenic (aluminium, calcium, iron and manganese) sources were different. Chromium and nickel were below detection limits. Lead was primarily associated with geogenic sources, but also exhibited a significant relationship with anthropogenic sources. The empirical prediction models developed were validated using an independent data set and found to have relative prediction errors of 12-50%, which is generally acceptable for complex systems such as urban road surfaces. Also, the predicted values were very close to the observed values and well within 95% prediction interval.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Moving fronts of cells are essential features of embryonic development, wound repair and cancer metastasis. This paper describes a set of experiments to investigate the roles of random motility and proliferation in driving the spread of an initially confined cell population. The experiments include an analysis of cell spreading when proliferation was inhibited. Our data have been analysed using two mathematical models: a lattice-based discrete model and a related continuum partial differential equation model. We obtain independent estimates of the random motility parameter, D, and the intrinsic proliferation rate, λ, and we confirm that these estimates lead to accurate modelling predictions of the position of the leading edge of the moving front as well as the evolution of the cell density profiles. Previous work suggests that systems with a high λ/D ratio will be characterized by steep fronts, whereas systems with a low λ/D ratio will lead to shallow diffuse fronts and this is confirmed in the present study. Our results provide evidence that continuum models, based on the Fisher–Kolmogorov equation, are a reliable platform upon which we can interpret and predict such experimental observations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents an assessment of the coprecipitation technique for the reliable production of high-temperature superconducting (HTS) copper-oxide powders in quantities scaled up to 1 kg. This process affords precise control of cation stoichiometry (< 4% relative), occurs rapidly (almost instantaneously) and can be suitably developed for large-scale (e.g. tonne) manufacture of HTS materials. The process is based upon a simple control of the chemistry of the cation solution and precipitation with oxalic acid. This coprecipitation method is applicable to all copper-oxides and has been demonstrated in this work using over thirty separate experiments for the following compositions: YBa2Cu3O7-δ, Y2BaCuO5 and YBa2Cu4O8. The precursor powders formed via this coprecipitation process are fine-grained (∼ 5-10 nm), chemically homogeneous at the nanometer scale and reactive, Conversion to phase-pure HTS powders can therefore occur in minutes at appropriate firing temperatures. © 1995.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

HRTEM has been used to examine illite/smectite from the Mancos shale, rectorite from Garland County, Arkansas; illite from Silver Hill, Montana; Na-smectite from Crook County, Wyoming; corrensite from Packwood, Washington; and diagenetic chlorite from the Tuscaloosa formation. Thin specimens were prepared by ion milling, ultra-microtome sectioning and/or grain dispersal on a porous carbon substrate. Some smectite-bearing clays were also examined after intercalation with dodecylamine hydrochloride (DH). Intercalation of smectite with DH proved to be a reliable method of HRTEM imaging of expanded smectite, d(001) 16 A which could then be distinguished from unexpanded illite, d(001) 10 A. Lattice fringes of basal spacings of DH-intercalated rectorite and illite/smectite showed 26 A periodicity. These data support XRD studies which suggest that these samples are ordered, interstratified varieties of illite and smectite. The ion-thinned, unexpanded corrensite sample showed discrete crystallites containing 10 A and 14 A basal spacings corresponding with collapsed smectite and chlorite, respectively. Regions containing disordered layers of chlorite and smectite were also noted. Crystallites containing regular alternations of smectite and chlorite were not common. These HRTEM observations of corrensite did not corroborate XRD data. Particle sizes parallel to the c axis ranged widely for each sample studied, and many particles showed basal dimensions equivalent to > five layers. -J.M.H.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A review of 291 catalogued particles on the bases of particle size, shape, bulk chemistry, and texture is used to establish a reliable taxonomy. Extraterrestrial materials occur in three defined categories: spheres, aggregates and fragments. Approximately 76% of aggregates are of probable extraterrestrial origin, whereas spheres contain the smallest amount of extraterrestrial material (approx 43%). -B.M.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secure communications between large number of sensor nodes that are randomly scattered over a hostile territory, necessitate efficient key distribution schemes. However, due to limited resources at sensor nodes such schemes cannot be based on post deployment computations. Instead, pairwise (symmetric) keys are required to be pre-distributed by assigning a list of keys, (a.k.a. key-chain), to each sensor node. If a pair of nodes does not have a common key after deployment then they must find a key-path with secured links. The objective is to minimize the keychain size while (i) maximizing pairwise key sharing probability and resilience, and (ii) minimizing average key-path length. This paper presents a deterministic key distribution scheme based on Expander Graphs. It shows how to map the parameters (e.g., degree, expansion, and diameter) of a Ramanujan Expander Graph to the desired properties of a key distribution scheme for a physical network topology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The 30-item USDI is a self-report measure that assesses depressive symptoms among university students. It consists of three correlated three factors: Lethargy, Cognitive-Emotional and Academic motivation. The current research used confirmatory factor analysis to asses construct validity and determine whether the original factor structure would be replicated in a different sample. Psychometric properties were also examined. Method: Participants were 1148 students (mean age 22.84 years, SD = 6.85) across all faculties from a large Australian metropolitan university. Students completed a questionnaire comprising of the USDI, the Depression Anxiety Stress Scale (DASS) and Life Satisfaction Scale (LSS). Results: The three correlated factor model was shown to be an acceptable fit to the data, indicating sound construct validity. Internal consistency of the scale was also demonstrated to be sound, with high Cronbach Alpha values. Temporal stability of the scale was also shown to be strong through test-retest analysis. Finally, concurrent and discriminant validity was examined with correlations between the USDI and DASS subscales as well as the LSS, with sound results contributing to further support the construct validity of the scale. Cut-off points were also developed to aid total score interpretation. Limitations: Response rates are unclear. In addition, the representativeness of the sample could be improved potentially through targeted recruitment (i.e. reviewing the online sample statistics during data collection, examining the representativeness trends and addressing particular faculties within the university that were underrepresented). Conclusions: The USDI provides a valid and reliable method of assessing depressive symptoms found among university students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nutrients balance such as nitrogen and phosphorus balance are increasingly used as an indicator of the environmental performance of agricultural sector in international and global context. However there still is a lack of harmony in the use of methods for estimating the nutrients balance among countries. This is because of the disagreement regarding the accuracy and uncertainty of different accounting methods. The lack of harmony in the methods used in different countries further increases the uncertainty in the context of the international comparisons. This paper provides a new framework for nutrients balance calculation using the farm-gate accounting method. The calculation under this new framework takes advantage of availability of data from FAO and other reliable national and international sources. Due to this, the proposed framework is highly adaptable in many countries, making the global comparison feasible. The paper also proposes three criteria including adaptability, accuracy and interpretability to assess the appropriateness of nutrients accounting method. Based on these criteria, the paper provides a comprehensive comparison of the farm-gate and soil-surface methods in accounting country-level nutrients balance of agricultural production. The paper identifies some shortcomings of the soil-surface balance and shows that the farm-gate method has a greater potential of providing a more accurate and meaningful estimation of national nutrients balance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

None of currently used tonometers produce estimated IOP values that are free of errors. Measurement incredibility arises from indirect measurement of corneal deformation and the fact that pressure calculations are based on population averaged parameters of anterior segment. Reliable IOP values are crucial for understanding and monitoring of number of eye pathologies e.g. glaucoma. We have combined high speed swept source OCT with air-puff chamber. System provides direct measurement of deformation of cornea and anterior surface of the lens. This paper describes in details the performance of air-puff ssOCT instrument. We present different approaches of data presentation and analysis. Changes in deformation amplitude appears to be good indicator of IOP changes. However, it seems that in order to provide accurate intraocular pressure values an additional information on corneal biomechanics is necessary. We believe that such information could be extracted from data provided by air-puff ssOCT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Average speed enforcement is a relatively new approach gaining popularity throughout Europe and Australia. This paper reviews the evidence regarding the impact of this approach on vehicle speeds, crashes rates and a number of additional road safety and public health outcomes. The economic and practical viability of the approach as a road safety countermeasure is also explored. A literature review, with an international scope, of both published and grey literature was conducted. There is a growing body of evidence to suggest a number of road safety benefits associated with average speed enforcement, including high rates of compliance with speed limits, reductions in average and 85th percentile speeds and reduced speed variability between vehicles. Moreover, the approach has been demonstrated to be particularly effective in reducing excessive speeding behaviour. Reductions in crash rates have also been reported in association with average speed enforcement, particularly in relation to fatal and serious injury crashes. In addition, the approach has been shown to improve traffic flow, reduce vehicle emissions and has also been associated with high levels of public acceptance. Average speed enforcement offers a greater network-wide approach to managing speeds that reduces the impact of time and distance halo effects associated with other automated speed enforcement approaches. Although comparatively expensive it represents a highly reliable approach to speed enforcement that produces considerable returns on investment through reduced social and economic costs associated with crashes.