248 resultados para Statistical physics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explored the development of statistical methods to support the monitoring and improvement in quality of treatment delivered to patients undergoing coronary angioplasty procedures. To achieve this goal, a suite of outcome measures was identified to characterise performance of the service, statistical tools were developed to monitor the various indicators and measures to strengthen governance processes were implemented and validated. Although this work focused on pursuit of these aims in the context of a an angioplasty service located at a single clinical site, development of the tools and techniques was undertaken mindful of the potential application to other clinical specialties and a wider, potentially national, scope.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrous oxide emissions from soil are known to be spatially and temporally volatile. Reliable estimation of emissions over a given time and space depends on measuring with sufficient intensity but deciding on the number of measuring stations and the frequency of observation can be vexing. The question of low frequency manual observations providing comparable results to high frequency automated sampling also arises. Data collected from a replicated field experiment was intensively studied with the intention to give some statistically robust guidance on these issues. The experiment had nitrous oxide soil to air flux monitored within 10 m by 2.5 m plots by automated closed chambers under a 3 h average sampling interval and by manual static chambers under a three day average sampling interval over sixty days. Observed trends in flux over time by the static chambers were mostly within the auto chamber bounds of experimental error. Cumulated nitrous oxide emissions as measured by each system were also within error bounds. Under the temporal response pattern in this experiment, no significant loss of information was observed after culling the data to simulate results under various low frequency scenarios. Within the confines of this experiment observations from the manual chambers were not spatially correlated above distances of 1 m. Statistical power was therefore found to improve due to increased replicates per treatment or chambers per replicate. Careful after action review of experimental data can deliver savings for future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. The rich sources of prior information in IGRT are incorporated into a hidden Markov random field model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk. The voxel labels are estimated using iterated conditional modes. The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom. The mean voxel-wise misclassification rate was 6.2\%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explored the knowledge and reasoning of young children in solving novel statistical problems, and the influence of problem context and design on their solutions. It found that young children's statistical competencies are underestimated, and that problem design and context facilitated children's application of a wide range of knowledge and reasoning skills, none of which had been taught. A qualitative design-based research method, informed by the Models and Modeling perspective (Lesh & Doerr, 2003) underpinned the study. Data modelling activities incorporating picture story books were used to contextualise the problems. Children applied real-world understanding to problem solving, including attribute identification, categorisation and classification skills. Intuitive and metarepresentational knowledge together with inductive and probabilistic reasoning was used to make sense of data, and beginning awareness of statistical variation and informal inference was visible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of the characteristics of articular cartilage such as thickness, stiffness and swelling, especially in the form that can facilitate real-time decisions and diagnostics is still a matter for research and development. This paper correlates near infrared spectroscopy with mechanically measured cartilage thickness to establish a fast, non-destructive, repeatable and precise protocol for determining this tissue property. Statistical correlation was conducted between the thickness of bovine cartilage specimens (n = 97) and regions of their near infrared spectra. Nine regions were established along the full absorption spectrum of each sample and were correlated with the thickness using partial least squares (PLS) regression multivariate analysis. The coefficient of determination (R2) varied between 53 and 93%, with the most predictive region (R2 = 93.1%, p < 0.0001) for cartilage thickness lying in the region (wavenumber) 5350–8850 cm−1. Our results demonstrate that the thickness of articular cartilage can be measured spectroscopically using NIR light. This protocol is potentially beneficial to clinical practice and surgical procedures in the treatment of joint disease such as osteoarthritis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis was a step forward in predicting the levels and sources of polycyclic aromatic hydrocarbons (PAHs) in sediments of Brisbane river, especially after the Brisbane floods in 2011. It employed different statistical techniques to provide valuable information that may assist source control and formulation of pollution mitigation measures for the river.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asset service organisations often recognize asset management as a core competence to deliver benefits to their business. But how do organizations know whether their asset management processes are adequate? Asset management maturity models, which combine best practices and competencies, provide a useful approach to test the capacity of organisations to manage their assets. Asset management frameworks are required to meet the dynamic challenges of managing assets in contemporary society. Although existing models are subject to wide variations in their implementation and sophistication, they also display a distinct weakness in that they tend to focus primarily on the operational and technical level and neglect the levels of strategy, policy and governance as well as the social and human resources – the people elements. Moreover, asset management maturity models have to respond to the external environmental factors, including such as climate change and sustainability, stakeholders and community demand management. Drawing on five dimensions of effective asset management – spatial, temporal, organisational, statistical, and evaluation – as identified by Amadi Echendu et al. [1], this paper carries out a comprehensive comparative analysis of six existing maturity models to identify the gaps in key process areas. Results suggest incorporating these into an integrated approach to assess the maturity of asset-intensive organizations. It is contended that the adoption of an integrated asset management maturity model will enhance effective and efficient delivery of services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter argues for the need to restructure children’s statistical experiences from the beginning years of formal schooling. The ability to understand and apply statistical reasoning is paramount across all walks of life, as seen in the variety of graphs, tables, diagrams, and other data representations requiring interpretation. Young children are immersed in our data-driven society, with early access to computer technology and daily exposure to the mass media. With the rate of data proliferation have come increased calls for advancing children’s statistical reasoning abilities, commencing with the earliest years of schooling (e.g., Langrall et al. 2008; Lehrer and Schauble 2005; Shaughnessy 2010; Whitin and Whitin 2011). Several articles (e.g., Franklin and Garfield 2006; Langrall et al. 2008) and policy documents (e.g., National Council of Teachers ofMathematics 2006) have highlighted the need for a renewed focus on this component of early mathematics learning, with children working mathematically and scientifically in dealing with realworld data. One approach to this component in the beginning school years is through data modelling (English 2010; Lehrer and Romberg 1996; Lehrer and Schauble 2000, 2007)...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The utility of a novel technique for determining the ignition delay in a compression ignition engine has been shown. This method utilises statistical modelling in the Bayesian paradigm to accurately resolve the start of combustion from a band-pass in-cylinder pressure signal. Applied to neat diesel and six biofuels, including four fractionations of palm oil of varying carbon chain length and degree of unsaturation, the relationships between ignition delay, cetane number and oxygen content have been explored. It is noted that the expected negative relationship between ignition delay and cetane number held, as did the positive relationship between ignition delay and oxygen content. The degree of unsaturation was also identified as a potential factor influencing the ignition delay.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To investigate the correlation between postmenopausal osteoporosis (PMO) and the pathogenesis of periodontitis, ovariectomized rats were generated and the experimental periodontitis was induced using a silk ligature. The inflammatory factors and bone metabolic markers were measured in the serum and periodontal tissues of ovariectomized rats using an automatic chemistry analyzer, enzyme-linked immunosorbent assays, and immunohistochemistry. The bone mineral density of whole body, pelvis, and spine was analyzed using dual-energy X-ray absorptiometry and image analysis. All data were analyzed using SPSS 13.0 statistical software. It was found that ovariectomy could upregulate the expression of interleukin- (IL-)6, the receptor activator of nuclear factor-κB ligand (RANKL), and osteoprotegerin (OPG) and downregulate IL-10 expression in periodontal tissues, which resulted in progressive alveolar bone loss in experimental periodontitis. This study indicates that changes of cytokines and bone turnover markers in the periodontal tissues of ovariectomized rats contribute to the damage of periodontal tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methodology was applied to a survey of time-course incidence of four viruses (alfalfa mosaic virus, clover yellow vein virus, subterranean clover mottle virus and subterranean clover red leaf virus) in improved pastures in southern regions of Australia. -from Authors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In particle-strengthened metallic alloys, fatigue damage incubates at inclusion particles near the surface or at the change of geometries. Micromechanical simulation of inclusions such that the fatigue damage incubation mechanisms can be categorized. As micro-plasticity gradient field around different inclusions is different, a novel concept for nonlocal evaluation of micro-plasticity intensity is introduced. The effects of void aspects ration and spatial distributions are quantified for fatigue incubation life in the high-cycle fatigue regime. At last, these effects are integrated based on the statistical facts of inclusions to predict the fatigue life of structural components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyclostationary models for the diagnostic signals measured on faulty rotating machineries have proved to be successful in many laboratory tests and industrial applications. The squared envelope spectrum has been pointed out as the most efficient indicator for the assessment of second order cyclostationary symptoms of damages, which are typical, for instance, of rolling element bearing faults. In an attempt to foster the spread of rotating machinery diagnostics, the current trend in the field is to reach higher levels of automation of the condition monitoring systems. For this purpose, statistical tests for the presence of cyclostationarity have been proposed during the last years. The statistical thresholds proposed in the past for the identification of cyclostationary components have been obtained under the hypothesis of having a white noise signal when the component is healthy. This need, coupled with the non-white nature of the real signals implies the necessity of pre-whitening or filtering the signal in optimal narrow-bands, increasing the complexity of the algorithm and the risk of losing diagnostic information or introducing biases on the result. In this paper, the authors introduce an original analytical derivation of the statistical tests for cyclostationarity in the squared envelope spectrum, dropping the hypothesis of white noise from the beginning. The effect of first order and second order cyclostationary components on the distribution of the squared envelope spectrum will be quantified and the effectiveness of the newly proposed threshold verified, providing a sound theoretical basis and a practical starting point for efficient automated diagnostics of machine components such as rolling element bearings. The analytical results will be verified by means of numerical simulations and by using experimental vibration data of rolling element bearings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.