249 resultados para STATISTICAL STRENGTH


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. To create a binocular statistical eye model based on previously measured ocular biometric data. Methods. Thirty-nine parameters were determined for a group of 127 healthy subjects (37 male, 90 female; 96.8% Caucasian) with an average age of 39.9 ± 12.2 years and spherical equivalent refraction of −0.98 ± 1.77 D. These parameters described the biometry of both eyes and the subjects' age. Missing parameters were complemented by data from a previously published study. After confirmation of the Gaussian shape of their distributions, these parameters were used to calculate their mean and covariance matrices. These matrices were then used to calculate a multivariate Gaussian distribution. From this, an amount of random biometric data could be generated, which were then randomly selected to create a realistic population of random eyes. Results. All parameters had Gaussian distributions, with the exception of the parameters that describe total refraction (i.e., three parameters per eye). After these non-Gaussian parameters were omitted from the model, the generated data were found to be statistically indistinguishable from the original data for the remaining 33 parameters (TOST [two one-sided t tests]; P < 0.01). Parameters derived from the generated data were also significantly indistinguishable from those calculated with the original data (P > 0.05). The only exception to this was the lens refractive index, for which the generated data had a significantly larger SD. Conclusions. A statistical eye model can describe the biometric variations found in a population and is a useful addition to the classic eye models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With increasing rate of shipping traffic, the risk of collisions in busy and congested port waters is likely to rise. However, due to low collision frequencies in port waters, it is difficult to analyze such risk in a sound statistical manner. A convenient approach of investigating navigational collision risk is the application of the traffic conflict techniques, which have potential to overcome the difficulty of obtaining statistical soundness. This study aims at examining port water conflicts in order to understand the characteristics of collision risk with regard to vessels involved, conflict locations, traffic and kinematic conditions. A hierarchical binomial logit model, which considers the potential correlations between observation-units, i.e., vessels, involved in the same conflicts, is employed to evaluate the association of explanatory variables with conflict severity levels. Results show higher likelihood of serious conflicts for vessels of small gross tonnage or small overall length. The probability of serious conflict also increases at locations where vessels have more varied headings, such as traffic intersections and anchorages; becoming more critical at night time. Findings from this research should assist both navigators operating in port waters as well as port authorities overseeing navigational management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This LiteSteel beam (LSB) is a new cold-formed steel hollow flange channel section produced using a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. The LSBs are commonly used as floor joists and bearers with web openings in buildings. Their shear strengths are considerably reduced when web openings are included for the purpose of locating building services. Shear tests of LSBs with web openings have shown that there is up to 60% reduction in the shear capacity. Hence there is a need to improve the shear capacity of LSBs with web openings. A cost effective way to eliminate the shear capacity reduction is to stiffen the web openings using suitable stiffeners. Hence numerical studies were undertaken to investigate the shear capacity of LSBs with stiffened web openings. In this research, finite element models of LSBs with stiffened web openings in shear were developed to simulate the shear behaviour and strength of LSBs. Various stiffening methods using plate and LSB stiffeners attached to LSBs using both welding and screw-fastening were attempted. The developed models were then validated by comparing their results with experimental results and used in further studies. Both finite element and experimental results showed that the stiffening arrangements recommended by past research for cold-formed steel channel beams are not adequate to restore the shear strengths of LSBs with web openings. Therefore new stiffener arrangements were proposed for LSBs with web openings. This paper presents the details of this research project using numerical studies and the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an experimental investigation of the flexural bond strength of thin bed concrete masonry. Flexural bond strength of masonry depends upon the mortar type, the techniques of dispersion of mortar and the surface texture (roughness) of concrete blocks. There exists an abundance of literature on the conventional masonry bond containing 10mm thick mortar; however, the 2mm polymer flue mortar bond is not yet well researched. This paper reports a study on the examination of the effect of mortar compositions, dispersion methods and unit surface textures to the flexural bond strength of thin bed concrete masonry. Three types of polymer modified glue mortars, three surface textures and four techniques of mortar dispersion have been used in preparing 108 four point flexural test specimens. All mortar joints have been carefully prepared to ensure achievement of 2mm layer polymer mortar thickness on average. The results exhibit the flexural bond strength of thin bed concrete masonry much is higher than that of the conventional masonry; moreover the unit surface texture and the mortar dispersion methods are found to have significant influence on the flexural bond strength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The addition of lime into soils has been widely used to stabilize the expansive sub-grade soils when the road pavements are constructed on them. It is common practice to apply a half of the required lime amount and allow a certain time period for lime to react with soils (Amelioration period) before applying the rest of lime and compacting the sub-grade. The optimum amelioration period is essential to minimize the construction delay and to gain the higher strength. In this study, two different expansive soils procured from two different locations in the state of Queensland in Australia were first mixed with different lime contents. A soil mixed with a particular lime content was compacted at different amelioration periods (e.g.: 0, 6, 12, 18, 24 hrs) to obtain soil samples to measure the Unconfined Compressive Strength (UCS). The results suggested that for a given amelioration period, UCS increased with the increase in lime content. The optimum amelioration period could be within 14~17 hours for most of the lime contents in tested soils. This could suggest that the current 24-48 hour amelioration period specified by the Queensland Department of Transport and Main roads could be reduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to determine if athletes with a history of hamstring strain injury display lower levels of surface EMG (sEMG) activity and median power frequency in the previously injured hamstring muscle during maximal voluntary contractions. Recreational athletes were recruited, 13 with a history of unilateral hamstring strain injury and 15 without prior injury. All athletes undertook isokinetic dynamometry testing of the knee flexors and sEMG assessment of the biceps femoris long head (BF) and medial hamstrings (MH) during concentric and eccentric contractions at ± 180 and ± 600.s-1. The knee flexors on the previously injured limb were weaker at all contraction speeds compared to the uninjured limb (+1800.s-1 p = 0.0036; +600.s-1 p = 0.0013; -600.s-1 p = 0.0007; -1800.s-1 p = 0.0007) whilst sEMG activity was only lower in the BF during eccentric contractions (-600.s-1 p = 0.0025; -1800.s-1 p = 0.0003). There were no between limb differences in MH sEMG activity or median power frequency from either BF or MH in the injured group. The uninjured group showed no between limb differences in any of the tested variables. Secondary analysis comparing the between limb difference in the injured and the uninjured groups, confirmed that previously injured hamstrings were mostly weaker (+1800.s-1 p = 0.2208; +600.s-1 p = 0.0379; -600.s-1 p = 0.0312; -1800.s-1 p = 0.0110) and that deficits in sEMG were confined to the BF during eccentric contractions (-600.s-1 p = 0.0542; -1800.s-1 p = 0.0473) Previously injured hamstrings were weaker and BF sEMG activity was lower than the contralateral uninjured hamstring. This has implications for hamstring strain injury prevention and rehabilitation which should consider altered neural function following hamstring strain injury.