555 resultados para statistical distribution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urbanisation significantly changes the characteristics of a catchment as natural areas are transformed to impervious surfaces such as roads, roofs and parking lots. The increased fraction of impervious surfaces leads to changes to the stormwater runoff characteristics, whilst a variety of anthropogenic activities common to urban areas generate a range of pollutants such as nutrients, solids and organic matter. These pollutants accumulate on catchment surfaces and are removed and trans- ported by stormwater runoff and thereby contribute pollutant loads to receiving waters. In summary, urbanisation influences the stormwater characteristics of a catchment, including hydrology and water quality. Due to the growing recognition that stormwater pollution is a significant environmental problem, the implementation of mitigation strategies to improve the quality of stormwater runoff is becoming increasingly common in urban areas. A scientifically robust stormwater quality treatment strategy is an essential requirement for effective urban stormwater management. The efficient design of treatment systems is closely dependent on the state of knowledge in relation to the primary factors influencing stormwater quality. In this regard, stormwater modelling outcomes provide designers with important guidance and datasets which significantly underpin the design of effective stormwater treatment systems. Therefore, the accuracy of modelling approaches and the reliability modelling outcomes are of particular concern. This book discusses the inherent complexity and key characteristics in the areas of urban hydrology and stormwater quality, based on the influence exerted by a range of rainfall and catchment characteristics. A comprehensive field sampling and testing programme in relation to pollutant build-up, an urban catchment monitoring programme in relation to stormwater quality and the outcomes from advanced statistical analyses provided the platform for the knowledge creation. Two case studies and two real-world applications are discussed to illustrate the translation of the knowledge created to practical use in relation to the role of rainfall and catchment characteristics on urban stormwater quality. An innovative rainfall classification based on stormwater quality was developed to support the effective and scientifically robust design of stormwater treatment systems. Underpinned by the rainfall classification methodology, a reliable approach for design rainfall selection is proposed in order to optimise stormwater treatment based on both, stormwater quality and quantity. This is a paradigm shift from the common approach where stormwater treatment systems are designed based solely on stormwater quantity data. Additionally, how pollutant build-up and stormwater runoff quality vary with a range of catchment characteristics was also investigated. Based on the study out- comes, it can be concluded that the use of only a limited number of catchment parameters such as land use and impervious surface percentage, as it is the case in current modelling approaches, could result in appreciable error in water quality estimation. Influential factors which should be incorporated into modelling in relation to catchment characteristics, should also include urban form and impervious surface area distribution. The knowledge created through the research investigations discussed in this monograph is expected to make a significant contribution to engineering practice such as hydrologic and stormwater quality modelling, stormwater treatment design and urban planning, as the study outcomes provide practical approaches and recommendations for urban stormwater quality enhancement. Furthermore, this monograph also demonstrates how fundamental knowledge of stormwater quality processes can be translated to provide guidance on engineering practice, the comprehensive application of multivariate data analyses techniques and a paradigm on integrative use of computer models and mathematical models to derive practical outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integration of small-scale electricity generators, known as distributed generation (DG), into the distribution networks has become increasingly popular at the present. This tendency together with the falling price of the synchronous-type generator has potential to give DG a better chance at participating in the voltage regulation process together with other devices already available in the system. The voltage control issue turns out to be a very challenging problem for the distribution engineers since existing control coordination schemes would need to be reconsidered to take into account the DG operation. In this paper, we propose a control coordination technique, which is able to utilize the ability of DG as a voltage regulator and, at the same time, minimize interaction with other active devices, such as an on-load tap changing transformer and a voltage regulator. The technique has been developed based on the concept of control zone, line drop compensation, dead band, as well as the choice of controllers' parameters. Simulations carried out on an Australian system show that the technique is suitable and flexible for any system with multiple regulating devices including DG.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project is a step forward in developing effective methods to mitigate voltage unbalance in urban residential networks. The method is proposed to reduce energy losses and improve quality of service in strongly unbalanced low-voltage networks. The method is based on phase swapping as well as optimal placement and sizing of Distribution Static Synchronous Compensator (D-STATCOM) using a Particle Swarm Optimisation method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The capacity to diagnosys, quantify and evaluate movement beyond the general confines of a clinical environment under effectiveness conditions may alleviate rampant strain on limited, expensive and highly specialized medical resources. An iPhone 4® mounted a three dimensional accelerometer subsystem with highly robust software applications. The present study aimed to evaluate the reliability and concurrent criterion-related validity of the accelerations with an iPhone 4® in an Extended Timed Get Up and Go test. Extended Timed Get Up and Go is a clinical test with that the patient get up from the chair and walking ten meters, turn and coming back to the chair. Methods A repeated measure, cross-sectional, analytical study. Test-retest reliability of the kinematic measurements of the iPhone 4® compared with a standard validated laboratory device. We calculated the Coefficient of Multiple Correlation between the two sensors acceleration signal of each subject, in each sub-stage, in each of the three Extended Timed Get Up and Go test trials. To investigate statistical agreement between the two sensors we used the Bland-Altman method. Results With respect to the analysis of the correlation data in the present work, the Coefficient of Multiple Correlation of the five subjects in their triplicated trials were as follows: in sub-phase Sit to Stand the ranged between r = 0.991 to 0.842; in Gait Go, r = 0.967 to 0.852; in Turn, 0.979 to 0.798; in Gait Come, 0.964 to 0.887; and in Turn to Stand to Sit, 0.992 to 0.877. All the correlations between the sensors were significant (p < 0.001). The Bland-Altman plots obtained showed a solid tendency to stay at close to zero, especially on the y and x-axes, during the five phases of the Extended Timed Get Up and Go test. Conclusions The inertial sensor mounted in the iPhone 4® is sufficiently reliable and accurate to evaluate and identify the kinematic patterns in an Extended Timed Get and Go test. While analysis and interpretation of 3D kinematics data continue to be dauntingly complex, the iPhone 4® makes the task of acquiring the data relatively inexpensive and easy to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long term exposure to organic pollutants, both inside and outside school buildings may affect children’s health and influence their learning performance. Since children spend significant amount of time in school, air quality, especially in classrooms plays a key role in determining the health risks associated with exposure at schools. Within this context, the present study investigated the ambient concentrations of Volatile Organic Compounds (VOCs) in 25 primary schools in Brisbane with the aim to quantify the indoor and outdoor VOCs concentrations, identify VOCs sources and their contribution, and based on these; propose mitigation measures to reduce VOCs exposure in schools. One of the most important findings is the occurrence of indoor sources, indicated by the I/O ratio >1 in 19 schools. Principal Component Analysis with Varimax rotation was used to identify common sources of VOCs and source contribution was calculated using an Absolute Principal Component Scores technique. The result showed that outdoor 47% of VOCs were contributed by petrol vehicle exhaust but the overall cleaning products had the highest contribution of 41% indoors followed by air fresheners and art and craft activities. These findings point to the need for a range of basic precautions during the selection, use and storage of cleaning products and materials to reduce the risk from these sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the extensive use of rating systems in the web, and their significance in decision making process by users, the need for more accurate aggregation methods has emerged. The Naïve aggregation method, using the simple mean, is not adequate anymore in providing accurate reputation scores for items [6 ], hence, several researches where conducted in order to provide more accurate alternative aggregation methods. Most of the current reputation models do not consider the distribution of ratings across the different possible ratings values. In this paper, we propose a novel reputation model, which generates more accurate reputation scores for items by deploying the normal distribution over ratings. Experiments show promising results for our proposed model over state-of-the-art ones on sparse and dense datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new transdimensional Sequential Monte Carlo (SMC) algorithm called SM- CVB is proposed. In an SMC approach, a weighted sample of particles is generated from a sequence of probability distributions which ‘converge’ to the target distribution of interest, in this case a Bayesian posterior distri- bution. The approach is based on the use of variational Bayes to propose new particles at each iteration of the SMCVB algorithm in order to target the posterior more efficiently. The variational-Bayes-generated proposals are not limited to a fixed dimension. This means that the weighted particle sets that arise can have varying dimensions thereby allowing us the option to also estimate an appropriate dimension for the model. This novel algorithm is outlined within the context of finite mixture model estimation. This pro- vides a less computationally demanding alternative to using reversible jump Markov chain Monte Carlo kernels within an SMC approach. We illustrate these ideas in a simulated data analysis and in applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Provides an accessible foundation to Bayesian analysis using real world models This book aims to present an introduction to Bayesian modelling and computation, by considering real case studies drawn from diverse fields spanning ecology, health, genetics and finance. Each chapter comprises a description of the problem, the corresponding model, the computational method, results and inferences as well as the issues that arise in the implementation of these approaches. Case Studies in Bayesian Statistical Modelling and Analysis: •Illustrates how to do Bayesian analysis in a clear and concise manner using real-world problems. •Each chapter focuses on a real-world problem and describes the way in which the problem may be analysed using Bayesian methods. •Features approaches that can be used in a wide area of application, such as, health, the environment, genetics, information science, medicine, biology, industry and remote sensing. Case Studies in Bayesian Statistical Modelling and Analysis is aimed at statisticians, researchers and practitioners who have some expertise in statistical modelling and analysis, and some understanding of the basics of Bayesian statistics, but little experience in its application. Graduate students of statistics and biostatistics will also find this book beneficial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical impedance tomography is a novel technology capable of quantifying ventilation distribution in the lung in real time during various therapeutic manoeuvres. The technique requires changes to the patient’s position to place the electrical impedance tomography electrodes circumferentially around the thorax. The impact of these position changes on the time taken to stabilise the regional distribution of ventilation determined by electrical impedance tomography is unknown. This study aimed to determine the time taken for the regional distribution of ventilation determined by electrical impedance tomography to stabilise after changing position. Eight healthy, male volunteers were connected to electrical impedance tomography and a pneumotachometer. After 30 minutes stabilisation supine, participants were moved into 60 degrees Fowler’s position and then returned to supine. Thirty minutes was spent in each position. Concurrent readings of ventilation distribution and tidal volumes were taken every five minutes. A mixed regression model with a random intercept was used to compare the positions and changes over time. The anterior-posterior distribution stabilised after ten minutes in Fowler’s position and ten minutes after returning to supine. Left-right stabilisation was achieved after 15 minutes in Fowler’s position and supine. A minimum of 15 minutes of stabilisation should be allowed for spontaneously breathing individuals when assessing ventilation distribution. This time allows stabilisation to occur in the anterior-posterior direction as well as the left-right direction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work examined a new method of detecting small water filled cracks in underground insulation ('water trees') using data from commecially available non-destructive testing equipment. A testing facility was constructed and a computer simulation of the insulation designed in order to test the proposed ageing factor - the degree of non-linearity. This was a large industry-backed project involving an ARC linkage grant, Ergon Energy and the University of Queensland, as well as the Queensland University of Technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Foot dorsiflexion plays an essential role in both controlling balance and human gait. Electromyography (EMG) and sonomyography (SMG) can provide information on several aspects of muscle function. The aim was to establish the relationship between the EMG and SMG variables during isotonic contractions of foot dorsiflexors. Methods Twenty-seven healthy young adults performed the foot dorsiflexion test on a device designed ad hoc. EMG variables were maximum peak and area under the curve. Muscular architecture variables were muscle thickness and pennation angle. Descriptive statistical analysis, inferential analysis and a multivariate linear regression model were carried out. The confidence level was established with a statistically significant p-value of less than 0.05. Results The correlation between EMG variables and SMG variables was r = 0.462 (p < 0.05). The linear regression model to the dependent variable “peak normalized tibialis anterior (TA)” from the independent variables “pennation angle and thickness”, was significant (p = 0.002) with an explained variance of R2 = 0.693 and SEE = 0.16. Conclusions There is a significant relationship and degree of contribution between EMG and SMG variables during isotonic contractions of the TA muscle. Our results suggest that EMG and SMG can be feasible tools for monitoring and assessment of foot dorsiflexors. TA muscle parameterization and assessment is relevant in order to know that increased strength accelerates the recovery of lower limb injuries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We defined a new statistical fluid registration method with Lagrangian mechanics. Although several authors have suggested that empirical statistics on brain variation should be incorporated into the registration problem, few algorithms have included this information and instead use regularizers that guarantee diffeomorphic mappings. Here we combine the advantages of a large-deformation fluid matching approach with empirical statistics on population variability in anatomy. We reformulated the Riemannian fluid algorithmdeveloped in [4], and used a Lagrangian framework to incorporate 0 th and 1st order statistics in the regularization process. 92 2D midline corpus callosum traces from a twin MRI database were fluidly registered using the non-statistical version of the algorithm (algorithm 0), giving initial vector fields and deformation tensors. Covariance matrices were computed for both distributions and incorporated either separately (algorithm 1 and algorithm 2) or together (algorithm 3) in the registration. We computed heritability maps and two vector and tensorbased distances to compare the power and the robustness of the algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we used a nonconservative Lagrangian mechanics approach to formulate a new statistical algorithm for fluid registration of 3-D brain images. This algorithm is named SAFIRA, acronym for statistically-assisted fluid image registration algorithm. A nonstatistical version of this algorithm was implemented, where the deformation was regularized by penalizing deviations from a zero rate of strain. In, the terms regularizing the deformation included the covariance of the deformation matrices Σ and the vector fields (q). Here, we used a Lagrangian framework to reformulate this algorithm, showing that the regularizing terms essentially allow nonconservative work to occur during the flow. Given 3-D brain images from a group of subjects, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the nonstatistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the nonconservative terms, creating four versions of SAFIRA. We evaluated and compared our algorithms' performance on 92 3-D brain scans from healthy monozygotic and dizygotic twins; 2-D validations are also shown for corpus callosum shapes delineated at midline in the same subjects. After preliminary tests to demonstrate each method, we compared their detection power using tensor-based morphometry (TBM), a technique to analyze local volumetric differences in brain structure. We compared the accuracy of each algorithm variant using various statistical metrics derived from the images and deformation fields. All these tests were also run with a traditional fluid method, which has been quite widely used in TBM studies. The versions incorporating vector-based empirical statistics on brain variation were consistently more accurate than their counterparts, when used for automated volumetric quantification in new brain images. This suggests the advantages of this approach for large-scale neuroimaging studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust and automatic non-rigid registration depends on many parameters that have not yet been systematically explored. Here we determined how tissue classification influences non-linear fluid registration of brain MRI. Twin data is ideal for studying this question, as volumetric correlations between corresponding brain regions that are under genetic control should be higher in monozygotic twins (MZ) who share 100% of their genes when compared to dizygotic twins (DZ) who share half their genes on average. When these substructure volumes are quantified using tensor-based morphometry, improved registration can be defined based on which method gives higher MZ twin correlations when compared to DZs, as registration errors tend to deplete these correlations. In a study of 92 subjects, higher effect sizes were found in cumulative distribution functions derived from statistical maps when performing tissue classification before fluid registration, versus fluidly registering the raw images. This gives empirical evidence in favor of pre-segmenting images for tensor-based morphometry.