959 resultados para Statistical parameters
Resumo:
A prospective, consecutive series of 106 patients receiving endoscopic anterior scoliosis correction. The aim was to analyse changes in radiographic parameters and rib hump in the two years following surgery. Endoscopic anterior scoliosis correction is a level sparing approach, therefore it is important to assess the amount of decompensation which occurs after surgery. All patients received a single anterior rod and vertebral body screws using a standard compression technique. Cleared disc spaces were packed with either mulched femoral head allograft or rib head/iliac crest autograft. Radiographic parameters (major, instrumented, minor Cobb, T5-T12 kyphosis) and rib hump were measured at 2,6,12 and 24 months after surgery. Paired t-tests and Wilcoxon signed ranks tests were used to assess the statistical significant of changes between adjacent time intervals.----- Results: Mean loss of major curve correction from 2 to 24 months after surgery was 4 degrees. Mean loss of rib hump correction was 1.4 degrees. Mean sagittal kyphosis increased from 27 degrees at 2 months to 30.6 degrees at 24 months. Rod fractures and screw-related complications resulted in several degrees less correction than patients without complications, but overall there was no clinically significant decompensation following complications. The study concluded that there are small changes in deformity measures after endoscopic anterior scoliosis surgery, which are statistically significant but not clinically significant.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
Background The problem of silent multiple comparisons is one of the most difficult statistical problems faced by scientists. It is a particular problem for investigating a one-off cancer cluster reported to a health department because any one of hundreds, or possibly thousands, of neighbourhoods, schools, or workplaces could have reported a cluster, which could have been for any one of several types of cancer or any one of several time periods. Methods This paper contrasts the frequentist approach with a Bayesian approach for dealing with silent multiple comparisons in the context of a one-off cluster reported to a health department. Two published cluster investigations were re-analysed using the Dunn-Sidak method to adjust frequentist p-values and confidence intervals for silent multiple comparisons. Bayesian methods were based on the Gamma distribution. Results Bayesian analysis with non-informative priors produced results similar to the frequentist analysis, and suggested that both clusters represented a statistical excess. In the frequentist framework, the statistical significance of both clusters was extremely sensitive to the number of silent multiple comparisons, which can only ever be a subjective "guesstimate". The Bayesian approach is also subjective: whether there is an apparent statistical excess depends on the specified prior. Conclusion In cluster investigations, the frequentist approach is just as subjective as the Bayesian approach, but the Bayesian approach is less ambitious in that it treats the analysis as a synthesis of data and personal judgements (possibly poor ones), rather than objective reality. Bayesian analysis is (arguably) a useful tool to support complicated decision-making, because it makes the uncertainty associated with silent multiple comparisons explicit.
Resumo:
Harmful Algal Blooms (HABs) are a worldwide problem that have been increasing in frequency and extent over the past several decades. HABs severely damage aquatic ecosystems by destroying benthic habitat, reducing invertebrate and fish populations and affecting larger species such as dugong that rely on seagrasses for food. Few statistical models for predicting HAB occurrences have been developed, and in common with most predictive models in ecology, those that have been developed do not fully account for uncertainties in parameters and model structure. This makes management decisions based on these predictions more risky than might be supposed. We used a probit time series model and Bayesian Model Averaging (BMA) to predict occurrences of blooms of Lyngbya majuscula, a toxic cyanophyte, in Deception Bay, Queensland, Australia. We found a suite of useful predictors for HAB occurrence, with Temperature figuring prominently in models with the majority of posterior support, and a model consisting of the single covariate average monthly minimum temperature showed by far the greatest posterior support. A comparison of alternative model averaging strategies was made with one strategy using the full posterior distribution and a simpler approach that utilised the majority of the posterior distribution for predictions but with vastly fewer models. Both BMA approaches showed excellent predictive performance with little difference in their predictive capacity. Applications of BMA are still rare in ecology, particularly in management settings. This study demonstrates the power of BMA as an important management tool that is capable of high predictive performance while fully accounting for both parameter and model uncertainty.
Resumo:
This study aimed to describe wandering using new parameters and to evaluate parameters as a function of cognitive impairment and mobility. Forty-four wanderers in long-term care settings were videotaped 12 times. Rate and duration of wandering episodes were plotted and used to derive parameters from values above and below case medians, proportion of hours wandering, and time of day. Participants wandered during 47% of observations; on average, the hourly rate was 4.3 episodes, the peak hourly rate was 18 episodes, and the peak hourly duration was 19.9 minutes. Mini-Mental State Examination (MMSE) scores was negatively correlated with overall duration and number of observations during which duration exceeded 15 minutes per hour, was positively correlated with number of observations without wandering, and was not significantly correlated with rate-related parameters. Mobility correlated positively with rate and duration parameters. Interaction of MMSE score and mobility was the strongest predictor of wandering duration. Parameters derived from repeated measures provide a new view of daytime wandering and insight into relationships between MMSE score and mobility status with specific parameters of wandering.
Resumo:
Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.
Resumo:
Intimate partner violence (IPV) is not only a problem for heterosexual couples. Although research in the area is beset by methodological and definitional problems, studies generally demonstrate that IPV also affects those who identify as non-heterosexual; that is, those sexualities that are typically categorized as lesbian, gay, bisexual, transgender, or intersex (LGBTI). IPV appears to be at least as prevalent in LGBTI relationships as it is in heterosexual couples, and follows similar patterns (e.g. Australian Research Centre on Sex, Health and Society 2006; Donovan et al. 2006; Chan 2005; Craft and Serovich 2005; Burke et al. 2002; Jeffries and Ball 2008; Kelly and Warshafsky 1987; Letellier 1994; Turrell 2000; Ristock 2003; Vickers 1996). There is, however, little in the way of specific community or social services support available to either victims or perpetrators of violence in same-sex relationships (see Vickers 1996). In addition, there are important differences in the experience of IPV between LGBTI and non-LGBTI victims, and even among LGBTI individuals; for example, among transgender populations (Chan 2005), and those who are HIV sero-positive (Craft and Serovich 2005). These different experiences of IPV include the use of HIV and the threat of “outing” a partner as tools of control, as just two examples (Jeffries and Ball 2008; Salyer 1999; WA Government 2008b). Such differences impact on how LGBTI victims respond to the violence, including whether or not and how they seek help, what services they are able to avail themselves of, and how likely they are to remain with, or return to, their violent partners (Burke et al. 2002). This chapter explores the prevalent heteronormative discourses that surround IPV, both within the academic literature, and in general social and government discourses. It seeks to understand how same-sex IPV remains largely invisible, and suggests that these dominant discourses play a major role in maintaining this invisibility. In many respects, it builds on work by a number of scholars who have begun to interrogate the criminal justice and social discourses surrounding violent crime, primarily sexual violence, and who problematize these discourses (see for example Carmody 2003; Carmody and Carrington 2000; Marcus 1992). It will begin by outlining these dominant discourses, and then problematize these by identifying some of the important differences between LGBTI IPV and IPV in heterosexual relationships. In doing so, this chapter will suggest some possible reasons for the silence regarding IPV in LGBTI relationships, and the effects that this can have on victims. Although an equally important area of research, and another point at which the limitations of dominant social discourses surrounding IPV can be brought to light, this chapter will not examine violence experienced by heterosexual men at the hands of their intimate female partners. Instead, it will restrict itself to IPV perpetrated within same-sex relationships.
Resumo:
Aim To estimate the economic consequences of pressure ulcers attributable to malnutrition. Method Statistical models were developed to predict the number of cases of pressure ulcer, associated bed days lost and the dollar value of these losses in public hospitals in 2002/2003 in Queensland, Australia. The following input parameters were specified and appropriate probability distributions fitted • Number of at risk discharges per annum • Incidence rate for pressure ulcer • Attributable fraction of malnutrition in the development of pressure ulcer • Independent effect of pressure ulcer on length of hospital stay • Opportunity cost of hospital bed day One thousand random re-samples were made and the results expressed as (output) probabilistic distributions. Results The model predicts a mean 16060 (SD 5 671) bed days lost and corresponding mean economic cost of AU$12 968 668 (SD AU$4 924 148) (EUROS 6 925 268 SD 2 629 495; US$ 7 288 391 SD 2 767 371) of pressure ulcer attributable to malnutrition in 2002/2003 in public hospitals in Queensland, Australia. Conclusion The cost of pressure ulcer attributable to malnutrition in bed days and dollar terms are substantial. The model only considers costs of increased length of stay associated with pressure ulcer and not other factors associated with care.
Resumo:
Multicarrier code division multiple access (MC-CDMA) is a very promising candidate for the multiple access scheme in fourth generation wireless communi- cation systems. During asynchronous transmission, multiple access interference (MAI) is a major challenge for MC-CDMA systems and significantly affects their performance. The main objectives of this thesis are to analyze the MAI in asyn- chronous MC-CDMA, and to develop robust techniques to reduce the MAI effect. Focus is first on the statistical analysis of MAI in asynchronous MC-CDMA. A new statistical model of MAI is developed. In the new model, the derivation of MAI can be applied to different distributions of timing offset, and the MAI power is modelled as a Gamma distributed random variable. By applying the new statistical model of MAI, a new computer simulation model is proposed. This model is based on the modelling of a multiuser system as a single user system followed by an additive noise component representing the MAI, which enables the new simulation model to significantly reduce the computation load during computer simulations. MAI reduction using slow frequency hopping (SFH) technique is the topic of the second part of the thesis. Two subsystems are considered. The first sub- system involves subcarrier frequency hopping as a group, which is referred to as GSFH/MC-CDMA. In the second subsystem, the condition of group hopping is dropped, resulting in a more general system, namely individual subcarrier frequency hopping MC-CDMA (ISFH/MC-CDMA). This research found that with the introduction of SFH, both of GSFH/MC-CDMA and ISFH/MC-CDMA sys- tems generate less MAI power than the basic MC-CDMA system during asyn- chronous transmission. Because of this, both SFH systems are shown to outper- form MC-CDMA in terms of BER. This improvement, however, is at the expense of spectral widening. In the third part of this thesis, base station polarization diversity, as another MAI reduction technique, is introduced to asynchronous MC-CDMA. The com- bined system is referred to as Pol/MC-CDMA. In this part a new optimum com- bining technique namely maximal signal-to-MAI ratio combining (MSMAIRC) is proposed to combine the signals in two base station antennas. With the applica- tion of MSMAIRC and in the absents of additive white Gaussian noise (AWGN), the resulting signal-to-MAI ratio (SMAIR) is not only maximized but also in- dependent of cross polarization discrimination (XPD) and antenna angle. In the case when AWGN is present, the performance of MSMAIRC is still affected by the XPD and antenna angle, but to a much lesser degree than the traditional maximal ratio combining (MRC). Furthermore, this research found that the BER performance for Pol/MC-CDMA can be further improved by changing the angle between the two receiving antennas. Hence the optimum antenna angles for both MSMAIRC and MRC are derived and their effects on the BER performance are compared. With the derived optimum antenna angle, the Pol/MC-CDMA system is able to obtain the lowest BER for a given XPD.
Resumo:
This thesis details methodology to estimate urban stormwater quality based on a set of easy to measure physico-chemical parameters. These parameters can be used as surrogate parameters to estimate other key water quality parameters. The key pollutants considered in this study are nitrogen compounds, phosphorus compounds and solids. The use of surrogate parameter relationships to evaluate urban stormwater quality will reduce the cost of monitoring and so that scientists will have added capability to generate a large amount of data for more rigorous analysis of key urban stormwater quality processes, namely, pollutant build-up and wash-off. This in turn will assist in the development of more stringent stormwater quality mitigation strategies. The research methodology was based on a series of field investigations, laboratory testing and data analysis. Field investigations were conducted to collect pollutant build-up and wash-off samples from residential roads and roof surfaces. Past research has identified that these impervious surfaces are the primary pollutant sources to urban stormwater runoff. A specially designed vacuum system and rainfall simulator were used in the collection of pollutant build-up and wash-off samples. The collected samples were tested for a range of physico-chemical parameters. Data analysis was conducted using both univariate and multivariate data analysis techniques. Analysis of build-up samples showed that pollutant loads accumulated on road surfaces are higher compared to the pollutant loads on roof surfaces. Furthermore, it was found that the fraction of solids smaller than 150 ìm is the most polluted particle size fraction in solids build-up on both roads and roof surfaces. The analysis of wash-off data confirmed that the simulated wash-off process adopted for this research agrees well with the general understanding of the wash-off process on urban impervious surfaces. The observed pollutant concentrations in wash-off from road surfaces were different to pollutant concentrations in wash-off from roof surfaces. Therefore, firstly, the identification of surrogate parameters was undertaken separately for roads and roof surfaces. Secondly, a common set of surrogate parameter relationships were identified for both surfaces together to evaluate urban stormwater quality. Surrogate parameters were identified for nitrogen, phosphorus and solids separately. Electrical conductivity (EC), total organic carbon (TOC), dissolved organic carbon (DOC), total suspended solids (TSS), total dissolved solids (TDS), total solids (TS) and turbidity (TTU) were selected as the relatively easy to measure parameters. Consequently, surrogate parameters for nitrogen and phosphorus were identified from the set of easy to measure parameters for both road surfaces and roof surfaces. Additionally, surrogate parameters for TSS, TDS and TS which are key indicators of solids were obtained from EC and TTU which can be direct field measurements. The regression relationships which were developed for surrogate parameters and key parameter of interest were of a similar format for road and roof surfaces, namely it was in the form of simple linear regression equations. The identified relationships for road surfaces were DTN-TDS:DOC, TP-TS:TOC, TSS-TTU, TDS-EC and TSTTU: EC. The identified relationships for roof surfaces were DTN-TDS and TSTTU: EC. Some of the relationships developed had a higher confidence interval whilst others had a relatively low confidence interval. The relationships obtained for DTN-TDS, DTN-DOC, TP-TS and TS-EC for road surfaces demonstrated good near site portability potential. Currently, best management practices are focussed on providing treatment measures for stormwater runoff at catchment outlets where separation of road and roof runoff is not found. In this context, it is important to find a common set of surrogate parameter relationships for road surfaces and roof surfaces to evaluate urban stormwater quality. Consequently DTN-TDS, TS-EC and TS-TTU relationships were identified as the common relationships which are capable of providing measurements of DTN and TS irrespective of the surface type.