955 resultados para statistical potentials
Resumo:
This article reports on the results of a study undertaken by the author together with her research assistant, Heather Green. The study collected and analysed data from all disciplinary tribunal decisions heard in Queensland since 1930 in an attempt to provide empirical information which has previously been lacking. This article will outline the main features of the disciplinary system in Queensland, describe the research methodology used in the present study and then report on some findings from the study. Reported findings include a profile of solicitors who have appeared before a disciplinary hearing, the types of matters which have attracted formal discipline and the types of orders made by the tribunal. Much of the data is then presented on a time scale so as to reveal any changes over time.
Resumo:
The monitoring of infection control indicators including hospital-acquired infections is an established part of quality maintenance programmes in many health-care facilities. However, surveillance data use can be frustrated by the infrequent nature of many infections. Traditional methods of analysis often provide delayed identification of increasing infection occurrence, placing patients at preventable risk. The application of Shewhart, Cumulative Sum (CUSUM) and Exponentially Weighted Moving Average (EWMA) statistical process control charts to the monitoring of indicator infections allows continuous real-time assessment. The Shewhart chart will detect large changes, while CUSUM and EWMA methods are more suited to recognition of small to moderate sustained change. When used together, Shewhart and EWMA methods are ideal for monitoring bacteraemia and multiresistant organism rates. Shewhart and CUSUM charts are suitable for surgical infection surveillance.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
A gestão dos recursos hídricos necessita da integração entre os critérios físicos e químicos, e os aspectos bióticos, os quais possibilitam identificar os efeitos combinados de substâncias e avaliar suas influências. Os sistemas testes Allium cepa e Tradescantia pallida, são utilizados para o estudo da poluição aquática a partir de aspectos citogenéticos. Além destes biomarcadores, os teores de clorofila também são utilizados em estudos de estresse devido ao reflexo a múltiplos fatores. Assim, o objetivo deste trabalho foi avaliar a qualidade da água da lagoa Juara (Município de Serra/ES) pela análise integrada de aspectos físicos, químicos e ecotoxicológicos a partir de estudos citogenéticos em A. cepa e T. pallida, e fotossintéticos nesta última espécie. Foram definidas três estações amostrais ao longo da lagoa e a partir de amostras de água foram analisados parâmetros tais como condutividade, oxigênio dissolvido, concentração de nutrientes e metais. A determinação dos metais ocorreu por análises de espectrometria de massa. O teste do A. cepa foi realizado a partir sementes germinadas em amostras de água da lagoa. Com plantas de T. pallida, foi realizado o ensaio da mitose em ponta de raiz de T. pallida e dosado os teores de pigmentos cloroplastídicos em folhas totalmente expandidas. Para tanto, foi realizado ensaio utilizando-se a água da lagoa como solvente para solução de Hoagland onde estacas previamente enraizadas de T. pallida foram expostas durante 24 horas e 40 dias para as avaliações citogenéticas e fotossintéticas, respectivamente. Foi realizado novo teste do A. cepa nas águas da lagoa após os 40 dias de ensaio para aferir a manutenção das propriedades químicas das amostras. A avaliação citogenética nas duas espécies envolveu a análise dos índice mitótico (IM), índice de aberrações cromossômicas (AC) e frequência de micronúcleos (MN). Para a análise estatística foi utilizada a análise de variância seguida pelo teste de Tukey (p < 0.05) para a comparação dos tratamentos durante a mesma campanha, e teste de Bonferroni (p < 0,05) para a comparação entre as campanhas. Os resultados físicos e químicos mensurados demonstram que a lagoa Juara apresenta indícios de eutroficação artificial. Duas estações amostrais, em pelo menos uma campanha amostral, apresentaram potenciais citotóxico, genotóxico e mutagênico. Todavia, esses potenciais não demonstram relação com os teores de Fe e Mn quantificados, levando a crer que tais pontos apresentam outros potenciais poluentes. Os danos citogenéticos observados apresentaram efeitos maximizados durante a segunda campanha, demostrando o efeito do período de chuva na intensificação da poluição nesse ambiente. O estudo do metabolismo fotossintético em T. pallida, demonstrou os teores de pigmentos cloroplastídicos relacionados ao elevado aporte de nutrientes presentes nas estações J2 e J3. Sendo o excesso destes, o provável responsável pelo teor inferior de pigmentos em J3. Observa-se que os ensaios com A. cepa e T. pallida responderam de maneira fidedigna ao risco potencial do ambiente, complementando as análises físicas e químicas usualmente utilizadas na avaliação da qualidade da água de ambientes lacustres.
Resumo:
A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.
Resumo:
A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.
Resumo:
Low noise surfaces have been increasingly considered as a viable and cost-effective alternative to acoustical barriers. However, road planners and administrators frequently lack information on the correlation between the type of road surface and the resulting noise emission profile. To address this problem, a method to identify and classify different types of road pavements was developed, whereby near field road noise is analyzed using statistical learning methods. The vehicle rolling sound signal near the tires and close to the road surface was acquired by two microphones in a special arrangement which implements the Close-Proximity method. A set of features, characterizing the properties of the road pavement, was extracted from the corresponding sound profiles. A feature selection method was used to automatically select those that are most relevant in predicting the type of pavement, while reducing the computational cost. A set of different types of road pavement segments were tested and the performance of the classifier was evaluated. Results of pavement classification performed during a road journey are presented on a map, together with geographical data. This procedure leads to a considerable improvement in the quality of road pavement noise data, thereby increasing the accuracy of road traffic noise prediction models.
Resumo:
Wyner - Ziv (WZ) video coding is a particular case of distributed video coding (DVC), the recent video coding paradigm based on the Slepian - Wolf and Wyner - Ziv theorems which exploits the source temporal correlation at the decoder and not at the encoder as in predictive video coding. Although some progress has been made in the last years, WZ video coding is still far from the compression performance of predictive video coding, especially for high and complex motion contents. The WZ video codec adopted in this study is based on a transform domain WZ video coding architecture with feedback channel-driven rate control, whose modules have been improved with some recent coding tools. This study proposes a novel motion learning approach to successively improve the rate-distortion (RD) performance of the WZ video codec as the decoding proceeds, making use of the already decoded transform bands to improve the decoding process for the remaining transform bands. The results obtained reveal gains up to 2.3 dB in the RD curves against the performance for the same codec without the proposed motion learning approach for high motion sequences and long group of pictures (GOP) sizes.
Resumo:
The mechanisms of speech production are complex and have been raising attention from researchers of both medical and computer vision fields. In the speech production mechanism, the articulator’s study is a complex issue, since they have a high level of freedom along this process, namely the tongue, which instigates a problem in its control and observation. In this work it is automatically characterized the tongues shape during the articulation of the oral vowels of Portuguese European by using statistical modeling on MR-images. A point distribution model is built from a set of images collected during artificially sustained articulations of Portuguese European sounds, which can extract the main characteristics of the motion of the tongue. The model built in this work allows under standing more clearly the dynamic speech events involved during sustained articulations. The tongue shape model built can also be useful for speech rehabilitation purposes, specifically to recognize the compensatory movements of the articulators during speech production.
Resumo:
Intensity Modulated Radiotherapy (IMRT) is a technique introduced to shape more precisely the dose distributions to the tumour, providing a higher dose escalation in the volume to irradiate and simultaneously decreasing the dose in the organs at risk which consequently reduces the treatment toxicity. This technique is widely used in prostate and head and neck (H&N) tumours. Given the complexity and the use of high doses in this technique it’s necessary to ensure as a safe and secure administration of the treatment, through the use of quality control programmes for IMRT. The purpose of this study was to evaluate statistically the quality control measurements that are made for the IMRT plans in prostate and H&N patients, before the beginning of the treatment, analysing their variations, the percentage of rejected and repeated measurements, the average, standard deviations and the proportion relations.
Resumo:
In this work, we investigated structural, morphological, electrical, and optical properties from a set of Cu2ZnSnS4 thin films grown by sulfurization of metallic precursors deposited on soda lime glass substrates coated with or without molybdenum. X-ray diffraction and Raman spectroscopy measurements revealed the formation of single-phase Cu2ZnSnS4 thin films. A good crystallinity and grain compactness of the film was found by scanning electron microscopy. The grown films are poor in copper and rich in zinc, which is a composition close to that of the Cu2ZnSnS4 solar cells with best reported efficiency. Electrical conductivity and Hall effect measurements showed a high doping level and a strong compensation. The temperature dependence of the free hole concentration showed that the films are nondegenerate. Photoluminescence spectroscopy showed an asymmetric broadband emission. The experimental behavior with increasing excitation power or temperature cannot be explained by donor-acceptor pair transitions. A model of radiative recombination of an electron with a hole bound to an acceptor level, broadened by potential fluctuations of the valence-band edge, was proposed. An ionization energy for the acceptor level in the range 29–40 meV was estimated, and a value of 172 ±2 meV was obtained for the potential fluctuation in the valence-band edge.
Resumo:
Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.