974 resultados para Bit Error Rate (BER)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials in which an experimental treatment is selected at an interim analysis have been the focus of much recent research interest. Many of the methods proposed are based on the group sequential approach. This paper considers designs of this type in which the treatment selection can be based on short-term endpoint information for more patients than have primary endpoint data available. We show that in such a case, the familywise type I error rate may be inflated if previously proposed group sequential methods are used and the treatment selection rule is not specified in advance. A method is proposed to avoid this inflation by considering the treatment selection that maximises the conditional error given the data available at the interim analysis. A simulation study is reported that illustrates the type I error rate inflation and compares the power of the new approach with two other methods: a combination testing approach and a group sequential method that does not use the short-term endpoint data, both of which also strongly control the type I error rate. The new method is also illustrated through application to a study in Alzheimer's disease. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coexistence between different types of templates has been the choice solution to the information crisis of prebiotic evolution, triggered by the finding that a single RNA-like template cannot carry enough information to code for any useful replicase. In principle, confining d distinct templates of length L in a package or protocell, whose Survival depends on the coexistence of the templates it holds in, could resolve this crisis provided that d is made sufficiently large. Here we review the prototypical package model of Niesert et al. [1981. Origin of life between Scylla and Charybdis. J. Mol. Evol. 17, 348-353] which guarantees the greatest possible region of viability of the protocell population, and show that this model, and hence the entire package approach, does not resolve the information crisis. In particular, we show that the total information stored in a viable protocell (Ld) tends to a constant value that depends only on the spontaneous error rate per nucleotide of the template replication mechanism. As a result, an increase of d must be followed by a decrease of L, so that the net information gain is null. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo desta pesquisa foi identificar a percepção dos empregados sobre a relação da legislação de compras com o desempenho da Embrapa Semiárido com base no critério de eficiência após a implantação do Pregão como uma nova modalidade licitatória. Foram realizadas pesquisas bibliográfica, documental, de campo, e para assegurar a validade das informações foi utilizada uma triangulação de técnicas de coleta de dados de análise documental, observação direta e entrevistas semiabertas, realizadas com empregados que atuam ou atuaram no Setor de Compras a mais de 10 anos e que fossem pregoeiros, e pesquisadores da Unidade, também com mais de 10 anos de experiência, que tivessem projetos aprovados com orçamento do Tesouro Nacional, com execução nos períodos anteriores e após à implantação do Pregão. De acordo com a documentação analisada e na ótica dos empregados do Setor, foi identificado que após a implantação do Pregão, a unidade tem conseguido economia de recursos nas contratações em média de 20% abaixo do valor de referência estabelecido nos Editais, o que poderia indicar eficiência. Entretanto, ficou evidente que o índice de erro nos processos passou a ser muito maior em relação aos processos realizados antes do pregão, e que apesar de tal modalidade ser apontada como benéfica, tem-se a impressão de que está sendo utilizada de maneira inadequada generalizadamente para toda e qualquer contratação sem uma análise prévia da modalidade licitatória mais adequada. Nessa mesma linha, no tocante aos relatórios de pesquisa e à opinião dos pesquisadores, foi evidenciado que tem havido perdas orçamentárias por não entrega de processos feitos em fim de ano, implicando em não cumprimento de metas, etapas ou tarefas dos experimentos, bem como, muitos atrasos nas entregas, e desperdícios de recursos por conta de aquisições de má qualidade que não servem para o trabalho de pesquisa. Por fim, a título de contribuição, este estudo apontou que a legislação de compras públicas atual influencia diretamente no desempenho da Embrapa Semiárido, e não atende às suas necessidades enquanto instituição de pesquisa, indicando que a instituição não pode seguir as mesmas regras, normas e legislação dos processos de aquisições dos demais órgãos da administração pública, e precisa de um novo aparato legal, que lhe proporcione maior flexibilidade para tocar seus projetos e cumprir sua missão institucional

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este estudo visa desenvolver um sistema portátil de radiocomunicação de radiação restrita, indicado para biotelemetria digital de curta distância aplicada ao Teste da Caminhada de Seis Minutos (TC6M) em pacientes com doença pulmonar obstrutiva crônica ou hipertensão pulmonar. A saturação periférica da hemoglobina (SpO2) e a freqüência cardíaca (FC) são monitoradas em tempo real. É utilizada a banda destinada a aplicações médicas, industriais e científicas (ISM), com freqüência de portadora em 916MHz e potência de transmissão de 0,75mW. Este sistema foi projetado para operar através de um enlace half duplex e codificação Manchester NRZ incorporando um protocolo para correção de erros do tipo automatic repeat request error com utilização de um código CRC-16 para detecção de erros. A velocidade máxima de transmissão de dados é de 115.2 kbps. O sistema é constituído de três partes: unidade portátil (Master), unidade estacionária (Slave) e software de visualização em tempo real. A unidade portátil recebe do oxímetro os parâmetros de monitorização que são transmitidos via enlace de rádio-freqüência. A interface da unidade estacionária com o software é feita através da porta de comunicação serial padrão RS-232. Os testes de laboratório e de campo demonstraram que o sistema de biotelemetria é adequado a realizar o TC6M com precisão de SpO2 de ±3 dígitos (com ±1 desvio padrão) e FC de ±3% ambos com taxa de Frame Error Rate < 10-4 (0,01%), sem restrigir os movimentos do usuário durante o processo de monitorização.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation describes the implementation of a WirelessHART networks simulation module for the Network Simulator 3, aiming for the acceptance of both on the present context of networks research and industry. For validating the module were imeplemented tests for attenuation, packet error rate, information transfer success rate and battery duration per station

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, classifying proteins in structural classes, which concerns the inference of patterns in their 3D conformation, is one of the most important open problems in Molecular Biology. The main reason for this is that the function of a protein is intrinsically related to its spatial conformation. However, such conformations are very difficult to be obtained experimentally in laboratory. Thus, this problem has drawn the attention of many researchers in Bioinformatics. Considering the great difference between the number of protein sequences already known and the number of three-dimensional structures determined experimentally, the demand of automated techniques for structural classification of proteins is very high. In this context, computational tools, especially Machine Learning (ML) techniques, have become essential to deal with this problem. In this work, ML techniques are used in the recognition of protein structural classes: Decision Trees, k-Nearest Neighbor, Naive Bayes, Support Vector Machine and Neural Networks. These methods have been chosen because they represent different paradigms of learning and have been widely used in the Bioinfornmatics literature. Aiming to obtain an improvment in the performance of these techniques (individual classifiers), homogeneous (Bagging and Boosting) and heterogeneous (Voting, Stacking and StackingC) multiclassification systems are used. Moreover, since the protein database used in this work presents the problem of imbalanced classes, artificial techniques for class balance (Undersampling Random, Tomek Links, CNN, NCL and OSS) are used to minimize such a problem. In order to evaluate the ML methods, a cross-validation procedure is applied, where the accuracy of the classifiers is measured using the mean of classification error rate, on independent test sets. These means are compared, two by two, by the hypothesis test aiming to evaluate if there is, statistically, a significant difference between them. With respect to the results obtained with the individual classifiers, Support Vector Machine presented the best accuracy. In terms of the multi-classification systems (homogeneous and heterogeneous), they showed, in general, a superior or similar performance when compared to the one achieved by the individual classifiers used - especially Boosting with Decision Tree and the StackingC with Linear Regression as meta classifier. The Voting method, despite of its simplicity, has shown to be adequate for solving the problem presented in this work. The techniques for class balance, on the other hand, have not produced a significant improvement in the global classification error. Nevertheless, the use of such techniques did improve the classification error for the minority class. In this context, the NCL technique has shown to be more appropriated

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wavelet coding has emerged as an alternative coding technique to minimize the fading effects of wireless channels. This work evaluates the performance of wavelet coding, in terms of bit error probability, over time-varying, frequency-selective multipath Rayleigh fading channels. The adopted propagation model follows the COST207 norm, main international standards reference for GSM, UMTS, and EDGE applications. The results show the wavelet coding s efficiency against the inter symbolic interference which characterizes these communication scenarios. This robustness of the presented technique enables its usage in different environments, bringing it one step closer to be applied in practical wireless communication systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine the correlation between the clinical diagnosis and autopsy findings in adult patients who died in an intensive care unit (ICU). To determine the rate of agreement of the basic and terminal causes of death and the types of errors in order to improve quality control of future care,Design, Retrospective study.Setting: Adult ICU in a university hospital.Patients: 30 adult patients who died in the ICU. with the exclusion of medicolegal cases.Methods and main results: Anatomo-clinical meetings were held to analyze the pre- and postmortem correlations in 30 consecutive autopsies at the ICU of the University Hospital, School of Medicine of Botucatu/ UNESP, from January 1994 to January 1997. The rate of correct clinical diagnoses of the basic cause was 66.7 %; in 23.3 % of cases, if the correct diagnosis was made, management would have been different, as would have been the evolution of the patient's course (Class I error): in 10 % of the cases the error would not have led to a change in management (Class II error). The rate of correct clinical diagnoses of terminal cause was 80 %.Conclusions: the rate of recognition of the basic cause was 66.7 %, which is consistent with the literature, but the Class I error rate was higher than that reported in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This was a prospective study of 43 septic neonates at the NICU of the School of Medicine of Botucatu, São Paulo State University. Clinical and laboratory data of sepsis were analyzed based on outcome divided into two groups, survival and death. We calculated the discriminatory power of the relevant variables for the diagnosis of sepsis in each group, and using software for Discriminant Analysis, a function was proposed. There were 43 septic cases with 31 survivals and 12 deaths. The variables that had the highest discriminatory power were: n(o) of compromised systems, the SNAP, FiO2, and (A-a)O2. The study of these and others variables, such as birth weight, n(o) of risk factors, and pH using a Linear Discriminant Function(LDF) allowed us to identify the high-risk neonates for death with a low error rate (8.33%). The LDF was: F = 0.00043 (birth weight) + 0.30367 (n(o) of risk factors) - 0.1171 (n(o) of compromised systems) + 0.33223 (SNAP) + 2.27972 (pH) - 14.96511 (FiO2) + 0.01814 ((A-a)O2). If F > 22.77 there was high risk of death. This study suggests that the LDF at the onset of sepsis is useful for the early identification of the high-risk neonates that need special clinical and laboratory surveillance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the present study, developed in a mountainous region in Brazil where many landslides occur, is to present a method for detecting landslide scars that couples image processing techniques with spatial analysis tools. An IKONOS image was initially segmented, and then classified through a Batthacharrya classifier, with an acceptance limit of 99%, resulting in 216 polygons identified with a spectral response similar to landslide scars. After making use of some spatial analysis tools that took into account a susceptibility map, a map of local drainage channels and highways, and the maximum expected size of scars in the study area, some features misinterpreted as scars were excluded. The 43 resulting features were then compared with visually interpreted landslide scars and field observations. The proposed method can be reproduced and enhanced by adding filtering criteria and was able to find new scars on the image, with a final error rate of 2.3%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

JUSTIFICATIVA E OBJETIVOS: A manutenção de concentração sangüínea alvo-controlada em níveis aproximadamente constantes do propofol é uma técnica que pode ser empregada de modo simplificado na sala de cirurgia. A finalidade desta pesquisa é comparar clínica e laboratorialmente a infusão de propofol em crianças usando os atributos farmacocinéticos de Short e de Marsh. MÉTODO: Foram estudados 41 pacientes com a idade de 4 a 12 anos, de ambos os sexos, estado físico ASA I ou II, distribuídos em dois grupos S (20 pacientes) e M (21 pacientes). No Grupo S utilizaram-se os atributos farmacocinéticos de Short, e no Grupo M, os atributos farmacocinéticos de Marsh. A indução anestésica foi feita com bolus de alfentanil 30 µg.kg-1, propofol 3 mg.kg-1 e pancurônio, 0,08 mg.kg-1 por via venosa. Procedeu-se a intubação traqueal e a manutenção com N2O/O2 (60%) em ventilação controlada mecânica. No grupo S a infusão de propofol foi de 254 (30 min) seguido de 216 µg.kg-1.min-1 por mais 30 min. No grupo M a infusão de propofol foi de 208 (30 min) seguido de 170 µg.kg-1.min-1 por mais 30 min. Através do atributo farmacocinético específico a cada grupo a meta foi a obtenção da concentração-alvo de 4 µg.kg-1 de propofol. Foram colhidas três amostras sangüíneas (aos 20, 40 e 60 minutos) para a dosagem do propofol pelo método da Cromatografia Líquida de Alta Performance. RESULTADOS: Os Grupos S e M foram considerados similares quanto à idade, altura, peso e sexo (p > 0,05). Não houve diferença estatística significativa entre os dois grupos estudados para os parâmetros: PAS, PAD, FC, FiN2O, SpO2 da hemoglobina e P ET CO2 no final da expiração. A comparação entre grupos no número de bolus repetidos de alfentanil não foi estatisticamente significativa. O índice bispectral (BIS) não apresentou diferença estatisticamente significativa entre M0 (vigília) e os demais momentos em ambos os grupos. Os valores Medianos da Performance do Erro (MPE) e os valores Medianos Absolutos da Performance do Erro (MAPE) mostraram diferenças estatísticas significativas entre os grupos no momento 60. Valores medianos da concentração sangüínea de propofol (µg.kg-1) mostraram diferenças estatísticas significativas entre M e S no momento 60 e entre os momentos 40 e 60 no grupo S. CONCLUSÕES: A anestesia com propofol usando os atributos farmacocinéticos de Marsh (Grupo M) apresentou menor erro no cálculo da concentração-alvo de propofol de 4 µg.kg-1. Além disso, utiliza menor quantidade de propofol para obter resultados clínicos semelhantes. Por todas essas qualidades deve ser a preferida para uso em crianças ASA I e com idades entre 4 e 12 anos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to assess and apply a microsatellite multiplex system for parentage determination in alpacas. An approach for parentage testing based on 10 microsatellites was evaluated in a population of 329 unrelated alpacas from different geographical zones in Peru. All microsatellite markers, which amplified in two multiplex reactions, were highly polymorphic with a mean of 14.5 alleles per locus (six to 28 alleles per locus) and an average expected heterozygosity (H-E) of 0.8185 (range of 0.698-0.946). The total parentage exclusion probability was 0.999456 for excluding a candidate parent from parentage of an arbitrary offspring, given only the genotype of the offspring, and 0.999991 for excluding a candidate parent from parentage of an arbitrary offspring, given the genotype of the offspring and the other parent. In a case test of parentage assignment, the microsatellite panel assigned 38 (from 45 cases) offspring parentage to 10 sires with LOD scores ranging from 2.19 x 10(+13) to 1.34 x 10(+15) and Delta values ranging from 2.80 x 10(+12) to 1.34 x 10(+15) with an estimated pedigree error rate of 15.5%. The performance of this multiplex panel of markers suggests that it will be useful in parentage testing of alpacas.