944 resultados para Interval analysis
Resumo:
Neste trabalho, são apresentados a metodologia de projeto e resultados de testes experimentais de um estabilizador de sistema de potência (ESP), implementado em um sistema de geração em escala reduzida de 10 kVA, localizado no Laboratório de Controle e Sistema de Potência (LACSPOT) da Universidade Federal do Pará (UFPA). O projeto do ESP é baseado em uma estratégia de controle robusto com ênfase em incertezas paramétricas estruturadas, as quais são tratadas com ferramentas da teoria de análise intervalar. Estas incertezas são decorrentes de mudanças do ponto de operação do sistema, que provocam variações nos parâmetros de um modelo matemático linearizado referente ao comportamento dinâmico do sistema elétrico de potência no referido ponto de operação. Para o projeto do ESP robusto intervalar, são realizados uma serie de ensaios experimentais com o propósito de estimar os parâmetros de modelos linearizados da planta, representando satisfatoriamente a dinâmica dos modos poucos amortecidos do sistema de geração interligado. O método de identificação é baseado em técnica de identificação paramétrica, baseado em mínimos quadrados. A partir de um conjunto de dados de entrada e saída, para cada ponto de operação, um modelo linear, do tipo auto-regressivo com entrada exógenos (ARX), estimado para fim de uso do projeto do ESP. Por fim, uma série de testes experimentais é realizada no sistema de geração interligado a rede elétrica local, com o propósito de verificar a efetividade da técnica de controle robusto intervalar proposta para a sintonia do ESP. A partir da análise da função custo do sinal de erro de desvio de potência elétrica na saída do gerador síncrono e a função custo do sinal de controle do ESP comprova-se experimentalmente o bom desempenho obtido pela técnica de controle proposta em comparação com uma técnica de controle clássica.
Resumo:
Os sistemas do tipo correia transportadora, são sistemas essenciais para grandes empresas, embora este equipamento apresente um elevado grau de criticidade, de modo que uma parada não planejada do mesmo, pode gerar perdas imensas ou até mesmo a parada de todo o processo produtivo. Tendo em vista a criticidade do equipamento, torna-se necessário realizar a monitoração adequada do mesmo e detectar com maior antecedência possível a ocorrência de alguma falta ocorrida no sistema. Objetivando reduzir as paradas não planejadas, investiga-se nesta dissertação a modelagem de um sistema do tipo correia transportadora com a finalidade de uso no monitoramento e diagnóstico de faltas neste tipo de sistema. Primeiramente é discutido um modelo fenomenológico do processo, o qual é baseado na aplicação das leis da mecânica e considerando-se os diversos tipos de força de oposição ao movimento de uma correia transportadora. Os principais parâmetros do transportador foram estimados utilizando-se técnicas de identificação baseadas em Mínimos Quadrados Não Recursivos. Em seguida, foi desenvolvido e implementado um algoritmo de detecção de faltas utilizando a teoria de análise intervalar, de modo que possibilite detectar condições inadequadas de funcionamento. Com o intuito de avaliar o desempenho do algoritmo proposto foi projetado e construído um protótipo que emula as condições operacionais típicas de um sistema real do tipo correia transportadora. Os resultados obtidos experimentalmente, confirmam o bom desempenho da metodologia proposta.
Resumo:
Pós-graduação em Matemática - IBILCE
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Background Catheter ablation (CA) of ventricular tachycardia (VT) is an important treatment option in patients with structural heart disease (SHD) and implantable cardioverter defibrillator (ICD). A subset of patients requires epicardial CA for VT. Objective The purpose of the study was to assess the significance of epicardial CA in these patients after a systematic sequential endocardial approach. Methods Between January 2009 and October 2012 CA for VT was analyzed. A sequential CA approach guided by earliest ventricular activation, pacemap, entrainment and stimulus to QRS-interval analysis was used. Acute CA success was assessed by programmed ventricular stimulation. ICD interrogation and 24 h-Holter ECG were used to evaluate long-term success. Results One hundred sixty VT ablation procedures in 126 consecutive patients (114 men; age 65 ± 12 years) were performed. Endocardial CA succeeded in 250 (94%) out of 265 treated VT. For 15 (6%) VT an additional epicardial CA was performed and succeeded in 9 of these 15 VT. Long-term FU (25 ± 18.2 month) showed freedom of VT in 104 pts (82%) after 1.2 ± 0.5 procedures, 11 (9%) suffered from repeated ICD shocks and 11 (9%) died due to worsening of heart failure. Conclusions Despite a heterogenic substrate for VT in SHD, endocardial CA alone results in high acute success rates. In this study additional epicardial CA following a sequential endocardial mapping and CA approach was performed in 6% of VT. Thus, due to possible complications epicardial CA should only be considered if endocardial CA fails.
Resumo:
Play is the primary occupation of childhood and provides a potentially powerful means of assessing and treating children with autistic disorder. This study utilized a cross-sectional comparison design to investigate the nature of play engagement in children with AD (n = 24), relative to typically developing children (n = 34) matched for chronological age. Play behaviours were recorded in a clinical play environment. Videotapes comprising 15 minutes of the children's spontaneous play behaviour were analysed using time-interval analysis. The particular play behaviours observed and play objects used were coded. Differences in play behaviours (p < 0.0001) and play object preferences (p < 0.0001) were identified between the groups. Findings regarding play behaviour contribute to contention in the literature surrounding functional and symbolic play. Explanations for play object preferences are postulated. Recommendations are made regarding clinical application of findings in terms of enhancing assessment and intervention by augmenting motivation.
Resumo:
An approximate number is an ordered pair consisting of a (real) number and an error bound, briefly error, which is a (real) non-negative number. To compute with approximate numbers the arithmetic operations on errors should be well-known. To model computations with errors one should suitably define and study arithmetic operations and order relations over the set of non-negative numbers. In this work we discuss the algebraic properties of non-negative numbers starting from familiar properties of real numbers. We focus on certain operations of errors which seem not to have been sufficiently studied algebraically. In this work we restrict ourselves to arithmetic operations for errors related to addition and multiplication by scalars. We pay special attention to subtractability-like properties of errors and the induced “distance-like” operation. This operation is implicitly used under different names in several contemporary fields of applied mathematics (inner subtraction and inner addition in interval analysis, generalized Hukuhara difference in fuzzy set theory, etc.) Here we present some new results related to algebraic properties of this operation.
Resumo:
L'Anàlisi de la supervivència s'utilitza en diferents camps per analitzar el temps transcorregut entre dos esdeveniments. El que distingeix l'anàlisi de la supervivència d'altres àrees de l'estadística és que les dades normalment estan censurades. La censura en un interval apareix quan l'esdeveniment final d'interès no és directament observable i només se sap que el temps de fallada està en un interval concret. Un esquema de censura més complex encara apareix quan tant el temps inicial com el temps final estan censurats en un interval. Aquesta situació s'anomena doble censura. En aquest article donem una descripció formal d'un mètode bayesà paramètric per a l'anàlisi de dades censurades en un interval i dades doblement censurades així com unes indicacions clares de la seva utilització o pràctica. La metodologia proposada s'ilustra amb dades d'una cohort de pacients hemofílics que es varen infectar amb el virus VIH a principis dels anys 1980's.
Resumo:
The objective of this paper is to propose a protocol to analyze blood samples in yellow fever 17DD vaccinated which developed serious adverse events. We investigated whether or not the time between sample collection and sample processing could interfere in lymphocyte subset percentage, for it is often impossible to analyze blood samples immediately after collection due to transport delay from collection places to the flow cytometry facility. CD4+CD38+ T, CD8+CD38+ T, CD3+ T, CD19+ B lymphocyte subsets were analyzed by flow cytometry in nine healthy volunteers immediately after blood collection and after intervals of 24 and 48 h. The whole blood lysis method and gradient sedimentation by Histopaque were applied to isolate peripheral blood mononuclear cells for flow cytometry analyses. With the lysis method, there was no significant change in lymphocyte subset percentage between the two time intervals (24 and 48 h). In contrast, when blood samples were processed by Histopaque gradient sedimentation, time intervals for sample processing influenced the percentage in T lymphocyte subsets but not in B cells. From the results obtained, we could conclude that the whole blood lysis method is more appropriate than gradient sedimentation by Histopaque for immunophenotyping of blood samples collected after serious adverse events, due to less variation in the lymphocyte subset levels with respect to the time factor.
Resumo:
No reports testing the efficacy of the use of the QT/RR ratio <1/2 for detecting a normal QTc interval were found in the literature. The objective of the present study was to determine if a QT/RR ratio <=1/2 can be considered to be equal to the normal QTc and to compare the QT and QTc measured and calculated clinically and by a computerized electrocardiograph. Ratios (140 QT/RR) of 28 successive electrocardiograms obtained from 28 consecutive patients in a tertiary level teaching hospital were analyzed clinically by 5 independent observers and by a computerized electrocardiograph. The QT/RR ratio provided 56% sensitivity and 78% specificity, with an area under the receiver operator characteristic curve of 75.8% (95%CI: 0.68 to 0.84). The divergence in QT and QTc interval measurements between clinical and computerized evaluation were 0.01 ± 0.03 s (95%CI: 0.04-0.02) and 0.01 ± 0.04 s (95%CI: -0.05-0.03), respectively. The QT and QTc values measured clinically and by a computerized electrocardiograph were similar. The QT/RR ratio <=1/2 was not a satisfactory index for QTc evaluation because it could not predict a normal QTc value.
Resumo:
Resumen tomado de la publicación
Resumo:
One of the earliest accounts of duration perception by Karl von Vierordt implied a common process underlying the timing of intervals in the sub-second and the second range. To date, there are two major explanatory approaches for the timing of brief intervals: the Common Timing Hypothesis and the Distinct Timing Hypothesis. While the common timing hypothesis also proceeds from a unitary timing process, the distinct timing hypothesis suggests two dissociable, independent mechanisms for the timing of intervals in the sub-second and the second range, respectively. In the present paper, we introduce confirmatory factor analysis (CFA) to elucidate the internal structure of interval timing in the sub-second and the second range. Our results indicate that the assumption of two mechanisms underlying the processing of intervals in the second and the sub-second range might be more appropriate than the assumption of a unitary timing mechanism. In contrast to the basic assumption of the distinct timing hypothesis, however, these two timing mechanisms are closely associated with each other and share 77% of common variance. This finding suggests either a strong functional relationship between the two timing mechanisms or a hierarchically organized internal structure. Findings are discussed in the light of existing psychophysical and neurophysiological data.
Resumo:
Many statistical studies feature data with both exact-time and interval-censored events. While a number of methods currently exist to handle interval-censored events and multivariate exact-time events separately, few techniques exist to deal with their combination. This thesis develops a theoretical framework for analyzing a multivariate endpoint comprised of a single interval-censored event plus an arbitrary number of exact-time events. The approach fuses the exact-time events, modeled using the marginal method of Wei, Lin, and Weissfeld, with a piecewise-exponential interval-censored component. The resulting model incorporates more of the information in the data and also removes some of the biases associated with the exclusion of interval-censored events. A simulation study demonstrates that our approach produces reliable estimates for the model parameters and their variance-covariance matrix. As a real-world data example, we apply this technique to the Systolic Hypertension in the Elderly Program (SHEP) clinical trial, which features three correlated events: clinical non-fatal myocardial infarction, fatal myocardial infarction (two exact-time events), and silent myocardial infarction (one interval-censored event). ^