43 resultados para Data sources detection


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Precipitation and temperature climate indices are calculated using the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis and validated against observational data from some stations over Brazil and other data sources. The spatial patterns of the climate indices trends are analyzed for the period 1961-1990 over South America. In addition, the correlation and linear regression coefficients for some specific stations were also obtained in order to compare with the reanalysis data. In general, the results suggest that NCEP/NCAR reanalysis can provide useful information about minimum temperature and consecutive dry days indices at individual grid cells in Brazil. However, some regional differences in the climate indices trends are observed when different data sets are compared. For instance, the NCEP/NCAR reanalysis shows a reversal signal for all rainfall annual indices and the cold night index over Argentina. Despite these differences, maps of the trends for most of the annual climate indices obtained from the NCEP/NCAR reanalysis and BRANT analysis are generally in good agreement with other available data sources and previous findings in the literature for large areas of southern South America. The pattern of trends for the precipitation annual indices over the 30 years analyzed indicates a change to wetter conditions over southern and southeastern parts of Brazil, Paraguay, Uruguay, central and northern Argentina, and parts of Chile and a decrease over southwestern South America. All over South America, the climate indices related to the minimum temperature (warm or cold nights) have clearly shown a warming tendency; however, no consistent changes in maximum temperature extremes (warm and cold days) have been observed. Therefore, one must be careful before suggesting an), trends for warm or cold days.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUÇÃO: A malária autóctone no Estado de São Paulo caracteriza-se por surtos esporádicos na região oeste e transmissão persistente na região leste onde ocorrem casos oligossintomáticos com baixa parasitemia pelo Plasmodium vivax. Os objetivos deste estudo foram: analisar a completitude das fichas de notificação de malária autóctone; estimar a tendência da incidência de casos autóctones no ESP de 1980 a 2007; analisar o comportamento clínico e epidemiológico dos casos em duas regiões de autoctonia neste período. MÉTODOS: Foi realizado um estudo descritivo com 18 variáveis das FIN de malária do ESP, analisadas em duas regiões e em dois períodos (1980-1993 e 1994-2007). Fontes de dados: SUCEN/SES/SP, SINAN/CVE/SES/SP e DATASUS. RESULTADOS: A completitude foi superior a 85% em 11 variáveis. A tendência da incidência foi decrescente. Foram notificados 821 casos autóctones, 91,6% na região leste, predominando Plasmodium vivax. A infecção assintomática teve maior porcentagem no segundo período (p<0,001). CONCLUSÕES: A completitude das informações foi considerada satisfatória. As diferenças clínicas encontradas merecem atenção da vigilância epidemiológica que deve lidar com o desafio da infecção assintomática por Plasmodium.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJETIVO: Estimar a prevalência de defeitos congênitos (DC) em uma coorte de nascidos vivos (NV) vinculando-se os bancos de dados do Sistema de Informação de Mortalidade (SIM) e do Sistema de Informação sobre Nascidos Vivos (SINASC). MÉTODOS: Estudo descritivo para avaliar as declarações de nascido vivo como fonte de informação sobre DC. A população de estudo é uma coorte de NV hospitalares do 1º semestre de 2006 de mães residentes e ocorridos no Município de São Paulo no período de 01/01/2006 a 30/06/2006, obtida por meio da vinculação dos bancos de dados das declarações de nascido vivo e óbitos neonatais provenientes da coorte. RESULTADOS: Os DC mais prevalentes segundo o SINASC foram: malformações congênitas (MC) e deformidades do aparelho osteomuscular (44,7%), MC do sistema nervoso (10,0%) e anomalias cromossômicas (8,6%). Após a vinculação, houve uma recuperação de 80,0% de indivíduos portadores de DC do aparelho circulatório, 73,3% de DC do aparelho respiratório e 62,5% de DC do aparelho digestivo. O SINASC fez 55,2% das notificações de DC e o SIM notificou 44,8%, mostrando-se importante para a recuperação de informações de DC. Segundo o SINASC, a taxa de prevalência de DC na coorte foi de 75,4%00 NV; com os dados vinculados com o SIM, essa taxa passou para 86,2%00 NV. CONCLUSÕES: A complementação de dados obtida pela vinculação SIM/SINASC fornece um perfil mais real da prevalência de DC do que aquele registrado pelo SINASC, que identifica os DC mais visíveis, enquanto o SIM identifica os mais letais, mostrando a importância do uso conjunto das duas fontes de dados.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A regionalização tem sido apontada como um dos principais desafios para viabilizar a equidade e a integralidade do SUS. Este artigo tem como objetivo avaliar o processo de implementação de um projeto de organização de regiões de saúde no município de São Paulo. Para tanto, foi realizado um estudo de caso em uma região selecionada desse município, a partir do referencial da análise de implantação, utilizando-se como fonte de dados documentos da gestão e entrevistas semiestruturadas com informantes-chave da gestão municipal 2005-2008. A análise temática evidenciou que o projeto de regionalização idealizado no início da gestão não foi efetivamente implementado. Dentre os fatores que interferiram nesse insucesso, destacam-se: a) a Secretaria Municipal de Saúde (SMS), além de seu caráter centralizador, manteve estruturas político-administrativas independentes para a gestão da atenção básica e da assistência hospitalar; b) a SMS não assumiu a gestão, de fato, de ambulatórios e hospitais estaduais; c) o poder institucional e a resistência dos hospitais em se integrar ao sistema de saúde. Discute-se, ainda, a necessidade de avançar na descentralização intramunicipal do SUS e buscar novas estratégias para a construção de pactos que consigam superar as resistências e articular instituições historicamente consolidadas, visando uma regionalização cooperativa e solidária.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O acesso aos serviços de média complexidade tem sido apontado, por gestores e pesquisadores, como um dos entraves para a efetivação da integralidade do SUS. Este artigo teve o objetivo de avaliar mecanismos utilizados pela gestão do SUS, no município de São Paulo, para garantir acesso à assistência de média complexidade, durante o período de 2005 a 2008. Optou-se pela estratégia de estudo de caso, utilizando as seguintes fontes de evidência: entrevistas com gestores; grupo focal com usuários e observação participante. Utilizouas técnica de análise temática, a partir do referencial teórico da integralidade da assistência, na dimensão da organização de serviços. Buscou-se descrever os caminhos percorridos pelos usuários para acessar os serviços da média complexidade, a partir da visão dos gestores e dos próprios usuários. A média complexidade foi identificada, pelos gestores, como o "gargalo" do SUS e um dos principais obstáculos para a construção da integralidade. Para enfrentar essa situação, o gestor municipal investiu na informatização dos serviços, como medida isolada e, ainda, sem considerar a necessidade dos usuários. Sendo assim, essa incorporação tecnológica teve pouco impacto na melhoria do acesso, o que se confirmou no relato dos usuários. Discute-se que para o enfrentamento de um problema tão complexo são necessárias ações articuladas, tanto no âmbito da política de saúde, quanto da organização dos serviços, bem como a (re)organização do processo de trabalho em todos os níveis do sistema de saúde.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O acesso aos serviços de média complexidade tem sido apontado, por gestores e pesquisadores, como um dos entraves para a efetivação da integralidade do SUS. Este artigo teve o objetivo de avaliar mecanismos utilizados pela gestão do SUS, no município de São Paulo, para garantir acesso à assistência de média complexidade, durante o período de 2005 a 2008. Optou-se pela estratégia de estudo de caso, utilizando as seguintes fontes de evidência: entrevistas com gestores; grupo focal com usuários e observação participante. Utilizouas técnica de análise temática, a partir do referencial teórico da integralidade da assistência, na dimensão da organização de serviços. Buscou-se descrever os caminhos percorridos pelos usuários para acessar os serviços da média complexidade, a partir da visão dos gestores e dos próprios usuários. A média complexidade foi identificada, pelos gestores, como o "gargalo" do SUS e um dos principais obstáculos para a construção da integralidade. Para enfrentar essa situação, o gestor municipal investiu na informatização dos serviços, como medida isolada e, ainda, sem considerar a necessidade dos usuários. Sendo assim, essa incorporação tecnológica teve pouco impacto na melhoria do acesso, o que se confirmou no relato dos usuários. Discute-se que para o enfrentamento de um problema tão complexo são necessárias ações articuladas, tanto no âmbito da política de saúde, quanto da organização dos serviços, bem como a (re)organização do processo de trabalho em todos os níveis do sistema de saúde

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method: The cost-effectiveness of the Optimal (R) and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U. S. dollars. Sensitivity analysis was performed considering key model parameters. Results: In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$ 549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion: Microscopy is more cost-effective than OptiMal (R) in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to the imprecise nature of biological experiments, biological data is often characterized by the presence of redundant and noisy data. This may be due to errors that occurred during data collection, such as contaminations in laboratorial samples. It is the case of gene expression data, where the equipments and tools currently used frequently produce noisy biological data. Machine Learning algorithms have been successfully used in gene expression data analysis. Although many Machine Learning algorithms can deal with noise, detecting and removing noisy instances from the training data set can help the induction of the target hypothesis. This paper evaluates the use of distance-based pre-processing techniques for noise detection in gene expression data classification problems. This evaluation analyzes the effectiveness of the techniques investigated in removing noisy data, measured by the accuracy obtained by different Machine Learning classifiers over the pre-processed data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The protozoan parasites Giardia and Cryptosporidium have been described as important waterbone disease pathogens, and are associated with severe gastrointestinal illnesses. The objective of this paper was to investigate the presence of Giardia cysts and Cryptosporidium oocysts in sample from wtershed catchments and treated water sources. A total of 25 water samples were collected and examined according to the EPA - Method 1623, 2005, consisting of 12 from drinking water and 13 from raw water. Positive samples from raw water for Giardia cysts and Cryptosporidium oocysts were 46.1 and 7.6%, respectively. In finished water, positive samples were 41.7 per centfor Giardia cysts and 25 per cent for Cryptosporidium oocysts. Concentrations of Giardia cysts found in raw water samples ranged from "not detected" to 0.1oocysts/L, whereas concentrations of Cryptosporidium oocystsranged from "not detected" to 0.1 oocysts/L. In finished water, Giardia concentrations ranged from "not detected" to 0.06 cysts/L, and Cryptosporidium oocysts were not high in the samples analyzed. Nevertheless, the results of this study highlight the need to monitor these organisms in both raw and drinking water.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work proposes a method based on both preprocessing and data mining with the objective of identify harmonic current sources in residential consumers. In addition, this methodology can also be applied to identify linear and nonlinear loads. It should be emphasized that the entire database was obtained through laboratory essays, i.e., real data were acquired from residential loads. Thus, the residential system created in laboratory was fed by a configurable power source and in its output were placed the loads and the power quality analyzers (all measurements were stored in a microcomputer). So, the data were submitted to pre-processing, which was based on attribute selection techniques in order to minimize the complexity in identifying the loads. A newer database was generated maintaining only the attributes selected, thus, Artificial Neural Networks were trained to realized the identification of loads. In order to validate the methodology proposed, the loads were fed both under ideal conditions (without harmonics), but also by harmonic voltages within limits pre-established. These limits are in accordance with IEEE Std. 519-1992 and PRODIST (procedures to delivery energy employed by Brazilian`s utilities). The results obtained seek to validate the methodology proposed and furnish a method that can serve as alternative to conventional methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Functional MRI (fMRI) data often have low signal-to-noise-ratio (SNR) and are contaminated by strong interference from other physiological sources. A promising tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). BSS is based on the assumption that the detected signals are a mixture of a number of independent source signals that are linearly combined via an unknown mixing matrix. BSS seeks to determine the mixing matrix to recover the source signals based on principles of statistical independence. In most cases, extraction of all sources is unnecessary; instead, a priori information can be applied to extract only the signal of interest. Herein we propose an algorithm based on a variation of ICA, called Dependent Component Analysis (DCA), where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We applied such method to inspect functional Magnetic Resonance Imaging (fMRI) data, aiming to find the hemodynamic response that follows neuronal activation from an auditory stimulation, in human subjects. The method localized a significant signal modulation in cortical regions corresponding to the primary auditory cortex. The results obtained by DCA were also compared to those of the General Linear Model (GLM), which is the most widely used method to analyze fMRI datasets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work describes two similar methods for calculating gamma transition intensities from multidetector coincidence measurements. In the first one, applicable to experiments where the angular correlation function is explicitly fitted, the normalization parameter from this fit is used to determine the gamma transition intensities. In the second, that can be used both in angular correlation or DCO measurements, the spectra obtained for all the detector pairs are summed up, in order to get the best detection statistics possible, and the analysis of the resulting bidimensional spectrum is used to calculate the transition intensities; in this method, the summation of data corresponding to different angles minimizes the influence of the angular correlation coefficient. Both methods are then tested in the calculation of intensities for well-known transitions from a (152)Eu standard source, as well as in the calculation of intensities obtained in beta-decay experiments with (193)Os and (155)Sm sources, yielding excellent results in all these cases. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to compare the polymerization shrinkage stress of composite resins (microfilled, microhybrid and hybrid) photoactivated by quartz-tungsten halogen light (QTH) and light-emitting diode (LED). Glass rods (5.0 mm x 5.0 cm) were fabricated and had one of the surfaces air-abraded with aluminum oxide and coated with a layer of an adhesive system, which was photoactivated with the QTH unit. The glass rods were vertically assembled, in pairs, to a universal testing machine and the composites were applied to the lower rod. The upper rod was placed closer, at 2 mm, and an extensometer was attached to the rods. The 20 composites were polymerized by either QTH (n=10) or LED (n=10) curing units. Polymerization was carried out using 2 devices positioned in opposite sides, which were simultaneously activated for 40 s. Shrinkage stress was analyzed twice: shortly after polymerization (t40s) and 10 min later (t10min). Data were analyzed statistically by 2-way ANOVA and Tukey's test (a=5%). The shrinkage stress for all composites was higher at t10min than at t40s, regardless of the activation source. Microfilled composite resins showed lower shrinkage stress values compared to the other composite resins. For the hybrid and microhybrid composite resins, the light source had no influence on the shrinkage stress, except for microfilled composite at t10min. It may be concluded that the composition of composite resins is the factor with the strongest influence on shrinkage stress.