951 resultados para analyses


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Engenharia Clínica)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE The aim of this study was to compare the performance of the current conventional Pap smear with liquid-based cytology (LBC) preparations. STUDY DESIGN Women routinely undergoing their cytopathological and histopathological examinations at Fundação Oncocentro de São Paulo (FOSP) were recruited for LBC. Conventional smears were analyzed from women from other areas of the State of São Paulo with similar sociodemographic characteristics. RESULTS A total of 218,594 cases were analyzed, consisting of 206,999 conventional smears and 11,595 LBC. Among the conventional smears, 3.0% were of unsatisfactory preparation; conversely, unsatisfactory LBC preparations accounted for 0.3%. The ASC-H (atypical squamous cells - cannot exclude high-grade squamous intraepithelial lesion) frequency did not demonstrate any differences between the two methods. In contrast, the incidence of ASC-US (atypical squamous cells of undetermined significance) was almost twice as frequent between LBC and conventional smears, at 2.9 versus 1.6%, respectively. An equal percentage of high-grade squamous intraepithelial lesions were observed for the two methods, but not for low-grade squamous intraepithelial lesions, which were more significantly observed in LBC preparations than in conventional smears (2.2 vs. 0.7%). The index of positivity was importantly enhanced from 3.0% (conventional smears) to 5.7% (LBC). CONCLUSIONS LBC performed better than conventional smears, and we are truly confident that LBC can improve public health strategies aimed at reducing cervical lesions through prevention programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Clinical in-stent restenosis (CISR) is the main limitation of coronary angioplasty with stent implantation. Objective: Describe the clinical and angiographic characteristics of CISR and the outcomes over a minimum follow-up of 12 months after its diagnosis and treatment. Methods: We analyzed in 110 consecutive patients with CISR the clinical presentation, angiographic characteristics, treatment and combined primary outcomes (cardiovascular death, nonfatal acute myocardial infarction [AMI]) and combined secondary (unstable angina with hospitalization, target vessel revascularization and target lesion revascularization) during a minimal follow-up of one year. Results: Mean age was 61 ± 11 years (68.2% males). Clinical presentations included acute coronary syndrome (ACS) in 62.7% and proliferative ISR in 34.5%. CISR was treated with implantation of drug-eluting stents (DES) in 36.4%, Bare Metal Stent (BMS) in 23.6%, myocardial revascularization surgery in 18.2%, balloon angioplasty in 15.5% and clinical treatment in 6.4%. During a median follow-up of 19.7 months, the primary outcome occurred in 18 patients, including 6 (5.5%) deaths and 13 (11.8%) AMI events. Twenty-four patients presented a secondary outcome. Predictors of the primary outcome were CISR with DES (HR = 4.36 [1.44–12.85]; p = 0.009) and clinical treatment for CISR (HR = 10.66 [2.53–44.87]; p = 0.001). Treatment of CISR with BMS (HR = 4.08 [1.75–9.48]; p = 0.001) and clinical therapy (HR = 6.29 [1.35–29.38]; p = 0.019) emerged as predictors of a secondary outcome. Conclusion: Patients with CISR present in most cases with ACS and with a high frequency of adverse events during a medium-term follow-up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present experimental and theoretical analyses of data requirements for haplotype inference algorithms. Our experiments include a broad range of problem sizes under two standard models of tree distribution and were designed to yield statistically robust results despite the size of the sample space. Our results validate Gusfield's conjecture that a population size of n log n is required to give (with high probability) sufficient information to deduce the n haplotypes and their complete evolutionary history. The experimental results inspired our experimental finding with theoretical bounds on the population size. We also analyze the population size required to deduce some fixed fraction of the evolutionary history of a set of n haplotypes and establish linear bounds on the required sample size. These linear bounds are also shown theoretically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MicroRNAs (miRNAs) have been shown to play important roles in both brain development and the regulation of adult neural cell functions. However, a systematic analysis of brain miRNA functions has been hindered by a lack of comprehensive information regarding the distribution of miRNAs in neuronal versus glial cells. To address this issue, we performed microarray analyses of miRNA expression in the four principal cell types of the CNS (neurons, astrocytes, oligodendrocytes, and microglia) using primary cultures from postnatal d 1 rat cortex. These analyses revealed that neural miRNA expression is highly cell-type specific, with 116 of the 351 miRNAs examined being differentially expressed fivefold or more across the four cell types. We also demonstrate that individual neuron-enriched or neuron-diminished RNAs had a significant impact on the specification of neuronal phenotype: overexpression of the neuron-enriched miRNAs miR-376a and miR-434 increased the differentiation of neural stem cells into neurons, whereas the opposite effect was observed for the glia-enriched miRNAs miR-223, miR-146a, miR-19, and miR-32. In addition, glia-enriched miRNAs were shown to inhibit aberrant glial expression of neuronal proteins and phenotypes, as exemplified by miR-146a, which inhibited neuroligin 1-dependent synaptogenesis. This study identifies new nervous system functions of specific miRNAs, reveals the global extent to which the brain may use differential miRNA expression to regulate neural cell-type-specific phenotypes, and provides an important data resource that defines the compartmentalization of brain miRNAs across different cell types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eukaryotic cells generate energy in the form of ATP, through a network of mitochondrial complexes and electron carriers known as the oxidative phosphorylation system. In mammals, mitochondrial complex I (CI) is the largest component of this system, comprising 45 different subunits encoded by mitochondrial and nuclear DNA. Humans diagnosed with mutations in the gene NDUFS4, encoding a nuclear DNA-encoded subunit of CI (NADH dehydrogenase ubiquinone Fe-S protein 4), typically suffer from Leigh syndrome, a neurodegenerative disease with onset in infancy or early childhood. Mitochondria from NDUFS4 patients usually lack detectable NDUFS4 protein and show a CI stability/assembly defect. Here, we describe a recessive mouse phenotype caused by the insertion of a transposable element into Ndufs4, identified by a novel combined linkage and expression analysis. Designated Ndufs4(fky), the mutation leads to aberrant transcript splicing and absence of NDUFS4 protein in all tissues tested of homozygous mice. Physical and behavioral symptoms displayed by Ndufs4(fky/fky) mice include temporary fur loss, growth retardation, unsteady gait, and abnormal body posture when suspended by the tail. Analysis of CI in Ndufs4(fky/fky) mice using blue native PAGE revealed the presence of a faster migrating crippled complex. This crippled CI was shown to lack subunits of the "N assembly module", which contains the NADH binding site, but contained two assembly factors not present in intact CI. Metabolomic analysis of the blood by tandem mass spectrometry showed increased hydroxyacylcarnitine species, implying that the CI defect leads to an imbalanced NADH/NAD(+) ratio that inhibits mitochondrial fatty acid β-oxidation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Generalized multiresolution analyses are increasing sequences of subspaces of a Hilbert space H that fail to be multiresolution analyses in the sense of wavelet theory because the core subspace does not have an orthonormal basis generated by a fixed scaling function. Previous authors have studied a multiplicity function m which, loosely speaking, measures the failure of the GMRA to be an MRA. When the Hilbert space H is L2(Rn), the possible multiplicity functions have been characterized by Baggett and Merrill. Here we start with a function m satisfying a consistency condition which is known to be necessary, and build a GMRA in an abstract Hilbert space with multiplicity function m.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. DESIGN: Cohort of protocols of randomised controlled trial and subsequent full journal publications. SETTING: Six research ethics committees in Switzerland, Germany, and Canada. DATA SOURCES: 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. RESULTS: Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. CONCLUSIONS: Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In previous work we have applied the environmental multi-region input-output (MRIO) method proposed by Turner et al (2007) to examine the ‘CO2 trade balance’ between Scotland and the Rest of the UK. In McGregor et al (2008) we construct an interregional economy-environment input-output (IO) and social accounting matrix (SAM) framework that allows us to investigate methods of attributing responsibility for pollution generation in the UK at the regional level. This facilitates analysis of the nature and significance of environmental spillovers and the existence of an environmental ‘trade balance’ between regions. While the existence of significant data problems mean that the quantitative results of this study should be regarded as provisional, we argue that the use of such a framework allows us to begin to consider questions such as the extent to which a devolved authority like the Scottish Parliament can and should be responsible for contributing to national targets for reductions in emissions levels (e.g. the UK commitment to the Kyoto Protocol) when it is limited in the way it can control emissions, particularly with respect to changes in demand elsewhere in the UK. However, while such analysis is useful in terms of accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. In this paper we argue that where analysis of marginal changes in activity is required, a more flexible interregional computable general equilibrium approach that models behavioural relationships in a more realistic and theory-consistent manner, is more appropriate and informative. To illustrate our analysis, we compare the results of introducing a positive demand stimulus in the UK economy using both IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels affect model results, including the impact on the interregional CO2 ‘trade balance’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the paper is to identify the added value from using general equilibrium techniques to consider the economy-wide impacts of increased efficiency in household energy use. We take as an illustrative case study the effect of a 5% improvement in household energy efficiency on the UK economy. This impact is measured through simulations that use models that have increasing degrees of endogeneity but are calibrated on a common data set. That is to say, we calculate rebound effects for models that progress from the most basic partial equilibrium approach to a fully specified general equilibrium treatment. The size of the rebound effect on total energy use depends upon: the elasticity of substitution of energy in household consumption; the energy intensity of the different elements of household consumption demand; and the impact of changes in income, economic activity and relative prices. A general equilibrium model is required to capture these final three impacts.