947 resultados para gee analyses


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collections of solid particles from the Earth's stratosphere by high-flying aircraft have been reported since 1965, with the initial primary objective of understanding the nature of the aerosol layer that occurs in the lower stratosphere. With the advent of efficient collection procedures and sophisticated electron- and ion-beam techniques, the primary aim of current stratospheric collections has been to study specific particle types that are extraterrestrial in origin and have survived atmospheric entry processes. The collection program provided by NASA at Johnson Space Center (JSC) has conducted many flights over the past 4 years and retrieved a total of 99 collection surfaces (flags) suitable for detailed study. Most of these collections are part of dedicated flights and have occurred during volcanically quiescent periods, although solid particles from the El Chichon eruptions have also been collected. Over 800 individual particles (or representative samples from larger aggregates) have been picked from these flags, examined in a preliminary fashion by SEM and EDS, and cataloged in a manner suitable for selection and study by the wider scientific community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preliminary data is presented on a detailed statistical analysis of k-factor determination for a single class of minerals (amphiboles) which contain a wide range of element concentrations. These amphiboles are homogeneous, contain few (if any) subsolidus microstructures and can be readily prepared for thin film analysis. In previous studies, element loss during the period of irradiation has been assumed negligible for the determination of k-factors. Since this phenomena may be significant for certain mineral systems, we also report on the effect of temperature on k-factor determination for various elements using small probe sizes (approx.20 nm).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historical information can be used, in addition to pedigree, traits and genotypes, to map quantitative trait locus (QTL) in general populations via maximum likelihood estimation of variance components. This analysis is known as linkage disequilibrium (LD) and linkage mapping, because it exploits both linkage in families and LD at the population level. The search for QTL in the wild population of Soay sheep on St. Kilda is a proof of principle. We analysed the data from a previous study and confirmed some of the QTLs reported. The most striking result was the confirmation of a QTL affecting birth weight that had been reported using association tests but not when using linkage-based analyses. Copyright © Cambridge University Press 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists an important tradition of content analyses of aggression in sexually explicit material. The majority of these analyses use a definition of aggression that excludes consent. This article identifies three problems with this approach. First, it does not distinguish between aggression and some positive acts. Second, it excludes a key element of healthy sexuality. Third, it can lead to heteronormative definitions of healthy sexuality. It would be better to use a definition of aggression such as Baron and Richardson's (1994) in our content analyses, that includes a consideration of consent. A number of difficulties have been identified with attending to consent but this article offers solutions to each of these.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a song at the beginning of the musical, West Side Story, where the character Tony sings that “something’s coming, something good.” The song is an anthem of optimism, brimming with promise. This paper is about the long-held promise of information and communication technology (ICT) to transform teaching and learning, to modernise the learning environment of the classroom, and to create a new digital pedagogy. But much of our experience to date in the schooling sector tells more of resistance and reaction than revolution, of more of the same but with a computer in the corner and of ICT activities as unwelcome time-fillers/time-wasters. Recently, a group of pre-service teachers in a postgraduate primary education degree in an Australian university were introduced to learning objects in an ICT immersion program. Their analyses and related responses, as recorded in online journals, have here been interpreted in terms of TPACK (Technological Pedagogical and Content Knowledge). Against contemporary observation, these students generally displayed high levels of competence and highly positive dispositions of students to the integration of ICT in their future classrooms. In short, they displayed the same optimism and confidence as the fictional “Tony” in believing that something good was coming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

5-Hydroxytryptamine (5HT), commonly known as serotonin, which predominantly serves as an inhibitory neurotransmitter in the brain, has long been implicated in migraine pathophysiology. This study tested an Mspl polymorphism in the human 5HT2A receptor gene (HTR2A) and a closely linked microsatellite marker (D13S126), for linkage and association with common migraine. In the association analyses, no significant differences were found between the migraine and control populations for both the Mspl polymorphism and the D13S126 microsatellite marker. The linkage studies involving three families comprising 36 affected members were analysed using both parametric (FASTLINK) and non-parametric (MFLINK and APM) techniques. Significant close linkage was indicated between the Mspl polymorphism and the D13S126 microsatellite marker at a recombination fraction (θ) of zero (lod score=7.15). Linkage results for the Mspl polymorphism were not very informative in the three families, producing maximum and minimum lod scores of only 0.35 and 0.39 at recombination fractions (θ) of 0.2 and 0.00, respectively. However, linkage analysis between the D13S126 marker and migraine indicated significant non-linkage (lod2) up to a recombination fraction (θ) of 0.028. Results from this study exclude the HTR2A gene, which has been localized to chromosome 13q14-q21, for involvement with common migraine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Few studies have formally examined the relationship between meteorological factors and the incidence of child pneumonia in the tropics, despite the fact that most child pneumonia deaths occur there. We examined the association between four meteorological exposures (rainy days, sunshine, relative humidity, temperature) and the incidence of clinical pneumonia in young children in the Philippines using three time-series methods: correlation of seasonal patterns, distributed lag regression, and case-crossover. Lack of sunshine was most strongly associated with pneumonia in both lagged regression [overall relative risk over the following 60 days for a 1-h increase in sunshine per day was 0·67 (95% confidence interval (CI) 0·51–0·87)] and case-crossover analysis [odds ratio for a 1-h increase in mean daily sunshine 8–14 days earlier was 0·95 (95% CI 0·91–1·00)]. This association is well known in temperate settings but has not been noted previously in the tropics. Further research to assess causality is needed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study examined polymorphisms of genes that might be involved in the onset of essential hypertension (HT). These included the (i) growth hormone gene (GH1), whose locus has recently been linked to elevated blood pressure (BP) in the stroke-prone SHR, although recent sib-pair analysis of a polymorphism near the human chorionic somatomammotropin gene (a member of the GH cluster) was unable to show linkage with HT; (ii) renal kallikrein gene (KLK1); and (iii) atrial natriuretic factor gene (ANF), where a primary defect in production or activity of kallikrein or ANF could cause NaCl retention and vasoconstriction. Association analyses were conducted to compare restriction fragment length polymorphisms (RFLPs) of each gene in 85 HT and 95 normotensive (NT) Caucasian subjects whose parents had a similar BP status at age ≥50 years. The frequency of the minor allele of (i) a RsaI RFLP in the promoter of GH1, amplified from leukocyte DNA by the polymerase chain reaction, was 0.15 in the HT group and 0.14 in the NT group (χ1=0.34, P=0.55); (ii) a TaqI RFLP for KLK1 was 0.035 in the HT group and 0.015 in the NT group (χ2=1.5, P=0.21); and (iii) a XhoI RFLP for ANF was 0.50 in HTs and 0.46 in NTs (χ2=0.20, P=0.65). Studies of HT pedigrees found one family in which the ANF locus and HT were not linked, owing to an obligate recombinant. The present data thus provide no evidence for involvement of the growth hormone, renal kallikrein, nor ANF gene in the causation of essential hypertension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Essential hypertension is a highly hereditable disorder in which genetic influences predominate over environmental factors. The molecular genetic profiles which predispose to essential hypertension are not known. In rats with genetic hypertension, there is some recent evidence pointing to linkage of renin gene alleles with blood pressure. The genes for renin and antithrombin III belong to a conserved synteny group which, in humans, spans the q21.3-32.3 region of chromosome I and, in rats, is linkage group X on chromosome 13. The present study examined the association of particular human renin gene (REN) and antithrombin III gene (AT3) polymorphisms with essential hypertension by comparing the frequency of specific alleles for each of these genes in 50 hypertensive offspring of hypertensive parents and 91 normotensive offspring of normotensive parents. In addition, linkage relationships were examined in hypertensive pedigrees with multiple affected individuals. Alleles of a REN HindIII restriction fragment length polymorphism (RFLP) were detected using a genomic clone, λHR5, to probe Southern blots of HindIII-cut leucocyte DNA, and those for an AT3 Pstl RFLP were detected by phATIII 113 complementary DNA probe. The frequencies of each REN allele in the hypertensive group were 0.76 and 0.24 compared with 0.74 and 0.26 in the normotensive group. For AT3, hypertensive allele frequencies were 0.49 and 0.51 compared with normotensive values of 0.54 and 0.46. These differences were not significant by χ2 analysis (P > 0.2). Linkage analysis of a family (data from 16 family members, 10 of whom were hypertensive), informative for both markers, without an age-of-onset correction, and assuming dominant inheritance of hypertension, complete penetrance and a disease frequency of 20%, did not indicate linkage of REN with hypertension, but gave a positive, although not significant, logarithm of the odds for linkage score of 0.784 at a recombination fraction of 0 for AT3 linkage to hypertension. In conclusion, the present study could find no evidence for an association of a REN HindIII RFLP with essential hypertension or for a linkage of the locus defined by this RFLP in a family segregating for hypertension. In the case of an AT3 Pstl RFLP, although association analysis was negative, linkage analysis suggested possible involvement (odds of 6:1 in favour) of a gene located near the 1q23 locus with hypertension in one informative family.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heat-related impacts may have greater public health implications as climate change continues. It is important to appropriately characterize the relationship between heatwave and health outcomes. However, it is unclear whether a case-crossover design can be effectively used to assess the event- or episode-related health effects. This study examined the association between exposure to heatwaves and mortality and emergency hospital admissions (EHAs) from non-external causes in Brisbane, Australia, using both case-crossover and time series analyses approaches. Methods Poisson generalised additive model (GAM) and time-stratified case-crossover analyses were used to assess the short-term impact of heatwaves on mortality and EHAs. Heatwaves exhibited a significant impact on mortality and EHAs after adjusting for air pollution, day of the week, and season. Results For time-stratified case-crossover analysis, odds ratios of mortality and EHAs during heatwaves were 1.62 (95% confidence interval (CI): 1.36–1.94) and 1.22 (95% CI: 1.14–1.30) at lag 1, respectively. Time series GAM models gave similar results. Relative risks of mortality and EHAs ranged from 1.72 (95% CI: 1.40–2.11) to 1.81 (95% CI: 1.56–2.10) and from 1.14 (95% CI: 1.06–1.23) to 1.28 (95% CI: 1.21–1.36) at lag 1, respectively. The risk estimates gradually attenuated after the lag of one day for both case-crossover and time series analyses. Conclusions The risk estimates from both case-crossover and time series models were consistent and comparable. This finding may have implications for future research on the assessment of event- or episode-related (e.g., heatwave) health effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, there has been a significant increase in the popularity of ontological analysis of conceptual modelling techniques. To date, related research explores the ontological deficiencies of classical techniques such as ER or UML modelling, as well as business process modelling techniques such as ARIS or even Web Services standards such as BPEL4WS, BPML, ebXML, BPSS and WSCI. While the ontologies that form the basis of these analyses are reasonably mature, it is the actual process of an ontological analysis that still lacks rigour. The current procedure is prone to individual interpretations and is one reason for criticism of the entire ontological analysis. This paper presents a procedural model for ontological analysis based on the use of meta models, multiple coders and metrics. The model is supported by examples from various ontological analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapidly developing proteomic tools are improving detection of deregulated kallikrein-related peptidase (KLK) expression, at the protein level, in prostate and ovarian cancer, as well as facilitating the determination of functional consequences downstream. Mass spectrometry (MS)-driven proteomics uniquely allows for the detection, identification and quantification of thousands of proteins in a complex protein pool, and this has served to identify certain KLKs as biomarkers for these diseases. In this review we describe applications of this technology in KLK biomarker discovery, and elucidate MS-based techniques which have been used for unbiased, global screening of KLK substrates within complex protein pools. Although MS-based KLK degradomic studies are limited to date, they helped to discover an array of novel KLK substrates. Substrates identified by MS-based degradomics are reported with improved confidence over those determined by incubating a purified or recombinant substrate and protease of interest, in vitro. We propose that these novel proteomic approaches represent the way forward for KLK research, in order to correlate proteolysis of biological substrates with tissue-related consequences, toward clinical targeting of KLK expression and function for cancer diagnosis, prognosis and therapies.