989 resultados para Human data
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.
Resumo:
Despite their limited proliferation capacity, regulatory T cells (T(regs)) constitute a population maintained over the entire lifetime of a human organism. The means by which T(regs) sustain a stable pool in vivo are controversial. Using a mathematical model, we address this issue by evaluating several biological scenarios of the origins and the proliferation capacity of two subsets of T(regs): precursor CD4(+)CD25(+)CD45RO(-) and mature CD4(+)CD25(+)CD45RO(+) cells. The lifelong dynamics of T(regs) are described by a set of ordinary differential equations, driven by a stochastic process representing the major immune reactions involving these cells. The model dynamics are validated using data from human donors of different ages. Analysis of the data led to the identification of two properties of the dynamics: (1) the equilibrium in the CD4(+)CD25(+)FoxP3(+)T(regs) population is maintained over both precursor and mature T(regs) pools together, and (2) the ratio between precursor and mature T(regs) is inverted in the early years of adulthood. Then, using the model, we identified three biologically relevant scenarios that have the above properties: (1) the unique source of mature T(regs) is the antigen-driven differentiation of precursors that acquire the mature profile in the periphery and the proliferation of T(regs) is essential for the development and the maintenance of the pool; there exist other sources of mature T(regs), such as (2) a homeostatic density-dependent regulation or (3) thymus- or effector-derived T(regs), and in both cases, antigen-induced proliferation is not necessary for the development of a stable pool of T(regs). This is the first time that a mathematical model built to describe the in vivo dynamics of regulatory T cells is validated using human data. The application of this model provides an invaluable tool in estimating the amount of regulatory T cells as a function of time in the blood of patients that received a solid organ transplant or are suffering from an autoimmune disease.
Comparison of three commercially available radio frequency coils for human brain imaging at 3 Tesla.
Resumo:
OBJECTIVE: To evaluate a transverse electromagnetic (TEM), a circularly polarized (CP) (birdcage), and a 12-channel phased array head coil at the clinical field strength of B0 = 3T in terms of signal-to-noise ratio (SNR), signal homogeneity, and maps of the effective flip angle alpha. MATERIALS AND METHODS: SNR measurements were performed on low flip angle gradient echo images. In addition, flip angle maps were generated for alpha(nominal) = 30 degrees using the double angle method. These evaluation steps were performed on phantom and human brain data acquired with each coil. Moreover, the signal intensity variation was computed for phantom data using five different regions of interest. RESULTS: In terms of SNR, the TEM coil performs slightly better than the CP coil, but is second to the smaller 12-channel coil for human data. As expected, both the TEM and the CP coils show superior image intensity homogeneity than the 12-channel coil, and achieve larger mean effective flip angles than the combination of body and 12-channel coil with reduced radio frequency power deposition. CONCLUSION: At 3T the benefits of TEM coil design over conventional lumped element(s) coil design start to emerge, though the phased array coil retains an advantage with respect to SNR performance.
Resumo:
Genome-wide association studies have been instrumental in identifying genetic variants associated with complex traits such as human disease or gene expression phenotypes. It has been proposed that extending existing analysis methods by considering interactions between pairs of loci may uncover additional genetic effects. However, the large number of possible two-marker tests presents significant computational and statistical challenges. Although several strategies to detect epistasis effects have been proposed and tested for specific phenotypes, so far there has been no systematic attempt to compare their performance using real data. We made use of thousands of gene expression traits from linkage and eQTL studies, to compare the performance of different strategies. We found that using information from marginal associations between markers and phenotypes to detect epistatic effects yielded a lower false discovery rate (FDR) than a strategy solely using biological annotation in yeast, whereas results from human data were inconclusive. For future studies whose aim is to discover epistatic effects, we recommend incorporating information about marginal associations between SNPs and phenotypes instead of relying solely on biological annotation. Improved methods to discover epistatic effects will result in a more complete understanding of complex genetic effects.
Resumo:
Any functionally important mutation is embedded in an evolutionary matrix of other mutations. Cladistic analysis, based on this, is a method of investigating gene effects using a haplotype phylogeny to define a set of tests which localize causal mutations to branches of the phylogeny. Previous implementations of cladistic analysis have not addressed the issue of analyzing data from related individuals, though in human studies, family data are usually needed to obtain unambiguous haplotypes. In this study, a method of cladistic analysis is described in which haplotype effects are parameterized in a linear model which accounts for familial correlations. The method was used to study the effect of apolipoprotein (Apo) B gene variation on total-, LDL-, and HDL-cholesterol, triglyceride, and Apo B levels in 121 French families. Five polymorphisms defined Apo B haplotypes: the signal peptide Insertion/deletion, Bsp 1286I, XbaI, MspI, and EcoRI. Eleven haplotypes were found, and a haplotype phylogeny was constructed and used to define a set of tests of haplotype effects on lipid and apo B levels.^ This new method of cladistic analysis, the parametric method, found significant effects for single haplotypes for all variables. For HDL-cholesterol, 3 clusters of evolutionarily-related haplotypes affecting levels were found. Haplotype effects accounted for about 10% of the genetic variance of triglyceride and HDL-cholesterol levels. The results of the parametric method were compared to those of a method of cladistic analysis based on permutational testing. The permutational method detected fewer haplotype effects, even when modified to account for correlations within families. Simulation studies exploring these differences found evidence of systematic errors in the permutational method due to the process by which haplotype groups were selected for testing.^ The applicability of cladistic analysis to human data was shown. The parametric method is suggested as an improvement over the permutational method. This study has identified candidate haplotypes for sequence comparisons in order to locate the functional mutations in the Apo B gene which may influence plasma lipid levels. ^
Resumo:
An important competence of human data analysts is to interpret and explain the meaning of the results of data analysis to end-users. However, existing automatic solutions for intelligent data analysis provide limited help to interpret and communicate information to non-expert users. In this paper we present a general approach to generating explanatory descriptions about the meaning of quantitative sensor data. We propose a type of web application: a virtual newspaper with automatically generated news stories that describe the meaning of sensor data. This solution integrates a variety of techniques from intelligent data analysis into a web-based multimedia presentation system. We validated our approach in a real world problem and demonstrate its generality using data sets from several domains. Our experience shows that this solution can facilitate the use of sensor data by general users and, therefore, can increase the utility of sensor network infrastructures.
Resumo:
INTRODUCTION: This study sought to describe the profile and geographic distribution of reported cases of visceral leishmaniasis (VL) in the City of Campo Grande, State of Mato Grosso do Sul (MS), Brazil, from 2002 to 2009. METHODS: Human data were collected from the Brazilian National Information System for Notifiable Diseases. Canine cases and entomological data were obtained from the Information Service for Canine Visceral Leishmaniasis Control/Campo Grande, MS. RESULTS: A total of 951 records from 2002 to 2009 were investigated. The number of reported cases of VL in males was significantly higher (p < 0.0001) than that in females. The higher frequency observed among males was associated with age (p < 0.0001), which increased in individuals aged 40 years and older. The overall fatality rate was 7.4%. Entomological surveys conducted in 2006, 2007, and 2009 showed the insect vector Lutzomyia longipalpis to be present in all urban regions of the county. CONCLUSIONS: VL cases in humans and dogs, as well as in vectors, occurs in all urban regions of Campo Grande. Despite not observing tendencies of increase or reduction in the incidence of the disease due to aging, the major incidence in men is higher in those aged 40 years or above.
Resumo:
Eosinophilic esophagitis (EoE) is a clinicopathologic condition of increasing recognition and prevalence. In 2007, a consensus recommendation provided clinical and histopathologic guidance for the diagnosis and treatment of EoE; however, only a minority of physicians use the 2007 guidelines, which require fulfillment of both histologic and clinical features. Since 2007, the number of EoE publications has doubled, providing new disease insight. Accordingly, a panel of 33 physicians with expertise in pediatric and adult allergy/immunology, gastroenterology, and pathology conducted a systematic review of the EoE literature (since September 2006) using electronic databases. Based on the literature review and expertise of the panel, information and recommendations were provided in each of the following areas of EoE: diagnostics, genetics, allergy testing, therapeutics, and disease complications. Because accumulating animal and human data have provided evidence that EoE appears to be an antigen-driven immunologic process that involves multiple pathogenic pathways, a new conceptual definition is proposed highlighting that EoE represents a chronic, immune/antigen-mediated disease characterized clinically by symptoms related to esophageal dysfunction and histologically by eosinophil-predominant inflammation. The diagnostic guidelines continue to define EoE as an isolated chronic disorder of the esophagus diagnosed by the need of both clinical and pathologic features. Patients commonly have high rates of concurrent allergic diatheses, especially food sensitization, compared with the general population. Proved therapeutic options include chronic dietary elimination, topical corticosteroids, and esophageal dilation. Important additions since 2007 include genetic underpinnings that implicate EoE susceptibility caused by polymorphisms in the thymic stromal lymphopoietin protein gene and the description of a new potential disease phenotype, proton pump inhibitor-responsive esophageal eosinophila. Further advances and controversies regarding diagnostic methods, surrogate disease markers, allergy testing, and treatment approaches are discussed.
Resumo:
In the present review, microvascular remodelling refers to alterations in the structure of resistance vessels contributing to elevated systemic vascular resistance in hypertension. We start with some historical aspects, underscoring the importance of Folkow's contribution made half a century ago. We then move to some basic concepts on the biomechanics of blood vessels, and explicit the definitions proposed by Mulvany for specific forms of remodelling, especially inward eutrophic and inward hypertrophic. The available evidence for the existence of remodelled resistance vessels in hypertension comes next, with relatively more weight given to human, in comparison with animal data. Mechanisms are discussed. The impact of antihypertensive drug treatment on remodelling is described, again with emphasis on human data. Some details are given on the three studies to date which point to remodelling of subcutaneous resistance arteries as an independent predictor of cardiovascular risk in hypertensive patients. We terminate by considering the potential role of remodelling in the pathogenesis of end-organ damage and in the perpetuation of hypertension.
Resumo:
Animal studies suggest that renal tissue hypoxia plays an important role in the development of renal damage in hypertension and renal diseases, yet human data were scarce due to the lack of noninvasive methods. Over the last decade, blood oxygenation level-dependent magnetic resonance imaging (BOLD-MRI), detecting deoxyhemoglobin in hypoxic renal tissue, has become a powerful tool to assess kidney oxygenation noninvasively in humans. This paper provides an overview of BOLD-MRI studies performed in patients suffering from essential hypertension or chronic kidney disease (CKD). In line with animal studies, acute changes in cortical and medullary oxygenation have been observed after the administration of medication (furosemide, blockers of the renin-angiotensin system) or alterations in sodium intake in these patient groups, underlining the important role of renal sodium handling in kidney oxygenation. In contrast, no BOLD-MRI studies have convincingly demonstrated that renal oxygenation is chronically reduced in essential hypertension or in CKD or chronically altered after long-term medication intake. More studies are required to clarify this discrepancy and to further unravel the role of renal oxygenation in the development and progression of essential hypertension and CKD in humans.
Resting-state temporal synchronization networks emerge from connectivity topology and heterogeneity.
Resumo:
Spatial patterns of coherent activity across different brain areas have been identified during the resting-state fluctuations of the brain. However, recent studies indicate that resting-state activity is not stationary, but shows complex temporal dynamics. We were interested in the spatiotemporal dynamics of the phase interactions among resting-state fMRI BOLD signals from human subjects. We found that the global phase synchrony of the BOLD signals evolves on a characteristic ultra-slow (<0.01Hz) time scale, and that its temporal variations reflect the transient formation and dissolution of multiple communities of synchronized brain regions. Synchronized communities reoccurred intermittently in time and across scanning sessions. We found that the synchronization communities relate to previously defined functional networks known to be engaged in sensory-motor or cognitive function, called resting-state networks (RSNs), including the default mode network, the somato-motor network, the visual network, the auditory network, the cognitive control networks, the self-referential network, and combinations of these and other RSNs. We studied the mechanism originating the observed spatiotemporal synchronization dynamics by using a network model of phase oscillators connected through the brain's anatomical connectivity estimated using diffusion imaging human data. The model consistently approximates the temporal and spatial synchronization patterns of the empirical data, and reveals that multiple clusters that transiently synchronize and desynchronize emerge from the complex topology of anatomical connections, provided that oscillators are heterogeneous.
Resumo:
Performance standards for Positron emission tomography (PET) were developed to be able to compare systems from different generations and manufacturers. This resulted in the NEMA methodology in North America and the IEC in Europe. In practices, the NEMA NU 2- 2001 is the method of choice today. These standardized methods allow assessment of the physical performance of new commercial dedicated PET/CT tomographs. The point spread in image formation is one of the factors that blur the image. The phenomenon is often called the partial volume effect. Several methods for correcting for partial volume are under research but no real agreement exists on how to solve it. The influence of the effect varies in different clinical settings and it is likely that new methods are needed to solve this problem. Most of the clinical PET work is done in the field of oncology. The whole body PET combined with a CT is the standard investigation today in oncology. Despite the progress in PET imaging technique visualization, especially quantification of small lesions is a challenge. In addition to partial volume, the movement of the object is a significant source of error. The main causes of movement are respiratory and cardiac motions. Most of the new commercial scanners are in addition to cardiac gating, also capable of respiratory gating and this technique has been used in patients with cancer of the thoracic region and patients being studied for the planning of radiation therapy. For routine cardiac applications such as assessment of viability and perfusion only cardiac gating has been used. However, the new targets such as plaque or molecular imaging of new therapies require better control of the cardiac motion also caused by respiratory motion. To overcome these problems in cardiac work, a dual gating approach has been proposed. In this study we investigated the physical performance of a new whole body PET/CT scanner with NEMA standard, compared methods for partial volume correction in PET studies of the brain and developed and tested a new robust method for dual cardiac-respiratory gated PET with phantom, animal and human data. Results from performance measurements showed the feasibility of the new scanner design in 2D and 3D whole body studies. Partial volume was corrected, but there is no best method among those tested as the correction also depends on the radiotracer and its distribution. New methods need to be developed for proper correction. The dual gating algorithm generated is shown to handle dual-gated data, preserving quantification and clearly eliminating the majority of contraction and respiration movement