972 resultados para Source to sinks
Resumo:
The identification of associations between interleukin-28B (IL-28B) variants and the spontaneous clearance of hepatitis C virus (HCV) raises the issues of causality and the net contribution of host genetics to the trait. To estimate more precisely the net effect of IL-28B genetic variation on HCV clearance, we optimized genotyping and compared the host contributions in multiple- and single-source cohorts to control for viral and demographic effects. The analysis included individuals with chronic or spontaneously cleared HCV infections from a multiple-source cohort (n = 389) and a single-source cohort (n = 71). We performed detailed genotyping in the coding region of IL-28B and searched for copy number variations to identify the genetic variant or haplotype carrying the strongest association with viral clearance. This analysis was used to compare the effects of IL-28B variation in the two cohorts. Haplotypes characterized by carriage of the major alleles at IL-28B single-nucleotide polymorphisms (SNPs) were highly overrepresented in individuals with spontaneous clearance versus those with chronic HCV infections (66.1% versus 38.6%, P = 6 × 10(-9) ). The odds ratios for clearance were 2.1 [95% confidence interval (CI) = 1.6-3.0] and 3.9 (95% CI = 1.5-10.2) in the multiple- and single-source cohorts, respectively. Protective haplotypes were in perfect linkage (r(2) = 1.0) with a nonsynonymous coding variant (rs8103142). Copy number variants were not detected. We identified IL-28B haplotypes highly predictive of spontaneous HCV clearance. The high linkage disequilibrium between IL-28B SNPs indicates that association studies need to be complemented by functional experiments to identify single causal variants. The point estimate for the genetic effect was higher in the single-source cohort, which was used to effectively control for viral diversity, sex, and coinfections and, therefore, offered a precise estimate of the net host genetic contribution.
Resumo:
Hydrophilic nanocarriers formed by electrostatic interaction of chitosan with oppositely charged macromolecules have a high potential as vectors in biomedical and pharmaceutical applications. However, comprehensive information about the fate of such nanomaterials in biological environment is lacking. We used chitosan from both animal and fungal sources to form well-characterized chitosan-pentasodium triphosphate (TPP)//alginate nanogels suitable for comparative studies. Upon exposure of human colon cancer cells (HT29 and CaCo2), breast cancer cells (MDA-MB-231 and MCF-7), glioblastoma cells (LN229), lung cancer cells (A549), and brain-derived endothelial cells (HCEC) to chitosan-(TPP)//alginate nanogels, cell type-, nanogel dosage-, and exposure time-dependent responses are observed. Comparing chitosan-TPP//alginate nanogels prepared from either animal or fungal source in terms of nanogel formation, cell uptake, reactive oxygen species production, and metabolic cell activity, no significant differences become obvious. The results identify fungal chitosan as an alternative to animal chitosan in particular if biomedical/pharmaceutical applications are intended.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
Antonio Damasio's works have brought emotions into line with current trends in neuroscience. They are conceived as the addition, to a perception, of the somatic effects it has induced. Nevertheless, this continuous and relatively steady process of body perception has also led to the less-known hypothesis of the "neural self." Behind the explicit and apparently contradictory reference to William James and Sigmund Freud, there lies a common source: Theodor Meynert's conception of a "cortical self." Our aim is to enlight a stream unified around what we call here "cerebral self." The Self is thus considered as the cerebral projection or presentation of the body. The specificity of this notion is particularly highlighted by its confrontation to the closely, yet disembodied, notion of "cerebral subject.". Pour citer cette revue : Psychiatr. Sci. Hum. Neurosci. 9 (2011).
Resumo:
The ability to adapt to marginal habitats, in which survival and reproduction are initially poor, plays a crucial role in the evolution of ecological niches and species ranges. Adaptation to marginal habitats may be limited by genetic, developmental, and functional constraints, but also by consequences of demographic characteristics of marginal populations. Marginal populations are often sparse, fragmented, prone to local extinctions, or are demographic sinks subject to high immigration from high-quality core habitats. This makes them demographically and genetically dependent on core habitats and prone to gene flow counteracting local selection. Theoretical and empirical research in the past decade has advanced our understanding of conditions that favor adaptation to marginal habitats despite those limitations. This review is an attempt at synthesis of those developments and of the emerging conceptual framework.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
OBJECTIVE: HIV-1 post-exposure prophylaxis (PEP) is frequently prescribed after exposure to source persons with an undetermined HIV serostatus. To reduce unnecessary use of PEP, we implemented a policy including active contacting of source persons and the availability of free, anonymous HIV testing ('PEP policy'). METHODS: All consultations for potential non-occupational HIV exposures i.e. outside the medical environment) were prospectively recorded. The impact of the PEP policy on PEP prescription and costs was analysed and modelled. RESULTS: Among 146 putative exposures, 47 involved a source person already known to be HIV positive and 23 had no indication for PEP. The remaining 76 exposures involved a source person of unknown HIV serostatus. Of 33 (43.4%) exposures for which the source person could be contacted and tested, PEP was avoided in 24 (72.7%), initiated and discontinued in seven (21.2%), and prescribed and completed in two (6.1%). In contrast, of 43 (56.6%) exposures for which the source person could not be tested, PEP was prescribed in 35 (81.4%), P < 0.001. Upon modelling, the PEP policy allowed a 31% reduction of cost for management of exposures to source persons of unknown HIV serostatus. The policy was cost-saving for HIV prevalence of up to 70% in the source population. The availability of all the source persons for testing would have reduced cost by 64%. CONCLUSION: In the management of non-occupational HIV exposures, active contacting and free, anonymous testing of source persons proved feasible. This policy resulted in a decrease in prescription of PEP, proved to be cost-saving, and presumably helped to avoid unnecessary toxicity and psychological stress.
Resumo:
The COMPTEL unidentified source GRO J1411-64 was observed by INTEGRAL, and its central part, also by XMM-Newton. The data analysis shows no hint for new detections at hard X-rays. The upper limits in flux herein presented constrain the energy spectrum of whatever was producing GRO J1411-64, imposing, in the framework of earlier COMPTEL observations, the existence of a peak in power output located somewhere between 300-700 keV for the so-called low state. The Circinus Galaxy is the only source detected within the 4$\sigma$ location error of GRO J1411-64, but can be safely excluded as the possible counterpart: the extrapolation of the energy spectrum is well below the one for GRO J1411-64 at MeV energies. 22 significant sources (likelihood $> 10$) were extracted and analyzed from XMM-Newton data. Only one of these sources, XMMU J141255.6-635932, is spectrally compatible with GRO J1411-64 although the fact the soft X-ray observations do not cover the full extent of the COMPTEL source position uncertainty make an association hard to quantify and thus risky. The unique peak of the power output at high energies (hard X-rays and gamma-rays) resembles that found in the SED seen in blazars or microquasars. However, an analysis using a microquasar model consisting on a magnetized conical jet filled with relativistic electrons which radiate through synchrotron and inverse Compton scattering with star, disk, corona and synchrotron photons shows that it is hard to comply with all observational constrains. This and the non-detection at hard X-rays introduce an a-posteriori question mark upon the physical reality of this source, which is discussed in some detail.
Resumo:
Iowa agriculture depends on anhydrous ammonia as a low-cost form of nitrogen fertilizer on 61 percent of Iowa’s 12.4 million acres of corn. Now we find a threat to that source of nutrient—the theft of anhydrous ammonia for use in making a powerful, illegal narcotic called methamphetamine. Naturally, the fertilizer industry is outraged by the illegal and illicit use of our products. We want to play a role in preventing abuse in the future. By raising awareness, knowing how to respond and using the Meth Inhibitor, fertilizer dealers can assist law enforcement in combating this illicit use of a product important to Iowa farmers.
Resumo:
Frequently the choice of a library management program is conditioned by social, economic and/or political factors that result in the selection of a system that is not altogether suitable for the library’s needs, characteristics and functions. Open source software is quickly becoming a preferred solution, owing to the freedom to copy, modify and distribute it and the freedom from contracts, as well as for greater opportunities for interoperability with other applications. These new trends regarding open source software in libraries are also reflected in LIS studies, as evidenced by the different courses addressing automated programs, repositorymanagement, including the Linux/GNU operating system, among others. The combination of the needs of the centres and the new trends for open source software is the focus of a virtual laboratory for the use of open source software for library applications. It was the result of a project, whose aim was to make a useful contribution to the library community, that was carried out by a group of professors of the School of Library and Information Science of the University of Barcelona, together with a group of students, members of a Working Group on Open Source Software for Information Professionals, of the Professional Library Association of Catalonia.
Resumo:
Myoblast transfer therapy has been extensively studied for a wide range of clinical applications, such as tissue engineering for muscular loss, cardiac surgery or Duchenne Muscular Dystrophy treatment. However, this approach has been hindered by numerous limitations, including early myoblast death after injection and specific immune response after transplantation with allogenic cells. Different cell sources have been analyzed to overcome some of these limitations. The object of our study was to investigate the growth potential, characterization and integration in vivo of human primary fetal skeletal muscle cells. These data together show the potential for the creation of a cell bank to be used as a cell source for muscle cell therapy and tissue engineering. For this purpose, we developed primary muscular cell cultures from biopsies of human male thigh muscle from a 16-week-old fetus and from donors of 13 and 30 years old. We show that fetal myogenic cells can be successfully isolated and expanded in vitro from human fetal muscle biopsies, and that fetal cells have higher growth capacities when compared to young and adult cells. We confirm lineage specificity by comparing fetal muscle cells to fetal skin and bone cells in vitro by immunohistochemistry with desmin and 5.1 H11 antibodies. For the feasibility of the cell bank, we ensured that fetal muscle cells retained intrinsic characteristics after 5 years cryopreservation. Finally, human fetal muscle cells marked with PKH26 were injected in normal C57BL/6 mice and were found to be present up to 4 days. In conclusion we estimate that a human fetal skeletal muscle cell bank can be created for potential muscle cell therapy and tissue engineering.
Resumo:
Source: Description: pKM-19 is a 1.0 kb EcoRI human genomic fragment inserted in pUC13, that detects a Scrfl (CC/NGG) RFLP (1, 2). We report here the primer sequences suitable for the detection of this RFLP by PCR...
Resumo:
Background. Human immunodeficiency virus type 1 (HIV-1) transmitted drug resistance (TDR) can compromise antiretroviral therapy (ART) and thus represents an important public health concern. Typically, sources of TDR remain unknown, but they can be characterized with molecular epidemiologic approaches. We used the highly representative Swiss HIV Cohort Study (SHCS) and linked drug resistance database (SHCS-DRDB) to analyze sources of TDR. Methods. ART-naive men who have sex with men with infection date estimates between 1996 and 2009 were chosen for surveillance of TDR in HIV-1 subtype B (N = 1674), as the SHCS-DRDB contains pre-ART genotypic resistance tests for >69% of this surveillance population. A phylogeny was inferred using pol sequences from surveillance patients and all subtype B sequences from the SHCS-DRDB (6934 additional patients). Potential sources of TDR were identified based on phylogenetic clustering, shared resistance mutations, genetic distance, and estimated infection dates. Results. One hundred forty of 1674 (8.4%) surveillance patients carried virus with TDR; 86 of 140 (61.4%) were assigned to clusters. Potential sources of TDR were found for 50 of 86 (58.1%) of these patients. ART-naive patients constitute 56 of 66 (84.8%) potential sources and were significantly overrepresented among sources (odds ratio, 6.43 [95% confidence interval, 3.22-12.82]; P < .001). Particularly large transmission clusters were observed for the L90M mutation, and the spread of L90M continued even after the near cessation of antiretroviral use selecting for that mutation. Three clusters showed evidence of reversion of K103N or T215Y/F. Conclusions. Many individuals harboring viral TDR belonged to transmission clusters with other Swiss patients, indicating substantial domestic transmission of TDR in Switzerland. Most TDR in clusters could be linked to sources, indicating good surveillance of TDR in the SHCS-DRDB. Most TDR sources were ART naive. This, and the presence of long TDR transmission chains, suggests that resistance mutations are frequently transmitted among untreated individuals, highlighting the importance of early diagnosis and treatment.
Resumo:
High wheat yields require good N fertilization management. The objective of this study was to evaluate the effects of different N applications at sowing using Entec (N source with nitrification inhibitor) and urea (traditional N source) at covering, on four wheat cultivars. The experiment was conducted in a randomized block design in a factorial scheme, with four replications, at the Experimental Station of the Faculdade de Engenharia de Ilha Solteira - UNESP, on a dystrophic, epi-eutrophic alic Red Latosol with loamy texture, formerly under savannah vegetation. Four N rates (0, 60, 120, and 180 kg ha-1) were tested, applied at sowing in the case of Entec and top-dressed 40 days after plant emergence in the case of urea, and the four wheat cultivars E 21, E 22, E 42, and IAC 370. The yield of the wheat cultivars E 21 and E 42 was highest. Plant height and lodging index of cultivar E 22 were greatest, with consequently lowest grain yield. There was no significant difference between Entec (applied at sowing) and urea (top-dressed) in terms of grain yield and yield components. Nevertheless, urea resulted in a higher N leaf content, and Entec in a larger number of undeveloped spikelets. High nitrogen rates influenced the hectoliter mass negatively, affecting wheat grain quality. Grain yield increased under N rates of up to 82 kg ha-1 N, through Entec applied at sowing or top-dressed urea.
Resumo:
This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.