188 resultados para GAUSSIAN-BASIS SET


Relevância:

20.00% 20.00%

Publicador:

Resumo:

HIV virulence, i.e. the time of progression to AIDS, varies greatly among patients. As for other rapidly evolving pathogens of humans, it is difficult to know if this variance is controlled by the genotype of the host or that of the virus because the transmission chain is usually unknown. We apply the phylogenetic comparative approach (PCA) to estimate the heritability of a trait from one infection to the next, which indicates the control of the virus genotype over this trait. The idea is to use viral RNA sequences obtained from patients infected by HIV-1 subtype B to build a phylogeny, which approximately reflects the transmission chain. Heritability is measured statistically as the propensity for patients close in the phylogeny to exhibit similar infection trait values. The approach reveals that up to half of the variance in set-point viral load, a trait associated with virulence, can be heritable. Our estimate is significant and robust to noise in the phylogeny. We also check for the consistency of our approach by showing that a trait related to drug resistance is almost entirely heritable. Finally, we show the importance of taking into account the transmission chain when estimating correlations between infection traits. The fact that HIV virulence is, at least partially, heritable from one infection to the next has clinical and epidemiological implications. The difference between earlier studies and ours comes from the quality of our dataset and from the power of the PCA, which can be applied to large datasets and accounts for within-host evolution. The PCA opens new perspectives for approaches linking clinical data and evolutionary biology because it can be extended to study other traits or other infectious diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geography as a school subject is specifically thought for and by the schools. The contents of the school subject, nowadays, do not reflect the concerns and the evolution of the discipline as such. Nevertheless, official curricula set school objectives that address issues affecting the world and people's lives. These issues are coherent with the ones addressed by geography as a social science, that is to say the study of how people and their environment interact and how societies are interconnected through space. On an every day basis, Geography as a school subject is most of the time reduced to accumulating knowledge outside any given context. This knowledge may even be partially untrue or old and the related activities focus on low cognitive tensions. These practices do not contribute to the learners' understanding of the world because it does not allow them to build a geographical competence, which they. will need as future citizens in order to make responsible choices when they are confronted to questions related to how the locations of human and physical features are influenced by each other and how they interact across space. The central part of the text relies on the ideas and the processes discussed in the publications, which constitute the published file; it is divided into two parts. The first part (chapter 4) presents a didactic approach, which gives meaningful insights into Geography as a school subject and shows a brief account of the theoretical background that supports it. This socio-constructivist approach relies on the main following features: a priming stage (élément déclencheur), which presents geographical knowledge as an issue to be explored, discussed or solved; the issue is given to learners;. the planning of the teaching-learning sequence in small units launched by the main issue in the priming stage ; the interconnections of geographical knowledge with integrative concepts ; the synthetic stage or reporting stage where final concepts and knowledge are put together in order to be learned. Such an approach allows learners to re-invest the knowledge they have built themselves. This knowledge is organised by geographical integrative concepts, which represent true thinking operative tools and with which key issues in the geographical thinking are associated. The second part of the text (chapter 5) displays the didactic principles that governed the conception of the new initial training course for the future upper secondary school teachers at the HEP Vaud. The ambition of this course is to prepare future teachers to plan and realize the teaching of geography that provides pupils with the tools to understand better how people and their environment interact and how societies are interconnected through space. One of the tools for the teachers is the conceptual framework, whose most salient interest is to be relevant at every stage of the preparation and planning of the teaching, including the necessary epistemological reflection that should always be present. The synthesis of the text starts with a short account of the first evaluation of the new course. Various reflections on the future concerns and issues, that the didactics and methodology of Geography will be confronted with, constitute the synthesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of model observers for mimicking human detection strategies has followed from symmetric signals in simple noise to increasingly complex backgrounds. In this study we implement different model observers for the complex task of detecting a signal in a 3D image stack. The backgrounds come from real breast tomosynthesis acquisitions and the signals were simulated and reconstructed within the volume. Two different tasks relevant to the early detection of breast cancer were considered: detecting an 8 mm mass and detecting a cluster of microcalcifications. The model observers were calculated using a channelized Hotelling observer (CHO) with dense difference-of-Gaussian channels, and a modified (Partial prewhitening [PPW]) observer which was adapted to realistic signals which are not circularly symmetric. The sustained temporal sensitivity function was used to filter the images before applying the spatial templates. For a frame rate of five frames per second, the only CHO that we calculated performed worse than the humans in a 4-AFC experiment. The other observers were variations of PPW and outperformed human observers in every single case. This initial frame rate was a rather low speed and the temporal filtering did not affect the results compared to a data set with no human temporal effects taken into account. We subsequently investigated two higher speeds at 5, 15 and 30 frames per second. We observed that for large masses, the two types of model observers investigated outperformed the human observers and would be suitable with the appropriate addition of internal noise. However, for microcalcifications both only the PPW observer consistently outperformed the humans. The study demonstrated the possibility of using a model observer which takes into account the temporal effects of scrolling through an image stack while being able to effectively detect a range of mass sizes and distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The variety of DNA microarray formats and datasets presently available offers an unprecedented opportunity to perform insightful comparisons of heterogeneous data. Cross-species studies, in particular, have the power of identifying conserved, functionally important molecular processes. Validation of discoveries can now often be performed in readily available public data which frequently requires cross-platform studies.Cross-platform and cross-species analyses require matching probes on different microarray formats. This can be achieved using the information in microarray annotations and additional molecular biology databases, such as orthology databases. Although annotations and other biological information are stored using modern database models ( e. g. relational), they are very often distributed and shared as tables in text files, i.e. flat file databases. This common flat database format thus provides a simple and robust solution to flexibly integrate various sources of information and a basis for the combined analysis of heterogeneous gene expression profiles.Results: We provide annotationTools, a Bioconductor-compliant R package to annotate microarray experiments and integrate heterogeneous gene expression profiles using annotation and other molecular biology information available as flat file databases. First, annotationTools contains a specialized set of functions for mining this widely used database format in a systematic manner. It thus offers a straightforward solution for annotating microarray experiments. Second, building on these basic functions and relying on the combination of information from several databases, it provides tools to easily perform cross-species analyses of gene expression data.Here, we present two example applications of annotationTools that are of direct relevance for the analysis of heterogeneous gene expression profiles, namely a cross-platform mapping of probes and a cross-species mapping of orthologous probes using different orthology databases. We also show how to perform an explorative comparison of disease-related transcriptional changes in human patients and in a genetic mouse model.Conclusion: The R package annotationTools provides a simple solution to handle microarray annotation and orthology tables, as well as other flat molecular biology databases. Thereby, it allows easy integration and analysis of heterogeneous microarray experiments across different technological platforms or species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we present a novel approach for diffusion MRI global tractography. Our formulation models the signal in each voxel as a linear combination of fiber-tract basis func- tions, which consist of a comprehensive set of plausible fiber tracts that are locally compatible with the measured MR signal. This large dictionary of candidate fibers is directly estimated from the data and, subsequently, efficient convex optimization techniques are used for recovering the smallest subset globally best fitting the measured signal. Experimen- tal results conducted on a realistic phantom demonstrate that our approach significantly reduces the computational cost of global tractography while still attaining a reconstruction quality at least as good as the state-of-the-art global methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

? The arbuscular mycorrhizal symbiosis is arguably the most ecologically important eukaryotic symbiosis, yet it is poorly understood at the molecular level. To provide novel insights into the molecular basis of symbiosis-associated traits, we report the first genome-wide analysis of the transcriptome from Glomus intraradices DAOM 197198. ? We generated a set of 25,906 nonredundant virtual transcripts (NRVTs) transcribed in germinated spores, extraradical mycelium and symbiotic roots using Sanger and 454 sequencing. NRVTs were used to construct an oligoarray for investigating gene expression. ? We identified transcripts coding for the meiotic recombination machinery, as well as meiosis-specific proteins, suggesting that the lack of a known sexual cycle in G. intraradices is not a result of major deletions of genes essential for sexual reproduction and meiosis. Induced expression of genes encoding membrane transporters and small secreted proteins in intraradical mycelium, together with the lack of expression of hydrolytic enzymes acting on plant cell wall polysaccharides, are all features of G. intraradices that are shared with ectomycorrhizal symbionts and obligate biotrophic pathogens. ? Our results illuminate the genetic basis of symbiosis-related traits of the most ancient lineage of plant biotrophs, advancing future research on these agriculturally and ecologically important symbionts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recommended dietary allowances of many expert committees (UK DHSS 1979, FAO/WHO/UNU 1985, USA NRC 1989) have set out the extra energy requirements necessary to support lactation on the basis of an efficiency of 80 per cent for human milk production. The metabolic efficiency of milk synthesis can be derived from the measurements of resting energy expenditure in lactating women and in a matched control group of non-pregnant non-lactating women. The results of the present study in Gambian women, as well as a review of human studies on energy expenditure during lactation performed in different countries, suggest an efficiency of human milk synthesis greater than the value currently used by expert committees. We propose that an average figure of 95 per cent would be more appropriate to calculate the energy cost of human lactation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a novel approach for the analysis of illicit tablets based on their visual characteristics. In particular, the paper concentrates on the problem of ecstasy pill seizure profiling and monitoring. The presented method extracts the visual information from pill images and builds a representation of it, i.e. it builds a pill profile based on the pill visual appearance. Different visual features are used to build different image similarity measures, which are the basis for a pill monitoring strategy based on both discriminative and clustering models. The discriminative model permits to infer whether two pills come from the same seizure, while the clustering models groups of pills that share similar visual characteristics. The resulting clustering structure allows to perform a visual identification of the relationships between different seizures. The proposed approach was evaluated using a data set of 621 Ecstasy pill pictures. The results demonstrate that this is a feasible and cost effective method for performing pill profiling and monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most conserved features of all cancers is a profound reprogramming of cellular metabolism, favoring biosynthetic processes and limiting catalytic processes. With the acquired knowledge of some of these important changes, we have designed a combination therapy in order to force cancer cells to use a particular metabolic pathway that ultimately results in the accumulation of toxic products. This innovative approach consists of blocking lipid synthesis, at the same time that we force the cell, through the inhibition of AMP-activated kinase, to accumulate toxic intermediates, such as malonyl-coenzyme A (malonyl-CoA) or nicotinamide adenine dinucleotide phosphate. This results in excess of oxidative stress and cancer cell death. Our new therapeutic strategy, based on the manipulation of metabolic pathways, will certainly set up the basis for new upcoming studies defining a new paradigm of cancer treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fatty acids are the basis of so-called stearates which are frequently used as lubricants in the production of ecstasy tablets. Being a product added at the initial tablet production step its composition does not change once the compression is performed. The analysis of fatty acids can therefore provide useful information for a drug intelligence purpose. In this context an appropriate analytical method was developed to improve results already obtained by routine analyses. Considering the small quantity of such fatty acids in ecstasy tablets (not, vert, similar3%) the research focussed on their extraction and concentration. Two different procedures were tested: (1) liquid/liquid extraction using dichloromethane followed by derivatisation and (2) in situ transesterification using bortrifluoride. Analyses were performed by GC-MS. The two procedures were optimized and applied to eight ecstasy seizures, in order to choose one of the procedures for its application to a large ecstasy sample set. They were compared by considering the number of peaks detected and sample amount needed, reproducibility and other technical aspects.