100 resultados para Robustness
em Université de Lausanne, Switzerland
Resumo:
The weak selection approximation of population genetics has made possible the analysis of social evolution under a considerable variety of biological scenarios. Despite its extensive usage, the accuracy of weak selection in predicting the emergence of altruism under limited dispersal when selection intensity increases remains unclear. Here, we derive the condition for the spread of an altruistic mutant in the infinite island model of dispersal under a Moran reproductive process and arbitrary strength of selection. The simplicity of the model allows us to compare weak and strong selection regimes analytically. Our results demonstrate that the weak selection approximation is robust to moderate increases in selection intensity and therefore provides a good approximation to understand the invasion of altruism in spatially structured population. In particular, we find that the weak selection approximation is excellent even if selection is very strong, when either migration is much stronger than selection or when patches are large. Importantly, we emphasize that the weak selection approximation provides the ideal condition for the invasion of altruism, and increasing selection intensity will impede the emergence of altruism. We discuss that this should also hold for more complicated life cycles and for culturally transmitted altruism. Using the weak selection approximation is therefore unlikely to miss out on any demographic scenario that lead to the evolution of altruism under limited dispersal.
Resumo:
MOTIVATION: Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. RESULTS: In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. AVAILABILITY: Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.
Resumo:
Introduction: With the setting up of the newly Athlete's Biological Passport antidoping programme, novel guidelines have been introduced to guarantee results beyond reproach. We investigated in this context, the effect of storage time on the variables commonly measured for the haematological passport. We also wanted to assess for these variables, the within and between analyzer variations. Methods: Blood samples were obtained from top level male professional cyclists (27 samples for the first part of the study and 102 for the second part) taking part to major stage races. After collection, they were transported under refrigerated conditions (2 °C < T < 12 °C), delivered to the antidoping laboratory, analysed and then stored at approximately 4 °C to conduct analysis at different time points up to 72 h after delivery. A mixed-model procedure was used to determine the stability of the different variables. Results: As expected haemoglobin concentration was not affected by storage and showed stability for at least 72 h. Under the conditions of our investigation, the reticulocytes percentage showed a much better stability than previous published data (> 48 h) and the technical comparison of the haematology analyzer demonstrated excellent results. Conclusion: In conclusion, our data clearly demonstrate that as long as the World Anti-Doping Agency's guidelines are followed rigorously, all blood results reach the quality level required in the antidoping context.
Resumo:
Within a developing organism, cells require information on where they are in order to differentiate into the correct cell-type. Pattern formation is the process by which cells acquire and process positional cues and thus determine their fate. This can be achieved by the production and release of a diffusible signaling molecule, called a morphogen, which forms a concentration gradient: exposure to different morphogen levels leads to the activation of specific signaling pathways. Thus, in response to the morphogen gradient, cells start to express different sets of genes, forming domains characterized by a unique combination of differentially expressed genes. As a result, a pattern of cell fates and specification emerges.Though morphogens have been known for decades, it is not yet clear how these gradients form and are interpreted in order to yield highly robust patterns of gene expression. During my PhD thesis, I investigated the properties of Bicoid (Bcd) and Decapentaplegic (Dpp), two morphogens involved in the patterning of the anterior-posterior axis of Drosophila embryo and wing primordium, respectively. In particular, I have been interested in understanding how the pattern proportions are maintained across embryos of different sizes or within a growing tissue. This property is commonly referred to as scaling and is essential for yielding functional organs or organisms. In order to tackle these questions, I analysed fluorescence images showing the pattern of gene expression domains in the early embryo and wing imaginal disc. After characterizing the extent of these domains in a quantitative and systematic manner, I introduced and applied a new scaling measure in order to assess how well proportions are maintained. I found that scaling emerged as a universal property both in early embryos (at least far away from the Bcd source) and in wing imaginal discs (across different developmental stages). Since we were also interested in understanding the mechanisms underlying scaling and how it is transmitted from the morphogen to the target genes down in the signaling cascade, I also quantified scaling in mutant flies where this property could be disrupted. While scaling is largely conserved in embryos with altered bcd dosage, my modeling suggests that Bcd trapping by the nuclei as well as pre-steady state decoding of the morphogen gradient are essential to ensure precise and scaled patterning of the Bcd signaling cascade. In the wing imaginal disc, it appears that as the disc grows, the Dpp response expands and scales with the tissue size. Interestingly, scaling is not perfect at all positions in the field. The scaling of the target gene domains is best where they have a function; Spalt, for example, scales best at the position in the anterior compartment where it helps to form one of the anterior veins of the wing. Analysis of mutants for pentagone, a transcriptional target of Dpp that encodes a secreted feedback regulator of the pathway, indicates that Pentagone plays a key role in scaling the Dpp gradient activity.
Resumo:
Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.
Using life strategies to explore the vulnerability of ecosystem services to invasion by alien plants
Resumo:
Invasive plants can have different effects of ecosystem functioning and on the provision of ecosystem services, from strongly deleterious impacts to positive effects. The nature and intensity of such effects will depend on the service and ecosystem being considered, but also on features of life strategies of invaders that influence their invasiveness as well as their influence of key processes of receiving ecosystems. To address the combined effect of these various factors we developed a robust and efficient methodological framework that allows to identify areas of possible conflict between ecosystem services and alien invasive plants, considering interactions between landscape invasibility and species invasiveness. Our framework combines the statistical robustness of multi-model inference, efficient techniques to map ecosystem services, and life strategies as a functional link between invasion, functional changes and potential provision of services by invaded ecosystems. The framework was applied to a test region in Portugal, for which we could successfully predict the current patterns of plant invasion, of ecosystem service provision, and finally of probable conflict (expressing concern for negative impacts, and value for positive impacts on services) between alien species richness (total and per plant life strategy) and the potential provision of selected services. Potential conflicts were identified for all combinations of plant strategy and ecosystem service, with an emphasis for those concerning conflicts with carbon sequestration, water regulation and wood production. Lower levels of conflict were obtained between invasive plant strategies and the habitat for biodiversity supporting service. The added value of the proposed framework in the context of landscape management and planning is discussed in perspective of anticipation of conflicts, mitigation of negative impacts, and potentiation of positive effects of plant invasions on ecosystems and their services.
Resumo:
BACKGROUND: Cone-beam computed tomography (CBCT) image-guided radiotherapy (IGRT) systems are widely used tools to verify and correct the target position before each fraction, allowing to maximize treatment accuracy and precision. In this study, we evaluate automatic three-dimensional intensity-based rigid registration (RR) methods for prostate setup correction using CBCT scans and study the impact of rectal distension on registration quality. METHODS: We retrospectively analyzed 115 CBCT scans of 10 prostate patients. CT-to-CBCT registration was performed using (a) global RR, (b) bony RR, or (c) bony RR refined by a local prostate RR using the CT clinical target volume (CTV) expanded with 1-to-20-mm varying margins. After propagation of the manual CT contours, automatic CBCT contours were generated. For evaluation, a radiation oncologist manually delineated the CTV on the CBCT scans. The propagated and manual CBCT contours were compared using the Dice similarity and a measure based on the bidirectional local distance (BLD). We also conducted a blind visual assessment of the quality of the propagated segmentations. Moreover, we automatically quantified rectal distension between the CT and CBCT scans without using the manual CBCT contours and we investigated its correlation with the registration failures. To improve the registration quality, the air in the rectum was replaced with soft tissue using a filter. The results with and without filtering were compared. RESULTS: The statistical analysis of the Dice coefficients and the BLD values resulted in highly significant differences (p<10(-6)) for the 5-mm and 8-mm local RRs vs the global, bony and 1-mm local RRs. The 8-mm local RR provided the best compromise between accuracy and robustness (Dice median of 0.814 and 97% of success with filtering the air in the rectum). We observed that all failures were due to high rectal distension. Moreover, the visual assessment confirmed the superiority of the 8-mm local RR over the bony RR. CONCLUSION: The most successful CT-to-CBCT RR method proved to be the 8-mm local RR. We have shown the correlation between its registration failures and rectal distension. Furthermore, we have provided a simple (easily applicable in routine) and automatic method to quantify rectal distension and to predict registration failure using only the manual CT contours.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
This paper extends previous research and discussion on the use of multivariate continuous data, which are about to become more prevalent in forensic science. As an illustrative example, attention is drawn here on the area of comparative handwriting examinations. Multivariate continuous data can be obtained in this field by analysing the contour shape of loop characters through Fourier analysis. This methodology, based on existing research in this area, allows one describe in detail the morphology of character contours throughout a set of variables. This paper uses data collected from female and male writers to conduct a comparative analysis of likelihood ratio based evidence assessment procedures in both, evaluative and investigative proceedings. While the use of likelihood ratios in the former situation is now rather well established (typically, in order to discriminate between propositions of authorship of a given individual versus another, unknown individual), focus on the investigative setting still remains rather beyond considerations in practice. This paper seeks to highlight that investigative settings, too, can represent an area of application for which the likelihood ratio can offer a logical support. As an example, the inference of gender of the writer of an incriminated handwritten text is forwarded, analysed and discussed in this paper. The more general viewpoint according to which likelihood ratio analyses can be helpful for investigative proceedings is supported here through various simulations. These offer a characterisation of the robustness of the proposed likelihood ratio methodology.
Resumo:
In cognition, common factors play a crucial role. For example, different types of intelligence are highly correlated, pointing to a common factor, which is often called g. One might expect that a similar common factor would also exist for vision. Surprisingly, no one in the field has addressed this issue. Here, we provide the first evidence that there is no common factor for vision. We tested 40 healthy students' performance in six basic visual paradigms: visual acuity, vernier discrimination, two visual backward masking paradigms, Gabor detection, and bisection discrimination. One might expect that performance levels on these tasks would be highly correlated because some individuals generally have better vision than others due to superior optics, better retinal or cortical processing, or enriched visual experience. However, only four out of 15 correlations were significant, two of which were nontrivial. These results cannot be explained by high intraobserver variability or ceiling effects because test-retest reliability was high and the variance in our student population is commensurate with that from other studies with well-sighted populations. Using a variety of tests (e.g., principal components analysis, Bayes theorem, test-retest reliability), we show the robustness of our null results. We suggest that neuroplasticity operates during everyday experience to generate marked individual differences. Our results apply only to the normally sighted population (i.e., restricted range sampling). For the entire population, including those with degenerate vision, we expect different results.
Resumo:
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria.
Resumo:
Lipids available in fingermark residue represent important targets for enhancement and dating techniques. While it is well known that lipid composition varies among fingermarks of the same donor (intra-variability) and between fingermarks of different donors (inter-variability), the extent of this variability remains uncharacterised. Thus, this worked aimed at studying qualitatively and quantitatively the initial lipid composition of fingermark residue of 25 different donors. Among the 104 detected lipids, 43 were reported for the first time in the literature. Furthermore, palmitic acid, squalene, cholesterol, myristyl myristate and myristyl myristoleate were quantified and their correlation within fingermark residue was highlighted. Ten compounds were then selected and further studied as potential targets for dating or enhancement techniques. It was shown that their relative standard deviation was significantly lower for the intra-variability than for the inter-variability. Moreover, the use of data pretreatments could significantly reduce this variability. Based on these observations, an objective donor classification model was proposed. Hierarchical cluster analysis was conducted on the pre-treated data and the fingermarks of the 25 donors were classified into two main groups, corresponding to "poor" and "rich" lipid donors. The robustness of this classification was tested using fingermark replicates of selected donors. 86% of these replicates were correctly classified, showing the potential of such a donor classification model for research purposes in order to select representative donors based on compounds of interest.
Resumo:
Persistent viruses are kept in check by specific lymphocytes. The clonal T cell receptor (TCR) repertoire against Epstein-Barr virus (EBV), once established following primary infection, exhibits a robust stability over time. However, the determinants contributing to this long-term persistence are still poorly characterized. Taking advantage of an in vivo clinical setting where lymphocyte homeostasis was transiently perturbed, we studied EBV antigen-specific CD8 T cells before and after non-myeloablative lympho-depleting chemotherapy of melanoma patients. Despite more advanced T cell differentiation, patients T cells showed clonal composition comparable to healthy individuals, sharing a preference for TRBV20 and TRBV29 gene segment usage and several co-dominant public TCR clonotypes. Moreover, our data revealed the presence of relatively few dominant EBV antigen-specific T cell clonotypes, which mostly persisted following transient lympho-depletion (TLD) and lymphocyte recovery, likely related to absence of EBV reactivation and de novo T cell priming in these patients. Interestingly, persisting clonotypes frequently co-expressed memory/homing-associated genes (CD27, IL7R, EOMES, CD62L/SELL and CCR5) supporting the notion that they are particularly important for long-lasting CD8 T cell responses. Nevertheless, the clonal composition of EBV-specific CD8 T cells was preserved over time with the presence of the same dominant clonotypes after non-myeloablative chemotherapy. The observed clonotype persistence demonstrates high robustness of CD8 T cell homeostasis and reconstitution.