960 resultados para dynamic causal modeling
Resumo:
Recent studies have pointed out a similarity between tectonics and slope tectonic-induced structures. Numerous studies have demonstrated that structures and fabrics previously interpreted as of purely geodynamical origin are instead the result of large slope deformation, and this led in the past to erroneous interpretations. Nevertheless, their limit seems not clearly defined, but it is somehow transitional. Some studies point out continuity between failures developing at surface with upper crust movements. In this contribution, the main studies which examine the link between rock structures and slope movements are reviewed. The aspects regarding model and scale of observation are discussed together with the role of pre-existing weaknesses in the rock mass. As slope failures can develop through progressive failure, structures and their changes in time and space can be recognized. Furthermore, recognition of the origin of these structures can help in avoiding misinterpretations of regional geology. This also suggests the importance of integrating different slope movement classifications based on distribution and pattern of deformation and the application of structural geology techniques. A structural geology approach in the landslide community is a tool that can greatly support the hazard quantification and related risks, because most of the physical parameters, which are used for landslide modeling, are derived from geotechnical tests or the emerging geophysical approaches.
Resumo:
Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.
Resumo:
L'objectiu final d'aquest projecte és realitzar un Sistema Traçador d' Errors, però potser mésimportant és l'objectiu d'aprendre noves tecnologies, que sovint estan a disposició de l'usuari però l'usuari les desconeix.
Resumo:
Axial deflection of DNA molecules in solution results from thermal motion and intrinsic curvature related to the DNA sequence. In order to measure directly the contribution of thermal motion we constructed intrinsically straight DNA molecules and measured their persistence length by cryo-electron microscopy. The persistence length of such intrinsically straight DNA molecules suspended in thin layers of cryo-vitrified solutions is about 80 nm. In order to test our experimental approach, we measured the apparent persistence length of DNA molecules with natural "random" sequences. The result of about 45 nm is consistent with the generally accepted value of the apparent persistence length of natural DNA sequences. By comparing the apparent persistence length to intrinsically straight DNA with that of natural DNA, it is possible to determine both the dynamic and the static contributions to the apparent persistence length.
Resumo:
The pathogenesis of Schistosoma mansoni infection is largely determined by host T-cell mediated immune responses such as the granulomatous response to tissue deposited eggs and subsequent fibrosis. The major egg antigens have a valuable role in desensitizing the CD4+ Th cells that mediate granuloma formation, which may prevent or ameliorate clinical signs of schistosomiasis.S. mansoni major egg antigen Smp40 was expressed and completely purified. It was found that the expressed Smp40 reacts specifically with anti-Smp40 monoclonal antibody in Western blotting. Three-dimensional structure was elucidated based on the similarity of Smp40 with the small heat shock protein coded in the protein database as 1SHS as a template in the molecular modeling. It was figured out that the C-terminal of the Smp40 protein (residues 130 onward) contains two alpha crystallin domains. The fold consists of eight beta strands sandwiched in two sheets forming Greek key. The purified Smp40 was used for in vitro stimulation of peripheral blood mononuclear cells from patients infected with S. mansoni using phytohemagglutinin mitogen as a positive control. The obtained results showed that there is no statistical difference in interferon-g, interleukin (IL)-4 and IL-13 levels obtained with Smp40 stimulation compared with the control group (P > 0.05 for each). On the other hand, there were significant differences after Smp40 stimulation in IL-5 (P = 0.006) and IL-10 levels (P < 0.001) compared with the control group. Gaining the knowledge by reviewing the literature, it was found that the overall pattern of cytokine profile obtained with Smp40 stimulation is reported to be associated with reduced collagen deposition, decreased fibrosis, and granuloma formation inhibition. This may reflect its future prospect as a leading anti-pathology schistosomal vaccine candidate.
Resumo:
General introductionThe Human Immunodeficiency/Acquired Immunodeficiency Syndrome (HIV/AIDS) epidemic, despite recent encouraging announcements by the World Health Organization (WHO) is still today one of the world's major health care challenges.The present work lies in the field of health care management, in particular, we aim to evaluate the behavioural and non-behavioural interventions against HIV/AIDS in developing countries through a deterministic simulation model, both in human and economic terms. We will focus on assessing the effectiveness of the antiretroviral therapies (ART) in heterosexual populations living in lesser developed countries where the epidemic has generalized (formerly defined by the WHO as type II countries). The model is calibrated using Botswana as a case study, however our model can be adapted to other countries with similar transmission dynamics.The first part of this thesis consists of reviewing the main mathematical concepts describing the transmission of infectious agents in general but with a focus on human immunodeficiency virus (HIV) transmission. We also review deterministic models assessing HIV interventions with a focus on models aimed at African countries. This review helps us to recognize the need for a generic model and allows us to define a typical structure of such a generic deterministic model.The second part describes the main feed-back loops underlying the dynamics of HIV transmission. These loops represent the foundation of our model. This part also provides a detailed description of the model, including the various infected and non-infected population groups, the type of sexual relationships, the infection matrices, important factors impacting HIV transmission such as condom use, other sexually transmitted diseases (STD) and male circumcision. We also included in the model a dynamic life expectancy calculator which, to our knowledge, is a unique feature allowing more realistic cost-efficiency calculations. Various intervention scenarios are evaluated using the model, each of them including ART in combination with other interventions, namely: circumcision, campaigns aimed at behavioral change (Abstain, Be faithful or use Condoms also named ABC campaigns), and treatment of other STD. A cost efficiency analysis (CEA) is performed for each scenario. The CEA consists of measuring the cost per disability-adjusted life year (DALY) averted. This part also describes the model calibration and validation, including a sensitivity analysis.The third part reports the results and discusses the model limitations. In particular, we argue that the combination of ART and ABC campaigns and ART and treatment of other STDs are the most cost-efficient interventions through 2020. The main model limitations include modeling the complexity of sexual relationships, omission of international migration and ignoring variability in infectiousness according to the AIDS stage.The fourth part reviews the major contributions of the thesis and discusses model generalizability and flexibility. Finally, we conclude that by selecting the adequate interventions mix, policy makers can significantly reduce the adult prevalence in Botswana in the coming twenty years providing the country and its donors can bear the cost involved.Part I: Context and literature reviewIn this section, after a brief introduction to the general literature we focus in section two on the key mathematical concepts describing the transmission of infectious agents in general with a focus on HIV transmission. Section three provides a description of HIV policy models, with a focus on deterministic models. This leads us in section four to envision the need for a generic deterministic HIV policy model and briefly describe the structure of such a generic model applicable to countries with generalized HIV/AIDS epidemic, also defined as pattern II countries by the WHO.
Resumo:
Inconsistencies about dynamic asymmetry between the on- and off-transient responses in VO2 are found in the literature. Therefore the purpose of this study was to examine VO2 on- and off-transients during moderate- and heavy-intensity cycling exercise in trained subjects. Ten men underwent an initial incremental test for the estimation of ventilatory threshold (VT) and, on different days, two bouts of square-wave exercise at moderate (<VT) and heavy (>VT) intensities. VO2 kinetics in exercise and recovery were better described by a single exponential model (<VT), or by a double exponential with two time delays (>VT). For moderate exercise, we found a symmetry of VO2 kinetics between the on- and off-transients (i.e., fundamental component), consistent with a system manifesting linear control dynamics. For heavy exercise, a slow component superimposed on the fundamental phase was expressed in both the exercise and recovery, with similar parameter estimates. But the on-transient values of the time constant were appreciably faster than the associated off-transient, and independent of the work rate imposed (<VT and >VT). Our results do not support a dynamically linear system model of VO2 during cycling exercise in the heavy-intensity domain.
Resumo:
Inconsistencies about dynamic asymmetry between the on- and off-transient responses in .VO2 are found in the literature. Therefore the purpose of this study was to examine .VO2on- and off-transients during moderate- and heavy-intensity cycling exercise in trained subjects. Ten men underwent an initial incremental test for the estimation of ventilatory threshold (VT) and, on different days, two bouts of square-wave exercise at moderate (<VT) and heavy (>VT) intensities. .VO2 kinetics in exercise and recovery were better described by a single exponential model (<VT) or by a double exponential with two time delays (>VT). For moderate exercise, we found a symmetry of .VO2 kinetics between the on- and off-transients (i.e., fundamental component), consistent with a system manifesting linear control dynamics. For heavy exercise, a slow component superimposed on the fundamental phase was expressed in both the exercise and recovery, with similar parameter estimates. But the on-transient values of the time constant were appreciably faster than the associated off-transient, and independent of the work rate imposed (<VT and >VT). Our results do not support a dynamically linear system model of .VO2 during cycling exercise in the heavy-intensity domain.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
Observations in daily practice are sometimes registered as positive values larger then a given threshold α. The sample space is in this case the interval (α,+∞), α & 0, which can be structured as a real Euclidean space in different ways. This fact opens the door to alternative statistical models depending not only on the assumed distribution function, but also on the metric which is considered as appropriate, i.e. the way differences are measured, and thus variability
Resumo:
This paper is a first draft of the principle of statistical modelling on coordinates. Several causes —which would be long to detail—have led to this situation close to the deadline for submitting papers to CODAWORK’03. The main of them is the fast development of the approach along thelast months, which let appear previous drafts as obsolete. The present paper contains the essential parts of the state of the art of this approach from my point of view. I would like to acknowledge many clarifying discussions with the group of people working in this field in Girona, Barcelona, Carrick Castle, Firenze, Berlin, G¨ottingen, and Freiberg. They have given a lot of suggestions and ideas. Nevertheless, there might be still errors or unclear aspects which are exclusively my fault. I hope this contribution serves as a basis for further discussions and new developments
Resumo:
In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.