971 resultados para mathematical modeling


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pathogenesis of Schistosoma mansoni infection is largely determined by host T-cell mediated immune responses such as the granulomatous response to tissue deposited eggs and subsequent fibrosis. The major egg antigens have a valuable role in desensitizing the CD4+ Th cells that mediate granuloma formation, which may prevent or ameliorate clinical signs of schistosomiasis.S. mansoni major egg antigen Smp40 was expressed and completely purified. It was found that the expressed Smp40 reacts specifically with anti-Smp40 monoclonal antibody in Western blotting. Three-dimensional structure was elucidated based on the similarity of Smp40 with the small heat shock protein coded in the protein database as 1SHS as a template in the molecular modeling. It was figured out that the C-terminal of the Smp40 protein (residues 130 onward) contains two alpha crystallin domains. The fold consists of eight beta strands sandwiched in two sheets forming Greek key. The purified Smp40 was used for in vitro stimulation of peripheral blood mononuclear cells from patients infected with S. mansoni using phytohemagglutinin mitogen as a positive control. The obtained results showed that there is no statistical difference in interferon-g, interleukin (IL)-4 and IL-13 levels obtained with Smp40 stimulation compared with the control group (P > 0.05 for each). On the other hand, there were significant differences after Smp40 stimulation in IL-5 (P = 0.006) and IL-10 levels (P < 0.001) compared with the control group. Gaining the knowledge by reviewing the literature, it was found that the overall pattern of cytokine profile obtained with Smp40 stimulation is reported to be associated with reduced collagen deposition, decreased fibrosis, and granuloma formation inhibition. This may reflect its future prospect as a leading anti-pathology schistosomal vaccine candidate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General introductionThe Human Immunodeficiency/Acquired Immunodeficiency Syndrome (HIV/AIDS) epidemic, despite recent encouraging announcements by the World Health Organization (WHO) is still today one of the world's major health care challenges.The present work lies in the field of health care management, in particular, we aim to evaluate the behavioural and non-behavioural interventions against HIV/AIDS in developing countries through a deterministic simulation model, both in human and economic terms. We will focus on assessing the effectiveness of the antiretroviral therapies (ART) in heterosexual populations living in lesser developed countries where the epidemic has generalized (formerly defined by the WHO as type II countries). The model is calibrated using Botswana as a case study, however our model can be adapted to other countries with similar transmission dynamics.The first part of this thesis consists of reviewing the main mathematical concepts describing the transmission of infectious agents in general but with a focus on human immunodeficiency virus (HIV) transmission. We also review deterministic models assessing HIV interventions with a focus on models aimed at African countries. This review helps us to recognize the need for a generic model and allows us to define a typical structure of such a generic deterministic model.The second part describes the main feed-back loops underlying the dynamics of HIV transmission. These loops represent the foundation of our model. This part also provides a detailed description of the model, including the various infected and non-infected population groups, the type of sexual relationships, the infection matrices, important factors impacting HIV transmission such as condom use, other sexually transmitted diseases (STD) and male circumcision. We also included in the model a dynamic life expectancy calculator which, to our knowledge, is a unique feature allowing more realistic cost-efficiency calculations. Various intervention scenarios are evaluated using the model, each of them including ART in combination with other interventions, namely: circumcision, campaigns aimed at behavioral change (Abstain, Be faithful or use Condoms also named ABC campaigns), and treatment of other STD. A cost efficiency analysis (CEA) is performed for each scenario. The CEA consists of measuring the cost per disability-adjusted life year (DALY) averted. This part also describes the model calibration and validation, including a sensitivity analysis.The third part reports the results and discusses the model limitations. In particular, we argue that the combination of ART and ABC campaigns and ART and treatment of other STDs are the most cost-efficient interventions through 2020. The main model limitations include modeling the complexity of sexual relationships, omission of international migration and ignoring variability in infectiousness according to the AIDS stage.The fourth part reviews the major contributions of the thesis and discusses model generalizability and flexibility. Finally, we conclude that by selecting the adequate interventions mix, policy makers can significantly reduce the adult prevalence in Botswana in the coming twenty years providing the country and its donors can bear the cost involved.Part I: Context and literature reviewIn this section, after a brief introduction to the general literature we focus in section two on the key mathematical concepts describing the transmission of infectious agents in general with a focus on HIV transmission. Section three provides a description of HIV policy models, with a focus on deterministic models. This leads us in section four to envision the need for a generic deterministic HIV policy model and briefly describe the structure of such a generic model applicable to countries with generalized HIV/AIDS epidemic, also defined as pattern II countries by the WHO.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The 'database search problem', that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions. The method's graphical environment, along with its computational and probabilistic architectures, represents a rich package that offers analysts and discussants with additional modes of interaction, concise representation, and coherent communication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MOTIVATION: Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. RESULTS: In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. AVAILABILITY: Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: The aim of the study was to investigate the influence of dietary intake of commercial hydrolyzed collagen (Gelatine Royal ®) on bone remodeling in pre-pubertal children. Methods: A randomized double-blind study was carried out in 60 children (9.42 ± 1.31 years) divided into three groups according to the amount of partially hydrolyzed collagen taken daily for 4 months: placebo (G-I, n = 18), collagen (G-II, n = 20) and collagen + calcium (G-III, n = 22) groups. Analyses of the following biochemical markers were carried out: total and bone alkaline phosphatase (tALP and bALP), osteocalcin, tartrate-resistant acid phosphatase (TRAP), type I collagen carboxy terminal telopeptide, lipids, calcium, 25-hydroxyvitamin D, insulin-like growth factor 1 (IGF-1), thyroid-stimulating hormone, free thyroxin and intact parathormone. Results: There was a significantly greater increase in serum IGF-1 in G-III than in G II (p < 0.01) or G-I (p < 0.05) during the study period, and a significantly greater increase in plasma tALP in G-III than in G-I (p < 0.05). Serum bALP behavior significantly (p < 0.05) differed between G-II (increase) and G-I (decrease). Plasma TRAP behavior significantly differed between G-II and G-I (p < 0.01) and between G-III and G-II (p < 0.05). Conclusion: Daily dietary intake of hydrolyzed collagen seems to have a potential role in enhancing bone remodeling at key stages of growth and development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted to developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas less has been done to predict the activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES2. The study involved first a homology modeling of the hCES2 protein based on the model of hCES1 since the two proteins share a high degree of homology (congruent with 73%). A set of 40 known substrates of hCES2 was taken from the literature; the ligands were docked in both their neutral and ionized forms using GriDock, a parallel tool based on the AutoDock4.0 engine which can perform efficient and easy virtual screening analyses of large molecular databases exploiting multi-core architectures. Useful statistical models (e.g., r (2) = 0.91 for substrates in their unprotonated state) were calculated by correlating experimental pK(m) values with distance between the carbon atom of the substrate's ester group and the hydroxy function of Ser228. Additional parameters in the equations accounted for hydrophobic and electrostatic interactions between substrates and contributing residues. The negatively charged residues in the hCES2 cavity explained the preference of the enzyme for neutral substrates and, more generally, suggested that ligands which interact too strongly by ionic bonds (e.g., ACE inhibitors) cannot be good CES2 substrates because they are trapped in the cavity in unproductive modes and behave as inhibitors. The effects of protonation on substrate recognition and the contrasting behavior of substrates and products were finally investigated by MD simulations of some CES2 complexes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 1986, several near-vertical seismic reflection profiles have been recorded in Switzerland in order to map the deep geologic structure of the Alps. One objective of this endeavour has been to determine the geometries of the autochthonous basement and of the external crystalline massifs, important elements for understanding the geodynamics of the Alpine orogeny. The PNR-20 seismic line W1, located in the Rawil depression of the western Swiss Alps, provides important information on this subject. It extends northward from the `'Penninic front'' across the Helvetic nappes to the Prealps. The crystalline massifs do not outcrop along this profile. Thus, the interpretation of `'near-basement'' reflections has to be constrained by down-dip projections of surface geology, `'true amplitude'' processing, rock physical property studies and modelling. 3-D seismic modelling has been used to evaluate the seismic response of two alternative down-dip projection models. To constrain the interpretation in the southern part of the profile, `'true amplitude'' processing has provided information on the strength of the reflections. Density and velocity measurements on core samples collected up-dip from the region of the seismic line have been used to evaluate reflection coefficients of typical lithologic boundaries in the region. The cover-basement contact itself is not a source of strong reflections, but strong reflections arise from within the overlaying metasedimentary cover sequence, allowing the geometry of the top of the basement to be determined on the basis of `'near-basement'' reflections. The front of the external crystalline massifs is shown to extend beneath the Prealps, about 6 km north of the expected position. A 2-D model whose seismic response shows reflection patterns very similar to the observed is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: The prevalence of obesity has increased in societies of all socio-cultural backgrounds. To date, guidelines set forward to prevent obesity have universally emphasized optimal levels of physical activity. However there are few empirical data to support the assertion that low levels of energy expenditure in activity is a causal factor in the current obesity epidemic are very limited. METHODS: The Modeling the Epidemiologic Transition Study (METS) is a cohort study designed to assess the association between physical activity levels and relative weight, weight gain and diabetes and cardiovascular disease risk in five population-based samples at different stages of economic development. Twenty-five hundred young adults, ages 25-45, were enrolled in the study; 500 from sites in Ghana, South Africa, Seychelles, Jamaica and the United States. At baseline, physical activity levels were assessed using accelerometry and a questionnaire in all participants and by doubly labeled water in a subsample of 75 per site. We assessed dietary intake using two separate 24-h recalls, body composition using bioelectrical impedance analysis, and health history, social and economic indicators by questionnaire. Blood pressure was measured and blood samples collected for measurement of lipids, glucose, insulin and adipokines. Full examination including physical activity using accelerometry, anthropometric data and fasting glucose will take place at 12 and 24 months. The distribution of the main variables and the associations between physical activity, independent of energy intake, glucose metabolism and anthropometric measures will be assessed using cross-section and longitudinal analysis within and between sites. DISCUSSION: METS will provide insight on the relative contribution of physical activity and diet to excess weight, age-related weight gain and incident glucose impairment in five populations' samples of young adults at different stages of economic development. These data should be useful for the development of empirically-based public health policy aimed at the prevention of obesity and associated chronic diseases.