881 resultados para Interaction modeling. Model-based development. Interaction evaluation.
Resumo:
Modeling of water movement in non-saturated soil usually requires a large number of parameters and variables, such as initial soil water content, saturated water content and saturated hydraulic conductivity, which can be assessed relatively easily. Dimensional flow of water in the soil is usually modeled by a nonlinear partial differential equation, known as the Richards equation. Since this equation cannot be solved analytically in certain cases, one way to approach its solution is by numerical algorithms. The success of numerical models in describing the dynamics of water in the soil is closely related to the accuracy with which the water-physical parameters are determined. That has been a big challenge in the use of numerical models because these parameters are generally difficult to determine since they present great spatial variability in the soil. Therefore, it is necessary to develop and use methods that properly incorporate the uncertainties inherent to water displacement in soils. In this paper, a model based on fuzzy logic is used as an alternative to describe water flow in the vadose zone. This fuzzy model was developed to simulate the displacement of water in a non-vegetated crop soil during the period called the emergency phase. The principle of this model consists of a Mamdani fuzzy rule-based system in which the rules are based on the moisture content of adjacent soil layers. The performances of the results modeled by the fuzzy system were evaluated by the evolution of moisture profiles over time as compared to those obtained in the field. The results obtained through use of the fuzzy model provided satisfactory reproduction of soil moisture profiles.
Resumo:
Chronic pain is a complex disabling experience that negatively affects the cognitive, affective and physical functions as well as behavior. Although the interaction between chronic pain and physical functioning is a well-accepted paradigm in clinical research, the understanding of how pain affects individuals' daily life behavior remains a challenging task. Here we develop a methodological framework allowing to objectively document disruptive pain related interferences on real-life physical activity. The results reveal that meaningful information is contained in the temporal dynamics of activity patterns and an analytical model based on the theory of bivariate point processes can be used to describe physical activity behavior. The model parameters capture the dynamic interdependence between periods and events and determine a 'signature' of activity pattern. The study is likely to contribute to the clinical understanding of complex pain/disease-related behaviors and establish a unified mathematical framework to quantify the complex dynamics of various human activities.
Resumo:
Context: Until now, the testosterone/epitestosterone (T/E) ratio is the main marker for detection of testosterone (T) misuse in athletes. As this marker can be influenced by a number of confounding factors, additional steroid profile parameters indicating T misuse can provide substantiating evidence of doping with endogenous steroids. The evaluation of a steroid profile is currently based upon population statistics. Since large inter-individual variations exist, a paradigm shift towards subject-based references is ongoing in doping analysis. Objective: Proposition of new biomarkers for the detection of testosterone in sports using extensive steroid profiling and an adaptive model based upon Bayesian inference. Subjects: 6 healthy male volunteers were administered with testosterone undecanoate. Population statistics were performed upon steroid profiles from 2014 male Caucasian athletes participating in official sport competition. Design: An extended search for new biomarkers in a comprehensive steroid profile combined with Bayesian inference techniques as used in the Athlete Biological Passport resulted in a selection of additional biomarkers that may improve detection of testosterone misuse in sports. Results: Apart from T/E, 4 other steroid ratios (6α-OH-androstenedione/16α-OH-dehydroepiandrostenedione, 4-OH-androstenedione/16α-OH-androstenedione, 7α-OH-testosterone/7β-OH-dehydroepiandrostenedione and dihydrotestosterone/5β-androstane-3α,17β-diol) were identified as sensitive urinary biomarkers for T misuse. These new biomarkers were rated according to relative response, parameter stability, detection time and discriminative power. Conclusion: Newly selected biomarkers were found suitable for individual referencing within the concept of the Athlete's Biological Passport. The parameters showed improved detection time and discriminative power compared to the T/E ratio. Such biomarkers can support the evidence of doping with small oral doses of testosterone.
Resumo:
In this work, the calcium-induced aggregation of phosphatidylserine liposomes is probed by means of the analysis of the kinetics of such process as well as the aggregate morphology. This novel characterization of liposome aggregation involves the use of static and dynamic light-scattering techniques to obtain kinetic exponents and fractal dimensions. For salt concentrations larger than 5 mM, a diffusion-limited aggregation regime is observed and the Brownian kernel properly describes the time evolution of the diffusion coefficient. For slow kinetics, a slightly modified multiple contact kernel is required. In any case, a time evolution model based on the numerical resolution of Smoluchowski's equation is proposed in order to establish a theoretical description for the aggregating system. Such a model provides an alternative procedure to determine the dimerization constant, which might supply valuable information about interaction mechanisms between phospholipid vesicles.
Resumo:
Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.
Resumo:
Excitation-continuous music instrument control patterns are often not explicitly represented in current sound synthesis techniques when applied to automatic performance. Both physical model-based and sample-based synthesis paradigmswould benefit from a flexible and accurate instrument control model, enabling the improvement of naturalness and realism. Wepresent a framework for modeling bowing control parameters inviolin performance. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing control parameter signals.We model the temporal contour of bow velocity, bow pressing force, and bow-bridge distance as sequences of short Bézier cubic curve segments. Considering different articulations, dynamics, and performance contexts, a number of note classes are defined. Contours of bowing parameters in a performance database are analyzed at note-level by following a predefined grammar that dictates characteristics of curve segment sequences for each of the classes in consideration. As a result, contour analysis of bowing parameters of each note yields an optimal representation vector that is sufficient for reconstructing original contours with significant fidelity. From the resulting representation vectors, we construct a statistical model based on Gaussian mixtures suitable for both the analysis and synthesis of bowing parameter contours. By using the estimated models, synthetic contours can be generated through a bow planning algorithm able to reproduce possible constraints caused by the finite length of the bow. Rendered contours are successfully used in two preliminary synthesis frameworks: digital waveguide-based bowed stringphysical modeling and sample-based spectral-domain synthesis.
Resumo:
Introduction: Ankle arthrodesis (AD) and total ankle replacement (TAR) are typical treatments for ankle osteoarthritis (AO). Despite clinical interest, there is a lack of their outcome evaluation using objective criteria. Gait analysis and plantar pressure assessment are appropriate to detect pathologies in orthopaedics but they are mostly used in lab with few gait cycles. In this study, we propose an ambulatory device based on inertial and plantar pressure sensors to compare the gait during long-distance trials between healthy subjects (H) and patients with AO or treated by AD and TAR. Methods: Our study included four groups: 11 patients with AO, 9 treated by TAR, 7 treated by AD and 6 control subjects. An ambulatory system (Physilog®, CH) was used for gait analysis; plantar pressure measurements were done using a portable insole (Pedar®-X, DE). The subjects were asked to walk 50 meters in two trials. Mean value and coefficient of variation of spatio-temporal gait parameters were calculated for each trial. Pressure distribution was analyzed in ten subregions of foot. All parameters were compared among the four groups using multi-level model-based statistical analysis. Results: Significant difference (p <0.05) with control was noticed for AO patients in maximum force in medial hindfoot and forefoot and in central forefoot. These differences were no longer significant in TAR and AD groups. Cadence and speed of all pathologic groups showed significant difference with control. Both treatments showed a significant improvement in double support and stance. TAR decreased variability in speed, stride length and knee ROM. Conclusions: In spite of a small sample size, this study showed that ankle function after AO treatments can be evaluated objectively based on plantar pressure and spatio-temporal gait parameters measured during unconstrained walking outside the lab. The combination of these two ambulatory techniques provides a promising way to evaluate foot function in clinics.
Resumo:
We present a framework for modeling right-hand gestures in bowed-string instrument playing, applied to violin. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing gesture parameter cues. We model the temporal contour of bow transversal velocity, bow pressing force, and bow-bridge distance as sequences of short segments, in particular B´ezier cubic curve segments. Considering different articulations, dynamics, andcontexts, a number of note classes is defined. Gesture parameter contours of a performance database are analyzed at note-level by following a predefined grammar that dictatescharacteristics of curve segment sequences for each of the classes into consideration. Based on dynamic programming, gesture parameter contour analysis provides an optimal curve parameter vector for each note. The informationpresent in such parameter vector is enough for reconstructing original gesture parameter contours with significant fidelity. From the resulting representation vectors, weconstruct a statistical model based on Gaussian mixtures, suitable for both analysis and synthesis of bowing gesture parameter contours. We show the potential of the modelby synthesizing bowing gesture parameter contours from an annotated input score. Finally, we point out promising applicationsand developments.
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
To compare the cost and effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) versus combined oral contraception (COC) and progestogens (PROG) in first-line treatment of dysfunctional uterine bleeding (DUB) in Spain. STUDY DESIGN: A cost-effectiveness and cost-utility analysis of LNG-IUS, COC and PROG was carried out using a Markov model based on clinical data from the literature and expert opinion. The population studied were women with a previous diagnosis of idiopathic heavy menstrual bleeding. The analysis was performed from the National Health System perspective, discounting both costs and future effects at 3%. In addition, a sensitivity analysis (univariate and probabilistic) was conducted. RESULTS: The results show that the greater efficacy of LNG-IUS translates into a gain of 1.92 and 3.89 symptom-free months (SFM) after six months of treatment versus COC and PROG, respectively (which represents an increase of 33% and 60% of symptom-free time). Regarding costs, LNG-IUS produces savings of 174.2-309.95 and 230.54-577.61 versus COC and PROG, respectively, after 6 months-5 years. Apart from cost savings and gains in SFM, quality-adjusted life months (QALM) are also favourable to LNG-IUS in all scenarios, with a range of gains between 1 and 2 QALM compared to COC and PROG. CONCLUSIONS: The results indicate that first-line use of the LNG-IUS is the dominant therapeutic option (less costly and more effective) in comparison with first-line use of COC or PROG for the treatment of DUB in Spain. LNG-IUS as first line is also the option that provides greatest health-related quality of life to patients.
Resumo:
Advancements in high-throughput technologies to measure increasingly complex biological phenomena at the genomic level are rapidly changing the face of biological research from the single-gene single-protein experimental approach to studying the behavior of a gene in the context of the entire genome (and proteome). This shift in research methodologies has resulted in a new field of network biology that deals with modeling cellular behavior in terms of network structures such as signaling pathways and gene regulatory networks. In these networks, different biological entities such as genes, proteins, and metabolites interact with each other, giving rise to a dynamical system. Even though there exists a mature field of dynamical systems theory to model such network structures, some technical challenges are unique to biology such as the inability to measure precise kinetic information on gene-gene or gene-protein interactions and the need to model increasingly large networks comprising thousands of nodes. These challenges have renewed interest in developing new computational techniques for modeling complex biological systems. This chapter presents a modeling framework based on Boolean algebra and finite-state machines that are reminiscent of the approach used for digital circuit synthesis and simulation in the field of very-large-scale integration (VLSI). The proposed formalism enables a common mathematical framework to develop computational techniques for modeling different aspects of the regulatory networks such as steady-state behavior, stochasticity, and gene perturbation experiments.
Resumo:
Suunnittelu- ja valmistustoiminnot ovat eriytymässä myös ohutlevyteollisuudessa. Kilpailukyvyn parantamiseksi, valmistusta siirretään alihankkijoille maihin, jossa työvoima on halpaa ja suuret markkinat lähellä, tuotekehityksen ja suunnittelun jäädessä Suomeen tai muualle Länsi-Eurooppaan. Tällöin vanhan lokaalin toimintamallin synergiaedut eivät ole enää mahdollisia ja tuotteen valmistettavuuden arviointi, laadunhallinta ja komponenttien yhteensopivuuden varmistaminen on erittäin haasteellista. Tässä tutkimuksessa etsittiin uuden toimintamallin tuomiin haasteisiin vastauksia piirrepohjaisesta mallintamisesta. Tavoitteen mukaisesti, tutkimuksessa tunnistettiin ohutlevytuotteen valmistusteknilliset piirteet ja luotiin niiden mukainen piirrekaavio. Piirteiden tunnistus vaati tutkimustyötä sekä nykyaikaisten suunnittelumetodien että valmistusmenetelmien ja ohutlevymateriaalien parissa. Lisäksitarkasteltiin, millaisia vaikutuksia piirteillä on tuotteen valmistettavuuteen ja miten nämä tulee huomioida tuotteen suunnittelussa.
Resumo:
Työn tavoitteena on kehittää Microsoft Excel -taulukkolaskentaohjelmaan pohjautuva arvonmääritysmalli. Mallin avulla osaketutkimusta tekevät analyytikot ja sijoittajat voivat määrittää osakkeen fundamenttiarvon. Malli kehitetään erityisesti piensijoittajien työkaluksi. Työn toisena tavoitteena on soveltaa kehitettyä arvonmääritysmallia case-yrityksenä toimivan F-Securen arvonmäärityksessä ja selvittää mallin avulla onko F-Securen osake pörssissä fundamentteihin nähden oikein hinnoiteltu. Työn teoriaosassa esitellään arvonmäärityksen käyttökohteet ja historia, arvonmääritysprosessin vaiheet (strateginen analyysi, tilinpäätösanalyysi, tulevaisuuden ennakointi, yrityksen arvon laskeminen), pääoman kustannuksen määrittäminen ja sijoittajan eri arvonmääritysmenetelmät, joita ovat diskontattuun kassavirtaan perustuvassa arvonmäärityksessä käytettävät mallit sekä suhteellisen arvonmäärityksentunnusluvut. Empiirinen osa käsittää arvonmääritysmallin kehittämisen ja rakenteen kuvauksen sekä F-Securen arvonmääritysprosessin. Vaikka F-Securen tulevaisuus näyttää varsin valoisalta, osake on hinnoiteltu markkinoilla tällä hetkellä(23.02.2006) korkeammalle kuin näihin odotuksiin nähden olisi järkevää. Eri menetelmät antavat osakkeelle arvoja 2,25 euron ja 2,97 euron väliltä. Kehitetty Excel -malli määrittää F-Securen osakkeen tavoitehinnaksi eri menetelmien mediaanina 2,29 euroa. Tutkimuksen tuloksena F-Securen osaketta voidaan pitää yliarvostettuna, sillä sen hinta pörssissä on 3,05 euroa.
Resumo:
Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.
Resumo:
To compare the cost and effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) versus combined oral contraception (COC) and progestogens (PROG) in first-line treatment of dysfunctional uterine bleeding (DUB) in Spain. STUDY DESIGN: A cost-effectiveness and cost-utility analysis of LNG-IUS, COC and PROG was carried out using a Markov model based on clinical data from the literature and expert opinion. The population studied were women with a previous diagnosis of idiopathic heavy menstrual bleeding. The analysis was performed from the National Health System perspective, discounting both costs and future effects at 3%. In addition, a sensitivity analysis (univariate and probabilistic) was conducted. RESULTS: The results show that the greater efficacy of LNG-IUS translates into a gain of 1.92 and 3.89 symptom-free months (SFM) after six months of treatment versus COC and PROG, respectively (which represents an increase of 33% and 60% of symptom-free time). Regarding costs, LNG-IUS produces savings of 174.2-309.95 and 230.54-577.61 versus COC and PROG, respectively, after 6 months-5 years. Apart from cost savings and gains in SFM, quality-adjusted life months (QALM) are also favourable to LNG-IUS in all scenarios, with a range of gains between 1 and 2 QALM compared to COC and PROG. CONCLUSIONS: The results indicate that first-line use of the LNG-IUS is the dominant therapeutic option (less costly and more effective) in comparison with first-line use of COC or PROG for the treatment of DUB in Spain. LNG-IUS as first line is also the option that provides greatest health-related quality of life to patients.