968 resultados para Numerical Modelling
Resumo:
Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.
Resumo:
In this work we develop a viscoelastic bar element that can handle multiple rheo- logical laws with non-linear elastic and non-linear viscous material models. The bar element is built by joining in series an elastic and viscous bar, constraining the middle node position to the bar axis with a reduction method, and stati- cally condensing the internal degrees of freedom. We apply the methodology to the modelling of reversible softening with sti ness recovery both in 2D and 3D, a phenomenology also experimentally observed during stretching cycles on epithelial lung cell monolayers.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to thecomposition of the planets crustal materials. If this relationship is true, then inferences regarding the bulkchemistry of the planet might be made from a thorough exospheric study. The most vexing of allunsolved problems is the uncertainty in the source of each component. Historically, it has been believedthat H and He come primarily from the solar wind, while Na and K originate from volatilized materialspartitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms andmolecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulateddesorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its owntemporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly ellipticalorbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperatureand experiences differences of insolation with longitude. We will discuss these processes but focus moreon the expected surface composition and solar wind particle sputtering which releases material like Caand other elements from the surface minerals and discuss the relevance of composition modelling
Resumo:
The main objective of this paper aims at developing a methodology that takes into account the human factor extracted from the data base used by the recommender systems, and which allow to resolve the specific problems of prediction and recommendation. In this work, we propose to extract the user's human values scale from the data base of the users, to improve their suitability in open environments, such as the recommender systems. For this purpose, the methodology is applied with the data of the user after interacting with the system. The methodology is exemplified with a case study
Resumo:
Reverse transcriptase (RT) is a multifunctional enzyme in the human immunodeficiency virus (HIV)-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs) and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.
Resumo:
En aquest article es resumeixen els resultats publicats en un informe de l' ISS (Istituto Superiore di Sanità) del desembre de 2006, sobre un model matemàtic desenvolupat per un grup de treball que inclou a investigadors de les Universitats de Trento, Pisa i Roma, i els Instituts Nacionals de Salut (Istituto Superiore di Sanità, ISS), per avaluar i mesurar l'impacte de la transmissió i el control de la pandèmia de grip
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
Rare species have restricted geographic ranges, habitat specialization, and/or small population sizes. Datasets on rare species distribution usually have few observations, limited spatial accuracy and lack of valid absences; conversely they provide comprehensive views of species distributions allowing to realistically capture most of their realized environmental niche. Rare species are the most in need of predictive distribution modelling but also the most difficult to model. We refer to this contrast as the "rare species modelling paradox" and propose as a solution developing modelling approaches that deal with a sufficiently large set of predictors, ensuring that statistical models aren't overfitted. Our novel approach fulfils this condition by fitting a large number of bivariate models and averaging them with a weighted ensemble approach. We further propose that this ensemble forecasting is conducted within a hierarchic multi-scale framework. We present two ensemble models for a test species, one at regional and one at local scale, each based on the combination of 630 models. In both cases, we obtained excellent spatial projections, unusual when modelling rare species. Model results highlight, from a statistically sound approach, the effects of multiple drivers in a same modelling framework and at two distinct scales. From this added information, regional models can support accurate forecasts of range dynamics under climate change scenarios, whereas local models allow the assessment of isolated or synergistic impacts of changes in multiple predictors. This novel framework provides a baseline for adaptive conservation, management and monitoring of rare species at distinct spatial and temporal scales.
Resumo:
The identification of compositional changes in fumarolic gases of active and quiescent volcanoes is one of the mostimportant targets in monitoring programs. From a general point of view, many systematic (often cyclic) and randomprocesses control the chemistry of gas discharges, making difficult to produce a convincing mathematical-statisticalmodelling.Changes in the chemical composition of volcanic gases sampled at Vulcano Island (Aeolian Arc, Sicily, Italy) fromeight different fumaroles located in the northern sector of the summit crater (La Fossa) have been analysed byconsidering their dependence from time in the period 2000-2007. Each intermediate chemical composition has beenconsidered as potentially derived from the contribution of the two temporal extremes represented by the 2000 and 2007samples, respectively, by using inverse modelling methodologies for compositional data. Data pertaining to fumarolesF5 and F27, located on the rim and in the inner part of La Fossa crater, respectively, have been used to achieve theproposed aim. The statistical approach has allowed us to highlight the presence of random and not random fluctuations,features useful to understand how the volcanic system works, opening new perspectives in sampling strategies and inthe evaluation of the natural risk related to a quiescent volcano
Resumo:
A four compartment model of the cardiovascular system is developed. To allow for easy interpretation and to minimise the number of parameters, an effort was made to keep the model as simple as possible. A sensitivity analysis is first carried out to determine which are the most important model parameters to characterise the blood pressure signal. A four stage process is then described which accurately determines all parameter values. This process is applied to data from three patients and good agreement is shown in all cases.
Resumo:
One of the world's largest wollastonite deposits was formed at the contact of the northern Hunter Mountain Batholith (California, USA) in Paleozoic sediments. Wollastonite occurs as zones of variable thickness surrounding layers or nodules of quartzite in limestones. A minimum formation temperature of 650 degrees C is estimated from isolated periclase-bearing lenses in that area. Contact metamorphism of siliceous carbonates has produced mineral assemblages that are consistent with heterogeneous, and partly limited infiltration of water-rich fluids, compatible with O-18/O-16 and C-13/C-12 isotopic patterns recorded in carbonates. Oxygen isotope compositions of wollastonites in the study area may also not require infiltration of large quantities of externally-derived fluids that were out of equilibrium with the rocks. 8180 values of wollastonite are high (14.8 parts per thousand to 25.0 parts per thousand; median: 19.7 parts per thousand) and close to those of the host limestone (19.7 parts per thousand to 28 parts per thousand; median: 24.9 parts per thousand) and quartz (18.0 parts per thousand. to 29.1 parts per thousand; median: 22.6 parts per thousand). Isotopic disequilibrium exists at quartz/wollastonite and wollastonite/calcite boundaries. Therefore, classical batch/Rayleigh fractionation models based on reactant and product equilibrium are not applicable to the wollastonite rims. An approach that relies on local instantaneous mass balance for the reactants, based on the wollastonite-forming reaction is suggested as an alternative way to model wollastonite reaction rims. This model reproduces many of the measured delta O-18 values of wollastonite reaction rims of the current study to within +/- 1 parts per thousand, even though the wollastonite compositions vary by almost 10 parts per thousand. (C) 2011 Elsevier B.V. All rights reserved.