892 resultados para IN-VARIABLES MODELS
Resumo:
In this work we address the problem of finding formulas for efficient and reliable analytical approximation for the calculation of forward implied volatility in LSV models, a problem which is reduced to the calculation of option prices as an expansion of the price of the same financial asset in a Black-Scholes dynamic. Our approach involves an expansion of the differential operator, whose solution represents the price in local stochastic volatility dynamics. Further calculations then allow to obtain an expansion of the implied volatility without the aid of any special function or expensive from the computational point of view, in order to obtain explicit formulas fast to calculate but also as accurate as possible.
Resumo:
The uncertainties in the determination of the stratigraphic profile of natural soils is one of the main problems in geotechnics, in particular for landslide characterization and modeling. The study deals with a new approach in geotechnical modeling which relays on a stochastic generation of different soil layers distributions, following a boolean logic – the method has been thus called BoSG (Boolean Stochastic Generation). In this way, it is possible to randomize the presence of a specific material interdigitated in a uniform matrix. In the building of a geotechnical model it is generally common to discard some stratigraphic data in order to simplify the model itself, assuming that the significance of the results of the modeling procedure would not be affected. With the proposed technique it is possible to quantify the error associated with this simplification. Moreover, it could be used to determine the most significant zones where eventual further investigations and surveys would be more effective to build the geotechnical model of the slope. The commercial software FLAC was used for the 2D and 3D geotechnical model. The distribution of the materials was randomized through a specifically coded MatLab program that automatically generates text files, each of them representing a specific soil configuration. Besides, a routine was designed to automate the computation of FLAC with the different data files in order to maximize the sample number. The methodology is applied with reference to a simplified slope in 2D, a simplified slope in 3D and an actual landslide, namely the Mortisa mudslide (Cortina d’Ampezzo, BL, Italy). However, it could be extended to numerous different cases, especially for hydrogeological analysis and landslide stability assessment, in different geological and geomorphological contexts.
Resumo:
La materia ordinaria copre soli pochi punti percentuali della massa-energia totale dell'Universo, che è invece largamente dominata da componenti “oscure”. Il modello standard usato per descriverle è il modello LambdaCDM. Nonostante esso sembri consistente con la maggior parte dei dati attualmente disponibili, presenta alcuni problemi fondamentali che ad oggi restano irrisolti, lasciando spazio per lo studio di modelli cosmologici alternativi. Questa Tesi mira a studiare un modello proposto recentemente, chiamato “Multi-coupled Dark Energy” (McDE), che presenta interazioni modificate rispetto al modello LambdaCDM. In particolare, la Materia Oscura è composta da due diversi tipi di particelle con accoppiamento opposto rispetto ad un campo scalare responsabile dell'Energia Oscura. L'evoluzione del background e delle perturbazioni lineari risultano essere indistinguibili da quelle del modello LambdaCDM. In questa Tesi viene presentata per la prima volta una serie di simulazioni numeriche “zoomed”. Esse presentano diverse regioni con risoluzione differente, centrate su un singolo ammasso di interesse, che permettono di studiare in dettaglio una singola struttura senza aumentare eccessivamente il tempo di calcolo necessario. Un codice chiamato ZInCo, da me appositamente sviluppato per questa Tesi, viene anch'esso presentato per la prima volta. Il codice produce condizioni iniziali adatte a simulazioni cosmologiche, con differenti regioni di risoluzione, indipendenti dal modello cosmologico scelto e che preservano tutte le caratteristiche dello spettro di potenza imposto su di esse. Il codice ZInCo è stato usato per produrre condizioni iniziali per una serie di simulazioni numeriche del modello McDE, le quali per la prima volta mostrano, grazie all'alta risoluzione raggiunta, che l'effetto di segregazione degli ammassi avviene significativamente prima di quanto stimato in precedenza. Inoltre, i profili radiale di densità ottenuti mostrano un appiattimento centrale nelle fasi iniziali della segregazione. Quest'ultimo effetto potrebbe aiutare a risolvere il problema “cusp-core” del modello LambdaCDM e porre limiti ai valori dell'accoppiamento possibili.
Resumo:
Deep vein thrombosis (DVT) and its complication, pulmonary embolism, are frequent causes of disability and mortality. Although blood flow disturbance is considered an important triggering factor, the mechanism of DVT initiation remains elusive. Here we show that 48-hour flow restriction in the inferior vena cava (IVC) results in the development of thrombi structurally similar to human deep vein thrombi. von Willebrand factor (VWF)-deficient mice were protected from thrombosis induced by complete (stasis) or partial (stenosis) flow restriction in the IVC. Mice with half normal VWF levels were also protected in the stenosis model. Besides promoting platelet adhesion, VWF carries Factor VIII. Repeated infusions of recombinant Factor VIII did not rescue thrombosis in VWF(-/-) mice, indicating that impaired coagulation was not the primary reason for the absence of DVT in VWF(-/-) mice. Infusion of GPG-290, a mutant glycoprotein Ib?-immunoglobulin chimera that specifically inhibits interaction of the VWF A1 domain with platelets, prevented thrombosis in wild-type mice. Intravital microscopy showed that platelet and leukocyte recruitment in the early stages of DVT was dramatically higher in wild-type than in VWF(-/-) IVC. Our results demonstrate a pathogenetic role for VWF-platelet interaction in flow disturbance-induced venous thrombosis.
Resumo:
http://www.ncbi.nlm.nih.gov/pubmed/20864016
Resumo:
In the simultaneous estimation of a large number of related quantities, multilevel models provide a formal mechanism for efficiently making use of the ensemble of information for deriving individual estimates. In this article we investigate the ability of the likelihood to identify the relationship between signal and noise in multilevel linear mixed models. Specifically, we consider the ability of the likelihood to diagnose conjugacy or independence between the signals and noises. Our work was motivated by the analysis of data from high-throughput experiments in genomics. The proposed model leads to a more flexible family. However, we further demonstrate that adequately capitalizing on the benefits of a well fitting fully-specified likelihood in the terms of gene ranking is difficult.
Resumo:
BACKGROUND: Several epidemiological studies show that inhalation of particulate matter may cause increased pulmonary morbidity and mortality. Of particular interest are the ultrafine particles that are particularly toxic. In addition more and more nanoparticles are released into the environment; however, the potential health effects of these nanoparticles are yet unknown. OBJECTIVES: To avoid particle toxicity studies with animals many cell culture models have been developed during the past years. METHODS: This review focuses on the most commonly used in vitro epithelial airway and alveolar models to study particle-cell interactions and particle toxicity and highlights advantages and disadvantages of the different models. RESULTS/CONCLUSION: There are many lung cell culture models but none of these models seems to be perfect. However, they might be a great tool to perform basic research or toxicity tests. The focus here is on 3D and co-culture models, which seem to be more realistic than monocultures.
Resumo:
BACKGROUND: Activation of endothelial cells (EC) in xenotransplantation is mostly induced through binding of antibodies (Ab) and activation of the complement system. Activated EC lose their heparan sulfate proteoglycan (HSPG) layer and exhibit a procoagulant and pro-inflammatory cell surface. We have recently shown that the semi-synthetic proteoglycan analog dextran sulfate (DXS, MW 5000) blocks activation of the complement cascade and acts as an EC-protectant both in vitro and in vivo. However, DXS is a strong anticoagulant and systemic use of this substance in a clinical setting might therefore be compromised. It was the aim of this study to investigate a novel, fully synthetic EC-protectant with reduced inhibition of the coagulation system. METHOD: By screening with standard complement (CH50) and coagulation assays (activated partial thromboplastin time, aPTT), a conjugate of tyrosine sulfate to a polymer-backbone (sTyr-PAA) was identified as a candidate EC-protectant. The pathway-specificity of complement inhibition by sTyr-PAA was tested in hemolytic assays. To further characterize the substance, the effects of sTyr-PAA and DXS on complement deposition on pig cells were compared by flow cytometry and cytotoxicity assays. Using fluorescein-labeled sTyr-PAA (sTyr-PAA-Fluo), the binding of sTyr-PAA to cell surfaces was also investigated. RESULTS: Of all tested compounds, sTyr-PAA was the most effective substance in inhibiting all three pathways of complement activation. Its capacity to inhibit the coagulation cascade was significantly reduced as compared with DXS. sTyr-PAA also dose-dependently inhibited deposition of human complement on pig cells and this inhibition correlated with the binding of sTyr-PAA to the cells. Moreover, we were able to demonstrate that sTyr-PAA binds preferentially and dose-dependently to damaged EC. CONCLUSIONS: We could show that sTyr-PAA acts as an EC-protectant by binding to the cells and protecting them from complement-mediated damage. It has less effect on the coagulation system than DXS and may therefore have potential for in vivo application.
Resumo:
Background: The literature on the applications of homeopathy for controlling plant diseases in both plant pathological models and field trials was first reviewed by Scofield in 1984. No other review on homeopathy in plant pathology has been published since, though much new research has subsequently been carried out using more advanced methods. Objectives: To conduct an up-to-date review of the existing literature on basic research in homeopathy using phytopathological models and experiments in the field. Methods: A literature search was carried out on publications from 1969 to 2009, for papers that reported experiments on homeopathy using phytopathological models (in vitro and in planta) and field trials. The selected papers were summarized and analysed on the basis of a Manuscript Information Score (MIS) to identify those that provided sufficient information for proper interpretation (MIS ≥ 5). These were then evaluated using a Study Methods Evaluation Procedure (SMEP). Results: A total of 44 publications on phytopathological models were identified: 19 papers with statistics, 6 studies with MIS ≥ 5. Publications on field were 9, 6 with MIS ≥ 5. In general, significant and reproducible effects with decimal and centesimal potencies were found, including dilution levels beyond the Avogadro's number. Conclusions: The prospects for homeopathic treatments in agriculture are promising, but much more experimentation is needed, especially at a field level, and on potentisation techniques, effective potency levels and conditions for reproducibility. Phytopathological models may also develop into useful tools to answer pharmaceutical questions.
Resumo:
In the laboratory of Dr. Dieter Jaeger at Emory University, we use computer simulations to study how the biophysical properties of neurons—including their three-dimensional structure, passive membrane resistance and capacitance, and active membrane conductances generated by ion channels—affect the way that the neurons transfer synaptic inputs into the action potential streams that represent their output. Because our ultimate goal is to understand how neurons process and relay information in a living animal, we try to make our computer simulations as realistic as possible. As such, the computer models reflect the detailed morphology and all of the ion channels known to exist in the particular neuron types being simulated, and the model neurons are tested with synaptic input patterns that are intended to approximate the inputs that real neurons receive in vivo. The purpose of this workshop tutorial was to explain what we mean by ‘in vivo-like’ synaptic input patterns, and how we introduce these input patterns into our computer simulations using the freely available GENESIS software package (http://www.genesis-sim.org/GENESIS). The presentation was divided into four sections: first, an explanation of what we are talking about when we refer to in vivo-like synaptic input patterns