909 resultados para 010406 Stochastic Analysis and Modelling
Resumo:
Among the underlying assumptions of the Black-Scholes option pricingmodel, those of a fixed volatility of the underlying asset and of aconstantshort-term riskless interest rate, cause the largest empirical biases. Onlyrecently has attention been paid to the simultaneous effects of thestochasticnature of both variables on the pricing of options. This paper has tried toestimate the effects of a stochastic volatility and a stochastic interestrate inthe Spanish option market. A discrete approach was used. Symmetricand asymmetricGARCH models were tried. The presence of in-the-mean and seasonalityeffectswas allowed. The stochastic processes of the MIBOR90, a Spanishshort-terminterest rate, from March 19, 1990 to May 31, 1994 and of the volatilityofthe returns of the most important Spanish stock index (IBEX-35) fromOctober1, 1987 to January 20, 1994, were estimated. These estimators wereused onpricing Call options on the stock index, from November 30, 1993 to May30, 1994.Hull-White and Amin-Ng pricing formulas were used. These prices werecomparedwith actual prices and with those derived from the Black-Scholesformula,trying to detect the biases reported previously in the literature. Whereasthe conditional variance of the MIBOR90 interest rate seemed to be freeofARCH effects, an asymmetric GARCH with in-the-mean and seasonalityeffectsand some evidence of persistence in variance (IEGARCH(1,2)-M-S) wasfoundto be the model that best represent the behavior of the stochasticvolatilityof the IBEX-35 stock returns. All the biases reported previously in theliterature were found. All the formulas overpriced the options inNear-the-Moneycase and underpriced the options otherwise. Furthermore, in most optiontrading, Black-Scholes overpriced the options and, because of thetime-to-maturityeffect, implied volatility computed from the Black-Scholes formula,underestimatedthe actual volatility.
Resumo:
Short description of the proposed presentation * lees than 100 words This paper describes the interdisciplinary work done in Uspantán, Guatemala, a city vulnerable to natural hazards. We investigated local responses to landslides that happened in 2007 and 2010 and had a strong impact on the local community. We show a complete example of a systemic approach that incorporates physical, social and environmental aspects in order to understand risks. The objective of this work is to present the combination of social and geological data (mapping), and describe the methodology used for identification and assessment of risk. The article discusses both the limitations and methodological challenges encountered when conducting interdisciplinary research. Describe why it is important to present this topic at the Global Platform in less than 50 words This work shows the benefits of addressing risk in an interdisciplinary perspective, in particular how integrating social sciences can help identify new phenomena and natural hazards and assess risk. It gives a practical example of how one can integrate data from different fields. What is innovative about this presentation? * The use of mapping to combine qualitative and quantitative data. By coupling approaches, we could associate a hazard map with qualitative data gathered by interviews with the population. This map is an important document for the authorities. Indeed, it allows them to be aware of the most dangerous zones, the affected families and the places where it is most urgent to intervene.
Resumo:
The aim of this study was to assess whether Neisseria meningitidis, Listeria monocytogenes, Streptococcus pneumoniae and Haemophilus influenzae can be identified using the polymerase chain reaction technique in the cerebrospinal fluid of severely decomposed bodies with known, noninfectious causes of death or whether postmortem changes can lead to false positive results and thus erroneous diagnostic information. Biochemical investigations, postmortem bacteriology and real-time polymerase chain reaction analysis in cerebrospinal fluid were performed in a series of medico-legal autopsies that included noninfectious causes of death with decomposition, bacterial meningitis without decomposition, bacterial meningitis with decomposition, low respiratory tract infections with decomposition and abdominal infections with decomposition. In noninfectious causes of death with decomposition, postmortem investigations failed to reveal results consistent with generalized inflammation or bacterial infections at the time of death. Real-time polymerase chain reaction analysis in cerebrospinal fluid did not identify the studied bacteria in any of these cases. The results of this study highlight the usefulness of molecular approaches in bacteriology as well as the use of alternative biological samples in postmortem biochemistry in order to obtain suitable information even in corpses with severe decompositional changes.
Resumo:
Isolates of the Trichophyton mentagrophytes complex vary phenotypically. Whether the closely related zoophilic and anthropophilic anamorphs currently associated with Arthroderma vanbreuseghemii have to be considered as members of the same biological species remains an open question. In order to better delineate species in the T. mentagrophytes complex, we performed a mating analysis of freshly collected isolates from humans and animals with A. benhamiae and A. vanbreuseghemii reference strains, in comparison to internal transcribed spacer (ITS) and 28S rDNA sequencing. Mating experiments as well as ITS and 28S sequencing unambiguously allowed the distinction of A. benhamiae and A. vanbreuseghemii. We have also shown that all the isolates from tinea pedis and tinea unguium identified as T. interdigitale based on ITS sequences mated with A. vanbreuseghemii tester strains, but had lost their ability to give fertile cleistothecia. Therefore, T. interdigitale has to be considered as a humanized species derived from the sexual relative A. vanbreuseghemii.
Resumo:
We show the equivalence between the use of correspondence analysis (CA)of concadenated tables and the application of a particular version ofconjoint analysis called categorical conjoint measurement (CCM). Theconnection is established using canonical correlation (CC). The second part introduces the interaction e¤ects in all three variants of theanalysis and shows how to pass between the results of each analysis.
Resumo:
PURPOSE: To report our results of endovascular aneurysm repair (EVAR) over a 10-year period using systematic preoperative collateral artery embolization. METHODS: From 1999 until 2009, 124 patients (117 men; mean age 70.8 years) with abdominal aortic aneurysm (AAA) underwent embolization of patent lumbar and/or inferior mesenteric arteries prior to elective EVAR procedures. Embolization was systematically attempted and, whenever possible, performed using microcoils and a coaxial technique. Follow-up included computed tomography and/or magnetic resonance imaging and abdominal radiography. RESULTS: The technical success for EVAR was 96% (119/124), with 4 patients dying within 30 days (3.2% perioperative mortality) and 1 type III endoleak accounting for the failures. Collateral arteries were occluded spontaneously or by embolization in 60 (48%) of 124 patients. The endoleak rate was 50.9% (74 in 61 patients), most of which were type II (19%). Over a mean clinical follow-up of 60.5±34.1 months (range 1-144), aneurysm sac dimensions decreased in 66 patients, increased in 19 patients, and were stable in 35. The endoleak rate was significantly higher in the patients with increasing sac diameter (p<0.001). Among the patients with patent collateral arteries, 38/64 (59.3%) developed 46 leaks, while 28 leaks appeared in 23 (41%) of 56 patients with collateral artery occlusion (p=0.069). The type II endoleak rate significantly differed between these two groups (47.8% vs. 3.6%, p<0.001). CONCLUSION: Preoperative collateral embolization seems to be a valid method of reducing the incidence of type II endoleak, improving the long-term outcome.
Resumo:
Projective homography sits at the heart of many problems in image registration. In addition to many methods for estimating the homography parameters (R.I. Hartley and A. Zisserman, 2000), analytical expressions to assess the accuracy of the transformation parameters have been proposed (A. Criminisi et al., 1999). We show that these expressions provide less accurate bounds than those based on the earlier results of Weng et al. (1989). The discrepancy becomes more critical in applications involving the integration of frame-to-frame homographies and their uncertainties, as in the reconstruction of terrain mosaics and the camera trajectory from flyover imagery. We demonstrate these issues through selected examples
Resumo:
MHC-peptide multimers containing biotinylated MHC-peptide complexes bound to phycoerythrin (PE) streptavidin (SA) are widely used for analyzing and sorting antigen-specific T cells. Here we describe alternative T cell-staining reagents that are superior to conventional reagents. They are built on reversible chelate complexes of Ni(2+)-nitrilotriacetic acid (NTA) with oligohistidines. We synthesized biotinylated linear mono-, di-, and tetra-NTA compounds using conventional solid phase peptide chemistry and studied their interaction with HLA-A*0201-peptide complexes containing a His(6), His(12), or 2×His(6) tag by surface plasmon resonance on SA-coated sensor chips and equilibrium dialysis. The binding avidity increased in the order His(6) < His(12) < 2×His(6) and NTA(1) < NTA(2) < NTA(4), respectively, depending on the configuration of the NTA moieties and increased to picomolar K(D) for the combination of a 2×His(6) tag and a 2×Ni(2+)-NTA(2). We demonstrate that HLA-A2-2×His(6)-peptide multimers containing either Ni(2+)-NTA(4)-biotin and PE-SA- or PE-NTA(4)-stained influenza and Melan A-specific CD8+ T cells equal or better than conventional multimers. Although these complexes were highly stable, they very rapidly dissociated in the presence of imidazole, which allowed sorting of bona fide antigen-specific CD8+ T cells without inducing T cell death as well as assessment of HLA-A2-peptide monomer dissociation kinetics on CD8+ T cells.
Resumo:
Thermal analysis, powder diffraction, and Raman scattering as a function of the temperature were carried out on K2BeF4. Moreover, the crystal structure was determined at 293 K from powder diffraction. The compound shows a transition from Pna21 to Pnam space group at 921 K with a transition enthalpy of 5 kJ/mol. The transition is assumed to be first order because the compound shows metastability. Structurally and spectroscopically the transition is similar to those observed in (NH4)2SO4, which suggests that the low-temperature phase is ferroelectric. In order to confirm it, the spontaneous polarization has been computed using an ionic model.
Resumo:
Leakage detection is an important issue in many chemical sensing applications. Leakage detection hy thresholds suffers from important drawbacks when sensors have serious drifts or they are affected by cross-sensitivities. Here we present an adaptive method based in a Dynamic Principal Component Analysis that models the relationships between the sensors in the may. In normal conditions a certain variance distribution characterizes sensor signals. However, in the presence of a new source of variance the PCA decomposition changes drastically. In order to prevent the influence of sensor drifts the model is adaptive and it is calculated in a recursive manner with minimum computational effort. The behavior of this technique is studied with synthetic signals and with real signals arising by oil vapor leakages in an air compressor. Results clearly demonstrate the efficiency of the proposed method.
Resumo:
This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.
Resumo:
The biological properties of wild-type A75/17 and cell culture-adapted Onderstepoort canine distemper virus differ markedly. To learn more about the molecular basis for these differences, we have isolated and sequenced the protein-coding regions of the attachment and fusion proteins of wild-type canine distemper virus strain A75/17. In the attachment protein, a total of 57 amino acid differences were observed between the Onderstepoort strain and strain A75/17, and these were distributed evenly over the entire protein. Interestingly, the attachment protein of strain A75/17 contained an extension of three amino acids at the C terminus. Expression studies showed that the attachment protein of strain A75/17 had a higher apparent molecular mass than the attachment protein of the Onderstepoort strain, in both the presence and absence of tunicamycin. In the fusion protein, 60 amino acid differences were observed between the two strains, of which 44 were clustered in the much smaller F2 portion of the molecule. Significantly, the AUG that has been proposed as a translation initiation codon in the Onderstepoort strain is an AUA codon in strain A75/17. Detailed mutation analyses showed that both the first and second AUGs of strain A75/17 are the major translation initiation sites of the fusion protein. Similar analyses demonstrated that, also in the Onderstepoort strain, the first two AUGs are the translation initiation codons which contribute most to the generation of precursor molecules yielding the mature form of the fusion protein.
Resumo:
Interaction analysis is not a prerogative of any discipline in social sciences. It has its own history within each disciplinary field and is related to specific research objects. From the standpoint of psychology, this article first draws upon a distinction between factorial and dialogical conceptions of interaction. It then briefly presents the basis of a dialogical approach in psychology and focuses upon four basic assumptions. Each of them is examined on a theoretical and on a methodological level with a leading question: to what extent is it possible to develop analytical tools that are fully coherent with dialogical assumptions? The conclusion stresses the difficulty of developing methodological tools that are fully consistent with dialogical assumptions and argues that there is an unavoidable tension between accounting for the complexity of an interaction and using methodological tools which necessarily "monologise" this complexity.