914 resultados para Copula Technique Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ice cores provide a robust reconstruction of past climate. However, development of timescales by annual-layer counting, essential to detailed climate reconstruction and interpretation, on ice cores collected at low-accumulation sites or in regions of compressed ice, is problematic due to closely spaced layers. Ice-core analysis by laser ablation–inductively coupled plasma–mass spectrometry (LA-ICP-MS) provides sub-millimeter-scale sampling resolution (on the order of 100μm in this study) and the low detection limits (ng L–1) necessary to measure the chemical constituents preserved in ice cores. We present a newly developed cryocell that can hold a 1m long section of ice core, and an alternative strategy for calibration. Using ice-core samples from central Greenland, we demonstrate the repeatability of multiple ablation passes, highlight the improved sampling resolution, verify the calibration technique and identify annual layers in the chemical profile in a deep section of an ice core where annual layers have not previously been identified using chemistry. In addition, using sections of cores from the Swiss/Italian Alps we illustrate the relationship between Ca, Na and Fe and particle concentration and conductivity, and validate the LA-ICP-MS Ca profile through a direct comparison with continuous flow analysis results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we extend the debate concerning Credit Default Swap valuation to include time varying correlation and co-variances. Traditional multi-variate techniques treat the correlations between covariates as constant over time; however, this view is not supported by the data. Secondly, since financial data does not follow a normal distribution because of its heavy tails, modeling the data using a Generalized Linear model (GLM) incorporating copulas emerge as a more robust technique over traditional approaches. This paper also includes an empirical analysis of the regime switching dynamics of credit risk in the presence of liquidity by following the general practice of assuming that credit and market risk follow a Markov process. The study was based on Credit Default Swap data obtained from Bloomberg that spanned the period January 1st 2004 to August 08th 2006. The empirical examination of the regime switching tendencies provided quantitative support to the anecdotal view that liquidity decreases as credit quality deteriorates. The analysis also examined the joint probability distribution of the credit risk determinants across credit quality through the use of a copula function which disaggregates the behavior embedded in the marginal gamma distributions, so as to isolate the level of dependence which is captured in the copula function. The results suggest that the time varying joint correlation matrix performed far superior as compared to the constant correlation matrix; the centerpiece of linear regression models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the security evaluation, energy consumption optimization, and spectrum scarcity analysis of artificial noise techniques to increase physical-layer security in Cognitive Wireless Sensor Networks (CWSNs). These techniques introduce noise into the spectrum in order to hide real information. Nevertheless, they directly affect two important parameters in Cognitive Wireless Sensor Networks (CWSNs), energy consumption and spectrum utilization. Both are affected because the number of packets transmitted by the network and the active period of the nodes increase. Security evaluation demonstrates that these techniques are effective against eavesdropper attacks, but also optimization allows for the implementation of these approaches in low-resource networks such as Cognitive Wireless Sensor Networks. In this work, the scenario is formally modeled and the optimization according to the simulation results and the impact analysis over the frequency spectrum are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a reverse-transcriptase PCR-based protocol suitable for efficient expression analysis of multigene families is presented. The method combines restriction fragment length polymorphism (RFLP) technology with a gene family-specific version of mRNA differential display and hence is called "RFLP-coupled domain-directed differential display. "With this method, expression of all members of a multigene family at many different developmental stages, in diverse tissues and even in different organisms, can be displayed on one gel. Moreover, bands of interest, representing gene family members, are directly accessible to sequence analysis, without the need for subcloning. The method thus enables a detailed, high-resolution expression analysis of known gene family members as well as the identification and characterization of new ones. Here the technique was used to analyze differential expression of MADS-box genes in male and female inflorescences of maize (Zea mays ssp. mays). Six different MADS-box genes could be identified, being either specifically expressed in the female sex or preferentially expressed in male or female inflorescences, respectively. Other possible applications of the method are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper shows the results of an experimental analysis on the bell tower of “Chiesa della Maddalena” (Mola di Bari, Italy), to better understand the structural behavior of slender masonry structures. The research aims to calibrate a numerical model by means of the Operational Modal Analysis (OMA) method. In this way realistic conclusions about the dynamic behavior of the structure are obtained. The choice of using an OMA derives from the necessity to know the modal parameters of a structure with a non-destructive testing, especially in case of cultural-historical value structures. Therefore by means of an easy and accurate process, it is possible to acquire in-situ environmental vibrations. The data collected are very important to estimate the mode shapes, the natural frequencies and the damping ratios of the structure. To analyze the data obtained from the monitoring, the Peak Picking method has been applied to the Fast Fourier Transforms (FFT) of the signals in order to identify the values of the effective natural frequencies and damping factors of the structure. The main frequencies and the damping ratios have been determined from measurements at some relevant locations. The responses have been then extrapolated and extended to the entire tower through a 3-D Finite Element Model. In this way, knowing the modes of vibration, it has been possible to understand the overall dynamic behavior of the structure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aerobic Gymnastic is the ability to perform complex movements produced by the traditional aerobic exercises, in a continuous manner, with high intensity, perfectly integrated with soundtracks. This sport is performed in an aerobic/anaerobic lactacid condition and expects the execution of complex movements produced by the traditional aerobic exercises integrated with difficulty elements performed with a high technical level. An inaccuracy about this sport is related to the name itself “aerobic” because Aerobic Gymnastic does not use just the aerobic work during the competition, due to the fact that the exercises last among 1’30” and 1’45” at high rhythm. Agonistic Aerobics exploit the basic movements of amateur Aerobics and its coordination schemes, even though the agonistic Aerobics is so much intense than the amateur Aerobics to need a completely different mix of energetic mechanisms. Due to the complexity and the speed with which you perform the technical elements of Aerobic Gymnastic, the introduction of video analysis is essential for a qualitative and quantitative evaluation of athletes’ performance during the training. The performance analysis can allow the accurate analysis and explanation of the evolution and dynamics of a historical phenomenon and motor sports. The notational analysis is used by technicians to have an objective analysis of performance. Tactics, technique and individual movements can be analyzed to help coaches and athletes to re-evaluate their performance and gain advantage during the competition. The purpose of the following experimental work will be a starting point for analyzing the performance of the athletes in an objective way, not only during competitions, but especially during the phases of training. It is, therefore, advisable to introduce the video analysis and notational analysis for more quantitative and qualitative examination of technical movements. The goal is to lead to an improvement of the technique of the athlete and the teaching of the coach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

"PB-274008".

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Includes bibliographical references.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Web transaction data between Web visitors and Web functionalities usually convey user task-oriented behavior pattern. Mining such type of click-stream data will lead to capture usage pattern information. Nowadays Web usage mining technique has become one of most widely used methods for Web recommendation, which customizes Web content to user-preferred style. Traditional techniques of Web usage mining, such as Web user session or Web page clustering, association rule and frequent navigational path mining can only discover usage pattern explicitly. They, however, cannot reveal the underlying navigational activities and identify the latent relationships that are associated with the patterns among Web users as well as Web pages. In this work, we propose a Web recommendation framework incorporating Web usage mining technique based on Probabilistic Latent Semantic Analysis (PLSA) model. The main advantages of this method are, not only to discover usage-based access pattern, but also to reveal the underlying latent factor as well. With the discovered user access pattern, we then present user more interested content via collaborative recommendation. To validate the effectiveness of proposed approach, we conduct experiments on real world datasets and make comparisons with some existing traditional techniques. The preliminary experimental results demonstrate the usability of the proposed approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents some initial attempts to mathematically model the dynamics of a continuous estimation of distribution algorithm (EDA) based on a Gaussian distribution and truncation selection. Case studies are conducted on both unimodal and multimodal problems to highlight the effectiveness of the proposed technique and explore some important properties of the EDA. With some general assumptions, we show that, for ID unimodal problems and with the (mu, lambda) scheme: (1). The behaviour of the EDA is dependent only on the general shape of the test function, rather than its specific form; (2). When initialized far from the global optimum, the EDA has a tendency to converge prematurely; (3). Given a certain selection pressure, there is a unique value for the proposed amplification parameter that could help the EDA achieve desirable performance; for ID multimodal problems: (1). The EDA could get stuck with the (mu, lambda) scheme; (2). The EDA will never get stuck with the (mu, lambda) scheme.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper surveys the context of feature extraction by neural network approaches, and compares and contrasts their behaviour as prospective data visualisation tools in a real world problem. We also introduce and discuss a hybrid approach which allows us to control the degree of discriminatory and topographic information in the extracted feature space.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The primary objective of this research has been to determine the potential of fluorescence spectroscopy as a method for analysis of surface deposition on contact lenses. In order to achieve this it was first necessary to ascertain whether fluorescence analysis would be able to detect and distinguish between protein and lipid deposited on a lens surface. In conjunction with this it was important to determine the specific excitation wavelengths at which these deposited species were detected with the greatest sensitivity. Experimental observations showed that an excitation wavelength of 360nm would detect lipid deposited on a lens surface, and an excitation wavelength of 280nm would detect and distinguish between protein and lipid deposited on a contact lens. It was also very important to determine whether clean unspoilt lenses showed significant levels of fluorescence themselves. Fluorescence spectra recorded from a variety of unworn contact lenses at excitation wavelengths of 360nm and 280nm indicated that most contact lens materials do not fluoresce themselves to any great extent. Following these initial experiments various clinically and laboratory based studies were performed using fluorescence spectroscopy as a method of analysing contact lens deposition levels. The clinically based studies enabled analysis of contact lenses with known wear backgrounds to be rapidly and individually analysed following discontinuation of wear. Deposition levels in the early stages of lens wear were determined for various lens materials. The effect of surfactant cleaning on deposition levels was also investigated. The laboratory based studies involved comparing some of the in vivo results with those of identical lenses that had been spoilt using an in vitro method. Finally, an examination of lysosyme migration into and out of stored ionic high water contact lenses was made.