890 resultados para real option analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large quantities of teleseismic short-period seismograms recorded at SCARLET provide travel time, apparent velocity and waveform data for study of upper mantle compressional velocity structure. Relative array analysis of arrival times from distant (30° < Δ < 95°) earthquakes at all azimuths constrains lateral velocity variations beneath southern California. We compare dT/dΔ back azimuth and averaged arrival time estimates from the entire network for 154 events to the same parameters derived from small subsets of SCARLET. Patterns of mislocation vectors for over 100 overlapping subarrays delimit the spatial extent of an east-west striking, high-velocity anomaly beneath the Transverse Ranges. Thin lens analysis of the averaged arrival time differences, called 'net delay' data, requires the mean depth of the corresponding lens to be more than 100 km. Our results are consistent with the PKP-delay times of Hadley and Kanamori (1977), who first proposed the high-velocity feature, but we place the anomalous material at substantially greater depths than their 40-100 km estimate.

Detailed analysis of travel time, ray parameter and waveform data from 29 events occurring in the distance range 9° to 40° reveals the upper mantle structure beneath an oceanic ridge to depths of over 900 km. More than 1400 digital seismograms from earthquakes in Mexico and Central America yield 1753 travel times and 58 dT/dΔ measurements as well as high-quality, stable waveforms for investigation of the deep structure of the Gulf of California. The result of a travel time inversion with the tau method (Bessonova et al., 1976) is adjusted to fit the p(Δ) data, then further refined by incorporation of relative amplitude information through synthetic seismogram modeling. The application of a modified wave field continuation method (Clayton and McMechan, 1981) to the data with the final model confirms that GCA is consistent with the entire data set and also provides an estimate of the data resolution in velocity-depth space. We discover that the upper mantle under this spreading center has anomalously slow velocities to depths of 350 km, and place new constraints on the shape of the 660 km discontinuity.

Seismograms from 22 earthquakes along the northeast Pacific rim recorded in southern California form the data set for a comparative investigation of the upper mantle beneath the Cascade Ranges-Juan de Fuca region, an ocean-continent transit ion. These data consist of 853 seismograms (6° < Δ < 42°) which produce 1068 travel times and 40 ray parameter estimates. We use the spreading center model initially in synthetic seismogram modeling, and perturb GCA until the Cascade Ranges data are matched. Wave field continuation of both data sets with a common reference model confirms that real differences exist between the two suites of seismograms, implying lateral variation in the upper mantle. The ocean-continent transition model, CJF, features velocities from 200 and 350 km that are intermediate between GCA and T7 (Burdick and Helmberger, 1978), a model for the inland western United States. Models of continental shield regions (e.g., King and Calcagnile, 1976) have higher velocities in this depth range, but all four model types are similar below 400 km. This variation in rate of velocity increase with tectonic regime suggests an inverse relationship between velocity gradient and lithospheric age above 400 km depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

41 p.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

27 p.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The laminar to turbulent transition process in boundary layer flows in thermochemical nonequilibrium at high enthalpy is measured and characterized. Experiments are performed in the T5 Hypervelocity Reflected Shock Tunnel at Caltech, using a 1 m length 5-degree half angle axisymmetric cone instrumented with 80 fast-response annular thermocouples, complemented by boundary layer stability computations using the STABL software suite. A new mixing tank is added to the shock tube fill apparatus for premixed freestream gas experiments, and a new cleaning procedure results in more consistent transition measurements. Transition location is nondimensionalized using a scaling with the boundary layer thickness, which is correlated with the acoustic properties of the boundary layer, and compared with parabolized stability equation (PSE) analysis. In these nondimensionalized terms, transition delay with increasing CO2 concentration is observed: tests in 100% and 50% CO2, by mass, transition up to 25% and 15% later, respectively, than air experiments. These results are consistent with previous work indicating that CO2 molecules at elevated temperatures absorb acoustic instabilities in the MHz range, which is the expected frequency of the Mack second-mode instability at these conditions, and also consistent with predictions from PSE analysis. A strong unit Reynolds number effect is observed, which is believed to arise from tunnel noise. NTr for air from 5.4 to 13.2 is computed, substantially higher than previously reported for noisy facilities. Time- and spatially-resolved heat transfer traces are used to track the propagation of turbulent spots, and convection rates at 90%, 76%, and 63% of the boundary layer edge velocity, respectively, are observed for the leading edge, centroid, and trailing edge of the spots. A model constructed with these spot propagation parameters is used to infer spot generation rates from measured transition onset to completion distance. Finally, a novel method to control transition location with boundary layer gas injection is investigated. An appropriate porous-metal injector section for the cone is designed and fabricated, and the efficacy of injected CO2 for delaying transition is gauged at various mass flow rates, and compared with both no injection and chemically inert argon injection cases. While CO2 injection seems to delay transition, and argon injection seems to promote it, the experimental results are inconclusive and matching computations do not predict a reduction in N factor from any CO2 injection condition computed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raman spectroscopy on single, living epithelial cells captured in a laser trap is shown to have diagnostic power over colorectal cancer. This new single-cell technology comprises three major components: primary culture processing of human tissue samples to produce single-cell suspensions, Raman detection on singly trapped cells, and diagnoses of the cells by artificial neural network classifications. it is compared with DNA flow cytometry for similarities and differences. Its advantages over tissue Raman spectroscopy are also discussed. In the actual construction of a diagnostic model for colorectal cancer, real patient data were taken to generate a training set of 320 Raman spectra and, a test set of 80. By incorporating outlier corrections to a conventional binary neural classifier, our network accomplished significantly better predictions than logistic regressions, with sensitivity improved from 77.5% to 86.3% and specificity improved from 81.3% to 86.3% for the training set and moderate improvements for the test set. Most important, the network approach enables a sensitivity map analysis to quantitate the relevance of each Raman band to the normal-to-cancer transform at the cell level. Our technique has direct clinic applications for diagnosing cancers and basic science potential in the study of cell dynamics of carcinogenesis. (C) 2007 Society of Photo-Optical Instrumentation Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raman spectroscopy on single, living epithelial cells captured in a laser trap is shown to have diagnostic power over colorectal cancer. This new single-cell technology comprises three major components: primary culture processing of human tissue samples to produce single-cell suspensions, Raman detection on singly trapped cells, and diagnoses of the cells by artificial neural network classifications. it is compared with DNA flow cytometry for similarities and differences. Its advantages over tissue Raman spectroscopy are also discussed. In the actual construction of a diagnostic model for colorectal cancer, real patient data were taken to generate a training set of 320 Raman spectra and, a test set of 80. By incorporating outlier corrections to a conventional binary neural classifier, our network accomplished significantly better predictions than logistic regressions, with sensitivity improved from 77.5% to 86.3% and specificity improved from 81.3% to 86.3% for the training set and moderate improvements for the test set. Most important, the network approach enables a sensitivity map analysis to quantitate the relevance of each Raman band to the normal-to-cancer transform at the cell level. Our technique has direct clinic applications for diagnosing cancers and basic science potential in the study of cell dynamics of carcinogenesis. (C) 2007 Society of Photo-Optical Instrumentation Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of the evolution of the M3 money aggregate is an important element in the definition and implementation of monetary policy for the ECB. A well-defined and stable long run demand function is an essential requisite for M3 to be a valid monetary tool. Therefore, this paper analyzes based in cointegration techniques the existence of a long run money demand, estimating it and testing its stability for the Euro Area and for ten of its member countries. Specifically, bearing in mind the high degree of monetary instability that the current economic crisis has created in the Euro Area, we also test whether this has had a noticeable impact in the cointegration among real money demand and its determinants. The analysis gives evidence of the existence of a long run relationship when the aggregated Euro Area and six of the ten countries are considered. However, these relationships are highly instable since the outbreak of the financial crisis, leading in some cases to even rejecting cointegration. All this suggests that the ECB’s strategy of focusing in the M3 monetary aggregates could not be a convenient approach under the current circumstances

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe.

Initial phase of LIGO started in 2002, and since then data was collected during six science runs. Instrument sensitivity was improving from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010.

In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted till 2014.

This thesis describes results of commissioning work done at LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers.

The first part of this thesis is devoted to the description of methods for bringing interferometer to the linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details.

Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in real time. Sensitivity analysis was done to understand and eliminate noise sources of the instrument.

Coupling of noise sources to gravitational wave channel can be reduced if robust feedforward and optimal feedback control loops are implemented. The last part of this thesis describes static and adaptive feedforward noise cancellation techniques applied to Advanced LIGO interferometers and tested at the 40m prototype. Applications of optimal time domain feedback control techniques and estimators to aLIGO control loops are also discussed.

Commissioning work is still ongoing at the sites. First science run of advanced LIGO is planned for September 2015 and will last for 3-4 months. This run will be followed by a set of small instrument upgrades that will be installed on a time scale of few months. Second science run will start in spring 2016 and last for about 6 months. Since current sensitivity of advanced LIGO is already more than factor of 3 higher compared to initial detectors and keeps improving on a monthly basis, upcoming science runs have a good chance for the first direct detection of gravitational waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I. Crossing transformations constitute a group of permutations under which the scattering amplitude is invariant. Using Mandelstem's analyticity, we decompose the amplitude into irreducible representations of this group. The usual quantum numbers, such as isospin or SU(3), are "crossing-invariant". Thus no higher symmetry is generated by crossing itself. However, elimination of certain quantum numbers in intermediate states is not crossing-invariant, and higher symmetries have to be introduced to make it possible. The current literature on exchange degeneracy is a manifestation of this statement. To exemplify application of our analysis, we show how, starting with SU(3) invariance, one can use crossing and the absence of exotic channels to derive the quark-model picture of the tensor nonet. No detailed dynamical input is used.

II. A dispersion relation calculation of the real parts of forward π±p and K±p scattering amplitudes is carried out under the assumption of constant total cross sections in the Serpukhov energy range. Comparison with existing experimental results as well as predictions for future high energy experiments are presented and discussed. Electromagnetic effects are found to be too small to account for the expected difference between the π-p and π+p total cross sections at higher energies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES]El TFG consta de dos partes: en la primera se expondrá la teoría sobre las redes que se utilizarán en la segunda parte, que consistirá en el análisis de un caso real donde se aplicarán las redes estudiadas en la primera parte.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a real time sliding mode control scheme for a variable speed wind turbine that incorporates a doubly feed induction generator is described. In this design, the so-called vector control theory is applied, in order to simplify the system electrical equations. The proposed control scheme involves a low computational cost and therefore can be implemented in real-time applications using a low cost Digital Signal Processor (DSP). The stability analysis of the proposed sliding mode controller under disturbances and parameter uncertainties is provided using the Lyapunov stability theory. A new experimental platform has been designed and constructed in order to analyze the real-time performance of the proposed controller in a real system. Finally, the experimental validation carried out in the experimental platform shows; on the one hand that the proposed controller provides high-performance dynamic characteristics, and on the other hand that this scheme is robust with respect to the uncertainties that usually appear in the real systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES]Muchas de las piezas de vehículos trabajan bajo tensiones aleatorias, y por ello, es necesario tener en cuenta la fatiga por daño acumulativo de los materiales en el diseño. Sin embargo, no hay ningún método de cálculo específico en este ámbito, sino que se disponen de diferentes posibles elecciones, situación que incurre en la necesidad de una comparación entre ellos para tener cierto criterio de selección. Aportar ese criterio es la razón de ser de este informe, que se intentará corroborar con un análisis de una pieza de un tren de aterrizaje de una aeronave, bien por el tipo de cargas al que está sujeto cómo por la creciente importancia del sector aeronáutico en la industria en el cual el diseño tiene especial interés.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EU]Lan honetan hutsetik hasita skate motorizatu baten diseinu bat egin da, Industria Teknologian Ingeniaritzako Graduan eskuratutako jakintzak erabiliz. Skatea gazteengan oso erabilia den garraiobide bat da aisia dela-eta, eta sinpletasuna eta ekonomia kontuan hartuta, aisia eta ingurugiroarekiko konpromisoa nahasten dira diseinu honekin. Lanak bide argi bat dauka: hasieran merkatuaren egoera aztertzen da, gero diseinu posible batzuk proposatu ahal izateko, eta hortik aukera bat jorratu ahal izateko. Hortaz, prozesua hasieratik hasita, egitura hau izango du lanak: Hasteko, skate motorizatuen merkatuko egoera zein den aztertzen da, ondoren eta helburuak zeintzuk diren kontutan izanda, diseinu posible batzuk proposatu ahal izateko, alternatiben analisien atalean ageri direnak. Proposamen horiek aztertu egiten dira, alde on eta txarrak desberdinduz eta horietako soluzio bat aukeratuz. Aukeratutako diseinuaren CAD modelo bat eraikitzen da, ondoren egoera fisikoa matematikoki modelizatzeko eta beharrezko kalkuluak egin ahal izateko, metodologia atalean eta eranskinetan ageri direnak, hala nola, azelerazio jakin bat lortzeko behar den potentzia, transmisioan beharrezkoak diren kalkuluak kokatu beharreko elementuak zeintzuk diren erabaki ahal izateko, etab. Bide honetan, jorraturiko diseinuaz gain beste diseinu optimizatu eta konplexuago bat proposatzen da, ikerketa bidea zabalik duena, enpresa mundura gehiago zuzendua. Azkenik, lana burutzeko bete behar izan diren ataza bakoitzaren deskribapena eta iraupena, aurrekontua eta gastu aitorpena, arriskuen analisia eta proiektu honetatik atera ditugun ondorioak ematen dira.