31 resultados para Magnitude
Resumo:
Half-lives of radionuclides span more than 50 orders of magnitude. We characterize the probability distribution of this broad-range data set at the same time that explore a method for fitting power-laws and testing goodness-of-fit. It is found that the procedure proposed recently by Clauset et al. [SIAM Rev. 51, 661 (2009)] does not perform well as it rejects the power-law hypothesis even for power-law synthetic data. In contrast, we establish the existence of a power-law exponent with a value around 1.1 for the half-life density, which can be explained by the sharp relationship between decay rate and released energy, for different disintegration types. For the case of alpha emission, this relationship constitutes an original mechanism of power-law generation.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la State University of New York a Stony Brook, EEUU, a l’agost i setembre del 2008. S’ha demostrat com està relacionada la diferència en el rendiment de lasejat de dos mostres altament dopades amb iterbi (20 at. %) de capes epitaxials de KY(WO4)2 (KYW) dopat amb Yb crescut sobre un substrat de KYW i de KLu(WO4)2 (KLuW) dopat amb Yb crescut sobre un substrat de KLuW, respectivament, amb la presència de estrès estructural en les capes epitaxials, investigat per Topografia de Feix Blanc de Raig X de Sincrotró. A partir dels resultats obtinguts, queda clar que les mostres que mostren una quantitat d'estrés estructural més gran, les epitaxies de KYW dopat amb Yb crescudes sobre substrats de KYW, duen a una eficiència més petita durant el lasejat, permetent establir una correlació directa entre l'existència i la magnitud d'aquest estrès estructural i la pèrdua del rendiment làser en aquestes capes epitaxial, que per altra banda, des del punt de vista espectroscòpic són equivalents.
Resumo:
With the advent of High performance computing, it is now possible to achieve orders of magnitude performance and computation e ciency gains over conventional computer architectures. This thesis explores the potential of using high performance computing to accelerate whole genome alignment. A parallel technique is applied to an algorithm for whole genome alignment, this technique is explained and some experiments were carried out to test it. This technique is based in a fair usage of the available resource to execute genome alignment and how this can be used in HPC clusters. This work is a rst approximation to whole genome alignment and it shows the advantages of parallelism and some of the drawbacks that our technique has. This work describes the resource limitations of current WGA applications when dealing with large quantities of sequences. It proposes a parallel heuristic to distribute the load and to assure that alignment quality is mantained.
Resumo:
The control of optical fields on the nanometre scale is becoming an increasingly important tool in many fields, ranging from channelling light delivery in photovoltaics and light emitting diodes to increasing the sensitivity of chemical sensors to single molecule levels. The ability to design and manipulate light fields with specific frequency and space characteristics is explored in this project. We present an alternative realisation of Extraordinary Optical Transmission (EOT) that requires only a single aperture and a coupled waveguide. We show how this waveguide-resonant EOT improves the transmissivity of single apertures. An important technique in imaging is Near-Field Scanning Optical Microscopy (NSOM); we show how waveguide-resonant EOT and the novel probe design assist in improving the efficiency of NSOM probes by two orders of magnitude, and allow the imaging of single molecules with an optical resolution of as good as 50 nm. We show how optical antennas are fabricated into the apex of sharp tips and can be used in a near-field configuration.
Resumo:
We explore in depth the validity of a recently proposed scaling law for earthquake inter-event time distributions in the case of the Southern California, using the waveform cross-correlation catalog of Shearer et al. Two statistical tests are used: on the one hand, the standard two-sample Kolmogorov-Smirnov test is in agreement with the scaling of the distributions. On the other hand, the one-sample Kolmogorov-Smirnov statistic complemented with Monte Carlo simulation of the inter-event times, as done by Clauset et al., supports the validity of the gamma distribution as a simple model of the scaling function appearing on the scaling law, for rescaled inter-event times above 0.01, except for the largest data set (magnitude greater than 2). A discussion of these results is provided.
Resumo:
The literature on local services has focused on the effects of privatization and, if anything, has compared the effects of private and mixed public-private systems versus public provision. However, alternative forms of provision such as cooperatives, which can be very prevalent in many developing countries, have been completely ignored. In this paper, we investigate the effects of communal water provison (Comités Vecinales and Juntas Administrativas de Servicios de Saneamiento) on child health in Peru. Using detailed survey data at the household- and child-level for the years 2006-2010, we exploit the cross-section variability to assess the differential impact of this form of provision. Despite controlling for a wide range of household and local characteristics, the municipalities served by communal organizations are more likely to have poorer health indicators, what would result in a downward bias on the absolute magnitude of the effect of cooperatives. We rely on an instrumental variable strategy to deal with this potential endogeneity problem, and use the personnel resources and the administrative urban/rural classi fication of the municipalities as instruments for the provision type. The results show a negative and signi cant effect of comunal water provision on diarrhea among under- five year old children. Keywords: water utilities, cooperatives, child health, regulation, Peru. JEL Classi fication Numbers: L33; L50; L95
Resumo:
En aquest treball s'analitza la contribució estèrica de les molècules a les seves propietats químiques i físiques, mitjançant l'avaluació del seu volum i de la seva mesura de semblança, a partir d'ara definits com a descriptors moleculars de primer ordre. La difeèsncia entre aquests dos conceptes ha estat aclarida: mentre que el volum és la magnitud de l'espai que ocupa la molècula com a entitat global, la mesura de semblança ens dóna una idea de com està distribuïda la densitat electrònica al llarg d'aquest volum, i reflecteix més les diferències locals existents. L'ús de diverses aproximacions per a l'obtenció d'ambdós valors ha estat analitzat sobre diferents classes d'isòmers
Resumo:
Los datos existentes sobre el progresivo incremento de la infección con el virus de inmunodeficiencia humana (VIH) entre los adictos a las drogas por via parenteral (ADVP) y sus parejas e hijos, plantean la necesidad urgente de elaborar programas preventivos con el mayor grado de eficacia posible. En el presente trabajo nos proponemos tres objetivos: 1) Poner de manifiesto algunas insuficiencias observadas en los modelos deprevención que se aplican al caso del SIDA. 2) Conferir un énfasis especial a la influencia sobre los comportamientos preventivos frente al SIDA, de ciertos factores que, en general, no se tienen 10 bastante en cuenta en los modelos actuales como son: la magnitud del reforzamiento contingente a un determinado comportamiento y la demora con la que éste se recibe. 3) Exponer los resultados de una investigación realizada con drogadictos por via parenteral (Planes, 1991), cuyos objetivos eran conocer las relacionesexistentes entre la magnitud y la demora del reforzamiento contingente a los comportamientos sexuales preventivos y la frecuencia de dichos comportamientos
Resumo:
This paper describes a method to achieve the most relevant contours of an image. The presented method proposes to integrate the information of the local contours from chromatic components such as H, S and I, taking into account the criteria of coherence of the local contour orientation values obtained from each of these components. The process is based on parametrizing pixel by pixel the local contours (magnitude and orientation values) from the H, S and I images. This process is carried out individually for each chromatic component. If the criterion of dispersion of the obtained orientation values is high, this chromatic component will lose relevance. A final processing integrates the extracted contours of the three chromatic components, generating the so-called integrated contours image
Resumo:
A decentralized model reference controller is designed to reduce the magnitude of the transversal vibration of a flexible cable-stayed beam structure induced by a seismic excitation. The controller design is made based on the principle of sliding mode such that a priori knowledge
Resumo:
Given an observed test statistic and its degrees of freedom, one may compute the observed P value with most statistical packages. It is unknown to what extent test statistics and P values are congruent in published medical papers. Methods:We checked the congruence of statistical results reported in all the papers of volumes 409–412 of Nature (2001) and a random sample of 63 results from volumes 322–323 of BMJ (2001). We also tested whether the frequencies of the last digit of a sample of 610 test statistics deviated from a uniform distribution (i.e., equally probable digits).Results: 11.6% (21 of 181) and 11.1% (7 of 63) of the statistical results published in Nature and BMJ respectively during 2001 were incongruent, probably mostly due to rounding, transcription, or type-setting errors. At least one such error appeared in 38% and 25% of the papers of Nature and BMJ, respectively. In 12% of the cases, the significance level might change one or more orders of magnitude. The frequencies of the last digit of statistics deviated from the uniform distribution and suggested digit preference in rounding and reporting.Conclusions: this incongruence of test statistics and P values is another example that statistical practice is generally poor, even in the most renowned scientific journals, and that quality of papers should be more controlled and valued
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
Three multivariate statistical tools (principal component analysis, factor analysis, analysis discriminant) have been tested to characterize and model the sags registered in distribution substations. Those models use several features to represent the magnitude, duration and unbalanced grade of sags. They have been obtained from voltage and current waveforms. The techniques are tested and compared using 69 registers of sags. The advantages and drawbacks of each technique are listed
Resumo:
The main objective of this ex post facto study is to compare the differencesin cognitive functions and their relation to schizotypal personality traits between agroup of unaffected parents of schizophrenic patients and a control group. A total of 52unaffected biological parents of schizophrenic patients and 52 unaffected parents ofunaffected subjects were assessed in measures of attention (Continuous PerformanceTest- Identical Pairs Version, CPT-IP), memory and verbal learning (California VerbalLearning Test, CVLT) as well as schizotypal personality traits (Oxford-Liverpool Inventoryof Feelings and Experiences, O-LIFE). The parents of the patients with schizophreniadiffer from the parents of the control group in omission errors on the ContinuousPerformance Test- Identical Pairs, on a measure of recall and on two contrast measuresof the California Verbal Learning Test. The associations between neuropsychologicalvariables and schizotpyal traits are of a low magnitude. There is no defined pattern ofthe relationship between cognitive measures and schizotypal traits