948 resultados para Quasi-Arithmetic Mean
Resumo:
The marine atmospheric boundary layer (MABL) plays a vital role in the transport of momentum and heat from the surface of the ocean into the atmosphere. A detailed study on the MABL characteristics was carried out using high-resolution surface-wind data as measured by the QuikSCAT (Quick scatterometer) satellite. Spatial variations in the surface wind, frictional velocity, roughness parameter and drag coe±cient for the di®erent seasons were studied. The surface wind was strong during the southwest monsoon season due to the modulation induced by the Low Level Jetstream. The drag coe±cient was larger during this season, due to the strong winds and was lower during the winter months. The spatial variations in the frictional velocity over the seas was small during the post-monsoon season (»0.2 m s¡1). The maximum spatial variation in the frictional velocity was found over the south Arabian Sea (0.3 to 0.5 m s¡1) during the southwest monsoon period, followed by the pre-monsoon over the Bay of Bengal (0.1 to 0.25 m s¡1). The mean wind-stress curl during the winter was positive over the equatorial region, with a maximum value of 1.5£10¡7 N m¡3, but on either side of the equatorial belt, a negative wind-stress curl dominated. The area average of the frictional velocity and drag coe±cient over the Arabian Sea and Bay of Bengal were also studied. The values of frictional velocity shows a variability that is similar to the intraseasonal oscillation (ISO) and this was con¯rmed via wavelet analysis. In the case of the drag coe±cient, the prominent oscillations were ISO and quasi-biweekly mode (QBM). The interrelationship between the drag coe±cient and the frictional velocity with wind speed in both the Arabian Sea and the Bay of Bengal was also studied.
Resumo:
In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored
Resumo:
Globalization and liberalization, with the entry of many prominent foreign manufacturers, changed the automobile scenario in India, since early 1990’s. World Leaders in automobile manufacturing such as Ford, General Motors, Honda, Toyota, Suzuki, Hyundai, Renault, Mitsubishi, Benz, BMW, Volkswagen and Nissan set up their manufacturing units in India in joint venture with their Indian counterpart companies, by making use of the Foreign Direct Investment policy of the Government of India, These manufacturers started capturing the hearts of Indian car customers with their choice of technological and innovative product features, with quality and reliability. With the multiplicity of choices available to the Indian passenger car buyers, it drastically changed the way the car purchase scenario in India and particularly in the State of Kerala. This transformed the automobile scene from a sellers’ market to buyers’ market. Car customers started developing their own personal preferences and purchasing patterns, which were hitherto unknown in the Indian automobile segment. The main purpose of this paper is to come up with the identification of possible parameters and a framework development, that influence the consumer purchase behaviour patterns of passenger car owners in the State of Kerala, so that further research could be done, based on the framework and the identified parameters.
Resumo:
Adaptive filter is a primary method to filter Electrocardiogram (ECG), because it does not need the signal statistical characteristics. In this paper, an adaptive filtering technique for denoising the ECG based on Genetic Algorithm (GA) tuned Sign-Data Least Mean Square (SD-LMS) algorithm is proposed. This technique minimizes the mean-squared error between the primary input, which is a noisy ECG, and a reference input which can be either noise that is correlated in some way with the noise in the primary input or a signal that is correlated only with ECG in the primary input. Noise is used as the reference signal in this work. The algorithm was applied to the records from the MIT -BIH Arrhythmia database for removing the baseline wander and 60Hz power line interference. The proposed algorithm gave an average signal to noise ratio improvement of 10.75 dB for baseline wander and 24.26 dB for power line interference which is better than the previous reported works
Resumo:
Theory Division Department of Physics
Resumo:
Ausgangspunkt der Dissertation ist ein von V. Maz'ya entwickeltes Verfahren, eine gegebene Funktion f : Rn ! R durch eine Linearkombination fh radialer glatter exponentiell fallender Basisfunktionen zu approximieren, die im Gegensatz zu den Splines lediglich eine näherungsweise Zerlegung der Eins bilden und somit ein für h ! 0 nicht konvergentes Verfahren definieren. Dieses Verfahren wurde unter dem Namen Approximate Approximations bekannt. Es zeigt sich jedoch, dass diese fehlende Konvergenz für die Praxis nicht relevant ist, da der Fehler zwischen f und der Approximation fh über gewisse Parameter unterhalb der Maschinengenauigkeit heutiger Rechner eingestellt werden kann. Darüber hinaus besitzt das Verfahren große Vorteile bei der numerischen Lösung von Cauchy-Problemen der Form Lu = f mit einem geeigneten linearen partiellen Differentialoperator L im Rn. Approximiert man die rechte Seite f durch fh, so lassen sich in vielen Fällen explizite Formeln für die entsprechenden approximativen Volumenpotentiale uh angeben, die nur noch eine eindimensionale Integration (z.B. die Errorfunktion) enthalten. Zur numerischen Lösung von Randwertproblemen ist das von Maz'ya entwickelte Verfahren bisher noch nicht genutzt worden, mit Ausnahme heuristischer bzw. experimenteller Betrachtungen zur sogenannten Randpunktmethode. Hier setzt die Dissertation ein. Auf der Grundlage radialer Basisfunktionen wird ein neues Approximationsverfahren entwickelt, welches die Vorzüge der von Maz'ya für Cauchy-Probleme entwickelten Methode auf die numerische Lösung von Randwertproblemen überträgt. Dabei werden stellvertretend das innere Dirichlet-Problem für die Laplace-Gleichung und für die Stokes-Gleichungen im R2 behandelt, wobei für jeden der einzelnen Approximationsschritte Konvergenzuntersuchungen durchgeführt und Fehlerabschätzungen angegeben werden.
Resumo:
We discuss the possibility of identifying superheavy elements from the observation of their M-shell x-ray spectra, which might occur during the collision of a superheavy element with a heavy target. The same question is discussed for the possible observation of the x-rays from the quasimolecule (quasi-superheavy element) which is formed during such a heavy-ion collision. It is shown that it is very difficult, if not impossible, to determine any information about the interesting quantum electrodynamical effects from the M-shell x-ray spectra of these quasimolecules.
Resumo:
Due to the tremendous spin-orbit splitting of quasi-molecular levels in superheavy collision systems (Z = Z_1 + Z_2 {\ge\approx} 137) bombarding energy 0.5-6 MeV N{^-1}, unusual couplings may occur around Z \simeq 165. Experimental evidence for such a theoretically predicted coupling is discussed.
Resumo:
The potential energy curve of the system Ne-Ne is calculated for small internuclear distances from 0.005 to 3.0 au using a newly developed relativistic molecular Dirac-Fock-Slater code. A significant structure in the potential energy curve is found which leads to a nearly complete agreement with experimental differential elastic scattering cross sections. This demonstrates the presence of quasi-molecular effects in elastic ion-atom collisions at keV energies.
Resumo:
Kriging is an interpolation technique whose optimality criteria are based on normality assumptions either for observed or for transformed data. This is the case of normal, lognormal and multigaussian kriging. When kriging is applied to transformed scores, optimality of obtained estimators becomes a cumbersome concept: back-transformed optimal interpolations in transformed scores are not optimal in the original sample space, and vice-versa. This lack of compatible criteria of optimality induces a variety of problems in both point and block estimates. For instance, lognormal kriging, widely used to interpolate positive variables, has no straightforward way to build consistent and optimal confidence intervals for estimates. These problems are ultimately linked to the assumed space structure of the data support: for instance, positive values, when modelled with lognormal distributions, are assumed to be embedded in the whole real space, with the usual real space structure and Lebesgue measure
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’t includes below detection limits and/or zero values, and since most of the geological data responds to lognormal distributions, these “zero data” represent a mathematical challenge for the interpretation. We need to start by recognizing that there are zero values in geology. For example the amount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-exists with nepheline. Another common essential zero is a North azimuth, however we can always change that zero for the value of 360°. These are known as “Essential zeros”, but what can we do with “Rounded zeros” that are the result of below the detection limit of the equipment? Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimes we need to differentiate between a sodic and a potassic alteration. Pre-classification into groups requires a good knowledge of the distribution of the data and the geochemical characteristics of the groups which is not always available. Considering the zero values equal to the limit of detection of the used equipment will generate spurious distributions, especially in ternary diagrams. Same situation will occur if we replace the zero values by a small amount using non-parametric or parametric techniques (imputation). The method that we are proposing takes into consideration the well known relationships between some elements. For example, in copper porphyry deposits, there is always a good direct correlation between the copper values and the molybdenum ones, but while copper will always be above the limit of detection, many of the molybdenum values will be “rounded zeros”. So, we will take the lower quartile of the real molybdenum values and establish a regression equation with copper, and then we will estimate the “rounded” zero values of molybdenum by their corresponding copper values. The method could be applied to any type of data, provided we establish first their correlation dependency. One of the main advantages of this method is that we do not obtain a fixed value for the “rounded zeros”, but one that depends on the value of the other variable. Key words: compositional data analysis, treatment of zeros, essential zeros, rounded zeros, correlation dependency
Resumo:
En esta investigación se ha estudiado la relación entre dos subsistemas de la memoria de trabajo (bucle fonológico y agenda viso-espacial) y el rendimiento en cálculo con una muestra de 94 niños españoles de 7-8 años. Hemos administrado dos pruebas de cálculo diseñadas para este estudio y seis medidas simples de memoria de trabajo (de contenido verbal, numérico y espacial) de la «Batería de Tests de Memoria de Treball» de Pickering, Baqués y Gathercole (1999), y dos pruebas visuales complementarias. Los resultados muestran una correlación importante entre las medidas de contenido verbal y numérico y el rendimiento en cálculo. En cambio, no hemos encontrado ninguna relación con las medidas espaciales. Se concluye, por lo tanto, que en escolares españoles existe una relación importante entre el bucle fonológico y el rendimiento en tareas de cálculo. En cambio, el rol de la agenda viso-espacial es nulo
Resumo:
Desde que Hitch (1978) publicó el primer estudio sobre el rol de la memoria de trabajo en el cálculo han ido aumentando las investigaciones en este campo. Muchos trabajos han estudiado un único subsistema, pero nuestro objetivo es identificar qué subsistema de la memoria de trabajo (bucle fonológico, agenda viso-espacial o ejecutivo central) está más implicado en el cálculo mental. Para ello hemos realizado un estudio correlacional en el que hemos administrado dos pruebas aritméticas y nueve pruebas de la “Bateria de Test de Memòria de Treball” de Pickering, Baqués y Gathercole (1999) a una muestra de 94 niños españoles de 7-8 años. Nuestros resultados indican que el bucle fonológico y sobretodo el ejecutivo central inciden de forma estadísticamente significativa en el rendimiento aritmético