915 resultados para Trivariate Normal Distribution
Resumo:
This study presents the findings of an empirical channel characterisation for an ultra-wideband off-body optic fibre-fed multiple-antenna array within an office and corridor environment. The results show that for received power experiments, the office and corridor were best modelled by lognormal and Rician distributions, respectively [for both line of sight (LOS) and non-LOS (NLOS) scenarios]. In the office, LOS measurements for t and tRMS were both described by the Normal distribution for all channels, whereas NLOS measurements for t and t were Nakagami and Weibull distributed, respectively. For the corridor measurements, LOS for t and t were either Nakagami or normally distributed for all channels, with NLOS measurements for t and t being Nakagami and normally distributed, respectively. This work also shows that achievable diversity gain was influenced by both mutual coupling and cross-correlation co-efficients. Although the best diversity gains were 1.8 dB for three-channel selective diversity combining, the authors present recommendations for improving these results. © The Institution of Engineering and Technology 2013.
Resumo:
A randomly distributed multi-particle model considering the effects of particle/matrix interface and strengthening mechanisms introduced by the particles has been constructed. Particle shape, distribution, volume fraction and the particles/matrix interface due to the factors including element diffusion were considered in the model. The effects of strengthening mechanisms, caused by the introduction of particles on the mechanical properties of the composites, including grain refinement strengthening, dislocation strengthening and Orowan strengthening, are incorporated. In the model, the particles are assumed to have spheroidal shape, with uniform distribution of the centre, long axis length and inclination angle. The axis ratio follows a right half-normal distribution. Using Monte Carlo method, the location and shape parameters of the spheroids are randomly selected. The particle volume fraction is calculated using the area ratio of the spheroids. Then, the effects of particle/matrix interface and strengthening mechanism on the distribution of Mises stress and equivalent strain and the flow behaviour for the composites are discussed.
Resumo:
ABSTRACT BODY: To resolve outstanding questions on heating of coronal loops, we study intensity fluctuations in inter-moss portions of active region core loops as observed with AIA/SDO. The 94Å fluctuations (Figure 1) have structure on timescales shorter than radiative and conductive cooling times. Each of several strong 94Å brightenings is followed after ~8 min by a broader peak in the cooler 335Å emission. This indicates that we see emission from the hot component of the 94Å contribution function. No hotter contributions appear, and we conclude that the 94Å intensity can be used as a proxy for energy injection into the loop plasma. The probability density function of the observed 94Å intensity has 'heavy tails' that approach zero more slowly than the tails of a normal distribution. Hence, large fluctuations dominate the behavior of the system. The resulting 'intermittence' is associated with power-law or exponential scaling of the related variables, and these in turn are associated with turbulent phenomena. The intensity plots in Figure 1 resemble multifractal time series, which are common to various forms of turbulent energy dissipation. In these systems a single fractal dimension is insufficient to describe the dynamics and instead there is a spectrum of fractal dimensions that quantify the self-similar properties. Figure 2 shows the multifractal spectrum from our data to be invariant over timescales from 24 s to 6.4 min. We compare these results to outputs from theoretical energy dissipation models based on MHD turbulence, and in some cases we find substantial agreement, in terms of intermittence, multifractality and scale invariance. Figure 1. Time traces of 94A intensity in the inter-moss portions of four AR core loops. Figure 2. Multifractal spectra showing timescale invariance. The four cases of Figure 1 are included.
Resumo:
Most cryptographic devices should inevitably have a resistance against the threat of side channel attacks. For this, masking and hiding schemes have been proposed since 1999. The security validation of these countermeasures is an ongoing research topic, as a wider range of new and existing attack techniques are tested against these countermeasures. This paper examines the side channel security of the balanced encoding countermeasure, whose aim is to process the secret key-related data under a constant Hamming weight and/or Hamming distance leakage. Unlike previous works, we assume that the leakage model coefficients conform to a normal distribution, producing a model with closer fidelity to real-world implementations. We perform analysis on the balanced encoded PRINCE block cipher with simulated leakage model and also an implementation on an AVR board. We consider both standard correlation power analysis (CPA) and bit-wise CPA. We confirm the resistance of the countermeasure against standard CPA, however, we find with a bit-wise CPA that we can reveal the key with only a few thousands traces.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.
Resumo:
The exposure to dust and polynuclear aromatic hydrocarbons (PAH) of 15 truck drivers from Geneva, Switzerland, was measured. The drivers were divided between "long-distance" drivers and "local" drivers and between smokers and nonsmokers and were compared with a control group of 6 office workers who were also divided into smokers and nonsmokers. Dust was measured on 1 workday both by a direct-reading instrument and by sampling. The local drivers showed higher exposure to dust (0.3 mg/m3) and PAH than the long-distance drivers (0.1 mg/m3), who showed no difference with the control group. This observation may be due to the fact that the local drivers spend more time in more polluted areas, such as streets with heavy traffic and construction sites, than do the long-distance drivers. Smoking does not influence exposure to dust and PAH of professional truck drivers, as measured in this study, probably because the ventilation rate of the truck cabins is relatively high even during cold days (11-15 r/h). The distribution of dust concentrations was shown in some cases to be quite different from the expected log-normal distribution. The contribution of diesel exhaust to these exposures could not be estimated since no specific tracer was used. However, the relatively low level of dust exposure dose not support the hypothesis that present day levels of diesel exhaust particulates play a significant role in the excess occurrence of lung cancer observed in professional truck drivers.
Resumo:
This study used Q methodology to measure the extent to which individuals with five educational roles (student teacher, elementary music teacher, principal, high school music teacher, and music consultant) held five proposed philosophies of music education (hedonic, utilitarian, aesthetic cognitivism, aesthetic formalist, and praxial). Twenty-seven sUbjects participated in the Q study. These subjects were a convenience sample based on their educational role, accessibility, and willingness to participate. Participants completed a background sheet which indicated their background in music, and their responsibility for teaching music. The sUbjects in this Q study rank-ordered a set of 60 Q sort items (each item representing a proposed philosophical position) twice: Sort P to reflect current practice, and Sort I to reflect the ideal situation. The results of the sorting procedures were recorded by the participant on the response page which organized the rankings according to an approximated normal distribution as required by Q methodology. The analysis of the data suggested that the comparison across philosophical positions was significant and that the results of the interaction between philosophical position and educational role were significant, although educational role alone was not significant. Post-hoc analysis of the data was used to determine the significant differences between the levels of the, independent variables used in the model: philosophical position, educational role, and music background. A model of the association of the five philosophical positions was presented and discussed in relation to the Q study results. Further research could refine the Q sort items to better reflect each philosophical position.
Resumo:
This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.
Resumo:
In dieser Dissertation präsentieren wir zunächst eine Verallgemeinerung der üblichen Sturm-Liouville-Probleme mit symmetrischen Lösungen und erklären eine umfassendere Klasse. Dann führen wir einige neue Klassen orthogonaler Polynome und spezieller Funktionen ein, welche sich aus dieser symmetrischen Verallgemeinerung ableiten lassen. Als eine spezielle Konsequenz dieser Verallgemeinerung führen wir ein Polynomsystem mit vier freien Parametern ein und zeigen, dass in diesem System fast alle klassischen symmetrischen orthogonalen Polynome wie die Legendrepolynome, die Chebyshevpolynome erster und zweiter Art, die Gegenbauerpolynome, die verallgemeinerten Gegenbauerpolynome, die Hermitepolynome, die verallgemeinerten Hermitepolynome und zwei weitere neue endliche Systeme orthogonaler Polynome enthalten sind. All diese Polynome können direkt durch das neu eingeführte System ausgedrückt werden. Ferner bestimmen wir alle Standardeigenschaften des neuen Systems, insbesondere eine explizite Darstellung, eine Differentialgleichung zweiter Ordnung, eine generische Orthogonalitätsbeziehung sowie eine generische Dreitermrekursion. Außerdem benutzen wir diese Erweiterung, um die assoziierten Legendrefunktionen, welche viele Anwendungen in Physik und Ingenieurwissenschaften haben, zu verallgemeinern, und wir zeigen, dass diese Verallgemeinerung Orthogonalitätseigenschaft und -intervall erhält. In einem weiteren Kapitel der Dissertation studieren wir detailliert die Standardeigenschaften endlicher orthogonaler Polynomsysteme, welche sich aus der üblichen Sturm-Liouville-Theorie ergeben und wir zeigen, dass sie orthogonal bezüglich der Fisherschen F-Verteilung, der inversen Gammaverteilung und der verallgemeinerten t-Verteilung sind. Im nächsten Abschnitt der Dissertation betrachten wir eine vierparametrige Verallgemeinerung der Studentschen t-Verteilung. Wir zeigen, dass diese Verteilung gegen die Normalverteilung konvergiert, wenn die Anzahl der Stichprobe gegen Unendlich strebt. Eine ähnliche Verallgemeinerung der Fisherschen F-Verteilung konvergiert gegen die chi-Quadrat-Verteilung. Ferner führen wir im letzten Abschnitt der Dissertation einige neue Folgen spezieller Funktionen ein, welche Anwendungen bei der Lösung in Kugelkoordinaten der klassischen Potentialgleichung, der Wärmeleitungsgleichung und der Wellengleichung haben. Schließlich erklären wir zwei neue Klassen rationaler orthogonaler hypergeometrischer Funktionen, und wir zeigen unter Benutzung der Fouriertransformation und der Parsevalschen Gleichung, dass es sich um endliche Orthogonalsysteme mit Gewichtsfunktionen vom Gammatyp handelt.
Resumo:
Aitchison and Bacon-Shone (1999) considered convex linear combinations of compositions. In other words, they investigated compositions of compositions, where the mixing composition follows a logistic Normal distribution (or a perturbation process) and the compositions being mixed follow a logistic Normal distribution. In this paper, I investigate the extension to situations where the mixing composition varies with a number of dimensions. Examples would be where the mixing proportions vary with time or distance or a combination of the two. Practical situations include a river where the mixing proportions vary along the river, or across a lake and possibly with a time trend. This is illustrated with a dataset similar to that used in the Aitchison and Bacon-Shone paper, which looked at how pollution in a loch depended on the pollution in the three rivers that feed the loch. Here, I explicitly model the variation in the linear combination across the loch, assuming that the mean of the logistic Normal distribution depends on the river flows and relative distance from the source origins
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel density estimation techniques in the context of compositional data analysis. Indeed, they gave two options for the choice of the kernel to be used in the kernel estimator. One of these kernels is based on the use the alr transformation on the simplex SD jointly with the normal distribution on RD-1. However, these authors themselves recognized that this method has some deficiencies. A method for overcoming these dificulties based on recent developments for compositional data analysis and multivariate kernel estimation theory, combining the ilr transformation with the use of the normal density with a full bandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu- Figueras (2006). Here we present an extensive simulation study that compares both methods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition
Resumo:
Exam questions and solutions in PDF
Resumo:
Lecture notes in LaTex
Resumo:
Exam questions and solutions in LaTex. Diagrams for the questions are all together in the support.zip file, as .eps files