978 resultados para Empirical orthogonal functions
Resumo:
One of the most popular explanations for post-9/11 anti-Americanism argues that resentment against America and Americans is mainly a function of the US government’s unpopular actions. The present article challenges this interpretation: first, it argues that neither the vitality of the resentment in times when the United States had no influence in the respective parts of the world nor its recent radical manifestations are accounted for in a political reductionist framework. In fact, specific traditions of anti-Americanism have an influence on the negative attitudes observed today, as a comparison between Britain, France, Germany, and Poland reveals. Second, this article suggests an alternative theoretical approach. Anti-Americanism can be explained by two basic mechanisms: it functions as a strategy to project denied and disliked self-concepts onto an external object, and it offers an interpretation frame for complex social processes that allows to reduce cognitive dissonance. Multivariate analyses based on empirical data collected in the Pew surveys of 2002 and 2007 show the fruitfulness of our theoretical approach.
Resumo:
Traumatic brain injuries (TBIs) occur frequently in childhood and entail broad cognitive deficits, particularly in the domain of executive functions (EF). Concerning mild TBI (mTBI), only little empirical evidence is available on acute and postacute performance in EF. Given that EF are linked to school adaptation and achievement, even subtle deficits in performance may affect children's academic careers. The present study assessed performance in the EF components of inhibition, working memory (WM), and switching in children after mTBI. Regarding both acute and postacute consequences, performance trajectories were measured in 13 patients aged between 5 and 10 years and 13 controls who were closely matched in terms of sex, age, and education. Performance in the EF components of inhibition, switching, and WM was assessed in a short-term longitudinal design at 2, 6, and 12 weeks after the mTBI. Results indicate subtle deficits after mTBI, which became apparent in the longitudinal trajectory in the EF components of switching and WM. Compared with controls, children who sustained mTBI displayed an inferior performance enhancement across testing sessions in the first 6 weeks after the injury in switching and WM, resulting in a delayed deficit in the EF component of WM 12 weeks after the injury. Results are interpreted as mTBI-related deficits that become evident in terms of an inability to profit from previous learning opportunities, a finding that is potentially important for children's mastery of their daily lives.
Resumo:
This study investigated the empirical differentiation of prospective memory, executive functions, and metacognition and their structural relationships in 119 elementary school children (M = 95 months, SD = 4.8 months). These cognitive abilities share many characteristics on the theoretical level and are all highly relevant in many everyday contexts when intentions must be executed. Nevertheless, their empirical relationships have not been examined on the latent level, although an empirical approach would contribute to our knowledge concerning the differentiation of cognitive abilities during childhood. We administered a computerized event-based prospective memory task, three executive function tasks (updating, inhibition, shifting), and a metacognitive control task in the context of spelling. Confirmatory factor analysis revealed that the three cognitive abilities are already empirically differentiable in young elementary school children. At the same time, prospective memory and executive functions were found to be strongly related, and there was also a close link between prospective memory and metacognitive control. Furthermore, executive functions and metacognitive control were marginally significantly related. The findings are discussed within a framework of developmental differentiation and conceptual similarities and differences.
Resumo:
Succeeding in everyday activities often requires executive functioning (EF), metacognitive abilities (MC) and memory skills such as prospective memory (PM) and retrospective memory (RM). These cognitive abilities seem to gradually develop in childhood, possibly influencing each other during development. From a theoretical point of view, it is likely that they are closely interrelated, especially in children. Their empirical relation, however, is less clear. A model that links these cognitive abilities can help to better understand the relation between PM and RM and other cognitive processes. In this project we studied the longitudinal development of PM, RM, EF, and MC in 7-8 year old elementary school children across half a year. 119 second graders (MT1 = 95 months, SDT1, = 4.8 months) completed the same PM, RM, EF and MC tasks twice with a time-lag of 7 months. The developmental progression was analysed using paired t-tests, the longitudinal relationships were analysed using confirmatory factor analysis and all fit indices are in accordance with Hu and Bentler (1998). In general, performance improved significantly (ps < .001) and effect sizes ranged from .45 to .62 (Cohen’s d). CFA revealed a good model fit, c2(227, 119) = 242.56, p = .23, TLI = .973, CFI = .979, RMSEA = .024. At T1, significant cross-sectional links were found between PM T1 and RM T1, between PM T1 and EF T1, and between EF T1 and MC T1. Moreover, significant longitudinal links were found between EFT1 and PMT2 and between EFT1 and MCT2; EF T1 and RM T2 were marginally linked. Results underline previous findings showing that PM, RM, EF, and MC develop significantly during childhood, even within this short time period. Results also indicate that these cognitive abilities are linked not only cross-sectionally, but longitudinally. Most relevant, however, is the predictive role of EF for both metacognition and memory.
Resumo:
Prospective Memory (PM), executive functions (EF) and metacognition (MC) are relevant cognitive abilities for everyday functioning. They all seem to develop gradually in childhood and appear to be theoretically closely related; however, their empirical links remain unclear, especially in children. As a recent study revealed significant cross-sectional links between PM and EF, and a weaker but close link between PM and MC in 2nd graders (Spiess, Meier, & Roebers, submitted), this study focused on their short-term relationships and on their development. 119 children (MT1 =95 months, SDT1, = 4.8 months) completed the same tasks (one PM, three EF, one MC task) twice with a time-lag of 7 months. T-tests showed significant improvements in all tasks, except in the updating task. Different structural equation models were contrasted (AMOS); the best fitting model revealed that PMT2 was similarly predicted by PMT1 (r = .33) and EFT1 (r = .34). Additionally, EFT1 predicted MCT2 (r = .44), chi2(118, 119) = 128.91, p = .23, TLI = .968, CFI = .978, RMSEA = .028. Results show that PM, EF, and MC develop during childhood and also demonstrate that they are linked not only cross-sectionally, but longitudinally. Findings are discussed in a broader developmental framework.
Resumo:
Traditionally, researchers have discussed executive function and metacognition independently. However, more recently, theoretical frameworks linking these two groups of higher order cognitive processes have been advanced. In this article, we explore the relationship between executive function and procedural metacognition, and summarize theoretical similarities. From a developmental perspective, the assumed theoretical resemblances seem to be supported, considering development trajectories and their substantial impact on areas that include learning and memory. Moreover, empirical evidence suggests direct relationships on the task level, on the level of latent variables, and in terms of involved brain regions. However, research linking the two concepts directly remains rare. We discuss evidence and developmental mechanisms, and propose ways researchers can investigate links between executive function and procedural metacognition.
Resumo:
The purpose of this study was to examine, in the context of an economic model of health production, the relationship between inputs (health influencing activities) and fitness.^ Primary data were collected from 204 employees of a large insurance company at the time of their enrollment in an industrially-based health promotion program. The inputs of production included medical care use, exercise, smoking, drinking, eating, coronary disease history, and obesity. The variables of age, gender and education known to affect the production process were also examined. Two estimates of fitness were used; self-report and a physiologic estimate based on exercise treadmill performance. Ordinary least squares and two-stage least squares regression analyses were used to estimate the fitness production functions.^ In the production of self-reported fitness status the coefficients for the exercise, smoking, eating, and drinking production inputs, and the control variable of gender were statistically significant and possessed theoretically correct signs. In the production of physiologic fitness exercise, smoking and gender were statistically significant. Exercise and gender were theoretically consistent while smoking was not. Results are compared with previous analyses of health production. ^
Resumo:
With the purpose of assessing the absorption coefficients of quantum dot solar cells, symmetry considerations are introduced into a Hamiltonian whose eigenvalues are empirical. In this way, the proper transformation from the Hamiltonian's diagonalized form to the form that relates it with Γ-point exact solutions through k.p envelope functions is built accounting for symmetry. Forbidden transitions are thus determined reducing the calculation burden and permitting a thoughtful discussion of the possible options for this transformation. The agreement of this model with the measured external quantum efficiency of a prototype solar cell is found to be excellent.
Resumo:
In this work, a unified algorithm-architecture-circuit co-design environment for complex FPGA system development is presented. The main objective is to find an efficient methodology for designing a configurable optimized FPGA system by using as few efforts as possible in verification stage, so as to speed up the development period. A proposed high performance FFT/iFFT processor for Multiband Orthogonal Frequency Division Multiplexing Ultra Wideband (MB-OFDM UWB) system design process is given as an example to demonstrate the proposed methodology. This efficient design methodology is tested and considered to be suitable for almost all types of complex FPGA system designs and verifications.
Resumo:
A mathematical model of the process employed by a sonic anemometer to build up the measured wind vector in a steady flow is presented to illustrate the way the geometry of these sensors as well as the characteristics of aerodynamic disturbance on the acoustic path can lead to singularities in the transformation function that relates the measured (disturbed) wind vector with the real (corrected) wind vector, impeding the application of correction/calibration functions for some wind conditions. An implicit function theorem allows for the identification of those combinations of real wind conditions and design parameters that lead to undefined correction/ calibration functions. In general, orthogonal path sensors do not show problematic combination of parameters. However, some geometric sonic sensor designs, available in the market, with paths forming smaller angles could lead to undefined correction functions for some levels of aerodynamic disturbances and for certain wind directions. The parameters studied have a strong influence on the existence and number of singularities in the correction/ calibration function as well as on the number of singularities for some combination of parameters. Some conclusions concerning good design practices are included.
Resumo:
Context. The Gaia-ESO Survey (GES) is a large public spectroscopic survey at the European Southern Observatory Very Large Telescope. Aims. A key aim is to provide precise radial velocities (RVs) and projected equatorial velocities (vsini) for representative samples of Galactic stars, which will complement information obtained by the Gaia astrometry satellite. Methods. We present an analysis to empirically quantify the size and distribution of uncertainties in RV and vsini using spectra from repeated exposures of the same stars. Results. We show that the uncertainties vary as simple scaling functions of signal-to-noise ratio (S/N) and vsini, that the uncertainties become larger with increasing photospheric temperature, but that the dependence on stellar gravity, metallicity and age is weak. The underlying uncertainty distributions have extended tails that are better represented by Student’s t-distributions than by normal distributions. Conclusions. Parametrised results are provided, which enable estimates of the RV precision for almost all GES measurements, and estimates of the vsini precision for stars in young clusters, as a function of S/N, vsini and stellar temperature. The precision of individual high S/N GES RV measurements is 0.22–0.26 km s-1, dependent on instrumental configuration.
Resumo:
La plupart des modèles en statistique classique repose sur une hypothèse sur la distribution des données ou sur une distribution sous-jacente aux données. La validité de cette hypothèse permet de faire de l’inférence, de construire des intervalles de confiance ou encore de tester la fiabilité du modèle. La problématique des tests d’ajustement vise à s’assurer de la conformité ou de la cohérence de l’hypothèse avec les données disponibles. Dans la présente thèse, nous proposons des tests d’ajustement à la loi normale dans le cadre des séries chronologiques univariées et vectorielles. Nous nous sommes limités à une classe de séries chronologiques linéaires, à savoir les modèles autorégressifs à moyenne mobile (ARMA ou VARMA dans le cas vectoriel). Dans un premier temps, au cas univarié, nous proposons une généralisation du travail de Ducharme et Lafaye de Micheaux (2004) dans le cas où la moyenne est inconnue et estimée. Nous avons estimé les paramètres par une méthode rarement utilisée dans la littérature et pourtant asymptotiquement efficace. En effet, nous avons rigoureusement montré que l’estimateur proposé par Brockwell et Davis (1991, section 10.8) converge presque sûrement vers la vraie valeur inconnue du paramètre. De plus, nous fournissons une preuve rigoureuse de l’inversibilité de la matrice des variances et des covariances de la statistique de test à partir de certaines propriétés d’algèbre linéaire. Le résultat s’applique aussi au cas où la moyenne est supposée connue et égale à zéro. Enfin, nous proposons une méthode de sélection de la dimension de la famille d’alternatives de type AIC, et nous étudions les propriétés asymptotiques de cette méthode. L’outil proposé ici est basé sur une famille spécifique de polynômes orthogonaux, à savoir les polynômes de Legendre. Dans un second temps, dans le cas vectoriel, nous proposons un test d’ajustement pour les modèles autorégressifs à moyenne mobile avec une paramétrisation structurée. La paramétrisation structurée permet de réduire le nombre élevé de paramètres dans ces modèles ou encore de tenir compte de certaines contraintes particulières. Ce projet inclut le cas standard d’absence de paramétrisation. Le test que nous proposons s’applique à une famille quelconque de fonctions orthogonales. Nous illustrons cela dans le cas particulier des polynômes de Legendre et d’Hermite. Dans le cas particulier des polynômes d’Hermite, nous montrons que le test obtenu est invariant aux transformations affines et qu’il est en fait une généralisation de nombreux tests existants dans la littérature. Ce projet peut être vu comme une généralisation du premier dans trois directions, notamment le passage de l’univarié au multivarié ; le choix d’une famille quelconque de fonctions orthogonales ; et enfin la possibilité de spécifier des relations ou des contraintes dans la formulation VARMA. Nous avons procédé dans chacun des projets à une étude de simulation afin d’évaluer le niveau et la puissance des tests proposés ainsi que de les comparer aux tests existants. De plus des applications aux données réelles sont fournies. Nous avons appliqué les tests à la prévision de la température moyenne annuelle du globe terrestre (univarié), ainsi qu’aux données relatives au marché du travail canadien (bivarié). Ces travaux ont été exposés à plusieurs congrès (voir par exemple Tagne, Duchesne et Lafaye de Micheaux (2013a, 2013b, 2014) pour plus de détails). Un article basé sur le premier projet est également soumis dans une revue avec comité de lecture (Voir Duchesne, Lafaye de Micheaux et Tagne (2016)).
Resumo:
La plupart des modèles en statistique classique repose sur une hypothèse sur la distribution des données ou sur une distribution sous-jacente aux données. La validité de cette hypothèse permet de faire de l’inférence, de construire des intervalles de confiance ou encore de tester la fiabilité du modèle. La problématique des tests d’ajustement vise à s’assurer de la conformité ou de la cohérence de l’hypothèse avec les données disponibles. Dans la présente thèse, nous proposons des tests d’ajustement à la loi normale dans le cadre des séries chronologiques univariées et vectorielles. Nous nous sommes limités à une classe de séries chronologiques linéaires, à savoir les modèles autorégressifs à moyenne mobile (ARMA ou VARMA dans le cas vectoriel). Dans un premier temps, au cas univarié, nous proposons une généralisation du travail de Ducharme et Lafaye de Micheaux (2004) dans le cas où la moyenne est inconnue et estimée. Nous avons estimé les paramètres par une méthode rarement utilisée dans la littérature et pourtant asymptotiquement efficace. En effet, nous avons rigoureusement montré que l’estimateur proposé par Brockwell et Davis (1991, section 10.8) converge presque sûrement vers la vraie valeur inconnue du paramètre. De plus, nous fournissons une preuve rigoureuse de l’inversibilité de la matrice des variances et des covariances de la statistique de test à partir de certaines propriétés d’algèbre linéaire. Le résultat s’applique aussi au cas où la moyenne est supposée connue et égale à zéro. Enfin, nous proposons une méthode de sélection de la dimension de la famille d’alternatives de type AIC, et nous étudions les propriétés asymptotiques de cette méthode. L’outil proposé ici est basé sur une famille spécifique de polynômes orthogonaux, à savoir les polynômes de Legendre. Dans un second temps, dans le cas vectoriel, nous proposons un test d’ajustement pour les modèles autorégressifs à moyenne mobile avec une paramétrisation structurée. La paramétrisation structurée permet de réduire le nombre élevé de paramètres dans ces modèles ou encore de tenir compte de certaines contraintes particulières. Ce projet inclut le cas standard d’absence de paramétrisation. Le test que nous proposons s’applique à une famille quelconque de fonctions orthogonales. Nous illustrons cela dans le cas particulier des polynômes de Legendre et d’Hermite. Dans le cas particulier des polynômes d’Hermite, nous montrons que le test obtenu est invariant aux transformations affines et qu’il est en fait une généralisation de nombreux tests existants dans la littérature. Ce projet peut être vu comme une généralisation du premier dans trois directions, notamment le passage de l’univarié au multivarié ; le choix d’une famille quelconque de fonctions orthogonales ; et enfin la possibilité de spécifier des relations ou des contraintes dans la formulation VARMA. Nous avons procédé dans chacun des projets à une étude de simulation afin d’évaluer le niveau et la puissance des tests proposés ainsi que de les comparer aux tests existants. De plus des applications aux données réelles sont fournies. Nous avons appliqué les tests à la prévision de la température moyenne annuelle du globe terrestre (univarié), ainsi qu’aux données relatives au marché du travail canadien (bivarié). Ces travaux ont été exposés à plusieurs congrès (voir par exemple Tagne, Duchesne et Lafaye de Micheaux (2013a, 2013b, 2014) pour plus de détails). Un article basé sur le premier projet est également soumis dans une revue avec comité de lecture (Voir Duchesne, Lafaye de Micheaux et Tagne (2016)).
Resumo:
Published also as thesis (PH. D.) Columbia University, 1921.
Resumo:
Complex numbers appear in the Hilbert space formulation of quantum mechanics, but not in the formulation in phase space. Quantum symmetries are described by complex, unitary or antiunitary operators defining ray representations in Hilbert space, whereas in phase space they are described by real, true representations. Equivalence of the formulations requires that the former representations can be obtained from the latter and vice versa. Examples are given. Equivalence of the two formulations also requires that complex superpositions of state vectors can be described in the phase space formulation, and it is shown that this leads to a nonlinear superposition principle for orthogonal, pure-state Wigner functions. It is concluded that the use of complex numbers in quantum mechanics can be regarded as a computational device to simplify calculations, as in all other applications of mathematics to physical phenomena.