59 resultados para Pilot-scale
Resumo:
A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants
Resumo:
The project aims at advancing the state of the art in the use of context information for classification of image and video data. The use of context in the classification of images has been showed of great importance to improve the performance of actual object recognition systems. In our project we proposed the concept of Multi-scale Feature Labels as a general and compact method to exploit the local and global context. The feature extraction from the discriminative probability or classification confidence label field is of great novelty. Moreover the use of a multi-scale representation of the feature labels lead to a compact and efficient description of the context. The goal of the project has been also to provide a general-purpose method and prove its suitability in different image/video analysis problem. The two-year project generated 5 journal publications (plus 2 under submission), 10 conference publications (plus 2 under submission) and one patent (plus 1 pending). Of these publications, a relevant number make use of the main result of this project to improve the results in detection and/or segmentation of objects.
Resumo:
Background: Despite the fact that labour market flexibility has resulted in an expansion of precarious employment in industrialized countries, to date there is limited empirical evidence about its health consequences. The Employment Precariousness Scale (EPRES) is a newly developed, theory-based, multidimensional questionnaire specifically devised for epidemiological studies among waged and salaried workers. Objective: To assess acceptability, reliability and construct validity of EPRES in a sample of waged and salaried workers in Spain. Methods: Cross-sectional study, using a sub-sample of 6.968 temporary and permanent workers from a population-based survey carried out in 2004-2005. The survey questionnaire was interviewer administered and included the six EPRES subscales, measures of the psychosocial work environment (COPSOQ ISTAS21), and perceived general and mental health (SF-36). Results: A high response rate to all EPRES items indicated good acceptability; Cronbach’s alpha coefficients, over 0.70 for all subscales and the global score, demonstrated good internal consistency reliability; exploratory factor analysis using principal axis analysis and varimax rotation confirmed the six-subscale structure and the theoretical allocation of all items. Patterns across known groups and correlation coefficients with psychosocial work environment measures and perceived health demonstrated the expected relations, providing evidence of construct validity. Conclusions: Our results provide evidence in support of the psychometric properties of EPRES, which appears to be a promising tool for the measurement of employment precariousness in public health research.
Resumo:
The optimization of the pilot overhead in single-user wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-used block fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with the normalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
The optimization of the pilot overhead in wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-usedblock fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with thenormalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
Aquest working paper és un estudi preliminar que té com a objectiu analitzar acústicament diverses variables fonètiques amb la finalitat forense d'identificació de parlants: les freqüències dels dos primers formants de la vocal [ə] quan s'utilitza com a falca, la duració dels segments [m] tenint en compte el context sil·làbic i, finalment, l'estudi dels pics de freqüències en les fricatives estridents sordes -[s]- utilitzant l'anàlisi LPC. Els resultats revelen diferències estadísticament significatives entre els parlants.
Resumo:
Aquest treball elabora una proposta de traducció per al doblatge del capítol pilot de The Big Bang Theory, que combina llenguatge col•loquial i llenguatge científic.L’objectiu és doble: elaborar un llenguatge col•loquial creïble però a la vegada genuí i emprar els equivalents catalans adequats per als termes científics originals.
Resumo:
This paper studies two important reasons why people violate procedure invariance, loss aversion and scale compatibility. The paper extends previous research on loss aversion and scale compatibility by studying loss aversion and scale compatibility simultaneously, by looking at a new decision domain, medical decision analysis, and by examining the effect of loss aversion and scale compatibility on "well-contemplated preferences." We find significant evidence both of loss aversion and scale compatibility. However, the sizes of the biases due to loss aversion and scale compatibility vary over trade-offs and most participants do not behave consistently according to loss aversion or scale compatibility. In particular, the effect of loss aversion in medical trade-offs decreases with duration. These findings are encouraging for utility measurement and prescriptive decision analysis. There appear to exist decision contexts in which the effects of loss aversion and scale compatibility can be minimized and utilities can be measured that do not suffer from these distorting factors.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
We estimate an open economy dynamic stochastic general equilibrium (DSGE)model of Australia with a number of shocks, frictions and rigidities, matching alarge number of observable time series. We find that both foreign and domesticshocks are important drivers of the Australian business cycle.We also find that theinitial impact on inflation of an increase in demand for Australian commoditiesis negative, due to an improvement in the real exchange rate, though there is apersistent positive effect on inflation that dominates at longer horizons.
Resumo:
The classical binary classification problem is investigatedwhen it is known in advance that the posterior probability function(or regression function) belongs to some class of functions. We introduceand analyze a method which effectively exploits this knowledge. The methodis based on minimizing the empirical risk over a carefully selected``skeleton'' of the class of regression functions. The skeleton is acovering of the class based on a data--dependent metric, especiallyfitted for classification. A new scale--sensitive dimension isintroduced which is more useful for the studied classification problemthan other, previously defined, dimension measures. This fact isdemonstrated by performance bounds for the skeleton estimate in termsof the new dimension.
Resumo:
Geological and geomorphological mapping at scale 1:10.000 besides from being an important source of scientific information it is also a necessary tool for municipal organs in order to make proper decisions when dealing with geo-environmental problems concerning integral territorial development. In this work, detailed information is given on the contents of such maps, their social and economical application, and a balance of the investment and gains that derives from them
Resumo:
In this paper, we present a method to deal with the constraints of the underwater medium for finding changes between sequences of underwater images. One of the main problems of underwater medium for automatically detecting changes is the low altitude of the camera when taking pictures. This emphasise the parallax effect between the images as they are not taken exactly at the same position. In order to solve this problem, we are geometrically registering the images together taking into account the relief of the scene
Resumo:
In dealing with systems as complex as the cytoskeleton, we need organizing principles or, short of that, an empirical framework into which these systems fit. We report here unexpected invariants of cytoskeletal behavior that comprise such an empirical framework. We measured elastic and frictional moduli of a variety of cell types over a wide range of time scales and using a variety of biological interventions. In all instances elastic stresses dominated at frequencies below 300 Hz, increased only weakly with frequency, and followed a power law; no characteristic time scale was evident. Frictional stresses paralleled the elastic behavior at frequencies below 10 Hz but approached a Newtonian viscous behavior at higher frequencies. Surprisingly, all data could be collapsed onto master curves, the existence of which implies that elastic and frictional stresses share a common underlying mechanism. Taken together, these findings define an unanticipated integrative framework for studying protein interactions within the complex microenvironment of the cell body, and appear to set limits on what can be predicted about integrated mechanical behavior of the matrix based solely on cytoskeletal constituents considered in isolation. Moreover, these observations are consistent with the hypothesis that the cytoskeleton of the living cell behaves as a soft glassy material, wherein cytoskeletal proteins modulate cell mechanical properties mainly by changing an effective temperature of the cytoskeletal matrix. If so, then the effective temperature becomes an easily quantified determinant of the ability of the cytoskeleton to deform, flow, and reorganize.