79 resultados para series-parallel
Resumo:
Initial convergence of the perturbation series expansion for vibrational nonlinear optical (NLO) properties was analyzed. The zero-point vibrational average (ZPVA) was obtained through first-order in mechanical plus electrical anharmonicity. Results indicated that higher-order terms in electrical and mechanical anharmonicity can make substantial contributions to the pure vibrational polarizibility of typical NLO molecules
Resumo:
The mutual information of independent parallel Gaussian-noise channels is maximized, under an average power constraint, by independent Gaussian inputs whose power is allocated according to the waterfilling policy. In practice, discrete signalling constellations with limited peak-to-average ratios (m-PSK, m-QAM, etc) are used in lieu of the ideal Gaussian signals. This paper gives the power allocation policy that maximizes the mutual information over parallel channels with arbitrary input distributions. Such policy admits a graphical interpretation, referred to as mercury/waterfilling, which generalizes the waterfilling solution and allows retaining some of its intuition. The relationship between mutual information of Gaussian channels and nonlinear minimum mean-square error proves key to solving the power allocation problem.
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.
Resumo:
Este artículo presenta nuevas series de distribución factorial del ingreso entre 1950 y 2000 para 14 países de América Latina a partir de un proceso de armonización de los datos de compensación de los trabajadores recogidos en las cuentas nacionales. Además, se presentan estimaciones de la remuneración a los trabajadores autónomos, discutiéndose las diferentes metodologías de estimación posibles, y las limitaciones de datos que afectan a las mismas. El análisis de estas estimaciones coherentes nos permite llegar a algunas conclusionespreliminares. En primer lugar, las estimaciones presentan variaciones importantes entre los países. Segundo, a nivel regional, muestran variaciones tanto coyunturales como de largo plazo que apoyanlos estudios que cuestionan la estabilidad de la distribución factorial del ingreso en el largo plazo.Tercero, nuestras estimaciones de la remuneración del factor trabajo, una vez corregidas para incluir una estimación del trabajo no asalariado, siguen siendo sensiblemente inferiores a las de lospaíses desarrollados, cuestionando así los estudios que señalan que dichas variaciones desaparecen al aplicárseles esta corrección.
Resumo:
El presente trabajo analiza la construcción del espacio vivencial en cinco series televisivas producidas por la cadena norteamericana HBO: Carnivàle, Deadwood, The Sopranos, The Wire y Treme. La investigación parte de la convicción de una posible reelaboración de la historia del espacio doméstico norteamericano a través de la puesta en escena y de la potencialidad simbólica del espacio serial. De este modo, el trabajo propone un recorrido hermenéutico en el que se analiza la evolución del espacio narrativo, su evolución morfológica en dependencia de las necesidades diegéticas de cada una de las obras analizadas.
Resumo:
Over the past two decades, technological progress has been biased towards making skilled labor more productive. The evidence for this finding is based on the persistent parallel increase in the skill premium and the supply of skilled workers. What are the implications of skill-biased technological change for the business cycle? To answer this question, we use the CPS outgoing rotation groups to construct quarterly series for the price and quantity of skill. The unconditional correlation of the skill premium with the cycle is zero. However, using a structural VAR with long run restrictions, we find that technology shocks substantially increase the premium. Investment-specific technology shocks are not skill-biased and our findings suggest that capital and skill are (mildly) substitutable in aggregate production.
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
Theorem 1 of Euler s paper of 1737 'Variae Observationes Circa Series Infinitas', states the astonishing result that the series of all unit fractions whose denominators are perfect powers of integers minus unity has sum one. Euler attributes the Theorem to Goldbach. The proof is one of those examples of misuse of divergent series to obtain correct results so frequent during the seventeenth and eighteenth centuries. We examine this proof closelyand, with the help of some insight provided by a modern (and completely dierent) proof of the Goldbach-Euler Theorem, we present a rational reconstruction in terms which could be considered rigorous by modern Weierstrassian standards. At the same time, with a few ideas borrowed from nonstandard analysis we see how the same reconstruction can be also be considered rigorous by modern Robinsonian standards. This last approach, though, is completely in tune with Goldbach and Euler s proof. We hope to convince the reader then how, a few simple ideas from nonstandard analysis, vindicate Euler's work.
Resumo:
Condence intervals in econometric time series regressions suffer fromnotorious coverage problems. This is especially true when the dependencein the data is noticeable and sample sizes are small to moderate, as isoften the case in empirical studies. This paper suggests using thestudentized block bootstrap and discusses practical issues, such as thechoice of the block size. A particular data-dependent method is proposedto automate the method. As a side note, it is pointed out that symmetricconfidence intervals are preferred over equal-tailed ones, since theyexhibit improved coverage accuracy. The improvements in small sampleperformance are supported by a simulation study.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
This work proposes novel network analysis techniques for multivariate time series.We define the network of a multivariate time series as a graph where verticesdenote the components of the process and edges denote non zero long run partialcorrelations. We then introduce a two step LASSO procedure, called NETS, toestimate high dimensional sparse Long Run Partial Correlation networks. This approachis based on a VAR approximation of the process and allows to decomposethe long run linkages into the contribution of the dynamic and contemporaneousdependence relations of the system. The large sample properties of the estimatorare analysed and we establish conditions for consistent selection and estimation ofthe non zero long run partial correlations. The methodology is illustrated with anapplication to a panel of U.S. bluechips.
Resumo:
Interdisciplinary frameworks for studying natural hazards and their temporal trends have an important potential in data generation for risk assessment, land use planning, and therefore the sustainable management of resources. This paper focuses on the adjustments required because of the wide variety of scientific fields involved in the reconstruction and characterisation of flood events for the past 1000 years. The aim of this paper is to describe various methodological aspects of the study of flood events in their historical dimension, including the critical evaluation of old documentary and instrumental sources, flood-event classification and hydraulic modelling, and homogeneity and quality control tests. Standardized criteria for flood classification have been defined and applied to the Isère and Drac floods in France, from 1600 to 1950, and to the Ter, the Llobregat and the Segre floods, in Spain, from 1300 to 1980. The analysis on the Drac and Isère data series from 1600 to the present day showed that extraordinary and catastrophic floods were not distributed uniformly in time. However, the largest floods (general catastrophic floods) were homogeneously distributed in time within the period 1600¿1900. No major flood occurred during the 20th century in these rivers. From 1300 to the present day, no homogeneous behaviour was observed for extraordinary floods in the Spanish rivers. The largest floods were uniformly distributed in time within the period 1300-1900, for the Segre and Ter rivers.
Resumo:
Las características geoquímicas (elementos mayores y trazas) de las rocas analizadas son similares a las del arco volcánico de Ke rmadec en Pa c í fico SW. Por último, los bajos contenidos en REE, el patrón de REE con morfología plana, así como los bajos contenidos en elementos incompatibles (K, Rb, Zr, Th) son similares a los de las series tipo IAT presentes en el arco volcánico del Caribe. Estos nuevos datos sobre el volcanismo del Paleógeno de la Sierra Maestra sugieren que los modelos de placas tectónicas que han sido propuestos para explicar el origen del arco volcánico de Sierra Maestra deben ser revisados.
Resumo:
Este trabajo analiza si las series de Contabilidad Nacional Trimestral de España son excesivamente suaves y, por lo tanto, si son realmente informativas de la evolución de la economía española a corto plazo. Mediante la utilización de las técnicas de análisis espectral se observa que las series trimestrales españolas presentan una variabilidad mayor que las de otros países de la OCDE en el intervalo de frecuencias más bajas (asociadas al comportamiento de la serie a largo plazo ) y una variabilidad menor en el intervalo de frecuencias más altas (asociadas al ruido que contiene la serie). El motivo de este comportamiento diferencial de las series trimestrales españolas se encuentra en el método utilizado por el Instituto Nacional de Estadística por estimar la señal ciclo-tendencia de los indicadores utilizados como referencia, concretamente, el conocido como filtro de líneas aéreas modificado (LAM)