114 resultados para Processor power estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la National Oceanography Centre of Southampton (NOCS), Gran Bretanya, entre maig i juliol del 2006. La possibilitat d’obtenir una estimació precissa de la salinitat marina (SSS) és important per a investigar i predir l’extensió del fenòmen del canvi climàtic. La missió Soil Moisture and Ocean Salinity (SMOS) va ser seleccionada per l’Agència Espacial Europea (ESA) per a obtenir mapes de salinitat de la superfície marina a escala global i amb un temps de revisita petit. Abans del llençament de SMOS es preveu l’anàlisi de la variabilitat horitzontal de la SSS i del potencial de les dades recuperades a partir de mesures de SMOS per a reproduir comportaments oceanogràfics coneguts. L’objectiu de tot plegat és emplenar el buit existent entre les fonts de dades d’entrada/auxiliars fiables i les eines desenvolupades per a simular i processar les dades adquirides segons la configuració de SMOS. El SMOS End-to-end Performance Simulator (SEPS) és un simulador adhoc desenvolupat per la Universitat Politècnica de Catalunya (UPC) per a generar dades segons la configuració de SMOS. Es va utilitzar dades d’entrada a SEPS procedents del projecte Ocean Circulation and Climate Advanced Modeling (OCCAM), utilitzat al NOCS, a diferents resolucions espacials. Modificant SEPS per a poder fer servir com a entrada les dades OCCAM es van obtenir dades de temperatura de brillantor simulades durant un mes amb diferents observacions ascendents que cobrien la zona seleccionada. Les tasques realitzades durant l’estada a NOCS tenien la finalitat de proporcionar una tècnica fiable per a realitzar la calibració externa i per tant cancel•lar el bias, una metodologia per a promitjar temporalment les diferents adquisicions durant les observacions ascendents, i determinar la millor configuració de la funció de cost abans d’explotar i investigar les posibiltats de les dades SEPS/OCCAM per a derivar la SSS recuperada amb patrons d’alta resolució.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article provides a fresh methodological and empirical approach for assessing price level convergence and its relation to purchasing power parity (PPP) using annual price data for seventeen US cities. We suggest a new procedure that can handle a wide range of PPP concepts in the presence of multiple structural breaks using all possible pairs of real exchange rates. To deal with cross-sectional dependence, we use both cross-sectional demeaned data and a parametric bootstrap approach. In general, we find more evidence for stationarity when the parity restriction is not imposed, while imposing parity restriction provides leads toward the rejection of the panel stationar- ity. Our results can be embedded on the view of the Balassa-Samuelson approach, but where the slope of the time trend is allowed to change in the long-run. The median half-life point estimate are found to be lower than the consensus view regardless of the parity restriction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lean meat percentage (LMP) is an important carcass quality parameter. The aim of this work is to obtain a calibration equation for the Computed Tomography (CT) scans with the Partial Least Square Regression (PLS) technique in order to predict the LMP of the carcass and the different cuts and to study and compare two different methodologies of the selection of the variables (Variable Importance for Projection — VIP- and Stepwise) to be included in the prediction equation. The error of prediction with cross-validation (RMSEPCV) of the LMP obtained with PLS and selection based on VIP value was 0.82% and for stepwise selection it was 0.83%. The prediction of the LMP scanning only the ham had a RMSEPCV of 0.97% and if the ham and the loin were scanned the RMSEPCV was 0.90%. Results indicate that for CT data both VIP and stepwise selection are good methods. Moreover the scanning of only the ham allowed us to obtain a good prediction of the LMP of the whole carcass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article investigates the history of land and water transformations in Matadepera, a wealthy suburb of metropolitan Barcelona. Analysis is informed by theories of political ecology and methods of environmental history; although very relevant, these have received relatively little attention within ecological economics. Empirical material includes communications from the City Archives of Matadepera (1919-1979), 17 interviews with locals born between 1913 and 1958, and an exhaustive review of grey historical literature. Existing water histories of Barcelona and its outskirts portray a battle against natural water scarcity, hard won by heroic engineers and politicians acting for the good of the community. Our research in Matadepera tells a very different story. We reveal the production of a highly uneven landscape and waterscape through fierce political and power struggles. The evolution of Matadepera from a small rural village to an elite suburb was anything but spontaneous or peaceful. It was a socio-environmental project well intended by landowning elites and heavily fought by others. The struggle for the control of water went hand in hand with the land and political struggles that culminated – and were violently resolved - in the Spanish Civil War. The displacement of the economic and environmental costs of water use from few to many continues to this day and is constitutive of Matadepera’s uneven and unsustainable landscape. By unravelling the relations of power that are inscribed in the urbanization of nature (Swyngedouw, 2004), we question the perceived wisdoms of contemporary water policy debates, particularly the notion of a natural scarcity that merits a technical or economic response. We argue that the water question is fundamentally a political question of environmental justice; it is about negotiating alternative visions of the future and deciding whose visions will be produced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As computer chips implementation technologies evolve to obtain more performance, those computer chips are using smaller components, with bigger density of transistors and working with lower power voltages. All these factors turn the computer chips less robust and increase the probability of a transient fault. Transient faults may occur once and never more happen the same way in a computer system lifetime. There are distinct consequences when a transient fault occurs: the operating system might abort the execution if the change produced by the fault is detected by bad behavior of the application, but the biggest risk is that the fault produces an undetected data corruption that modifies the application final result without warnings (for example a bit flip in some crucial data). With the objective of researching transient faults in computer system’s processor registers and memory we have developed an extension of HP’s and AMD joint full system simulation environment, named COTSon. This extension allows the injection of faults that change a single bit in processor registers and memory of the simulated computer. The developed fault injection system makes it possible to: evaluate the effects of single bit flip transient faults in an application, analyze an application robustness against single bit flip transient faults and validate fault detection mechanism and strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Properties of GMM estimators for panel data, which have become very popular in the empirical economic growth literature, are not well known when the number of individuals is small. This paper analyses through Monte Carlo simulations the properties of various GMM and other estimators when the number of individuals is the one typically available in country growth studies. It is found that, provided that some persistency is present in the series, the system GMM estimator has a lower bias and higher efficiency than all the other estimators analysed, including the standard first-differences GMM estimator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the statistics of rain-event sizes, rain-event durations, and dry-spell durations in a network of 20 rain gauges scattered in an area situated close to the NW Mediterranean coast. Power-law distributions emerge clearly for the dryspell durations, with an exponent around 1.50 ± 0.05, although for event sizes and durations the power-law ranges are rather limited, in some cases. Deviations from power-law behavior are attributed to finite-size effects. A scaling analysis helps to elucidate the situation, providing support for the existence of scale invariance in these distributions. It is remarkable that rain data of not very high resolution yield findings in agreement with self-organized critical phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement (SCR), under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work in this paper deals with the development of momentum and thermal boundary layers when a power law fluid flows over a flat plate. At the plate we impose either constant temperature, constant flux or a Newton cooling condition. The problem is analysed using similarity solutions, integral momentum and energy equations and an approximation technique which is a form of the Heat Balance Integral Method. The fluid properties are assumed to be independent of temperature, hence the momentum equation uncouples from the thermal problem. We first derive the similarity equations for the velocity and present exact solutions for the case where the power law index n = 2. The similarity solutions are used to validate the new approximation method. This new technique is then applied to the thermal boundary layer, where a similarity solution can only be obtained for the case n = 1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Great Tohoku-Kanto earthquake and resulting tsunami has brought considerable attention to the issue of the construction of new power plants. We argue in this paper, nuclear power is not a sustainable solution to energy problems. First, we explore the stock of uranium-235 and the different schemes developed by the nuclear power industry to exploit this resource. Second, we show that these methods, fast breeder and MOX fuel reactors, are not feasible. Third, we show that the argument that nuclear energy can be used to reduce CO2 emissions is false: the emissions from the increased water evaporation from nuclear power generation must be accounted for. In the case of Japan, water from nuclear power plants is drained into the surrounding sea, raising the water temperature which has an adverse affect on the immediate ecosystem, as well as increasing CO2 emissions from increased water evaporation from the sea. Next, a short exercise is used to show that nuclear power is not even needed to meet consumer demand in Japan. Such an exercise should be performed for any country considering the construction of additional nuclear power plants. Lastly, the paper is concluded with a discussion of the implications of our findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn at the the Philipps-Universität Marburg, Germany, from september to december 2007. For the first, we employed the Energy-Decomposition Analysis (EDA) to investigate aromaticity on Fischer carbenes as it is related through all the reaction mechanisms studied in my PhD thesis. This powerful tool, compared with other well-known aromaticity indices in the literature like NICS, is useful not only for quantitative results but also to measure the degree of conjugation or hyperconjugation in molecules. Our results showed for the annelated benzenoid systems studied here, that electron density is more concentrated on the outer rings than in the central one. The strain-induced bond localization plays a major role as a driven force to keep the more substituted ring as the less aromatic. The discussion presented in this work was contrasted at different levels of theory to calibrate the method and ensure the consistency of our results. We think these conclusions can also be extended to arene chemistry for explaining aromaticity and regioselectivity reactions found in those systems.In the second work, we have employed the Turbomole program package and density-functionals of the best performance in the state of art, to explore reaction mechanisms in the noble gas chemistry. Particularly, we were interested in compounds of the form H--Ng--Ng--F (where Ng (Noble Gas) = Ar, Kr and Xe) and we investigated the relative stability of these species. Our quantum chemical calculations predict that the dixenon compound HXeXeF has an activation barrier for decomposition of 11 kcal/mol which should be large enough to identify the molecule in a low-temperature matrix. The other noble gases present lower activation barriers and therefore are more labile and difficult to be observable systems experimentally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines why a financial entity’s solvency capital estimation might be underestimated if the total amount required is obtained directly from a risk measurement. Using Monte Carlo simulation we show that, in some instances, a common risk measure such as Value-at-Risk is not subadditive when certain dependence structures are considered. Higher risk evaluations are obtained for independence between random variables than those obtained in the case of comonotonicity. The paper stresses, therefore, the relationship between dependence structures and capital estimation.