993 resultados para Pyroelectric coefficients


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the role of institutions in determining per capita income levels and growth. It contributes to the empirical literature by using different variables as proxies for institutions and by developing a deeper analysis of the issues arising from the use of weak and too many instruments in per capita income and growth regressions. The cross-section estimation suggests that institutions seem to matter, regardless if they are the only explanatory variable or are combined with geographical and integration variables, although most models suffer from the issue of weak instruments. The results from the growth models provides some interesting results: there is mixed evidence on the role of institutions and such evidence is more likely to be associated with law and order and investment profile; government spending is an important policy variable; collapsing the number of instruments results in fewer significant coefficients for institutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an effort to meet its obligations under the Kyoto Protocol, in 2005 the European Union introduced a cap-and-trade scheme where mandated installations are allocated permits to emit CO2. Financial markets have developed that allow companies to trade these carbon permits. For the EU to achieve reductions in CO2 emissions at a minimum cost, it is necessary that companies make appropriate investments and policymakers design optimal policies. In an effort to clarify the workings of the carbon market, several recent papers have attempted to statistically model it. However, the European carbon market (EU ETS) has many institutional features that potentially impact on daily carbon prices (and associated nancial futures). As a consequence, the carbon market has properties that are quite different from conventional financial assets traded in mature markets. In this paper, we use dynamic model averaging (DMA) in order to forecast in this newly-developing market. DMA is a recently-developed statistical method which has three advantages over conventional approaches. First, it allows the coefficients on the predictors in a forecasting model to change over time. Second, it allows for the entire fore- casting model to change over time. Third, it surmounts statistical problems which arise from the large number of potential predictors that can explain carbon prices. Our empirical results indicate that there are both important policy and statistical bene ts with our approach. Statistically, we present strong evidence that there is substantial turbulence and change in the EU ETS market, and that DMA can model these features and forecast accurately compared to conventional approaches. From a policy perspective, we discuss the relative and changing role of different price drivers in the EU ETS. Finally, we document the forecast performance of DMA and discuss how this relates to the efficiency and maturity of this market.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we develop methods for estimation and forecasting in large timevarying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is inspired by articles in the last decade or so that have argued for more attention to theory, and to empirical analysis, within the well-known, and long-lasting, contingency framework for explaining the organisational form of the firm. Its contribution is to extend contingency analysis in three ways: (a) by empirically testing it, using explicit econometric modelling (rather than case study evidence) involving estimation by ordered probit analysis; (b) by extending its scope from large firms to SMEs; (c) by extending its applications from Western economic contexts, to an emerging economy context, using field work evidence from China. It calibrates organizational form in a new way, as an ordinal dependent variable, and also utilises new measures of familiar contingency factors from the literature (i.e. Environment, Strategy, Size and Technology) as the independent variables. An ordered probit model of contingency was constructed, and estimated by maximum likelihood, using a cross section of 83 private Chinese firms. The probit was found to be a good fit to the data, and displayed significant coefficients with plausible interpretations for key variables under all the four categories of contingency analysis, namely Environment, Strategy, Size and Technology. Thus we have generalised the contingency model, in terms of specification, interpretation and applications area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers Bayesian variable selection in regressions with a large number of possibly highly correlated macroeconomic predictors. I show that by acknowledging the correlation structure in the predictors can improve forecasts over existing popular Bayesian variable selection algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

JPEG2000 és un estàndard de compressió d’imatges que utilitza la transformada wavelet i, posteriorment, una quantificació uniforme dels coeficients amb dead-zone. Els coeficients wavelet presenten certes dependències tant estadístiques com visuals. Les dependències estadístiques es tenen en compte a l'esquema JPEG2000, no obstant, no passa el mateix amb les dependències visuals. En aquest treball, es pretén trobar una representació més adaptada al sistema visual que la que proporciona JPEG2000 directament. Per trobar-la utilitzarem la normalització divisiva dels coeficients, tècnica que ja ha demostrat resultats tant en decorrelació estadística de coeficients com perceptiva. Idealment, el que es voldria fer és reconvertir els coeficients a un espai de valors en els quals un valor més elevat dels coeficients impliqui un valor més elevat d'aportació visual, i utilitzar aquest espai de valors per a codificar. A la pràctica, però, volem que el nostre sistema de codificació estigui integrat a un estàndard. És per això que utilitzarem JPEG2000, estàndard de la ITU que permet una elecció de les distorsions en la codificació, i utilitzarem la distorsió en el domini de coeficients normalitzats com a mesura de distorsió per a escollir quines dades s'envien abans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In contrast to previous results combining all ages we find positive effects of comparison income on happiness for the under 45s, and negative effects for those over 45. In the BHPS these coefficients are several times the magnitude of own income effects. In GSOEP they cancel to give no effect of effect of comparison income on life satisfaction in the whole sample, when controlling for fixed effects, and time-in-panel, and with flexible, age-group dummies. The residual age-happiness relationship is hump-shaped in all three countries. Results are consistent with a simple life cycle model of relative income under uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the challenges faced by the empirical macroeconomist and methods for surmounting them. These challenges arise due to the fact that macroeconometric models potentially include a large number of variables and allow for time variation in parameters. These considerations lead to models which have a large number of parameters to estimate relative to the number of observations. A wide range of approaches are surveyed which aim to overcome the resulting problems. We stress the related themes of prior shrinkage, model averaging and model selection. Subsequently, we consider a particular modelling approach in detail. This involves the use of dynamic model selection methods with large TVP-VARs. A forecasting exercise involving a large US macroeconomic data set illustrates the practicality and empirical success of our approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use factor augmented vector autoregressive models with time-varying coefficients to construct a financial conditions index. The time-variation in the parameters allows for the weights attached to each financial variable in the index to evolve over time. Furthermore, we develop methods for dynamic model averaging or selection which allow the financial variables entering into the FCI to change over time. We discuss why such extensions of the existing literature are important and show them to be so in an empirical application involving a wide range of financial variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using survey expectations data and Markov-switching models, this paper evaluates the characteristics and evolution of investors' forecast errors about the yen/dollar exchange rate. Since our model is derived from the uncovered interest rate parity (UIRP) condition and our data cover a period of low interest rates, this study is also related to the forward premium puzzle and the currency carry trade strategy. We obtain the following results. First, with the same forecast horizon, exchange rate forecasts are homogeneous among different industry types, but within the same industry, exchange rate forecasts differ if the forecast time horizon is different. In particular, investors tend to undervalue the future exchange rate for long term forecast horizons; however, in the short run they tend to overvalue the future exchange rate. Second, while forecast errors are found to be partly driven by interest rate spreads, evidence against the UIRP is provided regardless of the forecasting time horizon; the forward premium puzzle becomes more significant in shorter term forecasting errors. Consistent with this finding, our coefficients on interest rate spreads provide indirect evidence of the yen carry trade over only a short term forecast horizon. Furthermore, the carry trade seems to be active when there is a clear indication that the interest rate will be low in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Revolutionary endovascular treatments are on the verge of being available for management of ascending aortic diseases. Morphometric measurements of the ascending aorta have already been done with ECG-gated MDCT to help such therapeutic development. However the reliability of these measurements remains unknown. The objective of this work was to compare the intraobserver and interobserver variability of CAD (computer aided diagnosis) versus manual measurements in the ascending aorta. Methods and materials: Twenty-six consecutive patients referred for ECG-gated CT thoracic angiography (64-row CT scanner) were evaluated. Measurements of the maximum and minimum ascending aorta diameters at mid-distance between the brachiocephalic artery and the aortic valve were obtained automatically with a commercially available CAD and manually by two observers separately. Both observers repeated the measurements during a different session at least one month after the first measurements. Intraclass coefficients as well the Bland and Altman method were used for comparison between measurements. Two-paired t-test was used to determine the significance of intraobserver and interobserver differences (alpha = 0.05). Results: There is a significant difference between CAD and manual measurements in the maximum diameter (p = 0.004) for the first observer, whereas the difference was significant for minimum diameter between the second observer and the CAD (p <0.001). Interobserver variability showed a weak agreement when measurements were done manually. Intraobserver variability was lower with the CAD compared to the manual measurements (limits of variability: from -0.7 to 0.9 mm for the former and from -1.2 to 1.3 mm for the latter). Conclusion: In order to improve reproductibility of measurements whenever needed, pre- and post-therapeutic management of the ascending aorta may benefit from follow-up done by a unique observer with the help of CAD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: Advances in biopsychosocial science have underlined the importance of taking social history and life course perspective into consideration in primary care. For both clinical and research purposes, this study aims to develop and validate a standardised instrument measuring both material and social deprivation at an individual level. METHODS: We identified relevant potential questions regarding deprivation using a systematic review, structured interviews, focus group interviews and a think-aloud approach. Item response theory analysis was then used to reduce the length of the 38-item questionnaire and derive the deprivation in primary care questionnaire (DiPCare-Q) index using data obtained from a random sample of 200 patients during their planned visits to an ambulatory general internal medicine clinic. Patients completed the questionnaire a second time over the phone 3 days later to enable us to assess reliability. Content validity of the DiPCare-Q was then assessed by 17 general practitioners. Psychometric properties and validity of the final instrument were investigated in a second set of patients. The DiPCare-Q was administered to a random sample of 1898 patients attending one of 47 different private primary care practices in western Switzerland along with questions on subjective social status, education, source of income, welfare status and subjective poverty. RESULTS: Deprivation was defined in three distinct dimensions: material (eight items), social (five items) and health deprivation (three items). Item consistency was high in both the derivation (Kuder-Richardson Formula 20 (KR20) =0.827) and the validation set (KR20 =0.778). The DiPCare-Q index was reliable (interclass correlation coefficients=0.847) and was correlated to subjective social status (r(s)=-0.539). CONCLUSION: The DiPCare-Q is a rapid, reliable and validated instrument that may prove useful for measuring both material and social deprivation in primary care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been recently emphasized that, if individuals have heterogeneous dynamics, estimates of shock persistence based on aggregate data are significatively higher than those derived from its disaggregate counterpart. However, a careful examination of the implications of this statement on the various tools routinely employed to measure persistence is missing in the literature. This paper formally examines this issue. We consider a disaggregate linear model with heterogeneous dynamics and compare the values of several measures of persistence across aggregation levels. Interestingly, we show that the average persistence of aggregate shocks, as measured by the impulse response function (IRF) of the aggregate model or by the average of the individual IRFs, is identical on all horizons. This result remains true even in situations where the units are (short-memory) stationary but the aggregate process is long-memory or even nonstationary. In contrast, other popular persistence measures, such as the sum of the autoregressive coefficients or the largest autoregressive root, tend to be higher the higher the aggregation level. We argue, however, that this should be seen more as an undesirable property of these measures than as evidence of different average persistence across aggregation levels. The results are illustrated in an application using U.S. inflation data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) may contribute to optimizing the efficacy and safety of antifungal therapy because of the large variability in drug pharmacokinetics. Rapid, sensitive, and selective laboratory methods are needed for efficient TDM. Quantification of several antifungals in a single analytical run may best fulfill these requirements. We therefore developed a multiplex ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method requiring 100 μl of plasma for simultaneous quantification within 7 min of fluconazole, itraconazole, hydroxyitraconazole, posaconazole, voriconazole, voriconazole-N-oxide, caspofungin, and anidulafungin. Protein precipitation with acetonitrile was used in a single extraction procedure for eight analytes. After reverse-phase chromatographic separation, antifungals were quantified by electrospray ionization-triple-quadrupole mass spectrometry by selected reaction monitoring detection using the positive mode. Deuterated isotopic compounds of azole antifungals were used as internal standards. The method was validated based on FDA recommendations, including assessment of extraction yields, matrix effect variability (<9.2%), and analytical recovery (80.1 to 107%). The method is sensitive (lower limits of azole quantification, 0.01 to 0.1 μg/ml; those of echinocandin quantification, 0.06 to 0.1 μg/ml), accurate (intra- and interassay biases of -9.9 to +5% and -4.0 to +8.8%, respectively), and precise (intra- and interassay coefficients of variation of 1.2 to 11.1% and 1.2 to 8.9%, respectively) over clinical concentration ranges (upper limits of quantification, 5 to 50 μg/ml). Thus, we developed a simple, rapid, and robust multiplex UPLC-MS/MS assay for simultaneous quantification of plasma concentrations of six antifungals and two metabolites. This offers, by optimized and cost-effective lab resource utilization, an efficient tool for daily routine TDM aimed at maximizing the real-time efficacy and safety of different recommended single-drug antifungal regimens and combination salvage therapies, as well as a tool for clinical research.