993 resultados para Large Extra Dimensions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is motivated by the recent interest in the use of Bayesian VARs for forecasting, even in cases where the number of dependent variables is large. In such cases, factor methods have been traditionally used but recent work using a particular prior suggests that Bayesian VAR methods can forecast better. In this paper, we consider a range of alternative priors which have been used with small VARs, discuss the issues which arise when they are used with medium and large VARs and examine their forecast performance using a US macroeconomic data set containing 168 variables. We nd that Bayesian VARs do tend to forecast better than factor methods and provide an extensive comparison of the strengths and weaknesses of various approaches. Our empirical results show the importance of using forecast metrics which use the entire predictive density, instead of using only point forecasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we develop methods for estimation and forecasting in large timevarying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We provide field experimental evidence of the effects of monitoring in a context where productivity is multi-dimensional and only one dimension is monitored and incentivised. We hire students to do a job for us. The job consists of identifying euro coins. We study the effects of monitoring and penalising mistakes on work quality, and evaluate spillovers on non- incentivised dimensions of productivity (punctuality and theft). We .nd that monitoring improves work quality only if incentives are large, but reduces punctuality substantially irrespectively of the size of incentives. Monitoring does not affect theft, with ten per cent of participants stealing overall. Our setting also allows us to disentangle between possible theoretical mechanisms driving the adverse effects of monitoring. Our .ndings are supportive of a reciprocity mechanism, whereby workers retaliate for being distrusted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes full-Bayes priors for time-varying parameter vector autoregressions (TVP-VARs) which are more robust and objective than existing choices proposed in the literature. We formulate the priors in a way that they allow for straightforward posterior computation, they require minimal input by the user, and they result in shrinkage posterior representations, thus, making them appropriate for models of large dimensions. A comprehensive forecasting exercise involving TVP-VARs of different dimensions establishes the usefulness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vector Autoregressive Moving Average (VARMA) models have many theoretical properties which should make them popular among empirical macroeconomists. However, they are rarely used in practice due to over-parameterization concerns, difficulties in ensuring identification and computational challenges. With the growing interest in multivariate time series models of high dimension, these problems with VARMAs become even more acute, accounting for the dominance of VARs in this field. In this paper, we develop a Bayesian approach for inference in VARMAs which surmounts these problems. It jointly ensures identification and parsimony in the context of an efficient Markov chain Monte Carlo (MCMC) algorithm. We use this approach in a macroeconomic application involving up to twelve dependent variables. We find our algorithm to work successfully and provide insights beyond those provided by VARs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute cardiovascular dysfunction occurs perioperatively in more than 20% of cardiosurgical patients, yet current acute heart failure (HF) classification is not applicable to this period. Indicators of major perioperative risk include unstable coronary syndromes, decompensated HF, significant arrhythmias and valvular disease. Clinical risk factors include history of heart disease, compensated HF, cerebrovascular disease, presence of diabetes mellitus, renal insufficiency and high-risk surgery. EuroSCORE reliably predicts perioperative cardiovascular alteration in patients aged less than 80 years. Preoperative B-type natriuretic peptide level is an additional risk stratification factor. Aggressively preserving heart function during cardiosurgery is a major goal. Volatile anaesthetics and levosimendan seem to be promising cardioprotective agents, but large trials are still needed to assess the best cardioprotective agent(s) and optimal protocol(s). The aim of monitoring is early detection and assessment of mechanisms of perioperative cardiovascular dysfunction. Ideally, volume status should be assessed by 'dynamic' measurement of haemodynamic parameters. Assess heart function first by echocardiography, then using a pulmonary artery catheter (especially in right heart dysfunction). If volaemia and heart function are in the normal range, cardiovascular dysfunction is very likely related to vascular dysfunction. In treating myocardial dysfunction, consider the following options, either alone or in combination: low-to-moderate doses of dobutamine and epinephrine, milrinone or levosimendan. In vasoplegia-induced hypotension, use norepinephrine to maintain adequate perfusion pressure. Exclude hypovolaemia in patients under vasopressors, through repeated volume assessments. Optimal perioperative use of inotropes/vasopressors in cardiosurgery remains controversial, and further large multinational studies are needed. Cardiosurgical perioperative classification of cardiac impairment should be based on time of occurrence (precardiotomy, failure to wean, postcardiotomy) and haemodynamic severity of the patient's condition (crash and burn, deteriorating fast, stable but inotrope dependent). In heart dysfunction with suspected coronary hypoperfusion, an intra-aortic balloon pump is highly recommended. A ventricular assist device should be considered before end organ dysfunction becomes evident. Extra-corporeal membrane oxygenation is an elegant solution as a bridge to recovery and/or decision making. This paper offers practical recommendations for management of perioperative HF in cardiosurgery based on European experts' opinion. It also emphasizes the need for large surveys and studies to assess the optimal way to manage perioperative HF in cardiac surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a vast literature that specifies Bayesian shrinkage priors for vector autoregressions (VARs) of possibly large dimensions. In this paper I argue that many of these priors are not appropriate for multi-country settings, which motivates me to develop priors for panel VARs (PVARs). The parametric and semi-parametric priors I suggest not only perform valuable shrinkage in large dimensions, but also allow for soft clustering of variables or countries which are homogeneous. I discuss the implications of these new priors for modelling interdependencies and heterogeneities among different countries in a panel VAR setting. Monte Carlo evidence and an empirical forecasting exercise show clear and important gains of the new priors compared to existing popular priors for VARs and PVARs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This observational study analyzed imatinib pharmacokinetics and response in 2478 chronic myeloid leukemia (CML) patients. Data were obtained through centralized therapeutic drug monitoring (TDM) at median treatment duration of ≥2 years. First, individual initial trough concentrations under 400mg/day imatinib starting dose were estimated. Second, their correlation (C^min(400mg)) with reported treatment response was verified. Low imatinib levels were predicted in young male patients and those receiving P-gp/CYP3A4 inducers. These patients had also lower response rates (7% lower 18-months MMR in male, 17% lower 1-year CCyR in young patients, Kaplan-Meier estimates). Time-point independent multivariate regression confirmed a correlation of individual C^min(400mg) with response and adverse events. Possibly due to confounding factors (e.g. dose modifications, patient selection bias), the relationship seemed however flatter than previously reported from prospective controlled studies. Nonetheless, these observational results strongly suggest that a subgroup of patients could benefit from early dosage optimization assisted by TDM, because of lower imatinib concentrations and lower response rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When dealing with sustainability we are concerned with the biophysical as well as the monetary aspects of economic and ecological interactions. This multidimensional approach requires that special attention is given to dimensional issues in relation to curve fitting practice in economics. Unfortunately, many empirical and theoretical studies in economics, as well as in ecological economics, apply dimensional numbers in exponential or logarithmic functions. We show that it is an analytical error to put a dimensional unit x into exponential functions ( a x ) and logarithmic functions ( x a log ). Secondly, we investigate the conditions of data sets under which a particular logarithmic specification is superior to the usual regression specification. This analysis shows that logarithmic specification superiority in terms of least square norm is heavily dependent on the available data set. The last section deals with economists’ “curve fitting fetishism”. We propose that a distinction be made between curve fitting over past observations and the development of a theoretical or empirical law capable of maintaining its fitting power for any future observations. Finally we conclude this paper with several epistemological issues in relation to dimensions and curve fitting practice in economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present paper, we study the geometric discrepancy with respect to families of rotated rectangles. The well-known extremal cases are the axis-parallel rectangles (logarithmic discrepancy) and rectangles rotated in all possible directions (polynomial discrepancy). We study several intermediate situations: lacunary sequences of directions, lacunary sets of finite order, and sets with small Minkowski dimension. In each of these cases, extensions of a lemma due to Davenport allow us to construct appropriate rotations of the integer lattice which yield small discrepancy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Superinfection with drug resistant HIV strains could potentially contribute to compromised therapy in patients initially infected with drug-sensitive virus and receiving antiretroviral therapy. To investigate the importance of this potential route to drug resistance, we developed a bioinformatics pipeline to detect superinfection from routinely collected genotyping data, and assessed whether superinfection contributed to increased drug resistance in a large European cohort of viremic, drug treated patients. METHODS: We used sequence data from routine genotypic tests spanning the protease and partial reverse transcriptase regions in the Virolab and EuResist databases that collated data from five European countries. Superinfection was indicated when sequences of a patient failed to cluster together in phylogenetic trees constructed with selected sets of control sequences. A subset of the indicated cases was validated by re-sequencing pol and env regions from the original samples. RESULTS: 4425 patients had at least two sequences in the database, with a total of 13816 distinct sequence entries (of which 86% belonged to subtype B). We identified 107 patients with phylogenetic evidence for superinfection. In 14 of these cases, we analyzed newly amplified sequences from the original samples for validation purposes: only 2 cases were verified as superinfections in the repeated analyses, the other 12 cases turned out to involve sample or sequence misidentification. Resistance to drugs used at the time of strain replacement did not change in these two patients. A third case could not be validated by re-sequencing, but was supported as superinfection by an intermediate sequence with high degenerate base pair count within the time frame of strain switching. Drug resistance increased in this single patient. CONCLUSIONS: Routine genotyping data are informative for the detection of HIV superinfection; however, most cases of non-monophyletic clustering in patient phylogenies arise from sample or sequence mix-up rather than from superinfection, which emphasizes the importance of validation. Non-transient superinfection was rare in our mainly treatment experienced cohort, and we found a single case of possible transmitted drug resistance by this route. We therefore conclude that in our large cohort, superinfection with drug resistant HIV did not compromise the efficiency of antiretroviral treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.