930 resultados para Random effect model
Resumo:
2000 Mathematics Subject Classification: 62P10, 92D10, 92D30, 62F03
Resumo:
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regressiontechniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a nave random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists' long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies. © 2010 Elsevier B.V. All rights reserved.
Resumo:
Points of transition, when major life roles undergo change, tend to be associated with an increased need for social support. The transition from adolescence to adulthood is ideal for the examination of the effect of normative stress on the development and functioning of social networks. A questionnaire was designed based on the convoy model to assess the influence of personal and situational characteristics on the utilization of support in the prediction of post-transition adjustment. Data were initially collected for a multi-ethnic sample of 741 sophomores and seniors in high school. Surveys were mailed to participants two years later, and one again the following year. The current study is based on data for 310 participants with complete data for all three time periods. A series of hierarchical regressions were conducted to compare three explanatory models of support: main effect, mediation, and moderation. A main effect model of support on post-transition adjustment was confirmed, a mediator model was not confirmed, and a moderator model was marginally confirmed. Family and friend support was related to significantly lower levels of loneliness, particularly for those with less adaptable temperaments. ^
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.
Resumo:
The random walk models with temporal correlation (i.e. memory) are of interest in the study of anomalous diffusion phenomena. The random walk and its generalizations are of prominent place in the characterization of various physical, chemical and biological phenomena. The temporal correlation is an essential feature in anomalous diffusion models. These temporal long-range correlation models can be called non-Markovian models, otherwise, the short-range time correlation counterparts are Markovian ones. Within this context, we reviewed the existing models with temporal correlation, i.e. entire memory, the elephant walk model, or partial memory, alzheimer walk model and walk model with a gaussian memory with profile. It is noticed that these models shows superdiffusion with a Hurst exponent H > 1/2. We study in this work a superdiffusive random walk model with exponentially decaying memory. This seems to be a self-contradictory statement, since it is well known that random walks with exponentially decaying temporal correlations can be approximated arbitrarily well by Markov processes and that central limit theorems prohibit superdiffusion for Markovian walks with finite variance of step sizes. The solution to the apparent paradox is that the model is genuinely non-Markovian, due to a time-dependent decay constant associated with the exponential behavior. In the end, we discuss ideas for future investigations.
Resumo:
Essai doctoral présenté à la Faculté des Arts et des Sciences en vue de l'obtention du grade de doctorat en psychologie clinique (D.psy.)
Resumo:
Essai doctoral présenté à la Faculté des Arts et des Sciences en vue de l'obtention du grade de doctorat en psychologie clinique (D.psy.)
Resumo:
As part of its single technology appraisal (STA) process, the National Institute for Health and Care Excellence (NICE) invited the company that manufactures cabazitaxel (Jevtana(®), Sanofi, UK) to submit evidence for the clinical and cost effectiveness of cabazitaxel for treatment of patients with metastatic hormone-relapsed prostate cancer (mHRPC) previously treated with a docetaxel-containing regimen. The School of Health and Related Research Technology Appraisal Group at the University of Sheffield was commissioned to act as the independent Evidence Review Group (ERG). The ERG produced a critical review of the evidence for the clinical and cost effectiveness of the technology based upon the company's submission to NICE. Clinical evidence for cabazitaxel was derived from a multinational randomised open-label phase III trial (TROPIC) of cabazitaxel plus prednisone or prednisolone compared with mitoxantrone plus prednisone or prednisolone, which was assumed to represent best supportive care. The NICE final scope identified a further three comparators: abiraterone in combination with prednisone or prednisolone; enzalutamide; and radium-223 dichloride for the subgroup of people with bone metastasis only (no visceral metastasis). The company did not consider radium-223 dichloride to be a relevant comparator. Neither abiraterone nor enzalutamide has been directly compared in a trial with cabazitaxel. Instead, clinical evidence was synthesised within a network meta-analysis (NMA). Results from TROPIC showed that cabazitaxel was associated with a statistically significant improvement in both overall survival and progression-free survival compared with mitoxantrone. Results from a random-effects NMA, as conducted by the company and updated by the ERG, indicated that there was no statistically significant difference between the three active treatments for both overall survival and progression-free survival. Utility data were not collected as part of the TROPIC trial, and were instead taken from the company's UK early access programme. Evidence on resource use came from the TROPIC trial, supplemented by both expert clinical opinion and a UK clinical audit. List prices were used for mitoxantrone, abiraterone and enzalutamide as directed by NICE, although commercial in-confidence patient-access schemes (PASs) are in place for abiraterone and enzalutamide. The confidential PAS was used for cabazitaxel. Sequential use of the advanced hormonal therapies (abiraterone and enzalutamide) does not usually occur in clinical practice in the UK. Hence, cabazitaxel could be used within two pathways of care: either when an advanced hormonal therapy was used pre-docetaxel, or when one was used post-docetaxel. The company believed that the former pathway was more likely to represent standard National Health Service (NHS) practice, and so their main comparison was between cabazitaxel and mitoxantrone, with effectiveness data from the TROPIC trial. Results of the company's updated cost-effectiveness analysis estimated a probabilistic incremental cost-effectiveness ratio (ICER) of £45,982 per quality-adjusted life-year (QALY) gained, which the committee considered to be the most plausible value for this comparison. Cabazitaxel was estimated to be both cheaper and more effective than abiraterone. Cabazitaxel was estimated to be cheaper but less effective than enzalutamide, resulting in an ICER of £212,038 per QALY gained for enzalutamide compared with cabazitaxel. The ERG noted that radium-223 is a valid comparator (for the indicated sub-group), and that it may be used in either of the two care pathways. Hence, its exclusion leads to uncertainty in the cost-effectiveness results. In addition, the company assumed that there would be no drug wastage when cabazitaxel was used, with cost-effectiveness results being sensitive to this assumption: modelling drug wastage increased the ICER comparing cabazitaxel with mitoxantrone to over £55,000 per QALY gained. The ERG updated the company's NMA and used a random effects model to perform a fully incremental analysis between cabazitaxel, abiraterone, enzalutamide and best supportive care using PASs for abiraterone and enzalutamide. Results showed that both cabazitaxel and abiraterone were extendedly dominated by the combination of best supportive care and enzalutamide. Preliminary guidance from the committee, which included wastage of cabazitaxel, did not recommend its use. In response, the company provided both a further discount to the confidential PAS for cabazitaxel and confirmation from NHS England that it is appropriate to supply and purchase cabazitaxel in pre-prepared intravenous-infusion bags, which would remove the cost of drug wastage. As a result, the committee recommended use of cabazitaxel as a treatment option in people with an Eastern Cooperative Oncology Group performance status of 0 or 1 whose disease had progressed during or after treatment with at least 225 mg/m(2) of docetaxel, as long as it was provided at the discount agreed in the PAS and purchased in either pre-prepared intravenous-infusion bags or in vials at a reduced price to reflect the average per-patient drug wastage.
Resumo:
This work aims to investigate the relationship between the entrepreneurship and the incidence of bureaucratic corruption in the states of Brazil and Federal District. The main hypothesis of this study is that the opening of a business in Brazilian states is negatively affected by the incidence of corruption. The theoretical reference is divided into Entrepreneurship and bureaucratic corruption, with an emphasis on materialistic perspective (objectivist) of entrepreneurship and the effects of bureaucratic corruption on entrepreneurial activity. By the regression method with panel data, we estimated the models with pooled data and fixed and random effects. To measure corruption, I used the General Index of Corruption for the Brazilian states (BOLL, 2010), and to represent entrepreneurship, firm entry per capita by state. Tests (Chow, Hausman and Breusch-Pagan) indicate that the random effects model is more appropriate, and the preliminary results indicate a positive impact of bureaucratic corruption on entrepreneurial activity, contradicting the hypothesis expected and found in previous articles to Brazil, and corroborating the proposition of Dreher and Gassebner (2011) that, in countries with high regulation, bureaucratic corruption can be grease in the wheels of entrepreneurship
Resumo:
The goal of image retrieval and matching is to find and locate object instances in images from a large-scale image database. While visual features are abundant, how to combine them to improve performance by individual features remains a challenging task. In this work, we focus on leveraging multiple features for accurate and efficient image retrieval and matching. We first propose two graph-based approaches to rerank initially retrieved images for generic image retrieval. In the graph, vertices are images while edges are similarities between image pairs. Our first approach employs a mixture Markov model based on a random walk model on multiple graphs to fuse graphs. We introduce a probabilistic model to compute the importance of each feature for graph fusion under a naive Bayesian formulation, which requires statistics of similarities from a manually labeled dataset containing irrelevant images. To reduce human labeling, we further propose a fully unsupervised reranking algorithm based on a submodular objective function that can be efficiently optimized by greedy algorithm. By maximizing an information gain term over the graph, our submodular function favors a subset of database images that are similar to query images and resemble each other. The function also exploits the rank relationships of images from multiple ranked lists obtained by different features. We then study a more well-defined application, person re-identification, where the database contains labeled images of human bodies captured by multiple cameras. Re-identifications from multiple cameras are regarded as related tasks to exploit shared information. We apply a novel multi-task learning algorithm using both low level features and attributes. A low rank attribute embedding is joint learned within the multi-task learning formulation to embed original binary attributes to a continuous attribute space, where incorrect and incomplete attributes are rectified and recovered. To locate objects in images, we design an object detector based on object proposals and deep convolutional neural networks (CNN) in view of the emergence of deep networks. We improve a Fast RCNN framework and investigate two new strategies to detect objects accurately and efficiently: scale-dependent pooling (SDP) and cascaded rejection classifiers (CRC). The SDP improves detection accuracy by exploiting appropriate convolutional features depending on the scale of input object proposals. The CRC effectively utilizes convolutional features and greatly eliminates negative proposals in a cascaded manner, while maintaining a high recall for true objects. The two strategies together improve the detection accuracy and reduce the computational cost.
Resumo:
Facing widespread poverty and land degradation, Vietnam started a land reform in 1993 as part of its renovation policy package known as “Doi Moi”. This paper examines the impacts of improved land tenure security, via this land reform, on manure use by farm households. As manure potentially improves soil fertility by adding organic matter and nutrients to the soil surface, it might contribute to improving soil productive capacity and reversing land degradation. Random effect regression models are applied to a panel dataset of 133 farm households in the Northern Uplands of Vietnam collected in 1993, 1998, and 2006. The results confirm that land tenure security has positive effects on manure use, but the levels of influence differ depending on whether the land has been privatized or whether the land title has already been issued. In addition, manure use is also influenced by the number of cattle and pigs, the education level and ethnicity of household heads, farm land size and non-farm income. The findings suggest that speeding up land privatization and titling, encouraging cattle and pig rearing, and improving education would promote manure use in farm production. However, careful interpretation of our research findings is required as land privatization, together with economic growth and population pressure, might lead to overuse of farm inputs.
Resumo:
This work aims to investigate the relationship between the entrepreneurship and the incidence of bureaucratic corruption in the states of Brazil and Federal District. The main hypothesis of this study is that the opening of a business in Brazilian states is negatively affected by the incidence of corruption. The theoretical reference is divided into Entrepreneurship and bureaucratic corruption, with an emphasis on materialistic perspective (objectivist) of entrepreneurship and the effects of bureaucratic corruption on entrepreneurial activity. By the regression method with panel data, we estimated the models with pooled data and fixed and random effects. To measure corruption, I used the General Index of Corruption for the Brazilian states (BOLL, 2010), and to represent entrepreneurship, firm entry per capita by state. Tests (Chow, Hausman and Breusch-Pagan) indicate that the random effects model is more appropriate, and the preliminary results indicate a positive impact of bureaucratic corruption on entrepreneurial activity, contradicting the hypothesis expected and found in previous articles to Brazil, and corroborating the proposition of Dreher and Gassebner (2011) that, in countries with high regulation, bureaucratic corruption can be grease in the wheels of entrepreneurship
Resumo:
Doutoramento em Gestão
Resumo:
Les biotechnologies, le réchauffement climatique, les ressources naturelles et la gestion des écosystèmes sont tous représentatifs de la “nouvelle politique de la nature” (Hajer 2003), un terme englobant les enjeux marqués par une grande incertitude scientifique et un encadrement réglementaire inadapté aux nouvelles réalités, suscitant de fait un conflit politique hors du commun. Dans l'espoir de diminuer ces tensions et de générer un savoir consensuel, de nombreux gouvernements se tournent vers des institutions scientifiques ad hoc pour documenter l'élaboration des politiques et répondre aux préoccupations des partie-prenantes. Mais ces évaluations scientifiques permettent-elles réellement de créer une compréhension commune partagée par ces acteurs politiques polarisés? Alors que l'on pourrait croire que celles-ci génèrent un climat d'apprentissage collectif rassembleur, un environnement politique conflictuel rend l'apprentissage entre opposant extrêmement improbable. Ainsi, cette recherche documente le potentiel conciliateur des évaluation scientifique en utilisant le cas des gaz de schiste québécois (2010-2014). Ce faisant, elle mobilise la littérature sur les dimensions politiques du savoir et de la science afin de conceptualiser le rôle des évaluations scientifiques au sein d'une théorie de la médiation scientifique (scientific brokerage). Une analyse de réseau (SNA) des 5751 références contenues dans les documents déposés par 268 organisations participant aux consultations publiques de 2010 et 2014 constitue le corps de la démonstration empirique. Précisément, il y est démontré comment un médiateur scientifique peut rediriger le flux d'information afin de contrer l'incompatibilité entre apprentissage collectif et conflit politique. L'argument mobilise les mécanismes cognitifs traditionnellement présents dans la théorie des médiateurs de politique (policy broker), mais introduit aussi les jeux de pouvoir fondamentaux à la circulation de la connaissance entre acteurs politiques.