989 resultados para DSGE, Monte Carlo, Misspecification
Resumo:
Para 1997 se propone realizar las siguientes tareas: Teoría de Superficies Nulas en Relatividad General. * Estudiar la formación de cáusticas y singularidades de las superficies nulas en forma cinemática y dinámica. * Analizar la dinámica de superficies características hasta segundo orden en un esquema perturbativo. * Formalizar la teoría de Superficies Nulas usando espacios fibrados. Cuantificación asintótica del campo gravitatorio. Continuar el estudio del espacio de Hilbert asintótico para el campo gravitatorio. Es conveniente recordar que la cuantifiación de este campo es uno de los problemas teóricos más importantes sin resolver. Cuantificación de Superficies Nulas. Continuar con el estudio de la cuantización del campo gravitatorio usando el formalismo de superficies nulas. Hasta el momento el principal resultado obtenido fue probar que no sólo el campo gravitatorio sino también los puntos del espacio-tiempo se vuelven operadores con relaciones de conmutación. Esto demuestra de manera concreta la especulación de Wheeler que en la gravedad cuántica los puntos del espacio tiempo no están bien definidos. Simulación de un detector de rayos Gamma. (...) Para este año se prevé cumplir las siguientes etapas: * Se estudiarán los efectos de la polarización del fotón incidente en la sección eficaz y su posible método de detección. * Se implementarán las subrutinas correspondientes en las simulaciones Monte Carlo para incorporar secciones eficaces polarizadas en el Código GEANT. Asimismo, se piensa comenzar con el diseño de un nuevo detector de rayos Gamma en un rango de energías (0.1 / 1 MeV) muy inferior al anterior para ser usado como cámara Gamma ultra sensible.
Resumo:
En el área de los estudios básicos de la física de radiaciones, el Laboratorio de Espectrometría de Radiaciones (L.E.R.) de la U.N.C. desarrolla líneas de investigación cuyos resultados son de uso directo en física de radiaciones aplicadas. (...) Objetivos generales y específicos: * Estudios de Detectores de Radiaciones Ionizantes: se continúan los estudios en Argón, teniendo como objetivo específico completar y comprobar técnicas de simulación con el método Monte Carlo para detectores con simetría cilíndrica y gradiente de campo eléctrico. * Análisis por fluorescencia de Rayos X: los objetivos específicos están dedicados a la evaluación de diseños para aumentar la eficiencia de polarización de Rayos X. En cuanto al análisis de elementos, se considerarán las posibles mejoras experimentales mediante el aumento de la relación señal-ruido en la fluorescencia con Rayos X polarizados, disminuyendo los umbrales de detección de concentración de elementos, en muestras compuestas por muchos elementos. * Dosimetría de radiaciones ionizantes: En cuanto a la dosimetría mediante LiF se planea seguir estudiando la difusión de los positrones con bajos contenidos de centros de color y la linealidad de respuesta de la vida media en función de la dosis de radiación para diferentes clases de cristales termoluminiscentes. Para la dosimetría con centelladores plásticos, se pretende en el siguiente paso, la puesta a punto final del dosímetro y electrómetro desarrollado como así también su calibración para su posterior prueba en centros de radioterapia. * Análisis de composición de elementos químicos en el cuerpo humano: En un futuro mediato, se prevé la culminación de los estudios de una nueva técnica para determinación de concentración de elementos minoritarios y mayoritarios en órganos y en extremidades por dispersión incoherente de Rayos X. Esta técnica contempla la ulterior determinación in vivo de la composición del cuerpo humano.
Resumo:
OBJETIVOS: Comparar as relações de custo-efetividade do stent recoberto (SR) por rapamicina com o stent convencional (SC), sob duas perspectivas: medicina suplementar e sistema público (SUS). MÉTODOS: Modelo de decisão analítico com três estratégias de tratamento de lesão coronariana: intervenção coronária percutânea (icP) com SC; com SR com rapamicina e SC seguido de SR para manejo de reestenose sintomática. Os desfechos foram: sobrevida livre de eventos em um ano e expectativa de vida. As árvores de decisão foram construídas com resultados de registros e ensaios clínicos publicados. RESULTADOS: A sobrevida em um ano livre de reestenose foi de 92,7% com SR e de 78,8% com SC. A expectativa de vida estimada das estratégias foi muito semelhante, entre 18,5 e 19 anos. Sob a perspectiva não-pública, a diferença de custo no primeiro ano entre SC e SR foi de R$ 3.816, com relação de custo-efetividade incremental de R$ 27.403 por evento evitado em um ano. Sob a perspectiva do SUS, o custo por evento evitado em um ano foi de R$ 47.529. Na análise de sensibilidade, foram preditores relevantes a probabilidade de reestenose, a redução de risco esperada com SR, o custo do stent e o custo do manejo da reestenose. Os dados por anos de vida demonstraram relações de custo-efetividade bastante elevadas na simulação de Monte Carlo. CONCLUSÃO: As relações de custo-efetividade do SR por rapamicina foram elevadas em modelo brasileiro. O uso de SR foi mais favorável em pacientes de alto risco de reestenose, com elevado custo do manejo de reestenose e sob a perspectiva não-pública.
Resumo:
Las tareas de investigación involucran tres subproyectos interdependientes: A. Caracterización de muestras extensas. B. Caracterización de muestras delgadas y multiminares. C. Caracterización de partículas y aplicación a la contaminación ambiental. Se prevé utilizar la técnica de Monte Carlo para predecir espectros de FRX con sistema dispersivo en energías. Este mismo método será utilizado para describir la distribución angular de los electrones retrodifundidos para incidencia normal y no normal, y también para mejorar la descripción de la función distribución de ionizaciones f(pz) utilizando secciones eficaces realistas. Paralelamente, se desarrollarán modelos de los parámetros fo, go y h que determinan la función distribución de ionizaciones f(pz). Del mismo modo, se estudiará la caracterización de muestras delgadas y estratificadas. Para ello se realizarán simulaciones Monte Carlo utilizando los resultados previstos en el párrafo anterior y se harán determinaciones experimentales con la microsonda electrónica de sistemas multilaminares simples y multicomponentes. Se extenderán las aplicaciones del programa MULTI desarrollado en nuestro grupo para cuantificación de muestras extensas, láminas delgadas y partículas. En este sentido se implementará un algoritmo para cuantificar muestras con picos no detectables, se evaluarán los errores en las concentraciones calculadas y se incluirá el reforzamiento por fluorescencia con las líneas K beta. Por otro lado, se implementará de un método de procesamiento de datos para el análisis de partículas en EPMA y su aplicación al estudio de la contaminación ambiental mediante el procesamiento de imágenes para la caracterización morfológica y química de partículas en suspensión, análisis de clusters. El objetivo final es la identificación y caracterización de fuentes de contaminación. Esta tarea será complementada con análisis global de la composición química en muestras de materia particulada mediante FRX, y el estudio de sus parámetros físicos mediante el método de simulación Monte Carlo. Se mejorará la estructura experimental de nuestro laboratorio poniendo en funcionamiento un microscopio electrónico Cambridge Stereoscan cedido por la Universidad de Barcelona conjuntamente con la capacidad analítica que ha sido adquirida para extender sus posibilidades de aplicación, y se instalará un equipo de FRX desarrollado en nuestro grupo con accesorios recientemente adquiridos. El desarrollo y solución de los problemas propuestos permitirá mejorar la formación integral de estudiantes en distintas etapas de su carrera de doctorado y recientemente doctorados, ya que presentan además de un aspecto básico, uno aplicado, pues tienden a la solución de situaciones concretas de interés biológico, ambiental y tecnológico.
Resumo:
Modellunsicherheiten, Parameterunsicherheiten, epistemisch, aleatorisch, Monte-Carlo-Methode, Wärmeaustauscher, Kläranlage, Information, Anlagenplanung, Anlagensicherheit
Resumo:
Electromagnetic compatibility, lightning, crosstalk surge voltages, Monte Carlo simulation, accident initiator
Resumo:
Dados de contagem de juvenis de siri-azul (Callinectes sapidus Rathbun, 1896) coletados em dois estuários do Rio Grande do Sul são objeto do presente estudo. Por se encontrarem zero-inflacionados, esses dados motivaram a formulação de modelos hierárquicos, que quantificam o efeito das covariáveis categóricas mês e local sobre a probabilidade de ocorrência e densidade dessas populações, levando em conta a detecção imperfeita. Foram também desenvolvidos modelos não-hierárquicos para comparação. Uma abordagem Bayesiana foi adotada para a estimação dos parâmetros dos modelos por simulação Monte Carlo com Cadeias de Markov (MCMC). A comparação entre modelos foi feita com o Critério de Informação da Deviância (DIC). Os modelos hierárquicos apresentaram ajustes melhores que os modelos convencionais, mitigaram o problema do excesso de zeros e permitiram analisar simultaneamente as probabilidades de ocorrência e a densidade de juvenis de siri-azul. No estuário da Lagoa dos Patos, a probabilidade de ocorrência de juvenis na Classe 2 aumenta com a distância da desembocadura, enquanto em Tramandaí os pontos intermediários apresentam as maiores probabilidades. Em ambos os estuários a ocorrência é mais provável nos meses de verão e de inverno. A densidade de juvenis da Classe 2 apresenta marcada variação em relação aos meses do ano sendo, em geral, maior no estuário de Tramandaí.
Resumo:
This note describes ParallelKnoppix, a bootable CD that allows econometricians with average knowledge of computers to create and begin using a high performance computing cluster for parallel computing in very little time. The computers used may be heterogeneous machines, and clusters of up to 200 nodes are supported. When the cluster is shut down, all machines are in their original state, so their temporary use in the cluster does not interfere with their normal uses. An example shows how a Monte Carlo study of a bootstrap test procedure may be done in parallel. Using a cluster of 20 nodes, the example runs approximately 20 times faster than it does on a single computer.
Resumo:
This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.
Resumo:
Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
Assuming the role of debt management is to provide hedging against fiscal shocks we consider three questions: i) what indicators can be used to assess the performance of debt management? ii) how well have historical debt management policies performed? and iii) how is that performance affected by variations in debt issuance? We consider these questions using OECD data on the market value of government debt between 1970 and 2000. Motivated by both the optimal taxation literature and broad considerations of debt stability we propose a range of performance indicators for debt management. We evaluate these using Monte Carlo analysis and find that those based on the relative persistence of debt perform best. Calculating these measures for OECD data provides only limited evidence that debt management has helped insulate policy against unexpected fiscal shocks. We also find that the degree of fiscal insurance achieved is not well connected to cross country variations in debt issuance patterns. Given the limited volatility observed in the yield curve the relatively small dispersion of debt management practices across countries makes little difference to the realised degree of fiscal insurance.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.
Resumo:
Background: Over the last two decades, mortality from coronary heart disease (CHD) and cerebrovascular disease (CVD) declined by about 30% in the European Union (EU). Design: We analyzed trends in CHD (X ICD codes: I20-I25) and CVD (X ICD codes: I60-I69) mortality in young adults (age 35-44 years) in the EU as a whole and in 12 selected European countries, over the period 1980-2007. Methods: Data were derived from the World Health Organization mortality database. With joinpoint regression analysis, we identified significant changes in trends and estimated average annual percent changes (AAPC). Results: CHD mortality rates at ages 35-44 years have decreased in both sexes since the 1980s for most countries, except for Russia (130/100,000 men and 24/100,000 women, in 2005-7). The lowest rates (around 9/100,000 men, 2/100,000 women) were in France, Italy and Sweden. In men, the steepest declines in mortality were in the Czech Republic (AAPC = -6.1%), the Netherlands (-5.2%), Poland (-4.5%), and England and Wales (-4.5%). Patterns were similar in women, though with appreciably lower rates. The AAPC in the EU was -3.3% for men (rate = 16.6/100,000 in 2005-7) and -2.1% for women (rate = 3.5/100,000). For CVD, Russian rates in 2005-7 were 40/100,000 men and 16/100,000 women, 5 to 10-fold higher than in most western European countries. The steepest declines were in the Czech Republic and Italy for men, in Sweden and the Czech Republic for women. The AAPC in the EU was -2.5% in both sexes, with steeper declines after the mid-late 1990s (rates = 6.4/100,000 men and 4.3/100,000 women in 2005-7). Conclusions: CHD and CVD mortality steadily declined in Europe, except in Russia, whose rates were 10 to 15-fold higher than those of France, Italy or Sweden. Hungary and Poland, and also Scotland, where CHD trends were less favourable than in other western European countries, also emerge as priorities for preventive interventions.
Resumo:
Least Squares estimators are notoriously known to generate sub-optimal exercise decisions when determining the optimal stopping time. The consequence is that the price of the option is underestimated. We show how variance reduction methods can be implemented to obtain more accurate option prices. We also extend the Longsta¤ and Schwartz (2001) method to price American options under stochastic volatility. These are two important contributions that are particularly relevant for practitioners. Finally, we extend the Glasserman and Yu (2004b) methodology to price Asian options and basket options.