990 resultados para Dynamic Panel Estimations
Resumo:
BACKGROUND: The quality of colon cleansing is a major determinant of quality of colonoscopy. To our knowledge, the impact of bowel preparation on the quality of colonoscopy has not been assessed prospectively in a large multicenter study. Therefore, this study assessed the factors that determine colon-cleansing quality and the impact of cleansing quality on the technical performance and diagnostic yield of colonoscopy. METHODS: Twenty-one centers from 11 countries participated in this prospective observational study. Colon-cleansing quality was assessed on a 5-point scale and was categorized on 3 levels. The clinical indication for colonoscopy, diagnoses, and technical parameters related to colonoscopy were recorded. RESULTS: A total of 5832 patients were included in the study (48.7% men, mean age 57.6 [15.9] years). Cleansing quality was lower in elderly patients and in patients in the hospital. Procedures in poorly prepared patients were longer, more difficult, and more often incomplete. The detection of polyps of any size depended on cleansing quality: odds ratio (OR) 1.73: 95% confidence interval (CI)[1.28, 2.36] for intermediate-quality compared with low-quality preparation; and OR 1.46: 95% CI[1.11, 1.93] for high-quality compared with low-quality preparation. For polyps >10 mm in size, corresponding ORs were 1.0 for low-quality cleansing, OR 1.83: 95% CI[1.11, 3.05] for intermediate-quality cleansing, and OR 1.72: 95% CI[1.11, 2.67] for high-quality cleansing. Cancers were not detected less frequently in the case of poor preparation. CONCLUSIONS: Cleansing quality critically determines quality, difficulty, speed, and completeness of colonoscopy, and is lower in hospitalized patients and patients with higher levels of comorbid conditions. The proportion of patients who undergo polypectomy increases with higher cleansing quality, whereas colon cancer detection does not seem to critically depend on the quality of bowel preparation.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
This paper examines factors explaining subcontracting decisions in the construction industry. Rather than the more common cross-sectional analyses, we use panel data to evaluate the influence of all relevant variables. We design and use a new index of the closeness to small numbers situations to estimate the extent of hold-up problems. Results show that as specificity grows, firms tend to subcontract less. The opposite happens when output heterogeneity and the use of intangible assets and capabilities increase. Neither temporary shortage of capacity nor geographical dispersion of activities seem to affect the extent of subcontracting. Finally, proxies for uncertainty do not show any clear effect.
Resumo:
We study the interaction between insurance and capital markets within singlebut general framework.We show that capital markets greatly enhance the risksharing capacity of insurance markets and the scope of risks that areinsurable because efficiency does not depend on the number of agents atrisk, nor on risks being independent, nor on the preferences and endowmentsof agents at risk being the same. We show that agents share risks by buyingfull coverage for their individual risks and provide insurance capitalthrough stock markets.We show that aggregate risk enters private insuranceas positive loading on insurance prices and despite that agents will buyfull coverage. The loading is determined by the risk premium of investorsin the stock market and hence does not depend on the agent s willingnessto pay. Agents provide insurance capital by trading an equally weightedportfolio of insurance company shares and riskless asset. We are able toconstruct agents optimal trading strategies explicitly and for verygeneral preferences.
Resumo:
La segunda reunión del panel de expertos continua sus estudios sobre recursos pesqueros tomando como tema principal la anchoveta peruana. Esta investigación considera el trabajo realizado el año anterior en su primera reunión.
Resumo:
La tercera sesión del panel de expertos hace un reencuentro de las condiciones oceanográficas entre el año 1971 y 1972 respecto a la pesquería de la anchoveta.
Resumo:
Se presenta una revisión de los sucesos de las pesquerías en el lapso desde la anterior reunión del Panel en Julio, 1972 y de lo que se piensa ha tenido lugar en el recurso y en su medio ambiente; se indican diversas líneas de investigación del stock. Se resume la evidencia colateral acerca de los stocks y de su medio ambiente; ésta se relaciona con el fenómeno de El Niño, con las aves guaneras, el aumento de las capturas de sardinas, el contenido graso de la anchoveta y los estadios de maduración de la misma. Cada una de estas entidades indican que recientemente han prevalecido condiciones especiales en el ambiente de la anchoveta, en la biota asociada y en ciertos aspectos de su fisiología. Se ha hecho una detallada revisión de varios tipos de investigación de las poblaciones de anchoveta y de las evidencias que, acerca del estado del recurso, pueden derivarse de tales investigaciones. Se examinó la evidencia obtenida del uso de equipo acústico en el estudio de la distribución de stocks y en la estimación de su abundancia; el trabajo se realizó mediante exploraciones de varias embarcaciones (exploraciones Eureka) y de exploraciones especiales con el barco de investigación SNP-1; también se hizo uso de equipo eco-integrador. Se discuten las fuentes de errores sistemáticos existentes en ese trabajo. Se llega a la conclusión que lo que ese trabajo indicó fue que en Febrero de 1973 existía alrededor de 4 millones de toneladas de anchoveta.
Resumo:
En el informe de su cuarta reunión el Panel de Expertos en Dinámica revisó los datos proporcionados por el Instituto del Mar del Perú sobre el presente estado del stock de la anchoveta. Estos datos mostraron, evidentemente, que el stock de ancho veta se encontraba en una crítica situación a comienzos de 1973. Los mejores cálculos indicaban que la mayor captura que podría obtenerse en 1973 sería alrededor de a millones de toneladas pero que sería conveniente mantencrla muy por debajo de este nivel, especialmente antes de la mayor temporada de desove en Agosto-Setiembre. En 1974 el reclutamiento podía ser muy bajo especialmente si el stock desovante seguía siendo agotado por la pesca, de modo que las capturas en 1974 podrían ser mucho más bajas del promedio aún en el caso de que no hubiera restricciones para la pesca.
Resumo:
In models where privately informed agents interact, agents may need to formhigher order expectations, i.e. expectations of other agents' expectations. This paper develops a tractable framework for solving and analyzing linear dynamic rational expectationsmodels in which privately informed agents form higher order expectations. The frameworkis used to demonstrate that the well-known problem of the infinite regress of expectationsidentified by Townsend (1983) can be approximated to an arbitrary accuracy with a finitedimensional representation under quite general conditions. The paper is constructive andpresents a fixed point algorithm for finding an accurate solution and provides weak conditions that ensure that a fixed point exists. To help intuition, Singleton's (1987) asset pricingmodel with disparately informed traders is used as a vehicle for the paper.
Resumo:
Presentan los resultados del trabajo del panel siguiendo la secuencia establecida en los términos de referencia: estimar el estatus del stock de la merluza peruana, Proveer proyecciones del rendimiento y desarrollo del stock, bajo un escenario pesquero apropiado para el stock de la merluza peruana, comentar sobre medidas administrativas apropiadas de corto y mediano plazo, para la merluza peruana.
Resumo:
Evalua los resultados del trabajo del Panel de acuerdo con los términos de referencia: actualizar la evaluación y estimar el estatus del stock de la merluza peruana, evaluar el stock desovante y la ojiva de madurez, proponer una cuota para el año 2004, revisar los datos de los cruceros para determinar si existe merluza fuera del área geográfica muestreada, proveer proyecciones del rendimiento y desarrollo del stock bajo esquemas pesqueros apropiados para la merluza peruana.
Resumo:
Estudia la información sobre los procesos biológicos de la anchoveta y su dinámica poblacional frente a la variabilidad ambiental, en el marco de un enfoque ecosistémico, que permita caracterizar el rol actual de la anchoveta en el Ecosistema de Afloramiento frente al Perú, proporcionando mayores elementos para el desarrollo sustentable de su pesquería.
Resumo:
This paper demonstrates that, unlike what the conventional wisdom says, measurement error biases in panel data estimation of convergence using OLS with fixed effects are huge, not trivial. It does so by way of the "skipping estimation"': taking data from every m years of the sample (where m is an integer greater than or equal to 2), as opposed to every single year. It is shown that the estimated speed of convergence from the OLS with fixed effects is biased upwards by as much as 7 to 15%.
Resumo:
Game theory is a branch of applied mathematics used to analyze situation where two or more agents are interacting. Originally it was developed as a model for conflicts and collaborations between rational and intelligent individuals. Now it finds applications in social sciences, eco- nomics, biology (particularly evolutionary biology and ecology), engineering, political science, international relations, computer science, and philosophy. Networks are an abstract representation of interactions, dependencies or relationships. Net- works are extensively used in all the fields mentioned above and in many more. Many useful informations about a system can be discovered by analyzing the current state of a network representation of such system. In this work we will apply some of the methods of game theory to populations of agents that are interconnected. A population is in fact represented by a network of players where one can only interact with another if there is a connection between them. In the first part of this work we will show that the structure of the underlying network has a strong influence on the strategies that the players will decide to adopt to maximize their utility. We will then introduce a supplementary degree of freedom by allowing the structure of the population to be modified along the simulations. This modification allows the players to modify the structure of their environment to optimize the utility that they can obtain.