977 resultados para Instrumental-variable Methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Peer effects in adolescent cannabis are difficult to estimate, due in part to the lack of appropriate data on behaviour and social ties. This paper exploits survey data that have many desirable properties and have not previously been used for this purpose. The data set, collected from teenagers in three annual waves from 2002-2004 contains longitudinal information about friendship networks within schools (N = 5,020). We exploit these data on network structure to estimate peer effects on adolescents from their nominated friends within school using two alternative approaches to identification. First, we present a cross-sectional instrumental variable (IV) estimate of peer effects that exploits network structure at the second degree, i.e. using information on friends of friends who are not themselves ego’s friends to instrument for the cannabis use of friends. Second, we present an individual fixed effects estimate of peer effects using the full longitudinal structure of the data. Both innovations allow a greater degree of control for correlated effects than is commonly the case in the substance-use peer effects literature, improving our chances of obtaining estimates of peer effects than can be plausibly interpreted as causal. Both estimates suggest positive peer effects of non-trivial magnitude, although the IV estimate is imprecise. Furthermore, when we specify identical models with behaviour and characteristics of randomly selected school peers in place of friends’, we find effectively zero effect from these ‘placebo’ peers, lending credence to our main estimates. We conclude that cross-sectional data can be used to estimate plausible positive peer effects on cannabis use where network structure information is available and appropriately exploited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A mediator is a dependent variable, m (e.g., charisma), that is thought to channel the effect of an independent variable, x (e.g., receiving training or not), on another dependent variable (e.g., subordinate satisfaction), y. In experimental settings x is manipulated-subjects are randomized to treatment-to isolate the causal effect of x on other variables. If m is not or cannot be manipulated, which is often the case, its causal effect on other variables cannot be determined; thus, standard mediation tests cannot inform policy or practice. I will show how an econometric procedure, called instrumental-variable estimation, can examine mediation in such cases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to examine the role played by built heritages and cultural environments, alongside other locational factors, in explaining the growth of human capital in Sweden. We distinguish between urban, natural and cultural qualities as different sources of regional attractiveness and estimate their influence on the observed growth of individuals with at least three years of higher education during 2001–2010. Neighborhood-level data are used, and unobserved heterogeneity and spatial dependencies are modeled by employing random effects estimations and an instrumental variable approach. Our findings indicate that the local supply of built heritages and cultural environments explain a significant part of human capital growth in Sweden. Results suggest that these types of cultural heritages are important place-based resources with a potential to contribute to improved regional attractiveness and growth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Méthodologie: Modèle de régression quantile de variable instrumentale pour données de Panel utilisant la fonction de production partielle

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El presente documento analiza los determinantes del margen de intermediación para el sistema financiero colombiano entre 1989 y 2003. Bajo una estimación dinámica de los efectos generados por variables específicas de actividad, impuestos y estructura de mercado, se presenta un seguimiento del margen de intermediación financiero, para un período que presenta elementos de liberalización y crisis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

¿Cuáles son los efectos de la guerra sobre el comportamiento político? Colombia es un caso interesante en el que el conflicto y las elecciones coexisten y los grupos armados ilegales intencionalmente afectan los resultados electorales. Sin embargo, los grupos usan diferentes estrategias para alterar estos resultados. Este artículo argumenta que los efectos diferenciales de la violencia sobre los resultados electorales son el resultado de estrategias deliberadas de los grupos ilegales, que a su turno, son consecuencia de las condiciones militares que difieren entre ellos. Usando datos panel de las elecciones al Senado de 1994 a 2006 y una aproximación por variables instrumentales para resolver posibles problemas de endogenidad, este artículo muestra que la violencia guerrillera disminuye la participación electoral, mientras que la violencia paramilitar no tiene ningún efecto sobre la participación pero reduce la competencia electoral y beneficia a nuevos partidos no-tradicionales. Esto es consistente con la hipótesis de que la estrategia de la guerrilla es sabotear las elecciones, mientras que los paramiltares establecen alianzas con ciertos candidatos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Considering different perspectives, the scope of this thesis is to investigate how to improve healthcare resources allocation and the provision efficiency for hip surgeries, a resource-intensive operation, among the most frequently performed on the elderly, with a trend in volume that is increasing in years due to population aging. Firstly, the effect of Time-To-Surgery (TTS) on mortality for hip fracture patients is investigated. The analysis attempts to account for TTS endogeneity due to the inability to fully control for variables affecting patient delay – e.g. patient severity. Exploiting an instrumental variable model, where being admitted on Friday or Saturday predicts longer TTS, findings show exogenous TTS does not have a significant effect on mortality. Thus suggesting surgeons prioritize patients effectively, neutralizing the adverse impact of longer TTS. Then, the volume-outcome relation for total hip replacement surgery is analyzed, seeking to account for selective referral, which may be present in elective surgery context, and induce reverse causality issue in the volume-outcome relation. The analysis employs a conditional choice model where patient travel distance from all regions' hospitals is used as a hospital choice predictor. Findings show the exogenous hospital volume significantly decreases adverse outcomes probability, especially in the short run. Finally, the change in public procurement design enforced in the Romagna LHA (Italy) is exploited to assess its impact on hip prostheses cost, surgeons' implant choice, and patient health outcomes. Hip prostheses are the major cost-driver of hip replacement surgeries, hence it is crucial to design the public tender such that implant prices are minimized, but cost-containment policies have to be weighted with patient well-being. Evidence shows that a cost reduction occurred without a significant surgeons’ choices impact. Positive or no effect of surgeons specialization is found on patients outcomes after the new procurement introduction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work we perform a comparison of two different numerical schemes for the solution of the time-fractional diffusion equation with variable diffusion coefficient and a nonlinear source term. The two methods are the implicit numerical scheme presented in [M.L. Morgado, M. Rebelo, Numerical approximation of distributed order reaction- diffusion equations, Journal of Computational and Applied Mathematics 275 (2015) 216-227] that is adapted to our type of equation, and a colocation method where Chebyshev polynomials are used to reduce the fractional differential equation to a system of ordinary differential equations

Relevância:

40.00% 40.00%

Publicador:

Resumo:

"Most quantitative empirical analyses are motivated by the desire to estimate the causal effect of an independent variable on a dependent variable. Although the randomized experiment is the most powerful design for this task, in most social science research done outside of psychology, experimental designs are infeasible. (Winship & Morgan, 1999, p. 659)." This quote from earlier work by Winship and Morgan, which was instrumental in setting the groundwork for their book, captures the essence of our review of Morgan and Winship's book: It is about causality in nonexperimental settings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fluid handling systems account for a significant share of the global consumption of electrical energy. They also suffer from problems, which reduce their energy efficiency and increase life-cycle costs. Detecting or predicting these problems in time can make fluid handling systems more environmentally and economically sustainable to operate. In this Master’s Thesis, significant problems in fluid systems were studied and possibilities to develop variable-speed-drive-based detection methods for them was discussed. A literature review was conducted to find significant problems occurring in fluid handling systems containing pumps, fans and compressors. To find case examples for evaluating the feasibility of variable-speed-drive-based methods, queries were sent to industrial companies. As a result of this, the possibility to detect heat exchanger fouling with a variable-speed drive was analysed with data from three industrial cases. It was found that a mass flow rate estimate, which can be generated with a variable speed drive, can be used together with temperature measurements to monitor a heat exchanger’s thermal performance. Secondly, it was found that the fouling-related increase in the pressure drop of a heat exchanger can be monitored with a variable speed drive. Lastly, for systems where the flow device is speed controlled with by a pressure measurement, it was concluded that increasing rotational speed can be interpreted as progressing fouling in the heat exchanger.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

SynopsisBackgroundCellulite refers to skin relief alterations in womens thighs and buttocks, causing dissatisfaction and search for treatment. Its physiopathology is complex and not completely understood. Many therapeutic options have been reported with no scientific evidence about benefits. The majority of the studies are not controlled nor randomized; most efficacy endpoints are subjective, like not well-standardized photographs and investigator opinion. Objective measures could improve severity assessment. Our purpose was to correlate non-invasive instrumental measures and standardized clinical evaluation.MethodsTwenty six women presenting cellulite on buttocks, aged from 25 to 41, were evaluated by: body mass index; standardized photography analysis (10-point severity and 5-point photonumeric scales) by five dermatologists; cutometry and high-frequency ultrasonography (dermal density and dermis/hypodermis interface length). Quality of life impact was assessed. Correlations between clinical and instrumental parameters were performed.ResultsGood agreement among dermatologists and main investigator perceptions was detected. Positive correlations: body mass index and clinical scores; ultrasonographic measures. Negative correlation: cutometry and clinical scores. Quality of life score was correlated to dermal collagen density.ConclusionCellulite caused impact in quality of life. Poor correlation between objective measures and clinical evaluation was detected. Cellulite severity assessment is a challenge, and objective parameters should be optimized for clinical trials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.