960 resultados para Hybrid semi-parametric modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Novel brominated amorphous hydrogenated carbon (a-C:H:Br) films were produced by the plasma polymerization of acetylene-bromoform mixtures. The main parameter of interest was the degree of bromination, which depends on the partial pressure of bromoform in the plasma feed, expressed as a percentage of the total pressure, R-B. When bromoform is present in the feed, deposition rates of up to about 110 nm min(-1) may be obtained. The structure and composition of the films were characterized by Transmission Infrared Reflection Absorption Spectroscopy (IRRAS) and X-ray Photo-electron Spectroscopy (XPS). The latter revealed that films with atomic ratios Br:C of up to 0.58 may be produced. Surface contact angles, measured using goniometry, could be increased from similar to 63 degrees (for an unbrominated film) to similar to 90 degrees for R-B of 60 to 80%. Film surface roughness, measured using a profilometer, does not depend strongly on R-B. Optical properties the refractive index, n, absorption coefficient, alpha(E), where E is the photon energy, and the optical gap, E-g, were determined from film thicknesses and data obtained by Transmission Ultraviolet-Visible Near Infrared Spectroscopy (UVS). Control of n was possible via selection of R-B. The measured optical gap increases with increasing F-BC, the atomic ratio of Br to C in the film, and semi-empirical modeling accounts for this tendency. A typical hardness of the brominated films, determined via nano-indentation, was similar to 0.5 GPa. (C), 2013 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Air Pollution and Health: Bridging the Gap from Sources to Health Outcomes, an international specialty conference sponsored by the American Association for Aerosol Research, was held to address key uncertainties in our understanding of adverse health effects related to air pollution and to integrate and disseminate results from recent scientific studies that cut across a range of air pollution-related disciplines. The Conference addressed the science of air pollution and health within a multipollutant framework (herein "multipollutant" refers to gases and particulate matter mass, components, and physical properties), focusing on five key science areas: sources, atmospheric sciences, exposure, dose, and health effects. Eight key policy-relevant science questions integrated across various parts of the five science areas and a ninth question regarding findings that provide policy-relevant insights served as the framework for the meeting. Results synthesized from this Conference provide new evidence, reaffirm past findings, and offer guidance for future research efforts that will continue to incrementally advance the science required for reducing uncertainties in linking sources, air pollutants, human exposure, and health effects. This paper summarizes the Conference findings organized around the science questions. A number of key points emerged from the Conference findings. First, there is a need for greater focus on multipollutant science and management approaches that include more direct studies of the mixture of pollutants from sources with an emphasis on health studies at ambient concentrations. Further, a number of research groups reaffirmed a need for better understanding of biological mechanisms and apparent associations of various health effects with components of particulate matter (PM), such as elemental carbon, certain organic species, ultrafine particles, and certain trace elements such as Ni, V, and Fe(II), as well as some gaseous pollutants. Although much debate continues in this area, generation of reactive oxygen species induced by these and other species present in air pollution and the resulting oxidative stress and inflammation were reiterated as key pathways leading to respiratory and cardiovascular outcomes. The Conference also underscored significant advances in understanding the susceptibility of populations, including the role of genetics and epigenetics and the influence of socioeconomic and other confounding factors and their synergistic interactions with air pollutants. Participants also pointed out that short-and long-term intervention episodes that reduce pollution from sources and improve air quality continue to indicate that when pollution decreases so do reported adverse health effects. In the limited number of cases where specific sources or PM2.5 species were included in investigations, specific species are often associated with the decrease in effects. Other recent advances for improved exposure estimates for epidemiological studies included using new technologies such as microsensors combined with cell phone and integrated into real-time communications, hybrid air quality modeling such as combined receptor-and emission-based models, and surface observations used with remote sensing such as satellite data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis studies the economic and financial conditions of Italian households, by using microeconomic data of the Survey on Household Income and Wealth (SHIW) over the period 1998-2006. It develops along two lines of enquiry. First it studies the determinants of households holdings of assets and liabilities and estimates their correlation degree. After a review of the literature, it estimates two non-linear multivariate models on the interactions between assets and liabilities with repeated cross-sections. Second, it analyses households financial difficulties. It defines a quantitative measure of financial distress and tests, by means of non-linear dynamic probit models, whether the probability of experiencing financial difficulties is persistent over time. Chapter 1 provides a critical review of the theoretical and empirical literature on the estimation of assets and liabilities holdings, on their interactions and on households net wealth. The review stresses the fact that a large part of the literature explain households debt holdings as a function, among others, of net wealth, an assumption that runs into possible endogeneity problems. Chapter 2 defines two non-linear multivariate models to study the interactions between assets and liabilities held by Italian households. Estimation refers to a pooling of cross-sections of SHIW. The first model is a bivariate tobit that estimates factors affecting assets and liabilities and their degree of correlation with results coherent with theoretical expectations. To tackle the presence of non normality and heteroskedasticity in the error term, generating non consistent tobit estimators, semi-parametric estimates are provided that confirm the results of the tobit model. The second model is a quadrivariate probit on three different assets (safe, risky and real) and total liabilities; the results show the expected patterns of interdependence suggested by theoretical considerations. Chapter 3 reviews the methodologies for estimating non-linear dynamic panel data models, drawing attention to the problems to be dealt with to obtain consistent estimators. Specific attention is given to the initial condition problem raised by the inclusion of the lagged dependent variable in the set of explanatory variables. The advantage of using dynamic panel data models lies in the fact that they allow to simultaneously account for true state dependence, via the lagged variable, and unobserved heterogeneity via individual effects specification. Chapter 4 applies the models reviewed in Chapter 3 to analyse financial difficulties of Italian households, by using information on net wealth as provided in the panel component of the SHIW. The aim is to test whether households persistently experience financial difficulties over time. A thorough discussion is provided of the alternative approaches proposed by the literature (subjective/qualitative indicators versus quantitative indexes) to identify households in financial distress. Households in financial difficulties are identified as those holding amounts of net wealth lower than the value corresponding to the first quartile of net wealth distribution. Estimation is conducted via four different methods: the pooled probit model, the random effects probit model with exogenous initial conditions, the Heckman model and the recently developed Wooldridge model. Results obtained from all estimators accept the null hypothesis of true state dependence and show that, according with the literature, less sophisticated models, namely the pooled and exogenous models, over-estimate such persistence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work of thesis is the refined estimations of source parameters. To such a purpose we used two different approaches, one in the frequency domain and the other in the time domain. In frequency domain, we analyzed the P- and S-wave displacement spectra to estimate spectral parameters, that is corner frequencies and low frequency spectral amplitudes. We used a parametric modeling approach which is combined with a multi-step, non-linear inversion strategy and includes the correction for attenuation and site effects. The iterative multi-step procedure was applied to about 700 microearthquakes in the moment range 1011-1014 N•m and recorded at the dense, wide-dynamic range, seismic networks operating in Southern Apennines (Italy). The analysis of the source parameters is often complicated when we are not able to model the propagation accurately. In this case the empirical Green function approach is a very useful tool to study the seismic source properties. In fact the Empirical Green Functions (EGFs) consent to represent the contribution of propagation and site effects to signal without using approximate velocity models. An EGF is a recorded three-component set of time-histories of a small earthquake whose source mechanism and propagation path are similar to those of the master event. Thus, in time domain, the deconvolution method of Vallée (2004) was applied to calculate the source time functions (RSTFs) and to accurately estimate source size and rupture velocity. This technique was applied to 1) large event, that is Mw=6.3 2009 L’Aquila mainshock (Central Italy), 2) moderate events, that is cluster of earthquakes of 2009 L’Aquila sequence with moment magnitude ranging between 3 and 5.6, 3) small event, i.e. Mw=2.9 Laviano mainshock (Southern Italy).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research aims at developing a framework for semantic-based digital survey of architectural heritage. Rooted in knowledge-based modeling which extracts mathematical constraints of geometry from architectural treatises, as-built information of architecture obtained from image-based modeling is integrated with the ideal model in BIM platform. The knowledge-based modeling transforms the geometry and parametric relation of architectural components from 2D printings to 3D digital models, and create large amount variations based on shape grammar in real time thanks to parametric modeling. It also provides prior knowledge for semantically segmenting unorganized survey data. The emergence of SfM (Structure from Motion) provides access to reconstruct large complex architectural scenes with high flexibility, low cost and full automation, but low reliability of metric accuracy. We solve this problem by combing photogrammetric approaches which consists of camera configuration, image enhancement, and bundle adjustment, etc. Experiments show the accuracy of image-based modeling following our workflow is comparable to that from range-based modeling. We also demonstrate positive results of our optimized approach in digital reconstruction of portico where low-texture-vault and dramatical transition of illumination bring huge difficulties in the workflow without optimization. Once the as-built model is obtained, it is integrated with the ideal model in BIM platform which allows multiple data enrichment. In spite of its promising prospect in AEC industry, BIM is developed with limited consideration of reverse-engineering from survey data. Besides representing the architectural heritage in parallel ways (ideal model and as-built model) and comparing their difference, we concern how to create as-built model in BIM software which is still an open area to be addressed. The research is supposed to be fundamental for research of architectural history, documentation and conservation of architectural heritage, and renovation of existing buildings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the first chapter we develop a theoretical model investigating food consumption and body weight with a novel assumption regarding human caloric expenditure (i.e. metabolism), in order to investigate why individuals can be rationally trapped in an excessive weight equilibrium and why they struggle to lose weight even when offered incentives for weight-loss. This assumption allows the theoretical model to have multiple equilibria and to provide an explanation for why losing weight is so difficult even in the presence of incentives, without relying on rational addiction, time-inconsistency preferences or bounded rationality. In addition to this result we are able to characterize under which circumstances a temporary incentive can create a persistent weight loss. In the second chapter we investigate the possible contributions that social norms and peer effects had on the spread of obesity. In recent literature peer effects and social norms have been characterized as important pathways for the biological and behavioral spread of body weight, along with decreased food prices and physical activity. We add to this literature by proposing a novel concept of social norm related to what we define as social distortion in weight perception. The theoretical model shows that, in equilibrium, the effect of an increase in peers' weight on i's weight is unrelated to health concerns while it is mainly associated with social concerns. Using regional data from England we prove that such social component is significant in influencing individual weight. In the last chapter we investigate the relationship between body weight and employment probability. Using a semi-parametric regression we show that men and women employment probability do not follow a linear relationship with body mass index (BMI) but rather an inverted U-shaped one, peaking at a BMI way over the clinical threshold for overweight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze three sets of doubly-censored cohort data on incubation times, estimating incubation distributions using semi-parametric methods and assessing the comparability of the estimates. Weibull models appear to be inappropriate for at least one of the cohorts, and the estimates for the different cohorts are substantially different. We use these estimates as inputs for backcalculation, using a nonparametric method based on maximum penalized likelihood. The different incubations all produce fits to the reported AIDS counts that are as good as the fit from a nonstationary incubation distribution that models treatment effects, but the estimated infection curves are very different. We also develop a method for estimating nonstationarity as part of the backcalculation procedure and find that such estimates also depend very heavily on the assumed incubation distribution. We conclude that incubation distributions are so uncertain that meaningful error bounds are difficult to place on backcalculated estimates and that backcalculation may be too unreliable to be used without being supplemented by other sources of information in HIV prevalence and incidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we focus on the model for two types of tumors. Tumor development can be described by four types of death rates and four tumor transition rates. We present a general semi-parametric model to estimate the tumor transition rates based on data from survival/sacrifice experiments. In the model, we make a proportional assumption of tumor transition rates on a common parametric function but no assumption of the death rates from any states. We derived the likelihood function of the data observed in such an experiment, and an EM algorithm that simplified estimating procedures. This article extends work on semi-parametric models for one type of tumor (see Portier and Dinse and Dinse) to two types of tumors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time series models relating short-term changes in air pollution levels to daily mortality counts typically assume that the effects of air pollution on the log relative rate of mortality do not vary with time. However, these short-term effects might plausibly vary by season. Changes in the sources of air pollution and meteorology can result in changes in characteristics of the air pollution mixture across seasons. The authors develop Bayesian semi-parametric hierarchical models for estimating time-varying effects of pollution on mortality in multi-site time series studies. The methods are applied to the updated National Morbidity and Mortality Air Pollution Study database for the period 1987--2000, which includes data for 100 U.S. cities. At the national level, a 10 micro-gram/m3 increase in PM(10) at lag 1 is associated with a 0.15 (95% posterior interval: -0.08, 0.39),0.14 (-0.14, 0.42), 0.36 (0.11, 0.61), and 0.14 (-0.06, 0.34) percent increase in mortality for winter, spring, summer, and fall, respectively. An analysis by geographical regions finds a strong seasonal pattern in the northeast (with a peak in summer) and little seasonal variation in the southern regions of the country. These results provide useful information for understanding particle toxicity and guiding future analyses of particle constituent data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the time when at least two-thirds of the US states have already mandated some form of seller's property condition disclosure statement and there is a movement in this direction nationally, this paper examines the impact of seller's property condition disclosure law on the residential real estate values, the information asymmetry in housing transactions and shift of risk from buyers and brokers to the sellers, and attempts to ascertain the factors that lead to adoption of the disclosur law. The analytical structure employs parametric panel data models, semi-parametric propensity score matching models, and an event study framework using a unique set of economic and institutional attributes for a quarterly panel of 291 US Metropolitan Statistical Areas (MSAs) and 50 US States spanning 21 years from 1984 to 2004. Exploiting the MSA level variation in house prices, the study finds that the average seller may be able to fetch a higher price (about three to four percent) for the house if she furnishes a state-mandated seller's property condition disclosure statement to the buyer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the impact of seller's Property Condition Disclosure Law on the residential real estate values. A disclosure law may address the information asymmetry in housing transactions shifting of risk from buyers and brokers to the sellers and raising housing prices as a result. We combine propensity score techniques from the treatment effects literature with a traditional event study approach. We assemble a unique set of economic and institutional attributes for a quarterly panel of 291 US Metropolitan Statistical Areas (MSAs) and 50 US States spanning 21 years from 1984 to 2004 is used to exploit the MSA level variation in house prices. The study finds that the average seller may be able to fetch a higher price (about three to four percent) for the house if she furnishes a state-mandated seller.s property condition disclosure statement to the buyer. When we compare the results from parametric and semi-parametric event analyses, we find that the semi-parametric or the propensity score analysis generals moderately larger estimated effects of the law on housing prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extension of k-ratio multiple comparison methods to rank-based analyses is described. The new method is analogous to the Duncan-Godbold approximate k-ratio procedure for unequal sample sizes or correlated means. The close parallel of the new methods to the Duncan-Godbold approach is shown by demonstrating that they are based upon different parameterizations as starting points.^ A semi-parametric basis for the new methods is shown by starting from the Cox proportional hazards model, using Wald statistics. From there the log-rank and Gehan-Breslow-Wilcoxon methods may be seen as score statistic based methods.^ Simulations and analysis of a published data set are used to show the performance of the new methods. ^