914 resultados para Dynamic data analysis
Resumo:
The aim of this work was to evaluate the effect of the storage time on the thermal properties of triethylene glycol dimethacrylate/2,2-bis[4-(2-hydroxy-3-methacryloxy-prop-1-oxy)-phenyl]propane bisphenyl-alpha-glycidyl ether dimethacrylate (TB) copolymers used in formulations of dental resins after photopolymerization. The TB copolymers were prepared by photopolymerization with an Ultrablue IS light-emitting diode, stored in the dark for 160 days at 37 degrees C, and characterized with differential scanning calorimetry (DSC), dynamic mechanical analysis (DMA), and Fourier transform infrared spectroscopy with attenuated total reflection. DSC curves indicated the presence of an exothermic peak, confirming that the reaction was not completed during the photopolymerization process. This exothermic peak became smaller as a function of the storage time and was shifted at higher temperatures. In DMA studies, a plot of the loss tangent versus the temperature initially showed the presence of two well-defined peaks. The presence of both peaks confirmed the presence of residual monomers that were not converted during the photopolymerization process. (C) 2009 Wiley Periodicals, Inc. J Appl Polym Sci 112: 679-684, 2009
Resumo:
This paper investigates what factors affect the destination choice for Jordanian to 8 countries (Oman, Saudi Arabia, Syria, Tunisia, Yemen, Egypt, Lebanon and Bahrain) using panel data analysis. Number of outbound tourists is represented as dependent variable, which is regressed over five explanatory variables using fixed effect model. The finding of this paper is that tourists from Jordan have weak demand for outbound tourism; Jordanian decision of traveling abroad is determined by the cost of traveling to different places and choosing the cheapest alternative.
Resumo:
The Enriquillo and Azuei are saltwater lakes located in a closed water basin in the southwestern region of the island of La Hispaniola, these have been experiencing dramatic changes in total lake-surface area coverage during the period 1980-2012. The size of Lake Enriquillo presented a surface area of approximately 276 km2 in 1984, gradually decreasing to 172 km2 in 1996. The surface area of the lake reached its lowest point in the satellite observation record in 2004, at 165 km2. Then the recent growth of the lake began reaching its 1984 size by 2006. Based on surface area measurement for June and July 2013, Lake Enriquillo has a surface area of ~358 km2. Sumatra sizes at both ends of the record are 116 km2 in 1984 and 134 km2in 2013, an overall 15.8% increase in 30 years. Determining the causes of lake surface area changes is of extreme importance due to its environmental, social, and economic impacts. The overall goal of this study is to quantify the changing water balance in these lakes and their catchment area using satellite and ground observations and a regional atmospheric-hydrologic modeling approach. Data analyses of environmental variables in the region reflect a hydrological unbalance of the lakes due to changing regional hydro-climatic conditions. Historical data show precipitation, land surface temperature and humidity, and sea surface temperature (SST), increasing over region during the past decades. Salinity levels have also been decreasing by more than 30% from previously reported baseline levels. Here we present a summary of the historical data obtained, new sensors deployed in the sourrounding sierras and the lakes, and the integrated modeling exercises. As well as the challenges of gathering, storing, sharing, and analyzing this large volumen of data in a remote location from such a diverse number of sources.
Resumo:
The aim of this article is to assess the role of real effective exchange rate volatility on long-run economic growth for a set of 82 advanced and emerging economies using a panel data set ranging from 1970 to 2009. With an accurate measure for exchange rate volatility, the results for the two-step system GMM panel growth models show that a more (less) volatile RER has significant negative (positive) impact on economic growth and the results are robust for different model specifications. In addition to that, exchange rate stability seems to be more important to foster long-run economic growth than exchange rate misalignment
Resumo:
Housing is an important component of wealth for a typical household in many countries. The objective of this paper is to investigate the effect of real-estate price variation on welfare, trying to close a gap between the welfare literature in Brazil and that in the U.S., the U.K., and other developed countries. Our first motivation relates to the fact that real estate is probably more important here than elsewhere as a proportion of wealth, which potentially makes the impact of a price change bigger here. Our second motivation relates to the fact that real-estate prices boomed in Brazil in the last five years. Prime real estate in Rio de Janeiro and São Paulo have tripled in value in that period, and a smaller but generalized increase has been observed throughout the country. Third, we have also seen a recent consumption boom in Brazil in the last five years. Indeed, the recent rise of some of the poor to middle-income status is well documented not only for Brazil but for other emerging countries as well. Regarding consumption and real-estate prices in Brazil, one cannot imply causality from correlation, but one can do causal inference with an appropriate structural model and proper inference, or with a proper inference in a reduced-form setup. Our last motivation is related to the complete absence of studies of this kind in Brazil, which makes ours a pioneering study. We assemble a panel-data set for the determinants of non-durable consumption growth by Brazilian states, merging the techniques and ideas in Campbell and Cocco (2007) and in Case, Quigley and Shiller (2005). With appropriate controls, and panel-data methods, we investigate whether house-price variation has a positive effect on non-durable consumption. The results show a non-negligible significant impact of the change in the price of real estate on welfare consumption), although smaller then what Campbell and Cocco have found. Our findings support the view that the channel through which house prices affect consumption is a financial one.
Resumo:
There are four different hypotheses analyzed in the literature that explain deunionization, namely: the decrease in the demand for union representation by the workers; the impaet of globalization over unionization rates; teehnieal ehange and ehanges in the legal and politieal systems against unions. This paper aims to test alI ofthem. We estimate a logistie regression using panel data proeedure with 35 industries from 1973 to 1999 and eonclude that the four hypotheses ean not be rejeeted by the data. We also use a varianee analysis deeomposition to study the impaet of these variables over the drop in unionization rates. In the model with no demographic variables the results show that these economic (tested) variables can account from 10% to 12% of the drop in unionization. However, when we include demographic variables these tested variables can account from 10% to 35% in the total variation of unionization rates. In this case the four hypotheses tested can explain up to 50% ofthe total drop in unionization rates explained by the model.
Resumo:
We investigate the issue of whether there was a stable money demand function for Japan in 1990's using both aggregate and disaggregate time series data. The aggregate data appears to support the contention that there was no stable money demand function. The disaggregate data shows that there was a stable money demand function. Neither was there any indication of the presence of liquidity trapo Possible sources of discrepancy are explored and the diametrically opposite results between the aggregate and disaggregate analysis are attributed to the neglected heterogeneity among micro units. We also conduct simulation analysis to show that when heterogeneity among micro units is present. The prediction of aggregate outcomes, using aggregate data is less accurate than the prediction based on micro equations. Moreover. policy evaluation based on aggregate data can be grossly misleading.
Resumo:
In this paper a set of Brazilian commercial gasoline representative samples from São Paulo State, selected by HCA, plus six samples obtained directly from refineries were analysed by a high-sensitive gas chromatographic (GC) method ASTM D6733. The levels of saturated hydrocarbons and anhydrous ethanol obtained by GC were correlated with the quality obtained from Brazilian Government Petroleum, Natural Gas and Biofuels Agency (ANP) specifications through exploratory analysis (HCA and PCA). This correlation showed that the GC method, together with HCA and PCA, could be employed as a screening technique to determine compliance with the prescribed legal standards of Brazilian gasoline.
Resumo:
Linear mixed effects models are frequently used to analyse longitudinal data, due to their flexibility in modelling the covariance structure between and within observations. Further, it is easy to deal with unbalanced data, either with respect to the number of observations per subject or per time period, and with varying time intervals between observations. In most applications of mixed models to biological sciences, a normal distribution is assumed both for the random effects and for the residuals. This, however, makes inferences vulnerable to the presence of outliers. Here, linear mixed models employing thick-tailed distributions for robust inferences in longitudinal data analysis are described. Specific distributions discussed include the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted, and the Gibbs sampler and the Metropolis-Hastings algorithms are used to carry out the posterior analyses. An example with data on orthodontic distance growth in children is discussed to illustrate the methodology. Analyses based on either the Student-t distribution or on the usual Gaussian assumption are contrasted. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process for modelling distributions of the random effects and of residuals in linear mixed models, and the MCMC implementation allows the computations to be performed in a flexible manner.
Resumo:
In this work, initial crystallographic studies of human haemoglobin (Hb) crystallized in isoionic and oxygen-free PEG solution are presented. Under these conditions, functional measurements of the O-2-linked binding of water molecules and release of protons have evidenced that Hb assumes an unforeseen new allosteric conformation. The determination of the high-resolution structure of the crystal of human deoxy-Hb fully stripped of anions may provide a structural explanation for the role of anions in the allosteric properties of Hb and, particularly, for the influence of chloride on the Bohr effect, the mechanism by which Hb oxygen affinity is regulated by pH. X-ray diffraction data were collected to 1.87 Angstrom resolution using a synchrotron-radiation source. Crystals belong to the space group P2(1)2(1)2 and preliminary analysis revealed the presence of one tetramer in the asymmetric unit. The structure is currently being refined using maximum-likelihood protocols.
Resumo:
Hemoglobin remains, despite the enormous amount of research involving this molecule, as a prototype for allosteric models and new conformations. Functional studies carried out on Hemoglobin-I from the South-American Catfish Liposarcus anisitsi [1] suggest the existence of conformational states beyond those already described for human hemoglobin, which could be confirmed crystallographically. The present work represents the initial steps towards that goal.
Resumo:
The present study introduces a multi-agent architecture designed for doing automation process of data integration and intelligent data analysis. Different from other approaches the multi-agent architecture was designed using a multi-agent based methodology. Tropos, an agent based methodology was used for design. Based on the proposed architecture, we describe a Web based application where the agents are responsible to analyse petroleum well drilling data to identify possible abnormalities occurrence. The intelligent data analysis methods used was the Neural Network.
Resumo:
In this paper is reported the use of the chromatographic profiles of volatiles to determine disease markers in plants - in this case, leaves of Eucalyptus globulus contaminated by the necrotroph fungus Teratosphaeria nubilosa. The volatile fraction was isolated by headspace solid phase microextraction (HS-SPME) and analyzed by comprehensive two-dimensional gas chromatography-fast quadrupole mass spectrometry (GC. ×. GC-qMS). For the correlation between the metabolic profile described by the chromatograms and the presence of the infection, unfolded-partial least squares discriminant analysis (U-PLS-DA) with orthogonal signal correction (OSC) were employed. The proposed method was checked to be independent of factors such as the age of the harvested plants. The manipulation of the mathematical model obtained also resulted in graphic representations similar to real chromatograms, which allowed the tentative identification of more than 40 compounds potentially useful as disease biomarkers for this plant/pathogen pair. The proposed methodology can be considered as highly reliable, since the diagnosis is based on the whole chromatographic profile rather than in the detection of a single analyte. © 2013 Elsevier B.V..
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)