22 resultados para Power series models
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.
Resumo:
The theoretical E-curve for the laminar flow of non-Newtonian fluids in circular tubes may not be accurate for real tubular systems with diffusion, mechanical vibration, wall roughness, pipe fittings, curves, coils, or corrugated walls. Deviations from the idealized laminar flow reactor (LFR) cannot be well represented using the axial dispersion or the tanks-in-series models of residence time distribution (RTD). In this work, four RTD models derived from non-ideal velocity profiles in segregated tube flow are proposed. They were used to represent the RTD of three tubular systems working with Newtonian and pseudoplastic fluids. Other RTD models were considered for comparison. The proposed models provided good adjustments, and it was possible to determine the active volumes. It is expected that these models can be useful for the analysis of LFR or for the evaluation of continuous thermal processing of viscous foods.
Resumo:
The issue of assessing variance components is essential in deciding on the inclusion of random effects in the context of mixed models. In this work we discuss this problem by supposing nonlinear elliptical models for correlated data by using the score-type test proposed in Silvapulle and Silvapulle (1995). Being asymptotically equivalent to the likelihood ratio test and only requiring the estimation under the null hypothesis, this test provides a fairly easy computable alternative for assessing one-sided hypotheses in the context of the marginal model. Taking into account the possible non-normal distribution, we assume that the joint distribution of the response variable and the random effects lies in the elliptical class, which includes light-tailed and heavy-tailed distributions such as Student-t, power exponential, logistic, generalized Student-t, generalized logistic, contaminated normal, and the normal itself, among others. We compare the sensitivity of the score-type test under normal, Student-t and power exponential models for the kinetics data set discussed in Vonesh and Carter (1992) and fitted using the model presented in Russo et al. (2009). Also, a simulation study is performed to analyze the consequences of the kurtosis misspecification.
Resumo:
We discuss a new interacting model for the cosmological dark sector in which the attenuated dilution of cold dark matter scales as a(-3)f(a), where f(a) is an arbitrary function of the cosmic scale factor a. From thermodynamic arguments, we show that f(a) is proportional to the entropy source of the particle creation process. In order to investigate the cosmological consequences of this kind of interacting models, we expand f(a) in a power series, and viable cosmological solutions are obtained. Finally, we use current observational data to place constraints on the interacting function f(a).
Resumo:
This paper presents an extension of the Enestrom-Kakeya theorem concerning the roots of a polynomial that arises from the analysis of the stability of Brown (K, L) methods. The generalization relates to relaxing one of the inequalities on the coefficients of the polynomial. Two results concerning the zeros of polynomials will be proved, one of them providing a partial answer to a conjecture by Meneguette (1994)[6]. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
To estimate causal relationships, time series econometricians must be aware of spurious correlation, a problem first mentioned by Yule (1926). To deal with this problem, one can work either with differenced series or multivariate models: VAR (VEC or VECM) models. These models usually include at least one cointegration relation. Although the Bayesian literature on VAR/VEC is quite advanced, Bauwens et al. (1999) highlighted that "the topic of selecting the cointegrating rank has not yet given very useful and convincing results". The present article applies the Full Bayesian Significance Test (FBST), especially designed to deal with sharp hypotheses, to cointegration rank selection tests in VECM time series models. It shows the FBST implementation using both simulated and available (in the literature) data sets. As illustration, standard non informative priors are used.
Resumo:
In this paper we obtain asymptotic expansions, up to order n(-1/2) and under a sequence of Pitman alternatives, for the nonnull distribution functions of the likelihood ratio, Wald, score and gradient test statistics in the class of symmetric linear regression models. This is a wide class of models which encompasses the t model and several other symmetric distributions with longer-than normal tails. The asymptotic distributions of all four statistics are obtained for testing a subset of regression parameters. Furthermore, in order to compare the finite-sample performance of these tests in this class of models, Monte Carlo simulations are presented. An empirical application to a real data set is considered for illustrative purposes. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Aldolase has emerged as a promising molecular target for the treatment of human African trypanosomiasis. Over the last years, due to the increasing number of patients infected with Trypanosoma brucei, there is an urgent need for new drugs to treat this neglected disease. In the present study, two-dimensional fragment-based quantitative-structure activity relationship (QSAR) models were generated for a series of inhibitors of aldolase. Through the application of leave-one-out and leave-many-out cross-validation procedures, significant correlation coefficients were obtained (r(2) = 0.98 and q(2) = 0.77) as an indication of the statistical internal and external consistency of the models. The best model was employed to predict pK(i) values for a series of test set compounds, and the predicted values were in good agreement with the experimental results, showing the power of the model for untested compounds. Moreover, structure-based molecular modeling studies were performed to investigate the binding mode of the inhibitors in the active site of the parasitic target enzyme. The structural and QSAR results provided useful molecular information for the design of new aldolase inhibitors within this structural class.
Resumo:
We derive asymptotic expansions for the nonnull distribution functions of the likelihood ratio, Wald, score and gradient test statistics in the class of dispersion models, under a sequence of Pitman alternatives. The asymptotic distributions of these statistics are obtained for testing a subset of regression parameters and for testing the precision parameter. Based on these nonnull asymptotic expansions, the power of all four tests, which are equivalent to first order, are compared. Furthermore, in order to compare the finite-sample performance of these tests in this class of models, Monte Carlo simulations are presented. An empirical application to a real data set is considered for illustrative purposes. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In this work we compared the estimates of the parameters of ARCH models using a complete Bayesian method and an empirical Bayesian method in which we adopted a non-informative prior distribution and informative prior distribution, respectively. We also considered a reparameterization of those models in order to map the space of the parameters into real space. This procedure permits choosing prior normal distributions for the transformed parameters. The posterior summaries were obtained using Monte Carlo Markov chain methods (MCMC). The methodology was evaluated by considering the Telebras series from the Brazilian financial market. The results show that the two methods are able to adjust ARCH models with different numbers of parameters. The empirical Bayesian method provided a more parsimonious model to the data and better adjustment than the complete Bayesian method.
Resumo:
We study a model of fast magnetic reconnection in the presence of weak turbulence proposed by Lazarian and Vishniac (1999) using three-dimensional direct numerical simulations. The model has been already successfully tested in Kowal et al. (2009) confirming the dependencies of the reconnection speed V-rec on the turbulence injection power P-inj and the injection scale l(inj) expressed by a constraint V-rec similar to P(inj)(1/2)l(inj)(3/4)and no observed dependency on Ohmic resistivity. In Kowal et al. (2009), in order to drive turbulence, we injected velocity fluctuations in Fourier space with frequencies concentrated around k(inj) = 1/l(inj), as described in Alvelius (1999). In this paper, we extend our previous studies by comparing fast magnetic reconnection under different mechanisms of turbulence injection by introducing a new way of turbulence driving. The new method injects velocity or magnetic eddies with a specified amplitude and scale in random locations directly in real space. We provide exact relations between the eddy parameters and turbulent power and injection scale. We performed simulations with new forcing in order to study turbulent power and injection scale dependencies. The results show no discrepancy between models with two different methods of turbulence driving exposing the same scalings in both cases. This is in agreement with the Lazarian and Vishniac (1999) predictions. In addition, we performed a series of models with varying viscosity nu. Although Lazarian and Vishniac (1999) do not provide any prediction for this dependence, we report a weak relation between the reconnection speed with viscosity, V-rec similar to nu(-1/4).
Resumo:
The objective of this work was to evaluate extreme water table depths in a watershed, using methods for geographical spatial data analysis. Groundwater spatio-temporal dynamics was evaluated in an outcrop of the Guarani Aquifer System. Water table depths were estimated from monitoring of water levels in 23 piezometers and time series modeling available from April 2004 to April 2011. For generation of spatial scenarios, geostatistical techniques were used, which incorporated into the prediction ancillary information related to the geomorphological patterns of the watershed, using a digital elevation model. This procedure improved estimates, due to the high correlation between water levels and elevation, and aggregated physical sense to predictions. The scenarios showed differences regarding the extreme levels - too deep or too shallow ones - and can subsidize water planning, efficient water use, and sustainable water management in the watershed.
Resumo:
In this paper, a modeling technique for small-signal stability assessment of unbalanced power systems is presented. Since power distribution systems are inherently unbalanced, due to its lines and loads characteristics, and the penetration of distributed generation into these systems is increasing nowadays, such a tool is needed in order to ensure a secure and reliable operation of these systems. The main contribution of this paper is the development of a phasor-based model for the study of dynamic phenomena in unbalanced power systems. Using an assumption on the net torque of the generator, it is possible to precisely define an equilibrium point for the phasor model of the system, thus enabling its linearization around this point, and, consequently, its eigenvalue/eigenvector analysis for small-signal stability assessment. The modeling technique presented here was compared to the dynamic behavior observed in ATP simulations and the results show that, for the generator and controller models used, the proposed modeling approach is adequate and yields reliable and precise results.
Resumo:
The scope of this paper was to analyze the association between homicides and public security indicators in Sao Paulo between 1996 and 2008, after monitoring the unemployment rate and the proportion of youths in the population. A time-series ecological study for 1996 and 2008 was conducted with Sao Paulo as the unit of analysis. Dependent variable: number of deaths by homicide per year. Main independent variables: arrest-incarceration rate, access to firearms, police activity. Data analysis was conducted using Stata. IC 10.0 software. Simple and multivariate negative binomial regression models were created. Deaths by homicide and arrest-incarceration, as well as police activity were significantly associated in simple regression analysis. Access to firearms was not significantly associated to the reduction in the number of deaths by homicide (p>0,05). After adjustment, the associations with both the public security indicators were not significant. In Sao Paulo the role of public security indicators are less important as explanatory factors for a reduction in homicide rates, after adjustment for unemployment rate and a reduction in the proportion of youths. The results reinforce the importance of socioeconomic and demographic factors for a change in the public security scenario in Sao Paulo.
Resumo:
This work assessed homogeneity of the Institute of Astronomy, Geophysics and Atmospheric Sciences (IAG) weather station climate series, using various statistical techniques. The record from this target station is one of the longest in Brazil, having commenced in 1933 with observations of precipitation, and temperatures and other variables later in 1936. Thus, it is one of the few stations in Brazil with enough data for long-term climate variability and climate change studies. There is, however, a possibility that its data may have been contaminated by some artifacts over time. Admittedly, there was an intervention on the observations in 1958, with the replacement of instruments, for which the size of impact has not been yet evaluated. The station transformed in the course of time from rural to urban, and this may also have influenced homogeneity of the observations and makes the station less representative for climate studies over larger spatial scales. Homogeneity of the target station was assessed applying both absolute, or single station tests, and tests relatively to regional climate, in annual scale, regarding daily precipitation, relative humidity, maximum (TMax), minimum (TMin), and wet bulb temperatures. Among these quantities, only precipitation does not exhibit any inhomogeneity. A clear signal of change of instruments in 1958 was detected in the TMax and relative humidity data, the latter certainly because of its strong dependence on temperature. This signal is not very clear in TMin, but it presents non-climatic discontinuities around 1953 and around 1970. A significant homogeneity break is found around 1990 for TMax and wet bulb temperature. The discontinuities detected after 1958 may have been caused by urbanization, as the observed warming trend in the station is considerably greater than that corresponding to regional climate.