946 resultados para user data
Resumo:
This work proposes a method based on both preprocessing and data mining with the objective of identify harmonic current sources in residential consumers. In addition, this methodology can also be applied to identify linear and nonlinear loads. It should be emphasized that the entire database was obtained through laboratory essays, i.e., real data were acquired from residential loads. Thus, the residential system created in laboratory was fed by a configurable power source and in its output were placed the loads and the power quality analyzers (all measurements were stored in a microcomputer). So, the data were submitted to pre-processing, which was based on attribute selection techniques in order to minimize the complexity in identifying the loads. A newer database was generated maintaining only the attributes selected, thus, Artificial Neural Networks were trained to realized the identification of loads. In order to validate the methodology proposed, the loads were fed both under ideal conditions (without harmonics), but also by harmonic voltages within limits pre-established. These limits are in accordance with IEEE Std. 519-1992 and PRODIST (procedures to delivery energy employed by Brazilian`s utilities). The results obtained seek to validate the methodology proposed and furnish a method that can serve as alternative to conventional methods.
Resumo:
The central issue for pillar design in underground coal mining is the in situ uniaxial compressive strength (sigma (cm)). The paper proposes a new method for estimating in situ uniaxial compressive strength in coal seams based on laboratory strength and P wave propagation velocity. It describes the collection of samples in the Bonito coal seam, Fontanella Mine, southern Brazil, the techniques used for the structural mapping of the coal seam and determination of seismic wave propagation velocity as well as the laboratory procedures used to determine the strength and ultrasonic wave velocity. The results obtained using the new methodology are compared with those from seven other techniques for estimating in situ rock mass uniaxial compressive strength.
Resumo:
Artificial neural networks have been used to analyze a number of engineering problems, including settlement caused by different tunneling methods in various types of ground mass. This paper focuses on settlement over shotcrete- supported tunnels on Sao Paulo subway line 2 (West Extension) that were excavated in Tertiary sediments using the sequential excavation method. The adjusted network is a good tool for predicting settlement above new tunnels to be excavated in similar conditions. The influence of network training parameters on the quality of results is also discussed. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
One of the e-learning environment goal is to attend the individual needs of students during the learning process. The adaptation of contents, activities and tools into different visualization or in a variety of content types is an important feature of this environment, bringing to the user the sensation that there are suitable workplaces to his profile in the same system. Nevertheless, it is important the investigation of student behaviour aspects, considering the context where the interaction happens, to achieve an efficient personalization process. The paper goal is to present an approach to identify the student learning profile analyzing the context of interaction. Besides this, the learning profile could be analyzed in different dimensions allows the system to deal with the different focus of the learning.
Resumo:
This paper presents results of laboratory testing of unrestrained drying shrinkage during a period of 154 days of different concrete mixtures from the Brazilian production line that utilize ground granulated blast-furnace slag in their compositions. Three concrete mixtures with water/cement ratio of 0.78(M1), 0.41(M2), and 0.37(M3) were studied. The obtained experimental data were compared with the analytical results from prediction models available in the literature: the ACI 209 model (ACI), the B3 model (B3), the Eurocode 2 model (EC2), the GL 2000 model (GL), and the Brazilian NBR 6118 model (NBR), and an analysis of the efficacy of these models was conducted utilizing these experimental data. In addition, the development of the mechanical properties (compressive strength and modulus of elasticity) of the studied concrete mixtures was also measured in the laboratory until 126 days. From this study, it could be concluded that the ACI and the GL were the models that most approximated the experimental drying shrinkage data measured during the analyzed period of time.
Resumo:
This paper describes the development of an optimization model for the management and operation of a large-scale, multireservoir water supply distribution system with preemptive priorities. The model considers multiobjectives and hedging rules. During periods of drought, when water supply is insufficient to meet the planned demand, appropriate rationing factors are applied to reduce water supply. In this paper, a water distribution system is formulated as a network and solved by the GAMS modeling system for mathematical programming and optimization. A user-friendly interface is developed to facilitate the manipulation of data and to generate graphs and tables for decision makers. The optimization model and its interface form a decision support system (DSS), which can be used to configure a water distribution system to facilitate capacity expansion and reliability studies. Several examples are presented to demonstrate the utility and versatility of the developed DSS under different supply and demand scenarios, including applications to one of the largest water supply systems in the world, the Sao Paulo Metropolitan Area Water Supply Distribution System in Brazil.
Resumo:
Solid-liquid phase equilibrium modeling of triacylglycerol mixtures is essential for lipids design. Considering the alpha polymorphism and liquid phase as ideal, the Margules 2-suffix excess Gibbs energy model with predictive binary parameter correlations describes the non ideal beta and beta` solid polymorphs. Solving by direct optimization of the Gibbs free energy enables one to predict from a bulk mixture composition the phases composition at a given temperature and thus the SFC curve, the melting profile and the Differential Scanning Calorimetry (DSC) curve that are related to end-user lipid properties. Phase diagram, SFC and DSC curve experimental data are qualitatively and quantitatively well predicted for the binary mixture 1,3-dipalmitoyl-2-oleoyl-sn-glycerol (POP) and 1,2,3-tripalmitoyl-sn-glycerol (PPP), the ternary mixture 1,3-dimyristoyl-2-palmitoyl-sn-glycerol (MPM), 1,2-distearoyl-3-oleoyl-sn-glycerol (SSO) and 1,2,3-trioleoyl-sn-glycerol (OOO), for palm oil and cocoa butter. Then, addition to palm oil of Medium-Long-Medium type structured lipids is evaluated, using caprylic acid as medium chain and long chain fatty acids (EPA-eicosapentaenoic acid, DHA-docosahexaenoic acid, gamma-linolenic-octadecatrienoic acid and AA-arachidonic acid), as sn-2 substitutes. EPA, DHA and AA increase the melting range on both the fusion and crystallization side. gamma-linolenic shifts the melting range upwards. This predictive tool is useful for the pre-screening of lipids matching desired properties set a priori.
Resumo:
Thermodynamic properties of bread dough (fusion enthalpy, apparent specific heat, initial freezing point and unfreezable water) were measured at temperatures from -40 degrees C to 35 degrees C using differential scanning calorimetry. The initial freezing point was also calculated based on the water activity of dough. The apparent specific heat varied as a function of temperature: specific heat in the freezing region varied from (1.7-23.1) J g(-1) degrees C(-1), and was constant at temperatures above freezing (2.7 J g(-1) degrees C(-1)). Unfreezable water content varied from (0.174-0.182) g/g of total product. Values of heat capacity as a function of temperature were correlated using thermodynamic models. A modification for low-moisture foodstuffs (such as bread dough) was successfully applied to the experimental data. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents an analysis of the performance of a baseband multiple-input single-output (MISO) time reversal ultra-wideband system (TR-UWB) incorporating a symbol spaced decision feedback equalizer (DFE). A semi-analytical performance analysis based on a Gaussian approach is considered, which matched well with simulation results, even for the DFE case. The channel model adopted is based on the IEEE 802.15.3a model, considering correlated shadowing across antenna elements. In order to provide a more realistic analysis, channel estimation errors are considered for the design of the TR filter. A guideline for the choice of equalizer length is provided. The results show that the system`s performance improves with an increase in the number of transmit antennas and when a symbol spaced equalizer is used with a relatively small number of taps compared to the number of resolvable paths in the channel impulse response. Moreover, it is possible to conclude that due to the time reversal scheme, the error propagation in the DFE does not play a role in the system`s performance.
Resumo:
This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
In this work, an axisymmetric two-dimensional finite element model was developed to simulate instrumented indentation testing of thin ceramic films deposited onto hard steel substrates. The level of film residual stress (sigma(r)), the film elastic modulus (E) and the film work hardening exponent (n) were varied to analyze their effects on indentation data. These numerical results were used to analyze experimental data that were obtained with titanium nitride coated specimens, in which the substrate bias applied during deposition was modified to obtain films with different levels of sigma(r). Good qualitative correlation was obtained when numerical and experimental results were compared, as long as all film properties are considered in the analyses, and not only sigma(r). The numerical analyses were also used to further understand the effect of sigma(r) on the mechanical properties calculated based on instrumented indentation data. In this case, the hardness values obtained based on real or calculated contact areas are similar only when sink-in occurs, i.e. with high n or high ratio VIE, where Y is the yield strength of the film. In an additional analysis, four ratios (R/h(max)) between indenter tip radius and maximum penetration depth were simulated to analyze the combined effects of R and sigma(r) on the indentation load-displacement curves. In this case, or did not significantly affect the load curve exponent, which was affected only by the indenter tip radius. On the other hand, the proportional curvature coefficient was significantly affected by sigma(r) and n. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
For the first time, we introduce and study some mathematical properties of the Kumaraswamy Weibull distribution that is a quite flexible model in analyzing positive data. It contains as special sub-models the exponentiated Weibull, exponentiated Rayleigh, exponentiated exponential, Weibull and also the new Kumaraswamy exponential distribution. We provide explicit expressions for the moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are derived for the mean deviations, Bonferroni and Lorenz curves, reliability and Renyi entropy. The moments of the order statistics are calculated. We also discuss the estimation of the parameters by maximum likelihood. We obtain the expected information matrix. We provide applications involving two real data sets on failure times. Finally, some multivariate generalizations of the Kumaraswamy Weibull distribution are discussed. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
Estimation of Taylor`s power law for species abundance data may be performed by linear regression of the log empirical variances on the log means, but this method suffers from a problem of bias for sparse data. We show that the bias may be reduced by using a bias-corrected Pearson estimating function. Furthermore, we investigate a more general regression model allowing for site-specific covariates. This method may be efficiently implemented using a Newton scoring algorithm, with standard errors calculated from the inverse Godambe information matrix. The method is applied to a set of biomass data for benthic macrofauna from two Danish estuaries. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Interval-censored survival data, in which the event of interest is not observed exactly but is only known to occur within some time interval, occur very frequently. In some situations, event times might be censored into different, possibly overlapping intervals of variable widths; however, in other situations, information is available for all units at the same observed visit time. In the latter cases, interval-censored data are termed grouped survival data. Here we present alternative approaches for analyzing interval-censored data. We illustrate these techniques using a survival data set involving mango tree lifetimes. This study is an example of grouped survival data.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.