180 resultados para Lifetime warranties, Warranty policies, Cost models
Resumo:
Distribution of timing signals is an essential factor for the development of digital systems for telecommunication networks, integrated circuits and manufacturing automation. Originally, this distribution was implemented by using the master-slave architecture with a precise master clock generator sending signals to phase-locked loops (PLL) working as slave oscillators. Nowadays, wireless networks with dynamical connectivity and the increase in size and operation frequency of the integrated circuits suggest that the distribution of clock signals could be more efficient if mutually connected architectures were used. Here, mutually connected PLL networks are studied and conditions for synchronous states existence are analytically derived, depending on individual node parameters and network connectivity, considering that the nodes are nonlinear oscillators with nonlinear coupling conditions. An expression for the network synchronisation frequency is obtained. The lock-in range and the transmission error bounds are analysed providing hints to the design of this kind of clock distribution system.
Resumo:
Eight different models to represent the effect of friction in control valves are presented: four models based on physical principles and four empirical ones. The physical models, both static and dynamic, have the same structure. The models are implemented in Simulink/Matlab (R) and compared, using different friction coefficients and input signals. Three of the models were able to reproduce the stick-slip phenomenon and passed all the tests, which were applied following ISA standards. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.
Resumo:
Air transport has become a vital component of the global economy. However, greenhouse-gas emissions from this sector have a significant impact on global climate, being responsible for over 3.5% of all anthropogenic radiative forcing. Also, the accrued visibility of aircraft emissions greatly affects the public image of the industry. In this context, incentive-based regulations, in the form of price or quantity controls, can be envisaged as alternatives to mitigate these emissions. The use of environmental charges in air transport, and the inclusion of the sector in the European Union Emissions Trading Scheme (EU ETS), are considered under a range of scenarios. The impacts of these measures on demand are estimated, and results suggest that they are likely to be minimal-mainly due to the high willingness to pay for air transport. In particular, in the EU ETS scenario currently favoured by the EU, demand reductions are less than 2%. This may not be true in the longer run, for short trips, or if future caps become more stringent. Furthermore, given current estimates of the social Cost Of CO2 as well as typical EU ETS prices, supply-side abatement would be too costly to be encouraged by these policies in the short term. The magnitude of aviation CO2 emissions in the EU is estimated, both in physical and monetary terms; the results are consistent with Eurocontrol estimates and, for the EU-25, the total social cost of these emissions represents only 0.03% of the region`s GDP. It is concluded that the use of multisector policies, such as the EU ETS, is unsuitable for curbing emissions from air transport, and that stringent emission charges or an isolated ETS would be better instruments. However, the inclusion of aviation in the EU ETS has advantages under target-oriented post-2012 scenarios, such as policy-costs dilution, certainty in reductions, and flexibility in abatement allocation. This solution is also attractive to airlines, as it would improve their public image but require virtually no reduction of their own emissions, as they would be fully capable of passing on policy costs to their customers.
Resumo:
This article discusses the impact on the profitability of firms under Complementary Law 102/2000 (which abrogated the Law 89/96 - Kandir Law) allowing the appropriation of ICMS credits, due to investment in fixed assets goods, at a ratio of 1/48 per month. The paper seeks to demonstrate how this new system - which resulted in the transformation of the ICMS as a value added tax (VAT) consumption-type to an income-type - leads to a loss of approximately 30% of the value of credits to be recovered and the effect it generates on the cost of investment and the profits for small, medium and large firms. From the methodological point of view, it is a descriptive and quantitative research, which proceeded in three stages. Initially, we have obtained estimated value of net sales and volume of investments, based on report Painel de Competitividade prepared by the Federacao das Indtustrias do Estado de Sao Paulo (Fiesp/Serasa). Based on this information, it was possible to obtain estimates of the factors of generation of debits and credits for ICMS, using the model Credit Control of Fixed Assets (CIAP). Finally, we have calculated three indicators: (i) present value of debt recovery/value of credits, (ii) present value of debt recovery / investment value, (iii) present value of debt recovery / sales profitability. We have conclude that the system introduced by Complementary Law 102/2000 implicates great opportunity cost for firms and that legislation should be reviewed from this perspective, aiming to ensure lower costs associated with investment projects.
Resumo:
In this paper we proposed a new two-parameters lifetime distribution with increasing failure rate. The new distribution arises on a latent complementary risk problem base. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulae for its reliability and failure rate functions, quantiles and moments, including the mean and variance. A simple EM-type algorithm for iteratively computing maximum likelihood estimates is presented. The Fisher information matrix is derived analytically in order to obtaining the asymptotic covariance matrix. The methodology is illustrated on a real data set. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Since the computer viruses pose a serious problem to individual and corporative computer systems, a lot of effort has been dedicated to study how to avoid their deleterious actions, trying to create anti-virus programs acting as vaccines in personal computers or in strategic network nodes. Another way to combat viruses propagation is to establish preventive policies based on the whole operation of a system that can be modeled with population models, similar to those that are used in epidemiological studies. Here, a modified version of the SIR (Susceptible-Infected-Removed) model is presented and how its parameters are related to network characteristics is explained. Then, disease-free and endemic equilibrium points are calculated, stability and bifurcation conditions are derived and some numerical simulations are shown. The relations among the model parameters in the several bifurcation conditions allow a network design minimizing viruses risks. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The inverse Weibull distribution has the ability to model failure rates which are quite common in reliability and biological studies. A three-parameter generalized inverse Weibull distribution with decreasing and unimodal failure rate is introduced and studied. We provide a comprehensive treatment of the mathematical properties of the new distribution including expressions for the moment generating function and the rth generalized moment. The mixture model of two generalized inverse Weibull distributions is investigated. The identifiability property of the mixture model is demonstrated. For the first time, we propose a location-scale regression model based on the log-generalized inverse Weibull distribution for modeling lifetime data. In addition, we develop some diagnostic tools for sensitivity analysis. Two applications of real data are given to illustrate the potentiality of the proposed regression model.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
A bathtub-shaped failure rate function is very useful in survival analysis and reliability studies. The well-known lifetime distributions do not have this property. For the first time, we propose a location-scale regression model based on the logarithm of an extended Weibull distribution which has the ability to deal with bathtub-shaped failure rate functions. We use the method of maximum likelihood to estimate the model parameters and some inferential procedures are presented. We reanalyze a real data set under the new model and the log-modified Weibull regression model. We perform a model check based on martingale-type residuals and generated envelopes and the statistics AIC and BIC to select appropriate models. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The zero-inflated negative binomial model is used to account for overdispersion detected in data that are initially analyzed under the zero-Inflated Poisson model A frequentist analysis a jackknife estimator and a non-parametric bootstrap for parameter estimation of zero-inflated negative binomial regression models are considered In addition an EM-type algorithm is developed for performing maximum likelihood estimation Then the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and some ways to perform global influence analysis are derived In order to study departures from the error assumption as well as the presence of outliers residual analysis based on the standardized Pearson residuals is discussed The relevance of the approach is illustrated with a real data set where It is shown that zero-inflated negative binomial regression models seems to fit the data better than the Poisson counterpart (C) 2010 Elsevier B V All rights reserved
Resumo:
Corn ethanol produced in the US and sugarcane ethanol produced in Brazil are the world`s leading sources of biofuel. Current US biofuel policies create both incentives and constraints for the import of ethanol from Brazil and together with the cost competitiveness and greenhouse gas intensity of sugarcane ethanol compared to corn ethanol will determine the extent of these imports. This study analyzes the supply-side determinants of cost competitiveness and compares the greenhouse gas intensity of corn ethanol and sugarcane ethanol delivered to US ports. We find that while the cost of sugarcane ethanol production in Brazil is lower than that of corn ethanol in the US, the inclusion of transportation costs for the former and co-product credits for the latter changes their relative competitiveness. We also find that the relative cost of ethanol in the US and Brazil is highly sensitive to the prevailing exchange rate and prices of feedstocks. At an exchange rate of US$1=R$2.15 the cost of corn ethanol is 15% lower than the delivered cost of sugarcane ethanol at a US port. Sugarcane ethanol has lower GHG emissions than corn ethanol but a price of over $113 per ton of CO(2) is needed to affect competitiveness. (C) 2010 Elsevier Ltd. All rights reserved.