999 resultados para Bivariate Function
Resumo:
Using monotone bifunctions, we introduce a recession concept for general equilibrium problems relying on a variational convergence notion. The interesting purpose is to extend some results of P. L. Lions on variational problems. In the process we generalize some results by H. Brezis and H. Attouch relative to the convergence of the resolvents associated with maximal monotone operators.
Resumo:
Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.
Resumo:
So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).
Resumo:
Optimal design for generalized linear models has primarily focused on univariate data. Often experiments are performed that have multiple dependent responses described by regression type models, and it is of interest and of value to design the experiment for all these responses. This requires a multivariate distribution underlying a pre-chosen model for the data. Here, we consider the design of experiments for bivariate binary data which are dependent. We explore Copula functions which provide a rich and flexible class of structures to derive joint distributions for bivariate binary data. We present methods for deriving optimal experimental designs for dependent bivariate binary data using Copulas, and demonstrate that, by including the dependence between responses in the design process, more efficient parameter estimates are obtained than by the usual practice of simply designing for a single variable only. Further, we investigate the robustness of designs with respect to initial parameter estimates and Copula function, and also show the performance of compound criteria within this bivariate binary setting.
Resumo:
Purpose In this study we examine neuroretinal function in five amblyopes, who had been shown in previous functional MRI (fMRI) studies to have compromised function of the lateral geniculate nucleus (LGN), to determine if the fMRI deficit in amblyopia may have its origin at the retinal level. Methods We used slow flash multifocal ERG (mfERG) and compared averaged five ring responses of the amblyopic and fellow eyes across a 35 deg field. Central responses were also assessed over a field which was about 6.3 deg in diameter. We measured central retinal thickness using optical coherence tomography. Central fields were measured using the MP1-Microperimeter which also assesses ocular fixation during perimetry. MfERG data were compared with fMRI results from a previous study. Results Amblyopic eyes had reduced response density amplitudes (first major negative to first positive (N1-P1) responses) for the central and paracentral retina (up to 18 deg diameter) but not for the mid-periphery (from 18 to 35 deg). Retinal thickness was within normal limits for all eyes, and not different between amblyopic and fellow eyes. Fixation was maintained within the central 4° more than 80% of the time by four of the five participants; fixation assessed using bivariate contour ellipse areas (BCEA) gave rankings similar to those of the MP-1 system. There was no significant relationship between BCEA and mfERG response for either amblyopic or fellow eye. There was no significant relationship between the central mfERG eye response difference and the selective blood oxygen level dependent (BOLD) LGN eye response difference previously seen in these participants. Conclusions Retinal responses in amblyopes can be reduced within the central field without an obvious anatomical basis. Additionally, this retinal deficit may not be the reason why the LGN BOLD (blood oxygen level dependent) responses are reduced for amblyopic eye stimulation.
Resumo:
Recent focus of flood frequency analysis (FFA) studies has been on development of methods to model joint distributions of variables such as peak flow, volume, and duration that characterize a flood event, as comprehensive knowledge of flood event is often necessary in hydrological applications. Diffusion process based adaptive kernel (D-kernel) is suggested in this paper for this purpose. It is data driven, flexible and unlike most kernel density estimators, always yields a bona fide probability density function. It overcomes shortcomings associated with the use of conventional kernel density estimators in FFA, such as boundary leakage problem and normal reference rule. The potential of the D-kernel is demonstrated by application to synthetic samples of various sizes drawn from known unimodal and bimodal populations, and five typical peak flow records from different parts of the world. It is shown to be effective when compared to conventional Gaussian kernel and the best of seven commonly used copulas (Gumbel-Hougaard, Frank, Clayton, Joe, Normal, Plackett, and Student's T) in estimating joint distribution of peak flow characteristics and extrapolating beyond historical maxima. Selection of optimum number of bins is found to be critical in modeling with D-kernel.
Resumo:
Multivariate lifetime data arise in various forms including recurrent event data when individuals are followed to observe the sequence of occurrences of a certain type of event; correlated lifetime when an individual is followed for the occurrence of two or more types of events, or when distinct individuals have dependent event times. In most studies there are covariates such as treatments, group indicators, individual characteristics, or environmental conditions, whose relationship to lifetime is of interest. This leads to a consideration of regression models.The well known Cox proportional hazards model and its variations, using the marginal hazard functions employed for the analysis of multivariate survival data in literature are not sufficient to explain the complete dependence structure of pair of lifetimes on the covariate vector. Motivated by this, in Chapter 2, we introduced a bivariate proportional hazards model using vector hazard function of Johnson and Kotz (1975), in which the covariates under study have different effect on two components of the vector hazard function. The proposed model is useful in real life situations to study the dependence structure of pair of lifetimes on the covariate vector . The well known partial likelihood approach is used for the estimation of parameter vectors. We then introduced a bivariate proportional hazards model for gap times of recurrent events in Chapter 3. The model incorporates both marginal and joint dependence of the distribution of gap times on the covariate vector . In many fields of application, mean residual life function is considered superior concept than the hazard function. Motivated by this, in Chapter 4, we considered a new semi-parametric model, bivariate proportional mean residual life time model, to assess the relationship between mean residual life and covariates for gap time of recurrent events. The counting process approach is used for the inference procedures of the gap time of recurrent events. In many survival studies, the distribution of lifetime may depend on the distribution of censoring time. In Chapter 5, we introduced a proportional hazards model for duration times and developed inference procedures under dependent (informative) censoring. In Chapter 6, we introduced a bivariate proportional hazards model for competing risks data under right censoring. The asymptotic properties of the estimators of the parameters of different models developed in previous chapters, were studied. The proposed models were applied to various real life situations.
Resumo:
In this paper the class of continuous bivariate distributions that has form-invariant weighted distribution with weight function w(x1, x2) ¼ xa1 1 xa2 2 is identified. It is shown that the class includes some well known bivariate models. Bayesian inference on the parameters of the class is considered and it is shown that there exist natural conjugate priors for the parameters
Resumo:
In this paper, the residual Kullback–Leibler discrimination information measure is extended to conditionally specified models. The extension is used to characterize some bivariate distributions. These distributions are also characterized in terms of proportional hazard rate models and weighted distributions. Moreover, we also obtain some bounds for this dynamic discrimination function by using the likelihood ratio order and some preceding results.
Resumo:
In this paper, Bayesian decision procedures are developed for dose-escalation studies based on binary measures of undesirable events and continuous measures of therapeutic benefit. The methods generalize earlier approaches where undesirable events and therapeutic benefit are both binary. A logistic regression model is used to model the binary responses, while a linear regression model is used to model the continuous responses. Prior distributions for the unknown model parameters are suggested. A gain function is discussed and an optional safety constraint is included. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
The aim of this study was to analyze the weight at birth (BW) and adjusted at 205 (W205), 365 (W365) and 550 (W55O) days in beef buffaloes from Brazil, using two approaches: parametric, by normal distribution, and non-parametric, by kernel function, and thus estimating the genetic, environmental and phenotypic correlation among traits. Information of 5,169 animals at birth (BW), 3,792 at 205 days (W205), 3.883 at 365 days (W365) and 1,524 at 550 days of age (W550) were used. The birth weight distribution presented an evident discrepancy in relation to the normal distribution. However, W205, W365 and W550 presented normal distributions. The birth weight presented weak genetic, environmental, and phenotypic associations with the other weight measurements. on the other hand, the weight traits at 205, 365, 550 days of age showed a high genetic correlation.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this article we introduce a three-parameter extension of the bivariate exponential-geometric (BEG) law (Kozubowski and Panorska, 2005) [4]. We refer to this new distribution as the bivariate gamma-geometric (BGG) law. A bivariate random vector (X, N) follows the BGG law if N has geometric distribution and X may be represented (in law) as a sum of N independent and identically distributed gamma variables, where these variables are independent of N. Statistical properties such as moment generation and characteristic functions, moments and a variance-covariance matrix are provided. The marginal and conditional laws are also studied. We show that BBG distribution is infinitely divisible, just as the BEG model is. Further, we provide alternative representations for the BGG distribution and show that it enjoys a geometric stability property. Maximum likelihood estimation and inference are discussed and a reparametrization is proposed in order to obtain orthogonality of the parameters. We present an application to a real data set where our model provides a better fit than the BEG model. Our bivariate distribution induces a bivariate Levy process with correlated gamma and negative binomial processes, which extends the bivariate Levy motion proposed by Kozubowski et al. (2008) [6]. The marginals of our Levy motion are a mixture of gamma and negative binomial processes and we named it BMixGNB motion. Basic properties such as stochastic self-similarity and the covariance matrix of the process are presented. The bivariate distribution at fixed time of our BMixGNB process is also studied and some results are derived, including a discussion about maximum likelihood estimation and inference. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
A large number of proposals for estimating the bivariate survival function under random censoring has been made. In this paper we discuss nonparametric maximum likelihood estimation and the bivariate Kaplan-Meier estimator of Dabrowska. We show how these estimators are computed, present their intuitive background and compare their practical performance under different levels of dependence and censoring, based on extensive simulation results, which leads to a practical advise.
Resumo:
Estimation for bivariate right censored data is a problem that has had much study over the past 15 years. In this paper we propose a new class of estimators for the bivariate survival function based on locally efficient estimation. We introduce the locally efficient estimator for bivariate right censored data, present an asymptotic theorem, present the results of simulation studies and perform a brief data analysis illustrating the use of the locally efficient estimator.