976 resultados para zero
Resumo:
Fixed transactions costs that prohibit exchange engender bias in supply analysis due to censoring of the sample observations. The associated bias in conventional regression procedures applied to censored data and the construction of robust methods for mitigating bias have been preoccupations of applied economists since Tobin [Econometrica 26 (1958) 24]. This literature assumes that the true point of censoring in the data is zero and, when this is not the case, imparts a bias to parameter estimates of the censored regression model. We conjecture that this bias can be significant; affirm this from experiments; and suggest techniques for mitigating this bias using Bayesian procedures. The bias-mitigating procedures are based on modifications of the key step that facilitates Bayesian estimation of the censored regression model; are easy to implement; work well in both small and large samples; and lead to significantly improved inference in the censored regression model. These findings are important in light of the widespread use of the zero-censored Tobit regression and we investigate their consequences using data on milk-market participation in the Ethiopian highlands. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
A sparse kernel density estimator is derived based on the zero-norm constraint, in which the zero-norm of the kernel weights is incorporated to enhance model sparsity. The classical Parzen window estimate is adopted as the desired response for density estimation, and an approximate function of the zero-norm is used for achieving mathemtical tractability and algorithmic efficiency. Under the mild condition of the positive definite design matrix, the kernel weights of the proposed density estimator based on the zero-norm approximation can be obtained using the multiplicative nonnegative quadratic programming algorithm. Using the -optimality based selection algorithm as the preprocessing to select a small significant subset design matrix, the proposed zero-norm based approach offers an effective means for constructing very sparse kernel density estimates with excellent generalisation performance.
Resumo:
The optimal and the zero-forcing beamformers are two commonly used algorithms in the subspace-based blind beamforming technology. The optimal beamformer is regarded as the algorithm with the best output SINR. The zero-forcing algorithm emphasizes the co-channel interference cancellation. This paper compares the performance of these two algorithms under some practical conditions: the effect of the finite data length and the existence of the angle estimation error. The investigation reveals that the zero-forcing algorithm can be more robust in the practical environment than the optimal algorithm.
Resumo:
A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.
Resumo:
Following the US model, the UK has seen considerable innovation in the funding, finance and procurement of real estate in the last decade. In the growing CMBS market asset backed securitisations have included $2.25billion secured on the Broadgate office development and issues secured on Canary Wharf and the Trafford Centre regional mall. Major occupiers (retailer Sainsbury’s, retail bank Abbey National) have engaged in innovative sale & leaseback and outsourcing schemes. Strong claims are made concerning the benefits of such schemes – e.g. British Land were reported to have reduced their weighted cost of debt by 150bp as a result of the Broadgate issue. The paper reports preliminary findings from a project funded by the Corporation of London and the RICS Research Foundation examining a number of innovative schemes to identify, within a formal finance framework, sources of added value and hidden costs. The analysis indicates that many of the gains claimed conceal costs – in terms of market value of debt or flexibility of management – while others result from unusual firm or market conditions (for example utilising the UK long lease and the unusual shape of the yield curve). Nonetheless, there are real gains resulting from the innovations, reflecting arbitrage and institutional constraints in the direct (private) real estate market
Resumo:
In October 2008 UK government announced very ambitious commitment to reduce greenhouse gas emissions of at least 34% by 2020 and by 80% by 2050 against a 1990 baseline. Consequently the government declares that new homes should be built to high environmental standards which means that from 2016 new homes will have to be built to a Zero Carbon standard. The paper sets out to present UK zero carbon residential development achieving the highest, Level 6 of Code for Sustainable Homes standard. Comprehensive information is provided about various environmental aspects of the housing development. Special attention is given to energy efficiency features of the houses and low carbon district heating solution which include biomass boiler, heat pumps, solar collectors and photovoltaic panels. The paper presents also challenges which designers and builders had to face delivering houses of the future.
Resumo:
Housing in the UK accounts for 30.5% of all energy consumed and is responsible for 25% of all carbon emissions. The UK Government’s Code for Sustainable Homes requires all new homes to be zero carbon by 2016. The development and widespread diffusion of low and zero carbon (LZC) technologies is recognised as being a key solution for housing developers to deliver against this zero-carbon agenda. The innovation challenge to design and incorporate these technologies into housing developers’ standard design and production templates will usher in significant technical and commercial risks. In this paper we report early results from an ongoing Engineering and Physical Sciences Research Council project looking at the innovation logic and trajectory of LZC technologies in new housing. The principal theoretical lens for the research is the socio-technical network approach which considers actors’ interests and interpretative flexibilities of technologies and how they negotiate and reproduce ‘acting spaces’ to shape, in this case, the selection and adoption of LZC technologies. The initial findings are revealing the form and operation of the technology networks around new housing developments as being very complex, involving a range of actors and viewpoints that vary for each housing development.
Resumo:
The UK Government is committed to all new homes being zero-carbon from 2016. The use of low and zero carbon (LZC) technologies is recognised by housing developers as being a key part of the solution to deliver against this zero-carbon target. The paper takes as its starting point that the selection of new technologies by firms is not a phenomenon which takes place within a rigid sphere of technical rationality (for example, Rip and Kemp, 1998). Rather, technology forms and diffusion trajectories are driven and shaped by myriad socio-technical structures, interests and logics. A literature review is offered to contribute to a more critical and systemic foundation for understanding the socio-technical features of the selection of LZC technologies in new housing. The problem is investigated through a multidisciplinary lens consisting of two perspectives: technological and institutional. The synthesis of the perspectives crystallises the need to understand that the selection of LZC technologies by housing developers is not solely dependent on technical or economic efficiency, but on the emergent ‘fit’ between the intrinsic properties of the technologies, institutional logics and the interests and beliefs of various actors in the housing development process.