39 resultados para minimum contrast estimator
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This work provides a general framework for the design of second-order blind estimators without adopting anyapproximation about the observation statistics or the a prioridistribution of the parameters. The proposed solution is obtainedminimizing the estimator variance subject to some constraints onthe estimator bias. The resulting optimal estimator is found todepend on the observation fourth-order moments that can be calculatedanalytically from the known signal model. Unfortunately,in most cases, the performance of this estimator is severely limitedby the residual bias inherent to nonlinear estimation problems.To overcome this limitation, the second-order minimum varianceunbiased estimator is deduced from the general solution by assumingaccurate prior information on the vector of parameters.This small-error approximation is adopted to design iterativeestimators or trackers. It is shown that the associated varianceconstitutes the lower bound for the variance of any unbiasedestimator based on the sample covariance matrix.The paper formulation is then applied to track the angle-of-arrival(AoA) of multiple digitally-modulated sources by means ofa uniform linear array. The optimal second-order tracker is comparedwith the classical maximum likelihood (ML) blind methodsthat are shown to be quadratic in the observed data as well. Simulationshave confirmed that the discrete nature of the transmittedsymbols can be exploited to improve considerably the discriminationof near sources in medium-to-high SNR scenarios.
Resumo:
A new parametric minimum distance time-domain estimator for ARFIMA processes is introduced in this paper. The proposed estimator minimizes the sum of squared correlations of residuals obtained after filtering a series through ARFIMA parameters. The estimator iseasy to compute and is consistent and asymptotically normally distributed for fractionallyintegrated (FI) processes with an integration order d strictly greater than -0.75. Therefore, it can be applied to both stationary and non-stationary processes. Deterministic components are also allowed in the DGP. Furthermore, as a by-product, the estimation procedure provides an immediate check on the adequacy of the specified model. This is so because the criterion function, when evaluated at the estimated values, coincides with the Box-Pierce goodness of fit statistic. Empirical applications and Monte-Carlo simulations supporting the analytical results and showing the good performance of the estimator in finite samples are also provided.
Resumo:
The harmful dinoflagellate Prorocentrum minimum has different effects upon various species of grazing bivalves, and these effects also vary with life-history stage. Possible effects of this dinoflagellate upon mussels have not been reported; therefore, experiments exposing adult blue mussels, Mytilus edulis, to P. minimum were conducted. Mussels were exposed to cultures of toxic P. minimum or benign Rhodomonas sp. in glass aquaria. After a short period of acclimation, samples were collected on day 0 (before the exposure) and after 3, 6, and 9 days of continuous-exposure experiment. Hemolymph was extracted for flow-cytometric analyses of hemocyte, immune-response functions, and soft tissues were excised for histopathology. Mussels responded to P. minimum exposure with diapedesis of hemocytes into the intestine, presumably to isolate P. minimum cells within the gut, thereby minimizing damage to other tissues. This immune response appeared to have been sustained throughout the 9-day exposure period, as circulating hemocytes retained hematological and functional properties. Bacteria proliferated in the intestines of the P. minimum-exposed mussels. Hemocytes within the intestine appeared to be either overwhelmed by the large number of bacteria or fully occupied in the encapsulating response to P. minimum cells; when hemocytes reached the intestine lumina, they underwent apoptosis and bacterial degradation. This experiment demonstrated that M. edulis is affected by ingestion of toxic P. minimum; however, the specific responses observed in the blue mussel differed from those reported for other bivalve species. This finding highlights the need to study effects of HABs on different bivalve species, rather than inferring that results from one species reflect the exposure responses of all bivalves.
Resumo:
In a distribution problem, and specfii cally in bankruptcy issues, the Proportional (P) and the Egalitarian (EA) divisions are two of the most popular ways to resolve the conflict. The Constrained Equal Awards rule (CEA) is introduced in bankruptcy literature to ensure that no agent receives more than her claim, a problem that can arise when using the egalitarian division. We propose an alternative modi cation, by using a convex combination of P and EA. The recursive application of this new rule finishes at the CEA rule. Our solution concept ensures a minimum amount to each agent, and distributes the remaining estate in a proportional way. Keywords: Bankruptcy problems, Proportional rule, Equal Awards, Convex combination of rules, Lorenz dominance. JEL classi fication: C71, D63, D71.
Resumo:
One of the most effective techniques offering QoS routing is minimum interference routing. However, it is complex in terms of computation time and is not oriented toward improving the network protection level. In order to include better levels of protection, new minimum interference routing algorithms are necessary. Minimizing the failure recovery time is also a complex process involving different failure recovery phases. Some of these phases depend completely on correct routing selection, such as minimizing the failure notification time. The level of protection also involves other aspects, such as the amount of resources used. In this case shared backup techniques should be considered. Therefore, minimum interference techniques should also be modified in order to include sharing resources for protection in their objectives. These aspects are reviewed and analyzed in this article, and a new proposal combining minimum interference with fast protection using shared segment backups is introduced. Results show that our proposed method improves both minimization of the request rejection ratio and the percentage of bandwidth allocated to backup paths in networks with low and medium protection requirements
Resumo:
The R-package “compositions”is a tool for advanced compositional analysis. Its basicfunctionality has seen some conceptual improvement, containing now some facilitiesto work with and represent ilr bases built from balances, and an elaborated subsys-tem for dealing with several kinds of irregular data: (rounded or structural) zeroes,incomplete observations and outliers. The general approach to these irregularities isbased on subcompositions: for an irregular datum, one can distinguish a “regular” sub-composition (where all parts are actually observed and the datum behaves typically)and a “problematic” subcomposition (with those unobserved, zero or rounded parts, orelse where the datum shows an erratic or atypical behaviour). Systematic classificationschemes are proposed for both outliers and missing values (including zeros) focusing onthe nature of irregularities in the datum subcomposition(s).To compute statistics with values missing at random and structural zeros, a projectionapproach is implemented: a given datum contributes to the estimation of the desiredparameters only on the subcompositon where it was observed. For data sets withvalues below the detection limit, two different approaches are provided: the well-knownimputation technique, and also the projection approach.To compute statistics in the presence of outliers, robust statistics are adapted to thecharacteristics of compositional data, based on the minimum covariance determinantapproach. The outlier classification is based on four different models of outlier occur-rence and Monte-Carlo-based tests for their characterization. Furthermore the packageprovides special plots helping to understand the nature of outliers in the dataset.Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator,robustness, rounded zeros
Resumo:
PROPÒSIT: Estudiar l'efecte de la cirurgia LASIK en la llum dispersa i la sensibilitat al contrast. MÈTODES: Vint-i-vuit pacients van ser tractats amb LASIK. La qualitat visual es va avaluar abans de l'operació i dos mesos després. RESULTATS: La mitjana de llum dispersa i la sensibilitat al contrast abans de l'operació no va canviar en dos mesos després. Només un ull tenia un marcat augment en la llum dispersa. Nou ulls van presentar una lleugera disminució en la sensibilitat al contrast. S'han trobat dues complicacions. CONCLUSIÓ: Després de LASIK la majoria dels pacients (80%) no van tenir complicacions i van mantenir la seva qualitat visual. Uns pocs pacients (16%) van tenir una mica de qualitat visual disminuïda. Molt pocs (4%) van tenir complicacions clíniques amb disminució en la qualitat visual.
Resumo:
An overview is given on a study which showed that not only in chemical reactions but also in the favorable case of nontotally symmetric vibrations where the chemical and external potentials keep approximately constant, the generalized maximum hardness principle (GMHP) and generalized minimum polarizability principle (GMPP) may not be obeyed. A method that allows an accurate determination of the nontotally symmetric molecular distortions with more marked GMPP or anti-GMPP character through diagonalization of the polarizability Hessian matrix is introduced
Resumo:
The hypothesis of minimum entropy production is applied to a simple one-dimensional energy balance model and is analysed for different values of the radiative forcing due to greenhouse gases. The extremum principle is used to determine the planetary “conductivity” and to avoid the “diffusive” approximation, which is commonly assumed in this type of model. For present conditions the result at minimum radiative entropy production is similar to that obtained by applying the classical model. Other climatic scenarios show visible differences, with better behaviour for the extremal case
Resumo:
Tone Mapping is the problem of compressing the range of a High-Dynamic Range image so that it can be displayed in a Low-Dynamic Range screen, without losing or introducing novel details: The final image should produce in the observer a sensation as close as possible to the perception produced by the real-world scene. We propose a tone mapping operator with two stages. The first stage is a global method that implements visual adaptation, based on experiments on human perception, in particular we point out the importance of cone saturation. The second stage performs local contrast enhancement, based on a variational model inspired by color vision phenomenology. We evaluate this method with a metric validated by psychophysical experiments and, in terms of this metric, our method compares very well with the state of the art.
Resumo:
The paper contrasts empirically the results of alternative methods for estimating thevalue and the depreciation of mineral resources. The historical data of Mexico andVenezuela, covering the period 1920s-1980s, is used to contrast the results of severalmethods. These are the present value, the net price method, the user cost method andthe imputed income method. The paper establishes that the net price and the user costare not competing methods as such, but alternative adjustments to different scenariosof closed and open economies. The results prove that the biases of the methods, ascommonly described in the theoretical literature, only hold under the most restrictedscenario of constant rents over time. It is argued that the difference between what isexpected to happen and what actually did happen is for the most part due to a missingvariable, namely technological change. This is an important caveat to therecommendations made based on these models.
Resumo:
In this paper we explore the effects of the minimum pension program on welfare andretirement in Spain. This is done with a stylized life-cycle model which provides a convenient analytical characterization of optimal behavior. We use data from the Spanish Social Security to estimate the behavioral parameters of the model and then simulate the changes induced by the minimum pension in aggregate retirement patterns. The impact is substantial: there is threefold increase in retirement at 60 (the age of first entitlement) with respect to the economy without minimum pensions, and total early retirement (before or at 60) is almost 50% larger.
Resumo:
We analyze the impact of a minimum price variation (tick) and timepriority on the dynamics of quotes and the trading costs when competitionfor the order flow is dynamic. We find that convergence to competitiveoutcomes can take time and that the speed of convergence is influencedby the tick size, the priority rule and the characteristics of the orderarrival process. We show also that a zero minimum price variation is neveroptimal when competition for the order flow is dynamic. We compare thetrading outcomes with and without time priority. Time priority is shownto guarantee that uncompetitive spreads cannot be sustained over time.However it can sometimes result in higher trading costs. Empiricalimplications are proposed. In particular, we relate the size of thetrading costs to the frequency of new offers and the dynamics of theinside spread to the state of the book.