977 resultados para Covariance matrix estimation
Resumo:
Traditional mosquito control strategies rely heavily on the use of chemical insecticides. However, concerns about the efficiency of traditional control methods, environmental impact and emerging pesticide resistance have highlighted the necessity for developing innovative tools for mosquito control. Some novel strategies, including release of insects carrying a dominant lethal gene (RIDL®), rely on the sustained release of modified male mosquitoes and therefore benefit from a thorough understanding of the biology of the male of the species. In this report we present the results of a mark-release-recapture study aimed at: (i) establishing the survival in the field of laboratory-reared, wild-type male Aedes aegypti and (b) estimating the size of the local adult Ae. aegypti population. The study took place in Panama, a country where recent increases in the incidence and severity of dengue cases have prompted health authorities to evaluate alternative strategies for vector control. Results suggest a life expectancy of 2.3 days for released male mosquitoes (confidence interval: 1.78-2.86). Overall, the male mosquito population was estimated at 58 males/ha (range 12-81 males/ha), which can be extrapolated to an average of 0.64 pupae/person for the study area. The practical implications of these results are discussed.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
This paper focus on the problem of locating single-phase faults in mixed distribution electric systems, with overhead lines and underground cables, using voltage and current measurements at the sending-end and sequence model of the network. Since calculating series impedance for underground cables is not as simple as in the case of overhead lines, the paper proposes a methodology to obtain an estimation of zero-sequence impedance of underground cables starting from previous single-faults occurred in the system, in which an electric arc occurred at the fault location. For this reason, the signal is previously pretreated to eliminate its peaks voltage and the analysis can be done working with a signal as close as a sinus wave as possible
Resumo:
The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares
Resumo:
By using both conventional and confocal laser scanning microscopy with three monoclonal antibodies recognizing nuclear matrix proteins we have investigated by means of indirect fluorescence whether an incubation of isolated nuclei at the physiological temperature of 37 degrees C induces a redistribution of nuclear components in human K562 erythroleukemia cells. Upon incubation of isolated nuclei for 45 min at 37 degrees C, we have found that two of the antibodies, directed against proteins of the inner matrix network (M(r) 125 and 160 kDa), gave a fluorescent pattern different from that observed in permeabilized cells. By contrast, the fluorescent pattern did not change if nuclei were kept at 0 degrees C. The difference was more marked in case of the 160-kDa polypeptide. The fluorescent pattern detected by the third antibody, which recognizes the 180-kDa nucleolar isoform of DNA topoisomerase II, was unaffected by heat exposure of isolated nuclei. When isolated nuclear matrices prepared from heat-stabilized nuclei were stained by means of the same three antibodies, it was possible to see that the distribution of the 160-kDa matrix protein no longer corresponded to that observable in permeabilized cells, whereas the fluorescent pattern given by the antibody to the 125-kDa polypeptide resembled that detectable in permeabilized cells. The 180-kDa isoform of topoisomerase II was still present in the matrix nucleolar remnants. We conclude that a 37 degrees C incubation of isolated nuclei induces a redistribution of some nuclear matrix antigens and cannot prevent the rearrangement in the spatial organization of one of these antigens that takes place during matrix isolation in human erythroleukemia cells. The practical relevance of these findings is discussed.
Resumo:
The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.
Resumo:
In this paper a novel rank estimation technique for trajectories motion segmentation within the Local Subspace Affinity (LSA) framework is presented. This technique, called Enhanced Model Selection (EMS), is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built by LSA. The results on synthetic and real data show that without any a priori knowledge, EMS automatically provides an accurate and robust rank estimation, improving the accuracy of the final motion segmentation
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
The MDRD (Modification of diet in renal disease) equation enables glomerular filtration rate (GFR) estimation from serum creatinine only. Thus, the laboratory can report an estimated GFR (eGFR) with each serum creatinine assessment, increasing therefore the recognition of renal failure. Predictive performance of MDRD equation is better for GFR < 60 ml/min/1,73 m2. A normal or near-normal renal function is often underestimated by this equation. Overall, MDRD provides more reliable estimations of renal function than the Cockcroft-Gault (C-G) formula, but both lack precision. MDRD is not superior to C-G for drug dosing. Being adjusted to 1,73 m2, MDRD eGFR has to be back adjusted to the patient's body surface area for drug dosing. Besides, C-G has the advantage of a greater simplicity and a longer use.
Resumo:
Objectives: Polychlorinated biphenyls (PCBs) are considered probable human carcinogens by the International Agency for Research on Cancer and one congener, PCB126, has been rated as a known human carcinogen. A period-specific job exposure matrix (JEM) was developed for former PCB-exposed capacitor manufacturing workers (n=12,605) (1938-1977). Methods: A detailed exposure assessment for this plant was based on a number of exposure determinants (proximity, degree of contact with PCBs, temperature, ventilation, process control, job mobility). The intensity and frequency of PCB exposures by job for both inhalation and dermal exposures, and additional chemical exposures were reviewed. The JEM was developed in nine steps: (1) all unique jobs (n=1,684) were assessed using (2) defined PCB exposure determinants; (3) the exposure determinants were used to develop exposure profiles; (4) similar exposure profiles were combined into categories having similar PCB exposures; (5) qualitative intensity (high-medium-low-baseline) and frequency (continuous-intermittent) ratings were developed, and (6) used to qualitatively rate inhalation and dermal exposure separately for each category; (7) quantitative intensity ratings based on available air concentrations were developed for inhalation and dermal exposures based on equal importance of both routes of exposure; (8) adjustments were made for overall exposure, and (9) for each category the product of intensity and frequency was calculated, and exposure in the earlier era was weighted. Results: A period-specific JEM modified for two eras of stable PCB exposure conditions. Conclusions: These exposure estimates, derived from a systematic and rigorous use of the exposure determinant data, lead to cumulative PCB exposure-response relationships in the epidemiological cancer mortality and incidence studies of this cohort. [Authors]
Resumo:
In this paper, we propose a new paradigm to carry outthe registration task with a dense deformation fieldderived from the optical flow model and the activecontour method. The proposed framework merges differenttasks such as segmentation, regularization, incorporationof prior knowledge and registration into a singleframework. The active contour model is at the core of ourframework even if it is used in a different way than thestandard approaches. Indeed, active contours are awell-known technique for image segmentation. Thistechnique consists in finding the curve which minimizesan energy functional designed to be minimal when thecurve has reached the object contours. That way, we getaccurate and smooth segmentation results. So far, theactive contour model has been used to segment objectslying in images from boundary-based, region-based orshape-based information. Our registration technique willprofit of all these families of active contours todetermine a dense deformation field defined on the wholeimage. A well-suited application of our model is theatlas registration in medical imaging which consists inautomatically delineating anatomical structures. Wepresent results on 2D synthetic images to show theperformances of our non rigid deformation field based ona natural registration term. We also present registrationresults on real 3D medical data with a large spaceoccupying tumor substantially deforming surroundingstructures, which constitutes a high challenging problem.
Resumo:
The applicability of the protein phosphatase inhibition assay (PPIA) to the determination of okadaic acid (OA) and its acyl derivatives in shellfish samples has been investigated, using a recombinant PP2A and a commercial one. Mediterranean mussel, wedge clam, Pacific oyster and flat oyster have been chosen as model species. Shellfish matrix loading limits for the PPIA have been established, according to the shellfish species and the enzyme source. A synergistic inhibitory effect has been observed in the presence of OA and shellfish matrix, which has been overcome by the application of a correction factor (0.48). Finally, Mediterranean mussel samples obtained from Rı´a de Arousa during a DSP closure associated to Dinophysis acuminata, determined as positive by the mouse bioassay, have been analysed with the PPIAs. The OA equivalent contents provided by the PPIAs correlate satisfactorily with those obtained by liquid chromatography–tandem mass spectrometry (LC–MS/MS).
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.