970 resultados para mean-variance estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND The demographic structure has a significant influence on the use of healthcare services, as does the size of the population denominators. Very few studies have been published on methods for estimating the real population such as tourist resorts. The lack of information about these problems means there is a corresponding lack of information about the behaviour of populational denominators (the floating population or tourist load) and the effect of this on the use of healthcare services. The objectives of the study were: a) To determine the Municipal Solid Waste (MSW) ratio, per person per day, among populations of known size; b) to estimate, by means of this ratio, the real population in an area where tourist numbers are very significant; and c) to determine the impact on the utilisation of hospital emergency healthcare services of the registered population, in comparison to the non-resident population, in two areas where tourist numbers are very significant. METHODS An ecological study design was employed. We analysed the Healthcare Districts of the Costa del Sol and the island of Menorca. Both are Spanish territories in the Mediterranean region. RESULTS In the two areas analysed, the correlation coefficient between the MSW ratio and admissions to hospital emergency departments exceeded 0.9, with p < 0.001. On the basis of MSW generation ratios, obtained for a control zone and also measured in neighbouring countries, we estimated the real population. For the summer months, when tourist activity is greatest and demand for emergency healthcare at hospitals is highest, this value was found to be double that of the registered population. CONCLUSION The MSW indicator, which is both ecological and indirect, can be used to estimate the real population in areas where population levels vary significantly during the year. This parameter is of interest in planning and dimensioning the provision of healthcare services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional mosquito control strategies rely heavily on the use of chemical insecticides. However, concerns about the efficiency of traditional control methods, environmental impact and emerging pesticide resistance have highlighted the necessity for developing innovative tools for mosquito control. Some novel strategies, including release of insects carrying a dominant lethal gene (RIDL®), rely on the sustained release of modified male mosquitoes and therefore benefit from a thorough understanding of the biology of the male of the species. In this report we present the results of a mark-release-recapture study aimed at: (i) establishing the survival in the field of laboratory-reared, wild-type male Aedes aegypti and (b) estimating the size of the local adult Ae. aegypti population. The study took place in Panama, a country where recent increases in the incidence and severity of dengue cases have prompted health authorities to evaluate alternative strategies for vector control. Results suggest a life expectancy of 2.3 days for released male mosquitoes (confidence interval: 1.78-2.86). Overall, the male mosquito population was estimated at 58 males/ha (range 12-81 males/ha), which can be extrapolated to an average of 0.64 pupae/person for the study area. The practical implications of these results are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. METHODOLOGY/PRINCIPAL FINDINGS We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. CONCLUSION/SIGNIFICANCE The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and perspectives of using nutrient patterns in future studies particularly at international level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Gaining postpyloric access in ventilated, sedated ICU patients usually requires time-consuming procedures such as endoscopy. Recently, a feeding tube has been introduced that migrates spontaneously into the jejunum in surgical patients. The study aimed at assessing the rate of migration of this tube in critically ill patients. DESIGN: Prospective descriptive trial. SETTING: Surgical ICU in a tertiary University Hospital. PATIENTS: One hundred and five consecutive surgical ICU patients requiring enteral feeding were enrolled, resulting in 128 feeding-tube placement attempts. METHODS: A self-propelled tube was used and followed up for 3 days: progression was assessed by daily contrast-injected X-ray. Severity of illness was assessed with SAPS II and organ failure assessed with SOFA score. RESULTS: The patients were aged 55+/-19 years (mean+/-SD) with SAPS II score of 45+/-18. Of the 128 tube placement attempts, 12 could not be placed in the stomach; eight were accidentally pulled out while in gastric position due to the necessity to avoid fixation during the progression phase. Among organ failures, respiratory failure predominated, followed by cardiovascular. By day 3, the postpyloric progression rate was 63/128 tubes (49%). There was no association between migration and age, or SAPS II score, but the progression rate was significantly poorer in patients with hemodynamic failure. Use of norepinephrine and morphine were negatively associated with tube progression (P<0.001), while abdominal surgery was not. In ten patients, jejunal tubes were placed by endoscopy. CONCLUSION: Self-propelled feeding tubes progressed from the stomach to the postpyloric position in 49% of patients, reducing the number of endoscopic placements: these tubes may facilitate enteral nutrient delivery in the ICU.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focus on the problem of locating single-phase faults in mixed distribution electric systems, with overhead lines and underground cables, using voltage and current measurements at the sending-end and sequence model of the network. Since calculating series impedance for underground cables is not as simple as in the case of overhead lines, the paper proposes a methodology to obtain an estimation of zero-sequence impedance of underground cables starting from previous single-faults occurred in the system, in which an electric arc occurred at the fault location. For this reason, the signal is previously pretreated to eliminate its peaks voltage and the analysis can be done working with a signal as close as a sinus wave as possible

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examined drivers of article citations using 776 articles that were published from 1990-2012 in a broad-based and high-impact social sciences journal, The Leadership Quarterly. These articles had 1,191 unique authors having published and received in total (at the time of their most recent article published in our dataset) 16,817 articles and 284,777 citations, respectively. Our models explained 66.6% of the variance in citations and showed that quantitative, review, method, and theory articles were significantly more cited than were qualitative articles or agent-based simulations. As concerns quantitative articles, which constituted the majority of the sample, our model explained 80.3% of the variance in citations; some methods (e.g., use of SEM) and designs (e.g., meta-analysis), as well as theoretical approaches (e.g., use of transformational, charismatic, or visionary type-leadership theories) predicted higher article citations. Regarding the statistical conclusion validity of quantitative articles, articles having endogeneity threats received significantly fewer citations than did those using a more robust design or an estimation procedure that ensured correct causal estimation. We make several general recommendations on how to improve research practice and article citations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel technique for estimating the rank of the trajectory matrix in the local subspace affinity (LSA) motion segmentation framework is presented. This new rank estimation is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built with LSA. The result is an enhanced model selection technique for trajectory matrix rank estimation by which it is possible to automate LSA, without requiring any a priori knowledge, and to improve the final segmentation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data