963 resultados para calibration estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND The demographic structure has a significant influence on the use of healthcare services, as does the size of the population denominators. Very few studies have been published on methods for estimating the real population such as tourist resorts. The lack of information about these problems means there is a corresponding lack of information about the behaviour of populational denominators (the floating population or tourist load) and the effect of this on the use of healthcare services. The objectives of the study were: a) To determine the Municipal Solid Waste (MSW) ratio, per person per day, among populations of known size; b) to estimate, by means of this ratio, the real population in an area where tourist numbers are very significant; and c) to determine the impact on the utilisation of hospital emergency healthcare services of the registered population, in comparison to the non-resident population, in two areas where tourist numbers are very significant. METHODS An ecological study design was employed. We analysed the Healthcare Districts of the Costa del Sol and the island of Menorca. Both are Spanish territories in the Mediterranean region. RESULTS In the two areas analysed, the correlation coefficient between the MSW ratio and admissions to hospital emergency departments exceeded 0.9, with p < 0.001. On the basis of MSW generation ratios, obtained for a control zone and also measured in neighbouring countries, we estimated the real population. For the summer months, when tourist activity is greatest and demand for emergency healthcare at hospitals is highest, this value was found to be double that of the registered population. CONCLUSION The MSW indicator, which is both ecological and indirect, can be used to estimate the real population in areas where population levels vary significantly during the year. This parameter is of interest in planning and dimensioning the provision of healthcare services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional mosquito control strategies rely heavily on the use of chemical insecticides. However, concerns about the efficiency of traditional control methods, environmental impact and emerging pesticide resistance have highlighted the necessity for developing innovative tools for mosquito control. Some novel strategies, including release of insects carrying a dominant lethal gene (RIDL®), rely on the sustained release of modified male mosquitoes and therefore benefit from a thorough understanding of the biology of the male of the species. In this report we present the results of a mark-release-recapture study aimed at: (i) establishing the survival in the field of laboratory-reared, wild-type male Aedes aegypti and (b) estimating the size of the local adult Ae. aegypti population. The study took place in Panama, a country where recent increases in the incidence and severity of dengue cases have prompted health authorities to evaluate alternative strategies for vector control. Results suggest a life expectancy of 2.3 days for released male mosquitoes (confidence interval: 1.78-2.86). Overall, the male mosquito population was estimated at 58 males/ha (range 12-81 males/ha), which can be extrapolated to an average of 0.64 pupae/person for the study area. The practical implications of these results are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focus on the problem of locating single-phase faults in mixed distribution electric systems, with overhead lines and underground cables, using voltage and current measurements at the sending-end and sequence model of the network. Since calculating series impedance for underground cables is not as simple as in the case of overhead lines, the paper proposes a methodology to obtain an estimation of zero-sequence impedance of underground cables starting from previous single-faults occurred in the system, in which an electric arc occurred at the fault location. For this reason, the signal is previously pretreated to eliminate its peaks voltage and the analysis can be done working with a signal as close as a sinus wave as possible

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel technique for estimating the rank of the trajectory matrix in the local subspace affinity (LSA) motion segmentation framework is presented. This new rank estimation is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built with LSA. The result is an enhanced model selection technique for trajectory matrix rank estimation by which it is possible to automate LSA, without requiring any a priori knowledge, and to improve the final segmentation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Prevention of cardiovascular disease (CVD) at the individual level should rely on the assessment of absolute risk using population-specific risk tables. OBJECTIVE: To compare the predictive accuracy of the original and the calibrated SCORE functions regarding 10-year cardiovascular risk in Switzerland. DESIGN: Cross-sectional, population-based study (5773 participants aged 35-74 years). METHODS: The SCORE equation for low-risk countries was calibrated based on the Swiss CVD mortality rates and on the CVD risk factor levels from the study sample. The predicted number of CVD deaths after a 10-year period was computed from the original and the calibrated equations and from the observed cardiovascular mortality for 2003. RESULTS: According to the original and calibrated functions, 16.3 and 15.8% of men and 8.2 and 8.9% of women, respectively, had a 10-year CVD risk > or =5%. Concordance correlation coefficient between the two functions was 0.951 for men and 0.948 for women, both P<0.001. Both risk functions adequately predicted the 10-year cumulative number of CVD deaths: in men, 71 (original) and 74 (calibrated) deaths for 73 deaths when using the CVD mortality rates; in women, 44 (original), 45 (calibrated) and 45 (CVD mortality rates), respectively. Compared to the original function, the calibrated function classified more women and fewer men at high-risk. Moreover, the calibrated function gave better risk estimates among participants aged over 65 years. CONCLUSION: The original SCORE function adequately predicts CVD death in Switzerland, particularly for individuals aged less than 65 years. The calibrated function provides more reliable estimates for older individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The MDRD (Modification of diet in renal disease) equation enables glomerular filtration rate (GFR) estimation from serum creatinine only. Thus, the laboratory can report an estimated GFR (eGFR) with each serum creatinine assessment, increasing therefore the recognition of renal failure. Predictive performance of MDRD equation is better for GFR < 60 ml/min/1,73 m2. A normal or near-normal renal function is often underestimated by this equation. Overall, MDRD provides more reliable estimations of renal function than the Cockcroft-Gault (C-G) formula, but both lack precision. MDRD is not superior to C-G for drug dosing. Being adjusted to 1,73 m2, MDRD eGFR has to be back adjusted to the patient's body surface area for drug dosing. Besides, C-G has the advantage of a greater simplicity and a longer use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a new paradigm to carry outthe registration task with a dense deformation fieldderived from the optical flow model and the activecontour method. The proposed framework merges differenttasks such as segmentation, regularization, incorporationof prior knowledge and registration into a singleframework. The active contour model is at the core of ourframework even if it is used in a different way than thestandard approaches. Indeed, active contours are awell-known technique for image segmentation. Thistechnique consists in finding the curve which minimizesan energy functional designed to be minimal when thecurve has reached the object contours. That way, we getaccurate and smooth segmentation results. So far, theactive contour model has been used to segment objectslying in images from boundary-based, region-based orshape-based information. Our registration technique willprofit of all these families of active contours todetermine a dense deformation field defined on the wholeimage. A well-suited application of our model is theatlas registration in medical imaging which consists inautomatically delineating anatomical structures. Wepresent results on 2D synthetic images to show theperformances of our non rigid deformation field based ona natural registration term. We also present registrationresults on real 3D medical data with a large spaceoccupying tumor substantially deforming surroundingstructures, which constitutes a high challenging problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New zircon U-Pb ages are proposed for late Early and Middle Triassic volcanic ash layers from the Luolou and Baifeng formations (northwestern Guangxi, South China). These ages are based on analyses of single, thermally annealed and chemically abraded zircons. Calibration with ammonoid ages indicate a 250.6 +/- 0.5 Ma age for the early Spathian Tirolites/Columbites beds, a 248.1 +/- 0.4 Ma age for the late Spathian Neopopanoceras haugi Zone, a 246.9 +/- 0.4 Ma age for the early middle Anisian Acrochordiceras hyatti Zone, and a 244.6 +/- 0.5 Ma age for the late middle Anisian Balatonites shoshonensis Zone. The new dates and previously published U-Pb ages indicate a duration of ca. 3 my for the Spathian, and minimal durations of 4.5 +/- 0.6 my for the Early Triassic and of 6.6+0.7/-0.9 my for the Anisian. The new Spathian dates are in a better agreement with a 252.6 +/- 0.2 Ma age than with a 251.4 +/- 0.3 Ma age for the Permian-Triassic boundary. These dates also highlight the extremely uneven duration of the four Early Triassic substages (Griesbachian, Dienerian, Smithian, and Spathian), of which the Spathian exceeds half of the duration of the entire Early Triassic. The simplistic assumption of equal duration of the four Early Triassic subdivisions is no longer tenable for the reconstruction of recovery patterns following the end Permian mass extinction. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of three-dimensional (3D) knee joint angle outside a laboratory is of benefit in clinical examination and therapeutic treatment comparison. Although several motion capture devices exist, there is a need for an ambulatory system that could be used in routine practice. Up-to-date, inertial measurement units (IMUs) have proven to be suitable for unconstrained measurement of knee joint differential orientation. Nevertheless, this differential orientation should be converted into three reliable and clinically interpretable angles. Thus, the aim of this study was to propose a new calibration procedure adapted for the joint coordinate system (JCS), which required only IMUs data. The repeatability of the calibration procedure, as well as the errors in the measurement of 3D knee angle during gait in comparison to a reference system were assessed on eight healthy subjects. The new procedure relying on active and passive movements reported a high repeatability of the mean values (offset<1 degrees) and angular patterns (SD<0.3 degrees and CMC>0.9). In comparison to the reference system, this functional procedure showed high precision (SD<2 degrees and CC>0.75) and moderate accuracy (between 4.0 degrees and 8.1 degrees) for the three knee angle. The combination of the inertial-based system with the functional calibration procedure proposed here resulted in a promising tool for the measurement of 3D knee joint angle. Moreover, this method could be adapted to measure other complex joint, such as ankle or elbow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.