2 resultados para Multivariate Monitoring
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
OBJECTIVES: Premature babies require supplementation with calcium and phosphorus to prevent metabolic bone disease of prematurity. To guide mineral supplementation, two methods of monitoring urinary excretion of calcium and phosphorus are used: urinary calcium or phosphorus concentration and calcium/creatinine or phosphorus/creatinine ratios. We compare these two methods in regards to their agreement on the need for mineral supplementation. METHODS: Retrospective chart review of 230 premature babies with birthweight <1500 g, undergoing screening of urinary spot samples from day 21 of life and fortnightly thereafter. Hypothetical cut-off values for urine calcium or phosphorus concentration (1 mmol/l) and urine calcium/creatinine ratio (0.5 mol/mol) or phosphorus/creatinine ratio (4 mol/mol) were applied to the sample results. The agreement on whether or not to supplement the respective minerals based on the results with the two methods was compared. Multivariate general linear models sought to identify patient characteristic to predict disagreeing results. RESULTS: 24.8% of cases disagreed on the indication for calcium supplementation, 8.8% for phosphorus. Total daily calcium intake was the only patient characteristic associated with discordant results. CONCLUSIONS: With the intention to supplement the respective mineral, comparison of urinary mineral concentration with mineral/creatinine ratio is moderate for Calcium and good for Phosphorus. The results do not allow to identify superiority of either method on the decision which babies require calcium and/or phosphorus supplements.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.