953 resultados para function estimation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work addresses the problem of deriving F0 from distanttalking speech signals acquired by a microphone network. The method here proposed exploits the redundancy across the channels by jointly processing the different signals. To this purpose, a multi-microphone periodicity function is derived from the magnitude spectrum of all the channels. This function allows to estimate F0 reliably, even under reverberant conditions, without the need of any post-processing or smoothing technique. Experiments, conducted on real data, showed that the proposed frequency-domain algorithm is more suitable than other time-domain based ones.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY AND PRINCIPLES: Estimating glomerular filtration rate (GFR) in hospitalised patients with chronic kidney disease (CKD) is important for drug prescription but it remains a difficult task. The purpose of this study was to investigate the reliability of selected algorithms based on serum creatinine, cystatin C and beta-trace protein to estimate GFR and the potential added advantage of measuring muscle mass by bioimpedance. In a prospective unselected group of patients hospitalised in a general internal medicine ward with CKD, GFR was evaluated using inulin clearance as the gold standard and the algorithms of Cockcroft, MDRD, Larsson (cystatin C), White (beta-trace) and MacDonald (creatinine and muscle mass by bioimpedance). 69 patients were included in the study. Median age (interquartile range) was 80 years (73-83); weight 74.7 kg (67.0-85.6), appendicular lean mass 19.1 kg (14.9-22.3), serum creatinine 126 μmol/l (100-149), cystatin C 1.45 mg/l (1.19-1.90), beta-trace protein 1.17 mg/l (0.99-1.53) and GFR measured by inulin 30.9 ml/min (22.0-43.3). The errors in the estimation of GFR and the area under the ROC curves (95% confidence interval) relative to inulin were respectively: Cockcroft 14.3 ml/min (5.55-23.2) and 0.68 (0.55-0.81), MDRD 16.3 ml/min (6.4-27.5) and 0.76 (0.64-0.87), Larsson 12.8 ml/min (4.50-25.3) and 0.82 (0.72-0.92), White 17.6 ml/min (11.5-31.5) and 0.75 (0.63-0.87), MacDonald 32.2 ml/min (13.9-45.4) and 0.65 (0.52-0.78). Currently used algorithms overestimate GFR in hospitalised patients with CKD. As a consequence eGFR targeted prescriptions of renal-cleared drugs, might expose patients to overdosing. The best results were obtained with the Larsson algorithm. The determination of muscle mass by bioimpedance did not provide significant contributions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The attached file is created with Scientific Workplace Latex

Relevância:

40.00% 40.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present research sought to investigate the role of the basal ganglia in timing of sub- and supra-second intervals via an examination of the ability of people with Parkinson's disease (PD) to make temporal judgments in two ranges, 100-500 ms, and 1-5 s. Eighteen nondemented medicated patients with PD were compared with 14 matched controls on a duration-bisection task in which participants were required to discriminate auditory and visual signal durations within each time range. Results showed that patients with PD exhibited more variable duration judgments across both signal modality and duration range than controls, although closer analyses confirmed a timing deficit in the longer duration range only. The findings presented here suggest the bisection procedure may be a useful tool in identifying timing impairments in PD and, more generally, reaffirm the hypothesised role of the basal ganglia in temporal perception at the level of the attentionally mediated internal clock as well as memory retrieval and/or decision-making processes. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We investigate the issue of whether there was a stable money demand function for Japan in 1990's using both aggregate and disaggregate time series data. The aggregate data appears to support the contention that there was no stable money demand function. The disaggregate data shows that there was a stable money demand function. Neither was there any indication of the presence of liquidity trapo Possible sources of discrepancy are explored and the diametrically opposite results between the aggregate and disaggregate analysis are attributed to the neglected heterogeneity among micro units. We also conduct simulation analysis to show that when heterogeneity among micro units is present. The prediction of aggregate outcomes, using aggregate data is less accurate than the prediction based on micro equations. Moreover. policy evaluation based on aggregate data can be grossly misleading.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Doctorado en Economía.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recurrent event data are largely characterized by the rate function but smoothing techniques for estimating the rate function have never been rigorously developed or studied in statistical literature. This paper considers the moment and least squares methods for estimating the rate function from recurrent event data. With an independent censoring assumption on the recurrent event process, we study statistical properties of the proposed estimators and propose bootstrap procedures for the bandwidth selection and for the approximation of confidence intervals in the estimation of the occurrence rate function. It is identified that the moment method without resmoothing via a smaller bandwidth will produce curve with nicks occurring at the censoring times, whereas there is no such problem with the least squares method. Furthermore, the asymptotic variance of the least squares estimator is shown to be smaller under regularity conditions. However, in the implementation of the bootstrap procedures, the moment method is computationally more efficient than the least squares method because the former approach uses condensed bootstrap data. The performance of the proposed procedures is studied through Monte Carlo simulations and an epidemiological example on intravenous drug users.