992 resultados para Efficient estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To use measurement by cycling power meters (Pmes) to evaluate the accuracy of commonly used models for estimating uphill cycling power (Pest). Experiments were designed to explore the influence of wind speed and steepness of climb on accuracy of Pest. The authors hypothesized that the random error in Pest would be largely influenced by the windy conditions, the bias would be diminished in steeper climbs, and windy conditions would induce larger bias in Pest. METHODS: Sixteen well-trained cyclists performed 15 uphill-cycling trials (range: length 1.3-6.3 km, slope 4.4-10.7%) in a random order. Trials included different riding position in a group (lead or follow) and different wind speeds. Pmes was quantified using a power meter, and Pest was calculated with a methodology used by journalists reporting on the Tour de France. RESULTS: Overall, the difference between Pmes and Pest was -0.95% (95%CI: -10.4%, +8.5%) for all trials and 0.24% (-6.1%, +6.6%) in conditions without wind (<2 m/s). The relationship between percent slope and the error between Pest and Pmes were considered trivial. CONCLUSIONS: Aerodynamic drag (affected by wind velocity and orientation, frontal area, drafting, and speed) is the most confounding factor. The mean estimated values are close to the power-output values measured by power meters, but the random error is between ±6% and ±10%. Moreover, at the power outputs (>400 W) produced by professional riders, this error is likely to be higher. This observation calls into question the validity of releasing individual values without reporting the range of random errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study reports a set of forty proteinogenic histidine-containing dipeptides as potential carbonyl quenchers. The peptides were chosen to cover as exhaustively as possible the accessible chemical space, and their quenching activities toward 4-hydroxy-2-nonenal (HNE) and pyridoxal were evaluated by HPLC analyses. The peptides were capped at the C-terminus as methyl esters or amides to favor their resistance to proteolysis and diastereoisomeric pairs were considered to reveal the influence of configuration on quenching. On average, the examined dipeptides are less active than the parent compound carnosine (βAla + His) thus emphasizing the unfavorable effect of the shortening of the βAla residue as confirmed by the control dipeptide Gly-His. Nevertheless, some peptides show promising activities toward HNE combined with a remarkable selectivity. The results emphasize the beneficial role of aromatic and positively charged residues, while negatively charged and H-bonding side chains show a detrimental effect on quenching. As a trend, ester derivatives are slightly more active than amides while heterochiral peptides are more active than their homochiral diastereoisomer. Overall, the results reveal that quenching activity strongly depends on conformational effects and vicinal residues (as evidenced by the reported QSAR analysis), offering insightful clues for the design of improved carbonyl quenchers and to rationalize the specific reactivity of histidine residues within proteins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the current issue of epidemiology, Danaei and colleagues elegantly estimated both the direct effect and the indirect effect-that is, the effect mediated by blood pressure, cholesterol, glucose, fibrinogen, and high-sensitivity C-reactive protein-of body mass index (BMI) on the risk of coronary heart disease (CHD). they analyzed data from 9 cohort studies including 58,322 patients and 9459 CHD events, with baseline measurements between 1954 and 2001. Using sophisticated and cutting-edge methods for direct and indirect effect estimations, the authors estimated that half of the risk of overweight and obesity would be mediated by blood pressure, cholesterol, and glucose. Few additional percentage points of the risk would be mediated by fibrinogen and hs-CRP. How should we understand these estimates? Can we say that if obese persons reduce their body weight and reach a normal body weight, their excess risk of CHD would be reduced by half through an improvement in these mediators and by half through the reduction in BmI itself? Is that also true if these individuals are prevented from becoming obese in the first place? Can we also conclude that if these mediators are well controlled in obese individuals through other means than a body weight reduction, their excess risk of CHD would be reduced by half? Let us confront these estimates with observations from studies evaluating 2 interventions to reduce body weight, that is, bariatric surgery in patients with severe obesity and intensive lifestyle intervention in overweight patients with diabetes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to assess the efficiency of spiral CT (SCT) aortography for diagnosing acute aortic lesions in blunt thoracic trauma patients. Between October 1992 and June 1997, 487 SCT scans of the chest were performed on blunt thoracic trauma patients. To assess aortic injury, the following SCT criteria were considered: hemomediastinum, peri-aortic hematoma, irregular aspect of the aortic wall, aortic pseudodiverticulum, intimal flap and traumatic dissection. Aortic injury was diagnosed on 14 SCT examinations (2.9 %), five of the patients having had an additional digital aortography that confirmed the aortic trauma. Twelve subjects underwent surgical repair of the thoracic aorta, which in all but one case confirmed the aortic injury. Two patients died before surgery from severe brain lesions. The aortic blunt lesions were confirmed at autopsy. According to the follow-up of the other 473 patients, we are aware of no false-negative SCT examination. Our limited series shows a sensitivity of 100 % and specificity of 99.8 % of SCT aortography in the diagnosis of aortic injury. It is concluded that SCT aortagraphy is an accurate diagnostic method for the assessment of aortic injury in blunt thoracic trauma patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to evaluate an estimation system for rice yield in Brazil, based on simple agrometeorological models and on the technological level of production systems. This estimation system incorporates the conceptual basis proposed by Doorenbos & Kassam for potential and attainable yields with empirical adjusts for maximum yield and crop sensitivity to water deficit, considering five categories of rice yield. Rice yield was estimated from 2000/2001 to 2007/2008, and compared to IBGE yield data. Regression analyses between model estimates and data from IBGE surveys resulted in significant coefficients of determination, with less dispersion in the South than in the North and Northeast regions of the country. Index of model efficiency (E1') ranged from 0.01 in the lower yield classes to 0.45 in higher ones, and mean absolute error ranged from 58 to 250 kg ha‑1, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many audio watermarking schemes divide the audio signal into several blocks such that part of the watermark is embedded into each of them. One of the key issues in these block-oriented watermarking schemes is to preserve the synchronisation, i.e. to recover the exact position of each block in the mark recovery process. In this paper, a novel time domain synchronisation technique is presented together with a new blind watermarking scheme which works in the Discrete Fourier Transform (DFT or FFT) domain. The combined scheme provides excellent imperceptibility results whilst achieving robustness against typical attacks. Furthermore, the execution of the scheme is fast enough to be used in real-time applications. The excellent transparency of the embedding algorithm makes it particularly useful for professional applications, such as the embedding of monitoring information in broadcast signals. The scheme is also compared with some recent results of the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selostus: Ravikilpailumenestysmittojen periytymisasteet ja toistumiskertoimet kilpailukohtaisten tulosten perusteella

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the goodness of the Gaussian assumption when designing second-order blind estimationmethods in the context of digital communications. The low- andhigh-signal-to-noise ratio (SNR) asymptotic performance of the maximum likelihood estimator—derived assuming Gaussiantransmitted symbols—is compared with the performance of the optimal second-order estimator, which exploits the actualdistribution of the discrete constellation. The asymptotic study concludes that the Gaussian assumption leads to the optimalsecond-order solution if the SNR is very low or if the symbols belong to a multilevel constellation such as quadrature-amplitudemodulation (QAM) or amplitude-phase-shift keying (APSK). On the other hand, the Gaussian assumption can yield importantlosses at high SNR if the transmitted symbols are drawn from a constant modulus constellation such as phase-shift keying (PSK)or continuous-phase modulations (CPM). These conclusions are illustrated for the problem of direction-of-arrival (DOA) estimation of multiple digitally-modulated signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the asymptotic performance of maximum likelihood (ML) channel estimation algorithms in wideband code division multiple access (WCDMA) scenarios. We concentrate on systems with periodic spreading sequences (period larger than or equal to the symbol span) where the transmitted signal contains a code division multiplexed pilot for channel estimation purposes. First, the asymptotic covariances of the training-only, semi-blind conditional maximum likelihood (CML) and semi-blind Gaussian maximum likelihood (GML) channelestimators are derived. Then, these formulas are further simplified assuming randomized spreading and training sequences under the approximation of high spreading factors and high number of codes. The results provide a useful tool to describe the performance of the channel estimators as a function of basicsystem parameters such as number of codes, spreading factors, or traffic to training power ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the theory of hidden Markov models (HMM) isapplied to the problem of blind (without training sequences) channel estimationand data detection. Within a HMM framework, the Baum–Welch(BW) identification algorithm is frequently used to find out maximum-likelihood (ML) estimates of the corresponding model. However, such a procedureassumes the model (i.e., the channel response) to be static throughoutthe observation sequence. By means of introducing a parametric model fortime-varying channel responses, a version of the algorithm, which is moreappropriate for mobile channels [time-dependent Baum-Welch (TDBW)] isderived. Aiming to compare algorithm behavior, a set of computer simulationsfor a GSM scenario is provided. Results indicate that, in comparisonto other Baum–Welch (BW) versions of the algorithm, the TDBW approachattains a remarkable enhancement in performance. For that purpose, onlya moderate increase in computational complexity is needed.