932 resultados para Challenge posed by omics data to compositional analysis-paucity of independent samples (n)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Heart failure is a severe complication associated with doxorubicin (DOX) use. Strain, assessed by two-dimensional speckle tracking (2D-STE), has been shown to be useful in identifying subclinical ventricular dysfunction. Objectives: a) To investigate the role of strain in the identification of subclinical ventricular dysfunction in patients who used DOX; b) to investigate determinants of strain response in these patients. Methods: Cross-sectional study with 81 participants: 40 patients who used DOX ±2 years before the study and 41 controls. All participants had left ventricular ejection fraction (LVEF) ≥55%. Total dose of DOX was 396mg (242mg/ms2). The systolic function of the LV was evaluated by LVEF (Simpson), as well as by longitudinal (εLL), circumferential (εCC), and radial (εRR) strains. Multivariate linear regression (MLR) analysis was performed using εLL (model 1) and εCC (model 2) as dependent variables. Results: Systolic and diastolic blood pressure values were higher in the control group (p < 0.05). εLL was lower in the DOX group (-12.4 ±2.6%) versus controls (-13.4 ± 1.7%; p = 0.044). The same occurred with εCC: -12.1 ± 2.7% (DOX) versus -16.7 ± 3.6% (controls; p < 0.001). The S’ wave was shorter in the DOX group (p = 0.035). On MLR, DOX was an independent predictor of reduced εCC (B = -4.429, p < 0.001). DOX (B = -1.289, p = 0.012) and age (B = -0.057, p = 0.029) were independent markers of reduced εLL. Conclusion: a) εLL, εCC and the S’ wave are reduced in patients who used DOX ±2 years prior to the study despite normal LVEF, suggesting the presence of subclinical ventricular dysfunction; b) DOX was an independent predictor of reduced εCC; c) prior use of DOX and age were independent markers of reduced εLL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper devotes to evaluation of performance bottlenecks and algorithm deficiencies in the area of contemporary reliable multicast networking. Hereby, the impact of packet delay jitter on the end-to-end performance of multicast IP data transport is investigated. A series of tests with two most significant open-source implementations of reliable multicast is performed and analyzed. These are: UDP-based File Transfer Protocol (UFTP) and NACK-oriented Reliable multicast (NORM). Tests were targeted to simulate scenario of content distribution in WAN – sized Content Delivery Networks (CDN). Then, results were grouped and averaged, by round trip time and packet losses. This enabled us to see jitter influence independently on round trip time(RTT) and packet loss rates. Revealed jitter influence for different network conditions. Confirmed, that appearance of even small jitter causes significant data rate reduction.