994 resultados para BLOOD-STREAM
Resumo:
P>Aim: To determine the effects of imperfect adherence (i.e. occasionally missing prescribed doses), and the influence of rate of loss of antihypertensive effect during treatment interruption, on the predicted clinical effectiveness of antihypertensive drugs in reducing mean systolic blood pressure (SBP) and cardiovascular disease (CVD) risk.Method:The effects of imperfect adherence to antihypertensive treatment regimens were estimated using published patterns of missed doses, and taking into account the rate of loss of antihypertensive effect when doses are missed (loss of BP reduction in mmHg/day; the off-rate), which varies between drugs. Outcome measures were the predicted mean SBP reduction and CVD risk, determined from the Framingham Risk Equation for CVD.Results:In patients taking 75% of prescribed doses (typical of clinical practice), only long-acting drugs with an off-rate of similar to 1 mmHg/day were predicted to maintain almost the full mean SBP-lowering effect throughout the modelled period. In such patients, using shorter-acting drugs (e.g. an off-rate of similar to 5-6 mmHg/day) was predicted to lead to a clinically relevant loss of mean SBP reduction of > 2 mmHg. This change also influenced the predicted CVD risk reduction; in patients with a baseline 10-year CVD risk of 27.0% and who were taking 75% of prescribed doses, a difference in off-rate from 1 to 5 mmHg/day led to a predicted 0.5% absolute increase in 10-year CVD risk.Conclusions:In patients who occasionally miss doses of antihypertensives, modest differences in the rate of loss of antihypertensive effect following treatment interruption may have a clinically relevant impact on SBP reduction and CVD risk. While clinicians must make every effort to counsel and encourage each of their patients to adhere to their prescribed medication, it may also be prudent to prescribe drugs with a low off-rate to mitigate the potential consequences of missing doses.
Resumo:
This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.
Resumo:
Background: Numerous studies have shown a negative association between birth weight (BW) and blood pressure (BP) later in life. To estimate the direct effect of BW on BP, it is conventional to condition on current weight (CW). However, such conditioning can induce collider stratification bias in the estimate of the direct effect. Objective: To bound the potential bias due to U, an unmeasured common cause of CW and BP, on the estimate of the (controlled) direct effect of BW on BP. Methods: Data from a school based study in Switzerland were used (N = 4,005; 2,010 B/1,995 G; mean age: 12.3 yr [range: 10.1-14.9]). Measured common causes of BW-BP (SES, smoking, body weight, and hypertension status of the mother) and CW-BP (breastfeeding and child's physical activity and diet) were identified with DAGs. Linear regression models were fitted to estimate the association between BW and BP. Sensitivity analyses were conducted to assess the potential effect of U on the association between BW and BP. U was assumed 1) to be a binary variable that affected BP by the same magnitude in low BWand in normal BW children and 2) to have a different prevalence in low BW children and in normal BW children for a given CW. Results: A small negative association was observed between BW and BP [beta: -0.3 mmHg/kg (95% CI: -0.9 to 0.3)]. The association was strengthened upon conditioning for CW [beta: -1.5 mmHg/kg (95% CI: -2.1 to -0.9)]. Upon further conditioning on common causes of BW-BP and CW-BP, the association did not change substantially [beta: -1.4 mmHg/kg (95% CI: -2.0 to -0.8)]. The negative association could be explained by U only if U was strongly associated with BP and if there was a large difference in the prevalence of U between low BWand normal BW children. Conclusion: The observed negative association between BW and BP upon adjustment for CW was not easily explained by an unmeasured common cause of CWand BP.
Resumo:
The availability of stored red blood cells (RBCs) for transfusion remains an important aspect of the treatment of polytrauma, acute anemia or major bleedings. RBCs are prepared by blood banks from whole blood donations and stored in the cold in additive solutions for typically six weeks. These far from physiological storage conditions result in the so-called red cell storage lesion that is of importance both to blood bankers and to clinical practitioners. Here we review the current state of knowledge about the red cell storage lesion from a proteomic perspective. In particular, we describe the current models accounting for RBC aging and response to lethal stresses, review the published proteomic studies carried out to uncover the molecular basis of the RBC storage lesion, and conclude by suggesting a few possible proteomic studies that would provide further knowledge of the molecular alterations carried by RBCs stored in the cold for six weeks.