971 resultados para blood sampling
Resumo:
The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^
Resumo:
Until recently, measurements of energy expenditure (EE; herein defined as heat production) in respiration chambers did not account for the extra energy requirements of grazing dairy cows on pasture. As energy is first limiting in most pasture-based milk production systems, its efficient use is important. Therefore, the aim of the present study was to compare EE, which can be affected by differences in body weight (BW), body composition, grazing behavior, physical activity, and milk production level, in 2 Holstein cow strains. Twelve Swiss Holstein-Friesian (HCH; 616 kg of BW) and 12 New Zealand Holstein-Friesian (HNZ; 570 kg of BW) cows in the third stage of lactation were paired according to their stage of lactation and kept in a rotational, full-time grazing system without concentrate supplementation. After adaption, the daily milk yield, grass intake using the alkane double-indicator technique, nutrient digestibility, physical activity, and grazing behavior recorded by an automatic jaw movement recorder were investigated over 7d. Using the (13)C bicarbonate dilution technique in combination with an automatic blood sampling system, EE based on measured carbon dioxide production was determined in 1 cow pair per day between 0800 to 1400 h. The HCH were heavier and had a lower body condition score compared with HNZ, but the difference in BW was smaller compared with former studies. Milk production, grass intake, and nutrient digestibility did not differ between the 2 cow strains, but HCH grazed for a longer time during the 6-h measurement period and performed more grazing mastication compared with the HNZ. No difference was found between the 2 cow strains with regard to EE (291 ± 15.6 kJ) per kilogram of metabolic BW, mainly due to a high between-animal variation in EE. As efficiency and energy use are important in sustainable, pasture-based, organic milk production systems, the determining factors for EE, such as methodology, genetics, physical activity, grazing behavior, and pasture quality, should be investigated and quantified in more detail in future studies.
Resumo:
BACKGROUND Venous thromboembolism (VTE) and subclinical thyroid dysfunction (SCTD) are both common in elderly patients. SCTD has been related to a hypercoagulable state and increased thromboembolic risk. However, prospective data on the relationship between SCTD and VTE are lacking. OBJECTIVES To investigate the relationship between SCTD and recurrent VTE (rVTE), all-cause mortality, and thrombophilic biomarkers. PATIENTS Elderly participants with VTE. METHODS In a prospective multicenter cohort, thyroid hormones and thrombophilic biomarkers were measured 1 year after acute VTE, as both may be influenced by acute thrombosis. We defined subclinical hypothyroidism (SHypo) as elevated thyroid stimulating hormone levels (TSH=4.50-19.99 mIU/l), and subclinical hyperthyroidism (SHyper) as TSH<0.45, both with normal free thyroxine levels. Outcomes were incidence of rVTE and overall mortality during follow-up starting after the 1-year blood sampling. RESULTS Of 561 participants (58% with anticoagulation), 6% had SHypo and 5% SHyper. After 20.8 months of mean follow-up, 9% developed rVTE and 10% died. rVTE incidence rate was 7.2 (95% confidence interval:2.7-19.2) per 100 patient-years in SHypo, 0.0 (0.0-7.6) in SHyper and 5.9 (4.4-7.8) in euthyroid participants. In multivariate analyses, the sub-hazard ratio [SHR] for rVTE was 0.00 (0.00-0.58) in SHyper and 1.50 (0.52-4.34) in SHypo compared to euthyroids, without increased thrombophilic biomarkers. SHyper (HR 0.80,0.23-2.81) and SHypo (HR 0.99,0.30-3.29) were not associated with mortality. CONCLUSION In elderly patients, SHyper may be associated with lower rVTE risks. SHypo showed a non-statistically significant pattern of an association with rVTE, without increased mortality or differences in thrombophilic biomarkers. This article is protected by copyright. All rights reserved.
Resumo:
We present the first reference ranges for hematology (n = 35 animals), serum biochemistry (n = 62), and serum protein electrophoresis (n = 32) in physically restrained free-ranging roe deer (Capreolus capreolus). Animals were captured in box traps and physically restrained for blood sampling during the winter in Sweden, 2011-13. No clinically significant sex or age differences were found.
Resumo:
Background Malabsorptive bariatric surgery requires lifelongmicronutrientsupplementation.Basedontherecommendations, we assessed the number of adjustments of micronutrientsupplementationandtheprevalenceofvitaminandmineral deficiencies at a minimum follow-up of 5 years after biliopancreatic diversion with duodenal switch (BPD-DS). Methods Between October 2010 and December 2013, a total of 51 patients at a minimum follow-up of 5 years after BPDDS were invited for a clinical check-up with a nutritional blood screening test for vitamins and minerals. Results Forty-three of fifty-one patients (84.3 %) completed the blood sampling with a median follow-up of 71.2 (range 60–102) monthsafter BPD-DS. At that time,all patientswere supplemented with at least one multivitamin. However, 35 patients (81.4 %) showed either a vitamin or a mineral deficiencyoracombinationofit.Nineteenpatients(44.1%)were anemic,and17patients(39.5%)hadanirondeficiency.High deficiency rates for fat-soluble vitamins were also present in 23.2 % for vitamin A, in 76.7 % for vitamin D, in 7.0 % for vitamin E, and in 11.6 % for vitamin K. Conclusions Theresultsofourstudyshowthattheprevalence ofvitaminandmineraldeficienciesafterBPD-DSis81.4%at a minimum follow-up of 5 years. The initial prescription of micronutrientsupplementationandfurtheradjustmentsduring thefirstfollow-upwereinsufficient toavoidlong-term micronutrient deficiencies. Life-long monitoring of micronutrients at a specialized bariatric center and possibly a better micronutrient supplementation, is crucial to avoid a deficient micronutrient status at every stage after malabsorptive bariatric surgery
Resumo:
Existing methods for assessing protein synthetic rates (PSRs) in human skeletal muscle are invasive and do not readily provide information about individual muscle groups. Recent studies in canine skeletal muscle yielded PSRs similar to results of simultaneous stable isotope measurements using l-[1-13C, methyl-2H3]methionine, suggesting that positron-emission tomography (PET) with l-[methyl-11C]methionine could be used along with blood sampling and a kinetic model to provide a less invasive, regional assessment of PSR. We have extended and refined this method in an investigation with healthy volunteers studied in the postabsorptive state. They received ≈25 mCi of l-[methyl-11C]methionine with serial PET imaging of the thighs and arterial blood sampling for a period of 90 min. Tissue and metabolite-corrected arterial blood time activity curves were fitted to a three-compartment model. PSR (nmol methionine⋅min−1⋅g muscle tissue−1) was calculated from the fitted parameter values and the plasma methionine concentrations, assuming equal rates of protein synthesis and degradation. Pooled mean PSR for the anterior and posterior sites was 0.50 ± 0.040. When converted to a fractional synthesis rate for mixed proteins in muscle, assuming a protein-bound methionine content of muscle tissue, the value of 0.125 ± 0.01%⋅h−1 compares well with estimates from direct tracer incorporation studies, which generally range from ≈0.05 to 0.09%⋅h−1. We conclude that PET can be used to estimate skeletal muscle PSR in healthy human subjects and that it holds promise for future in vivo, noninvasive studies of the influences of physiological factors, pharmacological manipulations, and disease states on this important component of muscle protein turnover and balance.
Resumo:
The micronutrient selenium is essential to human physiology. As the amino acid selenocysteine, it is inserted into selenoproteins with a wide range of functions including antioxidant capacity, thyroid hormone metabolism, improvement of immune system, brain function, fertility and reproduction. Low selenium status has been associated with increased risk for chronic diseases, such as cancer, type-2 diabetes and cardiovascular disease. In this context, several studies have been conducted in order to investigate if selenium supplementation could reduce the risk of such diseases. However, genetic variations may interfere in the response of individuals to a dietary intervention and must be considered as a important source of inter-individual variation. Therefore, this study was conducted was conducted to investigate the influence of genetic variations in selenoproteins genes on the response to an intervention with Brazil nuts, the richest source of selenium known in nature. The study included 130 healthy volunteers with both genders, aged 20 to 60 years old selected in University of São Paulo. They received nuts for 8 weeks, eating one nut a day, and did a washout period for more 8 weeks. All volunteers had a blood sampling collection every 4 weeks during 4 months, in a total of 5. The following analysis were done: anthropometric measurements, lipid profile, plasma malondialdehyde, plasma and erythrocyte Se, selenoprotein P, plasma and erythrocyte GPx activity, gene expression of GPX1, SEPP1, SELS and SEP15. The volunteers were also genotyped for SNPs rs1050450, rs3811699, rs1800699, rs713041, rs3877899, rs7579, rs34713741 and rs5845. Each unit of Brazil nut provided an average of 300 µg of selenium. All 130 volunteers completed the protocol. The concentrations of total cholesterol and glucose decreased after 8 weeks of supplementation. Moreover, HDL concentrations were higher for carriers of the variant T allele for GPX4_rs713041. The frequencies of the variant genotypes were 5,4% for rs1050450, rs3811699 e rs1088668, 10% for rs3877899, 19,2% for rs713041 e rs7579, 11,5% for rs5845 and 8,5% for rs34713741. The levels of the five biomarkers increased significantly after supplementation. In addition, erythrocyte GPx activity was influenced by rs1050450, rs713041 and rs5845; erythrocyte selenium was influenced by rs5845 and plasma selenium by rs3877899. Gene expression of GPX1, SEPP1 and SEP15 were higher after supplementation. The SNP rs1050450 influenced GPX1 mRNA expression and rs7579 influenced SEPP1 mRNA expression. Therefore, it can be concluded that the supplementation with one of Brazil nut for 8 weeks was efficient to reduce total cholesterol and glucose levels and to increase the concentrations of the main biomarkers of selenium status in healthy adults. Furthermore, our results suggest that GPX4_rs713041 might interfere on HDL concentrations and GPx1 activity, GPX1_rs1050450 might interfere on GPx1 activity, SEP15_rs5845 might interfere on GPx1 activity and erythrocyte selenium and SEPP1_3877899 might interfere on plasma Se levels. Therefore, the effect of genetic variations should be considered in future nutritional interventions evaluating the response to Brazil nut supplementation.
Resumo:
Recently, methods for computing D-optimal designs for population pharmacokinetic studies have become available. However there are few publications that have prospectively evaluated the benefits of D-optimality in population or single-subject settings. This study compared a population optimal design with an empirical design for estimating the base pharmacokinetic model for enoxaparin in a stratified randomized setting. The population pharmacokinetic D-optimal design for enoxaparin was estimated using the PFIM function (MATLAB version 6.0.0.88). The optimal design was based on a one-compartment model with lognormal between subject variability and proportional residual variability and consisted of a single design with three sampling windows (0-30 min, 1.5-5 hr and 11 - 12 hr post-dose) for all patients. The empirical design consisted of three sample time windows per patient from a total of nine windows that collectively represented the entire dose interval. Each patient was assigned to have one blood sample taken from three different windows. Windows for blood sampling times were also provided for the optimal design. Ninety six patients were recruited into the study who were currently receiving enoxaparin therapy. Patients were randomly assigned to either the optimal or empirical sampling design, stratified for body mass index. The exact times of blood samples and doses were recorded. Analysis was undertaken using NONMEM (version 5). The empirical design supported a one compartment linear model with additive residual error, while the optimal design supported a two compartment linear model with additive residual error as did the model derived from the full data set. A posterior predictive check was performed where the models arising from the empirical and optimal designs were used to predict into the full data set. This revealed the optimal'' design derived model was superior to the empirical design model in terms of precision and was similar to the model developed from the full dataset. This study suggests optimal design techniques may be useful, even when the optimized design was based on a model that was misspecified in terms of the structural and statistical models and when the implementation of the optimal designed study deviated from the nominal design.
Resumo:
The aim of this review is to analyse critically the recent literature on the clinical pharmacokinetics and pharmacodynamics of tacrolimus in solid organ transplant recipients. Dosage and target concentration recommendations for tacrolimus vary from centre to centre, and large pharmacokinetic variability makes it difficult to predict what concentration will be achieved with a particular dose or dosage change. Therapeutic ranges have not been based on statistical approaches. The majority of pharmacokinetic studies have involved intense blood sampling in small homogeneous groups in the immediate post-transplant period. Most have used nonspecific immunoassays and provide little information on pharmacokinetic variability. Demographic investigations seeking correlations between pharmacokinetic parameters and patient factors have generally looked at one covariate at a time and have involved small patient numbers. Factors reported to influence the pharmacokinetics of tacrolimus include the patient group studied, hepatic dysfunction, hepatitis C status, time after transplantation, patient age, donor liver characteristics, recipient race, haematocrit and albumin concentrations, diurnal rhythm, food administration, corticosteroid dosage, diarrhoea and cytochrome P450 (CYP) isoenzyme and P-glycoprotein expression. Population analyses are adding to our understanding of the pharmacokinetics of tacrolimus, but such investigations are still in their infancy. A significant proportion of model variability remains unexplained. Population modelling and Bayesian forecasting may be improved if CYP isoenzymes and/or P-glycoprotein expression could be considered as covariates. Reports have been conflicting as to whether low tacrolimus trough concentrations are related to rejection. Several studies have demonstrated a correlation between high trough concentrations and toxicity, particularly nephrotoxicity. The best predictor of pharmacological effect may be drug concentrations in the transplanted organ itself. Researchers have started to question current reliance on trough measurement during therapeutic drug monitoring, with instances of toxicity and rejection occurring when trough concentrations are within 'acceptable' ranges. The correlation between blood concentration and drug exposure can be improved by use of non-trough timepoints. However, controversy exists as to whether this will provide any great benefit, given the added complexity in monitoring. Investigators are now attempting to quantify the pharmacological effects of tacrolimus on immune cells through assays that measure in vivo calcineurin inhibition and markers of immuno suppression such as cytokine concentration. To date, no studies have correlated pharmacodynamic marker assay results with immunosuppressive efficacy, as determined by allograft outcome, or investigated the relationship between calcineurin inhibition and drug adverse effects. Little is known about the magnitude of the pharmacodynamic variability of tacrolimus.
Resumo:
Background: Fetal scalp lactate testing has been shown to be as useful as pH with added benefits. One remaining question is What level of lactate should trigger intervention in the first stage of labour?' Aims: This study aimed to establish the lactate level in the first stage of labour that indicates the need for intervention to ensure satisfactory outcomes for both babies and mothers. Methods: A prospective study at Mater Mothers' Hospital, Brisbane, Australia, a tertiary referral centre. One hundred and forty women in labour, with non-reassuring fetal heart rate traces, were tested using fetal blood scalp sampling of 5 mu L of capillary blood tested on an Accusport (Boeringer, Mannheim, East Sussex, UK) lactate meter. Decision to intervene in labour was based on clinical assessment plus a predetermined cut off. Main outcome measures were APGAR scores, cord arterial pH, meconium stained liquor and Intensive Care Nursery admission. Results: Two-graph receiver operating characteristic (TG-ROC) analysis showed optimal specificity, and sensitivity for predicting adverse neonatal outcomes was a scalp lactate level above 4.2 mmol/L. Conclusions: Fetal blood sampling remains the standard for further investigating-non-reassuring cardiotocograph (CTG) traces. Even so, it is a poor predictor of fetal outcomes. Scalp lactate has been shown to be at least as good a predictor as scalp pH, with the advantages of being easier, cheaper and with a lower rate of technical failure. Our study, found that a cut off fetal scalp lactate level of 4.2 mmol/L, in combination with an assessment of the entire clinical picture, is a useful tool in identifying those women who need intervention.
Resumo:
Relaxation of the upper age limits for solid organ transplantation coupled with improvements in post-transplant survival have resulted in greater numbers of elderly patients receiving immunosuppressant drugs such as tacrolimus. Tacrolimus is a potent agent with a narrow therapeutic window and large inter- and intraindividual pharmacokinetic variability. Numerous physiological changes occur with aging that could potentially affect the pharmacokinetics of tacrolimus and, hence, patient dosage requirements. Tacrolimus is primarily metabolised by cytochrome P450 (CYP) 3A enzymes in the gut wall and liver. It is also a substrate for P-glycoprotein, which counter-transports diffused tacrolimus out of intestinal cells and back into the gut lumen. Age-associated alterations in CYP3A and P-glycoprotein expression and/or activity, along with liver mass and body composition changes, would be expected to affect the pharmacokinetics of tacrolimus in the elderly. However, interindividual variation in these processes may mask any changes caused by aging. More investigation is needed into the impact aging has on CYP and P-glycoprotein activity and expression. No single-dose, intense blood-sampling study has specifically compared the pharmacokinetics of tacrolimus across different patient age groups. However, five population pharmacokinetic studies, one in kidney, one in bone marrow and three in liver transplant recipients, have investigated age as a co-variate. None found a significant influence for age on tacrolimus bioavailability, volume of distribution or clearance. The number of elderly patients included in each study, however, was not documented and may have been only small. It is likely that inter- and intraindividual pharmacokinetic variability associated with tacrolimus increase in elderly populations. In addition to pharmacokinetic differences, donor organ viability, multiple co-morbidity, polypharmacy and immunological changes need to be considered when using tacrolimus in the elderly. Aging is associated with decreased immunoresponsiveness, a slower body repair process and increased drug adverse effects. Elderly liver and kidney transplant recipients are more likely to develop new-onset diabetes mellitus than younger patients. Elderly transplant recipients exhibit higher mortality from infectious and cardiovascular causes than younger patients but may be less likely to develop acute rejection. Elderly kidney recipients have a higher potential for chronic allograft nephropathy, and a single rejection episode can be more devastating. There is a paucity of information on optimal tacrolimus dosage and target trough concentration in the elderly. The therapeutic window for tacrolimus concentrations may be narrower. Further integrated pharmacokinetic-pharmaco-dynamic studies of tacrolimus are required. It would appear reasonable, based on current knowledge, to commence tacrolimus at similar doses as those used in younger patients. Maintenance dose requirements over the longer term may be lower in the elderly, but the increased variability in kinetics and the variety of factors that impact on dosage suggest that patient care needs to be based around more frequent monitoring in this age group.
Resumo:
This study aims to assess the oxidative stress in leprosy patients under multidrug therapy (MDT; dapsone, clofazimine and rifampicin), evaluating the nitric oxide (NO) concentration, catalase (CAT) and superoxide dismutase (SOD) activities, glutathione (GSH) levels, total antioxidant capacity, lipid peroxidation, and methemoglobin formation. For this, we analyzed 23 leprosy patients and 20 healthy individuals from the Amazon region, Brazil, aged between 20 and 45 years. Blood sampling enabled the evaluation of leprosy patients prior to starting multidrug therapy (called MDT 0) and until the third month of multidrug therapy (MDT 3). With regard to dapsone (DDS) plasma levels, we showed that there was no statistical difference in drug plasma levels between multibacillary (0.518±0.029 μg/mL) and paucibacillary (0.662±0.123 μg/mL) patients. The methemoglobin levels and numbers of Heinz bodies were significantly enhanced after the third MDTsupervised dose, but this treatment did not significantly change the lipid peroxidation and NO levels in these leprosy patients. In addition, CAT activity was significantly reduced in MDT-treated leprosy patients, while GSH content was increased in these patients. However, SOD and Trolox equivalent antioxidant capacity levels were similar in patients with and without treatment. These data suggest that MDT can reduce the activity of some antioxidant enzyme and influence ROS accumulation, which may induce hematological changes, such as methemoglobinemia in patients with leprosy. We also explored some redox mechanisms associated with DDS and its main oxidative metabolite DDS-NHOH and we explored the possible binding of DDS to the active site of CYP2C19 with the aid of molecular modeling software. © 2014 Schalcher et al.
Resumo:
Introduction: Polycystic ovary syndrome (PCOS) whose classic features (menstrual irregularity of oligo/ amenorrhea type, chronic anovulation, infertility and hyperandrogenism clinical and/ or biochemical), is associated with aspects of metabolic syndrome (MS), as obesity and insulin resistance. The level of obesity determines different levels of inflammation, increasing cytokines participants of metabolic and endocrine functions, beyond modulate the immune response. Metabolic changes, added to the imbalance of sex hormones underlying irregular menstruation observed in (PCOS) can trigger allergic processes and elevation of total and specific IgE antibodies indicate that a sensitization process was started. Objective: To evaluate the influence of PCOS on biochemical parameters and levels of total and specific IgE to aeroallergens in obese women. Methods: After approval by the Committee of Ethics in Research, were recruited 80 volunteers with BMI ≥ 30 kg/m2 and age between 18 and 45 years. Among these, 40 with PCOS according to the Rotterdam criteria and 40 women without PCOS (control group). All participants were analysed with regard to anthropometric, clinical, gynecological parameters, interviewed using a questionnaire, and underwent blood sampling for realization of laboratory tests of clinical biochemistry: Total cholesterol, LDL-cholesterol, HDL- cholesterol, Triglycerides, Fasting glucose, Urea, Creatinine, Aspartate aminotransferase (AST), Alanine aminotransferase (ALT) and immunological: total and specific IgE to Dermatophagoides pteronyssinus, Blomia tropicalis, Dermatophagoides farinae and Dermatophagoides microceras.Statistical analysis was performed using SPSS 15.0 software through the chi-square tests, Fisher, Student t test and binary logistic regression, with significance level (p <0.05). Results: It was observed in the group of obese women with PCOS that 29 (72.5%) had menstrual cycle variable and 27 (67.5%) had difficulty getting pregnant. According to waist-hip ratio, higher average was also observed in obese PCOS (0.87). Blood level of HDL (36.9 mg/dL) and ALT (29.3 U/L) were above normal levels in obese women with PCOS, with statistically significant relationship. In the analysis of total and specific IgE to D. pteronyssinus high results were also prevalent in obese PCOS, with blood level (365,22 IU/mL) and (6.83 kU/L), respectively, also statistically significant. Conclusions: Observed predominance of cases with high levels of total IgE in the group of obese women with PCOS, 28 (70%) of the participants, whose mean blood concentration of the group was 365.22 IU/mL. In the analysis of Specific IgE between the groups, the allergen Dermatophagoides pteronyssinus showed greater dispersion and average the results of sensitization in the group of obese PCOS, whose mean blood concentration was 6.83 kU/l. Keywords: Obesity, Allergens and Polycystic Ovary Syndrome
Resumo:
La circulation extracorporelle (CEC) est une technique utilisée en chirurgie cardiaque effectuée des milliers de fois chaque jour à travers le monde. L’instabilité hémodynamique associée au sevrage de la CEC difficile constitue la principale cause de mortalité en chirurgie cardiaque et l’hypertension pulmonaire (HP) a été identifiée comme un des facteurs de risque les plus importants. Récemment, une hypothèse a été émise suggérant que l'administration prophylactique (avant la CEC) de la milrinone par inhalation puisse avoir un effet préventif et faciliter le sevrage de la CEC chez les patients atteints d’HP. Toutefois, cette indication et voie d'administration pour la milrinone n'ont pas encore été approuvées par les organismes réglementaires. Jusqu'à présent, la recherche clinique sur la milrinone inhalée s’est principalement concentrée sur l’efficacité hémodynamique et l'innocuité chez les patients cardiaques, bien qu’aucun biomarqueur n’ait encore été établi. La dose la plus appropriée pour l’administration par nébulisation n'a pas été déterminée, de même que la caractérisation des profils pharmacocinétiques (PK) et pharmacodynamiques (PD) suite à l'inhalation. L'objectif de notre recherche consistait à caractériser la relation exposition-réponse de la milrinone inhalée administrée chez les patients subissant une chirurgie cardiaque sous CEC. Une méthode analytique par chromatographie liquide à haute performance couplée à un détecteur ultraviolet (HPLC-UV) a été optimisée et validée pour le dosage de la milrinone plasmatique suite à l’inhalation et s’est avérée sensible et précise. La limite de quantification (LLOQ) était de 1.25 ng/ml avec des valeurs de précision intra- et inter-dosage moyennes (CV%) <8%. Des patients souffrant d’HP pour lesquels une chirurgie cardiaque sous CEC était prévue ont d’abord été recrutés pour une étude pilote (n=12) et, par la suite, pour une étude à plus grande échelle (n=28) où la milrinone (5 mg) était administrée par inhalation pré-CEC. Dans l'étude pilote, nous avons comparé l'exposition systémique de la milrinone peu après son administration avec un nébuliseur pneumatique ou un nébuliseur à tamis vibrant. L’efficacité des nébuliseurs en termes de dose émise et dose inhalée a également été déterminée in vitro. Dans l'étude à plus grande échelle conduite en utilisant exclusivement le nébuliseur à tamis vibrant, la dose inhalée in vivo a été estimée et le profil pharmacocinétique de la milrinone inhalée a été pleinement caractérisé aux niveaux plasmatique et urinaire. Le ratio de la pression artérielle moyenne sur la pression artérielle pulmonaire moyenne (PAm/PAPm) a été choisi comme biomarqueur PD. La relation exposition-réponse de la milrinone a été caractérisée pendant la période d'inhalation en étudiant la relation entre l'aire sous la courbe de l’effet (ASCE) et l’aire sous la courbe des concentrations plasmatiques (ASC) de chacun des patients. Enfin, le ratio PAm/PAPm a été exploré comme un prédicteur potentiel de sortie de CEC difficile dans un modèle de régression logistique. Les expériences in vitro ont démontré que les doses émises étaient similaires pour les nébuliseurs pneumatique (64%) et à tamis vibrant (68%). Cependant, la dose inhalée était 2-3 fois supérieure (46% vs 17%) avec le nébuliseur à tamis vibrant, et ce, en accord avec les concentrations plasmatiques. Chez les patients, en raison des variations au niveau des facteurs liés au circuit et au ventilateur causant une plus grande dose expirée, la dose inhalée a été estimée inférieure (30%) et cela a été confirmé après récupération de la dose de milrinone dans l'urine 24 h (26%). Les concentrations plasmatiques maximales (Cmax: 41-189 ng/ml) et l'ampleur de la réponse maximale ΔRmax-R0 (0-65%) ont été observées à la fin de l'inhalation (10-30 min). Les données obtenues suite aux analyses PK sont en accord avec les données publiées pour la milrinone intraveineuse. Après la période d'inhalation, les ASCE individuelles étaient directement reliées aux ASC (P=0.045). Enfin, notre biomarqueur PD ainsi que la durée de CEC ont été identifiés comme des prédicteurs significatifs de la sortie de CEC difficile. La comparaison des ASC et ASCE correspondantes a fourni des données préliminaires supportant une preuve de concept pour l'utilisation du ratio PAm/PAPm comme biomarqueur PD prometteur et justifie de futures études PK/PD. Nous avons pu démontrer que la variation du ratio PAm/PAPm en réponse à la milrinone inhalée contribue à la prévention de la sortie de CEC difficile.
Associations between exposure to viruses and bovine respiratory disease in Australian feedlot cattle
Resumo:
Bovine respiratory disease (BRD) is the most important cause of clinical disease and death in feedlot cattle. Respiratory viral infections are key components in predisposing cattle to the development of this disease. To quantify the contribution of four viruses commonly associated with BRD, a case-control study was conducted nested within the National Bovine Respiratory Disease Initiative project population in Australian feedlot cattle. Effects of exposure to Bovine viral diarrhoea virus 1 (BVDV-1), Bovine herpesvirus 1 (BoHV-1), Bovine respiratory syncytial virus (BRSV) and Bovine parainfluenza virus 3 (BPIV-3), and to combinations of these viruses, were investigated. Based on weighted seroprevalences at induction (when animals were enrolled and initial samples collected), the percentages of the project population estimated to be seropositive were 24% for BoHV-1, 69% for BVDV-1, 89% for BRSV and 91% for BPIV-3. For each of the four viruses, seropositivity at induction was associated with reduced risk of BRD (OR: 0.6–0.9), and seroincrease from induction to second blood sampling (35–60 days after induction) was associated with increased risk of BRD (OR: 1.3–1.5). Compared to animals that were seropositive for all four viruses at induction, animals were at progressively increased risk with increasing number of viruses for which they were seronegative; those seronegative for all four viruses were at greatest risk (OR: 2.4). Animals that seroincreased for one or more viruses from induction to second blood sampling were at increased risk (OR: 1.4–2.1) of BRD compared to animals that did not seroincrease for any viruses. Collectively these results confirm that prior exposure to these viruses is protective while exposure at or after feedlot entry increases the risk of development of BRD in feedlots. However, the modest increases in risk associated with seroincrease for each virus separately, and the progressive increases in risk with multiple viral exposures highlights the importance of concurrent infections in the aetiology of the BRD complex. These findings indicate that, while efficacious vaccines could aid in the control of BRD, vaccination against one of these viruses would not have large effects on population BRD incidence but vaccination against multiple viruses would be expected to result in greater reductions in incidence. The findings also confirm the multifactorial nature of BRD development, and indicate that multifaceted approaches in addition to efficacious vaccines against viruses will be required for substantial reductions in BRD incidence.