917 resultados para days to eating soft (DTES)
Resumo:
PURPOSE: To retrospectively evaluate the midterm patency rate of the nitinol (Viatorr, W.L. Gore and Associates, Flagstaff, Ariz) stent-graft for direct intrahepatic portacaval shunt (DIPS) creation. MATERIALS AND METHODS: Institutional Review Board approval for this retrospective HIPAA-compliant study was obtained with waiver of informed consent. DIPS was created in 18 men and one woman (median age, 54 years; range, 45-65 years) by using nitinol polytetrafluoroethylene (PTFE)-covered stent-grafts. The primary indications were intractable ascites (n = 14), acute variceal bleeding (n = 3), and hydrothorax (n = 2). Follow-up included Doppler ultrasonography at 1, 6, and 12 months and venography with manometry at 6-month intervals after the procedure. Shunt patency and cumulative survival were evaluated by using the Kaplan-Meier method and survival curves were plotted. Differences in mean portosystemic gradients (PSGs) were evaluated by using the Student t test. Multiple regression analysis for survival and DIPS patency were performed for the following parameters: Child-Pugh class, model of end-stage liver disease score, pre- and post-DIPS PSGs, pre-DIPS liver function tests, and pre-DIPS creatinine levels. RESULTS: DIPS creation was successful in all patients. Effective portal decompression and free antegrade shunt flow was achieved in all patients. Intraperitoneal bleeding occurred in one patient during the procedure and was controlled during the same procedure by placing a second nitinol stent-graft. The primary patency rate was 100% at all times during the follow-up period (range, 2 days to 30 months; mean, 256 days; median, 160 days). Flow restrictors were deployed in two (11%) of 19 patients. The 1-year mortality rate was 37% (seven of 19). CONCLUSION: Patency after DIPS creation with the nitinol PTFE-covered stent-graft was superior to that after TIPS with the nitinol stent-graft.
Resumo:
It is well established that local muscle tissue hypoxia is an important consequence and possibly a relevant adaptive signal of endurance exercise training in humans. It has been reasoned that it might be advantageous to increase this exercise stimulus by working in hypoxia. However, as long-term exposure to severe hypoxia has been shown to be detrimental to muscle tissue, experimental protocols were developed that expose subjects to hypoxia only for the duration of the exercise session and allow recovery in normoxia (live low-train high or hypoxic training). This overview reports data from 27 controlled studies using some implementation of hypoxic training paradigms. Hypoxia exposure varied between 2300 and 5700 m and training duration ranged from 10 days to 8 weeks. A similar number of studies was carried out on untrained and on trained subjects. Muscle structural, biochemical and molecular findings point to a specific role of hypoxia in endurance training. However, based on the available data on global estimates of performance capacity such as maximal oxygen uptake (VO2max) and maximal power output (Pmax), hypoxia as a supplement to training is not consistently found to be of advantage for performance at sea level. There is some evidence mainly from studies on untrained subjects for an advantage of hypoxic training for performance at altitude. Live low-train high may be considered when altitude acclimatization is not an option.
Resumo:
BACKGROUND: This study reviews our experience with the Ross procedure in infants and young children. METHODS: From September 1993 to September 2004, 52 children less than 15 years of age underwent a Ross procedure. The patients ranged in age from 4 days to 15 years old (median, 5 years). Fifteen patients (29%) were less than 2 years of age. The predominant indication for the Ross procedure was aortic stenosis. Sixteen patients underwent a Ross-Konno procedure for severe left ventricular outflow tract obstruction. Thirty-four patients had 48 previous interventions. Preoperatively, 6 patients showed severe left ventricular dysfunction, and 2 of the patients required ventilation and inotropic support. Concomitant procedures were performed in 8 patients. Three patients had a mitral valve replacement, 2 patients had a ventricular septal defect closure and an aortic arch reconstruction, 2 patients had aortic arch reconstructions, and 1 patient had resection of a coarctation and a ventricular septal defect closure. RESULTS: Patients were followed up for a median of 43 months (range, 1 to 130). Overall survival was 85% +/- 5% at 1 and 82% +/- 5% at 2, 5, and 10 years. Hospital mortality was 5 of 52 patients (9.6%). All deaths occurred in neonates or infants less than 2 months of age, who needed urgent surgery. Three patients died late of noncardiac causes. At last follow-up, all patients were classified in New York Heart Association functional class I or II. No patient had endocarditis of the autograft or the right ventricular outflow tract replacement. During the follow-up, no event of thrombembolism was observed. No patient required the insertion of a permanent pacemaker. Overall freedom from reoperation is 57% +/- 15% at 10 years. One patient required the replacement of the autograft at 6 months postoperatively. The development of mild aortic insufficiency was observed in 24 patients, and moderate aortic insufficiency in 1 patient during follow-up. Freedom from reoperation for the right ventricular outflow tract replacement is 60% +/- 15% at 10 years. CONCLUSIONS: The Ross procedure represents an attractive approach to aortic valve disease in young children. However, a high early mortality rate has to be considered when performing this procedure in neonates or infants who present in critical preoperative condition.
Resumo:
OBJECTIVES: To evaluate the potential improvement of antimicrobial treatment by utilizing a new multiplex polymerase chain reaction (PCR) assay that identifies sepsis-relevant microorganisms in blood. DESIGN: Prospective, observational international multicentered trial. SETTING: University hospitals in Germany (n = 2), Spain (n = 1), and the United States (n = 1), and one Italian tertiary general hospital. PATIENTS: 436 sepsis patients with 467 episodes of antimicrobial treatment. METHODS: Whole blood for PCR and blood culture (BC) analysis was sampled independently for each episode. The potential impact of reporting microorganisms by PCR on adequacy and timeliness of antimicrobial therapy was analyzed. The number of gainable days on early adequate antimicrobial treatment attributable to PCR findings was assessed. MEASUREMENTS AND MAIN RESULTS: Sepsis criteria, days on antimicrobial therapy, antimicrobial substances administered, and microorganisms identified by PCR and BC susceptibility tests. RESULTS: BC diagnosed 117 clinically relevant microorganisms; PCR identified 154. Ninety-nine episodes were BC positive (BC+); 131 episodes were PCR positive (PCR+). Overall, 127.8 days of clinically inadequate empirical antibiotic treatment in the 99 BC+ episodes were observed. Utilization of PCR-aided diagnostics calculates to a potential reduction of 106.5 clinically inadequate treatment days. The ratio of gainable early adequate treatment days to number of PCR tests done is 22.8 days/100 tests overall (confidence interval 15-31) and 36.4 days/100 tests in the intensive care and surgical ward populations (confidence interval 22-51). CONCLUSIONS: Rapid PCR identification of microorganisms may contribute to a reduction of early inadequate antibiotic treatment in sepsis.
Resumo:
Bloch, Konrad E., Alexander J. Turk, Marco Maggiorini, Thomas Hess, Tobias Merz, Martina M. Bosch, Daniel Barthelmes, Urs Hefti, Jacqueline Pichler, Oliver Senn, and Otto D. Schoch. Effect of ascent protocol on acute mountain sickness and success at Muztagh Ata, 7546 m. High Alt. Med. Biol. 10:25-32, 2009.-Data on acclimatization during expedition-style climbing to > 5000 m are scant. We evaluated the hypothesis that minor differences in ascent protocol influence acute mountain sickness (AMS) symptoms and mountaineering success in climbers to Muztagh Ata (7546 m), Western China. We performed a randomized, controlled trial during a high altitude medical research expedition to Muztagh Ata. Thirty-four healthy mountaineers (mean age 45 yr, 7 women) were randomized to follow one of two protocols, ascending within 15 or 19 days to the summit of Muztagh Ata at 7546 m, respectively. The main outcome measures, AMS symptom scores and the number of proceeding climbers, were assessed daily. Mean +/- SD AMS-C scores of 16 climbers randomized to slow ascent were 0.06 +/- 0.18, 0.26 +/- 0.08, 0.41 +/- 0.45, 0.53 +/- 0.77 at camps I (5533 m), II (6265 m), III (6865 m), and the summit (7546 m), respectively. Corresponding values in 18 climbers randomized to fast ascent were significantly higher: 0.17 +/- 0.23, 0.43 +/- 0.75, 0.49 +/- 0.36, and 0.69 +/- 0.54 (p < 0.008, vs. slow ascent in regression analysis accounting for weather-related protocol deviation). Climbers randomized to slow ascent were able to ascend according to the protocol without AMS for significantly more days than climbers randomized to fast ascent (p = 0.04, Kaplan-Meier analysis). More climbers randomized to slow ascent were successful in reaching the highest camp at 6865 m without AMS (odds ratio 9.5; 95% confidence interval 1.02 to 89). In climbers ascending to very high altitudes, differences of a few days in acclimatization have a significant impact on symptom severity, the prevalence of AMS, and mountaineering success. ClinicalTrials.gov Identifier NCT00603122.
Resumo:
Minimally invasive vertebral augmentation-based techniques have been used for the treatment of spinal fractures (osteoporotic and malignant) for approximately 25 years. In this review, we try to give an overview of the current spectrum of percutaneous augmentation techniques, safety aspects and indications. Crucial factors for success are careful patient selection, proper technique and choice of the ideal cement augmentation option. Most compression fractures present a favourable natural course, with reduction of pain and regainment of mobility after a few days to several weeks, whereas other patients experience a progressive collapse and persisting pain. In this situation, percutaneous cement augmentation is an effective treatment option with regards to pain and disability reduction, improvement of quality of life and ambulatory and pulmonary function.
Resumo:
A feeding trial was conducted with 790-lb yearling heifers fed an average of 121 days to evaluate replacing cracked corn and supplemental urea with wet distillers grains or condensed distillers solubles. Wet distillers grains were evaluated at 16%, 28% and 40% of diet dry matter. Condensed distillers solubles were added at 6.5% of diet dry matter. Control diets were supplemented with urea or a combination of urea and soybean meal. Feeding 16% wet distillers grains or condensed distillers solubles increased gain of heifers compared with those fed the control urea diet. Increasing the amount of wet distillers grains tended to decrease feed intake and reduce gain. The calculated apparent net energy based on gain of the heifers was greatest for the heifers fed 16% wet distillers grains. The apparent energy of the wet distillers grains declined as the quantity fed was increased. The calculated net energy values were 1.09 and 1.35 Mcal/lb of dry matter for the average of the three concentrations of wet distillers grains and condensed distillers solubles. These results confirm the high energy values of wet distillers grains relative to cracked corn as observed in a previous steer feeding trial.
Resumo:
A feeding trial was conducted with 940-lb yearling steers fed 113 days to determine the feeding value of distillers grains relative to corn grain. Replacing corn and urea with wet distillers grains for 20% of the diet dry matter tended to increase gain with no increase in feed consumption, resulting in improved feed conversion. Replacing 40% of diet dry matter with wet distillers grains decreased feed intake without affecting gains, and improved feed efficiency. The overall average estimated net energy value of wet distillers grains was 1.20 Mcal NEg per pound dry matter. This experiment confirmed the observations in previous cattle feeding experiments, that for finishing cattle wet distillers grains have a high energy value compared with cracked corn grain. Another objective of the study was to determine if cattle being fed wet distillers grains could be suddenly changed to a different diet if the supply of wet feed was suddenly disrupted. It was found that if intake is managed during the change, that distillers grains portion of the diet can be suddenly changed from wet to dry and then changed back to wet after a week, without sacrificing performance of the cattle.
Resumo:
A feeding trial was conducted with 860-lb yearling steers fed 121 days to evaluate Condensed Porcine Solubles (Porcine Solubles) as a source of supplemental nitrogen for finishing cattle. Diets containing 5% soybean meal, 1.46% urea, and 2% or 4% Porcine Solubles were compared. When first offered, cattle did not want to consume feed containing the Porcine Solubles. Following adaptation, feed containing up to 4% Porcine Solubles was readily consumed. During the first 56 days, steers fed soybean meal gained faster and were more efficient than steers fed urea or Porcine Solubles. At the end of the trial there were no differences among the nitrogen supplements in feed intake, gain, or feed conversion. There were no significant differences in carcass weight or measures of carcass quality.
Resumo:
A retrospective study of 2,146 feedlot cattle in 17 feedlot tests from 1988 to 1997 was conducted to determine the impact of bovine respiratory disease (BRD) on veterinary treatment costs, average daily gain, carcass traits, mortality, and net profit. Morbidity caused by BRD was 20.6%. The average cost to treat each case of BRD was $12.39. Mortality rate of calves diagnosed and treated for BRD was 5.9% vs. .35% for those not diagnosed with BRD. Average daily gain differed between treated and non-treated steers during the first 28 days on feed but did not differ from 28 days to harvest. Net profit was $57.48 lower for treated steers. Eighty-two percent of this difference was due to a combination of mortality and treatment costs. Eighteen percent of the net profit difference was due to improved performance and carcass value of the non-treated steers. Data from 496 steers and heifers in nine feedlot tests were used to determine the effects of age, weaning, and use of modified live virus or killed vaccines prior to the test to predict BRD. Younger calves, non-weaned calves, and calves vaccinated with killed vaccines prior to the test had higher BRD morbidity than those that were older, weaned, or vaccinated with modified live virus vaccines, respectively. Treatment regimes that precluded relapse resulting in re-treatment prevented reduced performance and loss of carcass value. Using modified live virus vaccines and weaning calves 30 days prior to shipment reduced the incidence of BRD.
Resumo:
A feeding trial was conducted with 870-lb steers fed 137 days to evaluate replacing cracked corn with dry and wet distillers grains with solubles (DGS) as feed for finishing cattle. Dry DGS was evaluated at 16% of diet dry matter. Wet DGS (WDGS) was evaluated at 14.6%, 26.2%, and 37.5% of diet dry matter. Control diets were supplemented with urea or a combination of urea and soybean meal. Feeding 16% dry DGS or 14.6% wet DGS increased rate of gain and tended to increase carcass fatness. Increasing the amount of wet DGS in the diet decreased feed intake, reduced gain, and improved feed conversion. The calculated net energy for gain values for dry and wet DGS were .92 and 1.5 times the energy value of corn grain. Economic returns declined slightly as the percentage of wet DGS increased in the diet, but remained above the two diets without DGS. The average benefits from feeding wet DGS averaged $25, $21, and $19 per head for steers fed 14.6%, 26.2%, and 35.7%, respectively, based on a formula price for wet DGS related to price of corn and including a charge for transportation of the wet feed.
Resumo:
During the last few years γ-hydroxybutyric acid (GHB) and γ-butyrolactone (GBL) have attracted much interest as recreational drugs and knock-out drops in drug-facilitated sexual assaults. This experiment aims at getting an insight into the pharmacokinetics of GHB after intake of GBL. Therefore Two volunteers took a single dose of 1.5 ml GBL, which had been spiked to a soft drink. Assuming that GBL was completely metabolized to GHB, the corresponding amount of GHB was 2.1 g. Blood and urine samples were collected 5 h and 24 h after ingestion, respectively. Additionally, hair samples (head hair and beard hair) were taken within four to five weeks after intake of GBL. Samples were analyzed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) after protein precipitation with acetonitrile. The following observations were made: spiked to a soft drink, GBL, which tastes very bitter, formed a liquid layer at the bottom of the glass, only disappearing when stirring. Both volunteers reported weak central effects after approximately 15 min, which disappeared completely half an hour later. Maximum concentrations of GHB in serum were measured after 20 min (95 µg/ml and 106 µg/ml). Already after 4-5 h the GHB concentrations in serum decreased below 1 µg/ml. In urine maximum GHB concentrations (140 µg/ml and 120 µg/ml) were measured after 1-2 h, and decreased to less than 1 µg/ml within 8-10 h. The Ratio of GHB in serum versus blood was 1.2 and 1.6
Resumo:
Objective. This study examines the structure, processes, and data necessary to assess the outcome variables, length of stay and total cost, for a pediatric practice guideline. The guideline was developed by a group of physicians and ancillary staff members representing the services that most commonly provide treatment for asthma patients at Texas Children's Hospital, as a means of standardizing care. Outcomes have needed to be assessed to determine the practice guideline's effectiveness.^ Data sources and study design. Data for the study were collected retrospectively from multiple hospital data bases and from inpatient chart reviews. All patients in this quasi-experimental study had a diagnosis of Asthma (ICD-9-CM Code 493.91) at the time of admission.^ The study examined data for 100 patients admitted between September 15, 1995 and November 15, 1995, whose physician had elected to apply the asthma practice guideline at the time of the patient's admission. The study examined data for 66 inpatients admitted between September 15, 1995 and November 15, 1995, whose physician elected not to apply the asthma practice guideline. The principal outcome variables were identified as "Length of Stay" and "Cost".^ Principal findings. The mean length of stay for the group in which the practice guideline was applied was 2.3 days, and 3.1 days for the comparison group, who did not receive care directed by the practice guideline. The difference was statistically significant (p value = 0.008). There was not a demonstrable difference in risk factors, health status, or quality of care between the groups. Although not showing statistical significance in the univariate analysis, private insurance showed a significant difference in the logistic regression model presenting an elevated odds ratio (odds ratio = 2.2 for a hospital stay $\le$2 days to an odds ratio = 4.7 for a hospital stay $\le$3 days) showing that patients with private insurance experienced greater risk of a shorter hospital stay than the patients with public insurance in each of the logistic regression models. Public insurance included; Medicaid, Medicare, and charity cases. Private insurance included; private insurance policies whether group, individual, or managed care. The cost of an admission was significantly less for the group in which the practice guideline was applied, with a mean difference between the two groups of $1307 per patient.^ Conclusion. The implementation and utilization of a pediatric practice guideline for asthma inpatients at Texas Children's Hospital has a significant impact in terms of reducing the total cost of the hospital stay and length of the hospital stay for asthma patients admitted to Texas Children's Hospital. ^
Resumo:
The aim of the present study was to investigate the effects of different speech tasks (recitation of prose (PR), alliteration (AR) and hexameter (HR) verses) and a control task (mental arithmetic (MA) with voicing of the result) on endtidal CO2 (ET-CO2), cerebral hemodynamics; i.e. total hemoglobin (tHb) and tissue oxygen saturation (StO2). tHb and StO2 were measured with a frequency domain near infrared spectrophotometer (ISS Inc., USA) and ET-CO2 with a gas analyzer (Nellcor N1000). Measurements were performed in 24 adult volunteers (11 female, 13 male; age range 22 to 64 years) during task performance in a randomized order on 4 different days to avoid potential carry over effects. Statistical analysis was applied to test differences between baseline, 2 recitation and 5 recovery periods. The two brain hemispheres and 4 tasks were tested separately. Data analysis revealed that during the recitation tasks (PR, AR and HR) StO2 decreased statistically significant (p < 0.05) during PR and AR in the right prefrontal cortex (PFC) and during AR and HR in the left PFC. tHb showed a significant decrease during HR in the right PFC and during PR, AR and HR in the left PFC. During the MA task, StO2 increased significantly. A significant decrease in ET-CO2 was found during all 4 tasks with the smallest decrease during the MA task. In conclusion, we hypothesize that the observed changes in tHb and StO2 are mainly caused by an altered breathing during the tasks that led a lowering of the CO2 content in the blood provoked a cerebral CO2 reaction, i.e. a vasoconstriction of blood vessels due to decreased CO2 pressure and thereby decrease in cerebral blood volume. Therefore, breathing changes should be monitored during brain studies involving speech when using functional near infrared spectroscopy (fNIRS) to ensure a correct interpretation of changes in hemodynamics and oxygenation.
Resumo:
Purpose: Homeopathic preparations are used in homeopathy and anthroposophically extended medicine. Previous studies described differences in UV transmission between homeopathic preparations of CuSO4 and controls. The aim of the present study was to investigate whether statistically significant differences can be found between homeopathic verum and placebo globules by UV spectroscopy. Methods: Verum (aconitum 30c, calcium carbonate/quercus e cortice) and placebo globules used in two previous clinical trials were dissolved in distilled water at 10mg/ml 20-23h prior to the measurements. Absorbance was measured at 190 – 340nm with a Shimadzu UV-1800 double beam spectrophotometer. Duplicates of each sample were measured in a randomized order 4 times on each of the 5 measurement days. To correct for differences between measurement days, average absorbance of all samples on one day was deduced from absorbance of the individual samples. The Kruskal-Wallis test was used to determine group differences between the samples, and finally the coding of the samples was revealed. Results: First analysis showed significant differences (p≤0.05) in average UV absorbance at 200 – 290nm between the samples and a tendency of a correlation (p≤0.1) between absorbance and globule weight. More results will be presented at the conference. Conclusion: Since the absorbance of the samples at the wavelengths between 200 and 290nm was small, a number of aspects had to be considered and should be corrected for if they are present when performing UV spectroscopy on homeopathic globules: 1. Exact weighing of the globules. 2. Measurement error of the spectrophotometer at small absorbances. 3. Drift of the spectrophotometer during a measurement day. 4. Differences between measurement days. The question remains what caused the differences in absorbance found in these experiments: the use of the original material for the production of the verum globules, differences in the production of verum and placebo globules, or other context factors.