202 resultados para Regimen Sanitatis Salernitanum.
Resumo:
Hyper- and hyponatremia are frequently observed in patients after subarachnoidal hemorrhage, and are potentially related to worse outcome. We hypothesized that the fluid regimen in these patients is associated with distinct changes in serum electrolytes, acid-base disturbances, and fluid balance.
Resumo:
OBJECTIVE: To compare oral administration of lomustine and prednisolone with oral administration of prednisolone alone as treatment for granulomatous meningoencephalomyelitis (GME) or necrotizing encephalitis (NE) in dogs. DESIGN: Retrospective cohort study. ANIMALS: 25 dogs with GME and 18 dogs with NE (diagnosis confirmed in 8 and 5 dogs, respectively). PROCEDURES: Records of dogs with GME or NE were reviewed for results of initial neurologic assessments and clinicopathologic findings, treatment, follow-up clinicopathologic findings (for lomustine-treated dogs), and survival time. Dogs with GME or NE treated with lomustine and prednisolone were assigned to groups 1 (n = 14) and 3 (10), respectively; those treated with prednisolone alone were assigned to groups 2 (11) and 4 (8), respectively. RESULTS: Prednisolone was administered orally every 12 hours to all dogs. In groups 1 and 3, mean lomustine dosage was 60.3 mg/m(2), PO, every 6 weeks. Median survival times in groups 1 through 4 were 457, 329, 323, and 91 days, respectively (no significant difference between groups 1 and 2 or between groups 3 and 4). Within the initial 12 months of treatment, median prednisolone dosage was reduced in all groups; dosage reduction in group 1 was significantly larger than that in group 2 at 6, 9, and 12 months. Combination treatment most frequently caused leukopenia, but had no significant effect on liver enzyme activities. CONCLUSIONS AND CLINICAL RELEVANCE: In dogs with GME and NE, oral administration of lomustine and prednisolone or prednisolone alone had similar efficacy. Inclusion of lomustine in the treatment regimen was generally tolerated well.
Resumo:
Milk fatty acid (FA) profile is a dynamic pattern influenced by lactational stage, energy balance and dietary composition. In the first part of this study, effects of the energy balance during the proceeding lactation [weeks 1-21 post partum (pp)] on milk FA profile of 30 dairy cows were evaluated under a constant feeding regimen. In the second part, effects of a negative energy balance (NEB) induced by feed restriction on milk FA profile were studied in 40 multiparous dairy cows (20 feed-restricted and 20 control). Feed restriction (energy balance of -63 MJ NEL/d, restriction of 49 % of energy requirements) lasted 3 weeks starting at around 100 days in milk. Milk FA profile changed markedly from week 1 pp up to week 12 pp and remained unchanged thereafter. The proportion of saturated FA (predominantly 10:0, 12:0, 14:0 and 16:0) increased from week 1 pp up to week 12 pp, whereas monounsaturated FA, predominantly the proportion of 18:1,9c decreased as NEB in early lactation became less severe. During the induced NEB, milk FA profile showed a similarly directed pattern as during the NEB in early lactation, although changes were less marked for most FA. Milk FA composition changed rapidly within one week after initiation of feed restriction and tended to adjust to the initial composition despite maintenance of a high NEB. C18:1,9c was increased significantly during the induced NEB indicating mobilization of a considerable amount of adipose tissue. Besides 18:1,9c, changes in saturated FA, monounsaturated FA, de-novo synthesized and preformed FA (sum of FA >C16) reflected energy status in dairy cows and indicated the NEB in early lactation as well as the induced NEB by feed restriction.
Resumo:
We investigated the effects of different dietary vitamin D regimen on selected blood parameters in laying hens. Supplementation with vitamin D-3 only was compared with a combination of vitamin D-3 and its metabolite 25-hydroxy-cholecalciferol (25(OH)D-3). Blood concentrations of total calcium, phosphate and 25 (OH)D-3 were determined. Four thousand one-day-old LSL chicks were split in two treatment groups and distributed to eight pens. The control group was given a commercial animal diet containing 2800 IU synthetic vitamin D-3 in the starter feed and 2000 IU synthetic vitamin D-3 in the pullet feed. The experimental group was fed the same commercial diet in which half the synthetic vitamin D-3 content had been substituted with 25(OH)D-3 (Hy center dot D (R)). At 18 weeks of age, pullets were transferred to the layer house. At the ages of 11, 18 and 34 weeks, between 120 and 160 blood samples were collected from both the control and the experimental groups, respectively. The experimental group had higher levels of 25 (OH)D-3 than the control group at all three ages. Serum calcium levels did not differ between the treatment groups at any age. With the onset of laying, calcium levels rose significantly. Whereas blood serum concentration at 18 weeks was 3 mmol/L in both treatment groups, it increased to 8.32 mmol/L in the control group and to 8.66 mmol/L in the experimental group at week 34. At weeks 11 and 34, phosphate was significantly lower in the experimental group. In conclusion, HyD (R) significantly affected serum phosphate and 25(OH)D-3 levels. No effects of (25(OH)D-3 supplementation on performance, shell quality and fractures of keelbones were found.
Resumo:
Background Changes in CD4 cell counts are poorly documented in individuals with low or moderate-level viremia while on antiretroviral treatment (ART) in resource-limited settings. We assessed the impact of on-going HIV-RNA replication on CD4 cell count slopes in patients treated with a first-line combination ART. Method Naïve patients on a first-line ART regimen with at least two measures of HIV-RNA available after ART initiation were included in the study. The relationships between mean CD4 cell count change and HIV-RNA at 6 and 12 months after ART initiation (M6 and M12) were assessed by linear mixed models adjusted for gender, age, clinical stage and year of starting ART. Results 3,338 patients were included (14 cohorts, 64% female) and the group had the following characteristics: a median follow-up time of 1.6 years, a median age of 34 years, and a median CD4 cell count at ART initiation of 107 cells/μL. All patients with suppressed HIV-RNA at M12 had a continuous increase in CD4 cell count up to 18 months after treatment initiation. By contrast, any degree of HIV-RNA replication both at M6 and M12 was associated with a flat or a decreasing CD4 cell count slope. Multivariable analysis using HIV-RNA thresholds of 10,000 and 5,000 copies confirmed the significant effect of HIV-RNA on CD4 cell counts both at M6 and M12. Conclusion In routinely monitored patients on an NNRTI-based first-line ART, on-going low-level HIV-RNA replication was associated with a poor immune outcome in patients who had detectable levels of the virus after one year of ART.
Resumo:
Osteoporosis is characterised by a progressive loss of bone mass and microarchitecture which leads to increased fracture risk. Some of the drugs available to date have shown reductions in vertebral and non-vertebral fracture risk. However, in the ageing population of industrialised countries, still more fractures happen today than are avoided, which highlights the large medical need for new treatment options, models, and strategies. Recent insights into bone biology, have led to a better understanding of bone cell functions and crosstalk between osteoblasts, osteoclasts, and osteocytes at the molecular level. In the future, the armamentarium against osteoporotic fractures will likely be enriched by (1.) new bone anabolic substances such as antibodies directed against the endogenous inhibitors of bone formation sclerostin and dickkopf-1, PTH and PTHrp analogues, and possibly calcilytics; (2.) new inhibitors of bone resorption such as cathepsin K inhibitors which may suppress osteoclast function without impairing osteoclast viability and thus maintain bone formation by preserving the osteoclast-osteoblast crosstalk, and denosumab, an already widely available antibody against RANKL which inhibits osteoclast formation, function, and survival; and (3.) new therapeutic strategies based on an extended understanding of the pathophysiology of osteoporosis which may include sequential therapies with two or more bone active substances aimed at optimising the management of bone capital acquired during adolescence and maintained during adulthood in terms of both quantity and quality. Finally, one of the future challenges will be to identify those patients and patient populations expected to benefit the most from a given drug therapy or regimen. The WHO fracture risk assessment tool FRAX® and improved access to bone mineral density measurements by DXA will play a key role in this regard.
Resumo:
This phase I trial was designed to develop a new effective and well-tolerated regimen for patients with aggressive B cell lymphoma not eligible for front-line anthracycline-based chemotherapy or aggressive second-line treatment strategies. The combination of rituximab (375 mg/m(2) on day 1), bendamustine (70 mg/m(2) on days 1 and 2), and lenalidomide was tested with a dose escalation of lenalidomide at three dose levels (10, 15, or 20 mg/day) using a 3 + 3 design. Courses were repeated every 4 weeks. The recommended dose was defined as one level below the dose level identifying ≥2/6 patients with a dose-limiting toxicity (DLT) during the first cycle. Thirteen patients were eligible for analysis. Median age was 77 years. WHO performance status was 0 or 1 in 12 patients. The Charlson Comorbidity Index showed relevant comorbidities in all patients. Two DLTs occurred at the second dose level (15 mg/day) within the first cycle: one patient had prolonged grade 3 neutropenia, and one patient experienced grade 4 cardiac adverse event (myocardial infarction). Additional grade 3 and 4 toxicities were as follows: neutropenia (31 %), thrombocytopenia (23 %), cardiac toxicity (31 %), fatigue (15 %), and rash (15 %). The dose of lenalidomide of 10 mg/day was recommended for a subsequent phase II in combination with rituximab 375 mg/m(2) on day 1 and bendamustine 70 mg/m(2) on days 1 and 2.
Resumo:
The efficiency of an oncological treatment regimen is often assessed by morphological criteria such as tumour size evaluated by cross-sectional imaging, or by laboratory measurements of plasma biomarkers. Because these types of measures typically allow for assessment of treatment response several weeks or even months after the start of therapy, earlier response assessment that provides insight into tumour function is needed. This is particularly urgent for the evaluation of newer targeted therapies and for fractionated therapies that are delivered over a period of weeks to allow for a change of treatment in non-responding patients. Diffusion-weighted MRI (DW-MRI) is a non-invasive imaging tool that does not involve radiation or contrast media, and is sensitive to tissue microstructure and function on a cellular level. DW-MRI parameters have shown sensitivity to treatment response in a growing number of tumour types and organ sites, with additional potential as predictive parameters for treatment outcome. A brief overview of DW-MRI principles is provided here, followed by a review of recent literature in which DW-MRI has been used to monitor and predict tumour response to various therapeutic regimens.
Resumo:
Traditionally, the routine artificial digestion test is applied to assess the presence of Trichinella larvae in pigs. However, this diagnostic method has a low sensitivity compared to serological tests. The results from artificial digestion tests in Switzerland were evaluated over a time period of 15 years to determine by when freedom from infection based on these data could be confirmed. Freedom was defined as a 95% probability that the prevalence of infection was below 0.0001%. Freedom was demonstrated after 12 years at the latest. A new risk-based surveillance approach was then developed based on serology. Risk-based surveillance was also assessed over 15 years, starting in 2010. It was shown that by using this design, the sample size could be reduced by at least a factor of 4 when compared with the traditional testing regimen, without lowering the level of confidence in the Trichinella-free status of the pig population.
Resumo:
BACKGROUND: Bluetongue virus serotype 8 (BTV-8) has caused disease in domestic ruminants in several countries of northern Europe since 2006. In 2008 a mass-vaccination program was launched in most affected countries using whole virus inactivated vaccines. OBJECTIVE: To evaluate 2 inactivated vaccines (Bovilis BTV 8; BTVPUR AlSap8) for immunogenicity and safety against BTV-8 in South American camelids (SAC) in a field trial. ANIMALS: Forty-two SAC (25 Alpacas, 17 Llamas) aged between 1 and 16 years. METHODS: The animals were vaccinated twice at intervals of 21 days. They were observed clinically for adverse local, systemic, or both reactions throughout the trial. Blood samples collected on days 0, 14, 21, 43, and 156 after vaccination were tested for the presence of BTV-8 virus by real time-polymerase chain reaction and of specific antibodies by competitive ELISA and a serum neutralization test. RESULTS: All vaccinated animals developed antibodies to BTV-8 after the 2nd administration of the vaccine. No adverse effects were observed except for moderate local swellings at the injection site, which disappeared within 21 days. Slightly increased body temperatures were only observed in the first 2 days after vaccination. The BTV was not detected in any of the samples analyzed. CONCLUSIONS AND CLINICAL IMPORTANCE: The administration of the 2 inactivated commercial vaccines was safe and induced seroconversion against BTV-8 in all vaccinated animals. The results of this study suggest that 2 doses injected 3 weeks apart is a suitable vaccination regimen for SAC.
Resumo:
BACKGROUND Current guidelines give recommendations for preferred combination antiretroviral therapy (cART). We investigated factors influencing the choice of initial cART in clinical practice and its outcome. METHODS We analyzed treatment-naive adults with human immunodeficiency virus (HIV) infection participating in the Swiss HIV Cohort Study and starting cART from January 1, 2005, through December 31, 2009. The primary end point was the choice of the initial antiretroviral regimen. Secondary end points were virologic suppression, the increase in CD4 cell counts from baseline, and treatment modification within 12 months after starting treatment. RESULTS A total of 1957 patients were analyzed. Tenofovir-emtricitabine (TDF-FTC)-efavirenz was the most frequently prescribed cART (29.9%), followed by TDF-FTC-lopinavir/r (16.9%), TDF-FTC-atazanavir/r (12.9%), zidovudine-lamivudine (ZDV-3TC)-lopinavir/r (12.8%), and abacavir/lamivudine (ABC-3TC)-efavirenz (5.7%). Differences in prescription were noted among different Swiss HIV Cohort Study sites (P < .001). In multivariate analysis, compared with TDF-FTC-efavirenz, starting TDF-FTC-lopinavir/r was associated with prior AIDS (relative risk ratio, 2.78; 95% CI, 1.78-4.35), HIV-RNA greater than 100 000 copies/mL (1.53; 1.07-2.18), and CD4 greater than 350 cells/μL (1.67; 1.04-2.70); TDF-FTC-atazanavir/r with a depressive disorder (1.77; 1.04-3.01), HIV-RNA greater than 100 000 copies/mL (1.54; 1.05-2.25), and an opiate substitution program (2.76; 1.09-7.00); and ZDV-3TC-lopinavir/r with female sex (3.89; 2.39-6.31) and CD4 cell counts greater than 350 cells/μL (4.50; 2.58-7.86). At 12 months, 1715 patients (87.6%) achieved viral load less than 50 copies/mL and CD4 cell counts increased by a median (interquartile range) of 173 (89-269) cells/μL. Virologic suppression was more likely with TDF-FTC-efavirenz, and CD4 increase was higher with ZDV-3TC-lopinavir/r. No differences in outcome were observed among Swiss HIV Cohort Study sites. CONCLUSIONS Large differences in prescription but not in outcome were observed among study sites. A trend toward individualized cART was noted suggesting that initial cART is significantly influenced by physician's preference and patient characteristics. Our study highlights the need for evidence-based data for determining the best initial regimen for different HIV-infected persons.
Resumo:
The aim of this study was to evaluate the difference between the effects of a 5-day and a 1-day course of antibiotics on the incidence of postoperative infection after displaced fractures of the orbit. A total of 62 patients with orbital blow-out fractures were randomly assigned to two groups, both of which were given amoxicillin/clavulanic acid 1.2g intravenously every 8h from the time of admission to 24h postoperatively. The 5-day group were then given amoxicillin/clavulanic acid 625mg orally every 8h for 4 further days. The 1-day group were given placebo orally at the same time intervals. Follow up appointments were 1, 2, 4, 6, and 12 weeks, and 6 months, postoperatively. An infection in the orbital region was the primary end point. Sixty of the 62 patients completed the study. Two of the 29 patients in the 5-day group (6.8%) and 1/31 patients in the 1-day group (3.2%) developed local infections. In the 5-day group 1 patient developed diarrhoea. In the 1-day group 1 patient developed a rash on the trunk. There were no significant differences in the incidence of infection or side effects between the groups. We conclude that in displaced orbital fractures a postoperative 1-day course of antibiotics is as effective in preventing infective complications as a 5-day regimen.
Resumo:
Transplantation is the treatment of choice for many different organ failures. Despite growing experience in surgery and immunosuppression protocols, the long-term mortality of the procedure remains much higher than in the general population. Second only to cardiovascular diseases as the cause of death in organ transplant recipients, cancer is now known to be at least partly related to the immunosuppression regimen. Nevertheless, if calcineurin inhibitors have a demonstrated pro-oncogenic effect, other classes, such as mTOR inhibitors, are antiproliferative, and even demonstrated as an efficient therapy in some advanced oncological situations. Therefore, the adaptation of the therapy protocol evolves now towards an individualized medicine based on the risk factors of each transplant recipient in terms of cardiovascular, infectious and oncological diseases. As the first organ involved by tumor is the skin, many different guidelines have been published to try and adapt the therapy to the occurrence of a new lesion. If, for example, limited actinic keratosis or the first episode of a non-melanoma skin cancer usually requires no change of the immunosuppressive therapy, but a local specialized care and frequent clinical controls, more advanced lesions imply the adaptation of the drug regimen. In any case, the collaboration between general practitioners, dermatologists and the transplantation team is mandatory.
Resumo:
A variety of chronic kidney diseases tend to progress towards end-stage kidney disease. Progression is largely due to factors unrelated to the initial disease, including arterial hypertension and proteinuria. Intensive treatment of these two factors is potentially able to slow the progression of kidney disease. Blockers of the renin-angiotensin-aldosterone system, either converting enzyme inhibitors or angiotensin II receptor antagonists, reduce both blood pressure and proteinuria and appear superior to a conventional antihypertensive treatment regimen in preventing progression to end-stage kidney disease. The most recent recommendations state that in children with chronic kidney disease without proteinuria the blood pressure goal is the corresponding 75th centile for body length, age and gender; whereas the 50th centile should be aimed in children with chronic kidney disease and pathologically increased proteinuria.
Resumo:
OBJECTIVE: To compare regimens consisting of either efavirenz or nevirapine and two or more nucleoside reverse transcriptase inhibitors (NRTIs) among HIV-infected, antiretroviral-naive, and AIDS-free individuals with respect to clinical, immunologic, and virologic outcomes. DESIGN: Prospective studies of HIV-infected individuals in Europe and the US included in the HIV-CAUSAL Collaboration. METHODS: Antiretroviral therapy-naive and AIDS-free individuals were followed from the time they started an NRTI, efavirenz or nevirapine, classified as following one or both types of regimens at baseline, and censored when they started an ineligible drug or at 6 months if their regimen was not yet complete. We estimated the 'intention-to-treat' effect for nevirapine versus efavirenz regimens on clinical, immunologic, and virologic outcomes. Our models included baseline covariates and adjusted for potential bias introduced by censoring via inverse probability weighting. RESULTS: A total of 15 336 individuals initiated an efavirenz regimen (274 deaths, 774 AIDS-defining illnesses) and 8129 individuals initiated a nevirapine regimen (203 deaths, 441 AIDS-defining illnesses). The intention-to-treat hazard ratios [95% confidence interval (CI)] for nevirapine versus efavirenz regimens were 1.59 (1.27, 1.98) for death and 1.28 (1.09, 1.50) for AIDS-defining illness. Individuals on nevirapine regimens experienced a smaller 12-month increase in CD4 cell count by 11.49 cells/mul and were 52% more likely to have virologic failure at 12 months as those on efavirenz regimens. CONCLUSIONS: Our intention-to-treat estimates are consistent with a lower mortality, a lower incidence of AIDS-defining illness, a larger 12-month increase in CD4 cell count, and a smaller risk of virologic failure at 12 months for efavirenz compared with nevirapine.