944 resultados para Dayly routine
Resumo:
STUDY AIM: A pilot study was conducted to implement and evaluate a routine gradual psycho-diagnostic programme to improve diagnostics and treatment of mental disorders in somatic rehabilitation centres. First of all, implementation strategies were acquired in trainings together with psychologists and physicians. The psycho-diagnostic programme consists of a screening instrument (PHQ-9) designed to permit time-effective detection of comorbid mental disorders. Besides evaluation of the training, the aim of the study was to analyze the extent to which it is possible to implement the routine gradual psycho-diagnostic programme in practice. Additionally, it was intended to identify beneficial and obstructive conditions for implementation. METHODOLOGY: The pilot study was conducted in two orthopaedic and one cardiological rehabilitation centre. The training was evaluated directly after its completion using a questionnaire. Three months after its implementation, the introduction of the psycho-diagnostic programme was evaluated using interviews with n=11 physicians and psychologists. RESULTS: The training was rated positively by the participants . Implementation of the entire gradual psycho-diagnostic programme was possible in one centre and to some degree in the other two. Beneficial for implementation were a frank organisational climate, sufficient time resources, and physicians' biopsychosocial understanding of disease. A dismissive attitude towards psycho-diagnostics, little communication between staff members, little perceived advantage for one's own work and fear to stigmatise patients by psychiatric diagnoses were obstructive. CONCLUSION: Essential for a successful implementation are sufficient time and personal resources, a motivation for change in staff and centre management, and a positive attitude regarding psycho-diagnostics in clinic staff. Furthermore, flexibility in implementation strategies and the opportunity to participate in the implementation process are important.
Resumo:
BACKGROUND: In industrialized countries vaccination coverage remains suboptimal, partly because of perception of an increased risk of asthma. Epidemiologic studies of the association between childhood vaccinations and asthma have provided conflicting results, possibly for methodologic reasons such as unreliable vaccination data, biased reporting, and reverse causation. A recent review stressed the need for additional, adequately controlled large-scale studies. OBJECTIVE: Our goal was to determine if routine childhood vaccination against pertussis was associated with subsequent development of childhood wheezing disorders and asthma in a large population-based cohort study. METHODS: In 6811 children from the general population born between 1993 and 1997 in Leicestershire, United Kingdom, respiratory symptom data from repeated questionnaire surveys up to 2003 were linked to independently collected vaccination data from the National Health Service database. We compared incident wheeze and asthma between children of different vaccination status (complete, partial, and no vaccination against pertussis) by computing hazard ratios. Analyses were based on 6048 children, 23 201 person-years of follow-up, and 2426 cases of new-onset wheeze. RESULTS: There was no evidence for an increased risk of wheeze or asthma in children vaccinated against pertussis compared with nonvaccinated children. Adjusted hazard ratios comparing fully and partially vaccinated with nonvaccinated children were close to one for both incident wheeze and asthma. CONCLUSION: This study provides no evidence of an association between vaccination against pertussis in infancy and an increased risk of later wheeze or asthma and does not support claims that vaccination against pertussis might significantly increase the risk of childhood asthma.
Resumo:
OBJECTIVE: Multiple organ failure is a common complication of acute circulatory and respiratory failure. We hypothesized that therapeutic interventions used routinely in intensive care can interfere with the perfusion of the gut and the liver, and thereby increase the risk of mismatch between oxygen supply and demand. DESIGN: Prospective, observational study. SETTING: Interdisciplinary intensive care unit (ICU) of a university hospital. PATIENTS: Thirty-six patients on mechanical ventilation with acute respiratory or circulatory failure or severe infection were included. INTERVENTIONS: Insertion of a hepatic venous catheter. MEASUREMENTS AND MAIN RESULTS: Daily nursing procedures were recorded. A decrease of >or=5% in hepatic venous oxygen saturation (Sho2) was considered relevant. Observation time was 64 (29-104) hours (median [interquartile range]). The ICU stay was 11 (8-15) days, and hospital mortality was 35%. The number of periods with procedures/patient was 170 (98-268), the number of procedure-related decreases in Sho2 was 29 (13-41), and the number of decreases in Sho2 unrelated to procedures was 9 (4-19). Accordingly, procedure-related Sho2 decreases occurred 11 (7-17) times per day. Median Sho2 decrease during the procedures was 7 (5-10)%, and median increase in the gradient between mixed and hepatic venous oxygen saturation was 6 (4-9)%. Procedures that caused most Sho2 decreases were airway suctioning, assessment of level of sedation, and changing patients' position. Sho2 decreases were associated with small but significant increases in heart rate and intravascular pressures. Maximal Sequential Organ Failure Assessment scores in the ICU correlated with the number of Sho2 decreases (r: .56; p < 0.001) and with the number of procedure-related Sho2 decreases (r: .60; p < 0.001). CONCLUSIONS: Patients are exposed to repeated episodes of impaired splanchnic perfusion during routine nursing procedures. More research is needed to examine the correlation, if any, between nursing procedures and hepatic venous desaturation.
Resumo:
OBJECTIVE: Nursing in 'live islands' and routine high dose intravenous immunoglobulins after allogeneic hematopoietic stem cell transplantation were abandoned by many teams in view of limited evidence and high costs. METHODS: This retrospective single-center study examines the impact of change from nursing in 'live islands' to care in single rooms (SR) and from high dose to targeted intravenous immunoglobulins (IVIG) on mortality and infection rate of adult patients receiving an allogeneic stem cell or bone marrow transplantation in two steps and three time cohorts (1993-1997, 1997-2000, 2000-2003). RESULTS: Two hundred forty-eight allogeneic hematopoetic stem cell transplantations were performed in 227 patients. Patient characteristics were comparable in the three cohorts for gender, median age, underlying disease, and disease stage, prophylaxis for graft versus host disease (GvHD) and cytomegalovirus constellation. The incidence of infections (78.4%) and infection rates remained stable (rates/1000 days of neutropenia for sepsis 17.61, for pneumonia 6.76). Cumulative incidence of GvHD and transplant-related mortality did not change over time. CONCLUSIONS: Change from nursing in 'live islands' to SR and reduction of high dose to targeted IVIG did not result in increased infection rates or mortality despite an increase in patient age. These results support the current practice.
Resumo:
Pasteurellaceae are bacteria with an important role as primary or opportunistic, mainly respiratory, pathogens in domestic and wild animals. Some species of Pasteurellaceae cause severe diseases with high economic losses in commercial animal husbandry and are of great diagnostic concern. Because of new data on the phylogeny of Pasteurellaceae, their taxonomy has recently been revised profoundly, thus requiring an improved phenotypic differentiation procedure to identify the individual species of this family. A new and simplified procedure to identify species of Actinobacillus, Avibacterium, Gallibacterium, Haemophilus, Mannheimia, Nicoletella, and Pasteurella, which are most commonly isolated from clinical samples of diseased animals in veterinary diagnostic laboratories, is presented in the current study. The identification procedure was evaluated with 40 type and reference strains and with 267 strains from routine diagnostic analysis of various animal species, including 28 different bacterial species. Type, reference, and field strains were analyzed by 16S ribosomal RNA (rrs) and rpoB gene sequencing for unambiguous species determination as a basis to evaluate the phenotypic differentiation schema. Primary phenotypic differentiation is based on beta-nicotinamide adenine dinucleotide (beta-NAD) dependence and hemolysis, which are readily determined on the isolation medium. The procedure divides the 28 species into 4 groups for which particular biochemical reactions were chosen to identify the bacterial species. The phenotypic identification procedure allowed researchers to determine the species of 240 out of 267 field strains. The procedure is an easy and cost-effective system for the rapid identification of species of the Pasteurellaceae family isolated from clinical specimens of animals.
Resumo:
In 2010 more than 600 radiocarbon samples were measured with the gas ion source at the MIni CArbon DAting System (MICADAS) at ETH Zurich and the number of measurements is rising quickly. While most samples contain less than 50 mu g C at present, the gas ion source is attractive as well for larger samples because the time-consuming graphitization is omitted. Additionally, modern samples are now measured down to 5 per-mill counting statistics in less than 30 min with the recently improved gas ion source. In the versatile gas handling system, a stepping-motor-driven syringe presses a mixture of helium and sample CO2 into the gas ion source, allowing continuous and stable measurements of different kinds of samples. CO2 can be provided in four different ways to the versatile gas interface. As a primary method. CO2 is delivered in glass or quartz ampoules. In this case, the CO2 is released in an automated ampoule cracker with 8 positions for individual samples. Secondly, OX-1 and blank gas in helium can be provided to the syringe by directly connecting gas bottles to the gas interface at the stage of the cracker. Thirdly, solid samples can be combusted in an elemental analyzer or in a thermo-optical OC/EC aerosol analyzer where the produced CO2 is transferred to the syringe via a zeolite trap for gas concentration. As a fourth method, CO2 is released from carbonates with phosphoric acid in septum-sealed vials and loaded onto the same trap used for the elemental analyzer. All four methods allow complete automation of the measurement, even though minor user input is presently still required. Details on the setup, versatility and applications of the gas handling system are given. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Reliable detection of JAK2-V617F is critical for accurate diagnosis of myeloproliferative neoplasms (MPNs); in addition, sensitive mutation-specific assays can be applied to monitor disease response. However, there has been no consistent approach to JAK2-V617F detection, with assays varying markedly in performance, affecting clinical utility. Therefore, we established a network of 12 laboratories from seven countries to systematically evaluate nine different DNA-based quantitative PCR (qPCR) assays, including those in widespread clinical use. Seven quality control rounds involving over 21,500 qPCR reactions were undertaken using centrally distributed cell line dilutions and plasmid controls. The two best-performing assays were tested on normal blood samples (n=100) to evaluate assay specificity, followed by analysis of serial samples from 28 patients transplanted for JAK2-V617F-positive disease. The most sensitive assay, which performed consistently across a range of qPCR platforms, predicted outcome following transplant, with the mutant allele detected a median of 22 weeks (range 6-85 weeks) before relapse. Four of seven patients achieved molecular remission following donor lymphocyte infusion, indicative of a graft vs MPN effect. This study has established a robust, reliable assay for sensitive JAK2-V617F detection, suitable for assessing response in clinical trials, predicting outcome and guiding management of patients undergoing allogeneic transplant.
Resumo:
Screening tests for drugs of abuse are regularly used in the clinical routine. These tests identify the targeted substances very differently if tests from different manufacturers are used and sometimes also react positive after the intake of drugs which are not intended to be detected. Therefore, implausible results have to be questioned. A test result can be falsely negative, if a patient has taken a compound which is not detected by the antibody used in the test system. Chromatographic confirmation and screening assays are more laborious to perform and more demanding for the interpretation and are therefore only offered by several specialized clinical laboratories. However, their specificity is excellent and many different compounds can be detected depending on the number of compounds which are part of the mass spectra library used. If the clinical evaluation results in the differential diagnosis of an acute intoxication, screening tests for drugs of abuse can help to identify a single compound or a group of substances. The clinical picture, however, can usually not been explained by a qualitative test result. In addition, there are no published data demonstrating that these tests meaningfully influence triage, treatment, diagnosis or further therapy of a poisoned patient. The quantitative determination of specific compounds in the blood allows for example an appraisal of the prognosis and helps to indicate a specific therapy after intake of acetaminophen or methanol. New designer drugs can not at all be detected by the classic screening tests for drugs of abuse. The have to be identified by chromatographic methods.
Resumo:
Objective: There is an ongoing debate concerning how outcome variables change during the course of psychotherapy. We compared the dose–effect model, which posits diminishing effects of additional sessions in later treatment phases, against a model that assumes a linear and steady treatment progress through termination. Method: Session-by-session outcome data of 6,375 outpatients were analyzed, and participants were categorized according to treatment length. Linear and log-linear (i.e., negatively accelerating) latent growth curve models (LGCMs) were estimated and compared for different treatment length categories. Results: When comparing the fit of the various models, the log-linear LGCMs assuming negatively accelerating treatment progress consistently outperformed the linear models irre- spective of treatment duration. The rate of change was found to be inversely related to the length of treatment. Conclusion: As proposed by the dose–effect model, the expected course of improvement in psychotherapy appears to follow a negatively accelerated pattern of change, irrespective of the duration of the treatment. However, our results also suggest that the rate of change is not constant across various treatment lengths. As proposed by the “good enough level” model, longer treatments are associated with less rapid rates of change.
Resumo:
Patients with ilio-femoral deep-vein thrombosis (DVT) are at high risk of developing the post-thrombotic syndrome (PTS). In comparison to anticoagulation therapy alone, extended venography-guided catheter-directed thrombolysis without routine stenting of venous stenosis in patients with ilio-femoral DVT is associated with an increased risk of bleeding and a moderate reduction of PTS. We performed a prospective single-centre study to investigate safety, patency and incidence of PTS in patients with acute ilio-femoral DVT treated with fixed-dose ultrasound-assisted catheter-directed thrombolysis (USAT; 20 mg rt-PA during 15 hours) followed by routing stenting of venous stenosis, defined as residual luminal narrowing >50%, absent antegrade flow, or presence of collateral flow at the site of suspected stenosis. A total of 87 patients (age 46 ± 21 years, 60% women) were included. At 15 hours, thrombolysis success ≥50% was achieved in 67 (77%) patients. Venous stenting (mean 1.9 ± 1.3 stents) was performed in 70 (80%) patients, with the common iliac vein as the most frequent stenting site (83%). One major (1%; 95% CI, 0-6%) and 6 minor bleedings (7%; 95%CI, 3-14%) occurred. Primary and secondary patency rates at 1 year were 87% (95% CI, 74-94%) and 96% (95% CI, 88-99%), respectively. At three months, 88% (95% CI, 78-94%) of patients were free from PTS according to the Villalta scale, with a similar rate at one year (94%, 95% CI, 81-99%). In conclusion, a fixed-dose USAT regimen followed by routine stenting of underlying venous stenosis in patients with ilio-femoral DVT was associated with a low bleeding rate, high patency rates, and a low incidence of PTS.
Resumo:
The performance of high-resolution CZE for determination of carbohydrate-deficient transferrin (CDT) in human serum based on internal and external quality data gathered over a 10-year period is reported. The assay comprises mixing of serum with a Fe(III) ion-containing solution prior to analysis of the iron saturated mixture in a dynamically double-coated capillary using a commercial buffer at alkaline pH. CDT values obtained with a human serum of a healthy individual and commercial quality control sera are shown to vary less than 10%. Values of a control from a specific lot were found to slowly decrease as function of time (less than 10% per year). Furthermore, due to unknown reasons, gradual changes in the monitored pattern around pentasialo-transferrin were detected, which limit the use of commercial control sera of the same lot to less than 2 years. Analysis of external quality control sera revealed correct classification of the samples over the entire 10-year period. Data obtained compare well with those of HPLC and CZE assays of other laboratories. The data gathered over a 10-year period demonstrate the robustness of the high-resolution CZE assay. This is the first account of a CZE-based CDT assay with complete internal and external quality assessment over an extended time period.
Resumo:
OBJECTIVE: The presence of minority nonnucleoside reverse transcriptase inhibitor (NNRTI)-resistant HIV-1 variants prior to antiretroviral therapy (ART) has been linked to virologic failure in treatment-naive patients. DESIGN: We performed a large retrospective study to determine the number of treatment failures that could have been prevented by implementing minority drug-resistant HIV-1 variant analyses in ART-naïve patients in whom no NNRTI resistance mutations were detected by routine resistance testing. METHODS: Of 1608 patients in the Swiss HIV Cohort Study, who have initiated first-line ART with two nucleoside reverse transcriptase inhibitors (NRTIs) and one NNRTI before July 2008, 519 patients were eligible by means of HIV-1 subtype, viral load and sample availability. Key NNRTI drug resistance mutations K103N and Y181C were measured by allele-specific PCR in 208 of 519 randomly chosen patients. RESULTS: Minority K103N and Y181C drug resistance mutations were detected in five out of 190 (2.6%) and 10 out of 201 (5%) patients, respectively. Focusing on 183 patients for whom virologic success or failure could be examined, virologic failure occurred in seven out of 183 (3.8%) patients; minority K103N and/or Y181C variants were present prior to ART initiation in only two of those patients. The NNRTI-containing, first-line ART was effective in 10 patients with preexisting minority NNRTI-resistant HIV-1 variant. CONCLUSION: As revealed in settings of case-control studies, minority NNRTI-resistant HIV-1 variants can have an impact on ART. However, the sole implementation of minority NNRTI-resistant HIV-1 variant analysis in addition to genotypic resistance testing (GRT) cannot be recommended in routine clinical settings. Additional associated risk factors need to be discovered.