9 resultados para grahics cards

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epidemiological, clinical, and experimental evidence has accumulated during the last decades suggesting that high-density lipoproteins (HDLs) may protect from atherosclerosis and its clinical consequences. However, more than 55 years after the first description of the link between HDL and heart attacks, many facets of the biochemistry, function, and clinical significance of HDL remain enigmatic. This applies particularly to the completely unexpected results that became available from some recent clinical trials of nicotinic acid and of inhibitors of cholesteryl ester transfer protein (CETP). The concept that raising HDL cholesterol by pharmacological means would decrease the risk of vascular disease has therefore been challenged.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The adequacy of thromboprophylaxis prescriptions in acutely ill hospitalized medical patients needs improvement. OBJECTIVE: To prospectively assess the efficacy of thromboprophylaxis adequacy of various clinical decision support systems (CDSS) with the aim of increasing the use of explicit criteria for thromboprophylaxis prescription in nine Swiss medical services. METHODS: We randomly assigned medical services to a pocket digital assistant program (PDA), pocket cards (PC) and no CDSS (controls). In centers using an electronic chart, an e-alert system (eAlerts) was developed. After 4 months, we compared post-CDSS with baseline thromboprophylaxis adequacy for the various CDSS and control groups. RESULTS: Overall, 1085 patients were included (395 controls, 196 PC, 168 PDA, 326 eAlerts), 651 pre- and 434 post-CDSS implementation: 472 (43.5%) presented a risk of VTE justifying thromboprophylaxis (31.8% pre, 61.1% post) and 556 (51.2%) received thromboprophylaxis (54.2% pre, 46.8% post). The overall adequacy (% patients with adequate prescription) of pre- and post-CDSS implementation was 56.2 and 50.7 for controls (P = 0.29), 67.3 and 45.3 for PC (P = 0.002), 66.0 and 64.9 for PDA (P = 0.99), 50.5 and 56.2 for eAlerts (P = 0.37), respectively, eAlerts limited overprescription (56% pre, 31% post, P = 0.01). CONCLUSION: While pocket cards and handhelds did not improve thromboprophylaxis adequacy, eAlerts had a modest effect, particularly on the reduction of overprescription. This effect only partially contributes to the improvement of patient safety and more work is needed towards institution-tailored tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Sierra Leone has undergone a decade of civil war from 1991 to 2001. From this period few data on immunization coverage are available, and conflict-related delays in immunization according to the Expanded Programme on Immunization (EPI) schedule have not been investigated. We aimed to study delays in childhood immunization in the context of civil war in a Sierra Leonean community. METHODS: We conducted an immunization survey in Kissy Mess-Mess in the Greater Freetown area in 1998/99 using a two-stage sampling method. Based on immunization cards and verbal history we collected data on immunization for tuberculosis, diphtheria, tetanus, pertussis, polio, and measles by age group (0-8/9-11/12-23/24-35 months). We studied differences between age groups and explored temporal associations with war-related hostilities taking place in the community. RESULTS: We included 286 children who received 1690 vaccine doses; card retention was 87%. In 243 children (85%, 95% confidence interval (CI): 80-89%) immunization was up-to-date. In 161 of these children (56%, 95%CI: 50-62%) full age-appropriate immunization was achieved; in 82 (29%, 95%CI: 24-34%) immunization was not appropriate for age. In the remaining 43 children immunization was partial in 37 (13%, 95%CI: 9-17) and absent in 6 (2%, 95%CI: 1-5). Immunization status varied across age groups. In children aged 9-11 months the proportion with age-inappropriate (delayed) immunization was higher than in other age groups suggesting an association with war-related hostilities in the community. CONCLUSION: Only about half of children under three years received full age-appropriate immunization. In children born during a period of increased hostilities, immunization was mostly inappropriate for age, but recommended immunizations were not completely abandoned. Missing or delayed immunization represents an additional threat to the health of children living in conflict areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Ethyl glucuronide (EtG) and ethyl sulfate (EtS) are non-oxidative minor metabolites of ethanol. They are detectable in various body fluids shortly after initial consumption of ethanol and have a longer detection time frame than the parent compound. They are regarded highly sensitive and specific markers of recent alcohol uptake. This study evaluates the determination of EtG and EtS from dried blood spots (DBS), a simple and cost-effective sampling method that would shorten the time gap between offense and blood sampling and lead to a better reflectance of the actual impairment. METHODS: For method validation, EtG and EtS standard and quality control samples were prepared in fresh human heparinized blood and spotted on DBS cards, then extracted and measured by an LC-ESI-MS/MS method. Additionally, 76 heparinized blood samples from traffic offense cases were analyzed for EtG and EtS as whole blood and as DBS specimens. The results from these measurements were then compared by calculating the respective mean values, by a matched-paired t test, by a Wilcoxon test, and by Bland-Altman and Mountain plots. RESULTS AND DISCUSSION: Calibrations for EtG and EtS in DBS were linear over the studied calibration range. The precision and accuracy of the method met the requirements of the validation guidelines that were employed in the study. The stability of the biomarkers stored as DBS was demonstrated under different storage conditions. The t test showed no significant difference between whole blood and DBS in the determination of EtG and EtS. In addition, the Bland-Altman analysis and Mountain plot confirmed that the concentration differences that were measured in DBS specimens were not relevant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Assessment of the proportion of patients with well controlled cardiovascular risk factors underestimates the proportion of patients receiving high quality of care. Evaluating whether physicians respond appropriately to poor risk factor control gives a different picture of quality of care. We assessed physician response to control cardiovascular risk factors, as well as markers of potential overtreatment in Switzerland, a country with universal healthcare coverage but without systematic quality monitoring, annual report cards on quality of care or financial incentives to improve quality. METHODS We performed a retrospective cohort study of 1002 randomly selected patients aged 50-80 years from four university primary care settings in Switzerland. For hypertension, dyslipidemia and diabetes mellitus, we first measured proportions in control, then assessed therapy modifications among those in poor control. "Appropriate clinical action" was defined as a therapy modification or return to control without therapy modification within 12 months among patients with baseline poor control. Potential overtreatment of these conditions was defined as intensive treatment among low-risk patients with optimal target values. RESULTS 20% of patients with hypertension, 41% with dyslipidemia and 36% with diabetes mellitus were in control at baseline. When appropriate clinical action in response to poor control was integrated into measuring quality of care, 52 to 55% had appropriate quality of care. Over 12 months, therapy of 61% of patients with baseline poor control was modified for hypertension, 33% for dyslipidemia, and 85% for diabetes mellitus. Increases in number of drug classes (28-51%) and in drug doses (10-61%) were the most common therapy modifications. Patients with target organ damage and higher baseline values were more likely to have appropriate clinical action. We found low rates of potential overtreatment with 2% for hypertension, 3% for diabetes mellitus and 3-6% for dyslipidemia. CONCLUSIONS In primary care, evaluating whether physicians respond appropriately to poor risk factor control, in addition to assessing proportions in control, provide a broader view of the quality of care than relying solely on measures of proportions in control. Such measures could be more clinically relevant and acceptable to physicians than simply reporting levels of control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an application and sample independent method for the automatic discrimination of noise and signal in optical coherence tomography Bscans. The proposed algorithm models the observed noise probabilistically and allows for a dynamic determination of image noise parameters and the choice of appropriate image rendering parameters. This overcomes the observer variability and the need for a priori information about the content of sample images, both of which are challenging to estimate systematically with current systems. As such, our approach has the advantage of automatically determining crucial parameters for evaluating rendered image quality in a systematic and task independent way. We tested our algorithm on data from four different biological and nonbiological samples (index finger, lemon slices, sticky tape, and detector cards) acquired with three different experimental spectral domain optical coherence tomography (OCT) measurement systems including a swept source OCT. The results are compared to parameters determined manually by four experienced OCT users. Overall, our algorithm works reliably regardless of which system and sample are used and estimates noise parameters in all cases within the confidence interval of those found by observers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES To improve malnutrition awareness and management in our department of general internal medicine; to assess patients' nutritional risk; and to evaluate whether an online educational program leads to an increase in basic knowledge and more frequent nutritional therapies. METHODS A prospective pre-post intervention study at a university department of general internal medicine was conducted. Nutritional screening using Nutritional Risk Score 2002 (NRS 2002) was performed, and prescriptions of nutritional therapies were assessed. The intervention included an online learning program and a pocket card for all residents, who had to fill in a multiple-choice questions (MCQ) test about basic nutritional knowledge before and after the intervention. RESULTS A total of 342 patients were included in the preintervention phase, and 300 were in the postintervention phase. In the preintervention phase, 54.1% were at nutritional risk (NRS 2002 ≥3) compared with 61.7% in the postintervention phase. There was no increase in the prescription of nutritional therapies (18.7% versus 17.0%). Forty-nine and 41 residents (response rate 58% and 48%) filled in the MCQ test before and after the intervention, respectively. The mean percentage of correct answers was 55.6% and 59.43%, respectively (which was not significant). Fifty of 84 residents completed the online program. The residents who participated in the whole program scored higher on the second MCQ test (63% versus 55% correct answers, P = 0.031). CONCLUSIONS Despite a high ratio of malnourished patients, the nutritional intervention, as assessed by nutritional prescriptions, is insufficient. However, the simple educational program via Internet and usage of NRS 2002 pocket cards did not improve either malnutrition awareness or nutritional treatment. More sophisticated educational systems to fight malnutrition are necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biosecurity is crucial for safeguarding livestock from infectious diseases. Despite the plethora of biosecurity recommendations, published scientific evidence on the effectiveness of individual biosecurity measures is limited. The objective of this study was to assess the perception of Swiss experts about the effectiveness and importance of individual on-farm biosecurity measures for cattle and swine farms (31 and 30 measures, respectively). Using a modified Delphi method, 16 Swiss livestock disease specialists (8 for each species) were interviewed. The experts were asked to rank biosecurity measures that were written on cards, by allocating a score from 0 (lowest) to 5 (highest). Experts ranked biosecurity measures based on their importance related to Swiss legislation, feasibility, as well as the effort required for implementation and the benefit of each biosecurity measure. The experts also ranked biosecurity measures based on their effectiveness in preventing an infectious agent from entering and spreading on a farm, solely based on transmission characteristics of specific pathogens. The pathogens considered by cattle experts were those causing Bluetongue (BT), Bovine Viral Diarrhea (BVD), Foot and Mouth Disease (FMD) and Infectious Bovine Rhinotracheitis (IBR). Swine experts expressed their opinion on the pathogens causing African Swine Fever (ASF), Enzootic Pneumonia (EP), Porcine Reproductive and Respiratory Syndrome (PRRS), as well as FMD. For cattle farms, biosecurity measures that improve disease awareness of farmers were ranked as both most important and most effective. For swine farms, the most important and effective measures identified were those related to animal movements. Among all single measures evaluated, education of farmers was perceived by the experts to be the most important and effective for protecting both Swiss cattle and swine farms from disease. The findings of this study provide an important basis for recommendation to farmers and policy makers.