921 resultados para Library personnel management
Resumo:
BACKGROUND: We have carried out an extensive qualitative research program focused on the barriers and facilitators to successful adoption and use of various features of advanced, state-of-the-art electronic health records (EHRs) within large, academic, teaching facilities with long-standing EHR research and development programs. We have recently begun investigating smaller, community hospitals and out-patient clinics that rely on commercially-available EHRs. We sought to assess whether the current generation of commercially-available EHRs are capable of providing the clinical knowledge management features, functions, tools, and techniques required to deliver and maintain the clinical decision support (CDS) interventions required to support the recently defined "meaningful use" criteria. METHODS: We developed and fielded a 17-question survey to representatives from nine commercially available EHR vendors and four leading internally developed EHRs. The first part of the survey asked basic questions about the vendor's EHR. The second part asked specifically about the CDS-related system tools and capabilities that each vendor provides. The final section asked about clinical content. RESULTS: All of the vendors and institutions have multiple modules capable of providing clinical decision support interventions to clinicians. The majority of the systems were capable of performing almost all of the key knowledge management functions we identified. CONCLUSION: If these well-designed commercially-available systems are coupled with the other key socio-technical concepts required for safe and effective EHR implementation and use, and organizations have access to implementable clinical knowledge, we expect that the transformation of the healthcare enterprise that so many have predicted, is achievable using commercially-available, state-of-the-art EHRs.
Resumo:
This study focused on the instruments that are currently being used by fire department personnel to identify and classify juvenile firesetters, these instruments, as published by the Federal Emergency Management Agency (F.E.M.A.) have never been empirically validated as to their ability to discriminate between first time and multiple firesetters and to predict the degree of risk for future firesetting by juveniles that come to the attention of authorities for firesetting behaviors. The study was descriptive in nature and not designed to test the validity of these instruments. The study was designed to test the ability of the instruments to discriminate between first time and multiple firesetters and to categorize known firesetters, based on the motive for firesetting, as to their degree or risk for future firesetting.^ The results suggest that the F.E.M.A. instruments are of little use in discriminating between first time and multiple firesetters. The F.E.M.A. instruments were not able to categorize juvenile firesetters as to their potential risk for future firesetting. A subset of variables from the F.E.M.A. instruments was identified that may be useful in discriminating between youth that are troubled firesetters and those that are not. ^
Resumo:
A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^
Resumo:
This study of ambulance workers for the emergency medical services of the City of Houston studied the factors related to shiftwork tolerance and intolerance. The EMS personnel work a 24-hour shift with rotating days of the week. Workers are assigned to A, B, C, D shift, each of which rotate 24-hours on, 24-hours off, 24-hours on and 4 days off. One-hundred and seventy-six male EMTs, paramedics and chauffeurs from stations of varying levels of activity were surveyed. The sample group ranged in age from 20 to 45. The average tenure on the job was 8.2 years. Over 68% of the workers held a second job, the majority of which worked over 20 hours a week at the second position.^ The survey instrument was a 20-page questionnaire modeled after the Folkard Standardized Shiftwork Index. In addition to demographic data, the survey tool provided measurements of general job satisfaction, sleep quality, general health complaints, morningness/eveningness, cognitive and somatic anxiety, depression, and circadian types. The survey questionnaire included an EMS-specific scaler of stress.^ A conceptual model of Shiftwork Tolerance was presented to identify the key factors examined in the study. An extensive list of 265 variables was reduced to 36 key variables that related to: (1) shift schedule and demographic/lifestyle factors, (2) individual differences related to traits and characteristics, and (3) tolerance/intolerance effects. Using the general job satisfaction scaler as the key measurement of shift tolerance/intolerance, it was shown that a significant relationship existed between this dependent variable and stress, number of years working a 24-hour shift, sleep quality, languidness/vigorousness. The usual amount of sleep received during the shift, general health complaints and flexibility/rigidity (R$\sp2$ =.5073).^ The sample consisted of a majority of morningness-types or extreme-morningness types, few evening-types and no extreme-evening types, duplicating the findings of Motohashi's previous study of ambulance workers. The level of activity by station was not significant on any of the dependent variables examined. However, the shift worked had a relationship with sleep quality, despite the fact that all shifts work the same hours and participate in the same rotation schedule. ^
Resumo:
The evolution of pharmaceutical care is identified through a complete review of the literature published in the American Journal of Health-System Pharmacy, the sole comprehensive publication of institutional pharmacy practice. The evolution is categorized according to characteristics of structure (organizational structure, the role of the pharmacist), process (drug delivery systems, formulary management, acquiring drug products, methods to impact drug therapy decisions), and outcomes (cost of drug delivery, cost of drug acquisition and use, improved safety, improved health outcomes) recorded from the 1950s through the 1990s. While significant progress has been made in implementing basic drug distribution systems, levels of pharmacy involvement with direct patient care is still limited.^ A new practice framework suggests enhanced direct patient care involvement through increase in the efficiency and effectiveness of traditional pharmacy services. Recommendations advance internal and external organizational structure relationships that position pharmacists to fully use their unique skills and knowledge to impact drug therapy decisions and outcomes. Specific strategies facilitate expansion of the breadth and scope of each process component in order to expand the depth of integration of pharmacy and pharmaceutical care within the broad healthcare environment. Economic evaluation methods formally evaluate the impact of both operational and clinical interventions.^ Outcome measurements include specific recommendations and methods to increase efficiency of drug acquisition, emphasizing pharmacists' roles that impact physician prescribing decisions. Effectiveness measures include those that improve safety of drug distribution systems, decrease the potential of adverse drug therapy events, and demonstrate that pharmaceutical care can significantly contribute to improvement in overall health status.^ The implementation of the new framework is modeled on a case study at the M.D. Anderson Cancer Center. The implementation of several new drug distribution methods facilitated the redeployment of personnel from distributive functions to direct patient care activities with significant personnel and drug cost reduction. A cost-benefit analysis illustrates that framework process enhancements produced a benefit-to-cost ratio of 7.9. In addition, measures of effectiveness demonstrated significant levels of safety and enhanced drug therapy outcomes. ^
Resumo:
Patient self-management (PSM) of oral anticoagulation is under discussion, because evidence from real-life settings is missing. Using data from a nationwide, prospective cohort study in Switzerland, we assessed overall long-term efficacy and safety of PSM and examined subgroups. Data of 1140 patients (5818.9 patient-years) were analysed and no patient were lost to follow-up. Median follow-up was 4.3 years (range 0.2-12.8 years). Median age at the time of training was 54.2 years (range 18.2-85.2) and 34.6% were women. All-cause mortality was 1.4 per 100 patient-years (95% CI 1.1-1.7) with a higher rate in patients with atrial fibrillation (2.5; 1.6-3.7; p<0.001), patients>50 years of age (2.0; 1.6-2.6; p<0.001), and men (1.6; 1.2-2.1; p = 0.036). The rate of thromboembolic events was 0.4 (0.2-0.6) and independent from indications, sex and age. Major bleeding were observed in 1.1 (0.9-1.5) per 100 patient-years. Efficacy was comparable to standard care and new oral anticoagulants in a network meta-analysis. PSM of properly trained patients is effective and safe in a long-term real-life setting and robust across clinical subgroups. Adoption in various clinical settings, including those with limited access to medical care or rural areas is warranted.
Resumo:
The endemic New Zealand longfin eel Anguilla dieffenbachi (hereafter, longfin eel), is overfished, and in southern South Island, New Zealand, rivers have recently become predominated by males. This study examined length and age at sexual differentiation in male eels in the Aparima River catchment (area, 1,375 km(2); mean flow, 20 m(3.)s(-1)) and the sex ratio and distribution of eels throughout the catchment. Longfin eels differentiated into males mostly at lengths from 300 to 460 mm and ages from 10 to 25+ years. Females were rare: Of 738 eels examined for sexual differentiation, 466 were males and 5 were females, and a few others, not examined, were large enough to be female. These counts suggest a male : female ratio among differentiated longfin eels of 68:1. Of 31 differentiated shortfin eels A. australis, less common in the Aparima River, 26 were females. Male longfin eels were distributed throughout the main stern and tributaries; undifferentiated eels were more prevalent in lower and middle reaches and in the main stem than in upper reaches and tributaries. In other studies, male longfin eels predominated commercial catches in the Aparima and four other southernmost rivers, by 2.4:1 to 13.6:1 males to females. The Aparima River had the most skewed sex ratio. Longfin eel catches from the Aparima River will become more male predominated because few sublegal-size females were present. The length-frequency distributions of eels in the present samples and in the commercial catches were truncated just above minimum legal size (about 460 mm), showing that few females escape the fishery. Historically, females predominated these rivers. The recent change in sex ratio is attributable partly to selective harvest of females, and partly to changes in the structure of the population from fishing, such that differentiation into males has been favored. Longevity, delayed sexual maturity, semel-parity, and endemism with restricted range make the longfin eel particularly vulnerable to overfishing.
Resumo:
Postpartum hemorrhage (PPH) is one of the main causes of maternal deaths even in industrialized countries. It represents an emergency situation which necessitates a rapid decision and in particular an exact diagnosis and root cause analysis in order to initiate the correct therapeutic measures in an interdisciplinary cooperation. In addition to established guidelines, the benefits of standardized therapy algorithms have been demonstrated. A therapy algorithm for the obstetric emergency of postpartum hemorrhage in the German language is not yet available. The establishment of an international (Germany, Austria and Switzerland D-A-CH) "treatment algorithm for postpartum hemorrhage" was an interdisciplinary project based on the guidelines of the corresponding specialist societies (anesthesia and intensive care medicine and obstetrics) in the three countries as well as comparable international algorithms for therapy of PPH.The obstetrics and anesthesiology personnel must possess sufficient expertise for emergency situations despite lower case numbers. The rarity of occurrence for individual patients and the life-threatening situation necessitate a structured approach according to predetermined treatment algorithms. This can then be carried out according to the established algorithm. Furthermore, this algorithm presents the opportunity to train for emergency situations in an interdisciplinary team.
Resumo:
Forest management not only affects biodiversity but also might alter ecosystem processes mediated by the organisms, i.e. herbivory the removal of plant biomass by plant-eating insects and other arthropod groups. Aiming at revealing general relationships between forest management and herbivory we investigated aboveground arthropod herbivory in 105 plots dominated by European beech in three different regions in Germany in the sun-exposed canopy of mature beech trees and on beech saplings in the understorey. We separately assessed damage by different guilds of herbivores, i.e. chewing, sucking and scraping herbivores, gall-forming insects and mites, and leaf-mining insects. We asked whether herbivory differs among different forest management regimes (unmanaged, uneven-aged managed, even-aged managed) and among age-classes within even-aged forests. We further tested for consistency of relationships between regions, strata and herbivore guilds. On average, almost 80 of beech leaves showed herbivory damage, and about 6 of leaf area was consumed. Chewing damage was most common, whereas leaf sucking and scraping damage were very rare. Damage was generally greater in the canopy than in the understorey, in particular for chewing and scraping damage, and the occurrence of mines. There was little difference in herbivory among differently managed forests and the effects of management on damage differed among regions, strata and damage types. Covariates such as wood volume, tree density and plant diversity weakly influenced herbivory, and effects differed between herbivory types. We conclude that despite of the relatively low number of species attacking beech; arthropod herbivory on beech is generally high. We further conclude that responses of herbivory to forest management are multifaceted and environmental factors such as forest structure variables affecting in particular microclimatic conditions are more likely to explain the variability in herbivory among beech forest plots.
Resumo:
Intensive land use is a driving force for biodiversity decline in many ecosystems. In semi-natural grasslands, land-use activities such as mowing, grazing and fertilization affect the diversity of plants and arthropods, but the combined effects of different drivers and the chain of effects are largely unknown. In this study we used structural equation modelling to analyse how the arthropod communities in managed grasslands respond to land use and whether these responses are mediated through changes in resource diversity or resource quantity (biomass). Plants were considered resources for herbivores which themselves were considered resources for predators. Plant and arthropod (herbivores and predators) communities were sampled on 141 meadows, pastures and mown pastures within three regions in Germany in 2008 and 2009. Increasing land-use intensity generally increased plant biomass and decreased plant diversity, mainly through increasing fertilization. Herbivore diversity decreased together with plant diversity but showed no response to changes in plant biomass. Hence, land-use effects on herbivore diversity were mediated through resource diversity rather than quantity. Land-use effects on predator diversity were mediated by both herbivore diversity (resource diversity) and herbivore quantity (herbivore biomass), but indirect effects through resource quantity were stronger. Our findings highlight the importance of assessing both direct and indirect effects of land-use intensity and mode on different trophic levels. In addition to the overall effects, there were subtle differences between the different regions, pointing to the importance of regional land-use specificities. Our study underlines the commonly observed strong effect of grassland land use on biodiversity. It also highlights that mechanistic approaches help us to understand how different land-use modes affect biodiversity.
Resumo:
The perioperative management of patients with mediastinal masses is a special clinical challenge in our field. Even though regional anaesthesia is normally the first choice, in some cases it is not feasible due to the method of operation. In these cases general anaesthesia is the second option but can lead to respiratory and haemodynamic decompensation due to tumor-associated compression syndrome (mediastinal mass syndrome). The appropriate treatment begins with the preoperative risk classification on the basis of clinical and radiological findings. In addition to anamnesis, chest radiograph, and CT, dynamical methods (e.g. pneumotachography and echocardiography) should be applied to verify possible intraoperative compression syndromes. The induction of general anaesthesia is to be realized in awake-fiberoptic intubation with introduction of the tube via nasal route while maintaining the spontaneous breathing of the patient. The anaesthesia continues with short effective agents applied inhalative or iv. If possible from the point of operation, agents of muscle relaxation are not to be applied. If the anaesthesia risk is classified as uncertain or unsafe, depending on the location of tumor compression (tracheobronchial tree, pulmonary artery, superior vena cava), alternative techniques of securing the respiratory tract (different tubes, rigid bronchoscope) and cardiopulmonary bypass with extracorporal oxygen supply are prepared. For patients with severe clinical symptoms and extensive mediastinal mass, the preoperative cannulation of femoral vessels is also recommended. In addition to fulfilling technical and personnel requirements, an interdisciplinary cooperation of participating fields is the most important prerequisite for the optimal treatment of patients.
Resumo:
Antimicrobial drugs may be used to treat diarrheal illness in companion animals. It is important to monitor antimicrobial use to better understand trends and patterns in antimicrobial resistance. There is no monitoring of antimicrobial use in companion animals in Canada. To explore how the use of electronic medical records could contribute to the ongoing, systematic collection of antimicrobial use data in companion animals, anonymized electronic medical records were extracted from 12 participating companion animal practices and warehoused at the University of Calgary. We used the pre-diagnostic, clinical features of diarrhea as the case definition in this study. Using text-mining technologies, cases of diarrhea were described by each of the following variables: diagnostic laboratory tests performed, the etiological diagnosis and antimicrobial therapies. The ability of the text miner to accurately describe the cases for each of the variables was evaluated. It could not reliably classify cases in terms of diagnostic tests or etiological diagnosis; a manual review of a random sample of 500 diarrhea cases determined that 88/500 (17.6%) of the target cases underwent diagnostic testing of which 36/88 (40.9%) had an etiological diagnosis. Text mining, compared to a human reviewer, could accurately identify cases that had been treated with antimicrobials with high sensitivity (92%, 95% confidence interval, 88.1%-95.4%) and specificity (85%, 95% confidence interval, 80.2%-89.1%). Overall, 7400/15,928 (46.5%) of pets presenting with diarrhea were treated with antimicrobials. Some temporal trends and patterns of the antimicrobial use are described. The results from this study suggest that informatics and the electronic medical records could be useful for monitoring trends in antimicrobial use.
Resumo:
Most commercial project management software packages include planning methods to devise schedules for resource-constrained projects. As it is proprietary information of the software vendors which planning methods are implemented, the question arises how the software packages differ in quality with respect to their resource-allocation capabilities. We experimentally evaluate the resource-allocation capabilities of eight recent software packages by using 1,560 instances with 30, 60, and 120 activities of the well-known PSPLIB library. In some of the analyzed packages, the user may influence the resource allocation by means of multi-level priority rules, whereas in other packages, only few options can be chosen. We study the impact of various complexity parameters and priority rules on the project duration obtained by the software packages. The results indicate that the resource-allocation capabilities of these packages differ significantly. In general, the relative gap between the packages gets larger with increasing resource scarcity and with increasing number of activities. Moreover, the selection of the priority rule has a considerable impact on the project duration. Surprisingly, when selecting a priority rule in the packages where it is possible, both the mean and the variance of the project duration are in general worse than for the packages which do not offer the selection of a priority rule.
Resumo:
OBJECTIVES Rates of TB/HIV coinfection and multi-drug resistant (MDR)-TB are increasing in Eastern Europe (EE). We aimed to study clinical characteristics, factors associated with MDR-TB and predicted activity of empiric anti-TB treatment at time of TB diagnosis among TB/HIV coinfected patients in EE, Western Europe (WE) and Latin America (LA). DESIGN AND METHODS Between January 1, 2011, and December 31, 2013, 1413 TB/HIV patients (62 clinics in 19 countries in EE, WE, Southern Europe (SE), and LA) were enrolled. RESULTS Significant differences were observed between EE (N = 844), WE (N = 152), SE (N = 164), and LA (N = 253) in the proportion of patients with a definite TB diagnosis (47%, 71%, 72% and 40%, p<0.0001), MDR-TB (40%, 5%, 3% and 15%, p<0.0001), and use of combination antiretroviral therapy (cART) (17%, 40%, 44% and 35%, p<0.0001). Injecting drug use (adjusted OR (aOR) = 2.03 (95% CI 1.00-4.09), prior anti-TB treatment (3.42 (1.88-6.22)), and living in EE (7.19 (3.28-15.78)) were associated with MDR-TB. Among 585 patients with drug susceptibility test (DST) results, the empiric (i.e. without knowledge of the DST results) anti-TB treatment included ≥3 active drugs in 66% of participants in EE compared with 90-96% in other regions (p<0.0001). CONCLUSIONS In EE, TB/HIV patients were less likely to receive a definite TB diagnosis, more likely to house MDR-TB and commonly received empiric anti-TB treatment with reduced activity. Improved management of TB/HIV patients in EE requires better access to TB diagnostics including DSTs, empiric anti-TB therapy directed at both susceptible and MDR-TB, and more widespread use of cART.
Resumo:
OBJECTIVE There is controversy regarding the significance of radiological consolidation in the context of COPD exacerbation (eCOPD). While some studies into eCOPD exclude these cases, consolidation is a common feature of eCOPD admissions in real practice. This study aims to address the question of whether consolidation in eCOPD is a distinct clinical phenotype with implications for management decisions and outcomes. PATIENTS AND METHODS The European COPD Audit was carried out in 384 hospitals from 13 European countries between 2010 and 2011 to analyze guideline adherence in eCOPD. In this analysis, admissions were split according to the presence or not of consolidation on the admission chest radiograph. Groups were compared in terms of clinical and epidemiological features, existing treatment, clinical care utilized and mortality. RESULTS 14,111 cases were included comprising 2,714 (19.2%) with consolidation and 11,397 (80.8%) without. The risk of radiographic consolidation increased with age, female gender, cardiovascular diseases, having had two or more admissions in the previous year, and sputum color change. Previous treatment with inhaled steroids was not associated. Patients with radiographic consolidation were significantly more likely to receive antibiotics, oxygen and non-invasive ventilation during the admission and had a lower survival from admission to 90-day follow-up. CONCLUSIONS Patients admitted for COPD exacerbation who have radiological consolidation have a more severe illness course, are treated more intensively by clinicians and have a poorer prognosis. We recommend that these patients be considered a distinct subset in COPD exacerbation.