123 resultados para ASSESSING COMPETENCE
Resumo:
The potential ecological impact of ongoing climate change has been much discussed. High mountain ecosystems were identified early on as potentially very sensitive areas. Scenarios of upward species movement and vegetation shift are commonly discussed in the literature. Mountains being characteristically conic in shape, impact scenarios usually assume that a smaller surface area will be available as species move up. However, as the frequency distribution of additional physiographic factors (e.g., slope angle) changes with increasing elevation (e.g., with few gentle slopes available at higher elevation), species migrating upslope may encounter increasingly unsuitable conditions. As a result, many species could suffer severe reduction of their habitat surface, which could in turn affect patterns of biodiversity. In this paper, results from static plant distribution modeling are used to derive climate change impact scenarios in a high mountain environment. Models are adjusted with presence/absence of species. Environmental predictors used are: annual mean air temperature, slope, indices of topographic position, geology, rock cover, modeled permafrost and several indices of solar radiation and snow cover duration. Potential Habitat Distribution maps were drawn for 62 higher plant species, from which three separate climate change impact scenarios were derived. These scenarios show a great range of response, depending on the species and the degree of warming. Alpine species would be at greatest risk of local extinction, whereas species with a large elevation range would run the lowest risk. Limitations of the models and scenarios are further discussed.
Resumo:
Purpose: To compare the performance Glaucoma Quality of Life-15 (GQL-15) Questionnaire, intraocular pressure measurement (IOP Goldmann tonometry) and a measure of visual field loss using Moorfields Motion Displacement Test (MDT) in detecting glaucomatous eyes from a self referred population. Methods: The GQL-15 has been suggested to correlate with visual disability and psychophysical measures of visual function in glaucoma patients. The Moorfields MDT is a multi location perimetry test with 32 white line stimuli presented on a grey background on a standard laptop computer. Each stimulus is displaced between computer frames to give the illusion of "apparent motion". Participants (N=312, 90% older than 45 years; 20.5% family history of glaucoma) self referred to an advertised World Glaucoma Day (March 2009) Jules Gonin Eye Hospital, Lausanne Switzerland. Participants underwent a clinical exam (IOP, slit lamp, angle and disc examination by a general ophthalmologist), 90% completed a GQL-15 questionnaire and over 50% completed a MDT test in both eyes. Those who were classified as abnormal on one or more of the following (IOP >21 mmHg/ GQL-15 score >20/ MDT score >2/ clinical exam) underwent a follow up clinical examination by a glaucoma specialist including imaging and threshold perimetry. After the second examination subjects were classified as "healthy"(H), "glaucoma suspect" (GS) (ocular hypertension and/or suspicious disc, angle closure with SD) or "glaucomatous" (G). Results: One hundred and ten subjects completed all 4 initial examinations; of these 69 were referred to complete the 2nd examination and were classified as; 8 G, 24 GS, and 37 H. MDT detected 7/8 G, and 7/24 GS, with false referral rate of 3.8%. IOP detected 2/8 G and 8/24 GS, with false referral rate of 8.9%. GQL-15 detected 4/8 G, 16/24 GS with a false referral rate of 42%. Conclusions: In this sample of participants attending a self referral glaucoma detection event, the MDT performed significantly better than the GQL-15 and IOP in discriminating glaucomatous patients from healthy subjects. Further studies are required to assess the potential of the MDT as a glaucoma screening tool.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.
Resumo:
This case-control study assessed whether the trabecular bone score (TBS), determined from gray-level analysis of DXA images, might be of any diagnostic value, either alone or combined with bone mineral density (BMD), in the assessment of vertebral fracture risk among postmenopausal women with osteopenia. Of 243 postmenopausal Caucasian women, 50-80 years old, with BMD T-scores between -1.0 and -2.5, we identified 81 with osteoporosis-related vertebral fractures and compared them with 162 age-matched controls without fractures. Primary outcomes were BMD and TBS. For BMD, each incremental decrease in BMD was associated with an OR = 1.54 (95% CI = 1.17-2.03), and the AUC was 0.614 (0.550-0.676). For TBS, corresponding values were 2.53 (1.82-3.53) and 0.721 (0.660-0.777). The difference in the AUC for TBS vs. BMD was statistically significant (p = 0.020). The OR for (TBS + BMD) was 2.54 (1.86-3.47) and the AUC 0.732 (0.672-0.787). In conclusion, the TBS warrants a closer look to see whether it may be of clinical usefulness in the determination of fracture risk in postmenopausal osteopenic women.
Resumo:
About 15 years ago, the Swiss Society of Pathology has developed and implemented a board examination in anatomical pathology. We describe herein the contents covered by this 2-day exam (autopsy pathology, cytology, histopathology, molecular pathology, and basic knowledge about mechanisms of disease) and its exact modalities, sketch a brief history of the exam, and finish with a concise discussion about the possible objectives and putative benefits weighed against the hardship that it imposes on the candidates.
Resumo:
QUESTIONS UNDER STUDY: The starting point of the interdisciplinary project "Assessing the impact of diagnosis related groups (DRGs) on patient care and professional practice" (IDoC) was the lack of a systematic ethical assessment for the introduction of cost containment measures in healthcare. Our aim was to contribute to the methodological and empirical basis of such an assessment. METHODS: Five sub-groups conducted separate but related research within the fields of biomedical ethics, law, nursing sciences and health services, applying a number of complementary methodological approaches. The individual research projects were framed within an overall ethical matrix. Workshops and bilateral meetings were held to identify and elaborate joint research themes. RESULTS: Four common, ethically relevant themes emerged in the results of the studies across sub-groups: (1.) the quality and safety of patient care, (2.) the state of professional practice of physicians and nurses, (3.) changes in incentives structure, (4.) vulnerable groups and access to healthcare services. Furthermore, much-needed data for future comparative research has been collected and some early insights into the potential impact of DRGs are outlined. CONCLUSIONS: Based on the joint results we developed preliminary recommendations related to conceptual analysis, methodological refinement, monitoring and implementation.
Resumo:
Purpose: In vitro studies in porcine eyes have demonstrated a good correlation between induced intraocular pressure variations and corneal curvature changes, using a contact lens with an embedded microfabricated strain gauge. Continuous 24 hour-intraocular pressure (IOP) monitoring to detect large diurnal fluctuation is currently an unmet clinical need. The aims of this study is to evaluate precision of signal transmission and biocompatibility of 24 hour contact lens sensor wear (SENSIMED Triggerfish®) in humans. Methods: After full eye examination in 10 healthy volunteers, a 8.7 mm radius contact lens sensor and an orbital bandage containing a loop antenna were applied and connected to a portable recorder. Best corrected visual acuity and position, lubrication status and mobility of the sensor were assessed after 5 and 30 minutes, 4, 7 and 24 hours. Subjective comfort was scored and activities documented in a logbook. After sensor removal full eye examination was repeated, and the registration signal studied. Results: The comfort score was high and did not fluctuate significantly, except at the 7 hour-visit. The mobility of the contact lens was minimal but its lubrication remained good. Best corrected visual acuity was significantly reduced during the sensor wear and immediately after its removal. Three patients developed mild corneal staining. In all but one participant we obtained a registration IOP curve with visible ocular pulse amplitude. Conclusions: This 24 hour-trial confirmed the functionality and biocompatibility of SENSIMED Triggerfish® wireless contact lens sensor for IOP-fluctuation monitoring in volunteers. Further studies with a range of different contact lens sensor radii are indicated.
Resumo:
Objectives The purpose of this study is to assess short and long term changes in knowledge, attitudes, and skills among medical residents following a short course on cultural competency and to explore their perspectives on the experience. Methods Eighteen medical residents went through a short training programme comprised of two seminars lasting 30' and 60' respectively over two days. Three months later, we conducted three focus groups, with 17 residents to explore their thoughts, perspectives and feedback about the course. To measure changes over time, we carried out a quantitative sequential survey before the seminars, three days after, and three months later using the Multicultural Assessment Questionnaire. Results Residents expressed a wide variety of perspectives on the main themes related to the content of the training - culture, trialogue, stereotypes, status, epidemiology, history and geopolitics - and related to its organization - relevance, volume, timing, target audience, training tools, and working material. Using the MAQ, we observed a higher global performance score (n=16) at three days (median=38) compared to results before the training (median=33) revealing a median difference of 5.5 points (z=2.4, p=0.015). This difference was still present at three months (∆=4.5, z=2.4, p=0.018), mainly due to knowledge acquisition (∆=3) rather than attitudes (∆=0) or skills (∆=1). Conclusions Cross-cultural competence training not only brings awareness of multicultural issues but also helps participants understand their own cultures, perception of others and preconceived ideas. Physicians' education should however also focus on improving implementation of acquired knowledge in cross-cultural competence.
Resumo:
The general public seems to be convinced that juvenile delinquency has massively increased over the last decades. However, this assumption is much less popular among academics and some media where doubts about the reality of this trend are often expressed. In the present paper, trends are followed using conviction statistics over 50 years, police and victimization data since the 1980s, and self-report data collected since 1992. All sources consistently point to a massive increase of offending among juveniles, particularly for violent offences during the 1990s. Given that trends were similar in most European countries, explanations should be sought at the European rather than the national level. The available evidence points to possible effects of increased opportunities for property offences since 1950, and although causality remains hard to prove, effects of increased exposure to extreme media violence since 1985.
Resumo:
BACKGROUND: It is well established that high adherence to HIV-infected patients on highly active antiretroviral treatment (HAART) is a major determinant of virological and immunologic success. Furthermore, psychosocial research has identified a wide range of adherence factors including patients' subjective beliefs about the effectiveness of HAART. Current statistical approaches, mainly based on the separate identification either of factors associated with treatment effectiveness or of those associated with adherence, fail to properly explore the true relationship between adherence and treatment effectiveness. Adherence behavior may be influenced not only by perceived benefits-which are usually the focus of related studies-but also by objective treatment benefits reflected in biological outcomes. METHODS: Our objective was to assess the bidirectional relationship between adherence and response to treatment among patients enrolled in the ANRS CO8 APROCO-COPILOTE study. We compared a conventional statistical approach based on the separate estimations of an adherence and an effectiveness equation to an econometric approach using a 2-equation simultaneous system based on the same 2 equations. RESULTS: Our results highlight a reciprocal relationship between adherence and treatment effectiveness. After controlling for endogeneity, adherence was positively associated with treatment effectiveness. Furthermore, CD4 count gain after baseline was found to have a positive significant effect on adherence at each observation period. This immunologic parameter was not significant when the adherence equation was estimated separately. In the 2-equation model, the covariances between disturbances of both equations were found to be significant, thus confirming the statistical appropriacy of studying adherence and treatment effectiveness jointly. CONCLUSIONS: Our results, which suggest that positive biological results arising as a result of high adherence levels, in turn reinforce continued adherence and strengthen the argument that patients who do not experience rapid improvement in their immunologic and clinical statuses after HAART initiation should be prioritized when developing adherence support interventions. Furthermore, they invalidate the hypothesis that HAART leads to "false reassurance" among HIV-infected patients.
Resumo:
Background: Cardio-vascular diseases (CVD), their well established risk factors (CVRF) and mental disorders are common and co-occur more frequently than would be expected by chance. However, the pathogenic mechanisms and course determinants of both CVD and mental disorders have only been partially identified.Methods/Design: Comprehensive follow-up of CVRF and CVD with a psychiatric exam in all subjects who participated in the baseline cross-sectional CoLaus study (2003-2006) (n=6'738) which also included a comprehensive genetic assessment. The somatic investigation will include a shortened questionnaire on CVRF, CV events and new CVD since baseline and measurements of the same clinical and biological variables as at baseline. In addition, pro-inflammatory markers, persistent pain and sleep patterns and disorders will be assessed. In the case of a new CV event, detailed information will be abstracted from medical records. Similarly, data on the cause of death will be collected from the Swiss National Death Registry. The comprehensive psychiatric investigation of the CoLaus/PsyCoLaus study will use contemporary epidemiological methods including semi-structured diagnostic interviews, experienced clinical interviewers, standardized diagnostic criteria including threshold according to DSM-IV and sub-threshold syndromes and supplementary information on risk and protective factors for disorders. In addition, screening for objective cognitive impairment will be performed in participants older than 65 years.Discussion: The combined CoLaus/PsyCoLaus sample provides a unique opportunity to obtain prospective data on the interplay between CVRF/CVD and mental disorders, overcoming limitations of previous research by bringing together a comprehensive investigation of both CVRF and mental disorders as well as a large number of biological variables and a genome-wide genetic assessment in participants recruited from the general population.