110 resultados para Information Anxiety


Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVE:. The information assessment method (IAM) permits health professionals to systematically document the relevance, cognitive impact, use and health outcomes of information objects delivered by or retrieved from electronic knowledge resources. The companion review paper (Part 1) critically examined the literature, and proposed a 'Push-Pull-Acquisition-Cognition-Application' evaluation framework, which is operationalized by IAM. The purpose of the present paper (Part 2) is to examine the content validity of the IAM cognitive checklist when linked to email alerts. METHODS: A qualitative component of a mixed methods study was conducted with 46 doctors reading and rating research-based synopses sent on email. The unit of analysis was a doctor's explanation of a rating of one item regarding one synopsis. Interviews with participants provided 253 units that were analysed to assess concordance with item definitions. RESULTS AND CONCLUSION: The content relevance of seven items was supported. For three items, revisions were needed. Interviews suggested one new item. This study has yielded a 2008 version of IAM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RAPPORT DE SYNTHÈSE Introduction En médecine de premier recours, les plaintes physiques sont fréquemment associées à des troubles dépressifs, anxieux et somatoformes et peuvent les masquer. Il est fréquemment reporté que ces troubles mentaux ont tendance à être insuffisamment diagnostiqués. Par ailleurs, peu d'études ont été conduites en médecine de premier recours concernant la possible association entre facteurs de stress psychosociaux et troubles dépressifs, anxieux et somatoformes. Objectifs Les objectifs étaient de déterminer la prévalence des troubles dépressifs, anxieux et somatoformes chez des patients consultant avec une plainte physique en médecine de premier recours, ainsi que d'explorer la possible association entre ces troubles mentaux et des facteurs de stress psychosociaux. Méthodes Nous avons conduit une étude transversale, multicentrique parmi vingt et un cabinets médicaux en Suisse Romande et la Policlinique Médicale Universitaire de Lausanne. Les sujets étaient sélectionnés aléatoirement parmi des patients qui avaient présenté spontanément au moins une plainte physique et qui avaient consulté lors d'une demi- journée de consultation considérée pour l'étude. Les patients inclus ont rempli l'auto- questionnaire Patient Health Questionnaire (PHQ) entre novembre 2004 et juillet 2005. Nous avons utilisé la version française et validée du PHQ qui permet le diagnostic des principaux troubles mentaux selon les critères du DSM-IV et l'analyse de l'exposition aux facteurs de stress psychosociaux. Résultats Neuf cent dix-sept patients se présentant avec au moins une plainte physique ont été inclus. Le taux de troubles dépressifs, anxieux et somatoformes a été de 20,0% (intervalle de confiance [IC] à 95% = 17,4%-22,7%), 15,5% (IC 95% = 13,2%- 18,0%) et 15,1% (IC 95% = 12,8%~17,5%), respectivement. Les facteurs de stress psychosociaux ont été significativement associés aux troubles mentaux. Les patients avec une accumulation de facteurs de stress psychosociaux ont été le plus souvent déprimés, anxieux ou ont manifesté des troubles somatoformes, avec une augmentation par un facteur 2,2 (IC 95% = 2,0-2,5) pour chaque facteur additionnel. Conclusions Bien que la relation entre facteurs de stress psychosociaux et trouble dépressif soit bien établie, cette étude montre qu'il existe un lien entre ces facteurs de stress et les troubles dépressifs, anxieux et somatoformes. L'investigation de ces troubles mentaux chez des patients consultant avec un symptôme physique en médecine de premier recours est pertinente. D'autres explorations sont nécessaires pour investiguer le bénéfice potentiel d'une prise en charge intégrée des facteurs de stress psychosociaux sur la diminution des plaintes physiques et des troubles mentaux chez les patients que suivent les médecins de premier recours.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Si l'impact de l'informatique ne fait généralement pas de doute, il est souvent plus problématique d'en mesurer sa valeur. Les Directeurs des Systèmes d'Information (DSI) expliquent l'absence de schéma directeur et de vision à moyen et long terme de l'entreprise, par un manque de temps et de ressources mais aussi par un défaut d'implication des directions générales et des directions financières. L'incapacité de mesurer précisément la valeur du système d'information engendre une logique de gestion par les coûts, néfaste à l'action de la DSI. Alors qu'une mesure de la valeur économique de l'informatique offrirait aux directions générales la matière leur permettant d'évaluer réellement la maturité et la contribution de leur système d'information. L'objectif de cette thèse est d'évaluer à la fois l'alignement de l'informatique avec la stratégie de l'entreprise, la qualité du pilotage (mesure de performance) des systèmes d'information, et enfin, l'organisation et le positionnement de la fonction informatique dans l'entreprise. La mesure de ces trois éléments clés de la gouvernance informatique a été réalisée par l'intermédiaire de deux vagues d'enquêtes successives menées en 2000/2001 (DSI) et 2002/2003 (DSI et DG) en Europe francophone (Suisse Romande, France, Belgique et Luxembourg). Abstract The impact of Information Technology (IT) is today a clear evidence to company stakeholders. However, measuring the value generated by IT is a real challenge. Chief Information Officers (CIO) explain the absence of solid IT Business Plans and clear mid/long term visions by a lack of time and resources but also by a lack of involvement of business senior management (e.g. CEO and CFO). Thus, being not able to measure the economic value of IT, the CIO will have to face the hard reality of permanent cost pressures and cost reductions to justify IT spending and investments. On the other side, being able to measure the value of IT would help CIO and senior business management to assess the maturity and the contribution of the Information System and therefore facilitate the decision making process. The objective of this thesis is to assess the alignment of IT with the business strategy, to assess the quality of measurement of the Information System and last but not least to assess the positioning of the IT organisation within the company. The assessment of these three key elements of the IT Governance was established with two surveys (first wave in 2000/2001 for CIO, second wave in 2002/2003 for CIO and CEO) in Europe (French speaking countries namely Switzerland, France, Belgium and Luxembourg).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 2008, Intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.