999 resultados para Project valuation
Resumo:
L'article présente le modèle de la formulation de cas pour l'évaluation psychologique de l'enfant et de l'adolescent. La formulation de cas s'est développée en réponse au diagnostic psychiatrique qui peut être perc ̧u comme une approche réductionniste du fonctionnement psychique. En Europe, de nombreux psychologues cliniciens ont adopté ce modèle, qui a initialement été développé par les psychothérapeutes d'orientation cognitivo-comportementale, et l'ont adapté à leur pratique respective. Ce modèle, aujourd'hui bien ancré dans la pratique clinique quotidienne, consiste à élaborer différents regards théoriques intégrant les résultats de la recherche scientifique pour favoriser la compréhension d'un cas clinique. Ce modèle s'inscrit parallèlement dans une approche centrée sur les preuves et dans une pratique réflexive de la clinique. L'objectif de la formulation de cas est d'élaborer des hypothèses et de proposer des traitements adéquats. La famille est toutefois intégrée dans le processus de discussion et de prise de décision, ce qui permet de renforcer l'alliance thérapeutique. Cet article présente les trois courants théoriques les plus utilisés dans la formulation de cas : les approches cognitivo-comportementale, psychodynamique et systémique. L'utilisation d'une seule et même vignette permettra de montrer la pertinence du modèle et la complémentarité des différentes approches cliniques.
Resumo:
Abstract Background: Preventable mortality is a good indicator of possible problems to be investigated in the primary prevention chain, making it also a useful tool with which to evaluate health policies particularly public health policies. This study describes inequalities in preventable avoidable mortality in relation to socioeconomic status in small urban areas of thirty three Spanish cities, and analyses their evolution over the course of the periods 1996–2001 and 2002–2007. Methods: We analysed census tracts and all deaths occurring in the population residing in these cities from 1996 to 2007 were taken into account. The causes included in the study were lung cancer, cirrhosis, AIDS/HIV, motor vehicle traffic accidents injuries, suicide and homicide. The census tracts were classified into three groups, according their socioeconomic level. To analyse inequalities in mortality risks between the highest and lowest socioeconomic levels and over different periods, for each city and separating by sex, Poisson regression were used. Results: Preventable avoidable mortality made a significant contribution to general mortality (around 7.5%, higher among men), having decreased over time in men (12.7 in 1996–2001 and 10.9 in 2002–2007), though not so clearly among women (3.3% in 1996–2001 and 2.9% in 2002–2007). It has been observed in men that the risks of death are higher in areas of greater deprivation, and that these excesses have not modified over time. The result in women is different and differences in mortality risks by socioeconomic level could not be established in many cities. Conclusions: Preventable mortality decreased between the 1996–2001 and 2002–2007 periods, more markedly in men than in women. There were socioeconomic inequalities in mortality in most cities analysed, associating a higher risk of death with higher levels of deprivation. Inequalities have remained over the two periods analysed. This study makes it possible to identify those areas where excess preventable mortality was associated with more deprived zones. It is in these deprived zones where actions to reduce and monitor health inequalities should be put into place. Primary healthcare may play an important role in this process. Keywords: Preventable avoidable mortality, Causes of death, Inequalities in health, Small area analysis
Resumo:
In recent years, some epidemiologic studies have attributed adverse effects of air pollutants on health not only to particles and sulfur dioxide but also to photochemical air pollutants (nitrogen dioxide and ozone). The effects are usually small, leading to some inconsistencies in the results of the studies. Furthermore, the different methodologic approaches of the studies used has made it difficult to derive generic conclusions. We provide here a quantitative summary of the short-term effects of photochemical air pollutants on mortality in seven Spanish cities involved in the EMECAM project, using generalized additive models from analyses of single and multiple pollutants. Nitrogen dioxide and ozone data were provided by seven EMECAM cities (Barcelona, Gijón, Huelva, Madrid, Oviedo, Seville, and Valencia). Mortality indicators included daily total mortality from all causes excluding external causes, daily cardiovascular mortality, and daily respiratory mortality. Individual estimates, obtained from city-specific generalized additive Poisson autoregressive models, were combined by means of fixed effects models and, if significant heterogeneity among local estimates was found, also by random effects models. Significant positive associations were found between daily mortality (all causes and cardiovascular) and NO(2), once the rest of air pollutants were taken into account. A 10 microg/m(3) increase in the 24-hr average 1-day NO(2)level was associated with an increase in the daily number of deaths of 0.43% [95% confidence interval (CI), -0.003-0.86%] for all causes excluding external. In the case of significant relationships, relative risks for cause-specific mortality were nearly twice as much as that for total mortality for all the photochemical pollutants. Ozone was independently related only to cardiovascular daily mortality. No independent statistically significant relationship between photochemical air pollutants and respiratory mortality was found. The results in this study suggest that, given the present levels of photochemical pollutants, people living in Spanish cities are exposed to health risks derived from air pollution.
Resumo:
La Conférence inaugurale de Barcelone a marqué, en novembre 1995, le début d'un long processus de rapprochement et de solidarité entre 27 partenaires (35 pays depuis le 1er mai 2004 et 37 à moyen terme). Cette initiative est censée revêtir un caractère permanent et évolutif sous l'angle institutionnel. De par sa dimension stratégique, le Processus de Barcelone, ci-après Processus, constitue l'instrument le plus important et le plus concret pour le dialogue et la coopération entre l'Union européenne (UE), ses Etats membres et les partenaires méditerranéens2. Pour être efficace, et pas uniquement rhétorique ou virtuel, le Partenariat euro-méditerranéen, ci-après Partenariat, doit se bâtir sur des valeurs universelles, capables de garantir un minimum de cohérence et de crédibilité à un projet extrêmement complexe, fragile et, par sa propre nature, constamment menacé de paralysie. En effet, il n'est pas toujours aisé de faire prévaloir des actions à caractère centripète aux tentations et tendances centrifuges qui caractérisent la région. Les changements et les événements exceptionnels survenus récemment, tant dans le domaine international qu'au sein de l'Union, ont rendu nécessaires l'approfondissement et le renforcement institutionnel des relations euro-méditerranéennes. Le Processus est appelé à se consolider d'urgence, pour être compris et accepté par une opinion publique de plus en plus sceptique et déconcertée par l'actualité internationale. La récente création de l'Assemblée parlementaire euro-méditerranéenne (APEM) - qui sera dotée de trois commissions permanentes3 - et la constitution prochaine à Alexandrie de la Fondation Euromed pour le dialogue entre les cultures et les civilisations, représentent des réponses logiques et encourageantes à cet état d'esprit plus ou moins généralisé
Resumo:
ABSTRACT The interorganizational cooperation, through joint efforts with various actors, allows the high-tech companies to complement resources, especially in R&D projects. Collaborative projects have been identified in many studies as an important strategy to produce complex products and services in uncertain and competitive environments. Thus, this research aims at deepening the understanding of how the development dynamics of a collaborative R&D project in an industry of high technology occur. In order to achieve the proposed objective, the R&D project of the first microcontroller in the Brazilian semiconductor industry was defined as the object of analysis. The empirical choice is justified by the uniqueness of the case, besides bringing a diversity of actors and a level of complementarity of resources that were significant to the success of the project. Given the motivation to know who the actors were and what the main forms of interorganizational coordination were used in this project, interviews were carried out and a questionnaire was also made, besides other documents related to the project. The results presented show a network of nine actors and their roles in the interorganizational collaboration process, as well as the forms of social and temporal overlapping, used in the coordination of collective efforts. Focusing on the mechanisms of temporal and social integration highlighted throughout the study, the inclusion of R&D projects in the typology for interorganizational projects is proposed in this paper, which was also proposed by Jones and Lichtenstein (2008).
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
Geography as a school subject is specifically thought for and by the schools. The contents of the school subject, nowadays, do not reflect the concerns and the evolution of the discipline as such. Nevertheless, official curricula set school objectives that address issues affecting the world and people's lives. These issues are coherent with the ones addressed by geography as a social science, that is to say the study of how people and their environment interact and how societies are interconnected through space. On an every day basis, Geography as a school subject is most of the time reduced to accumulating knowledge outside any given context. This knowledge may even be partially untrue or old and the related activities focus on low cognitive tensions. These practices do not contribute to the learners' understanding of the world because it does not allow them to build a geographical competence, which they. will need as future citizens in order to make responsible choices when they are confronted to questions related to how the locations of human and physical features are influenced by each other and how they interact across space. The central part of the text relies on the ideas and the processes discussed in the publications, which constitute the published file; it is divided into two parts. The first part (chapter 4) presents a didactic approach, which gives meaningful insights into Geography as a school subject and shows a brief account of the theoretical background that supports it. This socio-constructivist approach relies on the main following features: a priming stage (élément déclencheur), which presents geographical knowledge as an issue to be explored, discussed or solved; the issue is given to learners;. the planning of the teaching-learning sequence in small units launched by the main issue in the priming stage ; the interconnections of geographical knowledge with integrative concepts ; the synthetic stage or reporting stage where final concepts and knowledge are put together in order to be learned. Such an approach allows learners to re-invest the knowledge they have built themselves. This knowledge is organised by geographical integrative concepts, which represent true thinking operative tools and with which key issues in the geographical thinking are associated. The second part of the text (chapter 5) displays the didactic principles that governed the conception of the new initial training course for the future upper secondary school teachers at the HEP Vaud. The ambition of this course is to prepare future teachers to plan and realize the teaching of geography that provides pupils with the tools to understand better how people and their environment interact and how societies are interconnected through space. One of the tools for the teachers is the conceptual framework, whose most salient interest is to be relevant at every stage of the preparation and planning of the teaching, including the necessary epistemological reflection that should always be present. The synthesis of the text starts with a short account of the first evaluation of the new course. Various reflections on the future concerns and issues, that the didactics and methodology of Geography will be confronted with, constitute the synthesis.
Resumo:
Pour estimer le coût des atteintes à la santé, on recourt généralement conjointement à plusieurs méthodes sans s'interroger sur la cohérence d'ensemble du modèle d'évaluation. L'addition des estimations obtenues par la méthode du capital humain et des résultats d'une évaluation contingente en est un exemple. Toutefois ce procédé ne repose sur aucun fondement théorique. Face à ce constat, l'article propose un modèle d'évaluation du coût de la morbidité et de la mortalité, modèle basé sur l'économie du bien-être. Le cas des blessures mortelles causées par les accidents de la route sert d'illustration à ces propositions. Cette illustration est concrétisée dans le contexte helvétique par un exemple chiffré.
Resumo:
Genetic Epidemiology of Metabolic Syndrome is a multinational, family-based study to explore the genetic basis of the metabolic syndrome. Atherogenic dyslipidemia (defined as low plasma high-density lipoprotein cholesterol with elevated triglycerides (<25th and >75th percentile for age, gender, and country, respectively) identified affected subjects for the metabolic syndrome. This report examines the frequency at which atherogenic dyslipidemia predicts the metabolic syndrome of the National Cholesterol Education Program Adult Treatment Panel III (ATP-III). One thousand four hundred thirty-six (854 men/582 women) affected patients by our criteria were compared with 1,672 (737 men/935 women) unaffected persons. Affected patients had more hypertension, obesity, and hyperglycemia, and they met a higher number of ATP-III criteria (3.2 +/- 1.1 SD vs 1.3 +/- 1.1 SD, p <0.001). Overall, 76% of affected persons also qualified for the ATP-III definition (Cohen's kappa 0.61, 95% confidence interval 0.59 to 0.64), similar to a separate group of 464 sporadic, unrelated cases (75%). Concordance increased from 41% to 82% and 88% for ages < or =35, 36 to 55, and > or =55 years, respectively. Affected status was also independently associated with waist circumference (p <0.001) and fasting glucose (p <0.001) but not systolic blood pressure (p = 0.43). Thus, the lipid-based criteria used to define affection status in this study substantially parallels the ATP-III definition of metabolic syndrome in subjects aged >35 years. In subjects aged <35 years, atherogenic dyslipidemia frequently occurs in the absence of other metabolic syndrome risk factors.