872 resultados para effective knowledge integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pace of development of new healthcare technologies and related knowledge is very fast. Implementation of high quality evidence-based knowledge is thus mandatory to warrant an effective healthcare system and patient safety. However, even though only a small fraction of the approximate 2500 scientific publication indexed daily in Medline is actually useful to clinical practice, the amountof the new information is much too large to allow busy healthcare professionals to stay aware of possibly important evidence-based information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of Hispanic workers in the U.S. construction industry has been steadily increasing, and language and cultural barriers have sometimes arisen on the jobsite. Due in part to these barriers, the number of fatalities among Hispanics at construction sites in 2001 jumped 24%, while construction fatalities overall dropped 3%. This study, which constitutes Phase III of the Hispanic Workforce Research Project, addresses these language and cultural barriers by investigating the most effective way to deliver training material developed in Phases I and II to Hispanic workers, American supervisors, and department of transportation (DOT) inspectors. The research methodology consisted of assessing the needs and interests of potential and current course participants in terms of exploring innovative ways to deliver the training. The training courses were then adapted and delivered to fit the specific needs of each audience. During Phase III of this project, the research team delivered the courses described in the Phase I and II reports to eight highway construction companies and two DOT groups. The courses developed in Phases I and II consist of four construction-focused language training courses that can be part of an effective training program to facilitate integration among U.S. and Hispanic workers, increase productivity and motivation at the jobsite, and decrease the existing high mortality rate for Hispanic workers. Moreover, the research team developed a course for the construction season called Toolbox Integration Course for Hispanic workers and American supervisors (TICHA), which consists of nine 45-minute modules delivered to one construction company over 11 weeks in the summer of 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Shared decision making (SDM) is a process by which a healthcare choice is made jointly by the healthcare professional and the patient. SDM is the essential element of patient-centered care, a core concept of primary care. However, SDM is seldom translated into primary practice. Continuing professional development (CPD) is the principal means by which healthcare professionals continue to gain, improve, and broaden the knowledge and skills required for patient-centered care. Our international collaboration seeks to improve the knowledge base of CPD that targets translating SDM into the clinical practice of primary care in diverse healthcare systems. Methods: Funded by the Canadian Institutes of Health Research (CIHR), our project is to form an international, interdisciplinary research team composed of health services researchers, physicians, nurses, psychologists, dietitians, CPD decision makers and others who will study how CPD causes SDM to be practiced in primary care. We will perform an environmental scan to create an inventory of CPD programs and related activities for translating SDM into clinical practice. These programs will be critically assessed and compared according to their strengths and limitations. We will use the empirical data that results from the environmental scan and the critical appraisal to identify knowledge gaps and generate a research agenda during a two-day workshop to be held in Quebec City. We will ask CPD stakeholders to validate these knowledge gaps and the research agenda. Discussion: This project will analyse existing CPD programs and related activities for translating SDM into the practice of primary care. Because this international collaboration will develop and identify various factors influencing SDM, the project could shed new light on how SDM is implemented in primary care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors describe an invasive Aspergillus fumigatus deep-burn wound infection in a severely burned patient that was successfully treated with a combination of topical terbinafine and systemic voriconazole antifungal therapy. To our knowledge, this is the first case report describing the effective control of an invasive deep-burn wound infection using this combination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tese enquadra-se nas discussões sobre as concepções do currículo, como problemática central nos processos de educação e formação, e o papel da Universidade ao longo dos tempos, mormente nos contextos actuais da globalização, conferindo especial relevo às concepções, práxis e tendências que caracterizam a experiência de desenvolvimento curricular na Universidade de Cabo Verde (Uni-CV), desde a sua criação, em Novembro de 2006, no seguimento de um percurso de quase três décadas do ensino superior público cabo-verdiano Com o enquadramento teórico da problemática da investigação faz-se uma ampla cartografia da literatura relevante no campo científico dos estudos curriculares, numa abordagem que patenteia a diversidade de conceptualizações do currículo e do desenvolvimento curricular, os principais traços característicos das teorias curriculares que se têm sucedido e ou que rivalizam na busca de hegemonia no sector da educação, bem como as políticas educativas e curriculares que vêm sendo concebidas e realizadas à escala global, dispensando atenção particular às dimensões instituinte e instituída do processo curricular. Ainda que fortemente condicionado pelas concepções e políticas de globalização da educação, a tendência para a uniformização educativa e curricular não constitui uma inevitabilidade, demonstrando-se, pelo contrário, que o processo de desenvolvimento curricular deixa espaços de apropriação e inovação ao nível das instituições educativas, atendendo à diversidade de contextos, expectativas e perspectivas inerentes à dinâmica da realização do currículo. Ainda no plano teórico, ao analisar-se a evolução do conceito ou ideia de Universidade, desde a sua génese até aos tempos actuais, coloca-se em relevo a natureza específica da instituição no âmbito do ensino superior, patenteando o modo como, nos diferentes contextos, a mesma tem procurado afirmar a centralidade do conhecimento e do currículo no cumprimento da sua missão, a despeito de factores e condicionalismos diversos, de entre os quais releva o tipo de relacionamento predominante entre a Universidade, o Estado e o mercado, no âmbito do qual se deve entender a complexidade da crise institucional, na triplicidade das suas manifestações (crise de legitimidade, de hegemonia e de identidade) que atravessa a academia, com reflexos ao nível das tendências para o condicionamento da autonomia, missão e funções da academia, assim como da própria natureza do conhecimento universitário. Na procura de saídas para a crise, que é global e, como tal, se reflecte nas universidades do continente africano, em que se insere Cabo Verde, a Universidade é desafiada a afirmar a sua especificidade institucional, enquanto promotora da alta cultura e da capacidade de pensamento de longo prazo, conciliando, deste modo, as suas funções essenciais ou simbólicas com as que se prendem com a satisfação das necessidades imediatas ou de curto prazo da economia e do mercado. Com base nos pertinentes subsídios teóricos, os estudos empíricos desenvolvem-se segundo a abordagem metodológica de estudo de caso, em que a análise documental e as técnicas de investigação qualitativa e quantitativa permitiram consolidar as evidências sobre: (i) os antecedentes da criação da Uni-CV, através do mapeamento do percurso académico e curricular dos diversos estabelecimentos públicos de ensino superior que precederam a universidade pública, legando a esta o seu património científico, tecnológico e logístico, com as inerentes potencialidades e limitações; (ii) o processo de institucionalização da Uni-CV, com a referencialização das opções estruturantes da organização e gestão da Universidade assim como da política educativa e curricular da Universidade; (iii) a experiência multifacetada de desenvolvimento curricular na novel instituição durante os cinco primeiros anos de funcionamento (2006-2011), correlacionando opções e práxis e evidenciando tendências da sua evolução. Da análise interpretativa dos estudos empíricos realizados, mediante a triangulação dos dados de arquivo e de perspectiva, resulta que a Uni-CV, não obstante as fragilidades persistentes no processo de seu desenvolvimento institucional, tem cumprido a sua missão de forma satisfatória, facto que fica a dever-se quer à adequação das opções, normas e directivas conformadoras da dimensão instituinte do processo curricular, quer ao esforço de realização das prescrições curriculares, sendo, todavia, evidentes os desafios a serem vencidos tendo em vista a consecução da almejada excelência académica, que os Estatutos propugnam, e que passa, nomeadamente, pela melhoria do nível da qualificação do seu corpo docente, pela implementação ou funcionamento efectivo de alguns dos órgãos da academia e pela afirmação da investigação científica como função incontornável para o desempenho cabal das funções de ensino e extensão. De entre as conclusões, sustenta-se que, no processo de integração de Cabo Verde nas redes internacionais de investigação e excelência científica e tecnológica, como, de resto, propugnam os Estatutos da Uni-CV, deve atender-se à especificidade deste pequeno país do Atlântico Médio, tendo em conta as suas fragilidades estruturais, pelo que se impõe algum distanciamento crítico em relação à incorporação de certas opções de política educativa e curricular que emanam de instâncias internacionais, independentemente do seu carácter inovador ou mesmo da sua possível consistência científica e técnica, comprovada em outros contextos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on judgment and decision making presents a confusing picture of human abilities. For example, much research has emphasized the dysfunctional aspects of judgmental heuristics, and yet, other findings suggest that these can be highly effective. A further line of research has modeled judgment as resulting from as if linear models. This paper illuminates the distinctions in these approaches by providing a common analytical framework based on the central theoretical premise that understanding human performance requires specifying how characteristics of the decision rules people use interact with the demands of the tasks they face. Our work synthesizes the analytical tools of lens model research with novel methodology developed to specify the effectiveness of heuristics in different environments and allows direct comparisons between the different approaches. We illustrate with both theoretical analyses and simulations. We further link our results to the empirical literature by a meta-analysis of lens model studies and estimate both human andheuristic performance in the same tasks. Our results highlight the trade-off betweenlinear models and heuristics. Whereas the former are cognitively demanding, the latterare simple to use. However, they require knowledge and thus maps of when andwhich heuristic to employ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: : A primary goal of clinical pharmacology is to understand the factors that determine the dose-effect relationship and to use this knowledge to individualize drug dose. METHODS: : A principle-based criterion is proposed for deciding among alternative individualization methods. RESULTS: : Safe and effective variability defines the maximum acceptable population variability in drug concentration around the population average. CONCLUSIONS: : A decision on whether patient covariates alone are sufficient, or whether therapeutic drug monitoring in combination with target concentration intervention is needed, can be made by comparing the remaining population variability after a particular dosing method with the safe and effective variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hippocampal adult neurogenesis results in the continuous formation of new neurons in the adult hippocampus, which participate to learning and memory. Manipulations increasing adult neurogenesis have a huge clinical potential in pathologies involving memory loss. Intringuingly, most of the newborn neurons die during their maturation. Thus, increasing newborn neuron survival during their maturation may be a powerful way to increase overall adult neurogenesis. The factors governing this neuronal death are yet poorly known. In my PhD project, we made the hypothesis that synaptogenesis and synaptic activity play a role in the survival of newborn hippocampal neurons. We studied three factors potentially involved in the regulation of the synaptic integration of adult-born neurons. First, we used propofol anesthesia to provoke a global increase in GABAergic activity of the network, and we evaluated the outcome on newborn neuron synaptic integration, morphological development and survival. Propofol anesthesia impaired the dendritic maturation and survival of adult-born neurons in an age-dependent manner. Next, we examined the development of astrocytic ensheathment on the synapses formed by newborn neurons, as we hypothesized that astrocytes are involved in their synaptic integration. Astrocytic processes ensheathed the synapses of newborn neurons very early in their development, and the processes modulated synaptic transmission on these cells. Finally, we studied the cell-autonomous effects of the overexpression of synaptic adhesion molecules on the development, synaptic integration and survival of newborn neurons, and we found that manipulating of a single adhesion molecule was sufficient to modify synaptogenesis and/or synapse function, and to modify newborn neuron survival. Together, these results suggest that the activity of the neuronal network, the modulation of glutamate transport by astrocytes, and the synapse formation and activity of the neuron itself may regulate the survival of newborn neurons. Thus, the survival of newborn neurons may depend on their ability to communicate with the network. This knowledge is crucial for finding ways to increase neurogenesis in patients. More generally, understanding how the neurogenic niche works and which factors are important for the generation, maturation and survival of neurons is fundamental to be able to maybe, one day, replace neurons in any region of the brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette recherche s'applique aux témoins glaciaires des Chablais dans quatre de leurs dimensions : géopatrimoine, connaissance objective, inventaire de géosites et valorisation. Elle est organisée sur le canevas d'un processus de patrimonialisation auquel elle participe et qu'elle interroge à la fois. En 2009, débutait le projet 123 Chablais, pour une durée de quatre ans. Il concernait l'ensemble du territoire chablaisien, réparti sur deux pays (France et Suisse) et trois entités administratives (département de la Haute-Savoie, cantons de Vaud et du Valais). Ce projet, élaboré dans le cadre du programme Interreg IV France-Suisse, avait pour but de dynamiser le développement économique local en s'appuyant sur les patrimoines régionaux. Le géopatrimoine, identifié comme une de ces ressources, faisait donc l'objet de plusieurs actions, dont cette recherche. En parallèle, le Chablais haut-savoyard préparait sa candidature pour rejoindre l'European Geopark Network (EGN). Son intégration, effective dès 2012, a fait de ce territoire le cinquième géoparc français du réseau. Le Geopark du Chablais fonde son identité géologique sur l'eau et la glace, deux thématiques intimement liées aux témoins glaciaires. Dans ce contexte d'intérêt pour le géopatrimoine local et en particulier pour le patrimoine glaciaire, plusieurs missions ont été assignées à cette recherche qui devait à la fois améliorer la connaissance objective des témoins glaciaires, inventorier les géosites glaciaires et valoriser le patrimoine glaciaire. Le premier objectif de ce travail était d'acquérir une vision synthétique des témoins glaciaires. Il a nécessité une étape de synthèse bibliographique ainsi que sa spatialisation, afin d'identifier les lacunes de connaissance et la façon dont ce travail pouvait contribuer à les combler. Sur cette base, plusieurs méthodes ont été mises en oeuvre : cartographie géomorphologique, reconstitution des lignes d'équilibre glaciaires et datations de blocs erratiques à l'aide des isotopes cosmogéniques produits in situ. Les cartes géomorphologiques ont été élaborées en particulier dans les cirques et vallons glaciaires. Les datations cosmogéniques ont été concentrées sur deux stades du glacier du Rhône : le Last Local Glacial Maximum (LLGM) et le stade de Monthey. Au terme de cette étape, les spécificités du patrimoine glaciaire régional se sont révélées être 1) une grande diversité de formes et des liens étroits avec différents autres processus géomorphologiques ; 2) une appartenance des témoins glaciaires à dix grandes étapes de la déglaciation du bassin lémanique. Le second objectif était centré sur le processus d'inventaire des géosites glaciaires. Nous avons mis l'accent sur la sélection du géopatrimoine en développant une approche basée sur deux axes (temps et espace) identifiés dans le volet précédent et avons ainsi réalisé un inventaire à thèmes, composé de 32 géosites. La structure de l'inventaire a également été explorée de façon à intégrer des critères d'usage de ces géosites. Cette démarche, soutenue par une réflexion sur les valeurs attribuées au géopatrimoine et sur la façon d'évaluer ces valeurs, nous a permis de mettre en évidence le point de vue anthropo - et scientifico - centré qui prévaut nettement dans la recherche européenne sur le géopatrimoine. L'analyse des résultats de l'inventaire a fait apparaître quelques caractéristiques du patrimoine glaciaire chablaisien, discret, diversifié, et comportant deux spécificités exploitables dans le cadre d'une médiation scientifique : son statut de « berceau de la théorie glaciaire » et ses liens étroits avec des activités de la vie quotidienne, en tant que matière première, support de loisir ou facteur de risque. Cette recherche a débouché sur l'élaboration d'une exposition itinérante sur le patrimoine glaciaire des Chablais. Ce produit de valorisation géotouristique a été conçu pour sensibiliser la population locale à l'impact des glaciers sur son territoire. Il présente une série de sept cartes de stades glaciaires, encadrées par les deux mêmes thématiques, l'histoire de la connaissance glaciaire d'une part, les témoins glaciaires et la société, d'autre part. -- This research focuses on glacial witnesses in the Chablais area according to four dimensions : geoheritage, objective knowledge, inventory and promotion of geosites. It is organized on the model of an heritage's process which it participates and that it questions both. In 2009, the project 123 Chablais started for a period of four years. It covered the entire chablaisien territory spread over two countries and three administrative entities (département of Haute-Savoie, canton of Vaud, canton of Valais). This project, developed in the framework of the Interreg IV France-Switzerland program, aimed to boost the local development through regional heritage. The geoheritage identified as one of these resources, was therefore the subject of several actions, including this research. In parallel, the French Chablais was preparing its application to join the European Geopark Network (EGN). Its integration, effective since 2012, made of this area the fifth French Geopark of the network. The Chablais Geopark geological identity was based on water and ice, two themes closely linked to the glacial witnesses. In this context of interest for the regional geoheritage and especially for the glacial heritage, several missions have been assigned to this research which should improve objective knowledge of glacial witnesses, inventory and assess glacial geosites. The objective knowledge's component was to acquire a synthetic vision of the glacial witnesses. It required a first bibliography synthesis step in order to identify gaps in knowledge and how this work could help to fill them. On this basis, several methods have been implemented: geomorphological mapping, reconstruction of the equilibrium-line altitude and dating of glacial erratic blocks using cosmogenic isotopes produced in situ. Geomorphological maps have been developed especially in glacial cirques and valleys. Cosmogenic datings were concentrated on two stages of the Rhone glacier: the Last Local Glacial Maximum (LLGM) and « the stage of Monthey ». After this step, the specificities of the regional glacial heritage have emerged to us as 1) a wide variety of forms and links to various other geomorphological processes; 2) belonging of glacial witnesses to ten major glacial stages of Léman Lake's deglaciation. In the inventory of glacial geosites component we focused on the selection of geoheritage. We developed an approach based on two axes (time and space) identified in the preceding components. We obtained a thematic inventory, consisting of 32 geosites. The structure of the inventory was also explored in the aim to integrate use criteria of geosites. This approach, supported by a thought on the values attributed to the geoheritage and how to assess these values allowed us to highlight the point of view much anthropological - and scientific -centered prevailing in the European research on geoheritage. The analysis of the inventory's results revealed some characteristics of chablaisien glacial heritage, discrete, diverse, and with two features exploitable in the context of a scientific mediation: its status as « cradle of the glacial theory » and its close links with activities of daily life, as raw material, leisure support and risk factor. This research leads to the development of a traveling exhibition on the glacial heritage of the Chablais area. It presents a series of seven glacial stage's cards, framed by the two themes mentioned above: « history of glacial knowledge » and « glacial witnesses and society ».

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente trabajo pretende mostrar los resultados de un estudio cuya intención era identificar posibles variables que afectan a los procesos y estrategias de integración laboral de personas procedentes de otros países; las necesidades formativas en relación al colectivo de referencia y ampliar el conocimiento del fenómeno en aras de la intervención psicoeducativa y social. A partir de una metodología de investigación de orientación cualitativa se ha concluido que las diferencias en cuanto a las estrategias de integración y sus logros están más determinadas por los niveles de cualificación en el país de origen que por las propias diferencias culturales. The article shows the results of a study carried out by the authors to identify possible variables that affect the processes and strategies of occupational integration of people coming from other countries, to establish training needs in relation to the reference group and to extend the knowledge of the phenomenon for the sake of the educational and social intervention. Using a qualitative research methodology, it was concluded that the differences in integration strategies and their accomplishments are better explained by the levels of qualification attained in the country of origin than by cultural differences.