951 resultados para Evaluating a Cuisine: Six Criteria
Resumo:
Les six dimensions du concept d’espace social alimentaire ont été structurées pour faciliter l’analyse de l’aspect proprement social de l’acte alimentaire. Analysée à travers ce concept, une cuisine régionale révèle le vécu alimentaire des personnes qui la consomment. Car la promotion d’une cuisine régionale se base sur des produits emblématiques qui ne sont pas nécessairement consommés au quotidien. Dans le village provençal de Sault, village resté à peu près hors des sentiers du tourisme de masse, je m’intéresserai aux habitudes alimentaires de la population. Ces habitudes correspondent-elles à ce que nous a habitué la promotion culinaire et diététique de la cuisine provençale ? En abordant l’aspect proprement social de l’acte alimentaire des habitants de Sault, je verrai la particularité composite des habitudes de consommation alimentaire illustrée par une identité provençale, un discours micro-régional et des habitudes françaises.
Resumo:
Dans cette thèse, nous décrivons les résultats d’un projet de recherche visant à mesurer et évaluer la qualité des soins obstétricaux des hôpitaux de référence au Mali et au Sénégal. Dans ces pays, la mortalité maternelle hospitalière est élevée et est liée en partie à la pratique médicale inadéquate. Cette recherche a été réalisée dans le cadre de l’étude QUARITE, un essai randomisé en grappe évaluant l’efficacité du programme GESTA International visant à réduire la mortalité maternelle hospitalière. GESTA a été mis en œuvre entre 2008 et 2010 et consistait en la formation des professionnels de santé et en la revue des cas de décès maternels. En parallèle de QUARITE, les programmes de prévention de la transmission du VIH de la mère à l’enfant (PTME) ont été mis à l’échelle à travers les pays. Ces derniers ayant également la capacité d’augmenter la qualité des soins obstétricaux, nous avons donc évalué les effets des deux programmes (GESTA et PTME) sur la qualité des soins. Dans un premier temps, à l’aide d’une recension des écrits nous avons évalué la capacité d’un audit clinique basé sur des critères à mesurer la qualité des soins obstétricaux. Cet audit vérifiait si l’offre des soins avait respecté les critères cliniques définissant la meilleure prise en charge selon l’évidence scientifique et l’avis des experts. Nous avons démontré que cet outil est largement utilisé dans les pays à faibles et moyens revenus, malgré le peu d’évidence sur sa validité (article 1). Dans un deuxième temps, nous avons développé un audit clinique basé sur des critères qui s’applique au contexte ouest-africain et qui a été approuvé par des experts-obstétriciens nationaux et internationaux. À partir des dossiers obstétricaux, les actes médicaux posés pendant le travail et l’accouchement ont été évalués à l‘aide de cet instrument. La qualité des soins a été estimée sous forme de pourcentage de critères atteints. Appliqué dans différents contextes et par différents auditeurs, nous avons démontré que notre instrument est fiable et valide (article 3). Néanmoins, l’expérience de l’audit nous a amenés à nous questionner sur le mauvais remplissage des dossiers médicaux et ses conséquences sur la qualité des soins (article 2). Dans un troisième temps, l’outil a été appliqué à large échelle pour évaluer les effets de l’intervention GESTA (article 4). Nous avons mené une révision de plus de 800 dossiers obstétricaux dans 32 hôpitaux de référence (16 bénéficiaires de l’intervention et 16 non-bénéficiaires). Grâce à cet audit clinique, nous avons démontré que le programme GESTA contribue à l’amélioration de la qualité des soins, spécifiquement l’examen clinique lors de l’admission et le suivi après l’accouchement. Dernièrement, nous avons utilisé cet instrument afin d’évaluer les effets des programmes de PTME sur la qualité des soins obstétricaux (article 5). Notre travail a documenté que seulement certaines composantes du programme de PTME améliorent la qualité des soins telles que la formation des professionnels et les services complémentaires en nutrition. En conclusion, cette recherche a identifié plusieurs pistes d’intervention pour améliorer la qualité des soins obstétricaux en Afrique de l’Ouest.
Resumo:
Thèse réalisée en cotutelle France- Québec
Resumo:
Depuis plusieurs années, l’enseignement des sciences joue un rôle de 2e plan, après le français et les mathématiques, chez plusieurs enseignants du primaire (Lenoir, 2000). Peinant à établir des liens entre les savoirs à enseigner et la réalité quotidienne, ces enseignants ayant souvent une formation lacunaire semblent rechercher de nouveaux outils didactiques efficaces. À cet effet, l’intégration de pratiques sociales de référence (Martinand, 1986) aux situations d’apprentissage peut constituer une pratique innovante favorisant la transposition didactique des savoirs disciplinaires. Misant sur l’intérêt grandissant des Québécois pour la cuisine, cette recherche tente de déterminer l’impact d’une formation continue établissant des liens entre les sciences et les pratiques culinaires sur l’enseignement des sciences au primaire. Pour cette étude de cas descriptive, six enseignants du primaire ont participé à deux rencontres de formation durant lesquelles elles ont expérimenté une situation d’apprentissage et d’évaluation (SAÉ) intégrant des activités culinaires. À la suite d’une période de mise à l’essai de cette SAÉ en classe, les sujets ont été interviewés afin d’établir les apprentissages réalisés durant la formation et de dresser une liste des avantages et des obstacles associés à l’utilisation d’activités culinaires pour enseigner les sciences. De plus, les suggestions émises visant l’amélioration de la SAÉ (rapport bilan écrit) ont permis de mettre en relief la prédominance de préoccupations de nature pédagogique, organisationnelle et socioaffective chez les sujets de l’échantillon. Cette recherche entraîne des retombées pour les didacticiens puisqu’elle fournit des données supplémentaires pouvant contribuer à améliorer la qualité du matériel didactique mis à la disposition des praticiens. De plus, la SAÉ développée pour cette étude constitue un outil didactique novateur pouvant être utilisé par les enseignants du primaire.
Resumo:
Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methods for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and description of the business activities of the enterprise to be able to build up, optimize and operate an successful enterprise. The main goal of SIX SIGMA is to optimize the performance of processes. A still open question is: "What are the adequate Quality criteria and methods to ensure such performance? What must we do to get Quality governance?" This paper describes a method including an Enterprise Engineering method and SIX SIGMA strategy to reach Quality Governance
Resumo:
Introducción: La Paniculopatía Edematofibroesclerosa (PEFE) es una de las mayores causas de consulta estética. Presentándose 85% en mujeres, careciendo de un método de medición de la severidad reproducible y confiable. El objetivo del estudio es evaluar la confiabilidad y la reproducibilidad de una escala fotográfica que permita clasificar los grados de severidad de la PEFE en glúteos en un grupo de mujeres colombianas. Materiales y Métodos: Se tomaron 182 fotografías estandarizadas de los glúteos en reposo y contracción muscular. Se establecieron por consenso de expertos los criterios para la calificación de las fotografías. Se realizaron dos sesiones con seis evaluadores ciegos a datos clínicos. Se utilizó la siguiente escala 0= Ningún, 1= Leve, 2= Moderado, 3= Severo. Se tomaron para el álbum las fotografías en las que los seis evaluadores concordaron. Resultados: La concordancia entre los evaluadores en reposo y contracción se dio en 23 (25.27%) de las 182 fotografías. Los evaluadores concordaron en 5/91 (5.49%) de las fotografías en reposo y en 18/91 (19.78%) de las fotografías en contracción. No hubo concordancia para los grados 0 y 3 en contracción. La concordancia intraevaluador en reposo como en contracción fue 0.443(p<0.0001). La concordancia Interevaluador en reposo y contracción fue 0.398 (p<0.0001) Discusión: En la construcción de la escala fotográfica, no se encontró concordancia en la calificación de todos los grados de severidad, en los grados en los que si hubo concordancia intra e interevaluador, esta fue aceptable. Se hace necesario continuar con el estudio hasta obtener datos completos.
Resumo:
Con la creciente popularidad de las soluciones de IT como factor clave para aumentar la competitividad y la creación de valor para las empresas, la necesidad de invertir en proyectos de IT se incrementa considerablemente. La limitación de los recursos como un obstáculo para invertir ha obligado a las empresas a buscar metodologías para seleccionar y priorizar proyectos, asegurándose de que las decisiones que se toman son aquellas que van alineadas con las estrategias corporativas para asegurar la creación de valor y la maximización de los beneficios. Esta tesis proporciona los fundamentos para la implementación del Portafolio de dirección de Proyectos de IT (IT PPM) como una metodología eficaz para la gestión de proyectos basados en IT, y una herramienta para proporcionar criterios claros para los directores ejecutivos para la toma de decisiones. El documento proporciona la información acerca de cómo implementar el IT PPM en siete pasos, el análisis de los procesos y las funciones necesarias para su ejecución exitosa. Además, proporciona diferentes métodos y criterios para la selección y priorización de proyectos. Después de la parte teórica donde se describe el IT PPM, la tesis aporta un análisis del estudio de caso de una empresa farmacéutica. La empresa ya cuenta con un departamento de gestión de proyectos, pero se encontró la necesidad de implementar el IT PPM debido a su amplia cobertura de procesos End-to-End en Proyectos de IT, y la manera de asegurar la maximización de los beneficios. Con la investigación teórica y el análisis del estudio de caso, la tesis concluye con una definición práctica de un modelo aproximado IT PPM como una recomendación para su implementación en el Departamento de Gestión de Proyectos.
Resumo:
Six large-bodied, ≥ 120 g, woodpecker species are listed as near-threatened to critically endangered by the International Union for Conservation of Nature (IUCN). The small population paradigm assumes that these populations are likely to become extinct without an increase in numbers, but the combined influences of initial population size and demographic rates, i.e., annual adult survival and fecundity, may drive population persistence for these species. We applied a stochastic, stage-based single-population model to available demographic rates for Dryocopus and Campephilus woodpeckers. In particular, we determined the change in predicted extinction rate, i.e., proportion of simulated populations that went extinct within 100 yr, to concomitant changes in six input parameters. To our knowledge, this is the first study to evaluate the combined importance of initial population size and demographic rates for the persistence of large-bodied woodpeckers. Under a worse-case scenario, the median time to extinction was 7 yr (range: 1–32). Across the combinations of other input values, increasing initial population size by one female induced, on average, 0.4%–3.2% (range: 0%–28%) reduction in extinction rate. Increasing initial population size from 5–30 resulted in extinction rates < 0.05 under limited conditions: (1) all input values were intermediate, or (2) Allee effect present and annual adult survival ≥ 0.8. Based on our model, these species can persist as rare, as few as five females, and thus difficult-to-detect, populations provided they maintain ≥ 1.1 recruited females annually per adult female and an annual adult survival rate ≥ 0.8. Athough a demographic-based population viability analysis (PVA) is useful to predict how extinction rate changes across scenarios for life-history attributes, the next step for modeling these populations should incorporate more easily acquired data on changes in patch occupancy to make predictions about patch colonization and extinction rates.
Resumo:
An aggregated farm-level index, the Agri-environmental Footprint Index (AFI), based on multiple criteria methods and representing a harmonised approach to evaluation of EU agri-environmental schemes is described. The index uses a common framework for the design and evaluation of policy that can be customised to locally relevant agri-environmental issues and circumstances. Evaluation can be strictly policy-focused, or broader and more holistic in that context-relevant assessment criteria that are not necessarily considered in the evaluated policy can nevertheless be incorporated. The Index structure is flexible, and can respond to diverse local needs. The process of Index construction is interactive, engaging farmers and other relevant stakeholders in a transparent decision-making process that can ensure acceptance of the outcome, help to forge an improved understanding of local agri-environmental priorities and potentially increase awareness of the critical role of farmers in environmental management. The structure of the AFI facilitates post-evaluation analysis of relative performance in different dimensions of the agri-environment, permitting identification of current strengths and weaknesses, and enabling future improvement in policy design. Quantification of the environmental impact of agriculture beyond the stated aims of policy using an 'unweighted' form of the AFI has potential as the basis of an ongoing system of environmental audit within a specified agricultural context. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Observations of boundary-layer cloud have been made using radar and lidar at Chilbolton, Hampshire, UK. These have been compared with output from 7 different global and regional models. Fifty-five cloudy days have been composited to reveal the mean diurnal variation of cloud top and base heights, cloud thickness and liquid water path of the clouds. To enable like-for-like comparison between model and observations, the observations have been averaged on to the grid of each model. The composites show a distinct diurnal cycle in observed cloud; the cloud height exhibits a sinusoidal variation throughout the day with a maximum at around 1600 and a minimum at around 0700 UTC. This diurnal cycle is captured by six of the seven models analysed, although the models generally under-predict both cloud top and cloud base heights throughout the day. The two worst performing models in terms of cloud boundaries also have biases of around a factor of two in liquid water path; these were the only two models that did not include an explicit formulation for cloud-top entrainment.
Resumo:
Many different performance measures have been developed to evaluate field predictions in meteorology. However, a researcher or practitioner encountering a new or unfamiliar measure may have difficulty in interpreting its results, which may lead to them avoiding new measures and relying on those that are familiar. In the context of evaluating forecasts of extreme events for hydrological applications, this article aims to promote the use of a range of performance measures. Some of the types of performance measures that are introduced in order to demonstrate a six-step approach to tackle a new measure. Using the example of the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble precipitation predictions for the Danube floods of July and August 2002, to show how to use new performance measures with this approach and the way to choose between different performance measures based on their suitability for the task at hand is shown. Copyright © 2008 Royal Meteorological Society
Resumo:
This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.
Resumo:
While several privacy protection techniques are pre- sented in the literature, they are not complemented with an established objective evaluation method for their assess- ment and comparison. This paper proposes an annotation- free evaluation method that assesses the two key aspects of privacy protection that are privacy and utility. Unlike some existing methods, the proposed method does not rely on the use of subjective judgements and does not assume a spe- cific target type in the image data. The privacy aspect is quantified as an appearance similarity and the utility aspect is measured as a structural similarity between the original raw image data and the privacy-protected image data. We performed an extensive experimentation using six challeng- ing datasets (including two new ones) to demonstrate the effectiveness of the evaluation method by providing a per- formance comparison of four state-of-the-art privacy pro- tection techniques.
Resumo:
The aim of this paper is to evaluate the performance of two divergent methods for delineating commuting regions, also called labour market areas, in a situation that the base spatial units differ largely in size as a result of an irregular population distribution. Commuting patterns in Sweden have been analyzed with geographical information system technology by delineating commuting regions using two regionalization methods. One, a rule-based method, uses one-way commuting flows to delineate local labour market areas in a top-down procedure based on the selection of predefined employment centres. The other method, the interaction-based Intramax analysis, uses two-way flows in a bottom-up procedure based on numerical taxonomy principles. A comparison of these methods will expose a number of strengths and weaknesses. For both methods, the same data source has been used. The performance of both methods has been evaluated for the country as a whole using resident employed population, self-containment levels and job ratios for criteria. A more detailed evaluation has been done in the Goteborg metropolitan area by comparing regional patterns with the commuting fields of a number of urban centres in this area. It is concluded that both methods could benefit from the inclusion of additional control measures to identify improper allocations of municipalities.
Resumo:
In the past, the focus of drainage design was on sizing pipes and storages in order to provide sufficient network capacity. This traditional approach, together with computer software and technical guidance, had been successful for many years. However, due to rapid population growth and urbanisation, the requirements of a “good” drainage design have also changed significantly. In addition to water management, other aspects such as environmental impacts, amenity values and carbon footprint have to be considered during the design process. Going forward, we need to address the key sustainability issues carefully and practically. The key challenge of moving from simple objectives (e.g. capacity and costs) to complicated objectives (e.g. capacity, flood risk, environment, amenity etc) is the difficulty to strike a balance between various objectives and to justify potential benefits and compromises. In order to assist decision makers, we developed a new decision support system for drainage design. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. The evaluation framework is used for the quantification of performance, life-cycle costs and benefits of different drainage systems. The optimisation tool can search for feasible combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will discuss real-world application of the decision support system. A number of case studies have been developed based on recent drainage projects in China. We will use the case studies to illustrate how the evaluation framework highlights and compares the pros and cons of various design options. We will also discuss how the design parameters can be optimised based on the preferences of decision makers. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.