908 resultados para Willingness to pay for risk reduction
Resumo:
I denna uppsats skattas betalningsviljan hos besökarna på Peace & Love-festivalen år 2011. Med hjälp av enkätdata baserad på avslöjade och uttalade preferenser presenteras en regressionsanalys med olika oberoende variabler som karaktäriserar en festivalbesökare. Total budget är den beroende variabeln i regressionsanalysen och tolkas i uppsatsen som ekvivalent med besökarnas betalningsvilja. Analysen visar att män i genomsnitt spenderar 301 kronor mer än kvinnor, att turister i genomsnitt spenderar 1 124 kronor mer än en icke-turist samt att den genomsnittliga besökaren har en betalningsvilja på 4 183 kronor. Ett skattat konsumentöverskott har också värderats, vilket uppgick till 743 kronor per person och cirka 37 miljoner kronor totalt för de 50 000 festivalbesökarna. Uppsatsen tar inte hänsyn till de ekonomiska effekter som festivalen har på Borlänge som stad.
Resumo:
In this dissertation I study the development of urban areas. At the aggregate level I investigate how they may be affected by climate change policies and by being designated the seat of governmental power. At the household level I study with coauthors how microfinance could improve the health of urban residents. In Chapter 1, I investigate how local employment may be affected by electricity price increases, which is a likely consequence of climate change policies. I outline how previous studies that find large, negative effects may be biased. To overcome these biases I develop a novel estimation strategy that blends border-pair regressions with the synthetic control methodology. I show the conditions for consistent estimation. Using this estimator, I find no effect of contemporaneous price changes on employment. Consistent with the longer time-frame for manufacturing decisions, I do find evidence for negative effects from perceived permanent price shocks. These estimates are much smaller than previous research has found. National capital cities are often substantially larger than other cities in their countries. In Chapter 2, I investigate whether there is a causal effect from being a capital by studying the 1960 relocation of the Brazilian capital from Rio de Janeiro to Brasília. Using a synthetic controls strategy I find that losing the capital had no significant effects on Rio de Janeiro in terms of population, employment, or gross domestic product (GDP). I find that Brasília experienced large and significant increases in population, employment, and GDP. I find evidence of large spillovers from the public to the private sector. Chapter 3 investigates how microfinance could increase the uptake of costly health goods. We study the effect of time payments (micro-loans or micro-savings) on willingness-to-pay (WTP) for a water filter among households in the slums of Dhaka, Bangladesh. We find that time payments significantly increase WTP: compared to a lump-sum up-front purchase, median WTP increases 83% with a six-month loan and 115% with a 12-month loan. We find that households are quite patient with respect to consumption of health inputs. We find evidence for the presence of credit and savings constraints.
Resumo:
Cranial cruciate ligament (CCL) deficiency is the leading cause of lameness affecting the stifle joints of large breed dogs, especially Labrador Retrievers. Although CCL disease has been studied extensively, its exact pathogenesis and the primary cause leading to CCL rupture remain controversial. However, weakening secondary to repetitive microtrauma is currently believed to cause the majority of CCL instabilities diagnosed in dogs. Techniques of gait analysis have become the most productive tools to investigate normal and pathological gait in human and veterinary subjects. The inverse dynamics analysis approach models the limb as a series of connected linkages and integrates morphometric data to yield information about the net joint moment, patterns of muscle power and joint reaction forces. The results of these studies have greatly advanced our understanding of the pathogenesis of joint diseases in humans. A muscular imbalance between the hamstring and quadriceps muscles has been suggested as a cause for anterior cruciate ligament rupture in female athletes. Based on these findings, neuromuscular training programs leading to a relative risk reduction of up to 80% has been designed. In spite of the cost and morbidity associated with CCL disease and its management, very few studies have focused on the inverse dynamics gait analysis of this condition in dogs. The general goals of this research were (1) to further define gait mechanism in Labrador Retrievers with and without CCL-deficiency, (2) to identify individual dogs that are susceptible to CCL disease, and (3) to characterize their gait. The mass, location of the center of mass (COM), and mass moment of inertia of hind limb segments were calculated using a noninvasive method based on computerized tomography of normal and CCL-deficient Labrador Retrievers. Regression models were developed to determine predictive equations to estimate body segment parameters on the basis of simple morphometric measurements, providing a basis for nonterminal studies of inverse dynamics of the hind limbs in Labrador Retrievers. Kinematic, ground reaction forces (GRF) and morphometric data were combined in an inverse dynamics approach to compute hock, stifle and hip net moments, powers and joint reaction forces (JRF) while trotting in normal, CCL-deficient or sound contralateral limbs. Reductions in joint moment, power, and loads observed in CCL-deficient limbs were interpreted as modifications adopted to reduce or avoid painful mobilization of the injured stifle joint. Lameness resulting from CCL disease affected predominantly reaction forces during the braking phase and the extension during push-off. Kinetics also identified a greater joint moment and power of the contralateral limbs compared with normal, particularly of the stifle extensor muscles group, which may correlate with the lameness observed, but also with the predisposition of contralateral limbs to CCL deficiency in dogs. For the first time, surface EMG patterns of major hind limb muscles during trotting gait of healthy Labrador Retrievers were characterized and compared with kinetic and kinematic data of the stifle joint. The use of surface EMG highlighted the co-contraction patterns of the muscles around the stifle joint, which were documented during transition periods between flexion and extension of the joint, but also during the flexion observed in the weight bearing phase. Identification of possible differences in EMG activation characteristics between healthy patients and dogs with or predisposed to orthopedic and neurological disease may help understanding the neuromuscular abnormality and gait mechanics of such disorders in the future. Conformation parameters, obtained from femoral and tibial radiographs, hind limb CT images, and dual-energy X-ray absorptiometry, of hind limbs predisposed to CCL deficiency were compared with the conformation parameters from hind limbs at low risk. A combination of tibial plateau angle and femoral anteversion angle measured on radiographs was determined optimal for discriminating predisposed and non-predisposed limbs for CCL disease in Labrador Retrievers using a receiver operating characteristic curve analysis method. In the future, the tibial plateau angle (TPA) and femoral anteversion angle (FAA) may be used to screen dogs suspected of being susceptible to CCL disease. Last, kinematics and kinetics across the hock, stifle and hip joints in Labrador Retrievers presumed to be at low risk based on their radiographic TPA and FAA were compared to gait data from dogs presumed to be predisposed to CCL disease for overground and treadmill trotting gait. For overground trials, extensor moment at the hock and energy generated around the hock and stifle joints were increased in predisposed limbs compared to non predisposed limbs. For treadmill trials, dogs qualified as predisposed to CCL disease held their stifle at a greater degree of flexion, extended their hock less, and generated more energy around the stifle joints while trotting on a treadmill compared with dogs at low risk. This characterization of the gait mechanics of Labrador Retrievers at low risk or predisposed to CCL disease may help developing and monitoring preventive exercise programs to decrease gastrocnemius dominance and strengthened the hamstring muscle group.
TAKING THE PERSPECTIVE OF A SELLER AND A BUYER: IMPLICATIONS FOR PRICE ELICITATION AND PRICE FRAMING
Resumo:
This dissertation consists of two essays which investigate how assuming the role of a seller or a buyer affects valuations in a price elicitation task (essay I) and how different presentations of an equivalent price affect evaluations when a consumer plays the dual roles of a buyer and a seller in transactions involving trade-ins (essay II). Sellers’ willingness to accept (WTA) to give up a good is typically higher than buyers' willingness to pay (WTP) to obtain the good. Essay I proposes that valuation processes of sellers and buyers are guided by a motivational orientation of “getting the best.” For a seller (buyer) indicating WTA (WTP), getting the best implies receiving as much as possible to give up a specific good (giving up as little as possible to get the specific good). Results of six studies suggest that the WTA-WTP elicitation task activates different directional goals, leading to the WTA-WTP disparity. The different directional goals lead sellers and buyers to focus on different aspects and bias their cognitive reasoning and interpretation of information. By connecting the valuation process to the general motivation of getting the best, this research provides a unifying framework to explain the disparate interpretations of the WTA-WTP disparity. Many new purchases and replacement decisions involve consumers’ trading in their old products. In such transactions, the overall exchange may be priced either as separate transactions (partitioned) with price tags for the payment and the receipt or as a single net price (consolidated) which takes into account the value of the trade-in. Essay II examines whether consumers prefer a partitioned price versus a consolidated price presentation. The findings suggest that when consumers are trading in a product which has a low value relative to the price of a new product, they prefer a consolidated price. In contrast, when trading in a product which has high value, they prefer a partitioned price. The results suggest that consumers use the price of the new product as an anchor to evaluate the trade-in value, and the perception of the trade-in value influences the overall evaluation especially when the transaction is partitioned.
Resumo:
Integrated production (IP) is part of the Brazilian government program to promote sustainable agricultural production. IP ensure minimum food quality standards for domestic market, and export. Furthermore, IP is considered a good option to reduce negative environmental impacts of intensive crops in tropical Savannas, including common beans, as a Brazilian staple food. Although its advantages, and the government’s effort to promote IP, few growers are adopting IP. Maybe, the perception about IP usefulness and/or its ease of use is not too clear. Moreover, the production sector is driven by market signs, and there is few information on the consumer's preferences toward IP certified products in Brazil. In this study, we sought to identify some critical factors that can influence the IP adoption in beans' production. Moreover, we sought to verify the consumers’ perceptions and intention of purchasing IP certified beans (hypothetical product). This report comprises four chapters: (1) an introduction illustrating the context in which the research was based; (2) the results on the study of IP adoption based on the Technology Acceptance Model (TAM); (3) the choice experiment results applied to identify consumers preferences and willingness-to-pay (WTP) for IP label; (4) the results on the Theory of Planned Behaviour (TPB) applied to identify consumers’ perception toward IP certified beans. This research contributes with rich information for the beans’ supply chain, providing several insights to growers, retail and other agents, including policy makers. Beans’ production sector seems to be positively intentioned to adopt IP, but further studies should be conducted to test other adoption indicators using TAM model. Surveyed consumers are willing to pay a premium price for IP labelled beans. They showed a positive attitude toward purchasing IP labelled beans. It is an important information to motivate production sector to offer certified beans to the market.
Resumo:
Abstract: Heavily used and highly valuable, the Florida Reef is one of the world's most threatened ecosystems. Stakeholders from a densely urbanized coastal region in proximity to the reef system recognize its degradation, but their comprehension of climate change and commitment to pay for sustainable management and research funding have been opaque. With an emphasis on recreational anglers, residential stakeholders were surveyed online about their marine activities, perceptions of resources and threats, and willingness to pay (WTP) for dedicated coral reef research funding in Florida. The majority of stakeholders are wealthy, well educated, and politically independent. Supermajorities favored the two scenarios of taxation for a Florida Coral Reef Research Fund, and the scenario with matching federal funds earned higher support. In regression analyses, several factors emerged as significant contributors to stakeholders’ preferences, and the four recurring factors in extended models were prioritizing the environment over the economy, donating to environmental causes, concern about coral reefs, and concern about climate change, with the latter indicating a recent shift of opinion. Status in terms of income and education were found insignificant, and surprisingly income was negatively correlated with WTP. Perceptions through lenses of environmental and emotional attachments appear to overwhelm conventional status-based factors. Applied statewide, the first scenario's extrapolated WTP (based on a sales tax rate of 2.9%) would generate $675 million annually, and the extrapolated WTP under the second scenario, with matching federal funds (based on a sales tax rate of 3.0%) would generate $1.4 billion. Keywords: willingness to pay, coral reef research, taxation, climate change, stakeholder, perceptions, Florida Reef, recreational fishing, anglers
Resumo:
The value of a seasonal forecasting system based on phases of the Southern Oscillation was estimated for a representative dryland wheat grower in the vicinity of Goondiwindi. In particular the effects on this estimate of risk attitude and planting conditions were examined. A recursive stochastic programming approach was used to identify the grower's utility-maximising action set in the event of each of the climate patterns over the period 1894-1991 recurring In the imminent season. The approach was repeated with and without use of the forecasts. The choices examined were, at planting, nitrogen application rate and cultivar and, later in the season, choices of proceeding with or abandoning each wheat activity, The value of the forecasting system was estimated as the maximum amount the grower could afford to pay for its use without expected utility being lowered relative to its non use.
Resumo:
Dans le contexte climatique actuel, les régions méditerranéennes connaissent une intensification des phénomènes hydrométéorologiques extrêmes. Au Maroc, le risque lié aux inondations est devenu problématique, les communautés étant vulnérables aux événements extrêmes. En effet, le développement économique et urbain rapide et mal maîtrisé augmente l'exposition aux phénomènes extrêmes. La Direction du Développement et de la Coopération suisse (DDC) s'implique activement dans la réduction des risques naturels au Maroc. La cartographie des dangers et son intégration dans l'aménagement du territoire représentent une méthode efficace afin de réduire la vulnérabilité spatiale. Ainsi, la DDC a mandaté ce projet d'adaptation de la méthode suisse de cartographie des dangers à un cas d'étude marocain (la ville de Beni Mellal, région de Tadla-Azilal, Maroc). La méthode suisse a été adaptée aux contraintes spécifiques du terrain (environnement semi-aride, morphologie de piémont) et au contexte de transfert de connaissances (caractéristiques socio-économiques et pratiques). Une carte des phénomènes d'inondations a été produite. Elle contient les témoins morphologiques et les éléments anthropiques pertinents pour le développement et l'aggravation des inondations. La modélisation de la relation pluie-débit pour des événements de référence, et le routage des hydrogrammes de crue ainsi obtenus ont permis d'estimer quantitativement l'aléa inondation. Des données obtenues sur le terrain (estimations de débit, extension de crues connues) ont permis de vérifier les résultats des modèles. Des cartes d'intensité et de probabilité ont été obtenues. Enfin, une carte indicative du danger d'inondation a été produite sur la base de la matrice suisse du danger qui croise l'intensité et la probabilité d'occurrence d'un événement pour obtenir des degrés de danger assignables au territoire étudié. En vue de l'implémentation des cartes de danger dans les documents de l'aménagement du territoire, nous nous intéressons au fonctionnement actuel de la gestion institutionnelle du risque à Beni Mellal, en étudiant le degré d'intégration de la gestion et la manière dont les connaissances sur les risques influencent le processus de gestion. L'analyse montre que la gestion est marquée par une logique de gestion hiérarchique et la priorité des mesures de protection par rapport aux mesures passives d'aménagement du territoire. Les connaissances sur le risque restent sectorielles, souvent déconnectées. L'innovation dans le domaine de la gestion du risque résulte de collaborations horizontales entre les acteurs ou avec des sources de connaissances externes (par exemple les universités). Des recommandations méthodologiques et institutionnelles issues de cette étude ont été adressées aux gestionnaires en vue de l'implémentation des cartes de danger. Plus que des outils de réduction du risque, les cartes de danger aident à transmettre des connaissances vers le public et contribuent ainsi à établir une culture du risque. - Severe rainfall events are thought to be occurring more frequently in semi-arid areas. In Morocco, flood hazard has become an important topic, notably as rapid economic development and high urbanization rates have increased the exposure of people and assets in hazard-prone areas. The Swiss Agency for Development and Cooperation (SADC) is active in natural hazard mitigation in Morocco. As hazard mapping for urban planning is thought to be a sound tool for vulnerability reduction, the SADC has financed a project aimed at adapting the Swiss approach for hazard assessment and mapping to the case of Morocco. In a knowledge transfer context, the Swiss method was adapted to the semi-arid environment, the specific piedmont morphology and to socio-economic constraints particular to the study site. Following the Swiss guidelines, a hydro-geomorphological map was established, containing all geomorphic elements related to known past floods. Next, rainfall / runoff modeling for reference events and hydraulic routing of the obtained hydrographs were carried out in order to assess hazard quantitatively. Field-collected discharge estimations and flood extent for known floods were used to verify the model results. Flood hazard intensity and probability maps were obtained. Finally, an indicative danger map as defined within the Swiss hazard assessment terminology was calculated using the Swiss hazard matrix that convolves flood intensity with its recurrence probability in order to assign flood danger degrees to the concerned territory. Danger maps become effective, as risk mitigation tools, when implemented in urban planning. We focus on how local authorities are involved in the risk management process and how knowledge about risk impacts the management. An institutional vulnerability "map" was established based on individual interviews held with the main institutional actors in flood management. Results show that flood hazard management is defined by uneven actions and relationships, it is based on top-down decision-making patterns, and focus is maintained on active mitigation measures. The institutional actors embody sectorial, often disconnected risk knowledge pools, whose relationships are dictated by the institutional hierarchy. Results show that innovation in the risk management process emerges when actors collaborate despite the established hierarchy or when they open to outer knowledge pools (e.g. the academia). Several methodological and institutional recommendations were addressed to risk management stakeholders in view of potential map implementation to planning. Hazard assessment and mapping is essential to an integrated risk management approach: more than a mitigation tool, danger maps represent tools that allow communicating on hazards and establishing a risk culture.
Resumo:
Ces dernières années, de nombreuses recherches ont mis en évidence les effets toxiques des micropolluants organiques pour les espèces de nos lacs et rivières. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, alors que les organismes sont exposés tous les jours à des milliers de substances en mélange. Or les effets de ces cocktails ne sont pas négligeables. Cette thèse de doctorat s'est ainsi intéressée aux modèles permettant de prédire le risque environnemental de ces cocktails pour le milieu aquatique. Le principal objectif a été d'évaluer le risque écologique des mélanges de substances chimiques mesurées dans le Léman, mais aussi d'apporter un regard critique sur les méthodologies utilisées afin de proposer certaines adaptations pour une meilleure estimation du risque. Dans la première partie de ce travail, le risque des mélanges de pesticides et médicaments pour le Rhône et pour le Léman a été établi en utilisant des approches envisagées notamment dans la législation européenne. Il s'agit d'approches de « screening », c'est-à-dire permettant une évaluation générale du risque des mélanges. Une telle approche permet de mettre en évidence les substances les plus problématiques, c'est-à-dire contribuant le plus à la toxicité du mélange. Dans notre cas, il s'agit essentiellement de 4 pesticides. L'étude met également en évidence que toutes les substances, même en trace infime, contribuent à l'effet du mélange. Cette constatation a des implications en terme de gestion de l'environnement. En effet, ceci implique qu'il faut réduire toutes les sources de polluants, et pas seulement les plus problématiques. Mais l'approche proposée présente également un biais important au niveau conceptuel, ce qui rend son utilisation discutable, en dehors d'un screening, et nécessiterait une adaptation au niveau des facteurs de sécurité employés. Dans une deuxième partie, l'étude s'est portée sur l'utilisation des modèles de mélanges dans le calcul de risque environnemental. En effet, les modèles de mélanges ont été développés et validés espèce par espèce, et non pour une évaluation sur l'écosystème en entier. Leur utilisation devrait donc passer par un calcul par espèce, ce qui est rarement fait dû au manque de données écotoxicologiques à disposition. Le but a été donc de comparer, avec des valeurs générées aléatoirement, le calcul de risque effectué selon une méthode rigoureuse, espèce par espèce, avec celui effectué classiquement où les modèles sont appliqués sur l'ensemble de la communauté sans tenir compte des variations inter-espèces. Les résultats sont dans la majorité des cas similaires, ce qui valide l'approche utilisée traditionnellement. En revanche, ce travail a permis de déterminer certains cas où l'application classique peut conduire à une sous- ou sur-estimation du risque. Enfin, une dernière partie de cette thèse s'est intéressée à l'influence que les cocktails de micropolluants ont pu avoir sur les communautés in situ. Pour ce faire, une approche en deux temps a été adoptée. Tout d'abord la toxicité de quatorze herbicides détectés dans le Léman a été déterminée. Sur la période étudiée, de 2004 à 2009, cette toxicité due aux herbicides a diminué, passant de 4% d'espèces affectées à moins de 1%. Ensuite, la question était de savoir si cette diminution de toxicité avait un impact sur le développement de certaines espèces au sein de la communauté des algues. Pour ce faire, l'utilisation statistique a permis d'isoler d'autres facteurs pouvant avoir une influence sur la flore, comme la température de l'eau ou la présence de phosphates, et ainsi de constater quelles espèces se sont révélées avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps. Fait intéressant, une partie d'entre-elles avait déjà montré des comportements similaires dans des études en mésocosmes. En conclusion, ce travail montre qu'il existe des modèles robustes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques, et qu'ils peuvent être utilisés pour expliquer le rôle des substances dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application. - Depuis plusieurs années, les risques que posent les micropolluants organiques pour le milieu aquatique préoccupent grandement les scientifiques ainsi que notre société. En effet, de nombreuses recherches ont mis en évidence les effets toxiques que peuvent avoir ces substances chimiques sur les espèces de nos lacs et rivières, quand elles se retrouvent exposées à des concentrations aiguës ou chroniques. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, c'est à dire considérées séparément. Actuellement, il en est de même dans les procédures de régulation européennes, concernant la partie évaluation du risque pour l'environnement d'une substance. Or, les organismes sont exposés tous les jours à des milliers de substances en mélange, et les effets de ces "cocktails" ne sont pas négligeables. L'évaluation du risque écologique que pose ces mélanges de substances doit donc être abordé par de la manière la plus appropriée et la plus fiable possible. Dans la première partie de cette thèse, nous nous sommes intéressés aux méthodes actuellement envisagées à être intégrées dans les législations européennes pour l'évaluation du risque des mélanges pour le milieu aquatique. Ces méthodes sont basées sur le modèle d'addition des concentrations, avec l'utilisation des valeurs de concentrations des substances estimées sans effet dans le milieu (PNEC), ou à partir des valeurs des concentrations d'effet (CE50) sur certaines espèces d'un niveau trophique avec la prise en compte de facteurs de sécurité. Nous avons appliqué ces méthodes à deux cas spécifiques, le lac Léman et le Rhône situés en Suisse, et discuté les résultats de ces applications. Ces premières étapes d'évaluation ont montré que le risque des mélanges pour ces cas d'étude atteint rapidement une valeur au dessus d'un seuil critique. Cette valeur atteinte est généralement due à deux ou trois substances principales. Les procédures proposées permettent donc d'identifier les substances les plus problématiques pour lesquelles des mesures de gestion, telles que la réduction de leur entrée dans le milieu aquatique, devraient être envisagées. Cependant, nous avons également constaté que le niveau de risque associé à ces mélanges de substances n'est pas négligeable, même sans tenir compte de ces substances principales. En effet, l'accumulation des substances, même en traces infimes, atteint un seuil critique, ce qui devient plus difficile en terme de gestion du risque. En outre, nous avons souligné un manque de fiabilité dans ces procédures, qui peuvent conduire à des résultats contradictoires en terme de risque. Ceci est lié à l'incompatibilité des facteurs de sécurité utilisés dans les différentes méthodes. Dans la deuxième partie de la thèse, nous avons étudié la fiabilité de méthodes plus avancées dans la prédiction de l'effet des mélanges pour les communautés évoluant dans le système aquatique. Ces méthodes reposent sur le modèle d'addition des concentrations (CA) ou d'addition des réponses (RA) appliqués sur les courbes de distribution de la sensibilité des espèces (SSD) aux substances. En effet, les modèles de mélanges ont été développés et validés pour être appliqués espèce par espèce, et non pas sur plusieurs espèces agrégées simultanément dans les courbes SSD. Nous avons ainsi proposé une procédure plus rigoureuse, pour l'évaluation du risque d'un mélange, qui serait d'appliquer d'abord les modèles CA ou RA à chaque espèce séparément, et, dans une deuxième étape, combiner les résultats afin d'établir une courbe SSD du mélange. Malheureusement, cette méthode n'est pas applicable dans la plupart des cas, car elle nécessite trop de données généralement indisponibles. Par conséquent, nous avons comparé, avec des valeurs générées aléatoirement, le calcul de risque effectué selon cette méthode plus rigoureuse, avec celle effectuée traditionnellement, afin de caractériser la robustesse de cette approche qui consiste à appliquer les modèles de mélange sur les courbes SSD. Nos résultats ont montré que l'utilisation de CA directement sur les SSDs peut conduire à une sous-estimation de la concentration du mélange affectant 5 % ou 50% des espèces, en particulier lorsque les substances présentent un grand écart- type dans leur distribution de la sensibilité des espèces. L'application du modèle RA peut quant à lui conduire à une sur- ou sous-estimations, principalement en fonction de la pente des courbes dose- réponse de chaque espèce composant les SSDs. La sous-estimation avec RA devient potentiellement importante lorsque le rapport entre la EC50 et la EC10 de la courbe dose-réponse des espèces est plus petit que 100. Toutefois, la plupart des substances, selon des cas réels, présentent des données d' écotoxicité qui font que le risque du mélange calculé par la méthode des modèles appliqués directement sur les SSDs reste cohérent et surestimerait plutôt légèrement le risque. Ces résultats valident ainsi l'approche utilisée traditionnellement. Néanmoins, il faut garder à l'esprit cette source d'erreur lorsqu'on procède à une évaluation du risque d'un mélange avec cette méthode traditionnelle, en particulier quand les SSD présentent une distribution des données en dehors des limites déterminées dans cette étude. Enfin, dans la dernière partie de cette thèse, nous avons confronté des prédictions de l'effet de mélange avec des changements biologiques observés dans l'environnement. Dans cette étude, nous avons utilisé des données venant d'un suivi à long terme d'un grand lac européen, le lac Léman, ce qui offrait la possibilité d'évaluer dans quelle mesure la prédiction de la toxicité des mélanges d'herbicide expliquait les changements dans la composition de la communauté phytoplanctonique. Ceci à côté d'autres paramètres classiques de limnologie tels que les nutriments. Pour atteindre cet objectif, nous avons déterminé la toxicité des mélanges sur plusieurs années de 14 herbicides régulièrement détectés dans le lac, en utilisant les modèles CA et RA avec les courbes de distribution de la sensibilité des espèces. Un gradient temporel de toxicité décroissant a pu être constaté de 2004 à 2009. Une analyse de redondance et de redondance partielle, a montré que ce gradient explique une partie significative de la variation de la composition de la communauté phytoplanctonique, même après avoir enlevé l'effet de toutes les autres co-variables. De plus, certaines espèces révélées pour avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps, ont montré des comportements similaires dans des études en mésocosmes. On peut en conclure que la toxicité du mélange herbicide est l'un des paramètres clés pour expliquer les changements de phytoplancton dans le lac Léman. En conclusion, il existe diverses méthodes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques et celui-ci peut jouer un rôle dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application, avant d'utiliser leurs résultats pour la gestion des risques environnementaux. - For several years now, the scientists as well as the society is concerned by the aquatic risk organic micropollutants may pose. Indeed, several researches have shown the toxic effects these substances may induce on organisms living in our lakes or rivers, especially when they are exposed to acute or chronic concentrations. However, most of the studies focused on the toxicity of single compounds, i.e. considered individually. The same also goes in the current European regulations concerning the risk assessment procedures for the environment of these substances. But aquatic organisms are typically exposed every day simultaneously to thousands of organic compounds. The toxic effects resulting of these "cocktails" cannot be neglected. The ecological risk assessment of mixtures of such compounds has therefore to be addressed by scientists in the most reliable and appropriate way. In the first part of this thesis, the procedures currently envisioned for the aquatic mixture risk assessment in European legislations are described. These methodologies are based on the mixture model of concentration addition and the use of the predicted no effect concentrations (PNEC) or effect concentrations (EC50) with assessment factors. These principal approaches were applied to two specific case studies, Lake Geneva and the River Rhône in Switzerland, including a discussion of the outcomes of such applications. These first level assessments showed that the mixture risks for these studied cases exceeded rapidly the critical value. This exceeding is generally due to two or three main substances. The proposed procedures allow therefore the identification of the most problematic substances for which management measures, such as a reduction of the entrance to the aquatic environment, should be envisioned. However, it was also showed that the risk levels associated with mixtures of compounds are not negligible, even without considering these main substances. Indeed, it is the sum of the substances that is problematic, which is more challenging in term of risk management. Moreover, a lack of reliability in the procedures was highlighted, which can lead to contradictory results in terms of risk. This result is linked to the inconsistency in the assessment factors applied in the different methods. In the second part of the thesis, the reliability of the more advanced procedures to predict the mixture effect to communities in the aquatic system were investigated. These established methodologies combine the model of concentration addition (CA) or response addition (RA) with species sensitivity distribution curves (SSD). Indeed, the mixture effect predictions were shown to be consistent only when the mixture models are applied on a single species, and not on several species simultaneously aggregated to SSDs. Hence, A more stringent procedure for mixture risk assessment is proposed, that would be to apply first the CA or RA models to each species separately and, in a second step, to combine the results to build an SSD for a mixture. Unfortunately, this methodology is not applicable in most cases, because it requires large data sets usually not available. Therefore, the differences between the two methodologies were studied with datasets created artificially to characterize the robustness of the traditional approach applying models on species sensitivity distribution. The results showed that the use of CA on SSD directly might lead to underestimations of the mixture concentration affecting 5% or 50% of species, especially when substances present a large standard deviation of the distribution from the sensitivity of the species. The application of RA can lead to over- or underestimates, depending mainly on the slope of the dose-response curves of the individual species. The potential underestimation with RA becomes important when the ratio between the EC50 and the EC10 for the dose-response curve of the species composing the SSD are smaller than 100. However, considering common real cases of ecotoxicity data for substances, the mixture risk calculated by the methodology applying mixture models directly on SSDs remains consistent and would rather slightly overestimate the risk. These results can be used as a theoretical validation of the currently applied methodology. Nevertheless, when assessing the risk of mixtures, one has to keep in mind this source of error with this classical methodology, especially when SSDs present a distribution of the data outside the range determined in this study Finally, in the last part of this thesis, we confronted the mixture effect predictions with biological changes observed in the environment. In this study, long-term monitoring of a European great lake, Lake Geneva, provides the opportunity to assess to what extent the predicted toxicity of herbicide mixtures explains the changes in the composition of the phytoplankton community next to other classical limnology parameters such as nutrients. To reach this goal, the gradient of the mixture toxicity of 14 herbicides regularly detected in the lake was calculated, using concentration addition and response addition models. A decreasing temporal gradient of toxicity was observed from 2004 to 2009. Redundancy analysis and partial redundancy analysis showed that this gradient explains a significant portion of the variation in phytoplankton community composition, even when having removed the effect of all other co-variables. Moreover, some species that were revealed to be influenced positively or negatively, by the decrease of toxicity in the lake over time, showed similar behaviors in mesocosms studies. It could be concluded that the herbicide mixture toxicity is one of the key parameters to explain phytoplankton changes in Lake Geneva. To conclude, different methods exist to predict the risk of mixture in the ecosystems. But their reliability varies depending on the underlying hypotheses. One should therefore carefully consider these hypotheses, as well as the limits of the approaches, before using the results for environmental risk management
Resumo:
P>Aim: To determine the effects of imperfect adherence (i.e. occasionally missing prescribed doses), and the influence of rate of loss of antihypertensive effect during treatment interruption, on the predicted clinical effectiveness of antihypertensive drugs in reducing mean systolic blood pressure (SBP) and cardiovascular disease (CVD) risk.Method:The effects of imperfect adherence to antihypertensive treatment regimens were estimated using published patterns of missed doses, and taking into account the rate of loss of antihypertensive effect when doses are missed (loss of BP reduction in mmHg/day; the off-rate), which varies between drugs. Outcome measures were the predicted mean SBP reduction and CVD risk, determined from the Framingham Risk Equation for CVD.Results:In patients taking 75% of prescribed doses (typical of clinical practice), only long-acting drugs with an off-rate of similar to 1 mmHg/day were predicted to maintain almost the full mean SBP-lowering effect throughout the modelled period. In such patients, using shorter-acting drugs (e.g. an off-rate of similar to 5-6 mmHg/day) was predicted to lead to a clinically relevant loss of mean SBP reduction of > 2 mmHg. This change also influenced the predicted CVD risk reduction; in patients with a baseline 10-year CVD risk of 27.0% and who were taking 75% of prescribed doses, a difference in off-rate from 1 to 5 mmHg/day led to a predicted 0.5% absolute increase in 10-year CVD risk.Conclusions:In patients who occasionally miss doses of antihypertensives, modest differences in the rate of loss of antihypertensive effect following treatment interruption may have a clinically relevant impact on SBP reduction and CVD risk. While clinicians must make every effort to counsel and encourage each of their patients to adhere to their prescribed medication, it may also be prudent to prescribe drugs with a low off-rate to mitigate the potential consequences of missing doses.
Resumo:
The puq)ose of this thesis is to test a model Hnking community disadvantage and urbanicity factors to parenting variables (i.e., monitoring, warmth, and knowledge) and to youth risk behavior (i.e., substance use and delinquency), measured both concurrently and one year after the assessment of parenting variables. The model builds on the work of Fletcher, Steinberg, and Williams-Wheeler (2004) but a) includes a more comprehensive measure of SES than that conceptualized by Fletcher et al.; b) considers whether the role of community disadvantage is indirectly as well as directly linked to youth risk behavior, by way of its association with parenting variables; c) considers whether level of community urbanicity plays a direct role in predicting both parenting variables and risk behaviors, or whether its influence on risk behaviours is primarily indirect through parenting variables. Both community disadvantage and urbanicity had virtually no relation to parenting and risk behaviour variables. Results found for relations of parenting variables and risk behaviour were similar to Fletcher et al. Although urban youth are typically perceived as being more at risk for substance use and delinquency, no evidence was found for a distinction between urban and rural youth within this sample. Targeting risk behaviour prevention/reduction programs toward only urban youth, therefore, is not supported by these findings.
Resumo:
This paper studies the impact of banks' liability for environmental damages caused by their borrowers. Laws or court decisions that declare banks liable for environmental damages have two objectives : (1) finding someone to pay for the damages and (2) exerting a pressure on a firm's stakeholders to incite them to invest in environmental risk prevention. We study the effect that such legal decisions can have on financing relationships and especially on the incentives to reduce environmental risk in an environment where banks cannot commit to refinance the firm in all circumstances. Following an environmental accident, liable banks more readily agree to refinance the firm. We then show that bank liability effectively makes refinancing more attractive to banks, therefore improving the firm's risk-sharing possibilities. Consequently, the firm's incentives to invest in environmental risk reduction are weakened compared to the (bank) no-liability case. We also show that, when banks are liable, the firm invests at the full-commitment optimal level of risk reduction investment. If there are some externalities such that some damages cannot be accounted for, the socially efficient level of investment is greater than the privately optimal one. in that case, making banks non-liable can be socially desirable.
Resumo:
Hope is an important construct in marketing, once it is an antecedent of important marketing variables, such as trust, expectation and satisfaction (MacInnis & de Mello, 2005, Almeida, Mazzon & Botelho, 2007). Specifically, the literature suggests that hope can play an important influence on risk perception (Almeida, 2010, Almeida et al., 2007, Fleming, 2008, MacInnis & de Mello, 2005) and propensity to indebtedness (Fleming, 2008). Thus, this thesis aims to investigate the relations among hope, risk perception related to purchasing and consumption and propensity to indebtedness, by reviewing the existing literature and conducting two empirical researches. The first of them is a laboratory experiment, which accessed hope and risk perception of getting a mortgage loan. The second is a survey, investigating university students’ propensity to get indebted to pay for their university tuition, analyzed through the method of Structural Equations Modeling (SEM). These studies found that hope seems to play an important role on propensity to indebtedness, as higher levels of hope predicted an increase in the propensity to accept the mortgage loan, independent of actual risks, and an increase in the propensity of college students to get indebted to pay for their studies. In addition, the first study suggests that hope may lead to a decrease in risk perception, which, however, has not been confirmed by the second study. Finally, this research offers some methodological contributions, due to the fact that it is the first study using an experimental method to study hope in Brazil and, worldwide, it is the first study investigating the relation among hope, risk perception and propensity to indebtedness, which proved to be important influences in consumer behavior
Resumo:
Introduction: The aim was to confirm that PSF (probability of stone formation) changed appropriately following medical therapy on recurrent stone formers.Materials and Methods: Data were collected on 26 Brazilian stone-formers. A baseline 24-hour urine collection was performed prior to treatment. Details of the medical treatment initiated for stone-disease were recorded. A PSF calculation was performed on the 24 hour urine sample using the 7 urinary parameters required: voided volume, oxalate, calcium, urate, pH, citrate and magnesium. A repeat 24-hour urine sample was performed for PSF calculation after treatment. Comparison was made between the PSF scores before and during treatment.Results: At baseline, 20 of the 26 patients (77%) had a high PSF score (> 0.5). Of the 26 patients, 17 (65%) showed an overall reduction in their PSF profiles with a medical treatment regimen. Eleven patients (42%) changed from a high risk (PSF > 0.5) to a low risk (PSF < 0.5) and 6 patients reduced their risk score but did not change risk category. Six (23%) patients remained in a high risk category (> 0.5) during both assessments.Conclusions: The PSF score reduced following medical treatment in the majority of patients in this cohort.
Resumo:
Canada Geese overflying the runways at London’s Heathrow Airport have been struck on eleven occasions by aircraft during the last ten years. Four of these occurred during the pre-breeding season and seven during the post moult period. A monitoring study was initiated in 1999 to evaluate the movements of geese around the airport and determine appropriate mitigation strategies to reduce the risk of birdstrike. Moult sites within 13km of the airport were identified. 4,900 moulting geese were caught and fitted with colour rings and radio-transmitters between 1999 and 2004. 2,500 visits were made to over 300 sites resulting in over 10,000 sightings of known individuals. Birds that crossed the airport approaches whilst moving between roost sites and feeding areas in newly harvested cereal crops were identified. Throughout the monitoring period efforts were made to control the risk, but by 2003 it was estimated that 10,000 bird transits of the approaches involving almost 700 individuals occurred during a 50 day period. The knowledge of the movements of ringed and tagged birds was used to inform a revised habitat management, daily roost dispersal and on-airfield bird deterrence programme in 2004. By adopting a flexible approach to management, an estimated 70% reduction in bird transits was achieved. This paper discusses the techniques used to achieve this reduction.