869 resultados para Pollutant Build-up
Resumo:
Toll plazas are particularly susceptible to build-ups of vehicle-emitted pollutants because vehicles pass through in low gear. To look at this, three-dimensional computational fluid dynamics simulations of pollutant dispersion are used on the standard k e turbulence model. The effects of wind speed, wind direction and topography on pollutant dispersion were discussed. The Wuzhuang toll plaza on the Hefei-Nanjing expressway is considered, and the effects of the retaining walls along both sides of the plaza on pollutant dispersion is analysed. There are greater pollutant concentrations near the tollbooths as the angle between the direction of the wind and traffic increases implying that retaining walls impede dispersion. The slope of the walls has little influence on the variations in pollutant concentration.
Resumo:
This research shows that gross pollutant traps (GPTs) continue to play an important role in preventing visible street waste—gross pollutants—from contaminating the environment. The demand for these GPTs calls for stringent quality control and this research provides a foundation to rigorously examine the devices. A novel and comprehensive testing approach to examine a dry sump GPT was developed. The GPT is designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. This device has not been previously investigated. Apart from the review of GPTs and gross pollutant data, the testing approach includes four additional aspects to this research, which are: field work and an historical overview of street waste/stormwater pollution, calibration of equipment, hydrodynamic studies and gross pollutant capture/retention investigations. This work is the first comprehensive investigation of its kind and provides valuable practical information for the current research and any future work pertaining to the operations of GPTs and management of street waste in the urban environment. Gross pollutant traps—including patented and registered designs developed by industry—have specific internal configurations and hydrodynamic separation characteristics which demand individual testing and performance assessments. Stormwater devices are usually evaluated by environmental protection agencies (EPAs), professional bodies and water research centres. In the USA, the American Society of Civil Engineers (ASCE) and the Environmental Water Resource Institute (EWRI) are examples of professional and research organisations actively involved in these evaluation/verification programs. These programs largely rely on field evaluations alone that are limited in scope, mainly for cost and logistical reasons. In Australia, evaluation/verification programs of new devices in the stormwater industry are not well established. The current limitations in the evaluation methodologies of GPTs have been addressed in this research by establishing a new testing approach. This approach uses a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The physical model consisted of a 50% scale model GPT rig with screen blockages varying from 0 to 100%. This rig was placed in a 20 m flume and various inlet and outflow operating conditions were modelled on observations made during the field monitoring of GPTs. Due to infrequent cleaning, the retaining screens inside the GPTs were often observed to be blocked with organic matter. Blocked screens can radically change the hydrodynamic and gross pollutant capture/retention characteristics of a GPT as shown from this research. This research involved the use of equipment, such as acoustic Doppler velocimeters (ADVs) and dye concentration (Komori) probes, which were deployed for the first time in a dry sump GPT. Hence, it was necessary to rigorously evaluate the capability and performance of these devices, particularly in the case of the custom made Komori probes, about which little was known. The evaluation revealed that the Komori probes have a frequency response of up to 100 Hz —which is dependent upon fluid velocities—and this was adequate to measure the relevant fluctuations of dye introduced into the GPT flow domain. The outcome of this evaluation resulted in establishing methodologies for the hydrodynamic measurements and gross pollutant capture/retention experiments. The hydrodynamic measurements consisted of point-based acoustic Doppler velocimeter (ADV) measurements, flow field particle image velocimetry (PIV) capture, head loss experiments and computational fluid dynamics (CFD) simulation. The gross pollutant capture/retention experiments included the use of anthropogenic litter components, tracer dye and custom modified artificial gross pollutants. Anthropogenic litter was limited to tin cans, bottle caps and plastic bags, while the artificial pollutants consisted of 40 mm spheres with a range of four buoyancies. The hydrodynamic results led to the definition of global and local flow features. The gross pollutant capture/retention results showed that when the internal retaining screens are fully blocked, the capture/retention performance of the GPT rapidly deteriorates. The overall results showed that the GPT will operate efficiently until at least 70% of the screens are blocked, particularly at high flow rates. This important finding indicates that cleaning operations could be more effectively planned when the GPT capture/retention performance deteriorates. At lower flow rates, the capture/retention performance trends were reversed. There is little difference in the poor capture/retention performance between a fully blocked GPT and a partially filled or empty GPT with 100% screen blockages. The results also revealed that the GPT is designed with an efficient high flow bypass system to avoid upstream blockages. The capture/retention performance of the GPT at medium to high inlet flow rates is close to maximum efficiency (100%). With regard to the design appraisal of the GPT, a raised inlet offers a better capture/retention performance, particularly at lower flow rates. Further design appraisals of the GPT are recommended.
Resumo:
This manuscript took a 'top down' approach to understanding survival of inhabitant cells in the ecosystem bone, working from higher to lower length and time scales through the hierarchical ecosystem of bone. Our working hypothesis is that nature “engineered” the skeleton using a 'bottom up' approach,where mechanical properties of cells emerge from their adaptation to their local me-chanical milieu. Cell aggregation and formation of higher order anisotropic struc- ture results in emergent architectures through cell differentiation and extracellular matrix secretion. These emergent properties, including mechanical properties and architecture, result in mechanical adaptation at length scales and longer time scales which are most relevant for the survival of the vertebrate organism [Knothe Tate and von Recum 2009]. We are currently using insights from this approach to har-ness nature’s regeneration potential and to engineer novel mechanoactive materials [Knothe Tate et al. 2007, Knothe Tate et al. 2009]. In addition to potential applications of these exciting insights, these studies may provide important clues to evolution and development of vertebrate animals. For instance, one might ask why mesenchymal stem cells condense at all? There is a putative advantage to self-assembly and cooperation, but this advantage is somewhat outweighed by the need for infrastructural complexity (e.g., circulatory systems comprised of specific differentiated cell types which in turn form conduits and pumps to overcome limitations of mass transport via diffusion, for example; dif-fusion is untenable for multicellular organisms larger than 250 microns in diameter. A better question might be: Why do cells build skeletal tissue? Once cooperatingcells in tissues begin to deplete local sources of food in their aquatic environment, those that have evolved a means to locomote likely have an evolutionary advantage. Once the environment becomes less aquarian and more terrestrial, self-assembled organisms with the ability to move on land might have conferred evolutionary ad-vantages as well. So did the cytoskeleton evolve several length scales, enabling the emergence of skeletal architecture for vertebrate animals? Did the evolutionary advantage of motility over noncompliant terrestrial substrates (walking on land) favor adaptations including emergence of intracellular architecture (changes in the cytoskeleton and upregulation of structural protein manufacture), inter-cellular con- densation, mineralization of tissues, and emergence of higher order architectures?How far does evolutionary Darwinism extend and how can we exploit this knowl- edge to engineer smart materials and architectures on Earth and new, exploratory environments?[Knothe Tate et al. 2008]. We are limited only by our ability to imagine. Ultimately, we aim to understand nature, mimic nature, guide nature and/or exploit nature’s engineering paradigms without engineer-ing ourselves out of existence.
Resumo:
How does the image of the future operate upon history, and upon national and individual identities? To what extent are possible futures colonized by the image? What are the un-said futurecratic discourses that underlie the image of the future? Such questions inspired the examination of Japan’s futures images in this thesis. The theoretical point of departure for this examination is Polak’s (1973) seminal research into the theory of the ‘image of the future’ and seven contemporary Japanese texts which offer various alternative images for Japan’s futures, selected as representative of a ‘national conversation’ about the futures of that nation. These seven images of the future are: 1. Report of the Prime Minister’s Commission on Japan’s Goals in the 21st Century—The Frontier Within: Individual Empowerment and Better Governance in the New Millennium, compiled by a committee headed by Japan’s preeminent Jungian psychologist Kawai Hayao (1928-2007); 2. Slow Is Beautiful—a publication by Tsuji Shinichi, in which he re-images Japan as a culture represented by the metaphor of the sloth, concerned with slow and quality-oriented livingry as a preferred image of the future to Japan’s current post-bubble cult of speed and economic efficiency; 3. MuRatopia is an image of the future in the form of a microcosmic prototype community and on-going project based on the historically significant island of Awaji, and established by Japanese economist and futures thinker Yamaguchi Kaoru; 4. F.U.C.K, I Love Japan, by author Tanja Yujiro provides this seven text image of the future line-up with a youth oriented sub-culture perspective on that nation’s futures; 5. IMAGINATION / CREATION—a compilation of round table discussions about Japan’s futures seen from the point of view of Japan’s creative vanguard; 6. Visionary People in a Visionless Country: 21 Earth Connecting Human Stories is a collection of twenty one essays compiled by Denmark born Tokyo resident Peter David Pedersen; and, 7. EXODUS to the Land of Hope, authored by Murakami Ryu, one of Japan’s most prolific and influential writers, this novel suggests a future scenario portraying a massive exodus of Japan’s youth, who, literate with state-of-the-art information and communication technologies (ICTs) move en masse to Japan’s northern island of Hokkaido to launch a cyber-revolution from the peripheries. The thesis employs a Futures Triangle Analysis (FTA) as the macro organizing framework and as such examines both pushes of the present and weights from the past before moving to focus on the pulls to the future represented by the seven texts mentioned above. Inayatullah’s (1999) Causal Layered Analysis (CLA) is the analytical framework used in examining the texts. Poststructuralist concepts derived primarily from the work of Michel Foucault are a particular (but not exclusive) reference point for the analytical approach it encompasses. The research questions which reflect the triangulated analytic matrix are: 1. What are the pushes—in terms of current trends—that are affecting Japan’s futures? 2. What are the historical and cultural weights that influence Japan’s futures? 3. What are the emerging transformative Japanese images of the future discourses, as embodied in actual texts, and what potential do they offer for transformative change in Japan? Research questions one and two are discussed in Chapter five and research question three is discussed in Chapter six. The first two research questions should be considered preliminary. The weights outlined in Chapter five indicate that the forces working against change in Japan are formidable, structurally deep-rooted, wide-spread, and under-recognized as change-adverse. Findings and analyses of the push dimension reveal strong forces towards a potentially very different type of Japan. However it is the seven contemporary Japanese images of the future, from which there is hope for transformative potential, which form the analytical heart of the thesis. In analyzing these texts the thesis establishes the richness of Japan’s images of the future and, as such, demonstrates the robustness of Japan’s stance vis-à-vis the problem of a perceived map-less and model-less future for Japan. Frontier is a useful image of the future, whose hybrid textuality, consisting of government, business, academia, and creative minority perspectives, demonstrates the earnestness of Japan’s leaders in favour of the creation of innovative futures for that nation. Slow is powerful in its aim to reconceptualize Japan’s philosophies of temporality, and build a new kind of nation founded on the principles of a human-oriented and expanded vision of economy based around the core metaphor of slowness culture. However its viability in Japan, with its post-Meiji historical pushes to an increasingly speed-obsessed social construction of reality, could render it impotent. MuRatopia is compelling in its creative hybridity indicative of an advanced IT society, set in a modern day utopian space based upon principles of a high communicative social paradigm, and sustainability. IMAGINATION / CREATION is less the plan than the platform for a new discussion on Japan’s transformation from an econo-centric social framework to a new Creative Age. It accords with emerging discourses from the Creative Industries, which would re-conceive of Japan as a leading maker of meaning, rather than as the so-called guzu, a term referred to in the book meaning ‘laggard’. In total, Love Japan is still the most idiosyncratic of all the images of the future discussed. Its communication style, which appeals to Japan’s youth cohort, establishes it as a potentially formidable change agent in a competitive market of futures images. Visionary People is a compelling image for its revolutionary and subversive stance against Japan’s vision-less political leadership, showing that it is the people, not the futures-making elite or aristocracy who must take the lead and create a new vanguard for the nation. Finally, Murakami’s Exodus cannot be ruled out as a compelling image of the future. Sharing the appeal of Tanja’s Love Japan to an increasingly disenfranchised youth, Exodus portrays a near-term future that is achievable in the here and now, by Japan’s teenagers, using information and communications technologies (ICTs) to subvert leadership, and create utopianist communities based on alternative social principles. The principal contribution from this investigation in terms of theory belongs to that of developing the Japanese image of the future. In this respect, the literature reviews represent a significant compilation, specifically about Japanese futures thinking, the Japanese image of the future, and the Japanese utopia. Though not exhaustive, this compilation will hopefully serve as a useful starting point for future research, not only for the Japanese image of the future, but also for all image of the future research. Many of the sources are in Japanese and their English summations are an added reason to respect this achievement. Secondly, the seven images of the future analysed in Chapter six represent the first time that Japanese image of the future texts have been systematically organized and analysed. Their translation from Japanese to English can be claimed as a significant secondary contribution. What is more, they have been analysed according to current futures methodologies that reveal a layeredness, depth, and overall richness existing in Japanese futures images. Revealing this image-richness has been one of the most significant findings of this investigation, suggesting that there is fertile research to be found from this still under-explored field, whose implications go beyond domestic Japanese concerns, and may offer fertile material for futures thinkers and researchers, Japanologists, social planners, and policy makers.
Resumo:
The design-build (DB) system is regarded as an effective means of delivering sustainable buildings. Specifying clear sustainability requirements to potential contractors is of great importance to project success. This research investigates the current state-of-the-practice for the definition of sustainability requirements within the public sectors of the U.S. construction market using a robust content analysis of 49 DB requests for proposals (RFPs). The results reveal that owners predominantly communicate their desired level of sustainability through the LEED certification system. The sustainability requirement has become an important dimension for the best-value evaluation of DB contractors with specific importance weightings of up to 25%. Additionally, owners of larger projects and who provide less design information in their RFPs generally allocate significantly higher importance weightings to sustainability requirements. The primary knowledge contribution of this study to the construction industry is the reveal of current trend in DB procurement for green projects. The findings also provide owners, architects, engineers, and constructors with an effective means of communicating sustainability objectives in solicitation documents.
Resumo:
Arson homicides are rare, representing only two percent of all homicides in Australia each year. In this study, data was collected from the AIC’s National Homicide Monitoring Program (NHMP) to build on previous research undertaken into arson-associated homicides (Davies & Mouzos 2007) and to provide more detailed analysis of cases and offenders. Over the period 1989 to 2010, there were 123 incidents of arson-associated homicide, involving 170 unique victims and 131 offenders. The majority of incidents (63%) occurred in the victim’s home and more than half (57%) of all victims were male. It was found that there has been a 44 percent increase in the number of incidents in the past decade. It is evident that a considerable proportion of the identified arson homicides involved a high degree of premeditation and planning. These homicides were commonly committed by an offender who was well known to the victim, with over half of the victims (56%) specifically targeted by the offender. This paper therefore provides a valuable insight into the nature of arson homicides and signposts areas for further investigation.
Resumo:
Introduction Patients post sepsis syndromes have a poor quality of life and a high rate of recurring illness or mortality. Follow-up clinics have been instituted for patients postgeneral intensive care but evidence is sparse, and there has been no clinic specifically for survivors of sepsis. The aim of this trial is to investigate if targeted screening and appropriate intervention to these patients can result in an improved quality of life (Short Form 36 health survey (SF36V.2)), decreased mortality in the first 12 months, decreased readmission to hospital and/or decreased use of health resources. Methods and analysis 204 patients postsepsis syndromes will be randomised to one of the two groups. The intervention group will attend an outpatient clinic two monthly for 6 months and receive screening and targeted intervention. The usual care group will remain under the care of their physician. To analyse the results, a baseline comparison will be carried out between each group. Generalised estimating equations will compare the SF36 domain scores between groups and across time points. Mortality will be compared between groups using a Cox proportional hazards (time until death) analysis. Time to first readmission will be compared between groups by a survival analysis. Healthcare costs will be compared between groups using a generalised linear model. Economic (health resource) evaluation will be a within-trial incremental cost utility analysis with a societal perspective. Ethics and dissemination Ethical approval has been granted by the Royal Brisbane and Women’s Hospital Human Research Ethics Committee (HREC; HREC/13/QRBW/17), The University of Queensland HREC (2013000543), Griffith University (RHS/08/14/HREC) and the Australian Government Department of Health (26/2013). The results of this study will be submitted to peer-reviewed intensive care journals and presented at national and international intensive care and/or rehabilitation conferences.
Resumo:
The present work involves a computational study of soot (chosen as a scalar which is a primary pollutant source) formation and transport in a laminar acetylene diffusion flame perturbed by a convecting line vortex. The topology of soot contours resulting from flame vortex interactions has been investigated. More soot was produced when vortex was introduced from the air side in comparison to the fuel side. Also, the soot topography was spatially more diffuse in the case of air side vortex. The computational model was found to be in good agreement with the experimental work previously reported in the literature. The computational simulation enabled a study of various parameters like temperature, equivalence ratio and temperature gradient affecting the soot production and transport. Temperatures were found to be higher in the case of air side vortex in contrast to the fuel side one. In case of fuel side vortex, abundance of fuel in the vortex core resulted in fuel-rich combustion zone in the core and a more discrete soot topography. Besides, the overall soot production was observed to be low in the fuel side vortex. However, for the air side vortex, air abundance in the core resulted in higher temperatures and greater soot production. Probability density functions (PDFs) have been introduced to investigate the spatiotemporal variation of soot yield and transport and their dependence on temperature and acetylene concentration from statistical view point. In addition, the effect of flame curvature on soot production is also studied. The regions convex to fuel stream side witnessed thicker soot layer. All numerical simulations have been carried out on Fluent 6.3.26. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Traditionally, in cognitive science the emphasis is on studying cognition from a computational point of view. Studies in biologically inspired robotics and embodied intelligence, however, provide strong evidence that cognition cannot be analyzed and understood by looking at computational processes alone, but that physical system-environment interaction needs to be taken into account. In this opinion article, we review recent progress in cognitive developmental science and robotics, and expand the notion of embodiment to include soft materials and body morphology in the big picture. We argue that we need to build our understanding of cognition from the bottom up; that is, all the way from how our body is physically constructed.
Resumo:
This paper presents an approach to develop an intelligent digital mock-up (DMU) through integration of design and manufacturing disciplines to enable a better understanding of assembly related issues during design evolution. The intelligent DMU will contain tolerance information related to manufacturing capabilities so it can be used as a source for assembly simulations of realistic models to support the manufacturing decision making process within the design domain related to tolerance build ups. A literature review of the contributing research areas is presented, from which identification of the need for an intelligent DMU has been developed. The proposed methodology including the applications of cellular modelling and potential features of the intelligent DMU are presented and explained. Finally a conclusion examines the work to date and the future work to achieve an intelligent DMU.
Resumo:
In Portugal, it was estimated that around 1.95 Mton/year of wood is used in residential wood burning for heating and cooking. Additionally, in the last decades, burnt forest area has also been increasing. These combustions result in high levels of toxic air pollutants and a large perturbation of atmospheric chemistry, interfere with climate and have adverse effects on health. Accurate quantification of the amounts of trace gases and particulate matter emitted from residential wood burning, agriculture and garden waste burning and forest fires on a regional and global basis is essential for various purposes, including: the investigation of several atmospheric processes, the reporting of greenhouse gas emissions, and quantification of the air pollution sources that affect human health at regional scales. In Southern Europe, data on detailed emission factors from biomass burning are rather inexistent. Emission inventories and source apportionment, photochemical and climate change models use default values obtained for US and Northern Europe biofuels. Thus, it is desirable to use more specific locally available data. The objective of this study is to characterise and quantify the contribution of biomass combustion sources to atmospheric trace gases and aerosol concentrations more representative of the national reality. Laboratory (residential wood combustion) and field (agriculture/garden waste burning and experimental wildland fires) sampling experiments were carried out. In the laboratory, after the selection of the most representative wood species and combustion equipment in Portugal, a sampling program to determine gaseous and particulate matter emission rates was set up, including organic and inorganic aerosol composition. In the field, the smoke plumes from agriculture/garden waste and experimental wildland fires were sampled. The results of this study show that the combustion equipment and biofuel type used have an important role in the emission levels and composition. Significant differences between the use of traditional combustion equipment versus modern equipments were also observed. These differences are due to higher combustion efficiency of modern equipment, reflecting the smallest amount of particulate matter, organic carbon and carbon monoxide released. With regard to experimental wildland fires in shrub dominated areas, it was observed that the largest organic fraction in the samples studied was mainly composed by vegetation pyrolysis products. The major organic components in the smoke samples were pyrolysates of vegetation cuticles, mainly comprising steradienes and sterol derivatives, carbohydrates from the breakdown of cellulose, aliphatic lipids from vegetation waxes and methoxyphenols from the lignin thermal degradation. Despite being a banned practice in our country, agriculture/garden waste burning is actually quite common. To assess the particulate matter composition, the smoke from three different agriculture/garden residues have been sampled into 3 different size fractions (PM2.5, PM2.5-10 and PM>10). Despite distribution patterns of organic compounds in particulate matter varied among residues, the amounts of phenolics (polyphenol and guaiacyl derivatives) and organic acids were always predominant over other organic compounds in the organosoluble fraction of smoke. Among biomarkers, levoglucosan, β-sitosterol and phytol were detected in appreciable amounts in the smoke of all agriculture/garden residues. In addition, inositol may be considered as an eventual tracer for the smoke from potato haulm burning. It was shown that the prevailing ambient conditions (such as high humidity in the atmosphere) likely contributed to atmospheric processes (e.g. coagulation and hygroscopic growth), which influenced the particle size characteristics of the smoke tracers, shifting their distribution to larger diameters. An assessment of household biomass consumption was also made through a national scale survey. The information obtained with the survey combined with the databases on emission factors from the laboratory and field tests allowed us to estimate the pollutant amounts emitted in each Portuguese district. In addition to a likely contribution to the improvement of emission inventories, emission factors obtained for tracer compounds in this study can be applied in receptor models to assess the contribution of biomass burning to the levels of atmospheric aerosols and their constituents obtained in monitoring campaigns in Mediterranean Europe.
Resumo:
This project focuses on the bullying found in the 21st century elementary classrooms, more specifically in grades 4-8. These grades were found to have high levels of bullying because of major shifts in a student’s life that may place a student of this age at risk for problems with their peer relationships (Totura et al., 2009). Supporting the findings in the literature review, this handbook was created for Ontario grade 4-8 classroom teachers. The resource educates teachers on current knowledge of classroom bullying, and provides them with information and resources to share with their students so that they can create a culture of upstanders. Upstanders are students who stand up for the victims of bullying, and have the self-esteem and strategies to stand up to classroom bullies. These upstanders, with the support of their classroom teachers and their peers, will be a force strong enough to build the government-mandated Safe School environment.
Resumo:
This project focuses on the bullying found in the 21st century elementary classrooms, more specifically in grades 4-8. These grades were found to have high levels of bullying because of major shifts in a student’s life that may place a student of this age at risk for problems with their peer relationships (Totura et al., 2009). Supporting the findings in the literature review, this handbook was created for an Ontario grade 4-8 classroom teachers. The resource educates teachers on current knowledge of classroom bullying, and provides them with information and resources to share with their students so that they can create a culture of upstanders. Upstanders are students who stand up for the victims of bullying, and have the self-esteem and strategies to stand up to classroom bullies. These upstanders, with the support of their classroom teachers and their peers, will be a force strong enough to build the government-mandated Safe School environment.
Resumo:
Rapid growth in the production of new homes in the UK is putting build quality under pressure as evidenced by an increase in the number of defects. Housing associations (HAs) contribute approximately 20% of the UK’s new housing supply. HAs are currently experiencing central government funding cuts and rental revenue reductions. As part of HAs’ quest to ramp up supply despite tight budget conditions, they are reviewing how they learn from defects. Learning from defects is argued as a means of reducing the persistent defect problem within the UK housebuilding industry, yet how HAs learn from defects is under-researched. The aim of this research is to better understand how HAs, in practice, learn from past defects to reduce the prevalence of defects in future new homes. The theoretical lens for this research is organizational learning. The results drawn from 12 HA case studies indicate that effective organizational learning has the potential to reduce defects within the housing sector. The results further identify that HAs are restricting their learning to focus primarily on reducing defects through product and system adaptations. Focusing on product and system adaptations alone suppresses HAs’ abilities to reduce defects in the future.
Resumo:
As marcas globais mantiveram muita atenção no campo do marketing (Kotler, 1997; Holt, Quelch, e Taylor, 2004; Özsomer e Altaras, 2008), enquanto as marcas locais foram subestimadas (Ger, 1999; Schuiling e Kapferer, 2004). No entanto, o debate adaptação contra padronização foi amplamente discutido. Centra-se na definição de como uma empresa internacional deve construir a sua estratégia: ao padronizar sua estratégia de marketing ou, adaptando para melhor atender a cultura e às necessidades locais (Levitt, 1983; Subhash, 1989; Herbig, 1998; Holt, 2004; Melewar e Vemmervik , 2004; Heerden e Barter, 2008). No entanto, este assunto não foi discutido no contexto específico do consumo alternativo oferecido por concorrentes locais específicos. Hoje em dia, um aumento na oferta de produtos alternativos é observado. O consumo socialmente responsável está crescendo (Sen e Bhattacharya, 2001; Holt, 2002; Loureiro, 2002; François-Lecompte e Valette-Florence, 2006). O mercado dos refrigerantes de cola é de interesse particular. Colas alternativas são refrigerantes de cola que surgiram durante a última década em algumas regiões ou zonas específicas do mundo. Estas colas claramente posicionam-se como uma alternativa ao refrigerante global Coca-Cola. A alternativa não é baseada no preço mas nas características especiais dos produtos que constituem uma proposição de valor específica, diferente da Coca-Cola. Na França, desde uma década, o número de colas regionais aumentou, sendo mais de quinze hoje. O refregirante Breizh Cola foi lançado em 2002 e atinge quase uma quota de mercado de 10% na região Bretanha hoje. Em 2009, a Coca-Cola Entreprise iniciou uma campanha de marketing específica, na Bretanha, baseada em recursos visuais e parcerias regionais. Este caso de adaptação em um contexto de concorrência local específico é explorado nesta dissertação que incide sobre as razões da preferência para Breizh Cola, de um lado, e sobre as acções empreendidas pela Coca-Cola na Bretanha, do outro lado. Este estudo mostra que a Coca-Cola anda nos passos de Breizh Cola, a fim de melhor atender às expectativas local.