874 resultados para Research-Based Instructional Strategies (RBIS)
Resumo:
This paper will present a conceptual framework for the examination of land redevelopment based on a complex systems/networks approach. As Alvin Toffler insightfully noted, modern scientific enquiry has become exceptionally good at splitting problems into pieces but has forgotten how to put the pieces back together. Twenty-five years after his remarks, governments and corporations faced with the requirements of sustainability are struggling to promote an ‘integrated’ or ‘holistic’ approach to tackling problems. Despite the talk, both practice and research provide few platforms that allow for ‘joined up’ thinking and action. With socio-economic phenomena, such as land redevelopment, promising prospects open up when we assume that their constituents can make up complex systems whose emergent properties are more than the sum of the parts and whose behaviour is inherently difficult to predict. A review of previous research shows that it has mainly focused on idealised, ‘mechanical’ views of property development processes that fail to recognise in full the relationships between actors, the structures created and their emergent qualities. When reality failed to live up to the expectations of these theoretical constructs then somebody had to be blamed for it: planners, developers, politicians. However, from a ‘synthetic’ point of view the agents and networks involved in property development can be seen as constituents of structures that perform complex processes. These structures interact, forming new more complex structures and networks. Redevelopment then can be conceptualised as a process of transformation: a complex system, a ‘dissipative’ structure involving developers, planners, landowners, state agencies etc., unlocks the potential of previously used sites, transforms space towards a higher order of complexity and ‘consumes’ but also ‘creates’ different forms of capital in the process. Analysis of network relations point toward the ‘dualism’ of structure and agency in these processes of system transformation and change. Insights from actor network theory can be conjoined with notions of complexity and chaos to build an understanding of the ways in which actors actively seek to shape these structures and systems, whilst at the same time are recursively shaped by them in their strategies and actions. This approach transcends the blame game and allows for inter-disciplinary inputs to be placed within a broader explanatory framework that does away with many past dichotomies. Better understanding of the interactions between actors and the emergent qualities of the networks they form can improve our comprehension of the complex socio-spatial phenomena that redevelopment comprises. The insights that this framework provides when applied in UK institutional investment into redevelopment are considered to be significant.
Resumo:
The present study sets out to examine the strategies used by Chinese learners in a predominantly naturalistic environment and how such learner strategy use relates to their proficiency in the second language. Data were collected from four Chinese research students in the UK using semi-structured interviews. Their proficiency in English was assessed with an oral interview and a listening test. The main findings from this study are that the learners used a wide range of strategies overall, including metacognitive, cognitive, social/affective and compensation strategies. The majority of the commonly reported strategies were metacognitive strategies, suggesting that the learners were self-directed and attempting to manage their own learning in an informal context. They also showed idiosyncrasies in their use of learner strategies. Attempts to explain the learners’ strategy use in relation to their levels of proficiency in English and contextual factors, as well as several other factors, are offered. Implications for target-country institutions in terms of the provision of support to Chinese students are discussed.
Resumo:
In this paper, we propose a scenario framework that could provide a scenario “thread” through the different climate research communities (climate change – vulnerability, impact, and adaptation (VIA) and mitigation) in order to provide assessment of mitigation and adaptation strategies and other VIA challenges. The scenario framework is organised around a matrix with two main axes: radiative forcing levels and socio-economic conditions. The radiative forcing levels (and the associated climate signal) are described by the new Representative Concentration Pathways. The second axis, socio-economic developments, comprises elements that affect the capacity for mitigation and adaptation, as well as the exposure to climate impacts. The proposed scenarios derived from this framework are limited in number, allow for comparison across various mitigation and adaptation levels, address a range of vulnerability characteristics, provide information across climate forcing and vulnerability states and span a full century time scale. Assessments based on the proposed scenario framework would strengthen cooperation between integrated-assessment modelers, climate modelers and vulnerability, impact and adaptation researchers, and most importantly, facilitate the development of more consistent and comparable research within and across communities.
Resumo:
Although current research indicates that increasing the number of options has negative effects on the cognitive ability of consumers, little understanding has been given to the consequences on producers and their strategic behavior. This article tests whether a large portfolio of products is beneficial to producers by observing UK consumer response to price promotions. The article shows that discounts induce mainly segment switching (74% of the total impact), with a limited effect on stockpiling (26%) and no impact on purchase incidence. Consequently, consumers prefer to “follow the discount” rather than purchase multiple units of the same wine. This result seems to explain the current structure of the market, and suggests that discounts may conflict with segment loyalty, a situation that disfavors producers, particularly in very populated segments. Results also casts doubts on the economic sustainability of competition based on an intense product differentiation in the wine sector.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
In most Western countries, saturated fatty acid (SFA) intake exceeds recommended levels, which is considered a risk factor for cardiovascular disease (CVD). As milk and dairy products are major contributors to SFA intake in many countries, recent research has focused on sustainable methods of producing milk with a lower saturated fat concentration by altering dairy cow diets. Human intervention studies have shown that CVD risk can be reduced by consuming dairy products with reduced SFA and increased cis-monounsaturated fatty acid (MUFA) concentrations. This milk fatty acid profile can be achieved by supplementing dairy cow diets with cis-MUFA-rich unsaturated oils. However, rumen exposure of unsaturated oils also leads to enhanced milk trans fatty acid (TFA) concentrations. Because of concerns about the effects of TFA consumption on CVD, feeding strategies that increase MUFA concentrations in milk without concomitant increases in TFA concentration are preferred by milk processors. In an attempt to limit TFA production and increase the replacement of SFA by cis-MUFA, a preparation of rumen-protected unsaturated oils was developed using saponification with calcium salts. Four multiparous Holstein-Friesian cows in mid-late lactation were used in a 4 × 4 Latin square design with 21-d periods to investigate the effect of incremental dietary inclusion of a calcium salt of cis-MUFA product (Ca-MUFA; 20, 40, and 60 g/kg of dry matter of a maize silage-based diet), on milk production, composition, and fatty acid concentration. Increasing Ca-MUFA inclusion reduced dry matter intake linearly, but no change was observed in estimated ME intake. No change in milk yield was noted, but milk fat and protein concentrations were linearly reduced. Supplementation with Ca-MUFA resulted in a linear reduction in total SFA (from 71 to 52 g/100 g of fatty acids for control and 60 g/kg of dry matter diets, respectively). In addition, concentrations of both cis- and trans-MUFA were increased with Ca-MUFA inclusion, and increases in other biohydrogenation intermediates in milk fat were also observed. The Ca-MUFA supplement was very effective at reducing milk SFA concentration and increasing cis-MUFA concentrations without incurring any negative effects on milk and milk component yields. However, reduced milk fat and protein concentrations, together with increases in milk TFA concentrations, suggest partial dissociation of the calcium salts in the rumen
Resumo:
The paper provides a descriptive analysis of the carbon management activities of the cement industry in Europe based on a study involving the four largest producers of cement in the world. Based on this analysis, the paper explores the relationship between managerial perception and strategy with particular focus on the impact of government regulation and competitive dynamics. The research is based on extensive documentary analysis and in-depth interviews with senior managers from the four companies who have been responsible for and/or involved in the development of climate change strategies. We find that whilst the cement industry has embraced climate change and the need for action, their remains much scope for action in their carbon management activities with current effort concentration on hedging practices and win-win efficiency programs. Managers perceive that inadequate and unfavourable regulatory structure is the key barrier against more action to achieve emission reduction within the industry. EU Cement companies are also shifting their CO2 emissions to less developed countries of the South.
Resumo:
It is predicted that non-communicable diseases will account for over 73 % of global mortality in 2020. Given that the majority of these deaths occur in developed countries such as the UK, and that up to 80 % of chronic disease could be prevented through improvements in diet and lifestyle, it is imperative that dietary guidelines and disease prevention strategies are reviewed in order to improve their efficacy. Since the completion of the human genome project our understanding of complex interactions between environmental factors such as diet and genes has progressed considerably, as has the potential to individualise diets using dietary, phenotypic and genotypic data. Thus, there is an ambition for dietary interventions to move away from population-based guidance towards 'personalised nutrition'. The present paper reviews current evidence for the public acceptance of genetic testing and personalised nutrition in disease prevention. Health and clear consumer benefits have been identified as key motivators in the uptake of genetic testing, with individuals reporting personal experience of disease, such as those with specific symptoms, being more willing to undergo genetic testing for the purpose of personalised nutrition. This greater perceived susceptibility to disease may also improve motivation to change behaviour which is a key barrier in the success of any nutrition intervention. Several consumer concerns have been identified in the literature which should be addressed before the introduction of a nutrigenomic-based personalised nutrition service. Future research should focus on the efficacy and implementation of nutrigenomic-based personalised nutrition.