6 resultados para nurture

em CentAUR: Central Archive University of Reading - UK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study explores what happens to teachers practice and ’ professional identity when they adopt a collaborative action research approach to teaching and involve external creative partners and a university mentor. The teachers aim to nurture and develop the creative potential of their learners through empowering them to make decisions for themselves about their own progress and learning directions. The teachers worked creatively and collaboratively designing creative teaching and learning methods in support of pupils with language and communication difficulties. The respondents are from an English special school, primary school and girls secondary school. A mixed methods methodology is adopted. Gains in teacher confidence and capability were identified in addition to shifts in values that impacted directly on their self-concept of what it is to be an effective teacher promoting effective learning. The development of their professional identities within a team ethos included them being able to make decisions about learning that are based on the educational potential of learners that they proved resulted in elevated standards achieved by this group of learners. They were able to justify their actions on established educational principles. Tensions however were revealed between what they perceived as their normal required professionalism imposed by external agencies and the enhanced professionalism experienced working through the project where they were able to integrate theory and practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Animal models are invaluable tools which allow us to investigate the microbiome-host dialogue. However, experimental design introduces biases in the data that we collect, also potentially leading to biased conclusions. With obesity at pandemic levels animal models of this disease have been developed; we investigated the role of experimental design on one such rodent model. We used 454 pyrosequencing to profile the faecal bacteria of obese (n = 6) and lean (homozygous n = 6; heterozygous n = 6) Zucker rats over a 10 week period, maintained in mixed-genotype cages, to further understand the relationships between the composition of the intestinal bacteria and age, obesity progression, genetic background and cage environment. Phylogenetic and taxon-based univariate and multivariate analyses (non-metric multidimensional scaling, principal component analysis) showed that age was the most significant source of variation in the composition of the faecal microbiota. Second to this, cage environment was found to clearly impact the composition of the faecal microbiota, with samples from animals from within the same cage showing high community structure concordance, but large differences seen between cages. Importantly, the genetically induced obese phenotype was not found to impact the faecal bacterial profiles. These findings demonstrate that the age and local environmental cage variables were driving the composition of the faecal bacteria and were more deterministically important than the host genotype. These findings have major implications for understanding the significance of functional metagenomic data in experimental studies and beg the question; what is being measured in animal experiments in which different strains are housed separately, nature or nurture?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Culex pipiens is the most cosmopolitan mosquito of the Pipiens Assemblage. By studying the nature of interactions between this species and microorganisms common to its breeding environment we can unravel important pitfalls encountered during development. We tested the survival rate of larval stages, pupae and adults of a Cx. pipiens colony exposed to a variety of microorganisms in laboratory conditions and assessed the transmission to offspring (F1) by those organisms that secured development up to adulthood. Three complementary experiments were designed to: 1) explore the nutritional value of yeasts and other microorganisms during Cx. pipiens development; 2) elucidate the transstadial transmission of yeast to the host offspring; and 3) to examine the relevance of all these microorganisms in female choice for oviposition-substratum. The yeast Saccharomyces cerevisiae proved to be the most nutritional diet, but despite showing the highest survival rates, vertical transmission to F1 was never confirmed. In addition, during the oviposition trials, none of the gravid females was attracted to the yeast substratum. Notably, the two native bacterial strains, Klebsiella sp. and Aeromonas sp., were the preferred oviposition media, the same two bacteria that managed to feed neonates until molting into 2nd instar larvae. Our results not only suggest that Klebsiella sp. or Aeromonas sp. serve as attractants for oviposition habitat selection, but also nurture the most fragile instar, L1, to assure molting into a more resilient stage, L2, while yeast proves to be the most supportive diet for completing development. These experiments unearthed survival traits that might be considered in the future development of strategies of Cx. pipiens control. These studies can be extended to other members of the Pipiens Assemblage