49 resultados para Have, Paul ten: Understanding qualitative research and ethnomethdology
em CentAUR: Central Archive University of Reading - UK
Resumo:
The International Conference (series) on Disability, Virtual Reality and Associated Technologies (ICDVRAT) this year held its sixth biennial conference, celebrating ten years of research and development in this field. A total of 220 papers have been presented at the first six conferences, addressing potential, development, exploration and examination of how these technologies can be applied in disabilities research and practice. The research community is broad and multi-disciplined, comprising a variety of scientific and medical researchers, rehabilitation therapists, educators and practitioners. Likewise, technologies, their applications and target user populations are also broad, ranging from sensors positioned on real world objects to fully immersive interactive simulated environments. A common factor is the desire to identify what the technologies have to offer and how they can provide added value to existing methods of assessment, rehabilitation and support for individuals with disabilities. This paper presents a brief review of the first decade of research and development in the ICDVRAT community, defining technologies, applications and target user populations served.
Resumo:
A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
The role of the academic in the built environment seems generally to be not well understood or articulated. While this problem is not unique to our field, there are plenty of examples in a wide range of academic disciplines where the academic role has been fully articulated. But built environment academics have tended not to look beyond their own literature and their own vocational context in trying to give meaning to their academic work. The purpose of this keynote presentation is to explore the context of academic work generally and the connections between education, research and practice in the built environment, specifically. By drawing on ideas from the sociology of the professions, the role of universities, and the fundamentals of social science research, a case is made that helps to explain the kind of problems that routinely obstruct academic progress in our field. This discussion reveals that while there are likely to be great weaknesses in much of what is published and taught in the built environment, it is not too great a stretch to provide a more robust understanding and a good basis for developing our field in a way that would enable us collectively to make a major contribution to theory-building, theory-testing and to make a good stab at tackling some of the problems facing society at large. There is no reason to disregard the fundamental academic disciplines that underpin our knowledge of the built environment. If we contextualise our work in these more fundamental disciplines, there is every reason to think that we can have a much greater impact that we have experienced to date.
Resumo:
Over the past 10-15 years, several governments have implemented an array of technology, support-related, sustainable livelihoods (SL) and poverty-reduction projects for artisanal and small-scale mining (ASM). In the majority of cases, however, these interventions have failed to facilitate improvements in the industry's productivity and raise the living standards of the sector's subsistence operators. This article argues that a poor understanding of the demographics of target populations has precipitated these outcomes. In order to strengthen policy and assistance in the sector, governments must determine, with greater precision, the number of people operating in ASM regions, their origins and ethnic backgrounds, ages, and educational levels. This can be achieved by carrying out basic and localized census work before promoting ambitious sector-specific projects aimed at improving working conditions in the industry.
Resumo:
First defined in the mid-1990s, prebiotics, which alter the composition and activity of gastrointestinal (GI) microbiota to improve health and well-being, have generated scientific and consumer interest and regulatory debate. The Life Sciences Research Organization, Inc. (LSRO) held a workshop, Prebiotics and the Health Benefits of Fiber: Future Research and Goals, in February 2011 to assess the current state of the science and the international regulatory environment for prebiotics, identify research gaps, and create a strategy for future research. A developing body of evidence supports a role for prebiotics in reducing the risk and severity of GI infection and inflammation, including diarrhea, inflammatory bowel disease, and ulcerative colitis as well as bowel function disorders, including irritable bowel syndrome. Prebiotics also increase the bioavailability and uptake of minerals and data suggest that they reduce the risk of obesity by promoting satiety and weight loss. Additional research is needed to define the relationship between the consumption of different prebiotics and improvement of human health. New information derived from the characterization of the composition and function of different prebiotics as well as the interactions among and between gut microbiota and the human host would improve our understanding of the effects of prebiotics on health and disease and could assist in surmounting regulatory issues related to prebiotic use.
Resumo:
The impact of the Reformation was felt strongly in the nature and character of the priesthood, and in the function and reputation of the priest. A shift in the understanding of the priesthood was one of the most tangible manifestations of doctrinal change, evident in the physical arrangement of the church, in the language of the liturgy, and in the relaxation of the discipline of celibacy, which had for centuries bound priests in the Latin tradition to a life of perpetual continence. Clerical celibacy, and accusations of clerical incontinence, featured prominently in evangelical criticisms of the Catholic church and priesthood, which made a good deal of polemical capital out of the perceived relationship of the priest and the efficacy of his sacred function. Citing St Paul, Protestant polemicists presented clerical marriage as the only, and appropriate remedy, for priestly immorality. But did the advent of a married priesthood create more problems than it solved? The polemical certainties that informed evangelical writing on sacerdotal celibacy did not guarantee the immediate acceptance of a married priesthood, and the vocabulary that had been used to denounce clergy who failed in their obligation to celibacy was all too readily turned against the married clergy. The anti-clerical lexicon, and its usage, remained remarkably static despite the substantial doctrinal and practical challenges posed to the traditional model of priesthood by the Protestant Reformation.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Advances in the science and observation of climate change are providing a clearer understanding of the inherent variability of Earth’s climate system and its likely response to human and natural influences. The implications of climate change for the environment and society will depend not only on the response of the Earth system to changes in radiative forcings, but also on how humankind responds through changes in technology, economies, lifestyle and policy. Extensive uncertainties exist in future forcings of and responses to climate change, necessitating the use of scenarios of the future to explore the potential consequences of different response options. To date, such scenarios have not adequately examined crucial possibilities, such as climate change mitigation and adaptation, and have relied on research processes that slowed the exchange of information among physical, biological and social scientists. Here we describe a new process for creating plausible scenarios to investigate some of the most challenging and important questions about climate change confronting the global community
Resumo:
It has been suggested that Assessment for Learning (AfL) plays a significant role in enhancing teaching and learning in mainstream educational contexts. However, little empirical evidence can support these claims. As AfL has been shown to be enacted predominantly through interactions in primary classes, there is a need to understand if it is appropriate, whether it can be efficiently used in teaching English to Young Learners (TEYL) and how it can facilitate learning in such a context. This emerging research focus gains currency especially in the light of SLA research, which suggests the important role of interactions in foreign language learning. This mixed-method, descriptive and exploratory study aims to investigate how teachers of learners aged 7-11 understand AfL; how they implement it; and the impact that such implementation could have on interactions which occur during lessons. The data were collected through lesson observations, scrutiny of school documents, semi-structured interviews and a focus group interview with teachers. The findings indicate that fitness for purpose guides the implementation of AfL in TEYL classrooms. Significantly, the study has revealed differences in the implementation of AfL between classes of 7-9 and 10-11 year olds within each of the three purposes (setting objectives and expectations; monitoring performance; and checking achievement) identified through the data. Another important finding of this study is the empirical evidence suggesting that the use of AfL could facilitate creating conditions conducive to learning in TEYL classes during collaborative and expert/novice interactions. The findings suggest that teachers’ understanding of AfL is largely aligned with the theoretical frameworks (Black & Wiliam, 2009; Swaffield, 2011) already available. However, they also demonstrate that there are TEYL specific characteristics. This research has important pedagogical implications and indicates a number of areas for further research.
Resumo:
The accurate prediction of storms is vital to the oil and gas sector for the management of their operations. An overview of research exploring the prediction of storms by ensemble prediction systems is presented and its application to the oil and gas sector is discussed. The analysis method used requires larger amounts of data storage and computer processing time than other more conventional analysis methods. To overcome these difficulties eScience techniques have been utilised. These techniques potentially have applications to the oil and gas sector to help incorporate environmental data into their information systems
Resumo:
This article is a commentary on several research studies conducted on the prospects for aerobic rice production systems that aim at reducing the demand for irrigation water which in certain major rice producing areas of the world is becoming increasingly scarce. The research studies considered, as reported in published articles mainly under the aegis of the International Rice Research Institute (IRRI), have a narrow scope in that they test only 3 or 4 rice varieties under different soil moisture treatments obtained with controlled irrigation, but with other agronomic factors of production held as constant. Consequently, these studies do not permit an assessment of the interactions among agronomic factors that will be of critical significance to the performance of any production system. Varying the production factor of "water" will seriously affect also the levels of the other factors required to optimise the performance of a production system. The major weakness in the studies analysed in this article originates from not taking account of the interactions between experimental and non-experimental factors involved in the comparisons between different production systems. This applies to the experimental field design used for the research studies as well as to the subsequent statistical analyses of the results. The existence of such interactions is a serious complicating element that makes meaningful comparisons between different crop production systems difficult. Consequently, the data and conclusions drawn from such research readily become biased towards proposing standardised solutions for possible introduction to farmers through a linear technology transfer process. Yet, the variability and diversity encountered in the real-world farming environment demand more flexible solutions and approaches in the dissemination of knowledge-intensive production practices through "experiential learning" types of processes, such as those employed by farmer field schools. This article illustrates, based on expertise of the 'system of rice intensification' (SRI), that several cost-effective and environment-friendly agronomic solutions to reduce the demand for irrigation water, other than the asserted need for the introduction of new cultivars, are feasible. Further, these agronomic Solutions can offer immediate benefits of reduced water requirements and increased net returns that Would be readily accessible to a wide range of rice producers, particularly the resource poor smallholders. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Attempts to reduce the energy consumed in UK homes have met with limited success. One reason for this is a lack of understanding of how people interact with domestic technology – heating systems, lights, electrical equipment and so forth. Attaining such an understanding is hampered by a chronic shortage of detailed energy use data matched to descriptions of the house, the occupants, the internal conditions and the installed services and appliances. Without such information it is impossible to produce transparent and valid models for understanding and predicting energy use. The Carbon Reduction in Buildings (CaRB) consortium of five UK universities plans to develop socio-technical models of energy use, underpinned by a flow of data from a longitudinal monitoring campaign involving several hundred UK homes. This paper outlines the models proposed, the preliminary monitoring work and the structure of the proposed longitudinal study.
Resumo:
This paper reports on a study investigating teachers’ views and beliefs about the relationship between second language (L2) research and practice. Although a gap has been frequently reported between the two, there is little empirical data to show what teachers’ views on this relationship are or how these views and beliefs influence their use of research. A total of 60 TESOL1 teachers in England responded to a questionnaire which sought both qualitative and quantitative data. Results of the data analysis suggest that although their views on research and its usefulness are positive, teachers are mainly sceptical about the practicality and relevance of L2 research. More importantly, they expect research to originate from rather than end in classrooms and maintain that the prime responsibility of bringing research and practice together is to be shared by teacher training programmes and educational policies of the institutions they work in. Our analysis of the data further implies that there are differences between teachers’ epistemological assumptions and the more established notions of research.
Resumo:
The term 'big data' has recently emerged to describe a range of technological and commercial trends enabling the storage and analysis of huge amounts of customer data, such as that generated by social networks and mobile devices. Much of the commercial promise of big data is in the ability to generate valuable insights from collecting new types and volumes of data in ways that were not previously economically viable. At the same time a number of questions have been raised about the implications for individual privacy. This paper explores key perspectives underlying the emergence of big data, and considers both the opportunities and ethical challenges raised for market research.