22 resultados para Research and development tax credit

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The International Conference (series) on Disability, Virtual Reality and Associated Technologies (ICDVRAT) this year held its sixth biennial conference, celebrating ten years of research and development in this field. A total of 220 papers have been presented at the first six conferences, addressing potential, development, exploration and examination of how these technologies can be applied in disabilities research and practice. The research community is broad and multi-disciplined, comprising a variety of scientific and medical researchers, rehabilitation therapists, educators and practitioners. Likewise, technologies, their applications and target user populations are also broad, ranging from sensors positioned on real world objects to fully immersive interactive simulated environments. A common factor is the desire to identify what the technologies have to offer and how they can provide added value to existing methods of assessment, rehabilitation and support for individuals with disabilities. This paper presents a brief review of the first decade of research and development in the ICDVRAT community, defining technologies, applications and target user populations served.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Academic writing has a tendency to be turgid and impenetrable. This is not only anathema to communication between academics, but also a major barrier to advancing construction industry development. Clarity in our communication is a prerequisite to effective collaboration with industry. An exploration of what it means to be an academic in a University is presented in order to provide a context for a discussion on how academics might collaborate with industry to advance development. There are conflicting agendas that pull the academic in different directions: peer group recognition, institutional success and industry development. None can be achieved without the other, which results in the need for a careful balancing act. While academics search for better understandings and provisional explanations within the context of conceptual models, industry seeks the practical application of new ideas, whether the ideas come from research or experience. Universities have a key role to play in industry development and in economic development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – Expectations of future market conditions are acknowledged to be crucial for the development decision and hence for shaping the built environment. The purpose of this paper is to study the central London office market from 1987 to 2009 and test for evidence of rational, adaptive and naive expectations. Design/methodology/approach – Two parallel approaches are applied to test for either rational or adaptive/naive expectations: vector auto-regressive (VAR) approach with Granger causality tests and recursive OLS regression with one-step forecasts. Findings – Applying VAR models and a recursive OLS regression with one-step forecasts, the authors do not find evidence of adaptive and naïve expectations of developers. Although the magnitude of the errors and the length of time lags between market signal and construction starts vary over time and development cycles, the results confirm that developer decisions are explained, to a large extent, by contemporaneous and historic conditions in both the City and the West End, but this is more likely to stem from the lengthy design, financing and planning permission processes rather than adaptive or naive expectations. Research limitations/implications – More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of large demand shocks and/or irrational behaviour. Practical implications – Developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. Originality/value – This paper focuses the scholarly debate of real estate cycles on the role of expectations. It is also one of very few spatially disaggregate studies of the subject matter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The countries in West Africa (WA) are pushing for socio-economic development. The construction sector has an important part to play in helping to realise these aspirations. This necessitates an increased emphasis on research in the built environment, as a key contributor to developing capacity, knowledge and technologies for the sector. The West Africa Built Environment Research (WABER) conference was initiated in 2008. The objective was to: help young built environment researchers in West Africa (WA) to develop their research work and skills through constructive face-to-face interaction with their peers and experienced international academics; supply a platform for interaction among more senior academics and an outlet for disseminating their research work; and to serve as a vehicle for developing the built environment field in Africa. Three conferences have so far been organised, 2009 - 2011, bringing together ~300 academics, researchers and practitioners from the WA region. This paper draws on content analysis of the 189 papers in the proceedings of three conferences: 2009 (25); 2010 (57) and 2011 (107). These papers provide a window into current research priorities and trends and, thus, offer an opportunity to understand the kinds of research work undertaken by built environment researchers in West Africa. The aim is to illuminate the main research themes and methods that are currently pursued and the limitations thereof. The findings lay bare some of the many challenges that are faced by academics in WA and provide suggestions for alternative directions for future research and development work with indications of a potential research agenda.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tourism is the worlds largest employer, accounting for 10% of jobs worldwide (WTO, 1999). There are over 30,000 protected areas around the world, covering about 10% of the land surface(IUCN, 2002). Protected area management is moving towards a more integrated form of management, which recognises the social and economic needs of the worlds finest areas and seeks to provide long term income streams and support social cohesion through active but sustainable use of resources. Ecotourism - 'responsible travel to natural areas that conserves the environment and improves the well- being of local people' (The Ecotourism Society, 1991) - is often cited as a panacea for incorporating the principles of sustainable development in protected area management. However, few examples exist worldwide to substantiate this claim. In reality, ecotourism struggles to provide social and economic empowerment locally and fails to secure proper protection of the local and global environment. Current analysis of ecotourism provides a useful checklist of interconnected principles for more successful initiatives, but no overall framework of analysis or theory. This paper argues that applying common property theory to the application of ecotourism can help to establish more rigorous, multi-layered analysis that identifies the institutional demands of community based ecotourism (CBE). The paper draws on existing literature on ecotourism and several new case studies from developed and developing countries around the world. It focuses on the governance of CBE initiatives, particularly the interaction between local stakeholders and government and the role that third party non-governmental organisations can play in brokering appropriate institutional arrangements. The paper concludes by offering future research directions."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a scenario framework that could provide a scenario “thread” through the different climate research communities (climate change – vulnerability, impact, and adaptation (VIA) and mitigation) in order to provide assessment of mitigation and adaptation strategies and other VIA challenges. The scenario framework is organised around a matrix with two main axes: radiative forcing levels and socio-economic conditions. The radiative forcing levels (and the associated climate signal) are described by the new Representative Concentration Pathways. The second axis, socio-economic developments, comprises elements that affect the capacity for mitigation and adaptation, as well as the exposure to climate impacts. The proposed scenarios derived from this framework are limited in number, allow for comparison across various mitigation and adaptation levels, address a range of vulnerability characteristics, provide information across climate forcing and vulnerability states and span a full century time scale. Assessments based on the proposed scenario framework would strengthen cooperation between integrated-assessment modelers, climate modelers and vulnerability, impact and adaptation researchers, and most importantly, facilitate the development of more consistent and comparable research within and across communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews recent research and other literature concerning the planning and development of redundant defence estate. It concentrates on UK sources but includes reference to material from Europe and the North America were it is relevant for comparative purposes. It introduces the topic by providing a brief review of the recent restructuring of the UK defence estate and then proceeds to examine the various planning policy issues generated by this process; the policy frameworks used to guide it; comparable approaches to surplus land disposal and the appraisal of impacts; and ending the main body of the review with an analyse of the economic, social and environmental impacts of military base closure and redevelopment. It concludes that there is a significant body of work focusing on the reuse and redevelopment of redundant defence estate in the UK and abroad, but that much of this work is based on limited research or on personal experience. One particular weakness of the current literature is that it does not fully reflect the institutional difficulties posed by the disposal process and the day-to-day pressures which MOD personnel have to deal with. In doing this, it also under-emphasises the embedded cultures of individuals and professional groups who are required to operationalise the policies, procedures and practices for planning and redeveloping redundant defence estate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper summarises an initial report carried out by the Housing Business Research Group, of the University of Reading into Design and Build procurement and a number of research projects undertaken by the national federation of Housing Associations (NFHA), into their members' development programmes. The paper collates existing statistics from these sources and examines the way in which Design and Build procurement can be adapted for the provision of social housing. The paper comments on these changes and questions how risk averting the adopted strategies are in relation to long term housing business management issues arising from the quality of the product produced by the new system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In situ high resolution aircraft measurements of cloud microphysical properties were made in coordination with ground based remote sensing observations of a line of small cumulus clouds, using Radar and Lidar, as part of the Aerosol Properties, PRocesses And InfluenceS on the Earth's climate (APPRAISE) project. A narrow but extensive line (~100 km long) of shallow convective clouds over the southern UK was studied. Cloud top temperatures were observed to be higher than −8 °C, but the clouds were seen to consist of supercooled droplets and varying concentrations of ice particles. No ice particles were observed to be falling into the cloud tops from above. Current parameterisations of ice nuclei (IN) numbers predict too few particles will be active as ice nuclei to account for ice particle concentrations at the observed, near cloud top, temperatures (−7.5 °C). The role of mineral dust particles, consistent with concentrations observed near the surface, acting as high temperature IN is considered important in this case. It was found that very high concentrations of ice particles (up to 100 L−1) could be produced by secondary ice particle production providing the observed small amount of primary ice (about 0.01 L−1) was present to initiate it. This emphasises the need to understand primary ice formation in slightly supercooled clouds. It is shown using simple calculations that the Hallett-Mossop process (HM) is the likely source of the secondary ice. Model simulations of the case study were performed with the Aerosol Cloud and Precipitation Interactions Model (ACPIM). These parcel model investigations confirmed the HM process to be a very important mechanism for producing the observed high ice concentrations. A key step in generating the high concentrations was the process of collision and coalescence of rain drops, which once formed fell rapidly through the cloud, collecting ice particles which caused them to freeze and form instant large riming particles. The broadening of the droplet size-distribution by collision-coalescence was, therefore, a vital step in this process as this was required to generate the large number of ice crystals observed in the time available. Simulations were also performed with the WRF (Weather, Research and Forecasting) model. The results showed that while HM does act to increase the mass and number concentration of ice particles in these model simulations it was not found to be critical for the formation of precipitation. However, the WRF simulations produced a cloud top that was too cold and this, combined with the assumption of continual replenishing of ice nuclei removed by ice crystal formation, resulted in too many ice crystals forming by primary nucleation compared to the observations and parcel modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.