109 resultados para Technological developments
em CentAUR: Central Archive University of Reading - UK
Resumo:
Recent developments in the fields of veterinary epidemiology and economics are critically reviewed and assessed. The impacts of recent technological developments in diagnosis, genetic characterisation, data processing and statistical analysis are evaluated. It is concluded that the acquisition and availability of data remains the principal constraint to the application of available techniques in veterinary epidemiology and economics, especially at population level. As more commercial producers use computerised management systems, the availability of data for analysis within herds is improving. However, consistency of recording and diagnosis remains problematic. Recent trends to the development of national livestock databases intended to provide reassurance to consumers of the safety and traceability of livestock products are potentially valuable sources of data that could lead to much more effective application of veterinary epidemiology and economics. These opportunities will be greatly enhanced if data from different sources, such as movement recording, official animal health programmes, quality assurance schemes, production recording and breed societies can be integrated. However, in order to realise such integrated databases, it will be necessary to provide absolute control of user access to guarantee data security and confidentiality. The potential applications of integrated livestock databases in analysis, modelling, decision-support, and providing management information for veterinary services and livestock producers are discussed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
This article reviews current technological developments, particularly Peer-to-Peer technologies and Distributed Data Systems, and their value to community memory projects, particularly those concerned with the preservation of the cultural, literary and administrative data of cultures which have suffered genocide or are at risk of genocide. It draws attention to the comparatively good representation online of genocide denial groups and changes in the technological strategies of holocaust denial and other far-right groups. It draws on the author's work in providing IT support for a UK-based Non-Governmental Organization providing support for survivors of genocide in Rwanda.
Resumo:
This paper reports the proceedings of a conference held at Reading University in 1993 which addressed the issues of new technological developments at the regional and sub-regional levels in Britain and France. These new technological clusters - the `Technopoles' - are investigated in a series of papers in both English and French which examines their spatial, sectoral and economic aspects to determine what lessons can be learned from their development and what their future economic significance is likely to be. Two recurring themes are of particular significance in the papers - the link between R& D and regional development, and the different forms which innovation assumes within the various technopoles under scrutiny.
Resumo:
Requirements for research, practices and policies affecting soil management in relation to global food security are reviewed. Managing soil organic carbon (C) is central because soil organic matter influences numerous soil properties relevant to ecosystem functioning and crop growth. Even small changes in total C content can have disproportionately large impacts on key soil physical properties. Practices to encourage maintenance of soil C are important for ensuring sustainability of all soil functions. Soil is a major store of C within the biosphere – increases or decreases in this large stock can either mitigate or worsen climate change. Deforestation, conversion of grasslands to arable cropping and drainage of wetlands all cause emission of C; policies and international action to minimise these changes are urgently required. Sequestration of C in soil can contribute to climate change mitigation but the real impact of different options is often misunderstood. Some changes in management that are beneficial for soil C, increase emissions of nitrous oxide (a powerful greenhouse gas) thus cancelling the benefit. Research on soil physical processes and their interactions with roots can lead to improved and novel practices to improve crop access to water and nutrients. Increased understanding of root function has implications for selection and breeding of crops to maximise capture of water and nutrients. Roots are also a means of delivering natural plant-produced chemicals into soil with potentially beneficial impacts. These include biocontrol of soil-borne pests and diseases and inhibition of the nitrification process in soil (conversion of ammonium to nitrate) with possible benefits for improved nitrogen use efficiency and decreased nitrous oxide emission. The application of molecular methods to studies of soil organisms, and their interactions with roots, is providing new understanding of soil ecology and the basis for novel practical applications. Policy makers and those concerned with development of management approaches need to keep a watching brief on emerging possibilities from this fast-moving area of science. Nutrient management is a key challenge for global food production: there is an urgent need to increase nutrient availability to crops grown by smallholder farmers in developing countries. Many changes in practices including inter-cropping, inclusion of nitrogen-fixing crops, agroforestry and improved recycling have been clearly demonstrated to be beneficial: facilitating policies and practical strategies are needed to make these widely available, taking account of local economic and social conditions. In the longer term fertilizers will be essential for food security: policies and actions are needed to make these available and affordable to small farmers. In developed regions, and those developing rapidly such as China, strategies and policies to manage more precisely the necessarily large flows of nutrients in ways that minimise environmental damage are essential. A specific issue is to minimise emissions of nitrous oxide whilst ensuring sufficient nitrogen is available for adequate food production. Application of known strategies (through either regulation or education), technological developments, and continued research to improve understanding of basic processes will all play a part. Decreasing soil erosion is essential, both to maintain the soil resource and to minimise downstream damage such as sedimentation of rivers with adverse impacts on fisheries. Practical strategies are well known but often have financial implications for farmers. Examples of systems for paying one group of land users for ecosystem services affecting others exist in several parts of the world and serve as a model.
Resumo:
Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.
Resumo:
With advances in technology, terahertz imaging and spectroscopy are beginning to move out of the laboratory and find applications in areas as diverse as security screening, medicine, art conservation and field archaeology. Nevertheless, there is still a need to improve upon the performance of existing terahertz systems to achieve greater compactness and robustness, enhanced spatial resolution, more rapid data acquisition times and operation at greater standoff distances. This chapter will review recent technological developments in this direction that make use of nanostructures in the generation, detection and manipulation of terahertz radiation. The chapter will also explain how terahertz spectroscopy can be used as a tool to characterize the ultrafast carrier dynamics of nanomaterials.
Resumo:
Oxford University Press’s response to technological change in printing and publishing processes in this period can be considered in three phases: an initial period when the computerization of typesetting was seen as offering both cost savings and the ability to produce new editions of existing works more quickly; an intermediate phase when the emergence of standards in desktop computing allowed experiments with the sale of software as well as packaged electronic publications; and a third phase when the availability of the world wide web as a means of distribution allowed OUP to return to publishing in its traditional areas of strength albeit in new formats. Each of these phases demonstrates a tension between a desire to develop centralized systems and expertise, and a recognition that dynamic publishing depends on distributed decision-making and innovation. Alongside these developments in production and distribution lay developments in computer support for managerial and collaborative publishing processes, often involving the same personnel and sometimes the same equipment.
Resumo:
This update on radiocarbon calibration results from the 19th International Radiocarbon Conference at Oxford in April 2006, and is essential reading for all archaeologists. The way radiocarbon dates and absolute dates relate to each other differs in three periods: back to 12400 cal BR radiocarbon dates can be calibrated with tree rings, and the calibration curve in this form should soon extend back to 18 000 cal BP Between 12 400 and 26000 cal BR the calibration curves are based on marine records, and thus are only a best estimate of atmospheric concentrations. Beyond 26000 cal BR dates have to be based on comparison (rather than calibration) with a variety of records. Radical variations are thus possible in this period, a highly significant caveat,for the dating of middle and lower Paleolithic art, artefacts and animal and human remains.
Resumo:
Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.
Resumo:
During the past 15 years, a number of initiatives have been undertaken at national level to develop ocean forecasting systems operating at regional and/or global scales. The co-ordination between these efforts has been organized internationally through the Global Ocean Data Assimilation Experiment (GODAE). The French MERCATOR project is one of the leading participants in GODAE. The MERCATOR systems routinely assimilate a variety of observations such as multi-satellite altimeter data, sea-surface temperature and in situ temperature and salinity profiles, focusing on high-resolution scales of the ocean dynamics. The assimilation strategy in MERCATOR is based on a hierarchy of methods of increasing sophistication including optimal interpolation, Kalman filtering and variational methods, which are progressively deployed through the Syst`eme d’Assimilation MERCATOR (SAM) series. SAM-1 is based on a reduced-order optimal interpolation which can be operated using ‘altimetry-only’ or ‘multi-data’ set-ups; it relies on the concept of separability, assuming that the correlations can be separated into a product of horizontal and vertical contributions. The second release, SAM-2, is being developed to include new features from the singular evolutive extended Kalman (SEEK) filter, such as three-dimensional, multivariate error modes and adaptivity schemes. The third one, SAM-3, considers variational methods such as the incremental four-dimensional variational algorithm. Most operational forecasting systems evaluated during GODAE are based on least-squares statistical estimation assuming Gaussian errors. In the framework of the EU MERSEA (Marine EnviRonment and Security for the European Area) project, research is being conducted to prepare the next-generation operational ocean monitoring and forecasting systems. The research effort will explore nonlinear assimilation formulations to overcome limitations of the current systems. This paper provides an overview of the developments conducted in MERSEA with the SEEK filter, the Ensemble Kalman filter and the sequential importance re-sampling filter.
Resumo:
Snow properties have been retrieved from satellite data for many decades. While snow extent is generally felt to be obtained reliably from visible-band data, there is less confidence in the measurements of snow mass or water equivalent derived from passive microwave instruments. This paper briefly reviews historical passive microwave instruments and products, and compares the large-scale patterns from these sources to those of general circulation models and leading reanalysis products. Differences are seen to be large between the datasets, particularly over Siberia. A better understanding of the errors in both the model-based and measurement-based datasets is required to exploit both fully. Techniques to apply to the satellite measurements for improved large-scale snow data are suggested.
Resumo:
Vibrational spectroscopy at high excitation is an important research frontier for two reasons. Firstly, the near infrared is proving to be an important area for the analytical applications of spectroscopy, and we would therefore like to understand how the spectra we observe relate to the molecular structure of the absorbing species. Secondly, there is a fundamental interest in understanding molecular dynamics and energy flow within a polyatomic molecule at high excitation, because this is the boundary between spectroscopy and chemistry through which we try to understand the details of a chemical reaction. In this presentation I shall survey recent progress in this field.
Resumo:
The measurement of the impact of technical change has received significant attention within the economics literature. One popular method of quantifying the impact of technical change is the use of growth accounting index numbers. However, in a recent article Nelson and Pack (1999) criticise the use of such index numbers in situations where technical change is likely to be biased in favour of one or other inputs. In particular they criticise the common approach of applying observed cost shares, as proxies for partial output elasticities, to weight the change in quantities which they claim is only valid under Hicks neutrality. Recent advances in the measurement of product and factor biases of technical change developed by Balcombe et al (2000) provide a relatively straight-forward means of correcting product and factor shares in the face of biased technical progress. This paper demonstrates the correction of both revenue and cost shares used in the construction of a TFP index for UK agriculture over the period 1953 to 2000 using both revenue and cost function share equations appended with stochastic latent variables to capture the bias effect. Technical progress is shown to be biased between both individual input and output groups. Output and input quantity aggregates are then constructed using both observed and corrected share weights and the resulting TFPs are compared. There does appear to be some significant bias in TFP if the effect of biased technical progress is not taken into account when constructing the weights