27 resultados para BEST AVAILABLE TECHNOLOGY
em CentAUR: Central Archive University of Reading - UK
Resumo:
At present, there is much anxiety regarding the security of energy supplies; for example, the UK and other European States are set to become increasingly dependant upon imports of natural gas from states with which political relations are often strained. These uncertainties are felt acutely by the electricity generating sector, which is facing major challenges regarding the choice of fuel mix in the years ahead. Nuclear energy may provide an alternative; however, in the UK, progress in replacing the first generation reactors is exceedingly slow. A number of operators are looking to coal as a means of plugging the energy gap. However, in the light of ever more stringent legal controls on emissions, this step cannot be taken without the adoption of sophisticated pollution abatement technology. This article examines the role which legal concepts such as Best Available Techniques (BAT) must play in bringing about these changes.
Resumo:
The cephalochordate amphioxus is the best available proxy for the last common invertebrate ancestor of the vertebrates. During the last decade, the developmental genetics of amphioxus have been extensively examined for insights into the evolutionary origin and early evolution of the vertebrates. Comparisons between expression domains of homologous genes in amphioxus and vertebrates have strengthened proposed homologies between specific body parts. Molecular genetic studies have also highlighted parallels in the developmental mechanisms of amphioxus and vertebrates. In both groups, a similar nested pattern of Hox gene expression is involved in rostrocaudal patterning of the neural tube, and homologous genes also appear to be involved in dorsoventral neural patterning. Studies of amphioxus molecular biology have also hinted that the protochordate ancestor of the vertebrates included cell populations that modified their developmental genetic pathways during early vertebrate evolution to yield definitive neural crest and neurogenic placodes. We also discuss how the application of expressed sequence tag and gene-mapping approaches to amphioxus have combined with developmental studies to advance our understanding of chordate genome evolution. We conclude by considering the potential offered by the sequencing of the amphioxus genome, which was completed in late 2004.
Resumo:
The interface between humans and technology is a rapidly changing field. In particular as technological methods have improved dramatically so interaction has become possible that could only be speculated about even a decade earlier. This interaction can though take on a wide range of forms. Indeed standard buttons and dials with televisual feedback are perhaps a common example. But now virtual reality systems, wearable computers and most of all, implant technology are throwing up a completely new concept, namely a symbiosis of human and machine. No longer is it sensible simply to consider how a human interacts with a machine, but rather how the human-machine symbiotic combination interacts with the outside world. In this paper we take a look at some of the recent approaches, putting implant technology in context. We also consider some specific practical examples which may well alter the way we look at this symbiosis in the future. The main area of interest as far as symbiotic studies are concerned is clearly the use of implant technology, particularly where a connection is made between technology and the human brain and/or nervous system. Often pilot tests and experimentation has been carried out apriori to investigate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies are discussed briefly here. The paper however concentrates on human experimentation, in particular that carried out by the authors themselves, firstly to indicate what possibilities exist as of now with available technology, but perhaps more importantly to also show what might be possible with such technology in the future and how this may well have extensive social effects. The driving force behind the integration of technology with humans on a neural level has historically been to restore lost functionality in individuals who have suffered neurological trauma such as spinal cord damage, or who suffer from a debilitating disease such as lateral amyotrophic sclerosis. Very few would argue against the development of implants to enable such people to control their environment, or some aspect of their own body functions. Indeed this technology in the short term has applications for amelioration of symptoms for the physically impaired, such as alternative senses being bestowed on a blind or deaf individual. However the issue becomes distinctly more complex when it is proposed that such technology be used on those with no medical need, but instead who wish to enhance and augment their own bodies, particularly in terms of their mental attributes. These issues are discussed here in the light of practical experimental test results and their ethical consequences.
Resumo:
Vertebral compression fractures are a common clinical problem and the incidence of them will increase with the ageing population. Traditionally management has been conservative; however, there has been a growing trend towards vertebroplasty as an alternative therapy in patients with persisting severe pain. NICE produced guidance in 2003 recommending the procedure after 4 weeks of conservative management. Recent high-quality studies have been contradictory and there is currently a debate surrounding the role of the procedure with no agreement in the literature. We examine the evidence in both osteoporotic and malignant vertebral compression fractures; we also describe the benefits and side effects, alternative treatment options and the cost of the procedure. Finally, we recommend when vertebroplasty is most appropriately used based on the best available evidence.
Resumo:
Summer rainfall over China has experienced substantial variability on longer time scales during the last century, and the question remains whether this is due to natural, internal variability or is part of the emerging signal of anthropogenic climate change. Using the best available observations over China, the decadal variability and recent trends in summer rainfall are investigated with the emphasis on changes in the seasonal evolution and on the temporal characteristics of daily rainfall. The possible relationships with global warming are reassessed. Substantial decadal variability in summer rainfall has been confirmed during the period 1958–2008; this is not unique to this period but is also seen in the earlier decades of the twentieth century. Two dominant patterns of decadal variability have been identified that contribute substantially to the recent trend of southern flooding and northern drought. Natural decadal variability appears to dominate in general but in the cases of rainfall intensity and the frequency of rainfall days, particularly light rain days, then the dominant EOFs have a rather different character, being of one sign over most of China, and having principal components (PCs) that appear more trendlike. The increasing intensity of rainfall throughout China and the decrease in light rainfall days, particularly in the north, could at least partially be of anthropogenic origin, both global and regional, linked to increased greenhouse gases and increased aerosols.
Resumo:
In response to evidence of insect pollinator declines, organisations in many sectors, including the food and farming industry, are investing in pollinator conservation. They are keen to ensure that their efforts use the best available science. We convened a group of 32 ‘conservation practitioners’ with an active interest in pollinators and 16 insect pollinator scientists. The conservation practitioners include representatives from UK industry (including retail), environmental non-government organisations and nature conservation agencies. We collaboratively developed a long list of 246 knowledge needs relating to conservation of wild insect pollinators in the UK. We refined and selected the most important knowledge needs, through a three-stage process of voting and scoring, including discussions of each need at a workshop. We present the top 35 knowledge needs as scored by conservation practitioners or scientists. We find general agreement in priorities identified by these two groups. The priority knowledge needs will structure ongoing work to make science accessible to practitioners, and help to guide future science policy and funding. Understanding the economic benefits of crop pollination, basic pollinator ecology and impacts of pesticides on wild pollinators emerge strongly as priorities, as well as a need to monitor floral resources in the landscape.
Resumo:
This paper aims to develop a mathematical model based on semi-group theory, which allows to improve quality of service (QoS), including the reduction of the carbon path, in a pervasive environment of a Mobile Virtual Network Operator (MVNO). This paper generalise an interrelationship Machine to Machine (M2M) mathematical model, based on semi-group theory. This paper demonstrates that using available technology and with a solid mathematical model, is possible to streamline relationships between building agents, to control pervasive spaces so as to reduce the impact in carbon footprint through the reduction of GHG.
Resumo:
Numerical climate models constitute the best available tools to tackle the problem of climate prediction. Two assumptions lie at the heart of their suitability: (1) a climate attractor exists, and (2) the numerical climate model's attractor lies on the actual climate attractor, or at least on the projection of the climate attractor on the model's phase space. In this contribution, the Lorenz '63 system is used both as a prototype system and as an imperfect model to investigate the implications of the second assumption. By comparing results drawn from the Lorenz '63 system and from numerical weather and climate models, the implications of using imperfect models for the prediction of weather and climate are discussed. It is shown that the imperfect model's orbit and the system's orbit are essentially different, purely due to model error and not to sensitivity to initial conditions. Furthermore, if a model is a perfect model, then the attractor, reconstructed by sampling a collection of initialised model orbits (forecast orbits), will be invariant to forecast lead time. This conclusion provides an alternative method for the assessment of climate models.
Resumo:
Observations by the EISCAT experiments “POLAR” and Common Programme CP-3 reveal non-Maxwellian ion velocity distributions in the auroral F-region ionosphere. Analysis of data from three periods is presented. During the first period, convection velocities are large (≈2 km s-1) and constant over part of a CP-3 latitude scan; the second period is one of POLAR data containing a short-lived (<1 min.) burst of rapid (>1.5 km s-1) flow. We concentrate on these two periods as they allow the study of a great many features of the ion-neutral interactions which drive the plasma non-thermal and provide the best available experimental test for models of the 3-dimensional ion velocity distribution function. The third period is included to illustrate the fact that non-thermal plasma frequently exists in the auroral ionosphere: the data, also from the POLAR experiment, cover a three-hour period of typical auroral zone flow and analysis reveals that the ion distribution varies from Maxwellian to the threshold of a toroidal form.
Resumo:
Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.
Resumo:
This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.
Resumo:
The technology for site-specific applications of nitrogen (N) fertilizer has exposed a gap in our knowledge about the spatial variation of soil mineral N, and that which will become available during the growing season within arable fields. Spring mineral N and potentially available N were measured in an arable field together with gravimetric water content, loss on ignition, crop yield, percentages of sand, silt, and clay, and elevation to describe their spatial variation geostatistically. The areas with a larger clay content had larger values of mineral N, potentially available N, loss on ignition and gravimetric water content, and the converse was true for the areas with more sandy soil. The results suggest that the spatial relations between mineral N and loss on ignition, gravimetric water content, soil texture, elevation and crop yield, and between potentially available N and loss on ignition and silt content could be used to indicate their spatial patterns. Variable-rate nitrogen fertilizer application would be feasible in this field because of the spatial structure and the magnitude of variation of mineral N and potentially available N.
Resumo:
Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.