63 resultados para attempt to obtain disclosure
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this paper an attempt is described to increase the range of human sensory capabilities by means of implant technology. The key aim is to create an additional sense by feeding signals directly to the human brain, via the nervous system rather than via a presently operable human sense. Neural implant technology was used to directly interface a human nervous system with a computer in a one off trial. The output from active ultrasonic sensors was then employed to directly stimulate the human nervous system. An experimental laboratory set up was used as a test bed to assess the usefulness of this sensory addition.
Resumo:
Functional advantages of probiotics combined with interesting composition of oat were considered as an alternative to dairy products. In this study, fermentation of oat milk with Lactobacillus reuteri and Streptococcus thermophilus was analysed to develop a new probiotic product. Central composite design with response surface methodology was used to analyse the effect of different factors (glucose, fructose, inulin and starters) on the probiotic population in the product. Optimised formulation was characterised throughout storage time at 4 ℃ in terms of pH, acidity, β-glucan and oligosaccharides contents, colour and rheological behaviour. All formulations studied were adequate to produce fermented foods and minimum dose of each factor was considered as optimum. The selected formulation allowed starters survival above 107/cfu ml to be considered as a functional food and was maintained during the 28 days controlled. β-glucans remained in the final product with a positive effect on viscosity. Therefore, a new probiotic non-dairy milk was successfully developed in which high probiotic survivals were assured throughout the typical yoghurt-like shelf life.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.
Resumo:
A new method for assessing forecast skill and predictability that involves the identification and tracking of extratropical cyclones has been developed and implemented to obtain detailed information about the prediction of cyclones that cannot be obtained from more conventional analysis methodologies. The cyclones were identified and tracked along the forecast trajectories, and statistics were generated to determine the rate at which the position and intensity of the forecasted storms diverge from the analyzed tracks as a function of forecast lead time. The results show a higher level of skill in predicting the position of extratropical cyclones than the intensity. They also show that there is potential to improve the skill in predicting the position by 1 - 1.5 days and the intensity by 2 - 3 days, via improvements to the forecast model. Further analysis shows that forecasted storms move at a slower speed than analyzed storms on average and that there is a larger error in the predicted amplitudes of intense storms than the weaker storms. The results also show that some storms can be predicted up to 3 days before they are identified as an 850-hPa vorticity center in the analyses. In general, the results show a higher level of skill in the Northern Hemisphere (NH) than the Southern Hemisphere (SH); however, the rapid growth of NH winter storms is not very well predicted. The impact that observations of different types have on the prediction of the extratropical cyclones has also been explored, using forecasts integrated from analyses that were constructed from reduced observing systems. A terrestrial, satellite, and surface-based system were investigated and the results showed that the predictive skill of the terrestrial system was superior to the satellite system in the NH. Further analysis showed that the satellite system was not very good at predicting the growth of the storms. In the SH the terrestrial system has significantly less skill than the satellite system, highlighting the dominance of satellite observations in this hemisphere. The surface system has very poor predictive skill in both hemispheres.
Resumo:
Word sense disambiguation is the task of determining which sense of a word is intended from its context. Previous methods have found the lack of training data and the restrictiveness of dictionaries' choices of senses to be major stumbling blocks. A robust novel algorithm is presented that uses multiple dictionaries, the Internet, clustering and triangulation to attempt to discern the most useful senses of a given word and learn how they can be disambiguated. The algorithm is explained, and some promising sample results are given.
Resumo:
High resolution descriptions of plant distribution have utility for many ecological applications but are especially useful for predictive modelling of gene flow from transgenic crops. Difficulty lies in the extrapolation errors that occur when limited ground survey data are scaled up to the landscape or national level. This problem is epitomized by the wide confidence limits generated in a previous attempt to describe the national abundance of riverside Brassica rapa (a wild relative of cultivated rapeseed) across the United Kingdom. Here, we assess the value of airborne remote sensing to locate B. rapa over large areas and so reduce the need for extrapolation. We describe results from flights over the river Nene in England acquired using Airborne Thematic Mapper (ATM) and Compact Airborne Spectrographic Imager (CASI) imagery, together with ground truth data. It proved possible to detect 97% of flowering B. rapa on the basis of spectral profiles. This included all stands of plants that occupied >2m square (>5 plants), which were detected using single-pixel classification. It also included very small populations (<5 flowering plants, 1-2m square) that generated mixed pixels, which were detected using spectral unmixing. The high detection accuracy for flowering B. rapa was coupled with a rather large false positive rate (43%). The latter could be reduced by using the image detections to target fieldwork to confirm species identity, or by acquiring additional remote sensing data such as laser altimetry or multitemporal imagery.
Resumo:
Under global warming, the predicted intensification of the global freshwater cycle will modify the net freshwater flux at the ocean surface. Since the freshwater flux maintains ocean salinity structures, changes to the density-driven ocean circulation are likely. A modified ocean circulation could further alter the climate, potentially allowing rapid changes, as seen in the past. The relevant feedback mechanisms and timescales are poorly understood in detail, however, especially at low latitudes where the effects of salinity are relatively subtle. In an attempt to resolve some of these outstanding issues, we present an investigation of the climate response of the low-latitude Pacific region to changes in freshwater forcing. Initiated from the present-day thermohaline structure, a control run of a coupled ocean-atmosphere general circulation model is compared with a perturbation run in which the net freshwater flux is prescribed to be zero over the ocean. Such an extreme experiment helps to elucidate the general adjustment mechanisms and their timescales. The atmospheric greenhouse gas concentrations are held constant, and we restrict our attention to the adjustment of the upper 1,000 m of the Pacific Ocean between 40°N and 40°S, over 100 years. In the perturbation run, changes to the surface buoyancy, near-surface vertical mixing and mixed-layer depth are established within 1 year. Subsequently, relative to the control run, the surface of the low-latitude Pacific Ocean in the perturbation run warms by an average of 0.6°C, and the interior cools by up to 1.1°C, after a few decades. This vertical re-arrangement of the ocean heat content is shown to be achieved by a gradual shutdown of the heat flux due to isopycnal (i.e. along surfaces of constant density) mixing, the vertical component of which is downwards at low latitudes. This heat transfer depends crucially upon the existence of density-compensating temperature and salinity gradients on isopycnal surfaces. The timescale of the thermal changes in the perturbation run is therefore set by the timescale for the decay of isopycnal salinity gradients in response to the eliminated freshwater forcing, which we demonstrate to be around 10-20 years. Such isopycnal heat flux changes may play a role in the response of the low-latitude climate to a future accelerated freshwater cycle. Specifically, the mechanism appears to represent a weak negative sea surface temperature feedback, which we speculate might partially shield from view the anthropogenically-forced global warming signal at low latitudes. Furthermore, since the surface freshwater flux is shown to play a role in determining the ocean's thermal structure, it follows that evaporation and/or precipitation biases in general circulation models are likely to cause sea surface temperature biases.
Resumo:
A generic Nutrient Export Risk Matrix (NERM) approach is presented. This provides advice to farmers and policy makers on good practice for reducing nutrient loss and is intended to persuade them to implement such measures. Combined with a range of nutrient transport modelling tools and field experiments, NERMs can play an important role in reducing nutrient export from agricultural land. The Phosphorus Export Risk Matrix (PERM) is presented as an example NERM. The PERM integrates hydrological understanding of runoff with a number of agronomic and policy factors into a clear problem-solving framework. This allows farmers and policy makers to visualise strategies for reducing phosphorus loss through proactive land management. The risk Of Pollution is assessed by a series of informed questions relating to farming intensity and practice. This information is combined with the concept of runoff management to point towards simple, practical remedial strategies which do not compromise farmers' ability to obtain sound economic returns from their crop and livestock.
Resumo:
To provide reliable estimates for mapping soil properties for precision agriculture requires intensive sampling and costly laboratory analyses. If the spatial structure of ancillary data, such as yield, digital information from aerial photographs, and soil electrical conductivity (EC) measurements, relates to that of soil properties they could be used to guide the sampling intensity for soil surveys. Variograins of permanent soil properties at two study sites on different parent materials were compared with each other and with those for ancillary data. The ranges of spatial dependence identified by the variograms of both sets of properties are of similar orders of magnitude for each study site, Maps of the ancillary data appear to show similar patterns of variation and these seem to relate to those of the permanent properties of the soil. Correlation analysis has confirmed these relations. Maps of kriged estimates from sub-sampled data and the original variograrns showed that the main patterns of variation were preserved when a sampling interval of less than half the average variogram range of ancillary data was used. Digital data from aerial photographs for different years and EC appear to show a more consistent relation with the soil properties than does yield. Aerial photographs, in particular those of bare soil, seem to be the most useful ancillary data and they are often cheaper to obtain than yield and EC data.
Resumo:
Our recent paper [McMurtry, G.M., Tappin, D.R., Sedwick, P.N., Wilkinson, I., Fietzkc, J. and Sellwood, B., 2007a. Elevated marine deposits in Bermuda record a late Quaternary megatsunami. Sedimentary Geol. 200, 155-165.] critically re-examined elevated marine deposits in Bermuda, and concluded that their geological setting, sedimentary relations, micropetrography and microfaunal assemblages were inconsistent with sustained intertidal deposition. Instead, we hypothesized that these deposits were the result of a large tsunami that impacted the Bermuda island platform during the mid-Pleistocene. Hearty and Olson [Hearty, P.J., and Olson, S.L., in press. Mega-highstand or megatsunami? Discussion of McMurtry et al. "Elevated marine deposits in Bermuda record a late Quaternary megatsunami": Sedimentary Geology, 200, 155-165, 2007 (Aug. 07). Sedimentary Geol. 200, 155-165.] in their response, attempt to refute our conclusions and claim the deposits to be the result of a +21 m eustatic sea level highstand during marine isotope stage (MIS) 11. In our reply we answer the issues raised by Hearty and Olson [Hearty, P.J., and Olson, S.L., in press. Mega-highstand or megatsunami? Discussion of McMurtry et al. "Elevated marine deposits in Bermuda record a late Quaternary megatsunami": Sedimentary Geology, 200, 155-165, 2007 (Aug. 07). Sedimentary Geol. 200,155-165.] and conclude that the Bermuda deposits do not provide unequivocal evidence of a prolonged +21 m eustatic sea level highstand. Rather, the sediments are more likely the result of a past megatsunami in the North Atlantic basin. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this paper we pledge that physically based equations should be combined with remote sensing techniques to enable a more theoretically rigorous estimation of area-average soil heat flux, G. A standard physical equation (i.e. the analytical or exact method) for the estimation of G, in combination with a simple, but theoretically derived, equation for soil thermal inertia (F), provides the basis for a more transparent and readily interpretable method for the estimation of G; without the requirement for in situ instrumentation. Moreover, such an approach ensures a more universally applicable method than those derived from purely empirical studies (employing vegetation indices and albedo, for example). Hence, a new equation for the estimation of Gamma(for homogeneous soils) is discussed in this paper which only requires knowledge of soil type, which is readily obtainable from extant soil databases and surveys, in combination with a coarse estimate of moisture status. This approach can be used to obtain area-averaged estimates of Gamma(and thus G, as explained in paper II) which is important for large-scale energy balance studies that employ aircraft or satellite data. Furthermore, this method also relaxes the instrumental demand for studies at the plot and field scale (no requirement for in situ soil temperature sensors, soil heat flux plates and/or thermal conductivity sensors). In addition, this equation can be incorporated in soil-vegetation-atmosphere-transfer models that use the force restore method to update surface temperatures (such as the well-known ISBA model), to replace the thermal inertia coefficient.
Resumo:
Acid mine drainage (AMD) is a widespread environmental problem associated with both working and abandoned mining operations. As part of an overall strategy to determine a long-term treatment option for AMD, a pilot passive treatment plant was constructed in 1994 at Wheat Jane Mine in Cornwall, UK. The plant consists of three separate systems; each containing aerobic reed beds, anaerobic cell and rock filters, and represents the largest European experimental facility of its kind. The systems only differ by the type of pre-treatment utilised to increase the pH of the influent minewater (pH<4): lime-dosed (LD), anoxic limestone drain (ALD) and lime free (LF), which receives no form of pre-treatment. The Wheal Jane pilot plant offered a unique facility and a major research project was established to evaluate the pilot plant and study in detail the biological mechanisms and the geochemical and physical processes that control passive treatment systems. The project has led to data, knowledge, models and design criteria for the future design, planning and sustainable management of passive treatment systems. A multidisciplinary team of scientists and managers from the U.K. universities, the Environment Agency and the Mining Industry has been put together to obtain the maximum advantage from the excellent facilities facility at Wheal Jane. (C) 2004 Elseaier B.V All rights reserved.
Resumo:
Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.
Resumo:
The community pharmacy service medicines use review (MUR) was introduced in 2005 ‘to improve patient knowledge, concordance and use of medicines’ through a private patient–pharmacist consultation. The MUR presents a fundamental change in community pharmacy service provision. While traditionally pharmacists are dispensers of medicines and providers of medicines advice, and patients as recipients, the MUR considers pharmacists providing consultation-type activities and patients as active participants. The MUR facilitates a two-way discussion about medicines use. Traditional patient–pharmacist behaviours transform into a new set of behaviours involving the booking of appointments, consultation processes and form completion, and the physical environment of the patient–pharmacist interaction moves from the traditional setting of the dispensary and medicines counter to a private consultation room. Thus, the new service challenges traditional identities and behaviours of the patient and the pharmacist as well as the environment in which the interaction takes place. In 2008, the UK government concluded there is at present too much emphasis on the quantity of MURs rather than on their quality.[1] A number of plans to remedy the perceived imbalance included a suggestion to reward ‘health outcomes’ achieved, with calls for a more focussed and scientific approach to the evaluation of pharmacy services using outcomes research. Specifically, the UK government set out the main principal research areas for the evaluation of pharmacy services to include ‘patient and public perceptions and satisfaction’as well as ‘impact on care and outcomes’. A limited number of ‘patient satisfaction with pharmacy services’ type questionnaires are available, of varying quality, measuring dimensions relating to pharmacists’ technical competence, behavioural impressions and general satisfaction. For example, an often cited paper by Larson[2] uses two factors to measure satisfaction, namely ‘friendly explanation’ and ‘managing therapy’; the factors are highly interrelated and the questions somewhat awkwardly phrased, but more importantly, we believe the questionnaire excludes some specific domains unique to the MUR. By conducting patient interviews with recent MUR recipients, we have been working to identify relevant concepts and develop a conceptual framework to inform item development for a Patient Reported Outcome Measure questionnaire bespoke to the MUR. We note with interest the recent launch of a multidisciplinary audit template by the Royal Pharmaceutical Society of Great Britain (RPSGB) in an attempt to review the effectiveness of MURs and improve their quality.[3] This template includes an MUR ‘patient survey’. We will discuss this ‘patient survey’ in light of our work and existing patient satisfaction with pharmacy questionnaires, outlining a new conceptual framework as a basis for measuring patient satisfaction with the MUR. Ethical approval for the study was obtained from the NHS Surrey Research Ethics Committee on 2 June 2008. References 1. Department of Health (2008). Pharmacy in England: Building on Strengths – Delivering the Future. London: HMSO. www. official-documents.gov.uk/document/cm73/7341/7341.pdf (accessed 29 September 2009). 2. Larson LN et al. Patient satisfaction with pharmaceutical care: update of a validated instrument. JAmPharmAssoc 2002; 42: 44–50. 3. Royal Pharmaceutical Society of Great Britain (2009). Pharmacy Medicines Use Review – Patient Audit. London: RPSGB. http:// qi4pd.org.uk/index.php/Medicines-Use-Review-Patient-Audit. html (accessed 29 September 2009).