164 resultados para deferred-acceptance algorithm
Resumo:
This paper analyze and study a pervasive computing system in a mining environment to track people based on RFID (radio frequency identification) technology. In first instance, we explain the RFID fundamentals and the LANDMARC (location identification based on dynamic active RFID calibration) algorithm, then we present the proposed algorithm combining LANDMARC and trilateration technique to collect the coordinates of the people inside the mine, next we generalize a pervasive computing system that can be implemented in mining, and finally we show the results and conclusions.
Resumo:
For Northern Hemisphere extra-tropical cyclone activity, the dependency of a potential anthropogenic climate change signal on the identification method applied is analysed. This study investigates the impact of the used algorithm on the changing signal, not the robustness of the climate change signal itself. Using one single transient AOGCM simulation as standard input for eleven state-of-the-art identification methods, the patterns of model simulated present day climatologies are found to be close to those computed from re-analysis, independent of the method applied. Although differences in the total number of cyclones identified exist, the climate change signals (IPCC SRES A1B) in the model run considered are largely similar between methods for all cyclones. Taking into account all tracks, decreasing numbers are found in the Mediterranean, the Arctic in the Barents and Greenland Seas, the mid-latitude Pacific and North America. Changing patterns are even more similar, if only the most severe systems are considered: the methods reveal a coherent statistically significant increase in frequency over the eastern North Atlantic and North Pacific. We found that the differences between the methods considered are largely due to the different role of weaker systems in the specific methods.
Resumo:
Northern Hemisphere cyclone activity is assessed by applying an algorithm for the detection and tracking of synoptic scale cyclones to mean sea level pressure data. The method, originally developed for the Southern Hemisphere, is adapted for application in the Northern Hemisphere winter season. NCEP-Reanalysis data from 1958/59 to 1997/98 are used as input. The sensitivities of the results to particular parameters of the algorithm are discussed for both case studies and from a climatological point of view. Results show that the choice of settings is of major relevance especially for the tracking of smaller scale and fast moving systems. With an appropriate setting the algorithm is capable of automatically tracking different types of cyclones at the same time: Both fast moving and developing systems over the large ocean basins and smaller scale cyclones over the Mediterranean basin can be assessed. The climatology of cyclone variables, e.g., cyclone track density, cyclone counts, intensification rates, propagation speeds and areas of cyclogenesis and -lysis gives detailed information on typical cyclone life cycles for different regions. The lowering of the spatial and temporal resolution of the input data from full resolution T62/06h to T42/12h decreases the cyclone track density and cyclone counts. Reducing the temporal resolution alone contributes to a decline in the number of fast moving systems, which is relevant for the cyclone track density. Lowering spatial resolution alone mainly reduces the number of weak cyclones.
Resumo:
It is predicted that non-communicable diseases will account for over 73 % of global mortality in 2020. Given that the majority of these deaths occur in developed countries such as the UK, and that up to 80 % of chronic disease could be prevented through improvements in diet and lifestyle, it is imperative that dietary guidelines and disease prevention strategies are reviewed in order to improve their efficacy. Since the completion of the human genome project our understanding of complex interactions between environmental factors such as diet and genes has progressed considerably, as has the potential to individualise diets using dietary, phenotypic and genotypic data. Thus, there is an ambition for dietary interventions to move away from population-based guidance towards 'personalised nutrition'. The present paper reviews current evidence for the public acceptance of genetic testing and personalised nutrition in disease prevention. Health and clear consumer benefits have been identified as key motivators in the uptake of genetic testing, with individuals reporting personal experience of disease, such as those with specific symptoms, being more willing to undergo genetic testing for the purpose of personalised nutrition. This greater perceived susceptibility to disease may also improve motivation to change behaviour which is a key barrier in the success of any nutrition intervention. Several consumer concerns have been identified in the literature which should be addressed before the introduction of a nutrigenomic-based personalised nutrition service. Future research should focus on the efficacy and implementation of nutrigenomic-based personalised nutrition.
Resumo:
For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
A new sparse kernel density estimator is introduced. Our main contribution is to develop a recursive algorithm for the selection of significant kernels one at time using the minimum integrated square error (MISE) criterion for both kernel selection. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with competitive accuracy to existing kernel density estimators.
Resumo:
We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.
Resumo:
Currently UK fruit and vegetable intakes are below recommendations. Bread is a staple food consumed by ~95% of adults in western countries. In addition, bread provides an ideal matrix by which functionality can be delivered to the consumer in an accepted food. Therefore, enriching bread with vegetables may be an effective strategy to increase vegetable consumption. This study evaluated consumer acceptance, purchase intent and intention of product replacement of bread enriched with red beetroot, carrot with coriander, red pepper with tomato or white beetroot (80g vegetable per serving of 200g) compared to white control bread (0g vegetable). Consumers (n=120) rated their liking of the breads overall, as well as their liking of appearance, flavour and texture using nine-point hedonic scales. Product replacement and purchase intent of the breads was rated using five-point scales. The effect of providing consumers with health information about the breads was also evaluated. There were significant differences in overall liking (P<0.0001), as well as liking of appearance (P<0.0001), flavour (P=0.0002) and texture (P=0.04), between the breads. However, the significant differences resulted from the red beetroot bread which was significantly (P<0.05) less liked compared to control bread. There were no significant differences in overall liking between any of the other vegetable-enriched breads compared with the control bread (no vegetable inclusion), apart from the red beetroot bread which was significantly less liked. The provision of health information about the breads did not increase consumer liking of the vegetable-enriched breads. In conclusion, this study demonstrated that vegetable-enriched bread appeared to be an acceptable strategy to increase vegetable intake, however, liking depended on vegetable type.
Resumo:
Reinforcing the Low Voltage (LV) distribution network will become essential to ensure it remains within its operating constraints as demand on the network increases. The deployment of energy storage in the distribution network provides an alternative to conventional reinforcement. This paper presents a control methodology for energy storage to reduce peak demand in a distribution network based on day-ahead demand forecasts and historical demand data. The control methodology pre-processes the forecast data prior to a planning phase to build in resilience to the inevitable errors between the forecasted and actual demand. The algorithm uses no real time adjustment so has an economical advantage over traditional storage control algorithms. Results show that peak demand on a single phase of a feeder can be reduced even when there are differences between the forecasted and the actual demand. In particular, results are presented that demonstrate when the algorithm is applied to a large number of single phase demand aggregations that it is possible to identify which of these aggregations are the most suitable candidates for the control methodology.
Resumo:
This paper reviews theories and models of users’ acceptance and use in relation to “persuasive technology”, to justify the need to add consideration of ‘perceived persuasiveness’. We conclude by identifying variables associated with perceived persuasiveness, and highlight important future research directions in this domain.
Resumo:
With an aging global population, the number of people living with a chronic illness is expected to increase significantly by 2050. If left unmanaged, chronic care leads to serious health complications, resulting in poor patient quality of life and a costly time bomb for care providers. If effectively managed, patients with chronic care tend to live a richer and more healthy life, resulting in a less costly total care solution. This chapter considers literature from the areas of technology acceptance and care self-management, which aims to alleviate symptoms and/or reason for non-acceptance of care, and thus minimise the risk of long-term complications, which in turn reduces the chance of spiralling health expenditure. By bringing together these areas, the chapter highlights areas where self-management is failing so that changes can be made in care in advance of health deterioration.
Resumo:
Persuasive technologies have been extensively applied in the context of e-commerce for the purpose of marketing, enhancing system credibility, and motivating users to adopt the systems. Recognising that persuasion impacts on consumer behaviour to purchase online have not been investigated previously. This study reviews theories of technology acceptance, and identifies their limitation in not considering the effect of persuasive technologies when determining user online technology acceptance. The study proposes a theoretical model that considers the effect of persuasive technologies on consumer acceptance of e-commerce websites; with consideration of other related variables, i.e. trust and technological attributes. Moreover the paper proposes a model based on the UTAUT2, which contains relevant contributing factors; including the concept of perceived persuasiveness.
Resumo:
Persuasive technologies, used within in the domain of interactive technology, are used broadly in social contexts to encourage customers towards positive behavior change. In the context of e-commerce, persuasive technologies have already been extensively applied in the area of marketing to enhancing system credibility, however the issue of ‘persuasiveness’, and its role on positive user acceptance of technology, has not been investigated in the technology acceptance literature. This paper reviews theories and models of users’ acceptance and use in relation with persuasive technology, and identifies their limitation when considering the impact of persuasive technology on users’ acceptance of technology; thus justifying a need to add consideration of ‘perceived persuasiveness’. We conclude by identifying variables associated with perceived persuasiveness, and suggest key research directions for future research.