72 resultados para Running-based anaerobic sprint test
Resumo:
Chatterbox Challenge is an annual web-based contest for artificial conversational systems, ACE. The 2010 instantiation was the tenth consecutive contest held between March and June in the 60th year following the publication of Alan Turing’s influential disquisition ‘computing machinery and intelligence’. Loosely based on Turing’s viva voca interrogator-hidden witness imitation game, a thought experiment to ascertain a machine’s capacity to respond satisfactorily to unrestricted questions, the contest provides a platform for technology comparison and evaluation. This paper provides an insight into emotion content in the entries since the 2005 Chatterbox Challenge. The authors find that synthetic textual systems, none of which are backed by academic or industry funding, are, on the whole and more than half a century since Weizenbaum’s natural language understanding experiment, little further than Eliza in terms of expressing emotion in dialogue. This may be a failure on the part of the academic AI community for ignoring the Turing test as an engineering challenge.
Resumo:
Deep Brain Stimulation (DBS) has been successfully used throughout the world for the treatment of Parkinson's disease symptoms. To control abnormal spontaneous electrical activity in target brain areas DBS utilizes a continuous stimulation signal. This continuous power draw means that its implanted battery power source needs to be replaced every 18–24 months. To prolong the life span of the battery, a technique to accurately recognize and predict the onset of the Parkinson's disease tremors in human subjects and thus implement an on-demand stimulator is discussed here. The approach is to use a radial basis function neural network (RBFNN) based on particle swarm optimization (PSO) and principal component analysis (PCA) with Local Field Potential (LFP) data recorded via the stimulation electrodes to predict activity related to tremor onset. To test this approach, LFPs from the subthalamic nucleus (STN) obtained through deep brain electrodes implanted in a Parkinson patient are used to train the network. To validate the network's performance, electromyographic (EMG) signals from the patient's forearm are recorded in parallel with the LFPs to accurately determine occurrences of tremor, and these are compared to the performance of the network. It has been found that detection accuracies of up to 89% are possible. Performance comparisons have also been made between a conventional RBFNN and an RBFNN based on PSO which show a marginal decrease in performance but with notable reduction in computational overhead.
Resumo:
This paper presents a queue-based agent architecture for multimodal interfaces. Using a novel approach to intelligently organise both agents and input data, this system has the potential to outperform current state-of-the-art multimodal systems, while at the same time allowing greater levels of interaction and flexibility. This assertion is supported by simulation test results showing that significant improvements can be obtained over normal sequential agent scheduling architectures. For real usage, this translates into faster, more comprehensive systems, without the limited application domain that restricts current implementations.
Resumo:
This article describes a number of velocity-based moving mesh numerical methods formultidimensional nonlinear time-dependent partial differential equations (PDEs). It consists of a short historical review followed by a detailed description of a recently developed multidimensional moving mesh finite element method based on conservation. Finite element algorithms are derived for both mass-conserving and non mass-conserving problems, and results shown for a number of multidimensional nonlinear test problems, including the second order porous medium equation and the fourth order thin film equation as well as a two-phase problem. Further applications and extensions are referenced.
Resumo:
The authors compare the performance of two types of controllers one based on the multilayered network and the other based on the single layered CMAC network (cerebellar model articulator controller). The neurons (information processing units) in the multi-layered network use Gaussian activation functions. The control scheme which is considered is a predictive control algorithm, along the lines used by Willis et al. (1991), Kambhampati and Warwick (1991). The process selected as a test bed is a continuous stirred tank reactor. The reaction taking place is an irreversible exothermic reaction in a constant volume reactor cooled by a single coolant stream. This reactor is a simplified version of the first tank in the two tank system given by Henson and Seborg (1989).
Resumo:
Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.
Resumo:
In this paper, the statistical properties of tropical ice clouds (ice water content, visible extinction, effective radius, and total number concentration) derived from 3 yr of ground-based radar–lidar retrievals from the U.S. Department of Energy Atmospheric Radiation Measurement Climate Research Facility in Darwin, Australia, are compared with the same properties derived using the official CloudSat microphysical retrieval methods and from a simpler statistical method using radar reflectivity and air temperature. It is shown that the two official CloudSat microphysical products (2B-CWC-RO and 2B-CWC-RVOD) are statistically virtually identical. The comparison with the ground-based radar–lidar retrievals shows that all satellite methods produce ice water contents and extinctions in a much narrower range than the ground-based method and overestimate the mean vertical profiles of microphysical parameters below 10-km height by over a factor of 2. Better agreements are obtained above 10-km height. Ways to improve these estimates are suggested in this study. Effective radii retrievals from the standard CloudSat algorithms are characterized by a large positive bias of 8–12 μm. A sensitivity test shows that in response to such a bias the cloud longwave forcing is increased from 44.6 to 46.9 W m−2 (implying an error of about 5%), whereas the negative cloud shortwave forcing is increased from −81.6 to −82.8 W m−2. Further analysis reveals that these modest effects (although not insignificant) can be much larger for optically thick clouds. The statistical method using CloudSat reflectivities and air temperature was found to produce inaccurate mean vertical profiles and probability distribution functions of effective radius. This study also shows that the retrieval of the total number concentration needs to be improved in the official CloudSat microphysical methods prior to a quantitative use for the characterization of tropical ice clouds. Finally, the statistical relationship used to produce ice water content from extinction and air temperature obtained by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite is evaluated for tropical ice clouds. It is suggested that the CALIPSO ice water content retrieval is robust for tropical ice clouds, but that the temperature dependence of the statistical relationship used should be slightly refined to better reproduce the radar–lidar retrievals.
Resumo:
Motivation: The ability of a simple method (MODCHECK) to determine the sequence–structure compatibility of a set of structural models generated by fold recognition is tested in a thorough benchmark analysis. Four Model Quality Assessment Programs (MQAPs) were tested on 188 targets from the latest LiveBench-9 automated structure evaluation experiment. We systematically test and evaluate whether the MQAP methods can successfully detect native-likemodels. Results: We show that compared with the other three methods tested MODCHECK is the most reliable method for consistently performing the best top model selection and for ranking the models. In addition, we show that the choice of model similarity score used to assess a model's similarity to the experimental structure can influence the overall performance of these tools. Although these MQAP methods fail to improve the model selection performance for methods that already incorporate protein three dimension (3D) structural information, an improvement is observed for methods that are purely sequence-based, including the best profile–profile methods. This suggests that even the best sequence-based fold recognition methods can still be improved by taking into account the 3D structural information.
Resumo:
Blood clotting response (BCR) resistance tests are available for a number of anticoagulant rodenticides. However, during the development of these tests many of the test parameters have been changed, making meaningful comparisons between results difficult. It was recognised that a standard methodology was urgently required for future BCR resistance tests and, accordingly, this document presents a reappraisal of published tests, and proposes a standard protocol for future use (see Appendix). The protocol can be used to provide information on the incidence and degree of resistance in a particular rodent population; to provide a simple comparison of resistance factors between active ingredients, thus giving clear information about cross-resistance for any given strain; and to provide comparisons of susceptibility or resistance between different populations. The methodology has a sound statistical basis in being based on the ED50 response, and requires many fewer animals than the resistance tests in current use. Most importantly, tests can be used to give a clear indication of the likely practical impact of the resistance on field efficacy. The present study was commissioned and funded by the Rodenticide Resistance Action Committee (RRAC) of CropLife International.
Resumo:
Background A whole-genome genotyping array has previously been developed for Malus using SNP data from 28 Malus genotypes. This array offers the prospect of high throughput genotyping and linkage map development for any given Malus progeny. To test the applicability of the array for mapping in diverse Malus genotypes, we applied the array to the construction of a SNPbased linkage map of an apple rootstock progeny. Results Of the 7,867 Malus SNP markers on the array, 1,823 (23.2 %) were heterozygous in one of the two parents of the progeny, 1,007 (12.8 %) were heterozygous in both parental genotypes, whilst just 2.8 % of the 921 Pyrus SNPs were heterozygous. A linkage map spanning 1,282.2 cM was produced comprising 2,272 SNP markers, 306 SSR markers and the S-locus. The length of the M432 linkage map was increased by 52.7 cM with the addition of the SNP markers, whilst marker density increased from 3.8 cM/marker to 0.5 cM/marker. Just three regions in excess of 10 cM remain where no markers were mapped. We compared the positions of the mapped SNP markers on the M432 map with their predicted positions on the ‘Golden Delicious’ genome sequence. A total of 311 markers (13.7 % of all mapped markers) mapped to positions that conflicted with their predicted positions on the ‘Golden Delicious’ pseudo-chromosomes, indicating the presence of paralogous genomic regions or misassignments of genome sequence contigs during the assembly and anchoring of the genome sequence. Conclusions We incorporated data for the 2,272 SNP markers onto the map of the M432 progeny and have presented the most complete and saturated map of the full 17 linkage groups of M. pumila to date. The data were generated rapidly in a high-throughput semi-automated pipeline, permitting significant savings in time and cost over linkage map construction using microsatellites. The application of the array will permit linkage maps to be developed for QTL analyses in a cost-effective manner, and the identification of SNPs that have been assigned erroneous positions on the ‘Golden Delicious’ reference sequence will assist in the continued improvement of the genome sequence assembly for that variety.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
Epidemiological studies indicate that diets rich in fruits and vegetables (F&V) are protective against cardiovascular diseases (CVD). Pureed F&V products retain many beneficial components, including flavonoids, carotenoids, vitamin C and dietary fibres. This study aimed to establish the physiological effects of acute ingestion of F&V puree-based drink (FVPD) on vasodilation, antioxidant status, phytochemical bioavailability and other CVD risk factors. 24 Subjects, aged 30-70 years, completed the randomised, single-blind, controlled, crossover test meal study. Subjects consumed 400 ml FVPD, or fruit-flavoured sugar-matched control, after following a low-flavonoid diet for 5 days. Blood and urine samples were collected throughout the study day and vascular reactivity was assessed at 90-minute intervals using laser Doppler iontophoresis (LDI). FVPD significantly increased plasma vitamin C (P=0.002) and total nitrate/nitrite (P=0.001) concentrations. There was a near significant time by treatment effect on ex vivo LDL oxidation (P=0.068), with a longer lag phase after consuming FVPD. During the 6 hours after juice consumption the antioxidant capacity of plasma increased significantly (P=0.003) and there was a simultaneous increase in plasma and urinary phenolic metabolites (P<0.05). There were significantly lower glucose and insulin peaks after ingestion of FVPD compared with control (P=0.019 and P=0.003) and a trend towards increased endothelium-dependent vasodilation following FVPD consumption (P=0.061). Overall, FVPD consumption significantly increased plasma vitamin C and total nitrate/nitrite concentrations, with a trend towards increased endothelium-dependent vasodilation. Pureed F&V products are useful vehicles for increasing micronutrient status, plasma antioxidant capacity and in vivo NO generation, which may contribute to CVD risk reduction.
Resumo:
Water-table reconstructions from Holocene peatlands are increasingly being used as indicators of terrestrial palaeoclimate in many regions of the world. However, the links between peatland water tables, climate, and long-term peatland development are poorly understood. Here we use a combination of high-resolution proxy climate data and a model of long-term peatland development to examine the relationship between rapid hydrological fluctuations in peatlands and climatic forcing. We show that changes in water-table depth can occur independently of climate forcing. Ecohydrological feedbacks inherent in peatland development can lead to a degree of homeostasis that partially disconnects peatland water-table behaviour from external climatic influences. We conclude by suggesting that further work needs to be done before peat-based climate reconstructions can be used to test climate models.
Resumo:
This paper describes the implementation of a semantic web search engine on conversation styled transcripts. Our choice of data is Hansard, a publicly available conversation style transcript of parliamentary debates. The current search engine implementation on Hansard is limited to running search queries based on keywords or phrases hence lacks the ability to make semantic inferences from user queries. By making use of knowledge such as the relationship between members of parliament, constituencies, terms of office, as well as topics of debates the search results can be improved in terms of both relevance and coverage. Our contribution is not algorithmic instead we describe how we exploit a collection of external data sources, ontologies, semantic web vocabularies and named entity extraction in the analysis of underlying semantics of user queries as well as the semantic enrichment of the search index thereby improving the quality of results.
Resumo:
Crystallization must occur in honey in order to produce set or creamed honey; however, the process must occur in a controlled manner in order to obtain an acceptable product. As a consequence, reliable methods are needed to measure the crystal content of honey (φ expressed as kg crystal per kg honey), which can also be implemented with relative ease in industrial production facilities. Unfortunately, suitable methods do not currently exist. This article reports on the development of 2 independent offline methods to measure the crystal content in honey based on differential scanning calorimetry and high-performance liquid chromatography. The 2 methods gave highly consistent results on the basis of paired t-test involving 143 experimental points (P > 0.05, r**2 = 0.99). The crystal content also correlated with the relative viscosity, defined as the ratio of the viscosity of crystal containing honey to that of the same honey when all crystals are dissolved, giving the following correlation: μr = 1 + 1398.8∅**2.318. This correlation can be used to estimate the crystal content of honey in industrial production facilities. The crystal growth rate at a temperature of 14 ◦C—the normal crystallization temperature used in practice—was linear, and the growth rate also increased with the total glucose content in the honey.