928 resultados para Data-driven energy e ciency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Alterations in energy expenditure during activity post head injury has not been investigated due primarily to the difficulty of measurement. Objective: The aim of this study was to compare energy expenditure during activity and body composition of children following acquired brain injury (ABI) with data from a group of normal controls. Design: Energy expenditure was measured using the Cosmed K4b2 in a group of 15 children with ABI and a group of 67 normal children during rest and when walking and running. Mean number of steps taken per 3 min run was also recorded and body composition was measured. Results: The energy expended during walking was not significantly different between both groups. A significant difference was found between the two groups in the energy expended during running and also for the number of steps taken as children with ABI took significantly less steps than the normal controls during a 3 min run. Conclusions: Children with ABI exert more energy per activity than healthy controls when controlled for velocity or distance. However, they expend less energy to walk and run when they are free to choose their own desirable, comfortable pace than normal controls. © 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present study a two dimensional model is first developed to show the behaviour of dense non-aqueous phase liquids (DNAPL) within a rough fracture. To consider the rough fracture, the fracture is imposed with variable apertures along its plane. It is found that DNAPL follows preferential pathways. In next part of the study the above model is further extended for non-isothermal DNAPL flow and DNAPL-water interphase mass transfer phenomenon. These two models are then coupled with joint deformation due to normal stresses. The primary focus of these models is specifically to elucidate the influence of joint alteration due to external stress and fluid pressures on flow driven energy transport and interphase mass transfer. For this, it is assumed that the critical value for joint alteration is associated with external stress and average of water and DNAPL pressures in multiphase system and the temporal and spatial evolution of joint alteration are determined for its further influence on energy transport and miscible phase transfer. The developed model has been studied to show the influence of deformation on DNAPL flow. Further this preliminary study demonstrates the influence of joint deformation on heat transport and phase miscibility via multiphase flow velocities. It is seen that the temperature profile changes and shows higher diffusivity due to deformation and although the interphase miscibility value decreases but the lateral dispersion increases to a considerably higher extent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop compilation techniques for the realization of applications described in a High Level Language (HLL) onto a Runtime Reconfigurable Architecture. The compiler determines Hyper Operations (HyperOps) that are subgraphs of a data flow graph (of an application) and comprise elementary operations that have strong producer-consumer relationship. These HyperOps are hosted on computation structures that are provisioned on demand at runtime. We also report compiler optimizations that collectively reduce the overheads of data-driven computations in runtime reconfigurable architectures. On an average, HyperOps offer a 44% reduction in total execution time and a 18% reduction in management overheads as compared to using basic blocks as coarse grained operations. We show that HyperOps formed using our compiler are suitable to support data flow software pipelining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding plant demography and plant response to herbivory is critical to the selection of effective weed biological control agents. We adopt the metaphor of 'filters' to suggest how agent prioritisation may be improved to narrow our choices down to those likely to be most effective in achieving the desired weed management outcome. Models can serve to capture our level of knowledge (or ignorance) about our study system and we illustrate how one type of modelling approach (matrix models) may be useful in identifying the weak link in a plant life cycle by using a hypothetical and an actual weed example (Parkinsonia aculeata). Once the vulnerable stage has been identified we propose that studying plant response to herbivory (simulated and/or actual) can help identify the guilds of herbivores to which a plant is most likely to succumb. Taking only potentially effective agents through the filter of host specificity may improve the chances of releasing safe and effective agents. The methods we outline may not always lead us definitively to the successful agent(s), but such an empirical, data-driven approach will make the basis for agent selection explicit and serve as testable hypotheses once agents are released.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical inactivity, low cardiorespiratory fitness, and abdominal obesity are direct and mediating risk factors for cardiovascular disease (CVD). The results of recent studies suggest that individuals with higher levels of physical activity or cardiorespiratory fitness have lower CVD and all-cause mortality than those with lower activity or fitness levels regardless of their level of obesity. The interrelationships of physical activity, fitness, and abdominal obesity with cardiovascular risk factors have not been studied in detail. The aim of this study was to investigate the associations of different types of leisure time physical activity and aerobic fitness with cardiovascular risk factors in a large population of Finnish adults. In addition, a novel aerobic fitness test was implemented and the distribution of aerobic fitness was explored in men and women across age groups. The interrelationships of physical activity, aerobic fitness and abdominal obesity were examined in relation to cardiovascular risk factors. This study was part of the National FINRISK Study 2002, which monitors cardiovascular risk factors in a Finnish adult population. The sample comprised 13 437 men and women aged 25 to 74 years and was drawn from the Population Register as a stratified random sample according to 10-year age groups, gender and area. A separate physical activity study included 9179 subjects, of whom 5 980 participated (65%) in the study. At the study site, weight, height, waist and hip circumferences, and blood pressure were measured, a blood sample was drawn, and an aerobic fitness test was performed. The fitness test estimated maximal oxygen uptake (VO2max) and was based on a non-exercise method by using a heart rate monitor at rest. Waist-to-hip ratio (WHR) was calculated by dividing waist circumference with hip circumference and was used as a measure of abdominal obesity. Participants filled in a questionnaire on health behavior, a history of diseases, and current health status, and a detailed 12-month leisure time physical activity recall. Based on the recall data, relative energy expenditure was calculated using metabolic equivalents, and physical activity was divided into conditioning, non-conditioning, and commuting physical activity. Participants aged 45 to 74 years were later invited to take part in a 2-hour oral glucose tolerance test with fasting insulin and glucose measurements. Based on the oral glucose tolerance test, undiagnosed impaired glucose tolerance and type 2 diabetes were defined. The estimated aerobic fitness was lower among women and decreased with age. A higher estimated aerobic fitness and a lower WHR were independently associated with lower systolic and diastolic blood pressure, lower total cholesterol and triglyceride levels, and with higher high-density lipoprotein (HDL) cholesterol and HDL to total cholesterol ratio. The associations of the estimated aerobic fitness with diastolic blood pressure, triglycerides, and HDL to total cholesterol ratio were stronger in men with a higher WHR. High levels of conditioning and non-conditioning physical activity were associated with lower high-sensitivity C-reactive protein (CRP) levels. High levels of conditioning and overall physical activities were associated with lower insulin and glucose levels. The associations were stronger among women than men. A better self-rated physical fitness was associated with a higher estimated aerobic fitness, lower CRP levels, and lower insulin and glucose levels in men and women. In each WHR third, the risk of impaired glucose tolerance and type 2 diabetes was higher among physically inactive individuals who did not undertake at least 30 minutes of moderate-intensity physical activity on five days per week. These cross-sectional data show that higher levels of estimated aerobic fitness and regular leisure time physical activity are associated with a favorable cardiovascular risk factor profile and that these associations are present at all levels of abdominal obesity. Most of the associations followed a dose-response manner, suggesting that already low levels of physical activity or fitness are beneficial to health and that larger improvements in risk factor levels may be gained from higher activity and fitness levels. The present findings support the recommendation to engage regularly in leisure time physical activity, to pursue a high level of aerobic fitness, and to prevent abdominal obesity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data-driven approaches such as Gaussian Process (GP) regression have been used extensively in recent robotics literature to achieve estimation by learning from experience. To ensure satisfactory performance, in most cases, multiple learning inputs are required. Intuitively, adding new inputs can often contribute to better estimation accuracy, however, it may come at the cost of a new sensor, larger training dataset and/or more complex learning, some- times for limited benefits. Therefore, it is crucial to have a systematic procedure to determine the actual impact each input has on the estimation performance. To address this issue, in this paper we propose to analyse the impact of each input on the estimate using a variance-based sensitivity analysis method. We propose an approach built on Analysis of Variance (ANOVA) decomposition, which can characterise how the prediction changes as one or more of the input changes, and also quantify the prediction uncertainty as attributed from each of the inputs in the framework of dependent inputs. We apply the proposed approach to a terrain-traversability estimation method we proposed in prior work, which is based on multi-task GP regression, and we validate this implementation experimentally using a rover on a Mars-analogue terrain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient and reliable diagnostic tools for the routine indexing and certification of clean propagating material are essential for the management of pospiviroid diseases in horticultural crops. This study describes the development of a true multiplexed diagnostic method for the detection and identification of all nine currently recognized pospiviroid species in one assay using Luminex bead-based suspension array technology. In addition, a new data-driven, statistical method is presented for establishing thresholds for positivity for individual assays within multiplexed arrays. When applied to the multiplexed array data generated in this study, the new method was shown to have better control of false positives and false negative results than two other commonly used approaches for setting thresholds. The 11-plex Luminex MagPlex-TAG pospiviroid array described here has a unique hierarchical assay design, incorporating a near-universal assay in addition to nine species-specific assays, and a co-amplified plant internal control assay for quality assurance purposes. All assays of the multiplexed array were shown to be 100% specific, sensitive and reproducible. The multiplexed array described herein is robust, easy to use, displays unambiguous results and has strong potential for use in routine pospiviroid indexing to improve disease management strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kafka On The Shore consists of three simple concrete letterforms floating on a gallery wall. Reminiscent of minimalist sculpture, the mathematical expression of the letterforms states that ‘r’ is greater than ‘g’. Despite this material simplicity, the solemn presentation of the formula suggests a sense of foreboding, a quiet menace. The work was created as a response to the economic theories of Thomas Piketty presented in his book Capital in the Twenty-First Century. The primary finding of Piketty’s data-driven research is the formula presented by the work; that historically, wealth and inequity both flourish when the rate of return on capital (r) is greater than the rate of economic growth (g). With this simple mathematical summary the book acts as a sobering indictment on the present state of economic inequality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inadvertent failure of power transformers has serious consequences on the power system reliability, economics and the revenue accrual. Insulation is the weakest link in the power transformer prompting periodic inspection of the status of insulation at different points in time. A close Monitoring of the electrical, chemical and such other properties on insulation as are sensitive to the amount of time-dependent degradation becomes mandatory to judge the status of the equipment. Data-driven Diagnostic Testing and Condition Monitoring (DTCM) specific to power transformer is the aspect in focus. Authors develop a Monte Carlo approach for augmenting the rather scanty experimental data normally acquired using Proto-types of power transformers. Also described is a validation procedure for estimating the accuracy of the Model so developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research is to examine whether short-term communication training can have an impact on the improvement of communication capacity of working communities, and what are prerequisites for the creation of such capacity. Subjects of this research were short-term communication trainings aimed at the managerial and expert levels of enterprises and communities. The research endeavors to find out how communication trainings with an impact should be devised and implemented, and what this requires from the client and provider of the training service. The research data is mostly comprised of quantitative feed-back collected at the end of a training day, as well as delayed interviews. The evaluations have been based on a stakeholder approach, and those concerned were participants to the trainings, clients having commissioned the trainings and communication trainers. The principal method of the qualitative analysis is that of a data-driven content analysis. Two research instruments have been constructed for the analysis and for the presentation of the results: an evaluation circle for the purposes of a holistic evaluation and a development matrix for the structuring of an effective training. The core concept of the matrix is a carrier wave effect, which is needed to carry the abstractions from the training into concrete functions in the everyday life. The relevance of the results has been tested in a pilot organization. The immediate assessment and delayed evaluations gave a very differing picture of the trainings. The immediate feedback was of nearly commendable level, but the effects carried forward into the everyday situations of the working community were small and that the learning rarely was applied into practice. A training session that receives good feedback does not automatically result in the development of individual competence, let alone that of the community. The results show that even short-term communication training can promote communication competence that eventually changes the working culture on an organizational level, provided that the training is designed into a process and that the connections into the participants’ work are ensured. It is essential that all eight elements of the carrier wave effect are taken into account. The entire purchaser-provider -process must function while not omitting the contribution of the participants themselves. The research illustrates the so called bow tie -model of an effective communication training based on the carrier wave effect. Testing the results in pilot trainings showed that a rather small change in the training approach may have a signi¬ficant effect on the outcome of the training as well as those effects that are carried on into the working community. The evaluation circle proved to be a useful tool, which can be used while planning, executing and evaluating training in practice. The development matrix works as a tool for those producing the training service, those using the service as well as those deciding on the purchase of the service in planning and evaluating training that sustainably improves communication capacity. Thus the evaluation circle also works to support and ensure the long-term effects of short-term trainings. In addition to communication trainings, the tools developed for this research are useable for many such needs, where an organization is looking to improve its operations and profitability through training.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic models have been widely used to predict water quality in distribution systems, but their calibration requires extensive and accurate data sets for numerous parameters. In this study, alternative data-driven modeling approaches based on artificial neural networks (ANNs) were used to predict temporal variations of two important characteristics of water quality chlorine residual and biomass concentrations. The authors considered three types of ANN algorithms. Of these, the Levenberg-Marquardt algorithm provided the best results in predicting residual chlorine and biomass with error-free and ``noisy'' data. The ANN models developed here can generate water quality scenarios of piped systems in real time to help utilities determine weak points of low chlorine residual and high biomass concentration and select optimum remedial strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new algorithm for extracting Free-Form Surface Features (FFSFs) from a surface model. The extraction algorithm is based on a modified taxonomy of FFSFs from that proposed in the literature. A new classification scheme has been proposed for FFSFs to enable their representation and extraction. The paper proposes a separating curve as a signature of FFSFs in a surface model. FFSFs are classified based on the characteristics of the separating curve (number and type) and the influence region (the region enclosed by the separating curve). A method to extract these entities is presented. The algorithm has been implemented and tested for various free-form surface features on different types of free-form surfaces (base surfaces) and is found to correctly identify and represent the features irrespective of the type of underlying surface. The representation and extraction algorithm are both based on topology and geometry. The algorithm is data-driven and does not use any pre-defined templates. The definition presented for a feature is unambiguous and application independent. The proposed classification of FFSFs can be used to develop an ontology to determine semantic equivalences for the feature to be exchanged, mapped and used across PLM applications. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent focus of flood frequency analysis (FFA) studies has been on development of methods to model joint distributions of variables such as peak flow, volume, and duration that characterize a flood event, as comprehensive knowledge of flood event is often necessary in hydrological applications. Diffusion process based adaptive kernel (D-kernel) is suggested in this paper for this purpose. It is data driven, flexible and unlike most kernel density estimators, always yields a bona fide probability density function. It overcomes shortcomings associated with the use of conventional kernel density estimators in FFA, such as boundary leakage problem and normal reference rule. The potential of the D-kernel is demonstrated by application to synthetic samples of various sizes drawn from known unimodal and bimodal populations, and five typical peak flow records from different parts of the world. It is shown to be effective when compared to conventional Gaussian kernel and the best of seven commonly used copulas (Gumbel-Hougaard, Frank, Clayton, Joe, Normal, Plackett, and Student's T) in estimating joint distribution of peak flow characteristics and extrapolating beyond historical maxima. Selection of optimum number of bins is found to be critical in modeling with D-kernel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real world biological systems such as the human brain are inherently nonlinear and difficult to model. However, most of the previous studies have either employed linear models or parametric nonlinear models for investigating brain function. In this paper, a novel application of a nonlinear measure of phase synchronization based on recurrences, correlation between probabilities of recurrence (CPR), to study connectivity in the brain has been proposed. Being non-parametric, this method makes very few assumptions, making it suitable for investigating brain function in a data-driven way. CPR's utility with application to multichannel electroencephalographic (EEG) signals has been demonstrated. Brain connectivity obtained using thresholded CPR matrix of multichannel EEG signals showed clear differences in the number and pattern of connections in brain connectivity between (a) epileptic seizure and pre-seizure and (b) eyes open and eyes closed states. Corresponding brain headmaps provide meaningful insights about synchronization in the brain in those states. K-means clustering of connectivity parameters of CPR and linear correlation obtained from global epileptic seizure and pre-seizure showed significantly larger cluster centroid distances for CPR as opposed to linear correlation, thereby demonstrating the superior ability of CPR for discriminating seizure from pre-seizure. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. (C) 2013 Elsevier Ltd. All rights reserved.