932 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For most data stream applications, the volume of data is too huge to be stored in permanent devices or to be thoroughly scanned more than once. It is hence recognized that approximate answers are usually sufficient, where a good approximation obtained in a timely manner is often better than the exact answer that is delayed beyond the window of opportunity. Unfortunately, this is not the case for mining frequent patterns over data streams where algorithms capable of online processing data streams do not conform strictly to a precise error guarantee. Since the quality of approximate answers is as important as their timely delivery, it is necessary to design algorithms to meet both criteria at the same time. In this paper, we propose an algorithm that allows online processing of streaming data and yet guaranteeing the support error of frequent patterns strictly within a user-specified threshold. Our theoretical and experimental studies show that our algorithm is an effective and reliable method for finding frequent sets in data stream environments when both constraints need to be satisfied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computers of a non-dedicated cluster are often idle (users attend meetings, have lunch or coffee breaks) or lightly loaded (users carry out simple computations to support problem solving activities). These underutilised computers can be employed to execute parallel applications. Thus, these computers can be shared by parallel and sequential applications, which could lead to the improvement of their execution performance. However, there is a lack of experimental study showing the applications’ performance and the system utilization of executing parallel and sequential applications concurrently and concurrent execution of multiple parallel applications on a non-dedicated cluster. Here we present the result of an experimental study into load balancing based scheduling of mixtures of NAS Parallel Benchmarks and BYTE sequential applications on a very low cost non-dedicated cluster. This study showed that the proposed sharing provided performance boost as compared to the execution of the parallel load in isolation on a reduced number of computers and better cluster utilization. The results of this research were used not only to validate other researchers’ result generated by simulation but also to support our research mission of widening the use of non-dedicated clusters. Our promising results obtained could promote further research studies to convince universities, business and industry, which require a large amount of computing resources, to run parallel applications on their already owned non-dedicated clusters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE--The purpose of this study was to assess the effectiveness of a low-resource-intensive lifestyle modification program incorporating resistance training and to compare a gymnasium-based with a home-based resistance training program on diabetes diagnosis status and risk.

RESEARCH DESIGN AND METHODS--A quasi-experimental two-group study was undertaken with 122 participants with diabetes risk factors; 36.9% had impaired glucose tolerance (1GT) or impaired fasting glucose (IFG) at baseline. The intervention included a 6-week group self-management education program, a gymnasium-based or home-based 12-week resistance training program, and a 34-week maintenance program. Fasting plasma glucose (FPG) and 2-h plasma glucose, blood lipids, blood pressure, body composition, physical activity, and diet were assessed at baseline and week 52.

RESULTS--Mean 2-h plasma glucose and FPG fell by 0.34 mmol/1 (95% CI--0.60 to--0.08) and 0.15 mmol/l (-0.23 to -0.07), respectively. The proportion of participants with IFG or IGT decreased from 36.9 to 23.0% (P = 0.006). Mean weight loss was 4.07 kg (-4.99 to -3.15). The only significant difference between resistance training groups was a greater reduction in systolic blood pressure for the gymnasium-based group (P = 0.008).

CONCLUSIONS--This intervention significantly improved diabetes diagnostic status and reduced diabetes risk to a degree comparable to that of other low-resource-intensive lifestyle modification programs and more intensive interventions applied to individuals with IGT. The effects of home-based and gymnasium-based resistance training did not differ significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The staging model suggests that early stages of bipolar disorder respond better to treatments and have a more favourable prognosis. This study aims to provide empirical support for the model, and the allied construct of early intervention.

Methods: Pooled data from mania, depression, and maintenance studies of olanzapine were analyzed. Individuals were categorized as having had 0, 1–5, 6–10, or >10 prior episodes of illness, and data were analyzed across these groups.

Results: Response rates for the mania and maintenance studies ranged from 52–69% and 10–50%, respectively, for individuals with 1–5 previous episodes, and from 29–59% and 11–40% for individuals with >5 previous episodes. These rates were significantly higher for the 1–5 group on most measures of response with up to a twofold increase in the chance of responding for those with fewer previous episodes. For the depression studies, response rates were significantly higher for the 1–5 group for two measures only. In the maintenance studies, the chance of relapse to either mania or depression was reduced by 40–60% for those who had experienced 1–5 episodes or 6–10 episodes compared to the >10 episode group, respectively. This trend was statistically significant only for relapse into mania for the 1–5 episode group (p = 0.005).

Conclusion: Those individuals at the earliest stages of illness consistently had a more favourable response to treatment. This is consistent with the staging model and

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Account covers research dating from the early 1960s in the field of low-melting molten salts and hydrates,which has recently become popular under the rubric of “ionic liquids”. It covers understanding gained in the principal author’s laboratories (initially in Australia, but mostly in the U.S.A.) from spectroscopic, dynamic, and thermodynamic studies and includes recent applications of this understanding in the fields of energy conversion and biopreservation. Both protic and aprotic varieties of ionic liquids are included, but recent studies have focused on the protic class because of the special applications made possible by the highly variable proton activities available in these liquids.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses a major challenge in data mining applications where the full information about the underlying processes, such as sensor networks or large online database, cannot be practically obtained due to physical limitations such as low bandwidth or memory, storage, or computing power. Motivated by the recent theory on direct information sampling called compressed sensing (CS), we propose a framework for detecting anomalies from these largescale data mining applications where the full information is not practically possible to obtain. Exploiting the fact that the intrinsic dimension of the data in these applications are typically small relative to the raw dimension and the fact that compressed sensing is capable of capturing most information with few measurements, our work show that spectral methods that used for volume anomaly detection can be directly applied to the CS data with guarantee on performance. Our theoretical contributions are supported by extensive experimental results on large datasets which show satisfactory performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structure-property relationships of thermosets are important in the manufacture and application of materials. Understanding the desired properties of a material in a certain application is related to the material's structure and vice versa. The way in which the material is processed is also a determinant of the structure and resulting properties. Many books have been written about the chemistry of thermosets but with only brief consideration of structure-property relationships. This book focuses on how the structure and properties of a range of thermosets affect the final material and applications. It is composed of two parts: I Structure and properties of thermosets and II Applications of thermosets. Part I starts with a comprehensive overview of thermosets covering structure, properties and processing for advanced applications, followed by four chapters addressing mechanical properties, thermal properties, rheology, and nanostructures and toughening. The applications presented in Part II range from the use of thermosets in the building and construction industry to aerospace applications, electrical applications, thermoset adhesives and insulation materials in appliances and other applications. We hope that this book will not only be a useful textbook for advanced undergraduate and postgraduate students, but also a concise reference for researchers in academia and engineers in related industries. I would like to express my sincere gratitude to the staff of Woodhead Publishing Limited, especially Kathryn Picking who invited me to edit this book and helped develop the initial content, also Adam Hooper, Helen Bradley, Emily Cole, Francis Dodds and Rachel Cox for their assistance in many ways during the preparation of the manuscript. Finally, I wish to express my appreciation and respects to all the contributors for their commitment, patience and pleasant cooperation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is currently no universally recommended and accepted method of data processing within the science of indirect calorimetry for either mixing chamber or breath-by-breath systems of expired gas analysis. Exercise physiologists were first surveyed to determine methods used to process oxygen consumption ([OV0312]O 2) data, and current attitudes to data processing within the science of indirect calorimetry. Breath-by-breath datasets obtained from indirect calorimetry during incremental exercise were then used to demonstrate the consequences of commonly used time, breath and digital filter post-acquisition data processing strategies. Assessment of the variability in breath-by-breath data was determined using multiple regression based on the independent variables ventilation (VE), and the expired gas fractions for oxygen and carbon dioxide, FEO 2 and FECO2, respectively. Based on the results of explanation of variance of the breath-by-breath [OV0312]O2 data, methods of processing to remove variability were proposed for time-averaged, breath-averaged and digital filter applications. Among exercise physiologists, the strategy used to remove the variability in sequential [OV0312]O2 measurements varied widely, and consisted of time averages (30 sec [38%], 60 sec [18%], 20 sec [11%], 15 sec [8%]), a moving average of five to 11 breaths (10%), and the middle five of seven breaths (7%). Most respondents indicated that they used multiple criteria to establish maximum [OV0312]O 2 ([OV0312]O2max) including: the attainment of age-predicted maximum heart rate (HRmax) [53%], respiratory exchange ratio (RER) >1.10 (49%) or RER >1.15 (27%) and a rating of perceived exertion (RPE) of >17, 18 or 19 (20%). The reasons stated for these strategies included their own beliefs (32%), what they were taught (26%), what they read in research articles (22%), tradition (13%) and the influence of their colleagues (7%). The combination of VE, FEO 2 and FECO2 removed 96-98% of [OV0312]O2 breath-by-breath variability in incremental and steady-state exercise [OV0312]O2 data sets, respectively. Correction of residual error in [OV0312]O2 datasets to 10% of the raw variability results from application of a 30-second time average, 15-breath running average, or a 0.04 Hz low cut-off digital filter. Thus, we recommend that once these data processing strategies are used, the peak or maximal value becomes the highest processed datapoint. Exercise physiologists need to agree on, and continually refine through empirical research, a consistent process for analysing data from indirect calorimetry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monotonicity preserving interpolation and approximation have received substantial attention in the last thirty years because of their numerous applications in computer aided-design, statistics, and machine learning [9, 10, 19]. Constrained splines are particularly popular because of their flexibility in modeling different geometrical shapes, sound theoretical properties, and availability of numerically stable algorithms [9,10,26]. In this work we examine parallelization and adaptation for GPUs of a few algorithms of monotone spline interpolation and data smoothing, which arose in the context of estimating probability distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of a longitudinal study, infant/toddler pretend play development and maternal play modelling were investigated in dyadic context. A total of 21 children were videotaped in monthly play sessions with their mothers, from age 8 to 17 months. Child and mother pretend play frequencies and levels were measured using Brown’s Pretend Play Observation Scale. Child IQ assessments at 5 years (Stanford–Binet IV) indicated average to high ability levels (M = 122.62). Descriptive analyses showed that children’s levels of pretend development were markedly in advance of age-typical expectations. With a previous analysis showing no specific associations between play levels and IQ, intensive maternal scaffolding, data analysis approaches and use of abstract play materials are proposed as possible contributory factors to the children’s advanced pretend play development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of new media—including branded websites, social media and mobile applications—has created additional touch points for unhealthy food and beverage companies to target children and adolescents. The aim of this study was to perform an audit of new media for three top selling food and beverage brands in Australia. The top selling brand in three of the most advertised food and beverage categories was identified. Facebook, websites and mobile phone applications from these three brands were assessed using a combination of descriptive analyses and structured data collection during June and July 2013. Information on target audience, main focus of the activity, marketing strategies employed and connectivity were collected. Promotional activities were assessed against industry self-regulatory codes. McDonald's, Coca-Cola and Cadbury Dairy Milk were audited, with 21 promotional activities identified. These promotional activities appeared to use a number of marketing strategies, with frequent use of indirect product association, engagement techniques and branding. We identified strategic targeting of both children and adolescents. We found that while all promotional activities technically met self-regulatory codes (usually due to media-specific age restrictions) a number appeared to employ unhealthy food or beverage marketing directed to children. Brands are using engaging content via new media aimed at children and adolescents to promote unhealthy food and beverages. Given the limitations of self-regulatory codes in the context of new media, strategies need to be developed to reduce exposure of children and adolescents to marketing of unhealthy food and beverage products via these avenues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Physical activity is a modifiable behavior related to many preventable non-communicable diseases. There is an age-related decline in physical activity levels in young people, which tracks into adulthood. Common interactive technologies such as smartphones, particularly employing immersive features, may enhance the appeal and delivery of interventions to increase levels of physical activity in young people. The primary aim of the Apps for IMproving FITness (AIMFIT) trial is to evaluate the effectiveness of two popular "off-the-shelf" smartphone apps for improving cardiorespiratory fitness in young people.

METHODS/DESIGN: A three-arm, parallel, randomized controlled trial will be conducted in Auckland, New Zealand. Fifty-one eligible young people aged 14-17 years will be randomized to one of three conditions: 1) use of an immersive smartphone app, 2) use of a non-immersive app, or 3) usual behavior (control). Both smartphone apps consist of an eight-week training program designed to improve fitness and ability to run 5 km, however, the immersive app features a game-themed design and adds a narrative. Data are collected at baseline and 8 weeks. The primary outcome is cardiorespiratory fitness, assessed as time to complete the one mile run/walk test at 8 weeks. Secondary outcomes are physical activity levels, self-efficacy, enjoyment, psychological need satisfaction, and acceptability and usability of the apps. Analysis using intention to treat principles will be performed using regression models.

DISCUSSION: Despite the proliferation of commercially available smartphone applications, there is a dearth of empirical evidence to support their effectiveness on the targeted health behavior. This pragmatic study will determine the effectiveness of two popular "off-the-shelf" apps as a stand-alone instrument for improving fitness and physical activity among young people. Adherence to app use will not be closely controlled; however, random allocation of participants, a heterogeneous group, and data analysis using intention to treat principles provide internal and external validity to the study. The primary outcome will be objectively assessed with a valid and reliable field-based test, as well as the secondary outcome of physical activity, via accelerometry. If effective, such applications could be used alongside existing interventions to promote fitness and physical activity in this population. TRIAL REGISTRATION: Australian New Zealand Clinical Trials Registry: ACTRN12613001030763. Registered 16 September 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need to estimate a particular quantile of a distribution is an important problem which frequently arises in many computer vision and signal processing applications. For example, our work was motivated by the requirements of many semi-automatic surveillance analytics systems which detect abnormalities in close-circuit television (CCTV) footage using statistical models of low-level motion features. In this paper we specifically address the problem of estimating the running quantile of a data stream with non-stationary stochasticity when the memory for storing observations is limited. We make several major contributions: (i) we derive an important theoretical result which shows that the change in the quantile of a stream is constrained regardless of the stochastic properties of data, (ii) we describe a set of high-level design goals for an effective estimation algorithm that emerge as a consequence of our theoretical findings, (iii) we introduce a novel algorithm which implements the aforementioned design goals by retaining a sample of data values in a manner adaptive to changes in the distribution of data and progressively narrowing down its focus in the periods of quasi-stationary stochasticity, and (iv) we present a comprehensive evaluation of the proposed algorithm and compare it with the existing methods in the literature on both synthetic data sets and three large 'real-world' streams acquired in the course of operation of an existing commercial surveillance system. Our findings convincingly demonstrate that the proposed method is highly successful and vastly outperforms the existing alternatives, especially when the target quantile is high valued and the available buffer capacity severely limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need to estimate a particular quantile of a distribution is an important problem that frequently arises in many computer vision and signal processing applications. For example, our work was motivated by the requirements of many semiautomatic surveillance analytics systems that detect abnormalities in close-circuit television footage using statistical models of low-level motion features. In this paper, we specifically address the problem of estimating the running quantile of a data stream when the memory for storing observations is limited. We make the following several major contributions: 1) we highlight the limitations of approaches previously described in the literature that make them unsuitable for nonstationary streams; 2) we describe a novel principle for the utilization of the available storage space; 3) we introduce two novel algorithms that exploit the proposed principle in different ways; and 4) we present a comprehensive evaluation and analysis of the proposed algorithms and the existing methods in the literature on both synthetic data sets and three large real-world streams acquired in the course of operation of an existing commercial surveillance system. Our findings convincingly demonstrate that both of the proposed methods are highly successful and vastly outperform the existing alternatives. We show that the better of the two algorithms (data-aligned histogram) exhibits far superior performance in comparison with the previously described methods, achieving more than 10 times lower estimate errors on real-world data, even when its available working memory is an order of magnitude smaller.