88 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a hybrid neural classifier combining the auto-encoder neural network and the Lattice Vector Quantization (LVQ) model is described. The auto-encoder network is used for dimensionality reduction by projecting high dimensional data into the 2D space. The LVQ model is used for data visualization by forming and adapting the granularity of a data map. The mapped data are employed to predict the target classes of new data samples. To improve classification accuracy, a majority voting scheme is adopted by the hybrid classifier. To demonstrate the applicability of the hybrid classifier, a series of experiments using simulated and real fault data from induction motors is conducted. The results show that the hybrid classifier is able to outperform the Multi-Layer Perceptron neural network, and to produce very good classification accuracy rates for various fault conditions of induction motors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cost and time of deploying HPC applications on clouds is a problem. Instead of conducting their research discipline specialists are forced to carry out activities for application deployment, publication and ease of access. In response, a new approach for HPC application deployment and access in clouds is proposed. The major innovations are a new approach to deploying and executing HPC applications on IaaS and PaaS clouds, and exposing HPC applications as services. Through three case studies this paper demonstrates the feasibility and effectiveness of the proposed approach that could lead to the building of a SaaS library of discipline-oriented services evocable through user friendly, discipline specific interfaces. The new approach will reduce the time and money needed to deploy and expose discipline HPC applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since its establishment, the Android applications market has been infected by a proliferation of malicious applications. Recent studies show that rogue developers are injecting malware into legitimate market applications which are then installed on open source sites for consumer uptake. Often, applications are infected several times. In this paper, we investigate the behavior of malicious Android applications, we present a simple and effective way to safely execute and analyze them. As part of this analysis, we use the Android application sandbox Droidbox to generate behavioral graphs for each sample and these provide the basis of the development of patterns to aid in identifying it. As a result, we are able to determine if family names have been correctly assigned by current anti-virus vendors. Our results indicate that the traditional anti-virus mechanisms are not able to correctly identify malicious Android applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thousands of students are preparing for chemistry examinations in June. An unresolved debate is whether they should be permitted to use graphics and programmable calculators in those examinations. Some educators have not only advocated the use of graphics calculators, but have also pointed to the Danish system in which students are permitted to use computers in senior school examinations.

In some Australian jurisdictions, graphics calculators are permitted in year 12 mathematics examinations, but not in chemistry examinations. The reasoning is that information or methods of solving numerical chemical problems can be stored in the memory of graphics calculators, giving some students an unfair advantage. This means that chemistry students either have to learn how to use (and buy!) two types of calculators or, if they only have one calculator, are disadvantaged in using non-programmable calculators in mathematics examinations.

The use of technology (or its lack thereof) can limit how and what students learn. “The mechanics of computation and human thought” is an allusion to Asimov’s short story, “A Feeling of Power” in which, overuse of technology has caused people to forget how to do simple arithmetic. In our current assessment system, the insistence that students must be able to do simple chemical calculations has lead to underuse of available technology. The misperception is that the ability to do calculations is linked to understanding of concepts.

Graphics calculators, programmable calculators and computers are tools. Instead of banning or limiting technology, we should take the opportunity to rethink what is being assessed and how it is assessed. It is the proper use of technology, by combining the mechanics of computation and human thought to deepen understanding and to ask probing questions that truly leads to a feeling of power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lovastatin is a potent hypercholesterolemic drug used for lowering blood cholesterol. It acts by competitively inhibiting the enzyme, 3-hydroxy-3-methylglutaryl coenzyme A reductase involved in the biosynthesis of cholesterol. It is produced by a variety of filamentous fungi including Penicillium species, Monascus ruber and Aspergillus terreus as a secondary metabolite. Production of lovastatin by biotechnology decreases the production cost compared to costs of chemical synthesis. In recent years, lovastatin has also been reported as a potential therapeutic agent for the treatment of various types of tumors and also play a tremendous role in the regulation of the inflammatory and immune response, coagulation process, bone turnover, neovascularization, vascular tone, and arterial pressure. This review focus on the structure, biosynthesis, biotechnological production and biomedical applications of lovastatin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in conditioning (all the processes of preparation for competition) has used group research designs, where multiple athletes are observed at one or more points in time. However, empirical reports of large inter-individual differences in response to conditioning regimens suggest that applied conditioning research would greatly benefit from single-subject research designs. Single-subject research designs allow us to find out the extent to which a specific conditioning regimen works for a specific athlete, as opposed to the average athlete, who is the focal point of group research designs. The aim of the following review is to outline the strategies and procedures of single-subject research as they pertain to the assessment of conditioning for individual athletes. The four main experimental designs in single-subject research are: the AB design, reversal (withdrawal) designs and their extensions, multiple baseline designs and alternating treatment designs. Visual and statistical analyses commonly used to analyse single-subject data, and advantages and limitations are discussed. Modelling of multivariate single-subject data using techniques such as dynamic factor analysis and structural equation modelling may identify individualised models of conditioning leading to better prediction of performance. Despite problems associated with data analyses in single-subject research (e.g. serial dependency), sports scientists should use single-subject research designs in applied conditioning research to understand how well an intervention (e.g. a training method) works and to predict performance for a particular athlete.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Android platform uses a permission system model to allow users and developers to regulate access to private information and system resources required by applications. Permissions have been proved to be useful for inferring behaviors and characteristics of an application. In this paper, a novel method to extract contrasting permission patterns for clean and malicious applications is proposed. Contrary to existing work, both required and used permissions were considered when discovering the patterns. We evaluated our methodology on a clean and a malware dataset, each comprising of 1227 applications. Our empirical results suggest that our permission patterns can capture key differences between clean and malicious applications, which can assist in characterizing these two types of applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE

To introduce techniques for deriving a map that relates visual field locations to optic nerve head (ONH) sectors and to use the techniques to derive a map relating Medmont perimetric data to data from the Heidelberg Retinal Tomograph.

METHODS
Spearman correlation coefficients were calculated relating each visual field location (Medmont M700) to rim area and volume measures for 10° ONH sectors (HRT III software) for 57 participants: 34 with glaucoma, 18 with suspected glaucoma, and 5 with ocular hypertension. Correlations were constrained to be anatomically plausible with a computational model of the axon growth of retinal ganglion cells (Algorithm GROW). GROW generated a map relating field locations to sectors of the ONH. The sector with the maximum statistically significant (P < 0.05) correlation coefficient within 40° of the angle predicted by GROW for each location was computed. Before correlation, both functional and structural data were normalized by either normative data or the fellow eye in each participant.

RESULTS
The model of axon growth produced a 24-2 map that is qualitatively similar to existing maps derived from empiric data. When GROW was used in conjunction with normative data, 31% of field locations exhibited a statistically significant relationship. This significance increased to 67% (z-test, z = 4.84; P < 0.001) when both field and rim area data were normalized with the fellow eye.

CONCLUSIONS
A computational model of axon growth and normalizing data by the fellow eye can assist in constructing an anatomically plausible map connecting visual field data and sectoral ONH data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To examine the acceptability of the methods used to evaluate Coping-Together, one of the first self-directed coping skill intervention for couples facing cancer, and to collect preliminary efficacy data. METHODS: Forty-two couples, randomized to a minimal ethical care (MEC) condition or to Coping-Together, completed a survey at baseline and 2 months after, a cost diary, and a process evaluation phone interview. RESULTS: One hundred seventy patients were referred to the study. However, 57 couples did not meet all eligibility criteria, and 51 refused study participation. On average, two to three couples were randomized per month, and on average it took 26 days to enrol a couple in the study. Two couples withdrew from MEC, none from Coping-Together. Only 44 % of the cost diaries were completed, and 55 % of patients and 60 % of partners found the surveys too long, and this despite the follow-up survey being five pages shorter than the baseline one. Trends in favor of Coping-Together were noted for both patients and their partners. CONCLUSIONS: This study identified the challenges of conducting dyadic research, and a number of suggestions were put forward for future studies, including to question whether distress screening was necessary and what kind of control group might be more appropriate in future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For multiple heterogeneous multicore server processors across clouds and data centers, the aggregated performance of the cloud of clouds can be optimized by load distribution and balancing. Energy efficiency is one of the most important issues for large-scale server systems in current and future data centers. The multicore processor technology provides new levels of performance and energy efficiency. The present paper aims to develop power and performance constrained load distribution methods for cloud computing in current and future large-scale data centers. In particular, we address the problem of optimal power allocation and load distribution for multiple heterogeneous multicore server processors across clouds and data centers. Our strategy is to formulate optimal power allocation and load distribution for multiple servers in a cloud of clouds as optimization problems, i.e., power constrained performance optimization and performance constrained power optimization. Our research problems in large-scale data centers are well-defined multivariable optimization problems, which explore the power-performance tradeoff by fixing one factor and minimizing the other, from the perspective of optimal load distribution. It is clear that such power and performance optimization is important for a cloud computing provider to efficiently utilize all the available resources. We model a multicore server processor as a queuing system with multiple servers. Our optimization problems are solved for two different models of core speed, where one model assumes that a core runs at zero speed when it is idle, and the other model assumes that a core runs at a constant speed. Our results in this paper provide new theoretical insights into power management and performance optimization in data centers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: There has been a recent proliferation in the development of smartphone applications (apps) aimed at modifying various health behaviours. While interventions that incorporate behaviour change techniques (BCTs) have been associated with greater effectiveness, it is not clear to what extent smartphone apps incorporate such techniques. The purpose of this study was to investigate the presence of BCTs in physical activity and dietary apps and determine how reliably the taxonomy checklist can be used to identify BCTs in smartphone apps.

METHODS: The top-20 paid and top-20 free physical activity and/or dietary behaviour apps from the New Zealand Apple App Store Health & Fitness category were downloaded to an iPhone. Four independent raters user-tested and coded each app for the presence/absence of BCTs using the taxonomy of behaviour change techniques (26 BCTs in total). The number of BCTs included in the 40 apps was calculated. Krippendorff's alpha was used to evaluate interrater reliability for each of the 26 BCTs.

RESULTS: Apps included an average of 8.1 (range 2-18) techniques, the number being slightly higher for paid (M = 9.7, range 2-18) than free apps (M = 6.6, range 3-14). The most frequently included BCTs were "provide instruction" (83% of the apps), "set graded tasks" (70%), and "prompt self-monitoring" (60%). Techniques such as "teach to use prompts/cues", "agree on behavioural contract", "relapse prevention" and "time management" were not present in the apps reviewed. Interrater reliability coefficients ranged from 0.1 to 0.9 (Mean 0.6, SD = 0.2).

CONCLUSIONS: Presence of BCTs varied by app type and price; however, BCTs associated with increased intervention effectiveness were in general more common in paid apps. The taxonomy checklist can be used by independent raters to reliably identify BCTs in physical activity and dietary behaviour smartphone apps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Remote telemonitoring holds great potential to augment management of patients with coronary heart disease (CHD) and atrial fibrillation (AF) by enabling regular physiological monitoring during physical activity. Remote physiological monitoring may improve home and community exercise-based cardiac rehabilitation (exCR) programs and could improve assessment of the impact and management of pharmacological interventions for heart rate control in individuals with AF.

Objective: Our aim was to evaluate the measurement validity and data transmission reliability of a remote telemonitoring system comprising a wireless multi-parameter physiological sensor, custom mobile app, and middleware platform, among individuals in sinus rhythm and AF.

Methods: Participants in sinus rhythm and with AF undertook simulated daily activities, low, moderate, and/or high intensity exercise. Remote monitoring system heart rate and respiratory rate were compared to reference measures (12-lead ECG and indirect calorimeter). Wireless data transmission loss was calculated between the sensor, mobile app, and remote Internet server.

Results: Median heart rate (-0.30 to 1.10 b∙min-1) and respiratory rate (-1.25 to 0.39 br∙min-1) measurement biases were small, yet statistically significant (all P≤.003) due to the large number of observations. Measurement reliability was generally excellent (rho=.87-.97, all P<.001; intraclass correlation coefficient [ICC]=.94-.98, all P<.001; coefficient of variation [CV]=2.24-7.94%), although respiratory rate measurement reliability was poor among AF participants (rho=.43, P<.001; ICC=.55, P<.001; CV=16.61%). Data loss was minimal (<5%) when all system components were active; however, instability of the network hosting the remote data capture server resulted in data loss at the remote Internet server during some trials.

Conclusions: System validity was sufficient for remote monitoring of heart and respiratory rates across a range of exercise intensities. Remote exercise monitoring has potential to augment current exCR and heart rate control management approaches by enabling the provision of individually tailored care to individuals outside traditional clinical environments.