88 resultados para computation- and data-intensive applications

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Massive computation power and storage capacity of cloud computing systems allow scientists to deploy computation and data intensive applications without infrastructure investment, where large application data sets can be stored in the cloud. Based on the pay-as-you-go model, storage strategies and benchmarking approaches have been developed for cost-effectively storing large volume of generated application data sets in the cloud. However, they are either insufficiently cost-effective for the storage or impractical to be used at runtime. In this paper, toward achieving the minimum cost benchmark, we propose a novel highly cost-effective and practical storage strategy that can automatically decide whether a generated data set should be stored or not at runtime in the cloud. The main focus of this strategy is the local-optimization for the tradeoff between computation and storage, while secondarily also taking users' (optional) preferences on storage into consideration. Both theoretical analysis and simulations conducted on general (random) data sets as well as specific real world applications with Amazon's cost model show that the cost-effectiveness of our strategy is close to or even the same as the minimum cost benchmark, and the efficiency is very high for practical runtime utilization in the cloud.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scientific workflow offers a framework for cooperation between remote and shared resources on a grid computing environment (GCE) for scientific discovery. One major function of scientific workflow is to schedule a collection of computational subtasks in well-defined orders for efficient outputs by estimating task duration at runtime. In this paper, we propose a novel time computation model based on algorithm complexity (termed as TCMAC model) for high-level data intensive scientific workflow design. The proposed model schedules the subtasks based on their durations and the complexities of participant algorithms. Characterized by utilization of task duration computation function for time efficiency, the TCMAC model has three features for a full-aspect scientific workflow including both dataflow and control-flow: (1) provides flexible and reusable task duration functions in GCE;(2) facilitates better parallelism in iteration structures for providing more precise task durations;and (3) accommodates dynamic task durations for rescheduling in selective structures of control flow. We will also present theories and examples in scientific workflows to show the efficiency of the TCMAC model, especially for control-flow. Copyright©2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing popularity of smartphone devices has led to development of increasing numbers of applications which have subsequently become targets for malicious authors. Analysing applications in order to identify malicious ones is a current major concern in information security; an additional problem connected with smart-phone applications is that their many advertising libraries can lead to loss of personal information. In this paper, we relate the current methods of detecting malware on smartphone devices and discuss the problems caused by malware as well as advertising.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parallel execution is a very efficient means of processing vast amounts of data in a small amount of time. Creating parallel applications has never been easy, and requires much knowledge of the task and the execution environment used to execute parallel processes. The process of creating parallel applications can be made easier through using a compiler that automatically parallelises a supplied application. Executing the parallel application is also simplified when a well designed execution environment is used. Such an execution environment provides very powerful operations to the programmer transparently. Combining both a parallelising compiler and execution environment and providing a fully automated parallelisation and execution tool is the aim of this research. The advantage of using such a fully automated tool is that the user does not need to provide any additional input to gain the benefits of parallel execution. This report shows the tool and how it transparently supports the programmer creating parallel applications and supports their execution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To highlight the importance of sampling and data collection  processes in qualitative interview studies, and to discuss the contribution of  these processes to determining the strength of the evidence generated and  thereby to decisions for public health practice and policy.

Approach:
This discussion is informed by a hierarchy-of-evidence-for-practice  model. The paper provides succinct guidelines for key sampling and data  collection considerations in qualitative research involving interview studies. The  importance of allowing time for immersion in a given community to become  familiar with the context and population is discussed, as well as the practical  constraints that sometimes operate against this stage. The role of theory in  guiding sample selection is discussed both in terms of identifying likely sources  of rich data and in understanding the issues emerging from the data. It is noted  that sampling further assists in confirming the developing evidence and also  illuminates data that does not seem to fit. The importance of reporting sampling  and data collection processes is highlighted clearly to enable others to assess  both the strength of the evidence and the broader applications of the findings.

Conclusion:
Sampling and data collection processes are critical to determining  the quality of a study and the generalisability of the findings. We argue that  these processes should operate within the parameters of the research goal, be  guided by emerging theoretical considerations, cover a range of relevant   participant perspectives, and be clearly outlined in research reports with an  explanation of any research limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The peer-to-peer content distribution network (PCDN) is a hot topic recently, and it has a huge potential for massive data intensive applications on the Internet. One of the challenges in PCDN is routing for data sources and data deliveries. In this paper, we studied a type of network model which is formed by dynamic autonomy area, structured source servers and proxy servers. Based on this network model, we proposed a number of algorithms to address the routing and data delivery issues. According to the highly dynamics of the autonomy area, we established dynamic tree structure proliferation system routing, proxy routing and resource searching algorithms. The simulations results showed that the performance of the proposed network model and the algorithms are stable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Focuses on two areas within the field of general relativity. Firstly, the history and implications of the long-standing conjecture that general relativistic, shear-free perfect fluids which obey a barotropic equation of state p = p(w) such that w + p = 0, are either non-expanding or non-rotating. Secondly the application of the computer algebra system Maple to the area of tetrad formalisms in general relativity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The agent paradigm has been successfully used in a large number of research areas. MAPFS, a parallel file system, constitutes one successful application of agents to the I/O field, providing a multiagent I/O architecture. The use of a multiagent system implies coordination and cooperation among its agents. MAPFS is oriented to clusters of workstations, where agents are applied in order to provide features such as caching or prefetching. The adaptation of MAPFS to a grid environment is named MAPFS-Grid. Agents can help to increase the performance of data-intensive applications running on top of the grid.

This paper describes the conceptual agent framework and the communication model used in MAPFS-Grid, which provides the management of data resources in a grid environment. The evaluation of our proposal shows the advantages of using agents in a data grid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing is the most recent realisation of computing as a utility. Recently, fields with substantial computational requirements, e.g., biology, are turning to clouds for cheap, on-demand provisioning of resources. Of interest to this paper is the execution of compute intensive applications on hybrid clouds. If application requirements exceed private cloud resource capacity, clients require scaling down their applications. The outcome of this research is Web technology realising a new form of cloud called HPC Hybrid Deakin (H2D) Cloud -- an experimental hybrid cloud capable of utilising both local and remote computational services for single large embarrassingly parallel applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An Android application uses a permission system to regulate the access to system resources and users' privacy-relevant information. Existing works have demonstrated several techniques to study the required permissions declared by the developers, but little attention has been paid towards used permissions. Besides, no specific permission combination is identified to be effective for malware detection. To fill these gaps, we have proposed a novel pattern mining algorithm to identify a set of contrast permission patterns that aim to detect the difference between clean and malicious applications. A benchmark malware dataset and a dataset of 1227 clean applications has been collected by us to evaluate the performance of the proposed algorithm. Valuable findings are obtained by analyzing the returned contrast permission patterns. © 2013 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, fields with substantial computing requirementshave turned to cloud computing for economical, scalable, and on-demandprovisioning of required execution environments. However, current cloudofferings focus on providing individual servers while tasks such as applicationdistribution and data preparation are left to cloud users. This article presents anew form of cloud called HPC Hybrid Deakin (H2D) cloud; an experimentalhybrid cloud capable of utilising both local and remote computational servicesfor large embarrassingly parallel applications. As well as supporting execution,H2D also provides a new service, called DataVault, that provides transparentdata management services so all cloud-hosted clusters have required datasetsbefore commencing execution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The majority of deaths of children and infants occur in paediatric and neonatal intensive care settings. For nurses, managing an infant/child's deterioration and death can be very challenging. Nurses play a vital role in how the death occurs, how families are supported leading up to and after the infant/child's death. This paper describes the nurses' endeavours to create normality amidst the sadness and grief of the death of a child in paediatric and neonatal ICU. Focus groups and individual interviews with registered nurses from NICU and PICU settings gathered data on how neonatal and paediatric intensive care nurses care for families when a child dies and how they perceived their ability and preparedness to provide family care. Four themes emerged from thematic analysis: (1) respecting the child as a person; (2) creating opportunities for family involvement/connection; (3) collecting mementos; and (4) planning for death. Many of the activities described in this study empowered parents to participate in the care of their child as death approached. Further work is required to ensure these principles are translated into practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formal, functional, and material attributes of design are routinely investigated through the construction of physical models and scaled prototypes. With the increasing adoption of computational workflows, the digital to physical translation process is central to the construction of scaled prototypes. However, the choice of methods, tools and materials for computational prototyping is a developing area. Therefore a systematic body of knowledge on the benefits and costs of multiple methods of computational prototyping for the construction of physical prototypes need to be identified. This paper addresses the prototyping process through the comparison of three computational methods of fabrication through the modelling, analysis and construction of a Gaussian Vault. It reports on the process of digital to physical construction using additive manufacturing, surface fabrication and structural component models. The Gaussian Vault offers a unique set of geometric, structural and physical characteristics for testing all three methods of prototyping. The size, shape and proportion of vault prototypes are rapidly generated and tested. The design geometry, material properties and physical construction of the Gaussian Vault are realised using commonly used practice workflows comprising parametric modelling and analysis of geometry, model rationalisation with material characteristics and finally the use of digital fabrication methods. Comparison of the results identifies the characteristics, benefits and limitations of the three approaches. Finally the paper discusses the digital to physical translation processes and summarises the characteristics, benefits and issues encountered in each.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of a larger Simplification Project for program quality assurance processes conducted at RMIT University, this paper chronicles the refinement of one aspect: program assessment and reporting. This involved the realignment of criteria used in program quality assurance with those developed in higher-level strategic and business planning processes. In addition, the project attempted to address the lack of alignment between annual program processes and subsequent decisions made about the future of programs, particularly in profile planning processes.
A revised Program Annual Report process was developed that aimed to achieve simplicity and alignment while re-engaging program leaders and heads of schools with the quality agenda. A concerted effort was made to develop a process that improved on previously poor vertical communication inherent in program quality assurance. This paper explores the ways in which this was achieved by a) linking people to data through the use of agreed and contextualised performance indicators, and b) linking people to process through more meaningful input into planning and opportunity for dialogue.