224 resultados para Could computing

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

As an interesting application on cloud computing, content-based image retrieval (CBIR) has attracted a lot of attention, but the focus of previous research work was mainly on improving the retrieval performance rather than addressing security issues such as copyrights and user privacy. With an increase of security attacks in the computer networks, these security issues become critical for CBIR systems. In this paper, we propose a novel two-party watermarking protocol that can resolve the issues regarding user rights and privacy. Unlike the previously published protocols, our protocol does not require the existence of a trusted party. It exhibits three useful features: security against partial watermark removal, security in watermark verification and non-repudiation. In addition, we report an empirical research of CBIR with the security mechanism. The experimental results show that the proposed protocol is practicable and the retrieval performance will not be affected by watermarking query images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to show a general design of autonomic elements and initial implementation of a cluster operating system that moves parallel processing on clusters to the computing mainstream using the autonomic computing vision. The significance of this solution is as follows. Autonomic Computing was identified by IBM as one of computing's Grand Challenges. The human body was used to illustrate an Autonomic Computing system that possesses self-knowledge, self-configuration, self optimization, self-healing, and self-protection, knowledge of its environment and user friendliness properties. One of the areas that could benefit from the comprehensive approach created by the autonomic computing vision is parallel processing on non-dedicated clusters. Many researchers and research groups have responded positively to the challenge by initiating research around one or two of the characteristics identified by IBM as the requirements for autonomic computing. We demonstrate here that it is possible to satisfy all Autonomic Computing characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fifty years ago there were no stored-program electronic computers in the world. Even thirty years ago a computer was something that few organisations could afford, and few people could use. Suddenly, in the 1960s and 70s, everything changed and computers began to become accessible. Today* the need for education in Business Computing is generally acknowledged, with each of Victoria's seven universities offering courses of this type. What happened to promote the extremely rapid adoption of such courses is the subject of this thesis. I will argue that although Computer Science began in Australia's universities of the 1950s, courses in Business Computing commenced in the 1960s due to the requirement of the Commonwealth Government for computing professionals to fulfil its growing administrative needs. The Commonwealth developed Programmer-in-Training courses were later devolved to the new Colleges of Advanced Education. The movement of several key figures from the Commonwealth Public Service to take up positions in Victorian CAEs was significant, and the courses they subsequently developed became the model for many future courses in Business Computing. The reluctance of the universities to become involved in what they saw as little more than vocational training, opened the way for the CAEs to develop this curriculum area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide an overview of advances in pervasive computing.
Design/methodology/approach
– The paper provides a critical analysis of the literature.
Findings – Tools expected to support these advances are: resource location framework, data management (e.g. replica control) framework, communication paradigms, and smart interaction mechanisms. Also, infrastructures needed to support pervasive computing applications and an information appliance should be easy for anyone to use and the interaction with the device should be intuitive.
Originality/value – The paper shows how everyday devices with embedded processing and connectivity could interconnect as a pervasive network of intelligent devices that cooperatively and autonomously collect, process and transport information, in order to adapt to the associated context and activity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The lack of women in the computing industry in Australia, and in many western countries, is a problem that has been recognised by academics, the industry and governments. Over the last 20 years there have been many attempts to redress this gender imbalance via intervention programs aimed at increasing female participation in computing education and ultimately the computing profession. However statistics show no improvement in the rate of participation of females in this industry and anecdotal evidence suggests that these intervention programmes have not been as successful or effective as was anticipated. Evaluation could determine the effectiveness of such programmes yet only limited evaluations appear in the literature about intervention programmes established to encourage females in computing. This research sought to investigate how these types of intervention programmes should be evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data perturbation is a popular method to achieve privacy-preserving data mining. However, distorted databases bring enormous overheads to mining algorithms as compared to original databases. In this paper, we present the GrC-FIM algorithm to address the efficiency problem in mining frequent itemsets from distorted databases. Two measures are introduced to overcome the weakness in existing work: firstly, the concept of independent granule is introduced, and granule inference is used to distinguish between non-independent itemsets and independent itemsets. We further prove that the support counts of non-independent itemsets can be directly derived from subitemsets, so that the error-prone reconstruction process can be avoided. This could improve the efficiency of the algorithm, and bring more accurate results; secondly, through the granular-bitmap representation, the support counts can be calculated in an efficient way. The empirical results on representative synthetic and real-world databases indicate that the proposed GrC-FIM algorithm outperforms the popular EMASK algorithm in both the efficiency and the support count reconstruction accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assert that companies can make more money and research institutions can improve their performance if inexpensive clusters and enterprise grids are exploited. In this paper, we have demonstrated that our claim is valid by showing the study of how programming environments, tools and middleware could be used for the execution of parallel and sequential applications, multiple parallel applications executing simultaneously on a non-dedicated cluster, and parallel applications on an enterprise grid and that the execution performance was improved. For this purpose an execution environment, and parallel and sequential benchmark applications selected for, and used in, the experiments were characterised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QoS plays a key role in evaluating a service or a service composition plan across clouds and data centers. Currently, the energy cost of a service's execution is not covered by the QoS framework, and a service's price is often fixed during its execution. However, energy consumption has a great contribution in determining the price of a cloud service. As a result, it is not reasonable if the price of a cloud service is calculated with a fixed energy consumption value, if part of a service's energy consumption could be saved during its execution. Taking advantage of the dynamic energy-Aware optimal technique, a QoS enhanced method for service computing is proposed, in this paper, through virtual machine (VM) scheduling. Technically, two typical QoS metrics, i.e., the price and the execution time are taken into consideration in our method. Moreover, our method consists of two dynamic optimal phases. The first optimal phase aims at dynamically benefiting a user with discount price by transparently migrating his or her task execution from a VM located at a server with high energy consumption to a low one. The second optimal phase aims at shortening task's execution time, through transparently migrating a task execution from a VM to another one located at a server with higher performance. Experimental evaluation upon large scale service computing across clouds demonstrates the validity of our method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The single factor limiting the harnessing of the enormous computing power of clusters for parallel computing is the lack of appropriate software. Present cluster operating systems are not built to support parallel computing – they do not provide services to manage parallelism. The cluster operating environments that are used to assist the execution of parallel applications do not provide support for both Message Passing (MP) or Distributed Shared Memory (DSM) paradigms. They are only offered as separate components implemented at the user level as library and independent servers. Due to poor operating systems users must deal with computers of a cluster rather than to see this cluster as a single powerful computer. A Single System Image of the cluster is not offered to users. There is a need for an operating system for clusters. We claim and demonstrate that it is possible to develop a cluster operating system that is
able to efficiently manage parallelism, support Message Passing and DSM and offer the Single System Image. In order to substantiate the claim the first version of a cluster operating system, called GENESIS, that manages parallelism and offers the Single System Image has been developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low female participation rates in computing are a current concern of the education sector. To address this problem an intervention was developed — computing skills were introduced to girls in their English classes using three different teaching styles: peer tutoring, cross-age tutoring and teacher instruction (control). The sample comprised 136 girls from Years 8 and 10 from a single-sex government school. A pre-test post-test quantitative design was used. To describe the students perspectives, qualitative data were collected from six focus groups conducted with 8–10 students — one from each of the six classes. It was predicted that cross-age tutoring would yield more positive effects than peer tutoring which, in turn, would yield more positive effects than traditional teacher instruction as assessed by achievement on class tasks and attitudes towards computing. The hypothesis was not supported by the quantitative analysis, however in the qualitative data cross-age tutoring was appraised more favourably than peer tutoring or teacher instruction. The latter was the least preferred condition due to: (1) inefficiency; (2) difficulty understanding teachers' explanations; and (3) lack of teacher knowledge. Problems with the implementation of the intervention identified in the focus groups were teacher differences, system failures, missed classes, lack of communication, and selection of computing activities. Practical suggestions were provided relevant to the introduction of cross-age tutoring and the use of computers within secondary level English classes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research into Intelligent Agent (IA) technology and how it can assist computer systems in the autonomous completion of common office and home computing tasks is extremely widespread. The use oflA's is becoming more feasible as the functionality moves Into line with what users require for their everyday computing needs. However, this does not mean that IA technology cannot be exploited or developed for use in a malicious manner, such as within an Information Waifare (IW) scenario, where systems may be attacked autonomously by agent system implem-entations. This paper will discuss tne cilrrenlStcite Ofmalicious use of lA's as well as focusing on attack techniques, the difficulties brought about by such attacks as well as security methods, both proactive and reactive, that could be instated within compromised or sensitive systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides an analysis of student experiences of an approach to teaching theory that integrates the teaching of theory and data analysis. The argument that supports this approach is that theory is most effectively taught by using empirical data in order to generate and test propositions and hypotheses, thereby emphasising the dialectic relationship between theory and data through experiential learning. Bachelor of Commerce students in two second-year substantive organisational theory subjects were introduced to this method of learning at a large, multi-campus Australian university. In this paper, we present a model that posits a relationship between students' perceptions of their learning, the enjoyment of the experience and expected future outcomes. The results of our evaluation reveal that a majority of students:

•enjoyed this way of learning;
•believed that the exercise assisted their learning of substantive theory, computing applications and the nature of survey data; and
•felt that what they have learned could be applied elsewhere.

We argue that this approach presents the potential to improve the way theory is taught by integrating theory, theory testing and theory development; moving away from teaching theory and analysis in discrete subjects; and, introducing iterative experiences in substantive subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The search for effective ways of dealing with obesity has centred on biological research and clinical management. However, obesity needs to be conceptualized more broadly if the modern pandemic is to be arrested. The epidemiological triad (hosts, agent/vectors and environments) has served us well in dealing with epidemics in the past, and may be worth re-evaluating to this end. Education, behaviour change and clinical practices deal predominantly with the host, although multidisciplinary practices such as shared-care might also be expected to impact on other corners of the triad. Technology deals best with the agent of obesity (energy imbalance) and it's vectors (excessive energy intake and/or inadequate energy expenditure), and policy and social change are needed to cope with the environment. The value of a broad model like this, rather than specific isolated approaches, is that the key players such as legislators, health professionals, governments and industry can see their roles in attenuating and eventually reversing the epidemic. It also highlights the need to intervene at all levels in obesity control and reduces the relevance of arguments about nature vs. nurture.


--------------------------------------------------------------------------------

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to tolerate failures while effectively exploiting the grid computing resources in an scalable and transparent manner must be an integral part of grid computing infrastructure. Hence, fault-detection service is a necessary prerequisite to fault tolerance and fault recovery in grid computing. To this end, we present an scalable fault detection service architecture. The proposed fault-detection system provides services that monitors user applications, grid middlewares and the dynamically changing state of a collection of distributed resources. It reports summaries of this information to the appropriate agents on demand or instantaneously in the event of failures.