574 resultados para Mobile GPGPU computing platforms

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability of cloud computing to provide almost unlimited storage, backup and recovery, and quick deployment contributes to its widespread attention and implementation. Cloud computing has also become an attractive choice for mobile users as well. Due to limited features of mobile devices such as power scarcity and inability to cater computationintensive tasks, selected computation needs to be outsourced to the resourceful cloud servers. However, there are many challenges which need to be addressed in computation offloading for mobile cloud computing such as communication cost, connectivity maintenance and incurred latency. This paper presents taxonomy of the computation offloading approaches which aim to address the challenges. The taxonomy provides guidelines to identify research scopes in computation offloading for mobile cloud computing. We also outline directions and anticipated trends for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From location-aware computing to mining the social web, representations of context have promised to make better software applications. The opportunities and challenges of context-aware computing from representational, situated and interactional perspectives have been well documented, but arguments from the perspective of design are somewhat disparate. This paper draws on both theoretical perspectives and a design framing, using the problem of designing a social mobile agile ridesharing system, in order to reflect upon and call for broader design approaches for context-aware computing and human-computer Interaction research in general.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern mobile computing devices are versatile, but bring the burden of constant settings adjustment according to the current conditions of the environment. While until today, this task has to be accomplished by the human user, the variety of sensors usually deployed in such a handset provides enough data for autonomous self-configuration by a learning, adaptive system. However, this data is not fully available at certain points in time, or can contain false values. Handling potentially incomplete sensor data to detect context changes without a semantic layer represents a scientific challenge which we address with our approach. A novel machine learning technique is presented - the Missing-Values-SOM - which solves this problem by predicting setting adjustments based on context information. Our method is centered around a self-organizing map, extending it to provide a means of handling missing values. We demonstrate the performance of our approach on mobile context snapshots, as well as on classical machine learning datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GO423 was initiated in 2012 as part of a community effort to ensure the vitality of the Queensland Games Sector. In common with other industrialised nations, the game industry in Australia is a reasonably significant contributor to Gross National Product (GNP). Games are played in 92% of Australian homes and the average adult player has been playing them for at least twelve years with 26% playing for more than thirty years (Brand, 2011). Like the games and interactive entertainment industries in other countries, the Australian industry has its roots in the small team model of the 1980s. So, for example, Beam Software, which was established in Melbourne in 1980, was started by two people and Krome Studios was started in 1999 by three. Both these companies grew to employing over 100 people in their heydays (considered large by Antipodean standards), not by producing their own intellectual property (IP) but by content generation for off shore parent companies. Thus our bigger companies grew on a model of service provision and tended not to generate their own IP (Darchen, 2012). There are some no-table exceptions where IP has originated locally and been ac-quired by international companies but in the case of some of the works of which we are most proud, the Australian company took on the role of “Night Elf” – a convenience due to affordances of the time zone which allowed our companies to work while the parent companies slept in a different time zone. In the post GFC climate, the strong Australian dollar and the vulnerability of such service provision means that job security is virtually non-existent with employees invariably being on short-term contracts. These issues are exacerbated by the decline of middle-ground games (those which fall between the triple-A titles and the smaller games often produced for a casual audience). The response to this state of affairs has been the change in the Australian games industry to new recognition of its identity as a wider cultural sector and the rise (or return) of an increasing number of small independent game development companies. ’In-dies’ consist of small teams, often making games for mobile and casual platforms, that depend on producing at least one if not two games a year and who often explore more radical definitions of games as designed cultural objects. The need for innovation and creativity in the Australian context is seen as a vital aspect of the current changing scene where we see the emphasis on the large studio production model give way to an emerging cultural sector model where small independent teams are engaged in shorter design and production schedules driven by digital distribution. In terms of Quality of Life (QoL) this new digital distribution brings with it the danger of 'digital isolation' - a studio can work from home and deliver from home. Community events thus become increasingly important. The GO423 Symposium is a response to these perceived needs and the event is based on the understanding that our new small creative teams depend on the local community of practice in no small way. GO423 thus offers local industry participants the opportunity to talk to each other about their work, to talk to potential new members about their work and to show off their work in a small intimate situation, encouraging both feedback and support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of a correlation matrix for set of genomic sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. Each sequence may be millions of bases long and there may be thousands of such sequences which we wish to compare, so not all sequences may fit into main memory at the same time. Each sequence needs to be compared with every other sequence, so we will generally need to page some sequences in and out more than once. In order to minimize execution time we need to minimize this I/O. This paper develops an approach for faster and scalable computing of large-size correlation matrices through the maximal exploitation of available memory and reducing the number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different bioinformatics problems with different correlation matrix sizes. The significant performance improvement of the approach over previous work is demonstrated through benchmark examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An intrinsic challenge associated with evaluating proposed techniques for detecting Distributed Denial-of-Service (DDoS) attacks and distinguishing them from Flash Events (FEs) is the extreme scarcity of publicly available real-word traffic traces. Those available are either heavily anonymised or too old to accurately reflect the current trends in DDoS attacks and FEs. This paper proposes a traffic generation and testbed framework for synthetically generating different types of realistic DDoS attacks, FEs and other benign traffic traces, and monitoring their effects on the target. Using only modest hardware resources, the proposed framework, consisting of a customised software traffic generator, ‘Botloader’, is capable of generating a configurable mix of two-way traffic, for emulating either large-scale DDoS attacks, FEs or benign traffic traces that are experimentally reproducible. Botloader uses IP-aliasing, a well-known technique available on most computing platforms, to create thousands of interactive UDP/TCP endpoints on a single computer, each bound to a unique IP-address, to emulate large numbers of simultaneous attackers or benign clients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed Collaborative Computing services have taken over centralized computing platforms allowing the development of distributed collaborative user applications. These applications enable people and computers to work together more productively. Multi-Agent System (MAS) has emerged as a distributed collaborative environment which allows a number of agents to cooperate and interact with each other in a complex environment. We want to place our agents in problems whose solutions require the collation and fusion of information, knowledge or data from distributed and autonomous information sources. In this paper we present the design and implementation of an agent based conference planner application that uses collaborative effort of agents which function continuously and autonomously in a particular environment. The application also enables the collaborative use of services deployed geographically wide in different technologies i.e. Software Agents, Grid computing and Web service. The premise of the application is that it allows autonomous agents interacting with web and grid services to plan a conference as a proxy to their owners (humans). © 2005 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This is the final report of research project 2002-057-C: Enabling Team Collaboration with Pervasive and Mobile Computing. The research project was carried out by the Australian Cooperative Research Centre for Construction Innovation and has two streams that consider the use of pervasive computing technologies in two different contexts. The first context was the on-site deployment of mobile computing devices, where as the second context was the use and development of intelligent rooms based on sensed environments and new human-computer interfaces (HCI) for collaboration in the design office. The two streams present a model of team collaboration that relies on continues communication to people and information to reduce information leakage. This report consists of five sections: (1) Introduction; (2) Research Project Background; (3) Project Implementation; (4) Case Studies and Outcomes; and (5) Conclusion and Recommendation. Introduction in Section 1 presents a brief description of the research project including general research objectives and structure. Section 2 introduces the background of the research and detailed information regarding project participants, objectives and significance, and also research methodology. Review of all research activities such as literature review and case studies are summarised in Project Implementation in Section 3. Following this, in Section 4 the report then focuses on analysing the case studies and presents their outcomes. Conclusion and recommendation of the research project are summarised in Section 5. Other information to support the content of the report such as research project schedule is provided in Appendices. The purpose of the final project report is to provide industry partners with detailed information on the project activities and methodology such as the implementation of pervasive computing technologies in the real contexts. The report summarises the outcomes of the case studies and provides necessary recommendation to industry partners of using new technologies to support better project collaboration.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The application of Information and Communication Technology (ICT) in construction industry has been recognised widely by some practitioners and researchers for the last several years. During the 1990s the international construction industry started using with the increasing confidence information and communication technology. The use of e-mail became usual and web-sites were established for marketing purposes. Intranets and extranets were also established to facilitate communication within companies and throughout their branches. One of the important applications of the ICT in construction industry was the use of mobile computing devices to achieve better communication and data transmission between construction sites and offices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The rapid growth in the number of online services leads to an increasing number of different digital identities each user needs to manage. As a result, many people feel overloaded with credentials, which in turn negatively impact their ability to manage them securely. Passwords are perhaps the most common type of credential used today. To avoid the tedious task of remembering difficult passwords, users often behave less securely by using low entropy and weak passwords. Weak passwords and bad password habits represent security threats to online services. Some solutions have been developed to eliminate the need for users to create and manage passwords. A typical solution is based on giving the user a hardware token that generates one-time-passwords, i.e. passwords for single session or transaction usage. Unfortunately, most of these solutions do not satisfy scalability and/or usability requirements, or they are simply insecure. In this paper, we propose a scalable OTP solution using mobile phones and based on trusted computing technology that combines enhanced usability with strong security.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Web applications such as blogs, wikis, video and photo sharing sites, and social networking systems have been termed ‘Web 2.0’ to highlight an arguably more open, collaborative, personalisable, and therefore more participatory internet experience than what had previously been possible. Giving rise to a culture of participation, an increasing number of these social applications are now available on mobile phones where they take advantage of device-specific features such as sensors, location and context awareness. This international volume of book chapters will make a contribution towards exploring and better understanding the opportunities and challenges provided by tools, interfaces, methods and practices of social and mobile technology that enable participation and engagement. It brings together an international group of academics and practitioners from a diverse range of disciplines such as computing and engineering, social sciences, digital media and human-computer interaction to critically examine a range of applications of social and mobile technology, such as social networking, mobile interaction, wikis, twitter, blogging, virtual worlds, shared displays and urban sceens, and their impact to foster community activism, civic engagement and cultural citizenship.