995 resultados para Pervasive computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on an investigation to explore architectural design potentials with a responsive material system and physical computing. Contemporary architects and designers are seeking to integrate physical computing in responsive architectural designs; however, they have largely borrowed from engineering technology's mechanical devices and components. There is the opportunity to investigate an unexplored design approach to exploit the responsive capacity of material properties as alternatives to the current focus on mechanical components and discrete sensing devices. This opportunity creates a different design paradigm for responsive architecture that investigates the potential to integrate physical computing with responsive materials as one integrated material system. Instead of adopting highly intricate and expensive materials, this approach is explored through accessible and off-the-shelf materials to form a responsive material system, called Lumina. Lumina is implemented as an architectural installation called Cloud that serves as a morphing architectural skin. Cloud is a proof of concept to embody a responsive material system with physical computing to create a reciprocal and luminous architectural intervention for a selected dark corridor. It represents a different design paradigm for responsive architecture through alternative exploitation of contemporary materials and parametric design tools. © 2014, The Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Hong Kong.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary purpose of this book is to capture the state-of-the-art in Cloud Computing technologies and applications. The book will also aim to identify potential research directions and technologies that will facilitate creation a global market-place of cloud computing services supporting scientific, industrial, business, and consumer applications. We expect the book to serve as a reference for larger audience such as systems architects, practitioners, developers, new researchers and graduate level students. This area of research is relatively recent, and as such has no existing reference book that addresses it. This book will be a timely contribution to a field that is gaining considerable research interest, momentum, and is expected to be of increasing interest to commercial developers. The book is targeted for professional computer science developers and graduate students especially at Masters level. As Cloud Computing is recognized as one of the top five emerging technologies that will have a major impact on the quality of science and society over the next 20 years, its knowledge will help position our readers at the forefront of the field. © 2011 John Wiley & Sons, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing is becoming popular as the next infrastructure of computing platform. However, with data and business applications outsourced to a third party, how to protect cloud data centers from numerous attacks has become a critical concern. In this paper, we propose a clusterized framework of cloud firewall, which characters performance and cost evaluation. To provide quantitative performance analysis of the cloud firewall, a novel M/Geo/1 analytical model is established. The model allows cloud defenders to extract key system measures such as request response time, and determine how many resources are needed to guarantee quality of service (QoS). Moreover, we give an insight into financial cost of the proposed cloud firewall. Finally, our analytical results are verified by simulation experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud-based service computing has started to change the way how research in science, in particular biology, medicine, and engineering, is being carried out. Researchers in the area of mammalian genomics have taken advantage of cloud computing technology to cost-effectively process large amounts of data and speed up discovery. Mammalian genomics is limited by the cost and complexity of analysis, which require large amounts of computational resources to analyse huge amount of data and biology specialists to interpret results. On the other hand the application of this technology requires computing knowledge, in particular programming and operations management skills to develop high performance computing (HPC) applications and deploy them on HPC clouds. We carried out a survey of cloud-based service computing solutions, as the most recent and promising instantiations of distributed computing systems, in the context their use in research of mammalian genomic analysis. We describe our most recent research and development effort which focuses on building Software as a Service (SaaS) clouds to simplify the use of HPC clouds for carrying out mammalian genomic analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear subspace representations of appearance variation are pervasive in computer vision. This paper addresses the problem of robustly matching such subspaces (computing the similarity between them) when they are used to describe the scope of variations within sets of images of different (possibly greatly so) scales. A naïve solution of projecting the low-scale subspace into the high-scale image space is described first and subsequently shown to be inadequate, especially at large scale discrepancies. A successful approach is proposed instead. It consists of (i) an interpolated projection of the low-scale subspace into the high-scale space, which is followed by (ii) a rotation of this initial estimate within the bounds of the imposed "downsampling constraint". The optimal rotation is found in the closed-form which best aligns the high-scale reconstruction of the low-scale subspace with the reference it is compared to. The method is evaluated on the problem of matching sets of (i) face appearances under varying illumination and (ii) object appearances under varying viewpoint, using two large data sets. In comparison to the naïve matching, the proposed algorithm is shown to greatly increase the separation of between-class and within-class similarities, as well as produce far more meaningful modes of common appearance on which the match score is based. © 2014 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a growing awareness of the importance of including computing education in the curriculum of secondary schools in countries like the United States of America, the United Kingdom, New Zealand, and South Korea. Consequently, we have seen serious efforts to introduce computing education to the core curriculum and/or to improve it. Recent reports (such as Wilson et al. 2010; Hubwieser et al. 2011) reveal that computing education faces problems regarding its lack of exposure as well as a lack of motivators for students to follow this line of study. Although students use computers for many tasks both at home and at school, many of them never quite understand what computer science is and how it relates to algorithmic thinking and problem solving. This panel will bring together leaders in computing education from Australia, Germany, Greece, Israel and Norway to describe the state of computing education in each of their countries. Issues raised will include how high school computer education is conducted in that country, how teachers are skilled /accredited, the challenges that are being faced today and how these challenges are being addressed. Panellists will suggest lessons other countries may find of value from their way of doing things. An important issue is how to recruit female students in to computer education at high school level and how to encourage them to continue in the discipline to university. The problem is exacerbated because computer education is still not included as a compulsory subject in the regular curriculum of high schools in all of these countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is devoted to a case study of a new construction of classifiers. These classifiers are called automatically generated multi-level meta classifiers, AGMLMC. The construction combines diverse meta classifiers in a new way to create a unified system. This original construction can be generated automatically producing classifiers with large levels. Different meta classifiers are incorporated as low-level integral parts of another meta classifier at the top level. It is intended for the distributed computing and networking. The AGMLMC classifiers are unified classifiers with many parts that can operate in parallel. This make it easy to adopt them in distributed applications. This paper introduces new construction of classifiers and undertakes an experimental study of their performance. We look at a case study of their effectiveness in the special case of the detection and filtering of phishing emails. This is a possible important application area for such large and distributed classification systems. Our experiments investigate the effectiveness of combining diverse meta classifiers into one AGMLMC classifier in the case study of detection and filtering of phishing emails. The results show that new classifiers with large levels achieved better performance compared to the base classifiers and simple meta classifiers classifiers. This demonstrates that the new technique can be applied to increase the performance if diverse meta classifiers are included in the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smartphone applications are getting more and more popular and pervasive in our daily life, and are also attractive to malware writers due to their limited computing source and vulnerabilities. At the same time, we possess limited understanding of our opponents in cyberspace. In this paper, we investigate the propagation model of SMS/MMS-based worms through integrating semi-Markov process and social relationship graph. In our modeling, we use semi-Markov process to characterize state transition among mobile nodes, and hire social network theory, a missing element in many previous works, to enhance the proposed mobile malware propagation model. In order to evaluate the proposed models, we have developed a specific software, and collected a large scale real-world data for this purpose. The extensive experiments indicate that the proposed models and algorithms are effective and practical. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

 The platform remote attestation (RA) is one of the main features of trusted computing platform proposed by the trusted computing group (TCG). The privacy certificate authority (CA) solution of RA requires users to pay for multiple certificates, and the direct anonymous attestation (DAA) solution leads to inefficiency. TCG RA also suffers from limitations of platform configuration privacy. This paper proposed a RA scheme based on an improved combined public key cryptography (ICPK) (abbreviated to RA-ICPK). RA-ICPK is a certificate-less scheme without using public key infrastructure CA signature or DAA signature, which combines commitment scheme, zero-knowledge proof and ring signature (RS) to own the property of unforgeability and privacy. RA-ICPK is mainly based on elliptic curve cryptography without bilinear pair computing, and only carries out zero-knowledge proof one time. RA-ICPK need not depend on trusted third parties to check trusted platform modules identity and integrity values revocations. © 2014 Springer Science+Business Media New York

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transparent computing is an emerging computing paradigm where the users can enjoy any kind of service over networks on-demand with any devices, without caring about the underlying deployment details. In transparent computing, all software resources (even the OS) are stored on remote servers, from which the clients can request the resources for local execution in a block-streaming way. This paradigm has many benefits including cross-platform experience, user orientation, and platform independence. However, due to its fundamental features, e.g., separation of computation and storage in clients and servers respectively, and block-streaming-based scheduling and execution, transparent computing faces many new security challenges that may become its biggest obstacle. In this paper, we propose a Transparent Computing Security Architecture (TCSA), which builds user-controlled security for transparent computing by allowing the users to configure the desired security environments on demand. We envision, TCSA, which allows the users to take the initiative to protect their own data, is a promising solution for data security in transparent computing. © 2014 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud services to smart things face latency and intermittent connectivity issues. Fog devices are positioned between cloud and smart devices. Their high speed Internet connection to the cloud, and physical proximity to users, enable real time applications and location based services, and mobility support. Cisco promoted fog computing concept in the areas of smart grid, connected vehicles and wireless sensor and actuator networks. This survey article expands this concept to the decentralized smart building control, recognizes cloudlets as special case of fog computing, and relates it to the software defined networks (SDN) scenarios. Our literature review identifies a handful number of articles. Cooperative data scheduling and adaptive traffic light problems in SDN based vehicular networks, and demand response management in macro station and micro-grid based smart grids are discussed. Security, privacy and trust issues, control information overhead and network control policies do not seem to be studied so far within the fog computing concept.