891 resultados para web applications
Resumo:
This book underlines the growing importance of knowledge for the competitiveness of cities and their regions. Examining the role of knowledge - in its economic, socio-cultural, spatial and institutional forms - for urban and regional development, identifying the preconditions for innovative use of urban and regional knowledge assets and resources, and developing new methods to evaluate the performance and potential of knowledge-based urban and regional development, the book provides an in-depth and comprehensive understanding of both theoretical and practical aspects of knowledge-based development and its implications and prospects for cities and regions.
Resumo:
Multimedia communication capabilities are rapidly expanding, and visual information is easily shared electronically, yet funding bodies still rely on paper grant proposal submissions. Incorporating modern technologies will streamline the granting process by increasing the fidelity of grant communication, improving the efficiency of review, and reducing the cost of the process.
Resumo:
The present article gives an overview of the reversible addition fragmentation chain transfer (RAFT) process. RAFT is one of the most versatile living radical polymerization systems and yields polymers of predictable chain length and narrow molecular weight distribution. RAFT relies on the rapid exchange of thiocarbonyl thio groups between growing polymeric chains. The key strengths of the RAFT process for polymer design are its high tolerance of monomer functionality and reaction conditions, the wide range of well-controlled polymeric architectures achievable, and its (in-principle) non-rate-retarding nature. This article introduces the mechanism of polymerization, the range of polymer molecular weights achievable, the range of monomers in which polymerization is controlled by RAFT, the various polymeric architectures that can be obtained, the type of end-group functionalities available to RAFT-made polymers, and the process of RAFT polymerization.
Resumo:
In this chapter, we discuss four related areas of cryptology, namely, authentication, hashing, message authentication codes (MACs), and digital signatures. These topics represent active and growing research topics in cryptology. Space limitations allow us to concentrate only on the essential aspects of each topic. The bibliography is intended to supplement our survey. We have selected those items which providean overview of the current state of knowledge in the above areas. Authentication deals with the problem of providing assurance to a receiver that a communicated message originates from a particular transmitter, and that the received message has the same content as the transmitted message. A typical authentication scenario occurs in computer networks, where the identity of two communicating entities is established by means of authentication. Hashing is concerned with the problem of providing a relatively short digest–fingerprint of a much longer message or electronic document. A hashing function must satisfy (at least) the critical requirement that the fingerprints of two distinct messages are distinct. Hashing functions have numerous applications in cryptology. They are often used as primitives to construct other cryptographic functions. MACs are symmetric key primitives that provide message integrity against active spoofing by appending a cryptographic checksum to a message that is verifiable only by the intended recipient of the message. Message authentication is one of the most important ways of ensuring the integrity of information that is transferred by electronic means. Digital signatures provide electronic equivalents of handwritten signatures. They preserve the essential features of handwritten signatures and can be used to sign electronic documents. Digital signatures can potentially be used in legal contexts.
Resumo:
In this chapter we continue the exposition of crypto topics that was begun in the previous chapter. This chapter covers secret sharing, threshold cryptography, signature schemes, and finally quantum key distribution and quantum cryptography. As in the previous chapter, we have focused only on the essentials of each topic. We have selected in the bibliography a list of representative items, which can be consulted for further details. First we give a synopsis of the topics that are discussed in this chapter. Secret sharing is concerned with the problem of how to distribute a secret among a group of participating individuals, or entities, so that only predesignated collections of individuals are able to recreate the secret by collectively combining the parts of the secret that were allocated to them. There are numerous applications of secret-sharing schemes in practice. One example of secret sharing occurs in banking. For instance, the combination to a vault may be distributed in such a way that only specified collections of employees can open the vault by pooling their portions of the combination. In this way the authority to initiate an action, e.g., the opening of a bank vault, is divided for the purposes of providing security and for added functionality, such as auditing, if required. Threshold cryptography is a relatively recently studied area of cryptography. It deals with situations where the authority to initiate or perform cryptographic operations is distributed among a group of individuals. Many of the standard operations of single-user cryptography have counterparts in threshold cryptography. Signature schemes deal with the problem of generating and verifying electronic) signatures for documents.Asubclass of signature schemes is concerned with the shared-generation and the sharedverification of signatures, where a collaborating group of individuals are required to perform these actions. A new paradigm of security has recently been introduced into cryptography with the emergence of the ideas of quantum key distribution and quantum cryptography. While classical cryptography employs various mathematical techniques to restrict eavesdroppers from learning the contents of encrypted messages, in quantum cryptography the information is protected by the laws of physics.
Resumo:
This special issue of Networking Science focuses on Next Generation Network (NGN) that enables the deployment of access independent services over converged fixed and mobile networks. NGN is a packet-based network and uses the Internet protocol (IP) to transport the various types of traffic (voice, video, data and signalling). NGN facilitates easy adoption of distributed computing applications by providing high speed connectivity in a converged networked environment. It also makes end user devices and applications highly intelligent and efficient by empowering them with programmability and remote configuration options. However, there are a number of important challenges in provisioning next generation network technologies in a converged communication environment. Some preliminary challenges include those that relate to QoS, switching and routing, management and control, and security which must be addressed on an urgent or emergency basis. The consideration of architectural issues in the design and pro- vision of secure services for NGN deserves special attention and hence is the main theme of this special issue.
Resumo:
Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.
Resumo:
DeepBlue is much more than just an orchestra. Their innovative approach to audience engagement led it to develop ESP, their Electronic Show Programme web app which allows for real-time (synchronous) and delayed (asynchronous) audience interaction, customer feedback and research. The show itself is driven invisibly by a music technology operating system (currently QUT's Yodel) that allows them to adapt to a wide range of performance venues and varied types of presentation. DeepBlue's community engagement program has enabled over 5,500 young musicians and community choristers to participate in professional productions, it is also a cornerstone of DeepBlue's successful business model. You can view the ESP mobile web app at m.deepblue.net.au if you view this and only the landing page is active, there is not a show taking place or imminent. ESP prototype has already been used for 18 months. Imagine knowing what your audience really thinks – in real time so you can track their feelings and thoughts through the show. This tool has been developed and used by the performing group DeepBlue since late 2012 in Australia and Asia (even translated into Vietnamese). It has mostly superseded DeepBlue's SMS realtime communication during a show. It enables an event presenter or performance group to take the pulse of an audience through a series of targeted questions that can be anonymous or attributed. This will help build better, long-lasting, and more meaningful relationships with groups and individuals in the community. This can take place on a tablet, mobile phone or future platforms. There are three organisations trialling it so far.
Resumo:
Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.
Resumo:
LiteSteel beam (LSB) is a new cold-formed steel hollow flange channel section produced using a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. The LSBs were commonly used as floor joists and bearers with web openings in residential, industrial and commercial buildings. Due to the unique geometry of LSBs, as well as its unique residual stress characteristics and initial geometric imperfections resultant of manufacturing processes, much of the existing research for common cold-formed steel sections is not directly applicable to LSBs. Many research studies have been carried out to evaluate the behaviour and design of LSBs subject to pure bending actions, predominant shear and combined actions. However, to date, no investigation has been conducted into the web crippling behaviour and strength of LSB sections. Hence detailed experimental studies were conducted to investigate the web crippling behaviour and strengths of LSBs under EOF (End One Flange) and IOF (Interior One Flange) load cases. A total of 26 web crippling tests was conducted and the results were compared with current AS/NZS 4600 design rules. This comparison showed that AS/NZS 4600 (SA, 2005) design rules are very conservative for LSB sections under EOF and IOF load cases. Suitable design equations have been proposed to determine the web crippling capacity of LSBs based on experimental results. This paper presents the details of this experimental study on the web crippling behaviour and strengths of LiteSteel beams under EOF and IOF load cases.
Resumo:
This paper presents the details of experimental studies on the effect of real support conditions on the shear strength of LiteSteel beams (LSB). The LSB has a unique shape of a channel beam with two rectangular hollow flanges, made using a unique manufacturing process. In some applications in the building industry LSBs are used with only one web side plate (WSP) at their supports and are not used with full height web side plates (WSP) at their supports. Past research studies showed that theses real support connections did not provide simply supported conditions. Many studies have been carried out to evaluate the behaviour and design of LSBs with simply supported conditions subject to pure bending and predominant shear actions. To date, however, no investigation has been conducted into the effect of real support conditions on the shear strength of LSBs. Hence detailed experimental studies were undertaken to investigate the shear behaviour and strength of LSBs with real support conditions. A total of 28 experimental tests were conducted as part of the studies. Simply supported test specimens of LSBs with aspect ratios of 1.0 and 1.5 were loaded at mid-span until failure. It was found that the effect of using one WSP on the shear behaviour of LSB is significant and there is about 25% shear capacity reduction due to the lateral movement of the bottom flange at the supports. Shear capacity of LSB was also found to decrease when full height WSPs were not used. Suitable support connections were developed to improve the shear capacity of LSBs based on test results.
Resumo:
This paper presents the details of an experimental study of a cold-formed steel hollow flange channel beam known as LiteSteel Beam (LSB) subject to web crippling actions (ETF and ITF). Due to the geometry of the LSB, as well as its unique residual stress characteristics and initial geometric imperfections resultant of manufacturing processes, much of the existing research for common cold-formed steel sections is not directly applicable to LSB. Experimental and numerical studies have been carried out to evaluate the behaviour and design of LSBs subject to pure bending actions, predominant shear actions and combined actions. To date, however, no investigation has been conducted into the web crippling behaviour and strength of LSB sections under ETF and ITF load conditions. Hence experimental studies were conducted to assess the web crippling behaviour and strengths of LSBs. Twenty eight web crippling tests were conducted and the results were compared with the current AS/NZS 4600[1] and AISI S100 [2]design equations. Comparison of the ultimate web crippling capacities from tests showed that AS/NZS 4600[1] and AISI S100 [2] design equations are unconservative for LSB sections under ETF and ITF load cases. Hence new equations were proposed to determine the web crippling capacities of LSBs. Suitable design rules were also developed under the DSM format.
Resumo:
A business process is often modeled using some kind of a directed flow graph, which we call a workflow graph. The Refined Process Structure Tree (RPST) is a technique for workflow graph parsing, i.e., for discovering the structure of a workflow graph, which has various applications. In this paper, we provide two improvements to the RPST. First, we propose an alternative way to compute the RPST that is simpler than the one developed originally. In particular, the computation reduces to constructing the tree of the triconnected components of a workflow graph in the special case when every node has at most one incoming or at most one outgoing edge. Such graphs occur frequently in applications. Secondly, we extend the applicability of the RPST. Originally, the RPST was applicable only to graphs with a single source and single sink such that the completed version of the graph is biconnected. We lift both restrictions. Therefore, the RPST is then applicable to arbitrary directed graphs such that every node is on a path from some source to some sink. This includes graphs with multiple sources and/or sinks and disconnected graphs.
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
We identify relation completion (RC) as one recurring problem that is central to the success of novel big data applications such as Entity Reconstruction and Data Enrichment. Given a semantic relation, RC attempts at linking entity pairs between two entity lists under the relation. To accomplish the RC goals, we propose to formulate search queries for each query entity α based on some auxiliary information, so that to detect its target entity β from the set of retrieved documents. For instance, a pattern-based method (PaRE) uses extracted patterns as the auxiliary information in formulating search queries. However, high-quality patterns may decrease the probability of finding suitable target entities. As an alternative, we propose CoRE method that uses context terms learned surrounding the expression of a relation as the auxiliary information in formulating queries. The experimental results based on several real-world web data collections demonstrate that CoRE reaches a much higher accuracy than PaRE for the purpose of RC.