910 resultados para key scheduling algorithm
Resumo:
We introduce a formal model for certificateless authenticated key exchange (CL-AKE) protocols. Contrary to what might be expected, we show that the natural combination of an ID-based AKE protocol with a public key based AKE protocol cannot provide strong security. We provide the first one-round CL-AKE scheme proven secure in the random oracle model. We introduce two variants of the Diffie-Hellman trapdoor the introduced by \cite{DBLP:conf/eurocrypt/CashKS08}. The proposed key agreement scheme is secure as long as each party has at least one uncompromised secret. Thus, our scheme is secure even if the key generation centre learns the ephemeral secrets of both parties.
Resumo:
Manufacture, construction and use of buildings and building materials make a significant environmental impact internally (inside the building), locally (neighbourhood) and globally. Life cycle assessment (LCA) methodology is being applied for evaluating the environmental impact of building/or building materials. One of the major applications of LCA is to identify key issues of a product system from cradle to grave. Key issues identified in an LCA lead one to the right direction in assessing the environmental aspects of a product system and help to identify the areas for improvement of the environmental performance of a product as well. The purpose of this paper is to suggest two methods for identifying key issues using an integrated tool (LCADesign), which has been developed to provide a method of determining the best alternative for reducing environmental impacts from a building or building materials, and compare both methods in the case study. This paper assists the designers or marketers related to building or building materials in their decision making by giving information on activities or alternatives which are identified as key issues for environmental impacts.
Resumo:
Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems have recently been added to the already wide collection of wireless sensor networks applications. The PCS/SCADA environment is somewhat more amenable to the use of heavy cryptographic mechanisms such as public key cryptography than other sensor application environments. The sensor nodes in the environment, however, are still open to devastating attacks such as node capture, which makes designing a secure key management challenging. In this paper, a key management scheme is proposed to defeat node capture attack by offering both forward and backward secrecies. Our scheme overcomes the pitfalls which Nilsson et al.'s scheme suffers from, and is not more expensive than their scheme.
Resumo:
We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.
Resumo:
This study sought to improve understanding of the persuasive process of emotion-based appeals not only in relation to negative, fear-based appeals but also for appeals based upon positive emotions. In particular, the study investigated whether response efficacy, as a cognitive construct, mediated outcome measures of message effectiveness in terms of both acceptance and rejection of negative and positive emotion-based messages. Licensed drivers (N = 406) participated via the completion of an on-line survey. Within the survey, participants received either a negative (fear-based) appeal or one of the two possible positive appeals (pride or humor-based). Overall, the study's findings confirmed the importance of emotional and cognitive components of persuasive health messages and identified response efficacy as a key cognitive construct influencing the effectiveness of not only fear-based messages but also positive emotion-based messages. Interestingly, however, the results suggested that response efficacy's influence on message effectiveness may differ for positive and negative emotion-based appeals such that significant indirect (and mediational) effects were found with both acceptance and rejection of the positive appeals yet only with rejection of the fear-based appeal. As such, the study's findings provide an important extension to extant literature and may inform future advertising message design.
Resumo:
In public venues, crowd size is a key indicator of crowd safety and stability. Crowding levels can be detected using holistic image features, however this requires a large amount of training data to capture the wide variations in crowd distribution. If a crowd counting algorithm is to be deployed across a large number of cameras, such a large and burdensome training requirement is far from ideal. In this paper we propose an approach that uses local features to count the number of people in each foreground blob segment, so that the total crowd estimate is the sum of the group sizes. This results in an approach that is scalable to crowd volumes not seen in the training data, and can be trained on a very small data set. As a local approach is used, the proposed algorithm can easily be used to estimate crowd density throughout different regions of the scene and be used in a multi-camera environment. A unique localised approach to ground truth annotation reduces the required training data is also presented, as a localised approach to crowd counting has different training requirements to a holistic one. Testing on a large pedestrian database compares the proposed technique to existing holistic techniques and demonstrates improved accuracy, and superior performance when test conditions are unseen in the training set, or a minimal training set is used.
Resumo:
This paper aimed to explore the proportion associated with the perceived importance and the actual use of performance indicators from manufacturing and non manufacturing industries. The sample was 86 small and medium sized-organizations in Thailand. The perceived importance and the actual use of financial and non financial indicators were found to be significantly related among manufacturing and non manufacturing industries. KPI 3, 9, and 12 (i.e. sales and sales growth; quality of products and /or services; and process time) were perceived the most importance among manufacturing managers (85.3%, 79.4% and 76.5% respectively). While KPI 6, 9, and 12 (i.e. customer satisfaction, quality of products and /or services; and process time) were perceived the most importance among non manufacturing managers (84.8%, 93.5%, and 84.8% respectively). Interestingly, the most used KPIs for manufacturing were sales and sales growth (64.7%); profit margins (61.8%); and customer satisfaction (84.8) while non manufacturing used quality products/services (60.9%); sales and sales growth (54.3%) and employee development (54.3%) respectively. Limitation and implication were also discussed.
Resumo:
In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.
Resumo:
Effective information and knowledge management (IKM) is critical to corporate success; yet, its actual establishment and management is not yet fully understood. We identify ten organizational elements that need to be addressed to ensure the effective implementation and maintenance of information and knowledge management within organizations. We define these elements and provide key characterizations. We then discuss a case study that describes the implementation of an information system (designed to support IKM) in a medical supplies organization. We apply the framework of organizational elements in our analysis to uncover the enablers and barriers in this systems implementation project. Our analysis suggests that taking the ten organizational elements into consideration when implementing information systems will assist practitioners in managing information and knowledge processes more effectively and efficiently. We discuss implications for future research.
Resumo:
We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.
Resumo:
We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.
Resumo:
We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.
Resumo:
Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.
Resumo:
Purpose –The introduction of Building Information Model tools over the last 20 years is resulting in radical changes in the Architectural, Engineering and Construction industry. One of these changes concerns the use of Virtual Prototyping - an advanced technology integrating BIM with realistic graphical simulations. Construction Virtual Prototyping (CVP) has now been developed and implemented on ten real construction projects in Hong Kong in the past three years. This paper reports on a survey aimed at establishing the effects of adopting this new technology and obtaining recommendations for future development. Design/methodology/approach – A questionnaire survey was conducted in 2007 of 28 key participants involved in four major Hong Kong construction projects – these projects being chosen because the CVP approach was used in more than one stage in each project. In addition, several interviews were conducted with the project manager, planning manager and project engineer of an individual project. Findings –All the respondents and interviewees gave a positive response to the CVP approach, with the most useful software functions considered to be those relating to visualisation and communication. The CVP approach was thought to improve the collaboration efficiency of the main contractor and sub-contractors by approximately 30 percent, and with a concomitant 30 to 50 percent reduction in meeting time. The most important benefits of CPV in the construction planning stage are the improved accuracy of process planning and shorter planning times, while improved fieldwork instruction and reducing rework occur in the construction implementation stage. Although project teams are hesitant to attribute the use of CVP directly to any specific time savings, it was also acknowledged that the workload of project planners is decreased. Suggestions for further development of the approach include incorporation of automatic scheduling and advanced assembly study. Originality/value –Whilst the research, development and implementation of CVP is relatively new in the construction industry, it is clear from the applications and feedback to date that the approach provides considerable added value to the organisation and management of construction projects.