196 resultados para yoking proof


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a formal model for certificateless authenticated key exchange (CL-AKE) protocols. Contrary to what might be expected, we show that the natural combination of an ID-based AKE protocol with a public key based AKE protocol cannot provide strong security. We provide the first one-round CL-AKE scheme proven secure in the random oracle model. We introduce two variants of the Diffie-Hellman trapdoor the introduced by \cite{DBLP:conf/eurocrypt/CashKS08}. The proposed key agreement scheme is secure as long as each party has at least one uncompromised secret. Thus, our scheme is secure even if the key generation centre learns the ephemeral secrets of both parties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The challenges of maintaining a building such as the Sydney Opera House are immense and are dependent upon a vast array of information. The value of information can be enhanced by its currency, accessibility and the ability to correlate data sets (integration of information sources). A building information model correlated to various information sources related to the facility is used as definition for a digital facility model. Such a digital facility model would give transparent and an integrated access to an array of datasets and obviously would support Facility Management processes. In order to construct such a digital facility model, two state-of-the-art Information and Communication technologies are considered: an internationally standardized building information model called the Industry Foundation Classes (IFC) and a variety of advanced communication and integration technologies often referred to as the Semantic Web such as the Resource Description Framework (RDF) and the Web Ontology Language (OWL). This paper reports on some technical aspects for developing a digital facility model focusing on Sydney Opera House. The proposed digital facility model enables IFC data to participate in an ontology driven, service-oriented software environment. A proof-of-concept prototype has been developed demonstrating the usability of IFC information to collaborate with Sydney Opera House’s specific data sources using semantic web ontologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: In this research we examined, by means of case studies, the mechanisms by which relationships can be managed and by which communication and cooperation can be enhanced in sustainable supply chains. The research was predicated on the contention that the development of a sustainable supply chain depends, in part, on the transfer of knowledge and capabilities from the larger players in the supply chain. Design/Methodology/Approach: The research adopted a triangulated approach in which quantitative data were collected by questionnaire, interviews were conducted to explore and enrich the quantitative data and case studies were undertaken in order to illustrate and validate the findings. Handy‟s (1985) view of organisational culture, Allen & Meyer‟s (1990) concepts of organisational commitment and Van de Ven & Ferry‟s (1980) measures of organisational structuring have been combined into a model to test and explain how collaborative mechanisms can affect supply chain sustainability. Findings: It has been shown that the degree of match and mismatch between organisational culture and structure has an impact on staff‟s commitment level. A sustainable supply chain depends on convergence – that is the match between organisational structuring, organisation culture and organisation commitment. Research Limitations/implications: The study is a proof of concept and three case studies have been used to illustrate the nature of the model developed. Further testing and refinement of the model in practice should be the next step in this research. Practical implications: The concept of relationship management needs to filter down to all levels in the supply chain if participants are to retain commitment and buy-in to the relationship. A sustainable supply chain requires proactive relationship management and the development of an appropriate organisational culture, and trust. By legitimising individuals‟ expectations of the type of culture which is appropriate to their company and empowering employees to address mismatches that may occur a situation can be created whereby the collaborating organisations develop their competences symbiotically and so facilitate a sustainable supply chain. Originality/value: The culture/commitment/structure model developed from three separate strands of management thought has proved to be a powerful tool for analysing collaboration in supply chains and explaining how and why some supply chains are sustainable, and others are not.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Government figures put the current indigenous unemployment rate at around 23%, 3 times the unemployment rate for other Australians. This thesis aims to assess whether Australian indirect discrimination legislation can provide a remedy for one of the causes of indigenous unemployment - the systemic discrimination which can result from the mere operation of established procedures of recruitment and hiring. The impact of those practices on indigenous people is examined in the context of an analysis of anti-discrimination legislation and cases from all Australian jurisdictions from the time of the passing of the Racial Discrimination Act by the Commonwealth in 1975 to the present. The thesis finds a number of reasons why the legislation fails to provide equality of opportunity for indigenous people seeking to enter the workforce. In nearly all jurisdictions it is obscurely drafted, used mainly by educated middle class white women, and provides remedies which tend to be compensatory damages rather than change to recruitment policy. White dominance of the legal process has produced legislative and judicial definitions of "race" and "Aboriginality" which focus on biology rather than cultural difference. In the commissions and tribunals complaints of racial discrimination are often rejected on the grounds of being "vexatious" or "frivolous", not reaching the required standard of proof, or not showing a causal connection between race and the conduct complained of. In all jurisdictions the cornerstone of liability is whether a particular employment term, condition or practice is reasonable. The thesis evaluates the approaches taken by appellate courts, including the High Court, and concludes that there is a trend towards an interpretation of reasonableness which favours employer arguments such as economic rationalism, the maintenance of good industrial relations, managerial prerogative to hire and fire, and the protection of majority rights. The thesis recommends that separate, clearly drafted legislation should be passed to address indigenous disadvantage and that indigenous people should be involved in all stages of the process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, the authors propose a novel video stabilisation algorithm for mobile platforms with moving objects in the scene. The quality of videos obtained from mobile platforms, such as unmanned airborne vehicles, suffers from jitter caused by several factors. In order to remove this undesired jitter, the accurate estimation of global motion is essential. However it is difficult to estimate global motions accurately from mobile platforms due to increased estimation errors and noises. Additionally, large moving objects in the video scenes contribute to the estimation errors. Currently, only very few motion estimation algorithms have been developed for video scenes collected from mobile platforms, and this paper shows that these algorithms fail when there are large moving objects in the scene. In this study, a theoretical proof is provided which demonstrates that the use of delta optical flow can improve the robustness of video stabilisation in the presence of large moving objects in the scene. The authors also propose to use sorted arrays of local motions and the selection of feature points to separate outliers from inliers. The proposed algorithm is tested over six video sequences, collected from one fixed platform, four mobile platforms and one synthetic video, of which three contain large moving objects. Experiments show our proposed algorithm performs well to all these video sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The researcher’s professional role as an Education Officer was the impetus for this study. Designing and implementing professional development activities is a significant component of the researcher’s position description and as a result of reflection and feedback from participants and colleagues, the creation of a more effective model of professional development became the focus for this study. Few studies have examined all three links between the purposes of professional development that is, increasing teacher knowledge, improving teacher practice, and improving student outcomes. This study is significant in that it investigates the nature of the growth of teachers who participated in a model of professional development which was based upon the principles of Lesson Study. The research provides qualitative and empirical data to establish some links between teacher knowledge, teacher practice, and student learning outcomes. Teacher knowledge in this study refers to mathematics content knowledge as well as pedagogical-content knowledge. The outcomes for students include achievement outcomes, attitudinal outcomes, and behavioural outcomes. As the study was conducted at one school-site, existence proof research was the focus of the methodology and data collection. Developing over the 2007 school year, with five teacher-participants and approximately 160 students from Year Levels 6 to 9, the Lesson Study-principled model of professional development provided the teacher-participants with on-site, on-going, and reflective learning based on their classroom environment. The focus area for the professional development was strategising the engagement with and solution of worded mathematics problems. A design experiment was used to develop the professional development as an intervention of prevailing teacher practice for which data were collected prior to and after the period of intervention. A model of teacher change was developed as an underpinning framework for the development of the study, and was useful in making decisions about data collection and analyses. Data sources consisted of questionnaires, pre-tests and post-tests, interviews, and researcher observations and field notes. The data clearly showed that: content knowledge and pedagogical-content knowledge were increased among the teacher-participants; teacher practice changed in a positive manner; and that a majority of students demonstrated improved learning outcomes. The positive changes to teacher practice are described in this study as the demonstrated use of mixed pedagogical practices rather than a polarisation to either traditional pedagogical practices or contemporary pedagogical practices. The improvement in student learning outcomes was most significant as improved achievement outcomes as indicated by the comparison of pre-test and post-test scores. The effectiveness of the Lesson Study-principled model of professional development used in this study was evaluated using Guskey’s (2005) Five Levels of Professional Development Evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A wide range of screening strategies have been employed to isolate antibodies and other proteins with specific attributes, including binding affinity, specificity, stability and improved expression. However, there remains no high-throughput system to screen for target-binding proteins in a mammalian, intracellular environment. Such a system would allow binding reagents to be isolated against intracellular clinical targets such as cell signalling proteins associated with tumour formation (p53, ras, cyclin E), proteins associated with neurodegenerative disorders (huntingtin, betaamyloid precursor protein), and various proteins crucial to viral replication (e.g. HIV-1 proteins such as Tat, Rev and Vif-1), which are difficult to screen by phage, ribosome or cell-surface display. This study used the β-lactamase protein complementation assay (PCA) as the display and selection component of a system for screening a protein library in the cytoplasm of HEK 293T cells. The colicin E7 (ColE7) and Immunity protein 7 (Imm7) *Escherichia coli* proteins were used as model interaction partners for developing the system. These proteins drove effective β-lactamase complementation, resulting in a signal-to-noise ratio (9:1 – 13:1) comparable to that of other β-lactamase PCAs described in the literature. The model Imm7-ColE7 interaction was then used to validate protocols for library screening. Single positive cells that harboured the Imm7 and ColE7 binding partners were identified and isolated using flow cytometric cell sorting in combination with the fluorescent β-lactamase substrate, CCF2/AM. A single-cell PCR was then used to amplify the Imm7 coding sequence directly from each sorted cell. With the screening system validated, it was then used to screen a protein library based the Imm7 scaffold against a proof-of-principle target. The wild-type Imm7 sequence, as well as mutants with wild-type residues in the ColE7- binding loop were enriched from the library after a single round of selection, which is consistent with other eukaryotic screening systems such as yeast and mammalian cell-surface display. In summary, this thesis describes a new technology for screening protein libraries in a mammalian, intracellular environment. This system has the potential to complement existing screening technologies by allowing access to intracellular proteins and expanding the range of targets available to the pharmaceutical industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computer forensics is the process of gathering and analysing evidence from computer systems to aid in the investigation of a crime. Typically, such investigations are undertaken by human forensic examiners using purpose-built software to discover evidence from a computer disk. This process is a manual one, and the time it takes for a forensic examiner to conduct such an investigation is proportional to the storage capacity of the computer's disk drives. The heterogeneity and complexity of various data formats stored on modern computer systems compounds the problems posed by the sheer volume of data. The decision to undertake a computer forensic examination of a computer system is a decision to commit significant quantities of a human examiner's time. Where there is no prior knowledge of the information contained on a computer system, this commitment of time and energy occurs with little idea of the potential benefit to the investigation. The key contribution of this research is the design and development of an automated process to describe a computer system and its activity for the purposes of a computer forensic investigation. The term proposed for this process is computer profiling. A model of a computer system and its activity has been developed over the course of this research. Using this model a computer system, which is the subj ect of investigation, can be automatically described in terms useful to a forensic investigator. The computer profiling process IS resilient to attempts to disguise malicious computer activity. This resilience is achieved by detecting inconsistencies in the information used to infer the apparent activity of the computer. The practicality of the computer profiling process has been demonstrated by a proof-of concept software implementation. The model and the prototype implementation utilising the model were tested with data from real computer systems. The resilience of the process to attempts to disguise malicious activity has also been demonstrated with practical experiments conducted with the same prototype software implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this paper is to summarize the outcomes of a detailed research study carried out as part of the fulfilment of a doctoral programme which examined the relationships between, and impacts of organisational culture on construction performance within a Hong Kong context. The research used a mixed methodology approach consisting of an organisational culture survey using an adapted validated and reliable measurement instrument (the Denison Organisational Culture Survey), mini-case studies in four Hong Kong construction companies and correlated the derived culture scores against performance scores measured by the Hong Kong Housing Department Performance Assessment Scoring System (PASS). The significance of the research was to advance knowledge of the importance of organisational culture strength as a performance driver in the construction industry and the further proof of the culture performance links using a set of measures of the latter which were not financially-based. The findings of the research make a contribution to theory by further validating the work by Denison (1990) and others, not only in that a successful link between organisational culture and performance was demonstrated, but it also identifies particular cultural factors in organisations that appear to be significantly responsible for achieving successful outcomes and reveals opportunities for further research into the organisational culture of construction companies Keywords: organisational culture, construction performance, business success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Botnets are large networks of compromised machines under the control of a bot master. These botnets constantly evolve their defences to allow the continuation of their malicious activities. The constant development of new botnet mitigation strategies and their subsequent defensive countermeasures has lead to a technological arms race, one which the bot masters have significant incentives to win. This dissertation analyzes the current and future states of the botnet arms race by introducing a taxonomy of botnet defences and a simulation framework for evaluating botnet techniques. The taxonomy covers current botnet techniques and highlights possible future techniques for further analysis under the simulation framework. This framework allows the evaluation of the effect techniques such as reputation systems and proof of work schemes have on the resources required to disable a peer-to-peer botnet. Given the increase in the resources required, our results suggest that the prospects of eliminating the botnet threat are limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Berridge's model (e.g. [Berridge KC. Food reward: Brain substrates of wanting and liking. Neurosci Biobehav Rev 1996;20:1–25.; Berridge KC, Robinson T E. Parsing reward. Trends Neurosci 2003;26:507–513.; Berridge KC. Motivation concepts in behavioral neuroscience. Physiol Behav 2004;81:179–209]) outlines the brain substrates thought to mediate food reward with distinct ‘liking’ (hedonic/affective) and ‘wanting’ (incentive salience/motivation) components. Understanding the dual aspects of food reward could throw light on food choice, appetite control and overconsumption. The present study reports the development of a procedure to measure these processes in humans. A computer-based paradigm was used to assess ‘liking’ (through pleasantness ratings) and ‘wanting’ (through forced-choice photographic procedure) for foods that varied in fat (high or low) and taste (savoury or sweet). 60 participants completed the program when hungry and after an ad libitum meal. Findings indicate a state (hungry–satiated)-dependent, partial dissociation between ‘liking’ and ‘wanting’ for generic food categories. In the hungry state, participants ‘wanted’ high-fat savoury > low-fat savoury with no corresponding difference in ‘liking’, and ‘liked’ high-fat sweet > low-fat sweet but did not differ in ‘wanting’ for these foods. In the satiated state, participants ‘liked’, but did not ‘want’, high-fat savoury > low-fat savoury, and ‘wanted’ but did not ‘like’ low-fat sweet > high-fat sweet. More differences in ‘liking’ and ‘wanting’ were observed when hungry than when satiated. This procedure provides the first step in proof of concept that ‘liking’ and ‘wanting’ can be dissociated in humans and can be further developed for foods varying along different dimensions. Other experimental procedures may also be devised to separate ‘liking’ and ‘wanting’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Offering service bundles to the market is a promising option for service providers to strengthen their competitive advantages, cope with dynamic market conditions and deal with heterogeneous consumer demand. Although the expected positive effects of bundling strategies and pricing considerations for bundles are covered well by the available literature, limited guidance can be found regarding the identification of potential bundle candidates and the actual process of bundling. The proposed research aims at filling this gap by offering a service bundling method complemented by a proof-of-concept prototype, which extends the existing knowledge base in the multidisciplinary research area of Information Systems and Service Science as well as providing an organisation with a structured approach for bundling services.