355 resultados para Renormalization schemes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Delegation is a powerful mechanism to provide flexible and dynamic access control decisions. Delegation is particularly useful in federated environments where multiple systems, with their own security autonomy, are connected under one common federation. Although many delegation schemes have been studied, current models do not seriously take into account the issue of delegation commitment of the involved parties. In order to address this issue, this paper introduces a new mechanism to help parties involved in the delegation process to express commitment constraints, perform the commitments and track the committed actions. This mechanism looks at two different aspects: pre-delegation commitment and post-delegation commitment. In pre-delegation commitment, this mechanism enables the involved parties to express the delegation constraints and address those constraints. The post-delegation commitment phase enables those parties to inform the delegator and service providers how the commitments are conducted. This mechanism utilises a modified SAML assertion structure to support the proposed delegation and constraint approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We aim to demonstrate unaided visual 3D pose estimation and map reconstruction using both monocular and stereo vision techniques. To date, our work has focused on collecting data from Unmanned Aerial Vehicles, which generates a number of significant issues specific to the application. Such issues include scene reconstruction degeneracy from planar data, poor structure initialisation for monocular schemes and difficult 3D reconstruction due to high feature covariance. Most modern Visual Odometry (VO) and related SLAM systems make use of a number of sensors to inform pose and map generation, including laser range-finders, radar, inertial units and vision [1]. By fusing sensor inputs, the advantages and deficiencies of each sensor type can be handled in an efficient manner. However, many of these sensors are costly and each adds to the complexity of such robotic systems. With continual advances in the abilities, small size, passivity and low cost of visual sensors along with the dense, information rich data that they provide our research focuses on the use of unaided vision to generate pose estimates and maps from robotic platforms. We propose that highly accurate (�5cm) dense 3D reconstructions of large scale environments can be obtained in addition to the localisation of the platform described in other work [2]. Using images taken from cameras, our algorithm simultaneously generates an initial visual odometry estimate and scene reconstruction from visible features, then passes this estimate to a bundle-adjustment routine to optimise the solution. From this optimised scene structure and the original images, we aim to create a detailed, textured reconstruction of the scene. By applying such techniques to a unique airborne scenario, we hope to expose new robotic applications of SLAM techniques. The ability to obtain highly accurate 3D measurements of an environment at a low cost is critical in a number of agricultural and urban monitoring situations. We focus on cameras as such sensors are small, cheap and light-weight and can therefore be deployed in smaller aerial vehicles. This, coupled with the ability of small aerial vehicles to fly near to the ground in a controlled fashion, will assist in increasing the effective resolution of the reconstructed maps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growing importance of sustainability assessment in the construction industry, many green building rating schemes have been adopted in the building sector of Australia. However, there is an abnormal delay in the similar adoption in the infrastructure sector. This prolonged delay in practice poses a challenge in mapping the project objectives with sustainability outcomes. Responding to the challenge of sustainable development in infrastructure, it is critical to create a set of decision indicators for sustainability in infrastructure, which to be used in conjunction with the emerging infrastructure sustainability assessment framework of the Australian Green Infrastructure Council. The various literature sources confirm the lack of correlation between sustainability and infrastructure. This theoretical missing link signifies the crucial validation of the interrelationship and interdependency in sustainability, decision making and infrastructure. This validation is vital for the development of decision indicators for sustainability in infrastructure. Admittedly, underpinned by the serious socio-environmental vulnerability, the traditional focus on economic emphasis in infrastructure development needs to be drifted towards the appropriate decisions for sustainability enhancing the positive social and environmental outcomes. Moreover, the research findings suggest sustainability being observed as powerful socio-political and influential socio-environmental driver in deciding the infrastructure needs and its development. These newly developed sustainability decision indicators create the impetus for change leading to sustainability in infrastructure by integrating the societal cares, environmental concerns into the holistic financial consideration. Radically, this development seeks to transform principles into actions for infrastructure sustainability. Lastly, the thesis concludes with knowledge contribution in five significant areas and future research opportunities. The consolidated research outcomes suggest that the development of decision indicators has demonstrated sustainability as a pivotal driver for decision making in infrastructure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Graduated licensing schemes have been found to reduce the crash risk of young novice drivers, but there is less evidence of their success with novice motorcycle riders. This study examined the riding experience of a sample of Australian learner-riders to establish the extent and variety of their riding practice during the learner stage. Riders completed an anonymous questionnaire at a compulsory rider-training course for the licensing test. The majority of participants were male (81%) with an average age of 33 years. They worked full time (81%), held an unrestricted driver's license (81%), and owned the motorcycle that they rode (79%). These riders had held their learner's license for an average of 6 months. On average, they rode 6.4 h/week. By the time they attempted the rider-licensing test, they had ridden a total of 101 h. Their total hours of on-road practice were comparable to those of learner-drivers at the same stage of licensing, but they had less experience in adverse or challenging road conditions. A substantial proportion had little or no experience of riding in the rain (57%), at night (36%), in heavy traffic (22%), on winding rural roads (52%), or on high-speed roads (51%). These findings highlight the differences in the learning processes between unsupervised novice motorcycle riders and supervised novice drivers. Further research is necessary to clarify whether specifying the conditions under which riders should practice during the graduated licensing process would likely reduce or increase their crash risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methicillin-resistant Staphylococcus Aureus (MRSA) is a pathogen that continues to be of major concern in hospitals. We develop models and computational schemes based on observed weekly incidence data to estimate MRSA transmission parameters. We extend the deterministic model of McBryde, Pettitt, and McElwain (2007, Journal of Theoretical Biology 245, 470–481) involving an underlying population of MRSA colonized patients and health-care workers that describes, among other processes, transmission between uncolonized patients and colonized health-care workers and vice versa. We develop new bivariate and trivariate Markov models to include incidence so that estimated transmission rates can be based directly on new colonizations rather than indirectly on prevalence. Imperfect sensitivity of pathogen detection is modeled using a hidden Markov process. The advantages of our approach include (i) a discrete valued assumption for the number of colonized health-care workers, (ii) two transmission parameters can be incorporated into the likelihood, (iii) the likelihood depends on the number of new cases to improve precision of inference, (iv) individual patient records are not required, and (v) the possibility of imperfect detection of colonization is incorporated. We compare our approach with that used by McBryde et al. (2007) based on an approximation that eliminates the health-care workers from the model, uses Markov chain Monte Carlo and individual patient data. We apply these models to MRSA colonization data collected in a small intensive care unit at the Princess Alexandra Hospital, Brisbane, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Subtropical Design Handbook for Planners is primarily intended to provide advice in developing planning schemes to achieve the South East Queensland Regional Plan’s vision. This calls for ‘development which is sustainable and well-designed, and where the subtropical character of the region is recognised and reinforced’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The flying capacitor multicell inverter (FCMI) possesses natural balancing property. With the phase-shifted (PS) carrier-based scheme, natural balancing can be achieved in a straightforward manner. However, to achieve natural balancing with the harmonically optimal phase-disposition (PD) carrierbased scheme, the conventional approaches require (n-1) x (n-1) trapezoidal carrier signals for an n-level inverter, which is (n-1) x (n-2) times more than that in the standard PD scheme. This paper proposes two improved natural balancing strategies for FMI under PD scheme, which use the same (n-1) carrier signals as used in the standard PD scheme. In the first scheme, on-line detection is performed of the band in which the modulation signal is located, corresponding period number of the carrier, and rising or falling half cycle of the carrier waveform to generate the switching signals based on certain rules. In the second strategy, the output voltage level selection is first processed and the switching signals are then generated according to a rule based on preferential cell selection algorithm. These methods are easy to use and can be simply implemented as compared to the other available methods. Simulation and experimental results are presented for a five-level inverter to verify these proposed schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gradual authentication is a principle proposed by Meadows as a way to tackle denial-of-service attacks on network protocols by gradually increasing the confidence in clients before the server commits resources. In this paper, we propose an efficient method that allows a defending server to authenticate its clients gradually with the help of some fast-to-verify measures. Our method integrates hash-based client puzzles along with a special class of digital signatures supporting fast verification. Our hash-based client puzzle provides finer granularity of difficulty and is proven secure in the puzzle difficulty model of Chen et al. (2009). We integrate this with the fast-verification digital signature scheme proposed by Bernstein (2000, 2008). These schemes can be up to 20 times faster for client authentication compared to RSA-based schemes. Our experimental results show that, in the Secure Sockets Layer (SSL) protocol, fast verification digital signatures can provide a 7% increase in connections per second compared to RSA signatures, and our integration of client puzzles with client authentication imposes no performance penalty on the server since puzzle verification is a part of signature verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technology-mediated collaboration process has been extensively studied for over a decade. Most applications with collaboration concepts reported in the literature focus on enhancing efficiency and effectiveness of the decision-making processes in objective and well-structured workflows. However, relatively few previous studies have investigated the applications of collaboration schemes to problems with subjective and unstructured nature. In this paper, we explore a new intelligent collaboration scheme for fashion design which, by nature, relies heavily on human judgment and creativity. Techniques such as multicriteria decision making, fuzzy logic, and artificial neural network (ANN) models are employed. Industrial data sets are used for the analysis. Our experimental results suggest that the proposed scheme exhibits significant improvement over the traditional method in terms of the time–cost effectiveness, and a company interview with design professionals has confirmed its effectiveness and significance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It could be said that road congestion is one of the most significant problems within any modern metropolitan area. For several decades now, around the globe, congestion in metropolitan areas has been worsening for two main reasons. Firstly, road congestion has significantly increased due to a higher demand for road space because of growth in populations, economic activity and incomes (Hensher & Puckett, 2007). This factor, in conjunction with a significant lack of investment in new road and public transport infrastructure, has seen the road network capacities of cities exceeded by traffic volumes and thus, resulted in increased traffic congestion. This relentless increase in road traffic congestion has resulted in a dramatic increase in costs for both the road users and ultimately the metropolitan areas concerned (Bureau of Transport and Regional Economics, 2007). In response to this issue, several major cities around the world, including London, Stockholm and Singapore, have implemented congestion-charging schemes in order to combat the effects of road congestion. A congestion-charging scheme provides a mechanism for regulating traffic flows into the congested areas of a city, whilst simultaneously generating public revenue that can be used to improve both the public transport and road networks of the region. The aim of this paper was to assess the concept of congestion-charging, whilst reflecting on the experiences of various cities that have already implemented such systems. The findings from this paper have been used to inform the design of a congestion-charging scheme for the city of Brisbane in Australia in a supplementary study (Whitehead, Bunker, & Chung, 2011). The first section of this paper examines the background to road congestion; the theory behind different congestion-charging schemes; and the various technologies involved with the concept. The second section of this paper details the experiences, in relation to implementing a congestion-charging scheme, from the city of Stockholm in Sweden. This research has been crucial in forming a list of recommendations and lessons learnt for the design of a congestion-charging scheme in Australia. It is these recommendations that directly inform the proposed design of the Brisbane Cordon Scheme detailed in Whitehead et al. (2011).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital forensic examiners often need to identify the type of a file or file fragment based only on the content of the file. Content-based file type identification schemes typically use a byte frequency distribution with statistical machine learning to classify file types. Most algorithms analyze the entire file content to obtain the byte frequency distribution, a technique that is inefficient and time consuming. This paper proposes two techniques for reducing the classification time. The first technique selects a subset of features based on the frequency of occurrence. The second speeds classification by sampling several blocks from the file. Experimental results demonstrate that up to a fifteen-fold reduction in file size analysis time can be achieved with limited impact on accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The overarching objective of the research was to identify the existence and nature of international legal principles governing sustainable forest use and management. This research intended to uncover a set of forest legal considerations that are relevant for consideration across the globe. The purpose behind this, is to create a theoretical base of international forest law literature which be drawn upon to inform future international forestry research. This research will be of relevance to those undertaking examination of a particular forest issue or those focusing on forests in a particular region. The thesis explains the underlying legal issues in forest regulation, the dominant international regulatory approaches and makes suggestions as to how international and national forest policy could be improved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research on bicycle helmets and concerns about how public bicycle hire schemes will function in the context of compulsory helmet wearing laws have drawn media attention. This monograph presents the results of research commissioned by the Queensland Department of Transport and Main Roads to review the national and international literature regarding the health outcomes of cycling and bicycle helmets and examine crash and hospital data. It also includes critical examinations of the methodology used by Voukelatos and Rissel (2010), and estimates the likely effects of possible segmented approaches to bicycle helmet wearing legislation. The research concludes that current bicycle helmet wearing rates are halving the number of head injuries experienced by Queensland cyclists. Helmet wearing legislation discouraged people from cycling when it was first introduced but there is little evidence that it continues to do so. Cycling has significant health benefits and should be encouraged in ways that reduce the risk of the most serious injuries. Infrastructure and speed management approaches to improving the safety of cycling should be undertaken as part of a Safe System approach, but protection of the individual by simple and cost-effective methods such as bicycle helmets should also be part of an overall package of measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.