919 resultados para bargaining requirement


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim The aim of this study was to establish intensive care unit nurses’ knowledge of delirium within an acute tertiary hospital within South East Asia. Background Delirium is a common, life threatening and often preventable cause of morbidity and mortality among older patients. Undetected and untreated delirium is a catalyst to increased mortality, morbidity, functional decline and results in increased requirement for nursing care, healthcare expense and hospital length of stay. However, despite effective assessment tools to identify delirium in the acute setting, there still remains an inability of ICU nurses’ to accurately identify delirium in the critically ill patient especially that of hypoactive delirium. Method A purposive sample of 53 staff nurses from a 13-bedded medical intensive care unit within an acute tertiary teaching hospital in South East Asia were asked to participate. A 40 item 5-point Likert scale questionnaire was employed to determine the participants’ knowledge of the signs and symptoms; the risk factors and negative outcomes of delirium. Results The overall positively answered mean score was 27 (67.3%) out of a possible 40 questions. Mean scores for knowledge of signs and symptoms, risk factors and negative outcomes were 9.52 (63.5%, n = 15), 11.43 (63.5%, n = 17) and 6.0 (75%, n = 8), respectively. Conclusion Whilst the results of this study are similar to others taken from a western perspective, it appeared that the ICU nurses in this study demonstrated limited knowledge of the signs and symptoms, risk factors and negative outcomes of delirium in the critically patient. The implications for practice of this are important given the outcomes of untreated delirium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are many challenges in developing research projects in research-naïve clinical settings, especially palliative care where resistance to participate in research has been identified. These challenges to the implementation of research are common in nursing practice and are associated with attitudes towards research participation, and some lack of understanding of research as a process to improve clinical practice. This is despite the professional nursing requirement to conduct research into issues that influence palliative care practice. The purpose of this paper is to describe the process of implementing a clinical research project in collaboration with the clinicians of a palliative care community team and to reflect on the strategies implemented to overcome the challenges involved. The challenges presented here demonstrate the importance of proactively implementing engagement strategies from the inception of a research project in a clinical setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been significant research in the field of database watermarking recently. However, there has not been sufficient attention given to the requirement of providing reversibility (the ability to revert back to original relation from watermarked relation) and blindness (not needing the original relation for detection purpose) at the same time. This model has several disadvantages over reversible and blind watermarking (requiring only the watermarked relation and secret key from which the watermark is detected and the original relation is restored) including the inability to identify the rightful owner in case of successful secondary watermarking, the inability to revert the relation to the original data set (required in high precision industries) and the requirement to store the unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to a high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store the original database at a secure secondary storage. We have implemented our scheme and results show the success rate is limited to 11% even when 48% tuples are modified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been significant research in the field of database watermarking recently. However, there has not been sufficient attention given to the requirement of providing reversibility (the ability to revert back to original relation from watermarked relation) and blindness (not needing the original relation for detection purpose) at the same time. This model has several disadvantages over reversible and blind watermarking (requiring only the watermarked relation and secret key from which the watermark is detected and the original relation is restored) including the inability to identify the rightful owner in case of successful secondary watermarking, the inability to revert the relation to the original data set (required in high precision industries) and the requirement to store the unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to a high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store the original database at a secure secondary storage. We have implemented our scheme and results show the success rate is limited to 11% even when 48% tuples are modified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sudden, violent and otherwise unexplained deaths are investigated in most western jurisdictions through a Coronial or medico-legal process. A crucial element of such an investigation is the legislative requirement to remove the body for autopsy and other medical interventions, processes which can disrupt traditional religious and cultural grieving practices. While recent legislative changes in an increasing number of jurisdictions allow families to raise objections based on religious and cultural grounds, such concerns can be over-ruled, often exacerbating the trauma and grief of families. Based on funded research which interviews a range of Coronial staff in one Australian jurisdiction, this paper explores the disjuncture between medico-legal discourses, which position the body as corpse, and the rise of more ‘therapeutic’ discourses which recognise the family’s wishes to reposition the body as beloved and lamented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Disjoint top-view networked cameras are among the most commonly utilized networks in many applications. One of the open questions for these cameras' study is the computation of extrinsic parameters (positions and orientations), named extrinsic calibration or localization of cameras. Current approaches either rely on strict assumptions of the object motion for accurate results or fail to provide results of high accuracy without the requirement of the object motion. To address these shortcomings, we present a location-constrained maximum a posteriori (LMAP) approach by applying known locations in the surveillance area, some of which would be passed by the object opportunistically. The LMAP approach formulates the problem as a joint inference of the extrinsic parameters and object trajectory based on the cameras' observations and the known locations. In addition, a new task-oriented evaluation metric, named MABR (the Maximum value of All image points' Back-projected localization errors' L2 norms Relative to the area of field of view), is presented to assess the quality of the calibration results in an indoor object tracking context. Finally, results herein demonstrate the superior performance of the proposed method over the state-of-the-art algorithm based on the presented MABR and classical evaluation metric in simulations and real experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter, we discuss four related areas of cryptology, namely, authentication, hashing, message authentication codes (MACs), and digital signatures. These topics represent active and growing research topics in cryptology. Space limitations allow us to concentrate only on the essential aspects of each topic. The bibliography is intended to supplement our survey. We have selected those items which providean overview of the current state of knowledge in the above areas. Authentication deals with the problem of providing assurance to a receiver that a communicated message originates from a particular transmitter, and that the received message has the same content as the transmitted message. A typical authentication scenario occurs in computer networks, where the identity of two communicating entities is established by means of authentication. Hashing is concerned with the problem of providing a relatively short digest–fingerprint of a much longer message or electronic document. A hashing function must satisfy (at least) the critical requirement that the fingerprints of two distinct messages are distinct. Hashing functions have numerous applications in cryptology. They are often used as primitives to construct other cryptographic functions. MACs are symmetric key primitives that provide message integrity against active spoofing by appending a cryptographic checksum to a message that is verifiable only by the intended recipient of the message. Message authentication is one of the most important ways of ensuring the integrity of information that is transferred by electronic means. Digital signatures provide electronic equivalents of handwritten signatures. They preserve the essential features of handwritten signatures and can be used to sign electronic documents. Digital signatures can potentially be used in legal contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population increase and economic developments can lead to construction as well as demolition of infrastructures such as buildings, bridges, roads, etc resulting in used concrete as a primary waste product. Recycling of waste concrete to obtain the recycled concrete aggregates (RCA) for base and/or sub-base materials in road construction is a foremost application to be promoted to gain economical and sustainability benefits. As the mortar, bricks, glass and reclaimed asphalt pavement (RAP) present as constituents in RCA, it exhibits inconsistent properties and performance. In this study, six different types of RCA samples were subjected classification tests such as particle size distribution, plasticity, compaction test, unconfined compressive strength (UCS) and California bearing ratio (CBR) tests. Results were compared with those of the standard road materials used in Queensland, Australia. It was found that material type ‘RM1-100/RM3-0’ and ‘RM1-80/RM3-20’ samples are in the margin of the minimum required specifications of base materials used for high volume unbound granular roads while others are lower than that the minimum requirement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secure multi-party computation (MPC) protocols enable a set of n mutually distrusting participants P 1, ..., P n , each with their own private input x i , to compute a function Y = F(x 1, ..., x n ), such that at the end of the protocol, all participants learn the correct value of Y, while secrecy of the private inputs is maintained. Classical results in the unconditionally secure MPC indicate that in the presence of an active adversary, every function can be computed if and only if the number of corrupted participants, t a , is smaller than n/3. Relaxing the requirement of perfect secrecy and utilizing broadcast channels, one can improve this bound to t a  < n/2. All existing MPC protocols assume that uncorrupted participants are truly honest, i.e., they are not even curious in learning other participant secret inputs. Based on this assumption, some MPC protocols are designed in such a way that after elimination of all misbehaving participants, the remaining ones learn all information in the system. This is not consistent with maintaining privacy of the participant inputs. Furthermore, an improvement of the classical results given by Fitzi, Hirt, and Maurer indicates that in addition to t a actively corrupted participants, the adversary may simultaneously corrupt some participants passively. This is in contrast to the assumption that participants who are not corrupted by an active adversary are truly honest. This paper examines the privacy of MPC protocols, and introduces the notion of an omnipresent adversary, which cannot be eliminated from the protocol. The omnipresent adversary can be either a passive, an active or a mixed one. We assume that up to a minority of participants who are not corrupted by an active adversary can be corrupted passively, with the restriction that at any time, the number of corrupted participants does not exceed a predetermined threshold. We will also show that the existence of a t-resilient protocol for a group of n participants, implies the existence of a t’-private protocol for a group of n′ participants. That is, the elimination of misbehaving participants from a t-resilient protocol leads to the decomposition of the protocol. Our adversary model stipulates that a MPC protocol never operates with a set of truly honest participants (which is a more realistic scenario). Therefore, privacy of all participants who properly follow the protocol will be maintained. We present a novel disqualification protocol to avoid a loss of privacy of participants who properly follow the protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current military conflicts are characterized by the use of the improvised explosive device. Improvements in personal protection, medical care, and evacuation logistics have resulted in increasing numbers of casualties surviving with complex musculoskeletal injuries, often leading to lifelong disability. Thus, there exists an urgent requirement to investigate the mechanism of extremity injury caused by these devices in order to develop mitigation strategies. In addition, the wounds of war are no longer restricted to the battlefield; similar injuries can be witnessed in civilian centers following a terrorist attack. Key to understanding such mechanisms of injury is the ability to deconstruct the complexities of an explosive event into a controlled, laboratory-based environment. In this article, a traumatic injury simulator, designed to recreate in the laboratory the impulse that is transferred to the lower extremity from an anti-vehicle explosion, is presented and characterized experimentally and numerically. Tests with instrumented cadaveric limbs were then conducted to assess the simulator’s ability to interact with the human in two mounting conditions, simulating typical seated and standing vehicle passengers. This experimental device will now allow us to (a) gain comprehensive understanding of the load-transfer mechanisms through the lower limb, (b) characterize the dissipating capacity of mitigation technologies, and (c) assess the bio-fidelity of surrogates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The conflicts in Iraq and Afghanistan have been epitomized by the insurgents’ use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective While many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Method Parents of learner drivers in two Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child’s learner log book and the effectiveness of the log book system. Results The study indicates that the large majority of parents believe that their child’s learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. Conclusions The results of this study suggest the presence of a paradox with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child’s log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.