364 resultados para bargaining requirement
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
In this chapter, we discuss four related areas of cryptology, namely, authentication, hashing, message authentication codes (MACs), and digital signatures. These topics represent active and growing research topics in cryptology. Space limitations allow us to concentrate only on the essential aspects of each topic. The bibliography is intended to supplement our survey. We have selected those items which providean overview of the current state of knowledge in the above areas. Authentication deals with the problem of providing assurance to a receiver that a communicated message originates from a particular transmitter, and that the received message has the same content as the transmitted message. A typical authentication scenario occurs in computer networks, where the identity of two communicating entities is established by means of authentication. Hashing is concerned with the problem of providing a relatively short digest–fingerprint of a much longer message or electronic document. A hashing function must satisfy (at least) the critical requirement that the fingerprints of two distinct messages are distinct. Hashing functions have numerous applications in cryptology. They are often used as primitives to construct other cryptographic functions. MACs are symmetric key primitives that provide message integrity against active spoofing by appending a cryptographic checksum to a message that is verifiable only by the intended recipient of the message. Message authentication is one of the most important ways of ensuring the integrity of information that is transferred by electronic means. Digital signatures provide electronic equivalents of handwritten signatures. They preserve the essential features of handwritten signatures and can be used to sign electronic documents. Digital signatures can potentially be used in legal contexts.
Resumo:
Population increase and economic developments can lead to construction as well as demolition of infrastructures such as buildings, bridges, roads, etc resulting in used concrete as a primary waste product. Recycling of waste concrete to obtain the recycled concrete aggregates (RCA) for base and/or sub-base materials in road construction is a foremost application to be promoted to gain economical and sustainability benefits. As the mortar, bricks, glass and reclaimed asphalt pavement (RAP) present as constituents in RCA, it exhibits inconsistent properties and performance. In this study, six different types of RCA samples were subjected classification tests such as particle size distribution, plasticity, compaction test, unconfined compressive strength (UCS) and California bearing ratio (CBR) tests. Results were compared with those of the standard road materials used in Queensland, Australia. It was found that material type ‘RM1-100/RM3-0’ and ‘RM1-80/RM3-20’ samples are in the margin of the minimum required specifications of base materials used for high volume unbound granular roads while others are lower than that the minimum requirement.
Resumo:
Secure multi-party computation (MPC) protocols enable a set of n mutually distrusting participants P 1, ..., P n , each with their own private input x i , to compute a function Y = F(x 1, ..., x n ), such that at the end of the protocol, all participants learn the correct value of Y, while secrecy of the private inputs is maintained. Classical results in the unconditionally secure MPC indicate that in the presence of an active adversary, every function can be computed if and only if the number of corrupted participants, t a , is smaller than n/3. Relaxing the requirement of perfect secrecy and utilizing broadcast channels, one can improve this bound to t a < n/2. All existing MPC protocols assume that uncorrupted participants are truly honest, i.e., they are not even curious in learning other participant secret inputs. Based on this assumption, some MPC protocols are designed in such a way that after elimination of all misbehaving participants, the remaining ones learn all information in the system. This is not consistent with maintaining privacy of the participant inputs. Furthermore, an improvement of the classical results given by Fitzi, Hirt, and Maurer indicates that in addition to t a actively corrupted participants, the adversary may simultaneously corrupt some participants passively. This is in contrast to the assumption that participants who are not corrupted by an active adversary are truly honest. This paper examines the privacy of MPC protocols, and introduces the notion of an omnipresent adversary, which cannot be eliminated from the protocol. The omnipresent adversary can be either a passive, an active or a mixed one. We assume that up to a minority of participants who are not corrupted by an active adversary can be corrupted passively, with the restriction that at any time, the number of corrupted participants does not exceed a predetermined threshold. We will also show that the existence of a t-resilient protocol for a group of n participants, implies the existence of a t’-private protocol for a group of n′ participants. That is, the elimination of misbehaving participants from a t-resilient protocol leads to the decomposition of the protocol. Our adversary model stipulates that a MPC protocol never operates with a set of truly honest participants (which is a more realistic scenario). Therefore, privacy of all participants who properly follow the protocol will be maintained. We present a novel disqualification protocol to avoid a loss of privacy of participants who properly follow the protocol.
Resumo:
Current military conflicts are characterized by the use of the improvised explosive device. Improvements in personal protection, medical care, and evacuation logistics have resulted in increasing numbers of casualties surviving with complex musculoskeletal injuries, often leading to lifelong disability. Thus, there exists an urgent requirement to investigate the mechanism of extremity injury caused by these devices in order to develop mitigation strategies. In addition, the wounds of war are no longer restricted to the battlefield; similar injuries can be witnessed in civilian centers following a terrorist attack. Key to understanding such mechanisms of injury is the ability to deconstruct the complexities of an explosive event into a controlled, laboratory-based environment. In this article, a traumatic injury simulator, designed to recreate in the laboratory the impulse that is transferred to the lower extremity from an anti-vehicle explosion, is presented and characterized experimentally and numerically. Tests with instrumented cadaveric limbs were then conducted to assess the simulator’s ability to interact with the human in two mounting conditions, simulating typical seated and standing vehicle passengers. This experimental device will now allow us to (a) gain comprehensive understanding of the load-transfer mechanisms through the lower limb, (b) characterize the dissipating capacity of mitigation technologies, and (c) assess the bio-fidelity of surrogates.
Resumo:
The conflicts in Iraq and Afghanistan have been epitomized by the insurgents’ use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies.
Resumo:
Objective While many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Method Parents of learner drivers in two Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child’s learner log book and the effectiveness of the log book system. Results The study indicates that the large majority of parents believe that their child’s learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. Conclusions The results of this study suggest the presence of a paradox with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child’s log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.
Resumo:
Database watermarking has received significant research attention in the current decade. Although, almost all watermarking models have been either irreversible (the original relation cannot be restored from the watermarked relation) and/or non-blind (requiring original relation to detect the watermark in watermarked relation). This model has several disadvantages over reversible and blind watermarking (requiring only watermarked relation and secret key from which the watermark is detected and original relation is restored) including inability to identify rightful owner in case of successful secondary watermarking, inability to revert the relation to original data set (required in high precision industries) and requirement to store unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store original database at a secure secondary storage.
Resumo:
Bandwidths and offsets are important components in vehicle traffic control strategies. This article proposes new methods for quantifying and selecting them. Bandwidth is the amount of green time available for vehicles to travel through adjacent intersections without the requirement to stop at the second traffic light. The offset is the difference between the starting-time of ``green'' periods at two adjacent intersections, along a given route. The core ideas in this article were developed during the 2013 Maths and Industry Study Group in Brisbane, Australia. Analytical expressions for computing bandwidth, as a function of offset, are developed. An optimisation model, for selecting offsets across an arterial, is proposed. Arterial roads were focussed upon, as bandwidth and offset have a greater impact on these types of road as opposed to a full traffic network. A generic optimisation-simulation approach is also proposed to refine an initial starting solution, according to a specified metric. A metric that reflects the number of stops, and the distance between stops, is proposed to explicitly reduce the dissatisfaction of road users, and to implicitly reduce fuel consumption and emissions. Conceptually the optimisation-simulation approach is superior as it handles real-life complexities and is a global optimisation approach. The models and equations in this article can be used in road planning and traffic control.
Resumo:
The design-build (DB) system is regarded as an effective means of delivering sustainable buildings. Specifying clear sustainability requirements to potential contractors is of great importance to project success. This research investigates the current state-of-the-practice for the definition of sustainability requirements within the public sectors of the U.S. construction market using a robust content analysis of 49 DB requests for proposals (RFPs). The results reveal that owners predominantly communicate their desired level of sustainability through the LEED certification system. The sustainability requirement has become an important dimension for the best-value evaluation of DB contractors with specific importance weightings of up to 25%. Additionally, owners of larger projects and who provide less design information in their RFPs generally allocate significantly higher importance weightings to sustainability requirements. The primary knowledge contribution of this study to the construction industry is the reveal of current trend in DB procurement for green projects. The findings also provide owners, architects, engineers, and constructors with an effective means of communicating sustainability objectives in solicitation documents.
Resumo:
Detection and characterisation of structural modifications of a hindered amine light stabiliser (HALS) directly from a polyester-based coil coating have been achieved by desorption electrospray ionisation mass spectrometry (DESI-MS) for the first time. In situ detection is made possible by exposing the coating to an acetone vapour atmosphere prior to analysis. This is a gentle and non-destructive treatment that allows diffusion of analyte to the surface without promoting lateral migration. Using this approach a major structural modification of the HALS TINUVIN®123 (bis(1-octyloxy-2,2,6,6-tetramethyl-4-piperidyl) sebacate) was discovered where one N-ether piperidine moiety (N-OC8H17) is converted to a secondary piperidine (N–H). With the use of 2-dimensional DESI-MS imaging the modification was observed to arise during high curing temperatures (ca. 260 °C) and under simulated physiological conditions (80 °C, full solar spectrum). It is proposed that the secondary piperidine derivative is a result of a highly reactive aminyl radical intermediate produced by N–O homolytic bond cleavage. The nature of the bond cleavage is also suggested by ESR spin-trapping experiments employing α-phenyl-N-tert-butyl nitrone (PBN) in toluene at 80 °C. The presence of a secondary piperidine derivative in situ and the implication of N–OR competing with NO–R bond cleavage suggest an alternative pathway for generation of the nitroxyl radical—an essential requirement in anti-oxidant activity that has not previously been described for the N-ether sub-class of HALS.
Resumo:
Protein N-terminal acetylation (Nt-acetylation) is an important mediator of protein function, stability, sorting, and localization. Although the responsible enzymes are thought to be fairly well characterized, the lack of identified in vivo substrates, the occurrence of Nt-acetylation substrates displaying yet uncharacterized N-terminal acetyltransferase (NAT) specificities, and emerging evidence of posttranslational Nt-acetylation, necessitate the use of genetic models and quantitative proteomics. NatB, which targets Met-Glu-, Met-Asp-, and Met-Asn-starting protein N termini, is presumed to Nt-acetylate 15% of all yeast and 18% of all human proteins. We here report on the evolutionary traits of NatB from yeast to human and demonstrate that ectopically expressed hNatB in a yNatB-Δ yeast strain partially complements the natB-Δ phenotypes and partially restores the yNatB Nt-acetylome. Overall, combining quantitative N-terminomics with yeast studies and knockdown of hNatB in human cell lines, led to the unambiguous identification of 180 human and 110 yeast NatB substrates. Interestingly, these substrates included Met-Gln- N-termini, which are thus now classified as in vivo NatB substrates. We also demonstrate the requirement of hNatB activity for maintaining the structure and function of actomyosin fibers and for proper cellular migration. In addition, expression of tropomyosin-1 restored the altered focal adhesions and cellular migration defects observed in hNatB-depleted HeLa cells, indicative for the conserved link between NatB, tropomyosin, and actin cable function from yeast to human.
Resumo:
Copyright, it is commonly said, matters in society because it encourages the production of socially beneficial, culturally significant expressive content. Our focus on copyright's recent history, however, blinds us to the social information practices that have always existed. In this Article, we examine these social information practices, and query copyright's role within them. We posit a functional model of what is necessary for creative content to move from creator to user. These are the functions dealing with the creation, selection, production, dissemination, promotion, sale, and use of expressive content. We demonstrate how centralized commercial control of information content has been the driving force behind copyright's expansion. All of the functions that copyright industries once controlled, however, are undergoing revolutionary decentralization and disintermediation. Different aspects of information technology, notably the digitization of information, widespread computer ownership, the rise of the Internet, and the development of social software, threaten the viability and desirability of centralized control over every one of the content functions. These functions are increasingly being performed by individuals and disaggregated groups. This raises an issue for copyright as the main regulatory force in information practices: copyright assumes a central control requirement that no longer applies for the development of expressive content. We examine the normative implications of this shift for our information policy in this new post-copyright era. Most notably, we conclude that copyright law needs to be adjusted in order to recognize the opportunity and desirability of decentralized content, and the expanded marketplace of ideas it promises.
Resumo:
Objectives: To i) identify predictors of admission, and ii) describe outcomes for patients who arrived via ambulance to three Australian public Emergency Departments (EDs), before and after the opening of 41 additional ED beds within the area. Methods: A retrospective, comparative, cohort study using deterministically linked health data collected between 3 September 2006 and 2 September 2008. Data included ambulance offload delay, time to see doctor, ED length of stay (ED LOS), admission requirement, access block, hospital length of stay and in-hospital mortality. Logistic regression analysis was undertaken to identify predictors of hospital admission. Results: One third of all 286,037 ED presentations were via ambulance (n= 79,196) and 40.3% required admission. After increasing emergency capacity, the only outcome measure to improve was in-hospital mortality. Ambulance offload delay, time to see doctor, ED length of stay (ED LOS), admission requirement, access block, hospital length of stay did not improve. Strong predictors of admission before and after increased capacity included: age over 65 years, Australian Triage Scale (ATS) category 1-3, diagnoses of circulatory or respiratory conditions and ED LOS > 4 hours. With additional capacity the odds ratios for these predictors increased for age >65 and ED LOS > 4 hours and decreased for triage category and ED diagnoses. Conclusions: Expanding ED capacity from 81 to 122 beds within a health service area impacted favourably on mortality outcomes but not on time-related service outcomes such as ambulance offload time, time to see doctor and ED LOS. To improve all service outcomes, when altering (increasing/decreasing) ED bed numbers, the whole healthcare system needs to be considered.
Resumo:
In 1989 the first National Women's Health Policy was launched in Australia. Now, 20 years later, the Federal Government has announced plans for the development of a new National Women's Health Policy to address the health needs of Australian women. The Policy will be based on five principles: gender equity; health equity between women; a focus on prevention; an evidence base for interventions; and a life course approach. This editorial examines the role for law in the development of a new National Women's Health Policy. It considers the relevance of regulatory frameworks for health research in supporting an evidence base for health interventions and analyses the requirement in the National Health and Medical Research Council's National Statement on Ethical Conduct in Human Research for "fair inclusion" of research participants. The editorial argues for a holistic approach to women's health that includes regulatory frameworks for research, identification of funding priorities for research, and the need for a dedicated government department or agency to promote women's health.