919 resultados para bargaining requirement
Resumo:
HtrA (High Temperature Requirement A) is a critical stress response protease and chaperone for many bacteria. HtrA is a multitasking protein which can degrade unfolded proteins, conduct specific proteolysis of some substrates for correct assembly, interact with substrates to ensure correct folding, assembly or localisation, and chaperone unfolded proteins. These functions are critical for the virulence of a number of bacterial pathogens, in some cases not simply due to the broad activities of HtrA in protection against the protein stress conditions which occur during virulence. But also due to the role of HtrA in either specific proteolysis or assembly of key protein substrates which function directly in virulence. Remarkably, these activities are all conducted without any requirement for ATP. The biochemical mechanism of HtrA relies both on the chymotryptic serine protease active site as well as the presence of two PDZ (protein binding) domains. The mechanism is a unique combination of activation by substrate motifs to alter the confirmation of the active site, and assembly into a multimeric complex which has enhanced degradation and may also act as a protective cage for proteins which are not degraded. The role of this protease in the pathogenesis of a number of bacteria and the details of its distinctive biochemical activation and assembly mechanisms are discussed in this chapter.
Resumo:
Understanding the physical characteristics of the indoor environment that affect human health and wellbeing is the key requirement underpinning the beneficial design of a healthcare facility (HCF). We reviewed and summarised physical factors of the indoor environment reported to affect human health and wellbeing in HCFs. Research materials included articles identified in a Pubmed search, guidelines, books, reports and monographs, as well as the bibliographies of review articles in the area studied. Of these, 209 publications were selected for this review. According to the literature, there is evidence that the following physical factors of the indoor environment affect the health and wellbeing of human beings in an HCF: safety, ventilation and HVAC systems, thermal environment, acoustic environment, interior layout and room type, windows (including daylight and views), nature and gardens, lighting, colour, floor covering, furniture and its placement, ergonomics, wayfinding, artworks and music. Some of these, in themselves, directly promote or hinder health and wellbeing, but the physical factors may also have numerous indirect impacts by influencing the behaviour, actions, and interactions of patients, their families and the staff members. The findings of this research enable a good understanding of the different physical factors of the indoor environment on health and wellbeing and provide a practical resource for those responsible for the design and operate the facilities as well as researchers investigating these factors. However, more studies are needed in order to inform the design of optimally beneficial indoor environments in HCFs for all user groups.
Resumo:
BACKGROUND: A long length of stay (LOS) in the emergency department (ED) associated with overcrowding has been found to adversely affect the quality of ED care. The objective of this study is to determine whether patients who speak a language other than English at home have a longer LOS in EDs compared to those whose speak only English at home. METHODS: A secondary data analysis of a Queensland state-wide hospital EDs dataset (Emergency Department Information System) was conducted for the period, 1 January 2008 to 31 December 2010. RESULTS: The interpreter requirement was the highest among Vietnamese speakers (23.1%) followed by Chinese (19.8%) and Arabic speakers (18.7%). There were significant differences in the distributions of the departure statuses among the language groups (Chi-squared=3236.88, P<0.001). Compared with English speakers, the Beta coeffi cient for the LOS in the EDs measured in minutes was among Vietnamese, 26.3 (95%CI: 22.1–30.5); Arabic, 10.3 (95%CI: 7.3–13.2); Spanish, 9.4 (95%CI: 7.1–11.7); Chinese, 8.6 (95%CI: 2.6–14.6); Hindi, 4.0 (95%CI: 2.2–5.7); Italian, 3.5 (95%CI: 1.6–5.4); and German, 2.7 (95%CI: 1.0–4.4). The fi nal regression model explained 17% of the variability in LOS. CONCLUSION: There is a close relationship between the language spoken at home and the LOS at EDs, indicating that language could be an important predictor of prolonged LOS in EDs and improving language services might reduce LOS and ease overcrowding in EDs in Queensland's public hospitals.
Resumo:
The recent decision of the Court of Appeal in AGL Sales (Qld) Pty Ltd v Dawson Sales Pty Ltd [2009] QCA 262 provides clear direction on the Court’s expectations of a party seeking leave to appeal a costs order.This decision is likely to impact upon common practice in relation to appeals against costs orders. It sends a clear message to trial judges that they should not give leave as of course when giving a judgment in relation to costs, and that parties seeking leave under s 253 of the Supreme Court Act 1995 (Qld) should make a separate application. The application should be supported by material presenting an arguable case that the trial judge made an error in the exercise of the discretion of the kind described in House v King (1936) 55 CLR 499. A different, and interesting, aspect of this appeal is that it was the first wholly electronic civil appeal. The court-provided technology had been adopted at trial, and the Court of Appeal dispensed with any requirement for hard copy appeal record books.
Resumo:
In Deppro Pty Ltd v Hannah [2008] QSC 193 one of the matters considered by the court related to the requirement in r 243 of the Uniform Civil Procedure Rules 1999 (Qld) that a notice of non-party disclosure must “state the allegation in issue in the pleadings about which the document sought is directly relevant.”The approach adopted by the issuing party in this case of asserting that documents sought by a notice of non-party disclosure are relevant to allegations in numbered paragraphs in pleadings, and serving copies of the pleadings with the notice, is not uncommon in practice. This decision makes it clear that this practice is fraught with danger. In circumstances where it is not apparent that the non-party has been fully apprised of the relevant issues the decision suggests an applicant for non-party disclosure who has not complied with the requirements of s 243 might be required to issue a fresh, fully compliant notice, and to suffer associated costs consequences.
Resumo:
Modelling how a word is activated in human memory is an important requirement for determining the probability of recall of a word in an extra-list cueing experiment. Previous research assumed a quantum-like model in which the semantic network was modelled as entangled qubits, however the level of activation was clearly being over-estimated. This paper explores three variations of this model, each of which are distinguished by a scaling factor designed to compensate the overestimation.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
Ascorbic acid or vitamin C is involved in a number of biochemical pathways that are important to exercise metabolism and the health of exercising individuals. This review reports the results of studies investigating the requirement for vitamin C with exercise on the basis of dietary vitamin C intakes, the response to supplementation and alterations in plasma, serum, and leukocyte ascorbic acid concentration following both acute exercise and regular training. The possible physiological significance of changes in ascorbic acid with exercise is also addressed. Exercise generally causes a transient increase in circulating ascorbic acid in the hours following exercise, but a decline below pre-exercise levels occurs in the days after prolonged exercise. These changes could be associated with increased exercise-induced oxidative stress. On the basis of alterations in the concentration of ascorbic acid within the blood, it remains unclear if regular exercise increases the metabolism of vitamin C. However, the similar dietary intakes and responses to supplementation between athletes and nonathletes suggest that regular exercise does not increase the requirement for vitamin C in athletes. Two novel hypotheses are put forward to explain recent findings of attenuated levels of cortisol postexercise following supplementation with high doses of vitamin C.
Resumo:
This article examines the law in Australia and New Zealand that governs the withholding and withdrawal of ‘futile’ life-sustaining treatment. Although doctors have both civil and criminal law duties to treat patients, those general duties do not require the provision of treatment that is deemed to be futile. This is either because futile treatment is not in a patient’s best interests or because stopping such treatment does not breach the criminal law. This means, in the absence of a duty to treat, doctors may unilaterally withdraw or withhold treatment that is futile; consent is not required. The article then examines whether this general position has been altered by statute. It considers a range of suggested possible legislation but concludes it is likely that only Queensland’s adult guardianship legislation imposes a requirement to obtain consent to withhold or withdraw such treatment.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
The design-build (DB) delivery system is an effective means of delivering a green construction project and selecting an appropriate contractor is critical to project success. Moreover, the delivery of green buildings requires specific design, construction and operation and maintenance considerations not generally encountered in the procurement of conventional buildings. Specifying clear sustainability requirements to potential contractors is particularly important in achieving sustainable project goals. However, many client/owners either do not explicitly specify sustainability requirements or do so in a prescriptive manner during the project procurement process. This paper investigates the current state-of-the-art procurement process used in specifying the sustainability requirements of the public sector in the USA construction market by means of a robust content analysis of 40 design-build requests for proposals (RFPs). The results of the content analysis indicate that the sustainability requirement is one of the most important dimensions in the best-value evaluation of DB contractors. Client/owners predominantly specify the LEED certification levels (e.g. LEED Certified, Silver, Gold, and Platinum) for a particular facility, and include the sustainability requirements as selection criteria (with specific importance weightings) for contractor evolution. Additionally, larger size projects tend to allocate higher importance weightings to sustainability requirements.This study provides public DB client/owners with a number of practical implications for selecting appropriate design-builders for sustainable DB projects.
Resumo:
Global Navigation Satellite Systems (GNSS)-based observation systems can provide high precision positioning and navigation solutions in real time, in the order of subcentimetre if we make use of carrier phase measurements in the differential mode and deal with all the bias and noise terms well. However, these carrier phase measurements are ambiguous due to unknown, integer numbers of cycles. One key challenge in the differential carrier phase mode is to fix the integer ambiguities correctly. On the other hand, in the safety of life or liability-critical applications, such as for vehicle safety positioning and aviation, not only is high accuracy required, but also the reliability requirement is important. This PhD research studies to achieve high reliability for ambiguity resolution (AR) in a multi-GNSS environment. GNSS ambiguity estimation and validation problems are the focus of the research effort. Particularly, we study the case of multiple constellations that include initial to full operations of foreseeable Galileo, GLONASS and Compass and QZSS navigation systems from next few years to the end of the decade. Since real observation data is only available from GPS and GLONASS systems, the simulation method named Virtual Galileo Constellation (VGC) is applied to generate observational data from another constellation in the data analysis. In addition, both full ambiguity resolution (FAR) and partial ambiguity resolution (PAR) algorithms are used in processing single and dual constellation data. Firstly, a brief overview of related work on AR methods and reliability theory is given. Next, a modified inverse integer Cholesky decorrelation method and its performance on AR are presented. Subsequently, a new measure of decorrelation performance called orthogonality defect is introduced and compared with other measures. Furthermore, a new AR scheme considering the ambiguity validation requirement in the control of the search space size is proposed to improve the search efficiency. With respect to the reliability of AR, we also discuss the computation of the ambiguity success rate (ASR) and confirm that the success rate computed with the integer bootstrapping method is quite a sharp approximation to the actual integer least-squares (ILS) method success rate. The advantages of multi-GNSS constellations are examined in terms of the PAR technique involving the predefined ASR. Finally, a novel satellite selection algorithm for reliable ambiguity resolution called SARA is developed. In summary, the study demonstrats that when the ASR is close to one, the reliability of AR can be guaranteed and the ambiguity validation is effective. The work then focuses on new strategies to improve the ASR, including a partial ambiguity resolution procedure with a predefined success rate and a novel satellite selection strategy with a high success rate. The proposed strategies bring significant benefits of multi-GNSS signals to real-time high precision and high reliability positioning services.
Resumo:
In Balnaves v Smith [2012] QSC 408 Byrne SJA concluded that an offer to settle could be an “offer to settle” under Chapter 9 Part 5 of the Uniform Civil Procedure Rules 1999 (Qld) (UCPR) despite the inclusion of non-monetary terms. His Honour took a different approach to that taken by Moynihan SJA in Taske v Occupational & Medical Innovations Ltd [2007] QSC 147.
Resumo:
Property is an important factor in all businesses production in order to function. Nourse (1990) quoted ¡°some businesses are real estate, all businesses use real estate¡±. In recent years, the management of property assets has become the focus of many organisations, including the non-real estate businesses. Good asset management is concerned with the effective utilisation of a property owner.s assets. It is the management process of ensuring that the portfolio of properties held meets the overall requirements of the users. In short, it is the process of identifying the user.s requirement and the rationalisation of property holdings to match that requirement best, followed by a monitoring and ongoing review of practice. In Malaysia, federal agencies and local authorities are among the largest property asset owners. Recently the federal government has released a Total Asset Management Manual (TAMM). It is at the preliminary stage of implementation. This thesis will study the international practices of asset management of public sector assets and assess the effectiveness of TAMM. This research will focus on current international practices for the effective management of public sector property assets. The current application in Malaysia will be highlighted, to determine the awareness and understanding of the current practices to the recently released TAMM. This research is an exploratory research. The basis of this research relies on the combination of qualitative and quantitative approach, whereby the qualitative approach focuses on the international practices and its application to the management of public sector property assets. Questionnaires survey will be conducted among the Malaysian public property assets managers and users in the quantitative approach to gauge the collective opinion on the current practices of TAMM and its implementation
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.