919 resultados para bargaining requirement


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Database watermarking has received significant research attention in the current decade. Although, almost all watermarking models have been either irreversible (the original relation cannot be restored from the watermarked relation) and/or non-blind (requiring original relation to detect the watermark in watermarked relation). This model has several disadvantages over reversible and blind watermarking (requiring only watermarked relation and secret key from which the watermark is detected and original relation is restored) including inability to identify rightful owner in case of successful secondary watermarking, inability to revert the relation to original data set (required in high precision industries) and requirement to store unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store original database at a secure secondary storage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bandwidths and offsets are important components in vehicle traffic control strategies. This article proposes new methods for quantifying and selecting them. Bandwidth is the amount of green time available for vehicles to travel through adjacent intersections without the requirement to stop at the second traffic light. The offset is the difference between the starting-time of ``green'' periods at two adjacent intersections, along a given route. The core ideas in this article were developed during the 2013 Maths and Industry Study Group in Brisbane, Australia. Analytical expressions for computing bandwidth, as a function of offset, are developed. An optimisation model, for selecting offsets across an arterial, is proposed. Arterial roads were focussed upon, as bandwidth and offset have a greater impact on these types of road as opposed to a full traffic network. A generic optimisation-simulation approach is also proposed to refine an initial starting solution, according to a specified metric. A metric that reflects the number of stops, and the distance between stops, is proposed to explicitly reduce the dissatisfaction of road users, and to implicitly reduce fuel consumption and emissions. Conceptually the optimisation-simulation approach is superior as it handles real-life complexities and is a global optimisation approach. The models and equations in this article can be used in road planning and traffic control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design-build (DB) system is regarded as an effective means of delivering sustainable buildings. Specifying clear sustainability requirements to potential contractors is of great importance to project success. This research investigates the current state-of-the-practice for the definition of sustainability requirements within the public sectors of the U.S. construction market using a robust content analysis of 49 DB requests for proposals (RFPs). The results reveal that owners predominantly communicate their desired level of sustainability through the LEED certification system. The sustainability requirement has become an important dimension for the best-value evaluation of DB contractors with specific importance weightings of up to 25%. Additionally, owners of larger projects and who provide less design information in their RFPs generally allocate significantly higher importance weightings to sustainability requirements. The primary knowledge contribution of this study to the construction industry is the reveal of current trend in DB procurement for green projects. The findings also provide owners, architects, engineers, and constructors with an effective means of communicating sustainability objectives in solicitation documents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detection and characterisation of structural modifications of a hindered amine light stabiliser (HALS) directly from a polyester-based coil coating have been achieved by desorption electrospray ionisation mass spectrometry (DESI-MS) for the first time. In situ detection is made possible by exposing the coating to an acetone vapour atmosphere prior to analysis. This is a gentle and non-destructive treatment that allows diffusion of analyte to the surface without promoting lateral migration. Using this approach a major structural modification of the HALS TINUVIN®123 (bis(1-octyloxy-2,2,6,6-tetramethyl-4-piperidyl) sebacate) was discovered where one N-ether piperidine moiety (N-OC8H17) is converted to a secondary piperidine (N–H). With the use of 2-dimensional DESI-MS imaging the modification was observed to arise during high curing temperatures (ca. 260 °C) and under simulated physiological conditions (80 °C, full solar spectrum). It is proposed that the secondary piperidine derivative is a result of a highly reactive aminyl radical intermediate produced by N–O homolytic bond cleavage. The nature of the bond cleavage is also suggested by ESR spin-trapping experiments employing α-phenyl-N-tert-butyl nitrone (PBN) in toluene at 80 °C. The presence of a secondary piperidine derivative in situ and the implication of N–OR competing with NO–R bond cleavage suggest an alternative pathway for generation of the nitroxyl radical—an essential requirement in anti-oxidant activity that has not previously been described for the N-ether sub-class of HALS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Protein N-terminal acetylation (Nt-acetylation) is an important mediator of protein function, stability, sorting, and localization. Although the responsible enzymes are thought to be fairly well characterized, the lack of identified in vivo substrates, the occurrence of Nt-acetylation substrates displaying yet uncharacterized N-terminal acetyltransferase (NAT) specificities, and emerging evidence of posttranslational Nt-acetylation, necessitate the use of genetic models and quantitative proteomics. NatB, which targets Met-Glu-, Met-Asp-, and Met-Asn-starting protein N termini, is presumed to Nt-acetylate 15% of all yeast and 18% of all human proteins. We here report on the evolutionary traits of NatB from yeast to human and demonstrate that ectopically expressed hNatB in a yNatB-Δ yeast strain partially complements the natB-Δ phenotypes and partially restores the yNatB Nt-acetylome. Overall, combining quantitative N-terminomics with yeast studies and knockdown of hNatB in human cell lines, led to the unambiguous identification of 180 human and 110 yeast NatB substrates. Interestingly, these substrates included Met-Gln- N-termini, which are thus now classified as in vivo NatB substrates. We also demonstrate the requirement of hNatB activity for maintaining the structure and function of actomyosin fibers and for proper cellular migration. In addition, expression of tropomyosin-1 restored the altered focal adhesions and cellular migration defects observed in hNatB-depleted HeLa cells, indicative for the conserved link between NatB, tropomyosin, and actin cable function from yeast to human.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Copyright, it is commonly said, matters in society because it encourages the production of socially beneficial, culturally significant expressive content. Our focus on copyright's recent history, however, blinds us to the social information practices that have always existed. In this Article, we examine these social information practices, and query copyright's role within them. We posit a functional model of what is necessary for creative content to move from creator to user. These are the functions dealing with the creation, selection, production, dissemination, promotion, sale, and use of expressive content. We demonstrate how centralized commercial control of information content has been the driving force behind copyright's expansion. All of the functions that copyright industries once controlled, however, are undergoing revolutionary decentralization and disintermediation. Different aspects of information technology, notably the digitization of information, widespread computer ownership, the rise of the Internet, and the development of social software, threaten the viability and desirability of centralized control over every one of the content functions. These functions are increasingly being performed by individuals and disaggregated groups. This raises an issue for copyright as the main regulatory force in information practices: copyright assumes a central control requirement that no longer applies for the development of expressive content. We examine the normative implications of this shift for our information policy in this new post-copyright era. Most notably, we conclude that copyright law needs to be adjusted in order to recognize the opportunity and desirability of decentralized content, and the expanded marketplace of ideas it promises.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To i) identify predictors of admission, and ii) describe outcomes for patients who arrived via ambulance to three Australian public Emergency Departments (EDs), before and after the opening of 41 additional ED beds within the area. Methods: A retrospective, comparative, cohort study using deterministically linked health data collected between 3 September 2006 and 2 September 2008. Data included ambulance offload delay, time to see doctor, ED length of stay (ED LOS), admission requirement, access block, hospital length of stay and in-hospital mortality. Logistic regression analysis was undertaken to identify predictors of hospital admission. Results: One third of all 286,037 ED presentations were via ambulance (n= 79,196) and 40.3% required admission. After increasing emergency capacity, the only outcome measure to improve was in-hospital mortality. Ambulance offload delay, time to see doctor, ED length of stay (ED LOS), admission requirement, access block, hospital length of stay did not improve. Strong predictors of admission before and after increased capacity included: age over 65 years, Australian Triage Scale (ATS) category 1-3, diagnoses of circulatory or respiratory conditions and ED LOS > 4 hours. With additional capacity the odds ratios for these predictors increased for age >65 and ED LOS > 4 hours and decreased for triage category and ED diagnoses. Conclusions: Expanding ED capacity from 81 to 122 beds within a health service area impacted favourably on mortality outcomes but not on time-related service outcomes such as ambulance offload time, time to see doctor and ED LOS. To improve all service outcomes, when altering (increasing/decreasing) ED bed numbers, the whole healthcare system needs to be considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1989 the first National Women's Health Policy was launched in Australia. Now, 20 years later, the Federal Government has announced plans for the development of a new National Women's Health Policy to address the health needs of Australian women. The Policy will be based on five principles: gender equity; health equity between women; a focus on prevention; an evidence base for interventions; and a life course approach. This editorial examines the role for law in the development of a new National Women's Health Policy. It considers the relevance of regulatory frameworks for health research in supporting an evidence base for health interventions and analyses the requirement in the National Health and Medical Research Council's National Statement on Ethical Conduct in Human Research for "fair inclusion" of research participants. The editorial argues for a holistic approach to women's health that includes regulatory frameworks for research, identification of funding priorities for research, and the need for a dedicated government department or agency to promote women's health.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter presents the current challenges facing legislators, regulators, researchers, and ethics committees in determining how and when to include women appropriately in research, and ensure that sex analysis of research results is routinely performed. It offers five issues that require attention to address these challenges: that national regulatory statements could provide researchers with definitions of the terms ‘sex’ , ‘gender’, and ‘gender equity’ in research; that sex and gender analysis should be built into health research protocols; the lack of internationally comparable data regarding the rates of inclusion of men and women presents a major hurdle for analysing the efficacy of different regulatory strategies; the accessibility of data would be facilitated by a requirement for publication of the results of health research to include descriptions of sex analysis performed on research data; and that institutional review boards, research ethics committees, and researchers themselves require better education about the scientific and ethical importance of including of women in clinical research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the core values to be applied by a body reviewing the ethics of human research is justice. The inclusion of justice as a requirement in the ethical review of human research is relatively recent and its utility had been largely unexamined until debates arose about the conduct of international biomedical research in the late 1990s. The subsequent amendment of authoritative documents in ways that appeared to shift the meaning of conceptions of justice generated a deal of controversy. Another difficulty has been that both the theory and the substance of justice that are applied by researchers or reviewers can be frequently seen to be subjective. Both the concept of justice – whether distributive or commutative - and what counts as a just distribution or exchange – are given different weight and meanings by different people. In this paper, the origins and more recent debates about the requirement to consider justice as a criterion in the ethical review of human research are traced, relevant conceptions of justice are distinguished and the manner in which they can be applied meaningfully in the ethical review all human research is identified. The way that these concepts are articulated in, and the intent and function of, specific paragraphs of the National Statement on Ethical Conduct in Human Research (NHMRC, ARC, UA, 2007) (National Statement) is explained. The National Statement identifies a number of issues that should be considered when a human research ethics committee is reviewing the justice aspects of an application. It also provides guidance to researchers as to how they can show that there is a fair distribution of burdens and benefits in the participant experience and the research outcomes. It also provides practical guidance to researchers on how to think through issues of justice so that they can demonstrate that the design of their research projects meets this ethical requirement is also provided

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Asthma severity and control can be measured both subjectively and objectively. Sputum analysis for evaluation of percentage of sputum eosinophilia directly measures airway inflammation, and is one method of objectively monitoring asthma. Interventions for asthma therapies have been traditionally based on symptoms and spirometry. OBJECTIVES To evaluate the efficacy of tailoring asthma interventions based on sputum analysis in comparison to clinical symptoms (with or without spirometry/peak flow) for asthma related outcomes in children and adults. SEARCH STRATEGY We searched the Cochrane Airways Group Specialised Register of Trials, the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and reference lists of articles. The last search was on 31 October 2006. SELECTION CRITERIA All randomised controlled comparisons of adjustment of asthma therapy based on sputum eosinophils compared to traditional methods (primarily clinical symptoms and spirometry/peak flow). DATA COLLECTION AND ANALYSIS Results of searches were reviewed against pre-determined criteria for inclusion. Three sets of reviewers selected relevant studies.Two review authors independently assessed trial quality extracted data. Authors were contacted for further information but none were received. Data was analysed as "treatment received" and sensitivity analyses performed. MAIN RESULTS Three adult studies were included; these studies were clinically and methodologically heterogenous (use of medications, cut off for percentage of sputum eosinophils and definition of asthma exacerbation). There were no eligible paediatric studies. Of 246 participants randomised, 221 completed the trials. In the meta-analysis, a significant reduction in number of participants who had one or more asthma exacerbations occurred when treatment was based on sputum eosinophils in comparison to clinical symptoms; pooled odds ratio (OR) was 0.49 (95% CI 0.28 to 0.87); number needed to treat to benefit (NNTB) was 6 (95% CI 4 to 32).There were also differences between groups in the rate of exacerbation (any exacerbation per year) and severity of exacerbations defined by requirement for use of oral corticosteroids but the reduction in hospitalisations was not statistically significant. Data for clinical symptoms, quality of life and spirometry were not significantly different between groups. The mean dose of inhaled corticosteroids per day was similar in both groups and no adverse events were reported. However sputum induction was not always possible. AUTHORS' CONCLUSIONS Tailored asthma interventions based on sputum eosinophils is beneficial in reducing the frequency of asthma exacerbations in adults with asthma. This review supports the use of sputum eosinophils to tailor asthma therapy for adults with frequent exacerbations and severe asthma. Further studies need to be undertaken to strengthen these results and no conclusion can be drawn for children with asthma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This presentation will explore how BPM research can seamlessly combine the academic requirement of rigor with the aim to impact the practice of Business Process Management. After a brief introduction into the research agendas as they are perceived by different BPM communities, two research projects will be discussed that illustrate how empirically-informed quantitative and qualitative research, combined with design science, can lead to outcomes that BPM practitioners are willing to adopt. The first project studies the practice of process modeling using Information Systems theory, and demonstrates how a better understanding of this practice can inform the design of modeling notations and methods. The second project studies the adoption of process management within organizations, and leads to models of how organizations can incrementally transition to greater levels of BPM maturity. The presentation will conclude with recommendations for how the BPM research and practitioner communities can increasingly benefit from each other.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction This study aimed to examine the geometric and dosimetric results when radiotherapy treatment plans are designed for prostate cancer patients with hip prostheses. Methods Ten EBRT treatment plans for localised prostate cancer, in the presence of hip prostheses, were analysed and compared with a reference set of 196 treatment plans for localised prostate cancer in patients without prostheses. Crowe et al.’s TADA code [1] was used to extract treatment plan parameters and evaluate doses to target volumes and critical structures against recommended goals [2] and constraints [3, 4]. Results The need to avoid transmitting the radiation beam through the hip prostheses limited the range of gantry angles available for use in both the rotational (VMAT) and the non-rotational (3DCRT and IMRT) radiotherapy treatments. This geometric limitation (exemplified in the VMAT data shown in Fig. 1) reduced the overall quality of the treatment plans for patients with prostheses compared to the reference plans. All plans with prostheses failed the PTV dose homogeneity requirement [2], whereas only 4 % of the plans without prostheses failed this test. Several treatment plans for patients with hip prostheses also failed the QUANTEC requirements that less than 50 % of the rectum receive 50 Gy and less than 35 % of the rectum receive 60 Gy to keep the grade 3 toxicity rate below 10 % [3], or the Hansen and Roach requirement that less than 25 % of the bladder receive 75 Gy [4]. Discussion and conclusions The results of this study exemplify the difficulty of designing prostate radiotherapy treatment plans, where beams provide adequate doses to targeted tissues while avoiding nearby organs at risk, when the presence of hip prostheses limits the available treatment geometries. This work provides qualitative evidence of the compromised dose distributions that can result, in such cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate and detailed measurement of an individual's physical activity is a key requirement for helping researchers understand the relationship between physical activity and health. Accelerometers have become the method of choice for measuring physical activity due to their small size, low cost, convenience and their ability to provide objective information about physical activity. However, interpreting accelerometer data once it has been collected can be challenging. In this work, we applied machine learning algorithms to the task of physical activity recognition from triaxial accelerometer data. We employed a simple but effective approach of dividing the accelerometer data into short non-overlapping windows, converting each window into a feature vector, and treating each feature vector as an i.i.d training instance for a supervised learning algorithm. In addition, we improved on this simple approach with a multi-scale ensemble method that did not need to commit to a single window size and was able to leverage the fact that physical activities produced time series with repetitive patterns and discriminative features for physical activity occurred at different temporal scales.