884 resultados para Expected satiation
Resumo:
Objective To compare the diagnostic accuracy of the interRAI Acute Care (AC) Cognitive Performance Scale (CPS2) and the Mini-Mental State Examination (MMSE), against independent clinical diagnosis for detecting dementia in older hospitalized patients. Design, Setting, and Participants The study was part of a prospective observational cohort study of patients aged ≥70 years admitted to four acute hospitals in Queensland, Australia, between 2008 and 2010. Recruitment was consecutive and patients expected to remain in hospital for ≥48 hours were eligible to participate. Data for 462 patients were available for this study. Measurements Trained research nurses completed comprehensive geriatric assessments and administered the interRAI AC and MMSE to patients. Two physicians independently reviewed patients’ medical records and assessments to establish the diagnosis of dementia. Indicators of diagnostic accuracy included sensitivity, specificity, predictive values, likelihood ratios and areas under receiver (AUC) operating characteristic curves. Results 85 patients (18.4%) were considered to have dementia according to independent clinical diagnosis. The sensitivity of the CPS2 [0.68 (95%CI: 0.58–0.77)] was not statistically different to the MMSE [0.75 (0.64–0.83)] in predicting physician diagnosed dementia. The AUCs for the 2 instruments were also not statistically different: CPS2 AUC = 0.83 (95%CI: 0.78–0.89) and MMSE AUC = 0.87 (95%CI: 0.83–0.91), while the CPS2 demonstrated higher specificity [0.92 95%CI: 0.89–0.95)] than the MMSE [0.82 (0.77–0.85)]. Agreement between the CPS2 and clinical diagnosis was substantial (87.4%; κ=0.61). Conclusion The CPS2 appears to be a reliable screening tool for assessing cognitive impairment in acutely unwell older hospitalized patients. These findings add to the growing body of evidence supporting the utility of the interRAI AC, within which the CPS2 is embedded. The interRAI AC offers the advantage of being able to accurately screen for both dementia and delirium without the need to use additional assessments, thus increasing assessment efficiency.
Resumo:
Determining what consequences are likely to serve as effective punishment for any given behaviour is a complex task. This chapter focuses specifically on illegal road user behaviours and the mechanisms used to punish and deter them. Traffic law enforcement has traditionally used the threat and/or receipt of legal sanctions and penalties to deter illegal and risky behaviours. This process represents the use of positive punishment, one of the key behaviour modification mechanisms. Behaviour modification principles describe four types of reinforcers: positive and negative punishments and positive and negative reinforcements. The terms ‘positive’ and ‘negative’ are not used in an evaluative sense here. Rather, they represent the presence (positive) or absence (negative) of stimuli to promote behaviour change. Punishments aim to inhibit behaviour and reinforcements aim to encourage it. This chapter describes a variety of punishments and reinforcements that have been and could be used to modify illegal road user behaviours. In doing so, it draws on several theoretical perspectives that have defined behavioural reinforcement and punishment in different ways. Historically, the main theoretical approach used to deter risky road use has been classical deterrence theory which has focussed on the perceived certainty, severity and swiftness of penalties. Stafford and Warr (1993) extended the traditional deterrence principles to include the positive reinforcement concept of punishment avoidance. Evidence of the association between punishment avoidance experiences and behaviour has been established for a number of risky road user behaviours including drink driving, unlicensed driving, and speeding. We chose a novel way of assessing punishment avoidance by specifying two sub-constructs (detection evasion and punishment evasion). Another theorist, Akers, described the idea of competing reinforcers, termed differential reinforcement, within social learning theory (1977). Differential reinforcement describes a balance of reinforcements and punishments as influential on behaviour. This chapter describes comprehensive way of conceptualising a broad range of reinforcement and punishment concepts, consistent with Akers’ differential reinforcement concept, within a behaviour modification framework that incorporates deterrence principles. The efficacy of three theoretical perspectives to explain self-reported speeding among a sample of 833 Australian car drivers was examined. Results demonstrated that a broad range of variables predicted speeding including personal experiences of evading detection and punishment for speeding, intrinsic sensations, practical benefits expected from speeding, and an absence of punishing effects from being caught. Not surprisingly, being younger was also significantly related to more frequent speeding, although in a regression analysis, gender did not retain a significant influence once all punishment and reinforcement variables were entered. The implications for speed management, as well as road user behaviour modification more generally, are discussed in light of these findings. Overall, the findings reported in this chapter suggest that a more comprehensive approach is required to manage the behaviour of road users which does not rely solely on traditional legal penalties and sanctions.
Resumo:
This book analyses the structure, form and language of a selected number of international and national legal instruments and reviews how an illustrative range of international and national judicial institutions have responded to the issues before them and the processes of legal reasoning engaged by them in reaching their decisions. This involves a very detailed discussion of these primary sources of international and national environmental law with a view to determining their jurisprudential architecture and the processes of reasoning expected of those responsible for implementing these architectural arrangements. This book is concerned not with the effectiveness or the quality of an environmental legal system but only with its jurisprudential characteristics and their associated processes of legal reasoning.
Resumo:
We exploit a voting reform in France to estimate the causal effect of exit poll information on turnout and bandwagon voting. Before the change in legislation, individuals in some French overseas territories voted after the election result had already been made public via exit poll information from mainland France. We estimate that knowing the exit poll information decreases voter turnout by about 12 percentage points. Our study is the first clean empirical design outside of the laboratory to demonstrate the effect of such knowledge on voter turnout. Furthermore, we find that exit poll information significantly increases bandwagon voting; that is, voters who choose to turn out are more likely to vote for the expected winner.
Resumo:
Adult soft tissue sarcomas are relatively rare tumours which are curable with radical surgery. Approximately 50% of patients will develop inoperable disease or metastases for which chemotherapy may be inappropriate. Only two cytotoxic agents - doxorubicin and ifosfamide - have activity in > 20% of patients. For both these agents there is evidence of a dose-response relationship. There is currently no good evidence that combination chemotherapy confers a clinical benefit compared with single agents. Outside a clinical trial, standard first-line therapy should be with single agent doxorubicin at a dose intensity ≥ 70 mg2 every 3 weeks. Approximately 25% of patients may be expected to respond to this regimen. There is the suggestion that responses may occur to ifosfamide in patients who progress on doxorubicin. The role of chemotherapy in the adjuvant setting remains uncertain. Several trials have suggested a modest relapse-free and overall survival benefit for the use of post-operative chemotherapy and a recent overview of 14 randomised trials confirms a small though significant benefit. These benefits have to be weighed against the toxicity of chemotherapy. The importance of treating all patients with soft tissue sarcomas in clinical trials is stressed. There is an urgent need to define new active agents to treat this disease.
Resumo:
Purpose: In non-small-cell lung cancer (NSCLC), the epidermal growth factor receptor (EGFR) and cyclooxygenase-2 (COX-2) play major roles in tumorigenesis. This phase I/II study evaluated combined therapy with the EGFR tyrosine kinase inhibitor (TKI) gefitinib and the COX-2 inhibitor rofecoxib in platinum-pretreated, relapsed, metastatic NSCLC (n = 45). Patients and Methods: Gefitinib 250 mg/d was combined with rofecoxib (dose escalated from 12.5 to 25 to 50 mg/d through three cohorts, each n = 6). Because the rofecoxib maximum-tolerated dose was not reached, the 50 mg/d cohort was expanded for efficacy evaluation (n = 33). Results: Among the 42 assessable patients, there was one complete response (CR) and two partial responses (PRs) and 12 patients with stable disease (SD); disease control rate was 35.7% (95% CI, 21.6% to 52.0%). Median time to tumor progression was 55 days (95% CI, 47 to 70 days), and median survival was 144 days (95% CI, 103 to 190 days). In a pilot study, matrix-assisted laser desorption/ionization (MALDI) proteomics analysis of baseline serum samples could distinguish patients with an objective response from those with SD or progressive disease (PD), and those with disease control (CR, PR, and SD) from those with PD. The regimen was generally well tolerated, with predictable toxicities including skin rash and diarrhea. Conclusion: Gefitinib combined with rofecoxib provided disease control equivalent to that expected with single-agent gefitinib and was generally well tolerated. Baseline serum proteomics may help identify those patients most likely to benefit from EGFR TKIs. © 2007 by American Society of Clinical Oncology.
Resumo:
The utility of a novel technique for determining the ignition delay in a compression ignition engine has been shown. This method utilises statistical modelling in the Bayesian paradigm to accurately resolve the start of combustion from a band-pass in-cylinder pressure signal. Applied to neat diesel and six biofuels, including four fractionations of palm oil of varying carbon chain length and degree of unsaturation, the relationships between ignition delay, cetane number and oxygen content have been explored. It is noted that the expected negative relationship between ignition delay and cetane number held, as did the positive relationship between ignition delay and oxygen content. The degree of unsaturation was also identified as a potential factor influencing the ignition delay.
Resumo:
The properties of ellipsoidal nanowires are yet to be examined. They have likely applications in sensing, solar cells, microelectronics and cloaking devices. Little is known of the qualities that ellipse nanowires exhibit as we vary the aspect ratio with different dielectric materials and how varying these attributes affects plasmon coupling and propagation. It is known that the distance a plasmon can travel is further if it is supported by a thicker circular nanowire, while thinner nanowires are expected to be able to increase QD coupling. Ellipsoidal nanowires may be a good compromise due to their ability to have both thin and thick dimensions. Furthermore it has been shown that the plasmon resonances along the main axis of an ellipsoidal particle is governed by the relative aspect ratio of the ellipsoid, which may lead to further control of the plasmon. Research was done by the use of COMSOL Multiphysics by looking at the fundamental plasmon mode supported by an ellipsoidal nanowire and then studying this mode for various geometrical parameters, materials and illumination wavelength. Accordingly it was found that ellipsoidal nanowires exhibit a minimum for the wavenumber and a maximum for the propagation distance at roughly the same dimensions - Highlighting that there is an aspect ratio for which there is poor coupling but low loss. Here we investigate these and related attributes.
Resumo:
This thesis addresses the process simulation and validation in Business Process Management. It proposes that the hybrid Multi Agent System (MAS) / 3D Virtual World approach is a valid method for better simulating the behaviour of human resources in business processes, supporting a wide range of rich visualization applications that can facilitate communication between business analysts and stakeholders. It is expected that the findings of this thesis may be fruitfully extended from BPM to other application domains, such as social simulation in video games and computer-based training animations.
Resumo:
This thesis examined factors influencing health professional's response to victims of domestic violence in Vietnam. As this is the first time that this type of research has been conducted in Vietnam, it was expected that the results would contribute significantly to local knowledge and should add to global perspectives. Since it is the first questionnaire about this topic to be developed in Vietnam, the psychometric property of the questionnaire was primarily established, resulting in the questionnaire being recommended to use for further study. By explaining the factors that affect the intentions to respond of health workers, this project provides key data for authorities to design intervention strategies to improve the responses of health professionals to victims of domestic violence in Vietnam.
Resumo:
Chemical vapor deposition (CVD) is widely utilized to synthesize graphene with controlled properties for many applications, especially when continuous films over large areas are required. Although hydrocarbons such as methane are quite efficient precursors for CVD at high temperature (∼1000 °C), finding less explosive and safer carbon sources is considered beneficial for the transition to large-scale production. In this work, we investigated the CVD growth of graphene using ethanol, which is a harmless and readily processable carbon feedstock that is expected to provide favorable kinetics. We tested a wide range of synthesis conditions (i.e., temperature, time, gas ratios), and on the basis of systematic analysis by Raman spectroscopy, we identified the optimal parameters for producing highly crystalline graphene with different numbers of layers. Our results demonstrate the importance of high temperature (1070 °C) for ethanol CVD and emphasize the significant effects that hydrogen and water vapor, coming from the thermal decomposition of ethanol, have on the crystal quality of the synthesized graphene.
Resumo:
iTRAQ (isobaric tags for relative or absolute quantitation) is a mass spectrometry technology that allows quantitative comparison of protein abundance by measuring peak intensities of reporter ions released from iTRAQ-tagged peptides by fragmentation during MS/MS. However, current data analysis techniques for iTRAQ struggle to report reliable relative protein abundance estimates and suffer with problems of precision and accuracy. The precision of the data is affected by variance heterogeneity: low signal data have higher relative variability; however, low abundance peptides dominate data sets. Accuracy is compromised as ratios are compressed toward 1, leading to underestimation of the ratio. This study investigated both issues and proposed a methodology that combines the peptide measurements to give a robust protein estimate even when the data for the protein are sparse or at low intensity. Our data indicated that ratio compression arises from contamination during precursor ion selection, which occurs at a consistent proportion within an experiment and thus results in a linear relationship between expected and observed ratios. We proposed that a correction factor can be calculated from spiked proteins at known ratios. Then we demonstrated that variance heterogeneity is present in iTRAQ data sets irrespective of the analytical packages, LC-MS/MS instrumentation, and iTRAQ labeling kit (4-plex or 8-plex) used. We proposed using an additive-multiplicative error model for peak intensities in MS/MS quantitation and demonstrated that a variance-stabilizing normalization is able to address the error structure and stabilize the variance across the entire intensity range. The resulting uniform variance structure simplifies the downstream analysis. Heterogeneity of variance consistent with an additive-multiplicative model has been reported in other MS-based quantitation including fields outside of proteomics; consequently the variance-stabilizing normalization methodology has the potential to increase the capabilities of MS in quantitation across diverse areas of biology and chemistry.
Resumo:
The cycling interaction between climate change and buildings is of dynamic nature. On one hand, buildings have contributed significantly to the process of human‐induced climate change. On the other hand, climate change is also expected to impact on many aspects of buildings, including building design, construction, and operation. In this entry, these two aspects of knowledge are reviewed. The potential strategies of building design and operation to reduce the greenhouse gas emissions from buildings and to prepare the buildings to withstand a range of possible climate change scenarios are also discussed.
Resumo:
Located in the Gulf of Mexico in nearly 8,000 ft of water, the Perdido project is the deepest spar application to date in the world and Shell’s first fully integrated application of its inhouse digital oilfield technology— called “Smart Field”—in the Western hemisphere. Developed by Shell on behalf of partners BP and Chevron, the spar and the subsea equipment connected to it will eventually capture about an order of magnitude more data than is collected from any other Shelldesigned and -managed development operating in the Gulf of Mexico. This article describes Shell’s digital oilfield design philosophy, briefly explains the five design elements that underpin “smartness” in Shell’s North and South American operations and sheds light on the process by which a highly customized digital oilfield development and management plan was put together for Perdido. Although Perdido is the first instance in North and South America in which these design elements and processes were applied in an integrated way, all of Shell’s future new developments in the Western hemisphere are expected to follow the same overarching design principles. Accordingly, this article uses Perdido as a real-world example to outline the high-level details of Shell’s digital oilfield design philosophy and processes.
Resumo:
Prior to graduation engineering students are expected to provide evidence of relevant experience in the workplace. This experience is expected to provide opportunities for exposure to the profession and to help students develop confidence, skills and capabilities as emerging professionals. This investigation considers the expectations and challenges in implementing WIL programs in different contexts. While this will inform the next iteration of engineering course development at QUT the issues and interventions described provide useful insights into options available and engineering curriculum design more broadly. This comparative analysis across three phases highlights expectations and challenges including stakeholder responsibilities, expectations, and assessment. The study draws on the findings of a 2005 investigation into the purpose and provision of WIL and findings of a 2012 Faculty review of the current WIL model. The enhancement of WIL through a series of developmental phases highlights strengths and weaknesses of various models. It is anticipated that this investigation will inform course development decisions on a whole-of-course approach to WIL that improves student engagement and learning experience. The importance of WIL is not disputed. However with industry expectations, increasing student numbers and cohort diversity the ways in which students and industry currently engage in WIL are not sustainable and more creative, flexible and engaging approaches are needed.