115 resultados para non-standard neutrino interactions
Resumo:
Discharge summaries and other free-text reports in healthcare transfer information between working shifts and geographic locations. Patients are likely to have difficulties in understanding their content, because of their medical jargon, non-standard abbreviations,and ward-specific idioms. This paper reports on an evaluation lab with an aim to support the continuum of care by developing methods and resources that make clinical reports in English easier to understand for patients, and which helps them in finding information related to their condition.
Resumo:
Achieving business and IT integration is strategic goal for many organisations – it has almost become the ‘Holy Grail’ of organisational success. In this environment Enterprise Resource Planning (ERP) packages have become the defacto option for addressing this issue. Integration has come to mean adopting ERP, through configuration and without customization, but this all or nothing approach has proved difficult for many organisations. In part 1 of a 2 part update we provide evidence from the field that suggests that whilst costly, if managed appropriately, customization can have value in aiding organisational integration efforts. In part 2, we discuss in more detail the benefits and pitfalls involved in enacting a non-standard based integration strategy.
Resumo:
In part 1 of this update, we put forward the argument that integration in ERP based environments can be achieved in ways other than adopting a software configuration only approach. We drew on evidence from two large ERP implementations to show how, despite the cost implications, some customization, if carefully managed, could prove helpful. In this, the final part of the update, we discuss the benefits, and potential pitfalls, involved in enacting a non-standard based integration strategy. This requires attention to a) broadening the integration definition; b) bringing legacy practices forward and c) developing a customization based integration strategy.
Resumo:
Aim The aim was to explore the relationship between nursing casualization and the culture of communication for nurses in a healthcare facility. Background Casualization, or non-standard work, is the use of temporary, contract, part-time and casual labour. An increase in casual labour has been part of a global shift in work organization aimed at creating a more flexible and cheaper workforce. It has been argued that flexibility of labour has enabled nurses to manage both non-work related needs and an increasingly complex work environment. Yet no research has explored casualization and how it impacts on the communication culture for nurses in a healthcare facility. Design Critical ethnography. Methods Methods included observation, field notes, formal interviews and focus groups. Data collection was undertaken over the 2 years 2008–2009. Results The concepts of knowing and belonging were perceived as important to nursing teamwork and yet the traditional time/task work model, designed for a full-time workforce, marginalized non-standard workers. The combination of medical dominance and traditional stereotyping of the nurse and work as full-time shaped the behaviours of nurses and situated casual workers on the periphery. The overall finding was that entrenched systemic structures and processes shaped the physical and cultural dimensions of a contemporary work environment and contributed to an ineffective communication culture. Conclusion Flexible work is an important feature of contemporary nursing. Traditional work models and nurse attitudes and practices have not progressed and are discordant with a contemporary approach to nursing labour management.
Resumo:
The type of contract model may have a significant influence on achieving project objectives, including environmental and climate change goals. This research investigates non-standard contract models impacting greenhouse gas emissions (GHG) in transport infrastructure construction in Australia. The research is based on the analysis of two case studies: an Early Contractor Involvement (ECI) contract and a Design and Construct (D&C) contract with GHG reduction requirements embedded in the contractor selection. Main findings support the use of ECIs for better integrating decisions made during the planning phase with the construction activities, and improve environmental outcomes while achieving financial and time savings. Key words: greenhouse gases reduction; road construction; contracting; ECI; D&C
Resumo:
This paper focuses on the fundamental right to be heard, that is, the right to have one’s voice heard and listened to – to impose reception (Bourdieu, 1977). It focuses on the ways that non-mainstream English is heard and received in Australia, where despite public policy initiatives around equal opportunity, language continues to socially disadvantage people (Burridge & Mulder, 1998). English is the language of the mainstream and most people are monolingually English (Ozolins, 1993). English has no official status yet it remains dominant and its centrality is rarely challenged (Smolicz, 1995). This paper takes the position that the lack of language engagement in mainstream Australia leads to linguistic desensitisation. Writing in the US context where English is also the unofficial norm, Lippi-Green (1997) maintains that discrimination based on speech features or accent is commonly accepted and widely perceived as appropriate. In Australia, non-standard forms of English are often disparaged or devalued because they do not conform to the ‘standard’ (Burridge & Mulder, 1998). This paper argues that talk cannot be taken for granted: ‘spoken voices’ are critical tools for representing the self and negotiating and manifesting legitimacy within social groups (Miller, 2003). In multicultural, multilingual countries like Australia, the impact of the spoken voice, its message and how it is heard are critical tools for people seeking settlement, inclusion and access to facilities and services. Too often these rights are denied because of the way a person sounds. This paper reports a study conducted with a group that has been particularly vulnerable to ongoing ‘panics’ about language – international students. International education is the third largest revenue source for Australia (AEI, 2010) but has been beset by concerns from academics (Auditor-General, 2002) and the media about student language levels and falling work standards (e.g. Livingstone, 2004). Much of the focus has been high-stakes writing but with the ascendancy of project work in university assessment and the increasing emphasis on oracy, there is a call to recognise the salience of talk, especially among students using English as a second language (ESL) (Kettle & May, 2012). The study investigated the experiences of six international students in a Master of Education course at a large metropolitan university. It utilised data from student interviews, classroom observations, course materials, university policy documents and media reports to examine the ways that speaking and being heard impacted on the students’ learning and legitimacy in the course. The analysis drew on Fairclough’s (2003) model of the dialectical-relational Critical Discourse Analysis (CDA) to analyse the linguistic, discursive and social relations between the data texts and their conditions of production and interpretation, including the wider socio-political discourses on English, language difference, and second language use. The interests of the study were if and how discourses of marginalisation and discrimination manifested and if and how students recognised and responded to them pragmatically. Also how they juxtaposed with and/or contradicted the official rhetoric about diversity and inclusion. The underpinning rationale was that international students’ experiences can provide insights into the hidden politics and practices of being heard and afforded speaking rights as a second language speaker in Australia.
Resumo:
INTRODUCTION: Performance status (PS) 2 patients with non-small cell lung cancer (NSCLC) experience more toxicity, lower response rates, and shorter survival times than healthier patients treated with standard chemotherapy. Paclitaxel poliglumex (PPX), a macromolecule drug conjugate of paclitaxel and polyglutamic acid, reduces systemic exposure to peak concentrations of free paclitaxel and may lead to increased concentrations in tumors due to enhanced vascular permeability. METHODS: Chemotherapy-naive PS 2 patients with advanced NSCLC were randomized to receive carboplatin (area under the curve = 6) and either PPX (210 mg/m/10 min without routine steroid premedication) or paclitaxel (225 mg/m/3 h with standard premedication) every 3 weeks. The primary end point was overall survival. RESULTS: A total of 400 patients were enrolled. Alopecia, arthralgias/myalgias, and cardiac events were significantly less frequent with PPX/carboplatin, whereas grade ≥3 neutropenia and grade 3 neuropathy showed a trend of worsening. There was no significant difference in the incidence of hypersensitivity reactions despite the absence of routine premedication in the PPX arm. Overall survival was similar between treatment arms (hazard ratio, 0.97; log rank p = 0.769). Median and 1-year survival rates were 7.9 months and 31%, for PPX versus 8 months and 31% for paclitaxel. Disease control rates were 64% and 69% for PPX and paclitaxel, respectively. Time to progression was similar: 3.9 months for PPX/carboplatin versus 4.6 months for paclitaxel/carboplatin (p = 0.210). CONCLUSION: PPX/carboplatin failed to provide superior survival compared with paclitaxel/carboplatin in the first-line treatment of PS 2 patients with NSCLC, but the results with respect to progression-free survival and overall survival were comparable and the PPX regimen was more convenient. © 2008International Association for the Study of Lung Cancer.
Resumo:
Background: This open-label, randomised phase III study was designed to further investigate the clinical activity and safety of SRL172 (killed Mycobacterium vaccae suspension) with chemotherapy in the treatment of non-small-cell lung cancer (NSCLC). Patients and methods: Patients were randomised to receive platinum-based chemotherapy, consisting of up to six cycles of MVP (mitomycin, vinblastine and cisplatin or carboplatin) with (210 patients) or without (209 patients) monthly SRL172. Results: There was no statistical difference between the two groups in overall survival (primary efficacy end point) over the course of the study (median overall survival of 223 days versus 225 days; P = 0.65). However, a higher proportion of patients were alive at the end of the 15-week treatment phase in the chemotherapy plus SRL172 group (90%), than in the chemotherapy alone group (83%) (P = 0.061). At the end of the treatment phase, the response rate was 37% in the combined group and 33% in the chemotherapy alone group. Patients in the chemotherapy alone group had greater deterioration in their Global Health Status score (-14.3) than patients in the chemotherapy plus SRL172 group (-6.6) (P = 0.02). Conclusion: In this non-placebo controlled trial, SRL172 when added to standard cancer chemotherapy significantly improved patient quality of life without affecting overall survival times. © 2004 European Society for Medical Oncology.
Resumo:
We present the treatment rationale and study design of the MetLung phase III study. This study will investigate onartuzumab (MetMAb) in combination with erlotinib compared with erlotinib alone, as second- or third-line treatment, in patients with advanced non-small-cell lung cancer (NSCLC) who are Met-positive by immunohistochemistry. Approximately 490 patients (245 per treatment arm) will receive erlotinib (150 mg oral daily) plus onartuzumab or placebo (15 mg/kg intravenous every 3 weeks) until disease progression, unacceptable toxicity, patient or physician decision to discontinue, or death. The efficacy objectives of this study are to compare overall survival (OS) (primary endpoint), progression-free survival, and response rates between the 2 treatment arms. In addition, safety, quality of life, pharmacokinetics, and translational research will be investigated across treatment arms. If the primary objective (OS) is achieved, this study will provide robust results toward an alternative treatment option for patients with Met-positive second- or third-line NSCLC. © 2012 Elsevier Inc. All Rights Reserved.
Resumo:
Tumor hypoxia has been recognized to confer resistance to anticancer therapy since the early 20th century. More recently, its fundamental role in tumorigenesis has been established. Hypoxia-inducible factor (HIF)-1 has been identified as an important transcription factor that mediates the cellular response to hypoxia, promoting both cellular survival and apoptosis under different conditions. Increased tumor cell expression of this transcription factor promotes tumor growth In vivo and is associated with a worse prognosis in patients with non-small-cell lung cancer (NSCLC) undergoing tumor resection. The epidermal growth factor receptor (EGFR) promotes tumor cell proliferation and anglogenesis and inhibits apoptosis. Epidermal growth factor receptor expression increases in a stepwise manner during tumorigenesis and is overexpressed in > 50% of NSCLC tumors. This review discusses the reciprocal relationship between tumor cell hypoxia and EGFR. Recent studies suggest that hypoxia induces expression of EGFR and its ligands. In return, EGFR might enhance the cellular response to hypoxia by increasing expression of HIF-1α, and so act as a survival factor for hypoxic cancer cells. Immunohistochemical studies on a series of resected NSCLC tumors add weight to this contention by demonstrating a close association between expression of EGFR, HIF-1α, and:1 of HIF-1's target proteins, carbonic anhydrase IX. In this article we discuss emerging treatment strategies for NSCLC that target HIF-1, HIF-1 transcriptional targets, and EGFR.
Resumo:
An encryption scheme is non-malleable if giving an encryption of a message to an adversary does not increase its chances of producing an encryption of a related message (under a given public key). Fischlin introduced a stronger notion, known as complete non-malleability, which requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti later proposed a comparison-based definition of this security notion, which is more in line with the well-studied definitions proposed by Bellare et al. The authors also provide additional feasibility results by proposing two constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Therefore, the only previously known completely non-malleable (and non-interactive) scheme in the standard model, is quite inefficient as it relies on generic NIZK approach. They left the existence of efficient schemes in the common reference string model as an open problem. Recently, two efficient public-key encryption schemes have been proposed by Libert and Yung, and Barbosa and Farshim, both of them are based on pairing identity-based encryption. At ACISP 2011, Sepahi et al. proposed a method to achieve completely non-malleable encryption in the public-key setting using lattices but there is no security proof for the proposed scheme. In this paper we review the mentioned scheme and provide its security proof in the standard model. Our study shows that Sepahi’s scheme will remain secure even for post-quantum world since there are currently no known quantum algorithms for solving lattice problems that perform significantly better than the best known classical (i.e., non-quantum) algorithms.
Resumo:
This paper presents ongoing work toward constructing efficient completely non-malleable public-key encryption scheme based on lattices in the standard (common reference string) model. An encryption scheme is completely non-malleable if it requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti proposed two inefficient constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Recently, two efficient public-key encryption schemes have been proposed, both of them are based on pairing identity-based encryption.
Resumo:
The exchange of physical forces in both cell-cell and cell-matrix interactions play a significant role in a variety of physiological and pathological processes, such as cell migration, cancer metastasis, inflammation and wound healing. Therefore, great interest exists in accurately quantifying the forces that cells exert on their substrate during migration. Traction Force Microscopy (TFM) is the most widely used method for measuring cell traction forces. Several mathematical techniques have been developed to estimate forces from TFM experiments. However, certain simplifications are commonly assumed, such as linear elasticity of the materials and/or free geometries, which in some cases may lead to inaccurate results. Here, cellular forces are numerically estimated by solving a minimization problem that combines multiple non-linear FEM solutions. Our simulations, free from constraints on the geometrical and the mechanical conditions, show that forces are predicted with higher accuracy than when using the standard approaches.
Resumo:
Non Alcoholic Fatty Liver Disease (NAFLD) is a condition that is frequently seen but seldom investigated. Until recently, NAFLD was considered benign, self-limiting and unworthy of further investigation. This opinion is based on retrospective studies with relatively small numbers and scant follow-up of histology data. (1) The prevalence for adults, in the USA is, 30%, and NAFLD is recognized as a common and increasing form of liver disease in the paediatric population (1). Australian data, from New South Wales, suggests the prevalence of NAFLD in “healthy” 15 year olds as being 10%.(2) Non-alcoholic fatty liver disease is a condition where fat progressively invades the liver parenchyma. The degree of infiltration ranges from simple steatosis (fat only) to steatohepatitis (fat and inflammation) steatohepatitis plus fibrosis (fat, inflammation and fibrosis) to cirrhosis (replacement of liver texture by scarred, fibrotic and non functioning tissue).Non-alcoholic fatty liver is diagnosed by exclusion rather than inclusion. None of the currently available diagnostic techniques -liver biopsy, liver function tests (LFT) or Imaging; ultrasound, Computerised tomography (CT) or Magnetic Resonance Imaging (MRI) are specific for non-alcoholic fatty liver. An association exists between NAFLD, Non Alcoholic Steatosis Hepatitis (NASH) and irreversible liver damage, cirrhosis and hepatoma. However, a more pervasive aspect of NAFLD is the association with Metabolic Syndrome. This Syndrome is categorised by increased insulin resistance (IR) and NAFLD is thought to be the hepatic representation. Those with NAFLD have an increased risk of death (3) and it is an independent predictor of atherosclerosis and cardiovascular disease (1). Liver biopsy is considered the gold standard for diagnosis, (4), and grading and staging, of non-alcoholic fatty liver disease. Fatty-liver is diagnosed when there is macrovesicular steatosis with displacement of the nucleus to the edge of the cell and at least 5% of the hepatocytes are seen to contain fat (4).Steatosis represents fat accumulation in liver tissue without inflammation. However, it is only called non-alcoholic fatty liver disease when alcohol - >20gms-30gms per day (5), has been excluded from the diet. Both non-alcoholic and alcoholic fatty liver are identical on histology. (4).LFT’s are indicative, not diagnostic. They indicate that a condition may be present but they are unable to diagnosis what the condition is. When a patient presents with raised fasting blood glucose, low HDL (high density lipoprotein), and elevated fasting triacylglycerols they are likely to have NAFLD. (6) Of the imaging techniques MRI is the least variable and the most reproducible. With CT scanning liver fat content can be semi quantitatively estimated. With increasing hepatic steatosis, liver attenuation values decrease by 1.6 Hounsfield units for every milligram of triglyceride deposited per gram of liver tissue (7). Ultrasound permits early detection of fatty liver, often in the preclinical stages before symptoms are present and serum alterations occur. Earlier, accurate reporting of this condition will allow appropriate intervention resulting in better patient health outcomes. References 1. Chalasami N. Does fat alone cause significant liver disease: It remains unclear whether simple steatosis is truly benign. American Gastroenterological Association Perspectives, February/March 2008 www.gastro.org/wmspage.cfm?parm1=5097 Viewed 20th October, 2008 2. Booth, M. George, J.Denney-Wilson, E: The population prevalence of adverse concentrations with adiposity of liver tests among Australian adolescents. Journal of Paediatrics and Child Health.2008 November 3. Catalano, D, Trovato, GM, Martines, GF, Randazzo, M, Tonzuso, A. Bright liver, body composition and insulin resistance changes with nutritional intervention: a follow-up study .Liver Int.2008; February 1280-9 4. Choudhury, J, Sanysl, A. Clinical aspects of Fatty Liver Disease. Semin in Liver Dis. 2004:24 (4):349-62 5. Dionysus Study Group. Drinking factors as cofactors of risk for alcohol induced liver change. Gut. 1997; 41 845-50 6. Preiss, D, Sattar, N. Non-alcoholic fatty liver disease: an overview of prevalence, diagnosis, pathogenesis and treatment considerations. Clin Sci.2008; 115 141-50 7. American Gastroenterological Association. Technical review on nonalcoholic fatty liver disease. Gastroenterology.2002; 123: 1705-25