743 resultados para utility computing
Resumo:
Objectives: To evaluate the clinical value of pre-operative serum CA125 in predicting the presence of extra-uterine disease in patients with apparent early stage endometrial cancer. Methods: Between October 6, 2005 and June 17, 2010, 760 patients were enrolled in an international, multicentre, prospective randomized trial (LACE) comparing laparotomy with laparoscopy in the management of endometrial cancer apparently confined to the uterus. This study is based on data from 657 patients with endometrial adenocarcinoma who had a pre-operative serum CA125 value, and was undertaken to correlate pre-operative serum CA125 with final stage. Results: Using a pre-operative CA-125 cutpoint of 30U/ml was associated with the smallest misclassification error (14.5%) using a multiple cross-validation method. Median pre-operative serum CA-125 was 14U/ml, and using a cutpoint of 30U/ml, 14.9% of patients had elevated CA-125 levels. Of 98 patients with elevated CA-125 level, 36 (36.7%) had evidence of extra-uterine disease. Of the 116 patients (17.7%) with evidence of extra-uterine disease, 31.0% had elevated CA-125 level. In univariate and multivariate logistic regression analysis, only pre-operative CA-125 level was found to be associated with extra-uterine spread of disease. Utilising a cutpoint of 30U/ml achieved a sensitivity, specificity, positive predictive value and negative predictive value of 31.0%, 88.5%, 36.7% and 85.7% respectively. Overall, 326/657 (49.6%) of patients had full surgical staging involving lymph node dissection. When analysis was limited to patients that had undergone full surgical staging, the outcomes remained essentially unchanged. Conclusions: Elevated CA-125 above 30U/ml in patients with apparent early stage disease is associated with a sensitivity of 31.0% and specificity of 88.5% in detecting extra-uterine disease. Pre-operative identification of this risk factor may assist to triage patients to tertiary centres and comprehensive surgical staging.
Resumo:
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1=n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal. Funding source Cancer Australia (Department of Health and Ageing) Research Grant 614217
Resumo:
Circulating tumour cells (CTCs) have attracted much recent interest in cancer research as a potential biomarker and as a means of studying the process of metastasis. It has long been understood that metastasis is a hallmark of malignancy, and conceptual theories on the basis of metastasis from the nineteenth century foretold the existence of a tumour "seed" which is capable of establishing discrete tumours in the "soil" of distant organs. This prescient "seed and soil" hypothesis accurately predicted the existence of CTCs; microscopic tumour fragments in the blood, at least some of which are capable of forming metastases. However, it is only in recent years that reliable, reproducible methods of CTC detection and analysis have been developed. To date, the majority of studies have employed the CellSearch™ system (Veridex LLC), which is an immunomagnetic purification method. Other promising techniques include microfluidic filters, isolation of tumour cells by size using microporous polycarbonate filters and flow cytometry-based approaches. While many challenges still exist, the detection of CTCs in blood is becoming increasingly feasible, giving rise to some tantalizing questions about the use of CTCs as a potential biomarker. CTC enumeration has been used to guide prognosis in patients with metastatic disease, and to act as a surrogate marker for disease response during therapy. Other possible uses for CTC detection include prognostication in early stage patients, identifying patients requiring adjuvant therapy, or in surveillance, for the detection of relapsing disease. Another exciting possible use for CTC detection assays is the molecular and genetic characterization of CTCs to act as a "liquid biopsy" representative of the primary tumour. Indeed it has already been demonstrated that it is possible to detect HER2, KRAS and EGFR mutation status in breast, colon and lung cancer CTCs respectively. In the course of this review, we shall discuss the biology of CTCs and their role in metastagenesis, the most commonly used techniques for their detection and the evidence to date of their clinical utility, with particular reference to lung cancer.
Resumo:
With the widespread application of healthcare Information and Communication Technology (ICT), constructing a stable and sustainable data sharing circumstance has attracted rapidly growing attention in both academic research area and healthcare industry. Cloud computing is one of long dreamed visions of Healthcare Cloud (HC), which matches the need of healthcare information sharing directly to various health providers over the Internet, regardless of their location and the amount of data. In this paper, we discuss important research tool related to health information sharing and integration in HC and investigate the arising challenges and issues. We describe many potential solutions to provide more opportunities to implement EHR cloud. As well, we introduce the development of a HC related collaborative healthcare research example, thus illustrating the prospective of applying Cloud Computing in the health information science research.
Resumo:
Brief self-report symptom checklists are often used to screen for postconcussional disorder (PCD) and posttraumatic stress disorder (PTSD) and are highly susceptible to symptom exaggeration. This study examined the utility of the five-item Mild Brain Injury Atypical Symptoms Scale (mBIAS) designed for use with the Neurobehavioral Symptom Inventory (NSI) and the PTSD Checklist–Civilian (PCL–C). Participants were 85 Australian undergraduate students who completed a battery of self-report measures under one of three experimental conditions: control (i.e., honest responding, n = 24), feign PCD (n = 29), and feign PTSD (n = 32). Measures were the mBIAS, NSI, PCL–C, Minnesota Multiphasic Personality Inventory–2, Restructured Form (MMPI–2–RF), and the Structured Inventory of Malingered Symptomatology (SIMS). Participants instructed to feign PTSD and PCD had significantly higher scores on the mBIAS, NSI, PCL–C, and MMPI–2–RF than did controls. Few differences were found between the feign PCD and feign PTSD groups, with the exception of scores on the NSI (feign PCD > feign PTSD) and PCL–C (feign PTSD > feign PCD). Optimal cutoff scores on the mBIAS of ≥8 and ≥6 were found to reflect “probable exaggeration” (sensitivity = .34; specificity = 1.0; positive predictive power, PPP = 1.0; negative predictive power, NPP = .74) and “possible exaggeration” (sensitivity = .72; specificity = .88; PPP = .76; NPP = .85), respectively. Findings provide preliminary support for the use of the mBIAS as a tool to detect symptom exaggeration when administering the NSI and PCL–C.
Resumo:
In the 20 years since its inception, the EPPM has attracted much empirical support. Currently, and unsurprisingly given that is a model of fear-based persuasion, the EPPM’s explanatory utility has been based only upon fear-based messages. However, an argument is put forth herein, which draws upon existing evidence, that the EPPM may be an efficacious framework for explaining the persuasive process and outcomes of emotion-based messages more broadly when such messages are addressing serious health topics. For the current study, four different types of emotional appeals were purposefully devised and included a fear, an annoyance/agitation, a pride, and a humour-based message. All messages addressed the serious health issue of road safety, and in particular the risky behaviour of speeding. Participants (N = 551) were exposed to only one of the four messages and subsequently provided responses within a survey. A series of 2 (threat: low, high) x 2 (efficacy: low, high) analysis of variance was conducted for each of the appeals based on the EPPM’s message outcomes of acceptance and rejection. Support was found for the EPPM with a number of main effects of threat and efficacy emerging, reflecting that, irrespective of emotional appeal type, high levels of threat and efficacy enhanced message outcomes via maximising acceptance and minimising rejection. Theoretically, the findings provide support for the explanatory utility of the EPPM for emotion-based health messages more broadly. In an applied sense, the findings highlight the value of adopting the EPPM as a framework when devising and evaluating emotion-based health messages for serious health topics.
Resumo:
Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry's technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry's services to be offered through cloud-based “apps.”
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
Background: Bicycle commuting in an urban environment of high air pollution is known as a potential health risk, especially for susceptible individuals. While risk management strategies aimed to reduce motorised traffic emissions exposure have been suggested, limited studies have assessed the utility of such strategies in real-world circumstances. Objectives: The potential of reducing exposure to ultrafine particles (UFP; < 0.1 µm) during bicycle commuting by lowering interaction with motorised traffic was investigated with real-time air pollution and acute inflammatory measurements in healthy individuals using their typical, and an alternative to their typical, bicycle commute route. Methods: Thirty-five healthy adults (mean ± SD: age = 39 ± 11 yr; 29% female) each completed two return trips of their typical route (HIGH) and a pre-determined altered route of lower interaction with motorised traffic (LOW; determined by the proportion of on-road cycle paths). Particle number concentration (PNC) and diameter (PD) were monitored in real-time in-commute. Acute inflammatory indices of respiratory symptom incidence, lung function and spontaneous sputum (for inflammatory cell analyses) were collected immediately pre-commute, and one and three hours post-commute. Results: LOW resulted in a significant reduction in mean PNC (1.91 x e4 ± 0.93 x e4 ppcc vs. 2.95 x e4 ± 1.50 x e4 ppcc; p ≤ 0.001). Besides incidence of in-commute offensive odour detection (42 vs. 56 %; p = 0.019), incidence of dust and soot observation (33 vs. 47 %; p = 0.038) and nasopharyngeal irritation (31 vs. 41 %; p = 0.007), acute inflammatory indices were not significantly associated to in-commute PNC, nor were these indices reduced with LOW compared to HIGH. Conclusions: Exposure to PNC, and the incidence of offensive odour and nasopharyngeal irritation, can be significantly reduced when utilising a strategy of lowering interaction with motorised traffic whilst bicycle commuting, which may bring important benefits for both healthy and susceptible individuals.
Resumo:
This book develops tools and techniques that will help urban residents gain access to urban computing. Metaphorically speaking, it is taking computing to the street by giving the general public – rather than just researchers and professionals – the power to leverage available city infrastructure and create solutions tailored to their individual needs. It brings together five chapters that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer-Human Interaction Conference (OZCHI 2009). This book focuses on applying urban informatics, urban and community sensing and open application programming interfaces (APIs) to the public space through the delivery of online services, on demand and in real time. It then offers a case study of how the city of Singapore has harnessed the potential of an online infrastructure so that residents and visitors can access services electronically. This book was published as a special issue of the Journal of Urban Technology, 19(2), 2012.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
Every day we are confronted with social interactions with other people. Our social life is characterised by norms that manifest as attitudinal and behavioural uniformities among people. With greater awareness about our social context, we can interact more efficiently. Any theory or model of human interaction that fails to include social concepts could be suggested to lack a critical element. This paper identifies the construct of social concepts that need to be supported by future context-aw are systems from an interdisciplinary perspective. It discusses the limitations of existing context-aware systems to support social psychology theories related to the identification and membership of social groups. We argue that social norms are among the core modelling concepts that future context-aware systems need to capture with the view to support and enhance social interactions. A detailed summary of social psychology theory relevant to social computing is given, followed by a formal definition of the process with each such norm advertised and acquired. The social concepts identified in this paper could be used to simulate agent interactions imbued with social norms or use ICT to facilitate, assist, enhance or understand social interactions. They also could be used in virtual communities modelling where the social awareness of a community as well as the process of joining and exiting a community are important.
Resumo:
The most powerful known primitive in public-key cryptography is undoubtedly elliptic curve pairings. Upon their introduction just over ten years ago the computation of pairings was far too slow for them to be considered a practical option. This resulted in a vast amount of research from many mathematicians and computer scientists around the globe aiming to improve this computation speed. From the use of modern results in algebraic and arithmetic geometry to the application of foundational number theory that dates back to the days of Gauss and Euler, cryptographic pairings have since experienced a great deal of improvement. As a result, what was an extremely expensive computation that took several minutes is now a high-speed operation that takes less than a millisecond. This thesis presents a range of optimisations to the state-of-the-art in cryptographic pairing computation. Both through extending prior techniques, and introducing several novel ideas of our own, our work has contributed to recordbreaking pairing implementations.