743 resultados para utility computing
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.
Resumo:
Bicycle commuting has the potential to be an effective contributing solution to address some of modern society’s biggest issues, including cardiovascular disease, anthropogenic climate change and urban traffic congestion. However, individuals shifting from a passive to an active commute mode may be increasing their potential for air pollution exposure and the associated health risk. This project, consisting of three studies, was designed to investigate the health effects of bicycle commuters in relation to air pollution exposure, in a major city in Australia (Brisbane). The aims of the three studies were to: 1) examine the relationship of in-commute air pollution exposure perception, symptoms and risk management; 2) assess the efficacy of commute re-routing as a risk management strategy by determining the exposure potential profile of ultrafine particles along commute route alternatives of low and high proximity to motorised traffic; and, 3) evaluate the feasibility of implementing commute re-routing as a risk management strategy by monitoring ultrafine particle exposure and consequential physiological response from using commute route alternatives based on real-world circumstances; 3) investigate the potential of reducing exposure to ultrafine particles (UFP; < 0.1 µm) during bicycle commuting by lowering proximity to motorised traffic with real-time air pollution and acute inflammatory measurements in healthy individuals using their typical, and an alternative to their typical, bicycle commute route. The methods of the three studies included: 1) a questionnaire-based investigation with regular bicycle commuters in Brisbane, Australia. Participants (n = 153; age = 41 ± 11 yr; 28% female) reported the characteristics of their typical bicycle commute, along with exposure perception and acute respiratory symptoms, and amenability for using a respirator or re-routing their commute as risk management strategies; 2) inhaled particle counts measured along popular pre-identified bicycle commute route alterations of low (LOW) and high (HIGH) motorised traffic to the same inner-city destination at peak commute traffic times. During commute, real-time particle number concentration (PNC; mostly in the UFP range) and particle diameter (PD), heart and respiratory rate, geographical location, and meteorological variables were measured. To determine inhaled particle counts, ventilation rate was calculated from heart-rate-ventilation associations, produced from periodic exercise testing; 3) thirty-five healthy adults (mean ± SD: age = 39 ± 11 yr; 29% female) completed two return trips of their typical route (HIGH) and a pre-determined altered route of lower proximity to motorised traffic (LOW; determined by the proportion of on-road cycle paths). Particle number concentration (PNC) and diameter (PD) were monitored in real-time in-commute. Acute inflammatory indices of respiratory symptom incidence, lung function and spontaneous sputum (for inflammatory cell analyses) were collected immediately pre-commute, and one and three hours post-commute. The main results of the three studies are that: 1) healthy individuals reported a higher incidence of specific acute respiratory symptoms in- and post- (compared to pre-) commute (p < 0.05). The incidence of specific acute respiratory symptoms was significantly higher for participants with respiratory disorder history compared to healthy participants (p < 0.05). The incidence of in-commute offensive odour detection, and the perception of in-commute air pollution exposure, was significantly lower for participants with smoking history compared to healthy participants (p < 0.05). Females reported significantly higher incidence of in-commute air pollution exposure perception and other specific acute respiratory symptoms, and were more amenable to commute re-routing, compared to males (p < 0.05). Healthy individuals have indicated a higher incidence of acute respiratory symptoms in- and post- (compared to pre-) bicycle commuting, with female gender and respiratory disorder history indicating a comparably-higher susceptibility; 2) total mean PNC of LOW (compared to HIGH) was reduced (1.56 x e4 ± 0.38 x e4 versus 3.06 x e4 ± 0.53 x e4 ppcc; p = 0.012). Total estimated ventilation rate did not vary significantly between LOW and HIGH (43 ± 5 versus 46 ± 9 L•min; p = 0.136); however, due to total mean PNC, accumulated inhaled particle counts were 48% lower in LOW, compared to HIGH (7.6 x e8 ± 1.5 x e8 versus 14.6 x e8 ± 1.8 x e8; p = 0.003); 3) LOW resulted in a significant reduction in mean PNC (1.91 x e4 ± 0.93 x e4 ppcc vs. 2.95 x e4 ± 1.50 x e4 ppcc; p ≤ 0.001). Commute distance and duration were not significantly different between LOW and HIGH (12.8 ± 7.1 vs. 12.0 ± 6.9 km and 44 ± 17 vs. 42 ± 17 mins, respectively). Besides incidence of in-commute offensive odour detection (42 vs. 56 %; p = 0.019), incidence of dust and soot observation (33 vs. 47 %; p = 0.038) and nasopharyngeal irritation (31 vs. 41 %; p = 0.007), acute inflammatory indices were not significantly associated to in-commute PNC, nor were these indices reduced with LOW compared to HIGH. The main conclusions of the three studies are that: 1) the perception of air pollution exposure levels and the amenability to adopt exposure risk management strategies where applicable will aid the general population in shifting from passive, motorised transport modes to bicycle commuting; 2) for bicycle commuting at peak morning commute times, inhaled particle counts and therefore cardiopulmonary health risk may be substantially reduced by decreasing exposure to motorised traffic, which should be considered by both bicycle commuters and urban planners; 3) exposure to PNC, and the incidence of offensive odour and nasopharyngeal irritation, can be significantly reduced when utilising a strategy of lowering proximity to motorised traffic whilst bicycle commuting, without significantly increasing commute distance or duration, which may bring important benefits for both healthy and susceptible individuals. In summary, the findings from this project suggests that bicycle commuters can significantly lower their exposure to ultrafine particle emissions by varying their commute route to reduce proximity to motorised traffic and associated combustion emissions without necessarily affecting their time of commute. While the health endpoints assessed with healthy individuals were not indicative of acute health detriment, individuals with pre-disposing physiological-susceptibility may benefit considerably from this risk management strategy – a necessary research focus with the contemporary increased popularity of both promotion and participation in bicycle commuting.
Resumo:
[Letter to the Editor] I read with great interest the article recently published in the Journal of PeriAnesthesia Nursing that examined the utility of using dexmedetomidine (DEX) as an adjunct to midazolam and fentanyl for procedural sedation and analgesia during radiofrequency catheter ablation (RFCA) of atrial fibrillation (AF).1 With the view toward advancing knowledge about more effective medications for sedation in this challenging context, I offer the following insights for readers to consider regarding this study...
Unpacking user relations in an emerging ubiquitous computing environment : introducing the bystander
Resumo:
The move towards technological ubiquity is allowing a more idiosyncratic and dynamic working environment to emerge that may result in the restructuring of information communication technologies, and changes in their use through different user groups' actions. Taking a ‘practice’ lens to human agency, we explore the evolving roles of, and relationships between these user groups and their appropriation of emergent technologies by drawing upon Lamb and Kling's social actor framework. To illustrate our argument, we draw upon a study of a UK Fire Brigade that has introduced a variety of technologies in an attempt to move towards embracing mobile and ubiquitous computing. Our analysis of the enactment of such technologies reveals that Bystanders, a group yet to be taken as the central unit of analysis in information systems research, or considered in practice, are emerging as important actors. The research implications of our work relate to the need to further consider Bystanders in deployments other than those that are mobile and ubiquitous. For practice, we suggest that Bystanders require consideration in the systems development life cycle, particularly in terms of design and education in processes of use.
Resumo:
The main theme of this thesis is to allow the users of cloud services to outsource their data without the need to trust the cloud provider. The method is based on combining existing proof-of-storage schemes with distance-bounding protocols. Specifically, cloud customers will be able to verify the confidentiality, integrity, availability, fairness (or mutual non-repudiation), data freshness, geographic assurance and replication of their stored data directly, without having to rely on the word of the cloud provider.
Resumo:
Whilst alcohol is a common feature of many social gatherings, there are numerous immediate and long-term health and social harms associated with its abuse. Alcohol consumption is the world’s third largest risk factor for disease and disability with almost 4% of all deaths worldwide attributed to alcohol. Not surprisingly, alcohol use and binge drinking by young people is of particular concern with Australian data reporting that 39% of young people (18-19yrs) admitted drinking at least weekly and 32% drank to levels that put them at risk of alcohol-related harm. The growing market penetration and connectivity of smartphones may be an opportunities for innovation in promoting health-related self-management of substance use. However, little is known about how best to harness and optimise this technology for health-related intervention and behaviour change. This paper explores the utility and interface of smartphone technology as a health intervention tool to monitor and moderate alcohol use. A review of the psychological health applications of this technology will be presented along with the findings of a series of focus groups, surveys and behavioural field trials of several drink-monitoring applications. Qualitative and quantitative data will be presented on the perceptions, preferences and utility of the design, usability and functionality of smartphone apps to monitoring and moderate alcohol use. How these findings have shaped the development and evolution of the OnTrack app will be specifically discussed, along with future directions and applications of this technology in health intervention, prevention and promotion.
Resumo:
Currently there are ~3000 known species of Sarcophagidae (Diptera), which are classified into 173 genera in three subfamilies. Almost 25% of sarcophagids belong to the genus Sarcophaga (sensu lato) however little is known about the validity of, and relationships between the ~150 (or more) subgenera of Sarcophaga s.l. In this preliminary study, we evaluated the usefulness of three sources of data for resolving relationships between 35 species from 14 Sarcophaga s.l. subgenera: the mitochondrial COI barcode region, ~800. bp of the nuclear gene CAD, and 110 morphological characters. Bayesian, maximum likelihood (ML) and maximum parsimony (MP) analyses were performed on the combined dataset. Much of the tree was only supported by the Bayesian and ML analyses, with the MP tree poorly resolved. The genus Sarcophaga s.l. was resolved as monophyletic in both the Bayesian and ML analyses and strong support was obtained at the species-level. Notably, the only subgenus consistently resolved as monophyletic was Liopygia. The monophyly of and relationships between the remaining Sarcophaga s.l. subgenera sampled remain questionable. We suggest that future phylogenetic studies on the genus Sarcophaga s.l. use combined datasets for analyses. We also advocate the use of additional data and a range of inference strategies to assist with resolving relationships within Sarcophaga s.l.
Resumo:
Background—Palpation is an important clinical test for jumper's knee. Objectives—To (a) test the reproducibility of palpation tenderness, (b) evaluate the sensitivity and specificity of palpation in subjects with clinical symptoms of jumper's knee, and (c) determine whether tenderness to palpation may serve as a useful screening test for patellar tendinopathy. The yardstick for diagnosis of patellar tendinopathy was ultrasonographic abnormality. Methods—In 326 junior symptomatic and asymptomatic athletes' tendons, palpation was performed by a single examiner before ultrasonographic examination by a certified ultrasound radiologist. In 58 tendons, palpation was performed twice to test reliability. Tenderness to palpation was scored on a scale from 0 to 3 where 0 represented no pain, and 1, 2, and 3 represented mild, moderate, and severe tenderness respectively. Results—Patellar tendon palpation was a reliable examination for a single examiner (Pearson r = 0.82). In symptomatic tendons, the positive predictive value of palpation was 68%. As a screening examination in asymptomatic subjects, the positive predictive value of tendon palpation was 36–38%. Moderate and severe palpation tenderness were better predictors of ultrasonographic tendon pathology than absent or mild tenderness (p<0.001). Tender and symptomatic tendons were more likely to have ultrasound abnormality than tenderness alone (p<0.01). Conclusions—In this age group, palpation is a reliable test but it is not cost effective in detecting patellar tendinopathy in a preparticipation examination. In symptomatic tendons, palpation is a moderately sensitive but not specific test. Mild tenderness in the patellar tendons in asymptomatic jumping athletes should be considered normal.
Resumo:
Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.
Resumo:
Distributed Wireless Smart Camera (DWSC) network is a special type of Wireless Sensor Network (WSN) that processes captured images in a distributed manner. While image processing on DWSCs sees a great potential for growth, with its applications possessing a vast practical application domain such as security surveillance and health care, it suffers from tremendous constraints. In addition to the limitations of conventional WSNs, image processing on DWSCs requires more computational power, bandwidth and energy that presents significant challenges for large scale deployments. This dissertation has developed a number of algorithms that are highly scalable, portable, energy efficient and performance efficient, with considerations of practical constraints imposed by the hardware and the nature of WSN. More specifically, these algorithms tackle the problems of multi-object tracking and localisation in distributed wireless smart camera net- works and optimal camera configuration determination. Addressing the first problem of multi-object tracking and localisation requires solving a large array of sub-problems. The sub-problems that are discussed in this dissertation are calibration of internal parameters, multi-camera calibration for localisation and object handover for tracking. These topics have been covered extensively in computer vision literatures, however new algorithms must be invented to accommodate the various constraints introduced and required by the DWSC platform. A technique has been developed for the automatic calibration of low-cost cameras which are assumed to be restricted in their freedom of movement to either pan or tilt movements. Camera internal parameters, including focal length, principal point, lens distortion parameter and the angle and axis of rotation, can be recovered from a minimum set of two images of the camera, provided that the axis of rotation between the two images goes through the camera's optical centre and is parallel to either the vertical (panning) or horizontal (tilting) axis of the image. For object localisation, a novel approach has been developed for the calibration of a network of non-overlapping DWSCs in terms of their ground plane homographies, which can then be used for localising objects. In the proposed approach, a robot travels through the camera network while updating its position in a global coordinate frame, which it broadcasts to the cameras. The cameras use this, along with the image plane location of the robot, to compute a mapping from their image planes to the global coordinate frame. This is combined with an occupancy map generated by the robot during the mapping process to localised objects moving within the network. In addition, to deal with the problem of object handover between DWSCs of non-overlapping fields of view, a highly-scalable, distributed protocol has been designed. Cameras that follow the proposed protocol transmit object descriptions to a selected set of neighbours that are determined using a predictive forwarding strategy. The received descriptions are then matched at the subsequent camera on the object's path using a probability maximisation process with locally generated descriptions. The second problem of camera placement emerges naturally when these pervasive devices are put into real use. The locations, orientations, lens types etc. of the cameras must be chosen in a way that the utility of the network is maximised (e.g. maximum coverage) while user requirements are met. To deal with this, a statistical formulation of the problem of determining optimal camera configurations has been introduced and a Trans-Dimensional Simulated Annealing (TDSA) algorithm has been proposed to effectively solve the problem.
Resumo:
The purpose of this paper is to provide an evolutionary perspective of cloud computing (CC) by integrating two previously disparate literatures: CC and information technology outsourcing (ITO). We review the literature and develop a framework that highlights the demand for the CC service, benefits, risks, as well as risk mitigation strategies that are likely to influence the success of the service. CC success in organisations and as a technology overall is a function of (i) the outsourcing decision and supplier selection, (ii) contractual and relational governance, and (iii) industry standards and legal framework. Whereas CC clients have little control over standards and/or the legal framework, they are able to influence other factors to maximize the benefits while limiting the risks. This paper provides guidelines for (potential) cloud computing users with respect to the outsourcing decision, vendor selection, service-level-agreements, and other issues that need to be addressed when opting for CC services. We contribute to the literature by providing an evolutionary and holistic view of CC that draws on the extensive literature and theory of ITO. We conclude the paper with a number of research paths that future researchers can follow to advance the knowledge in this field.
Resumo:
Issue addressed: Although increases in cycling in Brisbane are encouraging, bicycle mode share to work in the state of Queensland remains low. The aim of this qualitative study was to draw upon the lived experiences of Queensland cyclists to understand the main motivators for utility cycling (cycling as a means to get to and from places) and compare motivators between utility cyclists (those who cycle for utility as well as for recreation) and non-utility cyclists (those who cycle only for recreation). Methods: For an online survey, members of a bicycle group (831 utility cyclists and 931 non-utility cyclists, aged 18-90 years) were asked to describe, unprompted, what would motivate them to engage in utility cycling (more often). Responses were coded into themes within four levels of an ecological model. Results: Within an ecological model, built environment influences on motivation were grouped according to whether they related to appeal (safety), convenience (accessibility) or attractiveness (more amenities) and included adequate infrastructure for short trips, bikeway connectivity, end-of-trip facilities at public locations and easy and safe bicycle access to destinations outside of cities. A key social-cultural influence related to improved interactions among different road users. Conclusions: The built and social-cultural environments need to be more supportive of utility cycling before even current utility and non-utility cyclists will be motivated to engage (more often) in utility cycling. So what?: Additional government strategies and more and better infrastructure that support utility cycling beyond commuter cycling may encourage a utility cycling culture.
Resumo:
This special issue of the Journal of Urban Technology brings together five articles that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer- Human Interaction conference (OZCHI 2009). Our own article introduces the Street Computing vision and explores the potential, challenges, and foundations of this research trajectory. In order to do so, we first look at the currently available sources of information and discuss their link to existing research efforts. Section 2 then introduces the notion of Street Computing and our research approach in more detail. Section 3 looks beyond the core concept itself and summarizes related work in this field of interest. We conclude by introducing the papers that have been contributed to this special issue.
Resumo:
Acoustic sensors are increasingly used to monitor biodiversity. They can remain deployed in the environment for extended periods to passively and objectively record the sounds of the environment. The collected acoustic data must be analyzed to identify the presence of the sounds made by fauna in order to understand biodiversity. Citizen scientists play an important role in analyzing this data by annotating calls and identifying species. This paper presents our research into bioacoustic annotation techniques. It describes our work in defining a process for managing, creating, and using tags that are applied to our annotations. This paper includes a detailed description of our methodology for correcting and then linking our folksonomic tags to taxonomic data sources. Providing tools and processes for maintaining species naming consistency is critical to the success of a project designed to generate scientific data. We demonstrate that cleaning the folksonomic data and providing links to external taxonomic authorities enhances the scientific utility of the tagging efforts of citizen scientists.
Resumo:
The topic of “the cloud” has attracted significant attention throughout the past few years (Cherry 2009; Sterling and Stark 2009) and, as a result, academics and trade journals have created several competing definitions of “cloud computing” (e.g., Motahari-Nezhad et al. 2009). Underpinning this article is the definition put forward by the US National Institute of Standards and Technology, which describes cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Garfinkel 2011, p. 3). Despite the lack of consensus about definitions, however, there is broad agreement on the growing demand for cloud computing. Some estimates suggest that spending on cloudrelated technologies and services in the next few years may climb as high as USD 42 billion/year (Buyya et al. 2009).