947 resultados para Emergency Communication Costs.
Resumo:
Distributed parallel execution systems speed up applications by splitting tasks into processes whose execution is assigned to different receiving nodes in a high-bandwidth network. On the distributing side, a fundamental problem is grouping and scheduling such tasks such that each one involves sufñcient computational cost when compared to the task creation and communication costs and other such practical overheads. On the receiving side, an important issue is to have some assurance of the correctness and characteristics of the code received and also of the kind of load the particular task is going to pose, which can be specified by means of certificates. In this paper we present in a tutorial way a number of general solutions to these problems, and illustrate them through their implementation in the Ciao multi-paradigm language and program development environment. This system includes facilities for parallel and distributed execution, an assertion language for specifying complex programs properties (including safety and resource-related properties), and compile-time and run-time tools for performing automated parallelization and resource control, as well as certification of programs with resource consumption assurances and efñcient checking of such certificates.
Resumo:
"November, 1989."
Resumo:
"December 8, 2005."
Resumo:
The advances in low power micro-processors, wireless networks and embedded systems have raised the need to utilize the significant resources of mobile devices. These devices for example, smart phones, tablets, laptops, wearables, and sensors are gaining enormous processing power, storage capacity and wireless bandwidth. In addition, the advancement in wireless mobile technology has created a new communication paradigm via which a wireless network can be created without any priori infrastructure called mobile ad hoc network (MANET). While progress is being made towards improving the efficiencies of mobile devices and reliability of wireless mobile networks, the mobile technology is continuously facing the challenges of un-predictable disconnections, dynamic mobility and the heterogeneity of routing protocols. Hence, the traditional wired, wireless routing protocols are not suitable for MANET due to its unique dynamic ad hoc nature. Due to the reason, the research community has developed and is busy developing protocols for routing in MANET to cope with the challenges of MANET. However, there are no single generic ad hoc routing protocols available so far, which can address all the basic challenges of MANET as mentioned before. Thus this diverse range of ever growing routing protocols has created barriers for mobile nodes of different MANET taxonomies to intercommunicate and hence wasting a huge amount of valuable resources. To provide interaction between heterogeneous MANETs, the routing protocols require conversion of packets, meta-model and their behavioural capabilities. Here, the fundamental challenge is to understand the packet level message format, meta-model and behaviour of different routing protocols, which are significantly different for different MANET Taxonomies. To overcome the above mentioned issues, this thesis proposes an Interoperable Framework for heterogeneous MANETs called IF-MANET. The framework hides the complexities of heterogeneous routing protocols and provides a homogeneous layer for seamless communication between these routing protocols. The framework creates a unique Ontology for MANET routing protocols and a Message Translator to semantically compare the packets and generates the missing fields using the rules defined in the Ontology. Hence, the translation between an existing as well as newly arriving routing protocols will be achieved dynamically and on-the-fly. To discover a route for the delivery of packets across heterogeneous MANET taxonomies, the IF-MANET creates a special Gateway node to provide cluster based inter-domain routing. The IF-MANET framework can be used to develop different middleware applications. For example: Mobile grid computing that could potentially utilise huge amounts of aggregated data collected from heterogeneous mobile devices. Disaster & crises management applications can be created to provide on-the-fly infrastructure-less emergency communication across organisations by utilising different MANET taxonomies.
Resumo:
Heterogeneous computing systems have become common in modern processor architectures. These systems, such as those released by AMD, Intel, and Nvidia, include both CPU and GPU cores on a single die available with reduced communication overhead compared to their discrete predecessors. Currently, discrete CPU/GPU systems are limited, requiring larger, regular, highly-parallel workloads to overcome the communication costs of the system. Without the traditional communication delay assumed between GPUs and CPUs, we believe non-traditional workloads could be targeted for GPU execution. Specifically, this thesis focuses on the execution model of nested parallel workloads on heterogeneous systems. We have designed a simulation flow which utilizes widely used CPU and GPU simulators to model heterogeneous computing architectures. We then applied this simulator to non-traditional GPU workloads using different execution models. We also have proposed a new execution model for nested parallelism allowing users to exploit these heterogeneous systems to reduce execution time.
Resumo:
Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.
Resumo:
The direct and indirect health effects of increasingly warmer temperatures are likely to further burden the already overcrowded hospital emergency departments (EDs). Using current trends and estimates in conjunction with future population growth and climate change scenarios, we show that the increased number of hot days in the future can have a considerable impact on EDs, adding to their workload and costs. The excess number of visits in 2030 is projected to range between 98–336 and 42–127 for younger and older groups, respectively. The excess costs in 2012–13 prices are estimated to range between AU$51,000–184,000 (0–64) and AU$27,000–84,000 (65+). By 2060, these estimates will increase to 229–2300 and 145–1188 at a cost of between AU$120,000–1,200,000 and AU$96,000–786,000 for the respective age groups. Improvements in climate change mitigation and adaptation measures are likely to generate synergistic health co-benefits and reduce the impact on frontline health services.
Resumo:
- Objective To compare health service cost and length of stay between a traditional and an accelerated diagnostic approach to assess acute coronary syndromes (ACS) among patients who presented to the emergency department (ED) of a large tertiary hospital in Australia. - Design, setting and participants This historically controlled study analysed data collected from two independent patient cohorts presenting to the ED with potential ACS. The first cohort of 938 patients was recruited in 2008–2010, and these patients were assessed using the traditional diagnostic approach detailed in the national guideline. The second cohort of 921 patients was recruited in 2011–2013 and was assessed with the accelerated diagnostic approach named the Brisbane protocol. The Brisbane protocol applied early serial troponin testing for patients at 0 and 2 h after presentation to ED, in comparison with 0 and 6 h testing in traditional assessment process. The Brisbane protocol also defined a low-risk group of patients in whom no objective testing was performed. A decision tree model was used to compare the expected cost and length of stay in hospital between two approaches. Probabilistic sensitivity analysis was used to account for model uncertainty. - Results Compared with the traditional diagnostic approach, the Brisbane protocol was associated with reduced expected cost of $1229 (95% CI −$1266 to $5122) and reduced expected length of stay of 26 h (95% CI −14 to 136 h). The Brisbane protocol allowed physicians to discharge a higher proportion of low-risk and intermediate-risk patients from ED within 4 h (72% vs 51%). Results from sensitivity analysis suggested the Brisbane protocol had a high chance of being cost-saving and time-saving. - Conclusions This study provides some evidence of cost savings from a decision to adopt the Brisbane protocol. Benefits would arise for the hospital and for patients and their families.
Resumo:
The development of an adaptive filter system, capable of reducing significantly the effect of siren noise within the cab of an emergency vehicle, is described. The system is capable of removing the siren noise picked up by the radio microphone inside the vehicle, without degrading the wanted voice signal, thus allowing the siren to be used at all times.
Resumo:
Diagnostic and therapeutic approaches to trauma patients are, depending on experience, equipment and different therapeutic doctrines, subject to wide variations. The ability to compare trauma centres using a standardised trauma register helps to reveal unresolved systemic issues and simplifies the quality management in an Emergency Department (ED).
Resumo:
Emergency Departments (EDs) and Emergency Rooms (ERs) are designed to manage trauma, respond to disasters, and serve as the initial care for those with serious illnesses. However, because of many factors, the ED has become the doorway to the hospital and a “catch-all net” for patients including those with non-urgent needs. This increase in the population in the ED has lead to an increase in wait times for patients. It has been well documented that there has been a constant and consistent rise in the number of patients that frequent the ED (National Center for Health Statistics, 2002); the wait time for patients in the ED has increased (Pitts, Niska, Xu, & Burt, 2008); and the cost of the treatment in the ER has risen (Everett Clinic, 2008). Because the ED was designed to treat patients who need quick diagnoses and may be in potential life-threatening circumstances, management of time can be the ultimate enemy. If a system was implemented to decrease wait times in the ED, decrease the use of ED resources, and decrease costs endured by patients seeking care, better outcomes for patients and patient satisfaction could be achieved. The goal of this research was to explore potential changes and/or alternatives to relieve the burden endured by the ED. In order to explore these options, data was collected by conducting one-on-one interviews with seven physicians closely tied to a Level 1 ED (Emergency Room physicians, Trauma Surgeons and Primary Care physicians). A qualitative analysis was performed on the responses of one-on-one interviews with the aforementioned physicians. The interviews were standardized, open-ended questions that probe what makes an effective ED, possible solutions to improving patient care in the ED, potential remedies for the mounting problems that plague the ED, and the feasibility of bringing Primary Care Physicians to the ED to decrease the wait times experienced by the patient. From the responses, it is clear that there needs to be more research in this area, several areas need to be addressed, and a variety of solutions could be implemented. The most viable option seems to be making the ED its own entity (similar to the clinic or hospital) that includes urgent clinics as a part of the system, in which triage and better staffing would be the most integral part of its success.^