115 resultados para room set up


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol–cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007–2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background The Australian National Hand Hygiene Initiative (NHHI) is a major patient safety programme co-ordinated by Hand Hygiene Australia (HHA) and funded by the Australian Commission for Safety and Quality in Health Care. The annual costs of running this programme need to be understood to know the cost-effectiveness of a decision to sustain it as part of health services. Aim To estimate the annual health services cost of running the NHHI; the set-up costs are excluded. Methods A health services perspective was adopted for the costing and collected data from the 50 largest public hospitals in Australia that implemented the initiative, covering all states and territories. The costs of HHA, the costs to the state-level infection-prevention groups, the costs incurred by each acute hospital, and the costs for additional alcohol-based hand rub are all included. Findings The programme cost AU$5.56 million each year (US$5.76, £3.63 million). Most of the cost is incurred at the hospital level (65%) and arose from the extra time taken for auditing hand hygiene compliance and doing education and training. On average, each infection control practitioner spent 5 h per week on the NHHI, and the running cost per annum to their hospital was approximately AU$120,000 in 2012 (US$124,000, £78,000). Conclusion Good estimates of the total costs of this programme are fundamental to understanding the cost-effectiveness of implementing the NHHI. This paper reports transparent costing methods, and the results include their uncertainty.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unified communications as a service (UCaaS) can be regarded as a cost-effective model for on-demand delivery of unified communications services in the cloud. However, addressing security concerns has been seen as the biggest challenge to the adoption of IT services in the cloud. This study set up a cloud system via VMware suite to emulate hosting unified communications (UC), the integration of two or more real time communication systems, services in the cloud in a laboratory environment. An Internet Protocol Security (IPSec) gateway was also set up to support network-level security for UCaaS against possible security exposures. This study was aimed at analysis of an implementation of UCaaS over IPSec and evaluation of the latency of encrypted UC traffic while protecting that traffic. Our test results show no latency while IPSec is implemented with a G.711 audio codec. However, the performance of the G.722 audio codec with an IPSec implementation affects the overall performance of the UC server. These results give technical advice and guidance to those involved in security controls in UC security on premises as well as in the cloud.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Australian Naturalistic Driving Study (ANDS), a ground-breaking study of Australian driver behaviour and performance, was officially launched on April 21st, 2015 at UNSW. The ANDS project will provide a realistic perspective on the causes of vehicle crashes and near miss crash events, along with the roles speeding, distraction and other factors have on such events. A total of 360 volunteer drivers across NSW and Victoria - 180 in NSW and 180 in Victoria - will be monitored by a Data Acquisition System (DAS) recording continuously for 4 months their driving behaviour using a suite of cameras and sensors. Participants’ driving behaviour (e.g. gaze), the behaviour of their vehicle (e.g. speed, lane position) and the behaviour of other road users with whom they interact in normal and safety-critical situations will be recorded. Planning of the ANDS commenced over two years ago in June 2013 when the Multi-Institutional Agreement for a grant supporting the equipment purchase and assembly phase was signed by parties involved in this large scale $4 million study (5 university accident research centres, 3 government regulators, 2 third party insurers and 2 industry partners). The program’s second development phase commenced a year later in June 2014 after a second grant was awarded. This paper presents an insider's view into that two year process leading up to the launch, and outlines issues that arose in the set-up phase of the study and how these were addressed. This information will be useful to other organisations considering setting up an NDS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In order to assess the structural reliability of bridges, an accurate and cost effective Non-Destructive Evaluation (NDE) technology is required to ensure their safe and reliable operation. Over 60% of the Australian National Highway System is prestressed concrete (PSC) bridges according to the Bureau of Transport and Communication Economics (1997). Most of the in-service bridges are more than 30 years old and may experience a heavier traffic load than their original intended level. Use of Ultrasonic waves is continuously increasing for (NDE) and Structural Health Monitoring (SHM) in civil, aerospace, electrical, mechanical applications. Ultrasonic Lamb waves are becoming more popular for NDE because it can propagate long distance and reach hidden regions with less energy loses. The purpose of this study is to numerically quantify prestress force (PSF) of (PSC) beam using the fundamental theory of acoustic-elasticity. A three-dimension finite element modelling approach is set up to perform parametric studies in order to better understand how the lamb wave propagation in PSC beam is affected by changing in the PSF level. Results from acoustic-elastic measurement on prestressed beam are presented, showing the feasibility of the lamb wave for PSF evaluation in PSC bridges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background To reduce nursing shortages, accelerated nursing programs are available for domestic and international students. However, the withdrawal and failure rates from these programs may be different than for the traditional programs. The main aim of our study was to improve the retention and experience of accelerated nursing students. Methods The academic background, age, withdrawal and failure rates of the accelerated and traditional students were determined. Data from 2009 and 2010 were collected prior to intervention. In an attempt to reduce the withdrawal of accelerated students, we set up an intervention, which was available to all students. The assessment of the intervention was a pre-post-test design with non-equivalent groups (the traditional and the accelerated students). The elements of the intervention were a) a formative website activity of some basic concepts in anatomy, physiology and pharmacology, b) a workshop addressing study skills and online resources, and c) resource lectures in anatomy/physiology and microbiology. The formative website and workshop was evaluated using questionnaires. Results The accelerated nursing students were five years older than the traditional students (p < 0.0001). The withdrawal rates from a pharmacology course are higher for accelerated nursing students, than for traditional students who have undertaken first year courses in anatomy and physiology (p = 0.04 in 2010). The withdrawing students were predominantly the domestic students with non-university qualifications or equivalent experience. The failure rates were also higher for this group, compared to the traditional students (p = 0.05 in 2009 and 0.03 in 2010). In contrast, the withdrawal rates for the international and domestic graduate accelerated students were very low. After the intervention, the withdrawal and failure rates in pharmacology for domestic accelerated students with non-university qualifications were not significantly different than those of traditional students. Conclusions The accelerated international and domestic graduate nursing students have low withdrawal rates and high success rates in a pharmacology course. However, domestic students with non-university qualifications have higher withdrawal and failure rates than other nursing students and may be underprepared for university study in pharmacology in nursing programs. The introduction of an intervention was associated with reduced withdrawal and failure rates for these students in the pharmacology course.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many existing companies have set up corporate websites in response to competitive pressures and/or the perceived advantages of having a presence in marketspace. However, the effect of this form of communication and/or way of doing business on the corporate brand has yet to be examined in detail. In this article we argue that the translation of corporate brand values from marketplace to marketspace is often problematic, leading to inconsistencies in the way that the brand values are interpreted. Some of issues discussed are: 1) the effect of changed organizational boundaries on the corporate brand, 2) the need to examine whether it is strategically feasible to translate the corporate brand values from marketplace to marketspace, 3) the inherent difficulty in communicating the emotional aspects of the corporate brand in marketspace, and 4) the need to manage the online brand, in terms of its consistency with the offline brand. The conclusion reached is that a necessary part of the process of embracing marketspace as part of a corporate brand strategy is a plan to manage the consistency and continuity of the corporate brand when applied to the Internet. In cases where this is not achievable, a separate corporate brand or a brand extension is a preferable alternative.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research reported in this paper documents the use of Web2.0 applications with six Western Australian schools that are considered to be regional and/or remote. With a population of two million people within an area of 2,525,500 square kilometres Western Australia has a number of towns that are classified as regional and remote. Each of the three education systems have set up telecommunications networks to improve learning opportunities for students and administrative services for staff through a virtual private network (VPN) with access from anywhere, anytime and ultimately reduce the feeling of professional and social dislocation experienced by many teachers and students in the isolated communities. By using Web2.0 applications including video conferencing there are enormous opportunities to close the digital divide within the broad directives of the Networking the Nation plan. The Networking the Nation plan aims to connect all Australians regardless of where they are hence closing the digital divide between city and regional living. Email and Internet facilities have greatly improved in rural, regional and remote areas supporting every day school use of the Internet. This study highlights the possibilities and issues for advanced telecommunications usage of Web2.0 applications discussing the research undertaken with these schools. (Contains 1 figure and 3 tables.)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A computational model for isothermal axisymmetric turbulent flow in a quarl burner is set up using the CFD package FLUENT, and numerical solutions obtained from the model are compared with available experimental data. A standard k-e model and and two versions of the RNG k-e model are used to model the turbulence. One of the aims of the computational study is to investigate whether the RNG based k-e turbulence models are capable of yielding improved flow predictions compared with the standard k-e turbulence model. A difficulty is that the flow considered here features a confined vortex breakdown which can be highly sensitive to flow behaviour both upstream and downstream of the breakdown zone. Nevertheless, the relatively simple confining geometry allows us to undertake a systematic study so that both grid-independent and domain-independent results can be reported. The systematic study includes a detailed investigation of the effects of upstream and downstream conditions on the predictions, in addition to grid refinement and other tests to ensure that numerical error is not significant. Another important aim is to determine to what extent the turbulence model predictions can provide us with new insights into the physics of confined vortex breakdown flows. To this end, the computations are discussed in detail with reference to known vortex breakdown phenomena and existing theories. A major conclusion is that one of the RNG k-e models investigated here is able to correctly capture the complex forward flow region inside the recirculating breakdown zone. This apparently pathological result is in stark contrast to the findings of previous studies, most of which have concluded that either algebraic or differential Reynolds stress modelling is needed to correctly predict the observed flow features. Arguments are given as to why an isotropic eddy-viscosity turbulence model may well be able to capture the complex flow structure within the recirculating zone for this flow setup. With regard to the flow physics, a major finding is that the results obtained here are more consistent with the view that confined vortex breakdown is a type of axisymmetric boundary layer separation, rather than a manifestation of a subcritical flow state.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Deep packet inspection is a technology which enables the examination of the content of information packets being sent over the Internet. The Internet was originally set up using “end-to-end connectivity” as part of its design, allowing nodes of the network to send packets to all other nodes of the network, without requiring intermediate network elements to maintain status information about the transmission. In this way, the Internet was created as a “dumb” network, with “intelligent” devices (such as personal computers) at the end or “last mile” of the network. The dumb network does not interfere with an application's operation, nor is it sensitive to the needs of an application, and as such it treats all information sent over it as (more or less) equal. Yet, deep packet inspection allows the examination of packets at places on the network which are not endpoints, In practice, this permits entities such as Internet service providers (ISPs) or governments to observe the content of the information being sent, and perhaps even manipulate it. Indeed, the existence and implementation of deep packet inspection may challenge profoundly the egalitarian and open character of the Internet. This paper will firstly elaborate on what deep packet inspection is and how it works from a technological perspective, before going on to examine how it is being used in practice by governments and corporations. Legal problems have already been created by the use of deep packet inspection, which involve fundamental rights (especially of Internet users), such as freedom of expression and privacy, as well as more economic concerns, such as competition and copyright. These issues will be considered, and an assessment of the conformity of the use of deep packet inspection with law will be made. There will be a concentration on the use of deep packet inspection in European and North American jurisdictions, where it has already provoked debate, particularly in the context of discussions on net neutrality. This paper will also incorporate a more fundamental assessment of the values that are desirable for the Internet to respect and exhibit (such as openness, equality and neutrality), before concluding with the formulation of a legal and regulatory response to the use of this technology, in accordance with these values.