921 resultados para Bookkeeping machines.
Resumo:
This item provides supplementary materials for the paper mentioned in the title, specifically a range of organisms used in the study. The full abstract for the main paper is as follows: Next Generation Sequencing (NGS) technologies have revolutionised molecular biology, allowing clinical sequencing to become a matter of routine. NGS data sets consist of short sequence reads obtained from the machine, given context and meaning through downstream assembly and annotation. For these techniques to operate successfully, the collected reads must be consistent with the assumed species or species group, and not corrupted in some way. The common bacterium Staphylococcus aureus may cause severe and life-threatening infections in humans,with some strains exhibiting antibiotic resistance. In this paper, we apply an SVM classifier to the important problem of distinguishing S. aureus sequencing projects from alternative pathogens, including closely related Staphylococci. Using a sequence k-mer representation, we achieve precision and recall above 95%, implicating features with important functional associations.
Resumo:
Artists: Donna Hewitt, Julian Knowles, Wade Marynowsky, Tim Bruniges, Avril Huddy Macrophonics presents new Australian work emerging from the leading edge of where performance interface research is taking place. The program addresses the emerging dialogue between traditional media and emerging digital media, as well as the dialogue across a broad range of musical traditions. Due to recent technological developments, we have reached a point artistically where the relationships between media and genres are being completely re-evaluated. This program presents a cross-section of responses to this condition. Each of the works in the program foregrounds an approach to performance that integrates sensors and novel performance control devices and/or examine how machines can be made musical in performance. Containing works for voice, electronics, video, movement and sensor based gestural controllers, it critically surveys the interface between humans and machines in performance. From sensor based microphones and guitars, performance a/v, to post-rock dronescapes and experimental electronica; Macrophonics provides a broad and engaging survey of new performance approaches in mediatised environments.
Resumo:
A theoretical framework for a construction management decision evaluation system for project selection by means of a literature review. The theory is developed by the examination of the major factors concerning the project selection decision from a deterministic viewpoint, where the decision-maker is assumed to possess 'perfect knowledge' of all the aspects involved. Four fundamental project characteristics are identified together with three meaningful outcome variables. The relationship within and between these variables are considered together with some possible solution techniques. The theory is next extended to time-related dynamic aspects of the problem leading to the implications of imperfect knowledge and a nondeterministic model. A solution technique is proposed in which Gottinger's sequential machines are utilised to model the decision process,
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
Objective: To describe an effective and inexpensive CPAP-mask apparatus for use in the emergency department. Conclusion: CPAP is an effective tool in the treatment of acute pulmonary oedema in the emergency department. The mask apparatus described is an inexpensive alternative to the commercially produced machines.
Resumo:
It is common for organizations to maintain multiple variants of a given business process, such as multiple sales processes for different products or multiple bookkeeping processes for different countries. Conventional business process modeling languages do not explicitly support the representation of such families of process variants. This gap triggered significant research efforts over the past decade leading to an array of approaches to business process variability modeling. This survey examines existing approaches in this field based on a common set of criteria and illustrates their key concepts using a running example. The analysis shows that existing approaches are characterized by the fact that they extend a conventional process mod- eling language with constructs that make it able to capture customizable process models. A customizable process model represents a family of process variants in a way that each variant can be derived by adding or deleting fragments according to configuration parameters or according to a domain model. The survey puts into evidence an abundance of customizable process modeling languages, embodying a diverse set of con- structs. In contrast, there is comparatively little tool support for analyzing and constructing customizable process models, as well as a scarcity of empirical evaluations of languages in the field.
Resumo:
This thesis investigates and develops techniques for accurately detecting Internet-based Distributed Denial-of-Service (DDoS) Attacks where an adversary harnesses the power of thousands of compromised machines to disrupt the normal operations of a Web-service provider, resulting in significant down-time and financial losses. This thesis also develops methods to differentiate these attacks from similar-looking benign surges in web-traffic known as Flash Events (FEs). This thesis also addresses an intrinsic challenge in research associated with DDoS attacks, namely, the extreme scarcity of public domain datasets (due to legal and privacy issues) by developing techniques to realistically emulate DDoS attack and FE traffic.
Resumo:
Increases in functionality, power and intelligence of modern engineered systems led to complex systems with a large number of interconnected dynamic subsystems. In such machines, faults in one subsystem can cascade and affect the behavior of numerous other subsystems. This complicates the traditional fault monitoring procedures because of the need to train models of the faults that the monitoring system needs to detect and recognize. Unavoidable design defects, quality variations and different usage patterns make it infeasible to foresee all possible faults, resulting in limited diagnostic coverage that can only deal with previously anticipated and modeled failures. This leads to missed detections and costly blind swapping of acceptable components because of one’s inability to accurately isolate the source of previously unseen anomalies. To circumvent these difficulties, a new paradigm for diagnostic systems is proposed and discussed in this paper. Its feasibility is demonstrated through application examples in automotive engine diagnostics.
Resumo:
Background Standard operating procedures state that police officers should not drive while interacting with their mobile data terminal (MDT) which provides in-vehicle information essential to police work. Such interactions do however occur in practice and represent a potential source of driver distraction. The MDT comprises visual output with manual input via touch screen and keyboard. This study investigated the potential for alternative input and output methods to mitigate driver distraction with specific focus on eye movements. Method Nineteen experienced drivers of police vehicles (one female) from the NSW Police Force completed four simulated urban drives. Three drives included a concurrent secondary task: imitation licence plate search using an emulated MDT. Three different interface methods were examined: Visual-Manual, Visual-Voice, and Audio-Voice (“Visual” and “Audio” = output modality; “Manual” and “Voice” = input modality). During each drive, eye movements were recorded using FaceLAB™ (Seeing Machines Ltd, Canberra, ACT). Gaze direction and glances on the MDT were assessed. Results The Visual-Voice and Visual-Manual interfaces resulted in a significantly greater number of glances towards the MDT than Audio-Voice or Baseline. The Visual-Manual and Visual-Voice interfaces resulted in significantly more glances to the display than Audio-Voice or Baseline. For longer duration glances (>2s and 1-2s) the Visual-Manual interface resulted in significantly more fixations than Baseline or Audio-Voice. The short duration glances (<1s) were significantly greater for both Visual-Voice and Visual-Manual compared with Baseline and Audio-Voice. There were no significant differences between Baseline and Audio-Voice. Conclusion An Audio-Voice interface has the greatest potential to decrease visual distraction to police drivers. However, it is acknowledged that an audio output may have limitations for information presentation compared with visual output. The Visual-Voice interface offers an environment where the capacity to present information is sustained, whilst distraction to the driver is reduced (compared to Visual-Manual) by enabling adaptation of fixation behaviour.
Resumo:
Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.
Resumo:
Electrostatic spinning or electrospinning is a fiber spinning technique driven by a high-voltage electric field that produces fibers with diameters in a submicrometer to nanometer range.1 Nanofibers are typical one-dimensional colloidal objects with an increased tensile strength, whose length can achieve a few kilometers and the specific surface area can be 100 m2 g–1 or higher.2 Nano- and microfibers from biocompatible polymers and biopolymers have received much attention in medical applications3 including biomedical structural elements (scaffolding used in tissue engineering,2,4–6 wound dressing,7 artificial organs and vascular grafts8), drug and vaccine delivery,9–11 protective shields in speciality fabrics, multifunctional membranes, etc. Other applications concern superhydrophobic coatings,12 encapsulation of solid materials,13 filter media for submicron particles in separation industry, composite reinforcement and structures for nano-electronic machines.
Resumo:
Techniques to improve the automated analysis of natural and spontaneous facial expressions have been developed. The outcome of the research has applications in several fields including national security (eg: expression invariant face recognition); education (eg: affect aware interfaces); mental and physical health (eg: depression and pain recognition).
Resumo:
An ironless motor for use as direct wheel drive is presented. The motor is intended for use in a lightweight (600kg), low drag, series hybrid commuter vehicle under development at The University of Queensland. The vehicle will utilise these ironless motors in each of its rear wheels, with each motor producing a peak torque output of 500Nm and a maximum rotational speed of 1500rpm. The axial flux motor consists of twin Ironless litz wire stators with a central magnetic ring and simplified Halbach magnet arrays on either side. A small amount of iron is used to support the outer Halbach arrays and to improve the peak magnetic flux density. Ducted air cooling is used to remove heat from the motor and will allow for a continuous torque rating of 250Nm. Ironless machines have previously been shown to be effective in high speed, high frequency applications (+1000Hz). They are generally regarded as non-optimal for low speed applications as iron cores allow for better magnet utilisation and do not significantly increase the weight of a machine. However, ironless machines can also be seen to be effective in applications where the average torque requirement is much lower than the peak torque requirement such as in some vehicle drive applications. The low spinning losses in ironless machines are shown to result in very high energy throughput efficiency in a wide range of vehicle driving cycles.
Resumo:
Objective The present paper reports on a quality improvement activity examining implementation of A Better Choice Healthy Food and Drink Supply Strategy for Queensland Health Facilities (A Better Choice). A Better Choice is a policy to increase supply and promotion of healthy foods and drinks and decrease supply and promotion of energy-dense, nutrient-poor choices in all food supply areas including food outlets, staff dining rooms, vending machines, tea trolleys, coffee carts, leased premises, catering, fundraising, promotion and advertising. Design An online survey targeted 278 facility managers to collect self-reported quantitative and qualitative data. Telephone interviews were sought concurrently with the twenty-five A Better Choice district contact officers to gather qualitative information. Setting Public sector-owned and -operated health facilities in Queensland, Australia. Subjects One hundred and thirty-four facility managers and twenty-four district contact officers participated with response rates of 48·2 % and 96·0 %, respectively. Results Of facility managers, 78·4 % reported implementation of more than half of the A Better Choice requirements including 24·6 % who reported full strategy implementation. Reported implementation was highest in food outlets, staff dining rooms, tea trolleys, coffee carts, internal catering and drink vending machines. Reported implementation was more problematic in snack vending machines, external catering, leased premises and fundraising. Conclusions Despite methodological challenges, the study suggests that policy approaches to improve the food and drink supply can be implemented successfully in public-sector health facilities, although results can be limited in some areas. A Better Choice may provide a model for improving food supply in other health and workplace settings.
Resumo:
Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.