108 resultados para Stand alone operation
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
The use of barcode technology to capture data on pharmacists' clinical interventions is described.
Resumo:
System efficiency and cost effectiveness are of critical importance for photovoltaic (PV) systems. This paper addresses the two issues by developing a novel three-port DC-DC converter for stand-alone PV systems, based on an improved Flyback-Forward topology. It provides a compact single-unit solution with a combined feature of optimized maximum power point tracking (MPPT), high step-up ratio, galvanic isolation and multiple operating modes for domestic and aerospace applications. A theoretical analysis is conducted to analyze the operating modes followed by simulation and experimental work. The paper is focused on a comprehensive modulation strategy utilizing both PWM and phase-shifted control that satisfies the requirement of PV power systems to achieve MPPT and output voltage regulation. A 250 W converter was designed and prototyped to provide experimental verification in term of system integration and high conversion efficiency.
Resumo:
Introduction Asthma is now one of the most common long-term conditions in the UK. It is therefore important to develop a comprehensive appreciation of the healthcare and societal costs in order to inform decisions on care provision and planning. We plan to build on our earlier estimates of national prevalence and costs from asthma by filling the data gaps previously identified in relation to healthcare and broadening the field of enquiry to include societal costs. This work will provide the first UK-wide estimates of the costs of asthma. In the context of asthma for the UK and its member countries (ie, England, Northern Ireland, Scotland and Wales), we seek to: (1) produce a detailed overview of estimates of incidence, prevalence and healthcare utilisation; (2) estimate health and societal costs; (3) identify any remaining information gaps and explore the feasibility of filling these and (4) provide insights into future research that has the potential to inform changes in policy leading to the provision of more cost-effective care.
Methods and analysis Secondary analyses of data from national health surveys, primary care, prescribing, emergency care, hospital, mortality and administrative data sources will be undertaken to estimate prevalence, healthcare utilisation and outcomes from asthma. Data linkages and economic modelling will be undertaken in an attempt to populate data gaps and estimate costs. Separate prevalence and cost estimates will be calculated for each of the UK-member countries and these will then be aggregated to generate UK-wide estimates.
Ethics and dissemination Approvals have been obtained from the NHS Scotland Information Services Division's Privacy Advisory Committee, the Secure Anonymised Information Linkage Collaboration Review System, the NHS South-East Scotland Research Ethics Service and The University of Edinburgh's Centre for Population Health Sciences Research Ethics Committee. We will produce a report for Asthma-UK, submit papers to peer-reviewed journals and construct an interactive map.
Resumo:
A bit-level systolic array for computing matrix x vector products is described. The operation is carried out on bit parallel input data words and the basic circuit takes the form of a 1-bit slice. Several bit-slice components must be connected together to form the final result, and authors outline two different ways in which this can be done. The basic array also has considerable potential as a stand-alone device, and its use in computing the Walsh-Hadamard transform and discrete Fourier transform operations is briefly discussed.
Resumo:
Much research over the past two decades has focussed on understanding the complex interactions of nitric oxide (NO()) in both physiological and pathological processes. As with many other aspects of NO() biology, its precise role in tumour pathophysiology has been the cause of intense debate and we now know that it participates in numerous signalling pathways that are crucial to the malignant character of cancer. The available experimental evidence highlights contrasting pro- and anti-tumour effects of NO() expression, which appear to be reconciled by consideration of the concentrations involved. This review addresses the complexities of the role of NO() in cancer, whilst evaluating various experimental approaches to NO()-based cancer therapies, including both inhibition of nitric oxide synthases, and overexpression of NO() using donor drugs or nitric oxide synthase gene transfer. The evidence provided strongly supports a role for manipulation of tumour NO() either as a stand-alone therapy or in combination with conventional treatments to achieve a significant therapeutic gain.
Resumo:
An H-file is used to convey information from the inner-region to the outer-region in R-matrix computations. HBrowse is a workstation tool for displaying a graphical abstraction of a local or remote R-matrix H-file. While it is published as a stand-alone tool for post-processing the output from R-matrix inner-region computations it also forms part of the Graphical R-matrix Atomic Collision Environment (GRACE), HBrowse is written in C and OSF/Motif for the UNIX operating system. (C) 2000 Elsevier Science B.V. All rights reserved.
Resumo:
Rather than treating conservative Protestantism as a homogenous phenomenon, recent literature has underlined the importance of disaggregating this group to illuminate important attitudinal and behavioral differences between conservative Protestants. However, the methods used to empirically operationalize conservative Protestantism have not always been able to capture variations within the groupings. Based on analysis of the 2004 Northern Ireland Life and Times Survey, we argue that religious self-identification is a more useful way of analyzing conservative Protestant subgroups than denomination or religious belief. We show that many of these identifications are overlapping, rather than stand-alone, religious group identifications. Moreover, the identification category of born-again has seldom been included in surveys. We find having a born-again identification to be a better predictor than the more frequently asked fundamentalist and evangelical categories of the religious and social beliefs that are seen as indicative of conservative Protestantism.
Resumo:
Stand-alone virtual environments (VEs) using haptic devices have proved useful for assembly/disassembly simulation of mechanical components. Nowadays, collaborative haptic virtual environments (CHVEs) are also emerging. A new peer-to-peer collaborative haptic assembly simulator (CHAS) has been developed whereby two users can simultaneously carry out assembly tasks using haptic devices. Two major challenges have been addressed: virtual scene synchronization (consistency) and the provision of a reliable and effective haptic feedback. A consistency-maintenance scheme has been designed to solve the challenge of achieving consistency. Results show that consistency is guaranteed. Furthermore, a force-smoothing algorithm has been developed which is shown to improve the quality of force feedback under adverse network conditions. A range of laboratory experiments and several real trials between Labein (Spain) and Queen’s University Belfast (Northern Ireland) have verified that CHAS can provide an adequate haptic interaction when both users perform remote assemblies (assembly of one user’s object with an object grasped by the other user). Moreover, when collisions between grasped objects occur (dependent collisions), the haptic feedback usually provides satisfactory haptic perception. Based on a qualitative study, it is shown that the haptic feedback obtained during remote assemblies with dependent collisions can continue to improve the sense of co-presence between users with regard to only visual feedback.
Resumo:
Multicore computational accelerators such as GPUs are now commodity components for highperformance computing at scale. While such accelerators have been studied in some detail as stand-alone computational engines, their integration in large-scale distributed systems raises new challenges and trade-offs. In this paper, we present an exploration of resource management alternatives for building asymmetric accelerator-based distributed systems. We present these alternatives in the context of a capabilities-aware framework for data-intensive computing, which uses an enhanced implementation of the MapReduce programming model for accelerator-based clusters, compared to the state of the art. The framework can transparently utilize heterogeneous accelerators for deriving high performance with low programming effort. Our work is the first to compare heterogeneous types of accelerators, GPUs and a Cell processors, in the same environment and the first to explore the trade-offs between compute-efficient and control-efficient accelerators on data-intensive systems. Our investigation shows that our framework scales well with the number of different compute nodes. Furthermore, it runs simultaneously on two different types of accelerators, successfully adapts to the resource capabilities, and performs 26.9% better on average than a static execution approach.
Resumo:
A methodology which allows a non-specialist to rapidly design silicon wavelet transform cores has been developed. This methodology is based on a generic architecture utilizing time-interleaved coefficients for the wavelet transform filters. The architecture is scaleable and it has been parameterized in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is designed in such a way that the cores can also be cascaded without any interface glue logic for any desired level of decomposition. This parameterization allows the use of any orthonormal wavelet family thereby extending the design space for improved transformation from algorithm to silicon. Case studies for stand alone and cascaded silicon cores for single and multi-stage analysis respectively are reported. The typical design time to produce silicon layout of a wavelet based system has been reduced by an order of magnitude. The cores are comparable in area and performance to hand-crafted designs. The designs have been captured in VHDL so they are portable across a range of foundries and are also applicable to FPGA and PLD implementations.
Resumo:
A rapid design methodology for orthonormal wavelet transform cores has been developed. This methodology is based on a generic, scaleable architecture utilising time-interleaved coefficients for the wavelet transform filters. The architecture has been captured in VHDL and parameterised in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is embedded within the cores and allows them to be cascaded without any interface glue logic for any desired level of decomposition. Case studies for stand alone and cascaded silicon cores for single and multi-stage wavelet analysis respectively are reported. The design time to produce silicon layout of a wavelet based system has been reduced to typically less than a day. The cores are comparable in area and performance to handcrafted designs. The designs are portable across a range of foundries and are also applicable to FPGA and PLD implementations.
Resumo:
This paper presents a novel hand-held instrument capable of real-time in situ detection and identification of heavy metals. The proposed system provides the facilities found in a traditional lab-based instrument in a hand held a design. In contrast to existing commercial systems, it can stand alone without the need of an associated computer. The electrochemical instrument uses anodic stripping voltammetry which is a precise and sensitive analytical method with excellent limits of detection. The sensors comprise disposable screen-printed (solid working) electrodes rather than the more common hanging mercury drop electrodes. The system is reliable, easy to use, safe, avoids expensive and time-consuming procedures and may be used in a variety of situations to help in the fields of environmental assessment and control.
Resumo:
IMPORTANCE Systematic reviews and meta-analyses of individual participant data (IPD) aim to collect, check, and reanalyze individual-level data from all studies addressing a particular research question and are therefore considered a gold standard approach to evidence synthesis. They are likely to be used with increasing frequency as current initiatives to share clinical trial data gain momentum and may be particularly important in reviewing controversial therapeutic areas.
OBJECTIVE To develop PRISMA-IPD as a stand-alone extension to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement, tailored to the specific requirements of reporting systematic reviews and meta-analyses of IPD. Although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis.
DESIGN Development of PRISMA-IPD followed the EQUATOR Network framework guidance and used the existing standard PRISMA Statement as a starting point to draft additional relevant material. A web-based survey informed discussion at an international workshop that included researchers, clinicians, methodologists experienced in conducting systematic reviews and meta-analyses of IPD, and journal editors. The statement was drafted and iterative refinements were made by the project, advisory, and development groups. The PRISMA-IPD Development Group reached agreement on the PRISMA-IPD checklist and flow diagram by consensus.
FINDINGS Compared with standard PRISMA, the PRISMA-IPD checklist includes 3 new items that address (1) methods of checking the integrity of the IPD (such as pattern of randomization, data consistency, baseline imbalance, and missing data), (2) reporting any important issues that emerge, and (3) exploring variation (such as whether certain types of individual benefit more from the intervention than others). A further additional item was created by reorganization of standard PRISMA items relating to interpreting results. Wording was modified in 23 items to reflect the IPD approach.
CONCLUSIONS AND RELEVANCE PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD.
Resumo:
Like any new technology, tidal power converters are being assessed for potential environmental impacts. Similar to wind power, where noise emissions have led to some regulations and limitations on consented installation sites, noise emissions of these new tidal devices attract considerable attention, especially due to the possible interaction with the marine fauna. However, the effect of turbine noise cannot be assessed as a stand-alone issue, but must be investigated in the context of the natural background noise in high flow environments. Noise measurements are also believed to be a useful tool for monitoring the operating conditions and health of equipment. While underwater noise measurements are not trivial to perform, this non-intrusive mon- itoring method could prove to be very cost effective. This paper presents sound measurements performed on the SCHOTTEL Instream Turbine as part of the MaRINET testing campaign at the QUB tidal test site in Portaferry during the summer of 2014. This paper demonstrates a comparison of the turbine noise emissions with the normal background noise at the test site and presents possible applications as a monitoring system.