11 resultados para time units

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The macrosystem refers to the overarching patterns that influence behavior at each level of the social ecology (Bronfenbrenner, 1977), making it a necessary component for assessing human development in contexts of political violence. This article proposes a method for systematically measuring the macrosystem in Northern Ireland that allows for a subnational analysis, multiple time units, and indicators of both low-level violence and positive relations. Articles were randomly chosen for each weekday in 2006-2011 from two prominent Northern Irish newspapers and coded according to their reflection of positive relations and political tensions between Catholics and Protestants. The newspaper data were then compared to existing macro-level measurements in Northern Ireland. We found that the newspaper data provided a more nuanced understanding of fluctuations in intergroup relations than the corresponding measures. This has practical implications for peacebuilding and advances our methods for assessing the impact of macro-level processes on individual development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method for simulation of acoustical bores, useful in the context of sound synthesis by physical modeling of woodwind instruments, is presented. As with previously developed methods, such as digital waveguide modeling (DWM) [Smith, Comput. Music J. 16, pp 74-91 (1992)] and the multi convolution algorithm (MCA) [Martinez et al., J. Acoust. Soc. Am. 84, pp 1620-1627 (1988)], the approach is based on a one-dimensional model of wave propagation in the bore. Both the DWM method and the MCA explicitly compute the transmission and reflection of wave variables that represent actual traveling pressure waves. The method presented in this report, the wave digital modeling (WDM) method, avoids the typical limitations associated with these methods by using a more general definition of the wave variables. An efficient and spatially modular discrete-time model is constructed from the digital representations of elemental bore units such as cylindrical sections, conical sections, and toneholes. Frequency-dependent phenomena, such as boundary losses, are approximated with digital filters. The stability of a simulation of a complete acoustic bore is investigated empirically. Results of the simulation of a full clarinet show that a very good concordance with classic transmission-line theory is obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new hierarchical learning structure, namely the holistic triple learning (HTL), for extending the binary support vector machine (SVM) to multi-classification problems. For an N-class problem, a HTL constructs a decision tree up to a depth of A leaf node of the decision tree is allowed to be placed with a holistic triple learning unit whose generalisation abilities are assessed and approved. Meanwhile, the remaining nodes in the decision tree each accommodate a standard binary SVM classifier. The holistic triple classifier is a regression model trained on three classes, whose training algorithm is originated from a recently proposed implementation technique, namely the least-squares support vector machine (LS-SVM). A major novelty with the holistic triple classifier is the reduced number of support vectors in the solution. For the resultant HTL-SVM, an upper bound of the generalisation error can be obtained. The time complexity of training the HTL-SVM is analysed, and is shown to be comparable to that of training the one-versus-one (1-vs.-1) SVM, particularly on small-scale datasets. Empirical studies show that the proposed HTL-SVM achieves competitive classification accuracy with a reduced number of support vectors compared to the popular 1-vs-1 alternative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The majority of previous research on social capital and health is limited to social capital in residential neighborhoods and communities. Using data from the Finnish 10-Town study we examined social capital at work as a predictor of health in a cohort of 9524 initially healthy local government employees in 1522 work units, who did not change their work unit between 2000 and 2004 and responded to surveys measuring social capital at work and health at both time-points. We used a validated tool to measure social capital with perceptions at the individual level and with co-workers' responses at the work unit level. According to multilevel modeling, a contextual effect of work unit social capital on self-rated health was not accounted for by the individual's socio-demographic characteristics or lifestyle. The odds for health impairment were 1.27 times higher for employees who constantly worked in units with low social capital than for those with constantly high work unit social capital. Corresponding odds ratios for low and declining individual-level social capital varied between 1.56 and 1.78. Increasing levels of individual social capital were associated with sustained good health. In conclusion, this longitudinal multilevel study provides support for the hypothesis that exposure to low social capital at work may be detrimental to the health of employees. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a multi-camera application capable of processing high resolution images and extracting features based on colors patterns over graphic processing units (GPU). The goal is to work in real time under the uncontrolled environment of a sport event like a football match. Since football players are composed for diverse and complex color patterns, a Gaussian Mixture Models (GMM) is applied as segmentation paradigm, in order to analyze sport live images and video. Optimization techniques have also been applied over the C++ implementation using profiling tools focused on high performance. Time consuming tasks were implemented over NVIDIA's CUDA platform, and later restructured and enhanced, speeding up the whole process significantly. Our resulting code is around 4-11 times faster on a low cost GPU than a highly optimized C++ version on a central processing unit (CPU) over the same data. Real time has been obtained processing until 64 frames per second. An important conclusion derived from our study is the scalability of the application to the number of cores on the GPU. © 2011 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates sub-integer implementations of the adaptive Gaussian mixture model (GMM) for background/foreground segmentation to allow the deployment of the method on low cost/low power processors that lack Floating Point Unit (FPU). We propose two novel integer computer arithmetic techniques to update Gaussian parameters. Specifically, the mean value and the variance of each Gaussian are updated by a redefined and generalised "round'' operation that emulates the original updating rules for a large set of learning rates. Weights are represented by counters that are updated following stochastic rules to allow a wider range of learning rates and the weight trend is approximated by a line or a staircase. We demonstrate that the memory footprint and computational cost of GMM are significantly reduced, without significantly affecting the performance of background/foreground segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a framework for a telecommunications interface which allows data from sensors embedded in Smart Grid applications to reliably archive data in an appropriate time-series database. The challenge in doing so is two-fold, firstly the various formats in which sensor data is represented, secondly the problems of telecoms reliability. A prototype of the authors' framework is detailed which showcases the main features of the framework in a case study featuring Phasor Measurement Units (PMU) as the application. Useful analysis of PMU data is achieved whenever data from multiple locations can be compared on a common time axis. The prototype developed highlights its reliability, extensibility and adoptability; features which are largely deferred from industry standards for data representation to proprietary database solutions. The open source framework presented provides link reliability for any type of Smart Grid sensor and is interoperable with existing proprietary database systems, and open database systems. The features of the authors' framework allow for researchers and developers to focus on the core of their real-time or historical analysis applications, rather than having to spend time interfacing with complex protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WHIRLBOB, also known as STRIBOBr2, is an AEAD (Authenticated Encryption with Associated Data) algorithm derived from STRIBOBr1 and the Whirlpool hash algorithm. WHIRLBOB/STRIBOBr2 is a second round candidate in the CAESAR competition. As with STRIBOBr1, the reduced-size Sponge design has a strong provable security link with a standardized hash algorithm. The new design utilizes only the LPS or ρ component of Whirlpool in flexibly domain-separated BLNK Sponge mode. The number of rounds is increased from 10 to 12 as a countermeasure against Rebound Distinguishing attacks. The 8 ×8 - bit S-Box used by Whirlpool and WHIRLBOB is constructed from 4 ×4 - bit “MiniBoxes”. We report on fast constant-time Intel SSSE3 and ARM NEON SIMD WHIRLBOB implementations that keep full miniboxes in registers and access them via SIMD shuffles. This is an efficient countermeasure against AES-style cache timing side-channel attacks. Another main advantage of WHIRLBOB over STRIBOBr1 (and most other AEADs) is its greatly reduced implementation footprint on lightweight platforms. On many lower-end microcontrollers the total software footprint of π+BLNK = WHIRLBOB AEAD is less than half a kilobyte. We also report an FPGA implementation that requires 4,946 logic units for a single round of WHIRLBOB, which compares favorably to 7,972 required for Keccak / Keyak on the same target platform. The relatively small S-Box gate count also enables efficient 64-bit bitsliced straight-line implementations. We finally present some discussion and analysis on the relationships between WHIRLBOB, Whirlpool, the Russian GOST Streebog hash, and the recent draft Russian Encryption Standard Kuznyechik.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Care of critically ill patients in intensive care units (ICUs) often requires potentially invasive or uncomfortable procedures, such as mechanical ventilation (MV). Sedation can alleviate pain and discomfort, provide protection from stressful or harmful events, prevent anxiety and promote sleep. Various sedative agents are available for use in ICUs. In the UK, the most commonly used sedatives are propofol (Diprivan(®), AstraZeneca), benzodiazepines [e.g. midazolam (Hypnovel(®), Roche) and lorazepam (Ativan(®), Pfizer)] and alpha-2 adrenergic receptor agonists [e.g. dexmedetomidine (Dexdor(®), Orion Corporation) and clonidine (Catapres(®), Boehringer Ingelheim)]. Sedative agents vary in onset/duration of effects and in their side effects. The pattern of sedation of alpha-2 agonists is quite different from that of other sedatives in that patients can be aroused readily and their cognitive performance on psychometric tests is usually preserved. Moreover, respiratory depression is less frequent after alpha-2 agonists than after other sedative agents.

OBJECTIVES: To conduct a systematic review to evaluate the comparative effects of alpha-2 agonists (dexmedetomidine and clonidine) and propofol or benzodiazepines (midazolam and lorazepam) in mechanically ventilated adults admitted to ICUs.

DATA SOURCES: We searched major electronic databases (e.g. MEDLINE without revisions, MEDLINE In-Process & Other Non-Indexed Citations, EMBASE and Cochrane Central Register of Controlled Trials) from 1999 to 2014.

METHODS: Evidence was considered from randomised controlled trials (RCTs) comparing dexmedetomidine with clonidine or dexmedetomidine or clonidine with propofol or benzodiazepines such as midazolam, lorazepam and diazepam (Diazemuls(®), Actavis UK Limited). Primary outcomes included mortality, duration of MV, length of ICU stay and adverse events. One reviewer extracted data and assessed the risk of bias of included trials. A second reviewer cross-checked all the data extracted. Random-effects meta-analyses were used for data synthesis.

RESULTS: Eighteen RCTs (2489 adult patients) were included. One trial at unclear risk of bias compared dexmedetomidine with clonidine and found that target sedation was achieved in a higher number of patients treated with dexmedetomidine with lesser need for additional sedation. The remaining 17 trials compared dexmedetomidine with propofol or benzodiazepines (midazolam or lorazepam). Trials varied considerably with regard to clinical population, type of comparators, dose of sedative agents, outcome measures and length of follow-up. Overall, risk of bias was generally high or unclear. In particular, few trials blinded outcome assessors. Compared with propofol or benzodiazepines (midazolam or lorazepam), dexmedetomidine had no significant effects on mortality [risk ratio (RR) 1.03, 95% confidence interval (CI) 0.85 to 1.24, I (2) = 0%; p = 0.78]. Length of ICU stay (mean difference -1.26 days, 95% CI -1.96 to -0.55 days, I (2) = 31%; p = 0.0004) and time to extubation (mean difference -1.85 days, 95% CI -2.61 to -1.09 days, I (2) = 0%; p < 0.00001) were significantly shorter among patients who received dexmedetomidine. No difference in time to target sedation range was observed between sedative interventions (I (2) = 0%; p = 0.14). Dexmedetomidine was associated with a higher risk of bradycardia (RR 1.88, 95% CI 1.28 to 2.77, I (2) = 46%; p = 0.001).

LIMITATIONS: Trials varied considerably with regard to participants, type of comparators, dose of sedative agents, outcome measures and length of follow-up. Overall, risk of bias was generally high or unclear. In particular, few trials blinded assessors.

CONCLUSIONS: Evidence on the use of clonidine in ICUs is very limited. Dexmedetomidine may be effective in reducing ICU length of stay and time to extubation in critically ill ICU patients. Risk of bradycardia but not of overall mortality is higher among patients treated with dexmedetomidine. Well-designed RCTs are needed to assess the use of clonidine in ICUs and identify subgroups of patients that are more likely to benefit from the use of dexmedetomidine.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42014014101.

FUNDING: The National Institute for Health Research Health Technology Assessment programme. The Health Services Research Unit is core funded by the Chief Scientist Office of the Scottish Government Health and Social Care Directorates.