908 resultados para random number generation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the field of embedded systems design, coprocessors play an important role as a component to increase performance. Many embedded systems are built around a small General Purpose Processor (GPP). If the GPP cannot meet the performance requirements for a certain operation, a coprocessor can be included in the design. The GPP can then offload the computationally intensive operation to the coprocessor; thus increasing the performance of the overall system. A common application of coprocessors is the acceleration of cryptographic algorithms. The work presented in this thesis discusses coprocessor architectures for various cryptographic algorithms that are found in many cryptographic protocols. Their performance is then analysed on a Field Programmable Gate Array (FPGA) platform. Firstly, the acceleration of Elliptic Curve Cryptography (ECC) algorithms is investigated through the use of instruction set extension of a GPP. The performance of these algorithms in a full hardware implementation is then investigated, and an architecture for the acceleration the ECC based digital signature algorithm is developed. Hash functions are also an important component of a cryptographic system. The FPGA implementation of recent hash function designs from the SHA-3 competition are discussed and a fair comparison methodology for hash functions presented. Many cryptographic protocols involve the generation of random data, for keys or nonces. This requires a True Random Number Generator (TRNG) to be present in the system. Various TRNG designs are discussed and a secure implementation, including post-processing and failure detection, is introduced. Finally, a coprocessor for the acceleration of operations at the protocol level will be discussed, where, a novel aspect of the design is the secure method in which private-key data is handled

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Skeletal muscle wasting and weakness are significant complications of critical illness, associated with the degree of illness severity and periods of reduced mobility during mechanical ventilation. They contribute to the profound physical and functional deficits observed in survivors. These impairments may persist for many years following discharge from the intensive care unit (ICU) and may markedly influence health-related quality of life. Rehabilitation is a key strategy in the recovery of patients following critical illness. Exercise based interventions are aimed at targeting this muscle wasting and weakness. Physical rehabilitation delivered during ICU admission has been systematically evaluated and shown to be beneficial. However its effectiveness when initiated after ICU discharge has yet to be established. Objectives: To assess the effectiveness of exercise rehabilitation programmes, initiated after ICU discharge, on functional exercise capacity and health-related quality of life in adult ICU survivors who have been mechanically ventilated for more than 24 hours. Search methods:We searched the following databases: the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), OvidSP MEDLINE, Ovid SP EMBASE, and CINAHL via EBSCO host to 15th May 2014. We used a specific search strategy for each database. This included synonyms for ICU and critical illness, exercise training and rehabilitation. We searched the reference lists of included studies and contacted primary authors to obtain further information regarding potentially eligible studies. We also searched major clinical trials registries (Clinical Trials and Current Controlled Trials) and the personal libraries of the review authors. We applied no language or publication restriction. We reran the search in February 2015. We will deal with any studies of interest when we update the review.  Selection criteria:We included randomized controlled trials (RCTs), quasi-RCTs, and controlled clinical trials (CCTs) that compared an exercise interventioninitiated after ICU discharge to any other intervention or a control or ‘usual care’ programme in adult (≥18years) survivors ofcritical illness. Data collection and analysis:We used standard methodological procedures expected by The Cochrane Collaboration. Main results:We included six trials (483 adult ICU participants). Exercise-based interventions were delivered on the ward in two studies; both onthe ward and in the community in one study; and in the community in three studies. The duration of the intervention varied according to the length of stay in hospital following ICU discharge (up to a fixed duration of 12 weeks).Risk of bias was variable for all domains across all trials. High risk of bias was evident in all studies for performance bias, although blinding of participants and personnel in therapeutic rehabilitation trials can be pragmatically challenging. Low risk of bias was at least 50% for all other domains across all trials, although high risk of bias was present in one study for random sequence generation (selection bias), incomplete outcome data (attrition bias) and other sources. Risk of bias was unclear for remaining studies across the domains.All six studies measured effect on the primary outcome of functional exercise capacity, although there was wide variability in natureof intervention, outcome measures and associated metrics, and data reporting. Overall quality of the evidence was very low. Only two studies using the same outcome measure for functional exercise capacity, had the potential for pooling of data and assessment of heterogeneity. On statistical advice, this was considered inappropriate to perform this analysis and study findings were therefore qualitatively described. Individually, three studies reported positive results in favour of the intervention. A small benefit (versus. control)was evident in anaerobic threshold in one study (mean difference, MD (95% confidence interval, CI), 1.8 mlO2/kg/min (0.4 to 3.2),P value = 0.02), although this effect was short-term, and in a second study, both incremental (MD 4.7 (95% CI 1.69 to 7.75) Watts, P value = 0.003) and endurance (MD 4.12 (95% CI 0.68 to 7.56) minutes, P value = 0.021) exercise testing demonstrated improvement.Finally self-reported physical function increased significantly following a rehabilitation manual (P value = 0.006). Remaining studies found no effect of the intervention.Similar variability in with regard findings for the primary outcome of health-related quality of life were also evident. Only two studies evaluated this outcome. Following statistical advice, these data again were considered inappropriate for pooling to determine overall effect and assessment of heterogeneity. Qualitative description of findings was therefore undertaken. Individually, neither study reported differences between intervention and control groups for health-related quality of life as a result of the intervention. Overall quality of the evidence was very low.Mortality was reported by all studies, ranging from 0% to 18.8%. Only one non-mortality adverse event was reported across all patients in all studies (a minor musculoskeletal injury). Withdrawals, reported in four studies, ranged from 0% to 26.5% in control groups,and 8.2% to 27.6% in intervention groups. Loss to follow-up, reported in all studies, ranged from 0% to 14% in control groups, and 0% to 12.5% in intervention groups. Authors’ conclusions:We are unable, at this time, to determine an overall effect on functional exercise capacity, or health-related quality of life, of an exercise based intervention initiated after ICU discharge in survivors of critical illness. Meta-analysis of findings was not appropriate. This was due to insufficient study number and data. Individual study findings were inconsistent. Some studies reported a beneficial effect of the intervention on functional exercise capacity, and others not. No effect was reported on health-related quality of life. Methodological rigour was lacking across a number of domains influencing quality of the evidence. There was also wide variability in the characteristics of interventions, outcome measures and associated metrics, and data reporting.If further trials are identified, we may be able to determine the effect of exercise-based interventions following ICU discharge, on functional exercise capacity and health-related quality of life in survivors of critical illness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

clRNG et clProbdist sont deux interfaces de programmation (APIs) que nous avons développées pour la génération de nombres aléatoires uniformes et non uniformes sur des dispositifs de calculs parallèles en utilisant l’environnement OpenCL. La première interface permet de créer au niveau d’un ordinateur central (hôte) des objets de type stream considérés comme des générateurs virtuels parallèles qui peuvent être utilisés aussi bien sur l’hôte que sur les dispositifs parallèles (unités de traitement graphique, CPU multinoyaux, etc.) pour la génération de séquences de nombres aléatoires. La seconde interface permet aussi de générer au niveau de ces unités des variables aléatoires selon différentes lois de probabilité continues et discrètes. Dans ce mémoire, nous allons rappeler des notions de base sur les générateurs de nombres aléatoires, décrire les systèmes hétérogènes ainsi que les techniques de génération parallèle de nombres aléatoires. Nous présenterons aussi les différents modèles composant l’architecture de l’environnement OpenCL et détaillerons les structures des APIs développées. Nous distinguons pour clRNG les fonctions qui permettent la création des streams, les fonctions qui génèrent les variables aléatoires uniformes ainsi que celles qui manipulent les états des streams. clProbDist contient les fonctions de génération de variables aléatoires non uniformes selon la technique d’inversion ainsi que les fonctions qui permettent de retourner différentes statistiques des lois de distribution implémentées. Nous évaluerons ces interfaces de programmation avec deux simulations qui implémentent un exemple simplifié d’un modèle d’inventaire et un exemple d’une option financière. Enfin, nous fournirons les résultats d’expérimentation sur les performances des générateurs implémentés.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper deals with the design of a network-on-chip reconfigurable pseudorandom number generation unit that can map and execute meta-heuristic algorithms in hardware. The unit can be configured to implement one of the following five linear generator algorithms: a multiplicative congruential, a mixed congruential, a standard multiple recursive, a mixed multiple recursive, and a multiply-with-carry. The generation unit can be used both as a pseudorandom and a message passing-based server, which is able to produce pseudorandom numbers on demand, sending them to the network-on-chip blocks that originate the service request. The generator architecture has been mapped to a field programmable gate array, and showed that millions of numbers in 32-, 64-, 96-, or 128-bit formats can be produced in tens of milliseconds. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE To determine if adequacy of randomisation and allocation concealment is associated with changes in effect sizes (ES) when comparing physical therapy (PT) trials with and without these methodological characteristics. DESIGN Meta-epidemiological study. PARTICIPANTS A random sample of randomised controlled trials (RCTs) included in meta-analyses in the PT discipline were identified. INTERVENTION Data extraction including assessments of random sequence generation and allocation concealment was conducted independently by two reviewers. To determine the association between sequence generation, and allocation concealment and ES, a two-level analysis was conducted using a meta-meta-analytic approach. PRIMARY AND SECONDARY OUTCOME MEASURES association between random sequence generation and allocation concealment and ES in PT trials. RESULTS 393 trials included in 43 meta-analyses, analysing 44 622 patients contributed to this study. Adequate random sequence generation and appropriate allocation concealment were accomplished in only 39.7% and 11.5% of PT trials, respectively. Although trials with inappropriate allocation concealment tended to have an overestimate treatment effect when compared with trials with adequate concealment of allocation, the difference was non-statistically significant (ES=0.12; 95% CI -0.06 to 0.30). When pooling our results with those of Nuesch et al, we obtained a pooled statistically significant value (ES=0.14; 95% CI 0.02 to 0.26). There was no difference in ES in trials with appropriate or inappropriate random sequence generation (ES=0.02; 95% CI -0.12 to 0.15). CONCLUSIONS Our results suggest that when evaluating risk of bias of primary RCTs in PT area, systematic reviewers and clinicians implementing research into practice should pay attention to these biases since they could exaggerate treatment effects. Systematic reviewers should perform sensitivity analysis including trials with low risk of bias in these domains as primary analysis and/or in combination with less restrictive analyses. Authors and editors should make sure that allocation concealment and random sequence generation are properly reported in trial reports.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE The objective of this study was to assess the risk of bias of randomized controlled trials (RCTs) published in prosthodontic and implant dentistry journals. MATERIALS AND METHODS The last 30 issues of 9 journals in the field of prosthodontic and implant dentistry (Clinical Implant Dentistry and Related Research, Clinical Oral Implants Research, Implant Dentistry, International Journal of Oral & Maxillofacial Implants, International Journal of Periodontics and Restorative Dentistry, International Journal of Prosthodontics, Journal of Dentistry, Journal of Oral Rehabilitation, and Journal of Prosthetic Dentistry) were hand-searched for RCTs. Risk of bias was assessed using the Cochrane Collaboration's risk of bias tool and analyzed descriptively. RESULTS From the 3,667 articles screened, a total of 147 RCTs were identified and included. The number of published RCTs increased with time. The overall distribution of a high risk of bias assessment varied across the domains of the Cochrane risk of bias tool: 8% for random sequence generation, 18% for allocation concealment, 41% for masking, 47% for blinding of outcome assessment, 7% for incomplete outcome data, 12% for selective reporting, and 41% for other biases. CONCLUSION The distribution of high risk of bias for RCTs published in the selected prosthodontic and implant dentistry journals varied among journals and ranged from 8% to 47%, which can be considered as substantial.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Radiotherapy has been a method of choice in cancer treatment for a number of years. Mathematical modeling is an important tool in studying the survival behavior of any cell as well as its radiosensitivity. One particular cell under investigation is the normal T-cell, the radiosensitivity of which may be indicative to the patient's tolerance to radiation doses.^ The model derived is a compound branching process with a random initial population of T-cells that is assumed to have compound distribution. T-cells in any generation are assumed to double or die at random lengths of time. This population is assumed to undergo a random number of generations within a period of time. The model is then used to obtain an estimate for the survival probability of T-cells for the data under investigation. This estimate is derived iteratively by applying the likelihood principle. Further assessment of the validity of the model is performed by simulating a number of subjects under this model.^ This study shows that there is a great deal of variation in T-cells survival from one individual to another. These variations can be observed under normal conditions as well as under radiotherapy. The findings are in agreement with a recent study and show that genetic diversity plays a role in determining the survival of T-cells. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work we propose a NLSE-based model of power and spectral properties of the random distributed feedback (DFB) fiber laser. The model is based on coupled set of non-linear Schrödinger equations for pump and Stokes waves with the distributed feedback due to Rayleigh scattering. The model considers random backscattering via its average strength, i.e. we assume that the feedback is incoherent. In addition, this allows us to speed up simulations sufficiently (up to several orders of magnitude). We found that the model of the incoherent feedback predicts the smooth and narrow (comparing with the gain spectral profile) generation spectrum in the random DFB fiber laser. The model allows one to optimize the random laser generation spectrum width varying the dispersion and nonlinearity values: we found, that the high dispersion and low nonlinearity results in narrower spectrum that could be interpreted as four-wave mixing between different spectral components in the quasi-mode-less spectrum of the random laser under study could play an important role in the spectrum formation. Note that the physical mechanism of the random DFB fiber laser formation and broadening is not identified yet. We investigate temporal and statistical properties of the random DFB fiber laser dynamics. Interestingly, we found that the intensity statistics is not Gaussian. The intensity auto-correlation function also reveals that correlations do exist. The possibility to optimize the system parameters to enhance the observed intrinsic spectral correlations to further potentially achieved pulsed (mode-locked) operation of the mode-less random distributed feedback fiber laser is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Popular wireless network standards, such as IEEE 802.11/15/16, are increasingly adopted in real-time control systems. However, they are not designed for real-time applications. Therefore, the performance of such wireless networks needs to be carefully evaluated before the systems are implemented and deployed. While efforts have been made to model general wireless networks with completely random traffic generation, there is a lack of theoretical investigations into the modelling of wireless networks with periodic real-time traffic. Considering the widely used IEEE 802.11 standard, with the focus on its distributed coordination function (DCF), for soft-real-time control applications, this paper develops an analytical Markov model to quantitatively evaluate the network quality-of-service (QoS) performance in periodic real-time traffic environments. Performance indices to be evaluated include throughput capacity, transmission delay and packet loss ratio, which are crucial for real-time QoS guarantee in real-time control applications. They are derived under the critical real-time traffic condition, which is formally defined in this paper to characterize the marginal satisfaction of real-time performance constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Choi et al. recently proposed an efficient RFID authentication protocol for a ubiquitous computing environment, OHLCAP(One-Way Hash based Low-Cost Authentication Protocol). However, this paper reveals that the protocol has several security weaknesses : 1) traceability based on the leakage of counter information, 2) vulnerability to an impersonation attack by maliciously updating a random number, and 3) traceability based on a physically-attacked tag. Finally, a security enhanced group-based authentication protocol is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Skin temperature is an important physiological measure that can reflect the presence of illness and injury as well as provide insight into the localised interactions between the body and the environment. The aim of this systematic review was to analyse the agreement between conductive and infrared means of assessing skin temperature which are commonly employed in in clinical, occupational, sports medicine, public health and research settings. Full-text eligibility was determined independently by two reviewers. Studies meeting the following criteria were included in the review: 1) the literature was written in English, 2) participants were human (in vivo), 3) skin surface temperature was assessed at the same site, 4) with at least two commercially available devices employed—one conductive and one infrared—and 5) had skin temperature data reported in the study. A computerised search of four electronic databases, using a combination of 21 keywords, and citation tracking was performed in January 2015. A total of 8,602 were returned. Methodology quality was assessed by 2 authors independently, using the Cochrane risk of bias tool. A total of 16 articles (n = 245) met the inclusion criteria. Devices are classified to be in agreement if they met the clinically meaningful recommendations of mean differences within ±0.5 °C and limits of agreement of ±1.0 °C. Twelve of the included studies found mean differences greater than ±0.5 °C between conductive and infrared devices. In the presence of external stimulus (e.g. exercise and/or heat) five studies foundexacerbated measurement differences between conductive and infrared devices. This is the first review that has attempted to investigate presence of any systemic bias between infrared and conductive measures by collectively evaluating the current evidence base. There was also a consistently high risk of bias across the studies, in terms of sample size, random sequence generation, allocation concealment, blinding and incomplete outcome data. This systematic review questions the suitability of using infrared cameras in stable, resting, laboratory conditions. Furthermore, both infrared cameras and thermometers in the presence of sweat and environmental heat demonstrate poor agreement when compared to conductive devices. These findings have implications for clinical, occupational, public health, sports science and research fields.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We describe a compiler for the Flat Concurrent Prolog language on a message passing multiprocessor architecture. This compiler permits symbolic and declarative programming in the syntax of Guarded Horn Rules, The implementation has been verified and tested on the 64-node PARAM parallel computer developed by C-DAC (Centre for the Development of Advanced Computing, India), Flat Concurrent Prolog (FCP) is a logic programming language designed for concurrent programming and parallel execution, It is a process oriented language, which embodies dataflow synchronization and guarded-command as its basic control mechanisms. An identical algorithm is executed on every processor in the network, We assume regular network topologies like mesh, ring, etc, Each node has a local memory, The algorithm comprises of two important parts: reduction and communication, The most difficult task is to integrate the solutions of problems that arise in the implementation in a coherent and efficient manner. We have tested the efficacy of the compiler on various benchmark problems of the ICOT project that have been reported in the recent book by Evan Tick, These problems include Quicksort, 8-queens, and Prime Number Generation, The results of the preliminary tests are favourable, We are currently examining issues like indexing and load balancing to further optimize our compiler.