20 resultados para Pseudorandom number generation

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Random number generation (RNG) is a functionally complex process that is highly controlled and therefore dependent on Baddeley's central executive. This study addresses this issue by investigating whether key predictions from this framework are compatible with empirical data. In Experiment 1, the effect of increasing task demands by increasing the rate of the paced generation was comprehensively examined. As expected, faster rates affected performance negatively because central resources were increasingly depleted. Next, the effects of participants' exposure were manipulated in Experiment 2 by providing increasing amounts of practice on the task. There was no improvement over 10 practice trials, suggesting that the high level of strategic control required by the task was constant and not amenable to any automatization gain with repeated exposure. Together, the results demonstrate that RNG performance is a highly controlled and demanding process sensitive to additional demands on central resources (Experiment 1) and is unaffected by repeated performance or practice (Experiment 2). These features render the easily administered RNG task an ideal and robust index of executive function that is highly suitable for repeated clinical use.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Two experiments examined the claim for distinct implicit and explicit learning modes in the artificial grammar-learning task (Reber, 1967, 1989). Subjects initially attempted to memorize strings of letters generated by a finite-state grammar and then classified new grammatical and nongrammatical strings. Experiment 1 showed that subjects' assessment of isolated parts of strings was sufficient to account for their classification performance but that the rules elicited in free report were not sufficient. Experiment 2 showed that performing a concurrent random number generation task under different priorities interfered with free report and classification performance equally. Furthermore, giving different groups of subjects incidental or intentional learning instructions did not affect classification or free report.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on the results of a laboratory investigation using a rotating two-layer annulus experiment, which exhibits both large-scale vortical modes and short-scale divergent modes. A sophisticated visualization method allows us to observe the flow at very high spatial and temporal resolution. The balanced long-wavelength modes appear only when the Froude number is supercritical (i.e. $F\,{>}\,F_\mathrm{critical}\,{\equiv}\, \upi^2/2$), and are therefore consistent with generation by a baroclinic instability. The unbalanced short-wavelength modes appear locally in every single baroclinically unstable flow, providing perhaps the first direct experimental evidence that all evolving vortical flows will tend to emit freely propagating inertia–gravity waves. The short-wavelength modes also appear in certain baroclinically stable flows. We infer the generation mechanisms of the short-scale waves, both for the baro-clinically unstable case in which they co-exist with a large-scale wave, and for the baroclinically stable case in which they exist alone. The two possible mechanisms considered are spontaneous adjustment of the large-scale flow, and Kelvin–Helmholtz shear instability. Short modes in the baroclinically stable regime are generated only when the Richardson number is subcritical (i.e. $\hbox{\it Ri}\,{<}\,\hbox{\it Ri}_\mathrm{critical}\,{\equiv}\, 1$), and are therefore consistent with generation by a Kelvin–Helmholtz instability. We calculate five indicators of short-wave generation in the baroclinically unstable regime, using data from a quasi-geostrophic numerical model of the annulus. There is excellent agreement between the spatial locations of short-wave emission observed in the laboratory, and regions in which the model Lighthill/Ford inertia–gravity wave source term is large. We infer that the short waves in the baroclinically unstable fluid are freely propagating inertia–gravity waves generated by spontaneous adjustment of the large-scale flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes an empirical, user-centred approach to explanation design. It reports three studies that investigate what patients want to know when they have been prescribed medication. The question is asked in the context of the development of a drug prescription system called OPADE. The system is aimed primarily at improving the prescribing behaviour of physicians, but will also produce written explanations for indirect users such as patients. In the first study, a large number of people were presented with a scenario about a visit to the doctor, and were asked to list the questions that they would like to ask the doctor about the prescription. On the basis of the results of the study, a categorization of question types was developed in terms of how frequently particular questions were asked. In the second and third studies a number of different explanations were generated in accordance with this categorization, and a new sample of people were presented with another scenario and were asked to rate the explanations on a number of dimensions. The results showed significant differences between the different explanations. People preferred explanations that included items corresponding to frequently asked questions in study 1. For an explanation to be considered useful, it had to include information about side effects, what the medication does, and any lifestyle changes involved. The implications of the results of the three studies are discussed in terms of the development of OPADE's explanation facility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The lack of myostatin promotes growth of skeletal muscle, and blockade of its activity has been proposed as a treatment for various muscle-wasting disorders. Here, we have examined two independent mouse lines that harbor mutations in the myostatin gene, constitutive null (Mstn(-/-)) and compact (Berlin High Line, BEH(c/c)). We report that, despite a larger muscle mass relative to age-matched wild types, there was no increase in maximum tetanic force generation, but that when expressed as a function of muscle size (specific force), muscles of myostatin-deficient mice were weaker than wild-type muscles. In addition, Mstn(-/-) muscle contracted and relaxed faster during a single twitch and had a marked increase in the number of type IIb fibers relative to wild-type controls. This change was also accompanied by a significant increase in type IIB fibers containing tubular aggregates. Moreover, the ratio of mitochondrial DNA to nuclear DNA and mitochondria number were decreased in myostatin-deficient muscle, suggesting a mitochondrial depletion. Overall, our results suggest that lack of myostatin compromises force production in association with loss of oxidative characteristics of skeletal muscle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expression of biologically active molecules as fusion proteins with antibody Fc can substantially extend the plasma half-life of the active agent but may also influence function. We have previously generated a number of fusion proteins comprising a complement regulator coupled to Fc and shown that the hybrid molecule has a long plasma half-life and retains biological activity. However, several of the fusion proteins generated had substantially reduced biological activity when compared with the native regulator or regulator released from the Fc following papain cleavage. We have taken advantage of this finding to engineer a prodrug with low complement regulatory activity that is cleaved at sites of inflammation to release active regulator. Two model prodrugs, comprising, respectively, the four short consensus repeats of human decay accelerating factor (CD55) linked to IgG4 Fc and the three NH2-terminal short consensus repeats of human decay accelerating factor linked to IgG2 Fc have been developed. In each, specific cleavage sites for matrix metalloproteinases and/or aggrecanases have been incorporated between the complement regulator and the Fc. These prodrugs have markedly decreased complement inhibitory activity when compared with the parent regulator in vitro. Exposure of the prodrugs to the relevant enzymes, either purified, or in supernatants of cytokine-stimulated chondrocytes or in synovial fluid, efficiently cleaved the prodrug, releasing active regulator. Such agents, having negligible systemic effects but active at sites of inflammation, represent a paradigm for the next generation of anti-C therapeutics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prerequisite for the enrichment of antibodies screened from phage display libraries is their stable expression on a phage during multiple selection rounds. Thus, if stringent panning procedures are employed, selection is simultaneously driven by antigen affinity, stability and solubility. To take advantage of robust pre-selected scaffolds of such molecules, we grafted single-chain Fv (scFv) antibodies, previously isolated from a human phage display library after multiple rounds of in vitro panning on tumor cells, with the specificity of the clinically established murine monoclonal anti-CD22 antibody RFB4. We show that a panel of grafted scFvs retained the specificity of the murine monoclonal antibody, bound to the target antigen with high affinity (6.4-9.6 nM), and exhibited exceptional biophysical stability with retention of 89-93% of the initial binding activity after 6 days of incubation in human serum at 37degreesC. Selection of stable human scaffolds with high sequence identity to both the human germline and the rodent frameworks required only a small number of murine residues to be retained within the human frameworks in order to maintain the structural integrity of the antigen binding site. We expect this approach may be applicable for the rapid generation of highly stable humanized antibodies with low immunogenic potential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To assess the effectiveness of absolute risk, relative risk, and number needed to harm formats for medicine side effects, with and without the provision of baseline risk information. Methods: A two factor, risk increase format (relative, absolute and NNH) x baseline (present/absent) between participants design was used. A sample of 268 women was given a scenario about increase in side effect risk with third generation oral contraceptives, and were required to answer written questions to assess their understanding, satisfaction, and likelihood of continuing to take the drug. Results: Provision of baseline information significantly improved risk estimates and increased satisfaction, although the estimates were still considerably higher than the actual risk. No differences between presentation formats were observed when baseline information was presented. Without baseline information, absolute risk led to the most accurate performance. Conclusion: The findings support the importance of informing people about baseline level of risk when describing risk increases. In contrast, they offer no support for using number needed to harm. Practice implications: Health professionals should provide baseline risk information when presenting information about risk increases or decreases. More research is needed before numbers needed to harm (or treat) should be given to members of the general populations. (c) 2005 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various studies investigating the future impacts of integrating high levels of renewable energy make use of historical meteorological (met) station data to produce estimates of future generation. Hourly means of 10m horizontal wind are extrapolated to a standard turbine hub height using the wind profile power or log law and used to simulate the hypothetical power output of a turbine at that location; repeating this procedure using many viable locations can produce a picture of future electricity generation. However, the estimate of hub height wind speed is dependent on the choice of the wind shear exponent a or the roughness length z0, and requires a number of simplifying assumptions. This paper investigates the sensitivity of this estimation on generation output using a case study of a met station in West Freugh, Scotland. The results show that the choice of wind shear exponent is a particularly sensitive parameter which can lead to significant variation of estimated hub height wind speed and hence estimated future generation potential of a region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vaccines such as inactivated or live attenuated vaccines, are gradually giving way to more biochemically defined vaccines that are most often based on a recombinant antigen known to possess neutralizing epitopes. Such vaccines can offer improvements in speed, safety and manufacturing process but an inevitable consequence of their high degree of purification is that immunogenicity is reduced through the lack of the innate triggering molecules present in more complex preparations. Targeting recombinant vaccines to antigen presenting cells (APCs) such as dendritic cells however can improve immunogenicity by ensuring that antigen processing is as efficient as possible. Immune complexes, one of a number of routes of APC targeting, are mimicked by a recombinant approach, crystallizable fragment (Fc) fusion proteins, in which the target immunogen is linked directly to an antibody effector domain capable of interaction with receptors, FcR, on the APC cell surface. A number of virus Fc fusion proteins have been expressed in insect cells using the baculovirus expression system and shown to be efficiently produced and purified. Their use for immunization next to non-Fc tagged equivalents shows that they are powerfully immunogenic in the absence of added adjuvant and that immune stimulation is the result of the Fc-FcR interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.