50 resultados para General-purpose computing on graphics processing units (GPGPU)

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies of construction labour productivity have revealed that limited predictability and multi-agent social complexity make long-range planning of construction projects extremely inaccurate. Fire-fighting, a cultural feature of construction project management, social and structural diversity of involved permanent organizations, and structural temporality all contribute towards relational failures and frequent changes. The main purpose of this paper is therefore to demonstrate that appropriate construction planning may have a profound synergistic effect on structural integration of a project organization. Using the general systems theory perspective it is further a specific objective to investigate and evaluate organizational effects of changes in planning and potentials for achieving continuous project-organizational synergy. The newly developed methodology recognises that planning should also represent a continuous, improvement-leading driving force throughout a project. The synergistic effect of the process planning membership duality fostered project-wide integration, eliminated internal boundaries, and created a pool of constantly upgrading knowledge. It maintained a creative environment that resulted in a number of process-related improvements from all parts of the organization. As a result labour productivity has seen increases of more than 30%, profits have risen from an average of 12% to more than 18%, and project durations have been reduced by several days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The computational grammatical complexity ( CGC) hypothesis claims that children with G(rammatical)-specific language impairment ( SLI) have a domain-specific deficit in the computational system affecting syntactic dependencies involving 'movement'. One type of such syntactic dependencies is filler-gap dependencies. In contrast, the Generalized Slowing Hypothesis claims that SLI children have a domain-general deficit affecting processing speed and capacity. Aims: To test contrasting accounts of SLI we investigate processing of syntactic (filler-gap) dependencies in wh-questions. Methods & Procedures: Fourteen 10; 2 - 17; 2 G-SLI children, 14 age- matched and 17 vocabulary-matched controls were studied using the cross- modal picturepriming paradigm. Outcomes & Results: G-SLI children's processing speed was significantly slower than the age controls, but not younger vocabulary controls. The G- SLI children and vocabulary controls did not differ on memory span. However, the typically developing and G-SLI children showed a qualitatively different processing pattern. The age and vocabulary controls showed priming at the gap, indicating that they process wh-questions through syntactic filler-gap dependencies. In contrast, G-SLI children showed priming only at the verb. Conclusions: The findings indicate that G-SLI children fail to establish reliably a syntactic filler- gap dependency and instead interpret wh-questions via lexical thematic information. These data challenge the Generalized Slowing Hypothesis account, but support the CGC hypothesis, according to which G-SLI children have a particular deficit in the computational system affecting syntactic dependencies involving 'movement'. As effective remediation often depends on aetiological insight, the discovery of the nature of the syntactic deficit, along side a possible compensatory use of semantics to facilitate sentence processing, can be used to direct therapy. However, the therapeutic strategy to be used, and whether such similar strengths and weaknesses within the language system are found in other SLI subgroups are empirical issues that warrant further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The definitions of the base units of the international system of units have been revised many times since the idea of such an international system was first conceived at the time of the French revolution. The objective today is to define all our units in terms of 'invariants of nature', i.e. by referencing our units to the fundamental constants of physics, or the properties of atoms, rather than the characteristics of our planet or of artefacts. This situation is reviewed, particularly in regard to finding a new definition of the kilogram to replace its present definition in terms of a prototype material artefact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The definitions of the base units of the international system of units have been revised many times since the idea of such an international system was first conceived at the time of the French revolution. The objective today is to define all our units in terms of 'invariants of nature', i.e. by referencing our units to the fundamental constants of physics, or the properties of atoms, rather than the characteristics of our planet or of artefacts. This situation is reviewed, particularly in regard to finding a new definition of the kilogram to replace its present definition in terms of a prototype material artefact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The temperature-time profiles of 22 Australian industrial ultra-high-temperature (UHT) plants and 3 pilot plants, using both indirect and direct heating, were surveyed. From these data, the operating parameters of each plant, the chemical index C*, the bacteriological index B* and the predicted changes in the levels of beta-lactoglobulin, alpha-lactalbumin, lactulose, furosine and browning were determined using a simulation program based on published formulae and reaction kinetics data. There was a wide spread of heating conditions used, some of which resulted in a large margin of bacteriological safety and high chemical indices. However, no conditions were severe enough to cause browning during processing. The data showed a clear distinction between the indirect and direct heating plants. They also indicated that degree of denaturation of alpha-lactalbumin varied over a wide range and may be a useful discriminatory index of heat treatment. Application of the program to pilot plants illustrated its value in determining processing conditions in these plants to simulate the conditions in industrial UHT plants. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of novel labels on visual processing was investigated across two experiments with infants aged between 9 and 21 months. Infants viewed pairs of images across a series of preferential looking trials. On each trial, one image was novel, and the other image had previously been viewed by the infant. Some infants viewed images in silence; other infants viewed images accompanied by novel labels. The pattern of fixations both across and within trials revealed that infants in the labelling condition took longer to develop a novelty preference than infants in the silent condition. Our findings contrast with prior research by Robinson and Sloutsky (e.g., Robinson & Sloutsky, 2007a; Sloutsky & Robinson, 2008) who found that novel labels did not disrupt visual processing for infants aged over a year. Provided that overall task demands are sufficiently high, it appears that labels can disrupt visual processing for infants during the developmental period of establishing a lexicon. The results suggest that when infants are processing labels and objects, attentional resources are shared across modalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models of normal word production are well specified about the effects of frequency of linguistic stimuli on lexical access, but are less clear regarding the same effects on later stages of word production, particularly word articulation. In aphasia, this lack of specificity of down-stream frequency effects is even more noticeable because there is relatively limited amount of data on the time course of frequency effects for this population. This study begins to fill this gap by comparing the effects of variation of word frequency (lexical, whole word) and bigram frequency (sub-lexical, within word) on word production abilities in ten normal speakers and eight mild–moderate individuals with aphasia. In an immediate repetition paradigm, participants repeated single monosyllabic words in which word frequency (high or low) was crossed with bigram frequency (high or low). Indices for mapping the time course for these effects included reaction time (RT) for linguistic processing and motor preparation, and word duration (WD) for speech motor performance (word articulation time). The results indicated that individuals with aphasia had significantly longer RT and WD compared to normal speakers. RT showed a significant main effect only for word frequency (i.e., high-frequency words had shorter RT). WD showed significant main effects of word and bigram frequency; however, contrary to our expectations, high-frequency items had longer WD. Further investigation of WD revealed that independent of the influence of word and bigram frequency, vowel type (tense or lax) had the expected effect on WD. Moreover, individuals with aphasia differed from control speakers in their ability to implement tense vowel duration, even though they could produce an appropriate distinction between tense and lax vowels. The results highlight the importance of using temporal measures to identify subtle deficits in linguistic and speech motor processing in aphasia, the crucial role of phonetic characteristics of stimuli set in studying speech production and the need for the language production models to account more explicitly for word articulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulating spiking neural networks is of great interest to scientists wanting to model the functioning of the brain. However, large-scale models are expensive to simulate due to the number and interconnectedness of neurons in the brain. Furthermore, where such simulations are used in an embodied setting, the simulation must be real-time in order to be useful. In this paper we present NeMo, a platform for such simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs). NeMo makes use of the Izhikevich neuron model which provides a range of realistic spiking dynamics while being computationally efficient. Our GPU kernel can deliver up to 400 million spikes per second. This corresponds to a real-time simulation of around 40 000 neurons under biologically plausible conditions with 1000 synapses per neuron and a mean firing rate of 10 Hz.