983 resultados para Task-level parallelism
Resumo:
In this paper we summarise key elements of retail change in Britain over a twenty-year period. The time period is that covered by a funded study into long-term change in grocery shopping habits in Portsmouth, England. The major empirical findings—to which we briefly allude—are reported elsewhere: the present task is to assess the wider context underlying that change. For example, it has frequently been stated that retailing in the UK is not as competitive as in other leading economies. As a result, the issue of consumer choice has become increasingly important politically. Concerns over concentration in the industry, new format development and market definition have been expressed by local planners, competition regulators and consumer groups. Macro level changes over time have also created market inequality in consumer opportunities at a local level—hence our decision to attempt a local-level study. Situational factors affecting consumer experiences over time at the local level involve the changing store choice sets available to particular consumers. Using actual consumer experiences thus becomes a yardstick for assessing the practical effectiveness of policy making. The paper demonstrates that choice at local level is driven by store use and that different levels of provision reflect real choice at the local level. Macro-level policy and ‘one size fits all’ approaches to regulation, it is argued, do not reflect the changing reality of grocery shopping. Accordingly, arguments for a more local and regional approach to regulation are made.
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.
Resumo:
Adaptability for distributed object-oriented enterprise frameworks is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing environment. In this thesis, we propose a Meta Level Component-Based Framework (MELC) which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our novel approach of combining a meta architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. The critical nature of distributed technologies requires frameworks to be adaptable. Our framework employs a meta architecture. It supports dynamic adaptation of feasible design decisions in the framework design space by specifying and coordinating meta-objects that represent various aspects within the distributed environment. The meta architecture in MELC framework can provide the adaptability for system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed applications. The concept of using a meta architecture to produce an adaptable pattern-oriented framework for distributed computing applications is new and has not previously been explored in research. As the framework is adaptable, the proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address technical system issues in the domain of distributed computing and they can be woven together to shape the framework in future. We show how MELC can be used effectively to enable dynamic component integration and to separate system functionality from business functionality. We demonstrate how MELC provides an adaptable and dynamic run time environment using our system configuration and management utility. We also highlight how MELC will impose significant adaptability in system evolution through a prototype E-Bookshop application to assemble its business functions with distributed computing components at the meta level in MELC architecture. Our performance tests show that MELC does not entail prohibitive performance tradeoffs. The work to develop the MELC framework for distributed computing applications has emerged as a promising way to meet current and future challenges in the distributed environment.
Resumo:
This thesis analyses the impact of workplace stressors and mood on innovation activities. Based on three competitive frameworks offered by cognitive spreading activation theory, mood repair perspective, and mood-as-information theory, different sets of predictions are developed. These hypotheses are tested in a field study involving 41 R&D teams and 123 individual R&D workers, and in an experimental study involving 54 teams of students. Results of the field study suggest that stressors and mood interact to predict innovation activities in such a way that with increasing stressors a high positive ( or negative) mood is more detrimental to innovation activities than a low positive (or negative) mood, lending support to the mood repair perspective. These effects are found for both individuals and teams. In the experimental study this effect is replicated and potential boundary conditions and mediators are tested. In addition, this thesis includes the development of an instrument to assess creativity and implementation activities within the realm of task-related innovative performance.
Resumo:
We introduce a type of 2-tier convolutional neural network model for learning distributed paragraph representations for a special task (e.g. paragraph or short document level sentiment analysis and text topic categorization). We decompose the paragraph semantics into 3 cascaded constitutes: word representation, sentence composition and document composition. Specifically, we learn distributed word representations by a continuous bag-of-words model from a large unstructured text corpus. Then, using these word representations as pre-trained vectors, distributed task specific sentence representations are learned from a sentence level corpus with task-specific labels by the first tier of our model. Using these sentence representations as distributed paragraph representation vectors, distributed paragraph representations are learned from a paragraph-level corpus by the second tier of our model. It is evaluated on DBpedia ontology classification dataset and Amazon review dataset. Empirical results show the effectiveness of our proposed learning model for generating distributed paragraph representations.
Resumo:
Prior research linking demographic (e.g., age, ethnicity/race, gender, and tenure) and underlying psychological (e.g., personality, attitudes, and values) dissimilarity variables to individual group member's work-related outcomes produced mixed and contradictory results. To account for these findings, this study develops a contingency framework and tests it using meta-analytic and structural equation modelling techniques. In line with this framework, results showed different effects of surface-level (i.e., demographic) dissimilarity and deep-level (i.e., underlying psychological) dissimilarity on social integration, and ultimately on individual effectiveness related outcomes (i.e., turnover, task, and contextual performance). Specifically, surface-level dissimilarity had a negative effect on social integration under low but not under high team interdependence. In return, social integration fully mediated the negative relationship between surface-level dissimilarity and individual effectiveness related outcomes under low interdependence. In contrast, deep-level dissimilarity had a negative effect on social integration, which was stronger under high and weaker under low team interdependence. Contrary to our predictions, social integration did not mediate the negative relationship between deep-level dissimilarity and individual effectiveness related outcomes but suppressed positive direct effects of deep-level dissimilarity on individual effectiveness related outcomes. Possible explanations for these counterintuitive findings are discussed. © 2011 The British Psychological Society.
Resumo:
Gender differences have been well established in verbal and spatial abilities but few studies have examined if these differences also extend into the domain of working memory in terms of behavioural differences and brain activation. The conclusions that can be drawn from these studies are not clear cut but suggest that even though gender differences might not be apparent from behavioural measures, the underlying neural substrate associated with working memory might be different in men and women. Previous research suggests activation in a network of frontal and parietal regions during working memory tasks. This study aimed to investigate gender differences in patterns of brain activation during a verbal version of the N-back working memory task, which incorporates the effects of increased demands on working memory. A total of 50 healthy subjects, aged 18 to 58 years, that were equally split by gender were recruited matched for age, levels of education and ethnicity. All subjects underwent functional magnetic resonance imaging. We found that men and women performed equally well in terms of accuracy and response times, while using similar brain regions to the same degree. Our observations indicate that verbal working memory is not affected by gender at the behavioural or neural level, and support the findings of a recent meta-analysis by Hyde ([2005]: Sex Roles 53:717-725) that gender differences are generally smaller than intra-gender differences in many cognitive domains. © 2009 Wiley-Liss, Inc.
Resumo:
The paper examines the policy responses in the UK West Midlands to the successive crises at the car maker MG-Rover. Whilst the firm’s eventual collapse in 2005 was a substantial shock to the West Midlands economy, the impact was much less than was anticipated when the firm was first threatened with closure in 2000 at the time of its break-up and sale by the German car firm BMW. Although the firm struggled as an independent producer, the five years of continued production until 2005 and the work of the initial Rover Task Force (RTF1), enabled many suppliers to adjust and diversify away from their hitherto dependence on MG-Rover resulting in as many as 10,000–12,000 jobs being ‘saved’. This first intervention was later followed by a programme to help ex-workers to find new jobs or re-train and assist supply firms to continue trading in the short term. Examination of the effectiveness of these emergency initiatives enables a wider discussion about the nature of industrial policy in the region and the work of the local regional development agency’s cluster-based approach to economic development and business support. Whilst the actions taken were successful in a number of aspects, there were a number of significant ‘failures’ at both national and local level. The MG-Rover case also illustrates a number of critical issues pertaining to regionally based cluster policies and the organization of cluster management groups where the ‘cluster’ in question not only crosses both administrative and ‘sector’ boundaries but is also subject to the imperatives of the global market car market.
Resumo:
The ability to hear a target signal over background noise is an important aspect of efficient hearing in everyday situations. This mechanism depends on binaural hearing whenever there are differences in the inter-aural timing of inputs from the noise and the signal. Impairments in binaural hearing may underlie some auditory processing disorders, for example temporal-lobe epilepsies. The binaural masking level difference (BMLD) measures the advantage in detecting a tone whose inter-aural phase differs from that of the masking noise. BMLD’s are typically estimated psychophysically, but this is challenging in children or those with cognitive impairments. The aim of this doctorate is to design a passive measure of BMLD using magnetoencephalography (MEG) and test this in adults, children and patients with different types of epilepsy. The stimulus consists of Gaussian background noise with 500-Hz tones presented binaurally either in-phase or 180° out-of-phase between the ears. Source modelling provides the N1m amplitude for the in-phase and out-of-phase tones, representing the extent of signal perception over background noise. The passive BMLD stimulus is successfully used as a measure of binaural hearing capabilities in participants who would otherwise be unable to undertake a psychophysical task.
Resumo:
The major contribution of the study is the identification of a positive link between perceived effective managerial coaching (PEMC) and team task performance and also, the examination of PEMC adopting a multilevel research design and incorporating dual-source data. Specifically, drawing on social psychology, the thesis aims at developing and testing a comprehensive conceptual framework of the antecedents and consequences of PEMC for knowledge workers. The model takes into consideration intrapersonal, interpersonal and team-level characteristics, which relate to PEMC and, subsequently associate with important work outcomes. In this regard, the thesis identifies PEMC as a practice of dual nature in that it may be experienced not only as a one-on-one workplace developmental interaction, but also as a managerial practice that is experienced by each member of a team for co-ordination purposes. Adopting a cross-sectional survey research design, the hypotheses are tested in three organisations in Greece and the UK. In particular, hierarchical linear modelling of 191 employees nested in 60 teams yields that employees’ learning goal orientation (LGO) and high-quality exchanges between an employee and a manager (LMX) are positively related to effective MC, while a manager’s LGO moderates the relationship between employees’ LGO and PEMC. In turn, PEMC, as a one-on-one practice, is related to cognitive outcomes, such as information sharing, while as a shared team practice is related also to behavioural outcomes, including individual and team performance. Overall, the study contributes to a growing body of coaching and management literature that acknowledges PEMC as a core managerial practice.
Resumo:
Adaptability for distributed object-oriented enterprise frameworks in multimedia technology is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing systems. In this paper, we propose a Metalevel Component-Based Framework which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our approach of combining a meta-architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed multimedia applications. The proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address issues in the domain of distributed computing and they can be woven together to shape the framework in future. © 2011 Springer Science+Business Media B.V.
Resumo:
Feature selection is important in medical field for many reasons. However, selecting important variables is a difficult task with the presence of censoring that is a unique feature in survival data analysis. This paper proposed an approach to deal with the censoring problem in endovascular aortic repair survival data through Bayesian networks. It was merged and embedded with a hybrid feature selection process that combines cox's univariate analysis with machine learning approaches such as ensemble artificial neural networks to select the most relevant predictive variables. The proposed algorithm was compared with common survival variable selection approaches such as; least absolute shrinkage and selection operator LASSO, and Akaike information criterion AIC methods. The results showed that it was capable of dealing with high censoring in the datasets. Moreover, ensemble classifiers increased the area under the roc curves of the two datasets collected from two centers located in United Kingdom separately. Furthermore, ensembles constructed with center 1 enhanced the concordance index of center 2 prediction compared to the model built with a single network. Although the size of the final reduced model using the neural networks and its ensembles is greater than other methods, the model outperformed the others in both concordance index and sensitivity for center 2 prediction. This indicates the reduced model is more powerful for cross center prediction.
Resumo:
In recent years there have been a number of high-profile plant closures in the UK. In several cases, the policy response has included setting up a task force to deal with the impacts of the closure. It can be hypothesised that task force involving multi-level working across territorial boundaries and tiers of government is crucial to devising a policy response tailored to people's needs and to ensuring success in dealing with the immediate impacts of a closure. This suggests that leadership, and vision, partnership working and community engagement, and delivery of high quality services are important. This paper looks at the case of the MG Rover closure in 2005, to examine the extent to which the policy response to the closure at the national, regional and local levels dealt effectively with the immediate impacts of the closure, and the lessons that can be learned from the experience. Such lessons are of particular relevance given the closure of the LDV van plant in Birmingham in 2009 and more broadly – such as in the case of the downsizing of the Opel operation in Europe following its takeover by Magna.
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.