19 resultados para task analysis

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The preventive knowledge of serviceability times is a critical factor for the quantification of after-sales services costs of a vehicle. Predetermined motion time system are frequently used to set labor rates in industry by quantifying the amount of time required to perform specific tasks. The first such system is known as Methods-time measurement (MTM). Several variants of MTM have been developed differing from each other on their level of focus. Among them MTM-UAS is suitable for processes that average around 1-3 min. However experimental tests carried out by the authors in Elasis (Research Center of FIAT Group) demonstrate that MTM-UAS is not the optimal approach to measure serviceability times. The reason is that it doesn't take into account ergonomic factors. In the present paper the authors propose to correct the MTM-UAS method including in the task analysis the study of human postures and efforts. The proposed approach allows to estimate with an "acceptable" error the time needed to perform maintenance tasks since the first phases of product design, by working on Digital Mock-up and human models in virtual environment. As a byproduct of that analysis, it is possible to obtain a list of maintenance times in order to preventively set after-sales service costs. © 2012 Springer-Verlag.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present BDDT, a task-parallel runtime system that dynamically discovers and resolves dependencies among parallel tasks. BDDT allows the programmer to specify detailed task footprints on any memory address range, multidimensional array tile or dynamic region. BDDT uses a block-based dependence analysis with arbitrary granularity. The analysis is applicable to existing C programs without having to restructure object or array allocation, and provides flexibility in array layouts and tile dimensions.
We evaluate BDDT using a representative set of benchmarks, and we compare it to SMPSs (the equivalent runtime system in StarSs) and OpenMP. BDDT performs comparable to or better than SMPSs and is able to cope with task granularity as much as one order of magnitude finer than SMPSs. Compared to OpenMP, BDDT performs up to 3.9× better for benchmarks that benefit from dynamic dependence analysis. BDDT provides additional data annotations to bypass dependence analysis. Using these annotations, BDDT outperforms OpenMP also in benchmarks where dependence analysis does not discover additional parallelism, thanks to a more efficient implementation of the runtime system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Processor architectures has taken a turn towards many-core processors, which integrate multiple processing cores on a single chip to increase overall performance, and there are no signs that this trend will stop in the near future. Many-core processors are harder to program than multi-core and single-core processors due to the need of writing parallel or concurrent programs with high degrees of parallelism. Moreover, many-cores have to operate in a mode of strong scaling because of memory bandwidth constraints. In strong scaling increasingly finer-grain parallelism must be extracted in order to keep all processing cores busy.

Task dataflow programming models have a high potential to simplify parallel program- ming because they alleviate the programmer from identifying precisely all inter-task de- pendences when writing programs. Instead, the task dataflow runtime system detects and enforces inter-task dependences during execution based on the description of memory each task accesses. The runtime constructs a task dataflow graph that captures all tasks and their dependences. Tasks are scheduled to execute in parallel taking into account dependences specified in the task graph.

Several papers report important overheads for task dataflow systems, which severely limits the scalability and usability of such systems. In this paper we study efficient schemes to manage task graphs and analyze their scalability. We assume a programming model that supports input, output and in/out annotations on task arguments, as well as commutative in/out and reductions. We analyze the structure of task graphs and identify versions and generations as key concepts for efficient management of task graphs. Then, we present three schemes to manage task graphs building on graph representations, hypergraphs and lists. We also consider a fourth edge-less scheme that synchronizes tasks using integers. Analysis using micro-benchmarks shows that the graph representation is not always scalable and that the edge-less scheme introduces least overhead in nearly all situations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Safety on public transport is a major concern for the relevant authorities. We
address this issue by proposing an automated surveillance platform which combines data from video, infrared and pressure sensors. Data homogenisation and integration is achieved by a distributed architecture based on communication middleware that resolves interconnection issues, thereby enabling data modelling. A common-sense knowledge base models and encodes knowledge about public-transport platforms and the actions and activities of passengers. Trajectory data from passengers is modelled as a time-series of human activities. Common-sense knowledge and rules are then applied to detect inconsistencies or errors in the data interpretation. Lastly, the rationality that characterises human behaviour is also captured here through a bottom-up Hierarchical Task Network planner that, along with common-sense, corrects misinterpretations to explain passenger behaviour. The system is validated using a simulated bus saloon scenario as a case-study. Eighteen video sequences were recorded with up to six passengers. Four metrics were used to evaluate performance. The system, with an accuracy greater than 90% for each of the four metrics, was found to outperform a rule-base system and a system containing planning alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure. which encourages further research towards a higher-dimensional analysis of Pareto fronts. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to identify, clarify and tabulate the various managerial issues encountered, to aid in the management of the complex health and safety concerns which occur within a confined construction site environment.

Design/methodology/approach – This is achieved through conducting extensive qualitative and qualitative research in the form of case studies, interviews and questionnaire survey.

Findings – The leading managerial issues in the management of health and safety on a confined construction site are found to be: “Difficulty to move materials around site safely”; “Lack of adequate room for the effective handling of materials”; “Difficulty in ensuring site is tidy and all plant and materials are stored safely”; “Close proximity of individuals to operation of large plant and machinery”; and joint fifth “Difficulty in ensuring proper arrangement and collection of waste materials on-site” along with “Difficulty in controlling hazardous materials and equipment on site”.

Practical implications – The resulting implication for practice of these results can be summarised by identifying that with sustained development of urban centres on a global scale, coupled with the increasing complexity of architectural designs, the majority of on-site project management professionals are faced with the onerous task of completing often intricate designs within a limited spatial environment, under strict health and safety parameters.

Originality/value – The subsequent value of the findings are such that just as on-site management professionals successfully identify the various managerial issues highlighted, the successful management of health and safety on a confined construction site is attainable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inherent difficulty of thread-based shared-memory programming has recently motivated research in high-level, task-parallel programming models. Recent advances of Task-Parallel models add implicit synchronization, where the system automatically detects and satisfies data dependencies among spawned tasks. However, dynamic dependence analysis incurs significant runtime overheads, because the runtime must track task resources and use this information to schedule tasks while avoiding conflicts and races.
We present SCOOP, a compiler that effectively integrates static and dynamic analysis in code generation. SCOOP combines context-sensitive points-to, control-flow, escape, and effect analyses to remove redundant dependence checks at runtime. Our static analysis can work in combination with existing dynamic analyses and task-parallel runtimes that use annotations to specify tasks and their memory footprints. We use our static dependence analysis to detect non-conflicting tasks and an existing dynamic analysis to handle the remaining dependencies. We evaluate the resulting hybrid dependence analysis on a set of task-parallel programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The REsearch on a CRuiser Enabled Air Transport Environment (RECREATE) project is considers the introduction and airworthiness of cruiser-feeder operations for civil aircraft. Cruiser-feeder operations are investigated as a promising pioneering idea for the air transport of the future. The soundness of the concept of cruiser-feeder operations for civil aircraft can be understood, taking air-to-air refueling operations as an example. For this example, a comprehensive estimate of the benefits can be made, which shows a fuel burn reduction potential and a CO2 emission reduction of 31% for a typical 6000 nautical miles flight with a payload of 250 passengers. This reduction potential is known to be large by any standard. The top level objective of the RECREATE project is to demonstrate on a preliminary design level that cruiser-feeder operations (as a concept to reduce fuel burn and CO2 emission levels) can be shown to comply with the airworthiness requirements for civil aircraft. The underlying Scientific and Technological (S&T) objectives are to determine and study airworthy operational concepts for cruiser-feeder operations, and to derive and quantify benefits in terms of CO2 emission reduction but also other benefits.

Work Package (WP) 3 has the objective to substantiate the assumed benefits of the cruiser/feeder operations through refined analysis and simulation. In this report, initial benefits evaluation of the initial RECREATE cruiser/feeder concepts is presented. The benefits analysis is conducted in delta mode, i.e. comparison is made with a baseline system. Since comparing different aircraft and air transport systems is never a trivial task, appropriate measures and metrics are defined and selected first. Non-dimensional parameters are defined and values for the baseline system derived.

The impact of cruiser/feeder operations such as air-to-air refueling are studied with respect to fuel-burn (or carbon-dioxide), noise and congestion. For this purpose, traffic simulations have been conducted.
Cruiser/feeder operations will have an impact on dispatch reliability as well. An initial assessment of the effect on dispatch reliability has been made and is reported.

Finally, a considerable effort has been made to create the infrastructure for economic delta analysis of the cruiser/feeder concept of operation. First results of the cost analysis have been obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating analysis and design models is a complex task due to differences between the models and the architectures of the toolsets used to create them. This complexity is increased with the use of many different tools for specific tasks during an analysis process. In this work various design and analysis models are linked throughout the design lifecycle, allowing them to be moved between packages in a way not currently available. Three technologies named Cellular Modeling, Virtual Topology and Equivalencing are combined to demonstrate how different finite element meshes generated on abstract analysis geometries can be linked to their original geometry. Establishing the equivalence relationships between models enables analysts to utilize multiple packages for specialist tasks without worrying about compatibility issues or rework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating analysis and design models is a complex task due to differences between the models and the architectures of the toolsets used to create them. This complexity is increased with the use of many different tools for specific tasks using an analysis process. In this work various design and analysis models are linked throughout the design lifecycle, allowing them to be moved between packages in a way not currently available. Three technologies named Cellular Modeling, Virtual Topology and Equivalencing are combined to demonstrate how different finite element meshes generated on abstract analysis geometries can be linked to their original geometry. Cellular models allow interfaces between adjacent cells to be extracted and exploited to transfer analysis attributes such as mesh associativity or boundary conditions between equivalent model representations. Virtual Topology descriptions used for geometry clean-up operations are explicitly stored so they can be reused by downstream applications. Establishing the equivalence relationships between models enables analysts to utilize multiple packages for specialist tasks without worrying about compatibility issues or substantial rework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To systematically review the evidence examining effects of walking interventions on pain and self-reported function in individuals with chronic musculoskeletal pain.
Data Sources: Six electronic databases (Medline, CINAHL, PsychINFO, PEDro, Sport Discus and the Cochrane Central Register of Controlled Trials) were searched from January 1980 up to March 2014.
Study Selection: Randomized and quasi-randomized controlled trials in adults with chronic low back pain, osteoarthritis or fibromyalgia comparing walking interventions to a non-exercise or non-walking exercise control group.
Data Extraction: Data were independently extracted using a standardized form. Methodological quality was assessed using the United States Preventative Services Task Force (USPSTF) system.
Data Synthesis: Twenty-six studies (2384 participants) were included and suitable data from 17 were pooled for meta-analysis with a random effects model used to calculate between group mean differences and 95% confidence intervals. Data were analyzed according to length of follow-up (short-term: ≤8 weeks post randomization; medium-term: >2 months - 12 months; long-term: > 12 months). Interventions were associated with small to moderate improvements in pain at short (mean difference (MD) -5.31, 95% confidence interval (95% CI) -8.06 to -2.56) and medium-term follow-up (MD -7.92, 95% CI -12.37 to -3.48). Improvements in function were observed at short (MD -6.47, 95% CI -12.00 to -0.95), medium (MD -9.31, 95% CI -14.00 to -4.61) and long-term follow-up (MD -5.22, 95% CI 7.21 to -3.23).
Conclusions: Evidence of fair methodological quality suggests that walking is associated with significant improvements in outcome compared to control interventions but longer-term effectiveness is uncertain. Using the USPSTF system, walking can be recommended as an effective form of exercise or activity for individuals with chronic musculoskeletal pain but should be supplemented with strategies aimed at maintaining participation. Further work is also required examining effects on important health related outcomes in this population in robustly designed studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of exploratory experiments using lexical valence extracted from brain using electroencephalography (EEG) for sentiment analysis. We selected 78 English words (36 for training and 42 for testing), presented as stimuli to 3 English native speakers. EEG signals were recorded from the subjects while they performed a mental imaging task for each word stimulus. Wavelet decomposition was employed to extract EEG features from the time-frequency domain. The extracted features were used as inputs to a sparse multinomial logistic regression (SMLR) classifier for valence classification, after univariate ANOVA feature selection. After mapping EEG signals to sentiment valences, we exploited the lexical polarity extracted from brain data for the prediction of the valence of 12 sentences taken from the SemEval-2007 shared task, and compared it against existing lexical resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is prompted by the widespread acceptance that the rates of inter-county and inter-state migration have been falling in the USA and sets itself the task of examining whether this decline in migration intensities is also the case in the UK. It uses annual inter-area migration matrices available for England and Wales since the 1970s by broad age group. The main methodological challenge, arising from changes in the geography of health areas for which the inter-area flows are given, is addressed by adopting the lowest common denominator of 80 areas. Care is also taken to allow for the effect of economic cycles in producing short-term fluctuations on migration rates and to isolate the effect of a sharp rise in rates for 16-24 year olds in the 1990s, which is presumed to be related to the expansion of higher education. The findings suggest that, unlike for the USA, there has not been a substantial decline in the intensity of internal migration between the first two decades of the study period and the second two. If there has been any major decline in the intensity of address changing in England and Wales, it can only be for the within-area moves that this time series does not cover. This latter possibility is examined in a companion paper using a very different data set (Champion and Shuttleworth, 2016).