889 resultados para Task partitioning
Resumo:
The point of departure in this dissertation was the practical safety problem of unanticipated, unfamiliar events and unexpected changes in the environment, the demanding situations which the operators should take care of in the complex socio-technical systems. The aim of this thesis was to increase the understanding of demanding situations and of the resources for coping with these situations by presenting a new construct, a conceptual model called Expert Identity (ExId) as a way to open up new solutions to the problem of demanding situations and by testing the model in empirical studies on operator work. The premises of the Core-Task Analysis (CTA) framework were adopted as a starting point: core-task oriented working practices promote the system efficiency (incl. safety, productivity and well-being targets) and that should be supported. The negative effects of stress were summarised and the possible countermeasures related to the operators' personal resources such as experience, expertise, sense of control, conceptions of work and self etc. were considered. ExId was proposed as a way to bring emotional-energetic depth into the work analysis and to supplement CTA-based practical methods to discover development challenges and to contribute to the development of complex socio-technical systems. The potential of ExId to promote understanding of operator work was demonstrated in the context of the six empirical studies on operator work. Each of these studies had its own practical objectives within the corresponding quite broad focuses of the studies. The concluding research questions were: 1) Are the assumptions made in ExId on the basis of the different theories and previous studies supported by the empirical findings? 2) Does the ExId construct promote understanding of the operator work in empirical studies? 3) What are the strengths and weaknesses of the ExId construct? The layers and the assumptions of the development of expert identity appeared to gain evidence. The new conceptual model worked as a part of an analysis of different kinds of data, as a part of different methods used for different purposes, in different work contexts. The results showed that the operators had problems in taking care of the core task resulting from the discrepancy between the demands and resources (either personal or external). The changes of work, the difficulties in reaching the real content of work in the organisation and the limits of the practical means of support had complicated the problem and limited the possibilities of the development actions within the case organisations. Personal resources seemed to be sensitive to the changes, adaptation is taking place, but not deeply or quickly enough. Furthermore, the results showed several characteristics of the studied contexts that complicated the operators' possibilities to grow into or with the demands and to develop practices, expertise and expert identity matching the core task. They were: discontinuation of the work demands, discrepancy between conceptions of work held in the other parts of organisation, visions and the reality faced by the operators, emphasis on the individual efforts and situational solutions. The potential of ExId to open up new paths to solving the problem of the demanding situations and its ability to enable studies on practices in the field was considered in the discussion. The results were interpreted as promising enough to encourage the conduction of further studies on ExId. This dissertation proposes especially contribution to supporting the workers in recognising the changing demands and their possibilities for growing with them when aiming to support human performance in complex socio-technical systems, both in designing the systems and solving the existing problems.
Resumo:
The freshwater sawfish (Pristis microdon) is a critically endangered elasmobranch. Ontogenetic changes in the habitat use of juvenile P. microdon were studied using acoustic tracking in the Fitzroy River, Western Australia. Habitat partitioning was significant between 0+ (2007 year class) and larger 1+ (2006 year class) P. microdon. Smaller 0+ fish generally occupied shallower water (<0.6 m) compared with 1+ individuals, which mainly occurred in depths >0.6 m. Significant differences in hourly depth use were also revealed. The depth that 1+ P. microdon occupied was significantly influenced by lunar phase with these animals utilising a shallower and narrower depth range during the full moon compared with the new moon. This was not observed in 0+ individuals. Habitat partitioning was likely to be related to predator avoidance, foraging behaviours, and temperature and/or light regimes. The occurrence of 1+ P. microdon in deeper water may also result from a need for greater depths in which to manoeuvre. The present study demonstrates the utility of acoustic telemetry in monitoring P. microdon in a riverine environment. These results demonstrate the need to consider the habitat requirements of different P. microdon cohorts in the strategic planning of natural resources and will aid in the development of management strategies for this species.
Resumo:
This publication, which is the final report to the Torres Strait Cooperative Research Centre, provides an overview of all the research that was conducted as part of the Torres Strait CRC Task 1.5 - Towards Ecologically Sustainable Management of the Torres Strait Prawn Fishery The objectives of the task were: To develop cost-effective protocols to monitor and quantify the bycatch and environmental impacts of commercial prawn trawling. To monitor the status of target species using both fishery dependent and fishery independent data. To develop biological reference points for target species and undertake management strategy evaluation, in particular a risk assessment of fishing at various levels of fishing mortality. This report focuses on the second component of objective 1 and details a comparative analysis of bycatch samples collected from areas of the Torres Strait that were both closed and open to prawn trawl fishing. The report also reviews the research conducted in relation to objectives 2 and 3 which are detailed in a separate report, Stock Assessment of the Torres Strait Tiger Prawn Fishery (Penaeus esculentus).
Resumo:
The StreamIt programming model has been proposed to exploit parallelism in streaming applications oil general purpose multicore architectures. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on accelerators such as Graphics Processing Units (GPUs) or CellBE which support abundant parallelism in hardware. In this paper, we describe a novel method to orchestrate the execution of if StreamIt program oil a multicore platform equipped with an accelerator. The proposed approach identifies, using profiling, the relative benefits of executing a task oil the superscalar CPU cores and the accelerator. We formulate the problem of partitioning the work between the CPU cores and the GPU, taking into account the latencies for data transfers and the required buffer layout transformations associated with the partitioning, as all integrated Integer Linear Program (ILP) which can then be solved by an ILP solver. We also propose an efficient heuristic algorithm for the work-partitioning between the CPU and the GPU, which provides solutions which are within 9.05% of the optimal solution on an average across the benchmark Suite. The partitioned tasks are then software pipelined to execute oil the multiple CPU cores and the Streaming Multiprocessors (SMs) of the GPU. The software pipelining algorithm orchestrates the execution between CPU cores and the GPU by emitting the code for the CPU and the GPU, and the code for the required data transfers. Our experiments on a platform with 8 CPU cores and a GeForce 8800 GTS 512 GPU show a geometric mean speedup of 6.94X with it maximum of 51.96X over it single threaded CPU execution across the StreamIt benchmarks. This is a 18.9% improvement over it partitioning strategy that maps only the filters that cannot be executed oil the GPU - the filters with state that is persistent across firings - onto the CPU.
Resumo:
Common diseases such as endometriosis (ED), Alzheimer's disease (AD) and multiple sclerosis (MS) account for a significant proportion of the health care burden in many countries. Genome-wide association studies (GWASs) for these diseases have identified a number of individual genetic variants contributing to the risk of those diseases. However, the effect size for most variants is small and collectively the known variants explain only a small proportion of the estimated heritability. We used a linear mixed model to fit all single nucleotide polymorphisms (SNPs) simultaneously, and estimated genetic variances on the liability scale using SNPs from GWASs in unrelated individuals for these three diseases. For each of the three diseases, case and control samples were not all genotyped in the same laboratory. We demonstrate that a careful analysis can obtain robust estimates, but also that insufficient quality control (QC) of SNPs can lead to spurious results and that too stringent QC is likely to remove real genetic signals. Our estimates show that common SNPs on commercially available genotyping chips capture significant variation contributing to liability for all three diseases. The estimated proportion of total variation tagged by all SNPs was 0.26 (SE 0.04) for ED, 0.24 (SE 0.03) for AD and 0.30 (SE 0.03) for MS. Further, we partitioned the genetic variance explained into five categories by a minor allele frequency (MAF), by chromosomes and gene annotation. We provide strong evidence that a substantial proportion of variation in liability is explained by common SNPs, and thereby give insights into the genetic architecture of the diseases.
Resumo:
A high level of parental involvement is widely considered to be essential for optimal child and adolescent development and wellbeing, including academic success. However, recent consideration has been given to the idea that extremely high levels of parental involvement (often called ‘overparenting’ or ‘helicopter parenting’) might not be beneficial. This study used a newly created overparenting measure, the Locke Parenting Scale (LPS), to investigate the association of overparenting and children’s homework. Eight hundred and sixty-six parents completed online questionnaires about their parenting beliefs and intentions, and their attitudes associated with their child’s homework. Parents with higher LPS scores tended to take more personal responsibility for the completion of their child’s homework than did other parents, and ascribed greater responsibility for homework completion to their child’s teacher. However, increased perceived responsibility by parents and teachers was not accompanied by a commensurate reduction in what they perceived was the child’s responsibility. Future research should examine whether extreme parental attitudes and reported behaviours translate to validated changes in actual homework support.
Resumo:
This research studied distributed computing of all-to-all comparison problems with big data sets. The thesis formalised the problem, and developed a high-performance and scalable computing framework with a programming model, data distribution strategies and task scheduling policies to solve the problem. The study considered storage usage, data locality and load balancing for performance improvement in solving the problem. The research outcomes can be applied in bioinformatics, biometrics and data mining and other domains in which all-to-all comparisons are a typical computing pattern.
Resumo:
Background: Hospitalised older adults often experience a decline in physical functioning and mobility in the lead up to (or during) an acute hospital admission. During acute illness and hospitalisation, older adults may also experience a decline or fluctuation in their cognitive functioning. Previous studies have demonstrated that patients with or without reduced cognitive functioning on admission to subacute inpatient rehabilitation have considerable potential to improve their physical functioning and quality of life.
Resumo:
Cyperus iria is a weed of rice with widespread occurrence throughout the world. Because of concerns about excessive and injudicious use of herbicides, cultural weed management approaches that are safe and economical are needed. Developing such approaches will require a better understanding of weed biology and ecology, as well as of weed response to increases in crop density and nutrition. Knowledge of the effects of nitrogen (N) fertilizer on crop-weed competitive interactions could also help in the development of integrated weed management strategies. The present study was conducted in a screenhouse to determine the effects of rice planting density (0, 5, 10, and 20 plants pot−1) and N rate (0, 50, 100, and 150 kg ha−1) on the growth of C. iria. Tiller number per plant decreased by 73–88%, leaf number by 85–94%, leaf area by 85–98%, leaf biomass by 92–99%, and inflorescence biomass by 96–99% when weed plants were grown at 20 rice plants pot−1 (i.e., 400 plants m−2) compared with weed plants grown alone. All of these parameters increased when N rates were increased. On average, weed biomass increased by 118–389% and rice biomass by 121–275% with application of 50–150 kg N ha−1, compared to control. Addition of N favored weed biomass production relative to rice biomass. Increased N rates reduced the root-to-shoot weight ratio of C. iria. Rice interference reduced weed growth and biomass and completely suppressed C. iria when no N was applied at high planting densities (i.e., 20 plants pot−1). The weed showed phenotypic plasticity in response to N application, and the addition of N increased the competitive ability of the weed over rice at densities of 5 and 10 rice plants pot−1 compared with 20 plants pot−1. The results of the present study suggest that high rice density (i.e., 400 plants m−2) can help suppress C. iria growth even at high N rates (150 kg ha−1).
Resumo:
Reuse of existing carefully designed and tested software improves the quality of new software systems and reduces their development costs. Object-oriented frameworks provide an established means for software reuse on the levels of both architectural design and concrete implementation. Unfortunately, due to frame-works complexity that typically results from their flexibility and overall abstract nature, there are severe problems in using frameworks. Patterns are generally accepted as a convenient way of documenting frameworks and their reuse interfaces. In this thesis it is argued, however, that mere static documentation is not enough to solve the problems related to framework usage. Instead, proper interactive assistance tools are needed in order to enable system-atic framework-based software production. This thesis shows how patterns that document a framework s reuse interface can be represented as dependency graphs, and how dynamic lists of programming tasks can be generated from those graphs to assist the process of using a framework to build an application. This approach to framework specialization combines the ideas of framework cookbooks and task-oriented user interfaces. Tasks provide assistance in (1) cre-ating new code that complies with the framework reuse interface specification, (2) assuring the consistency between existing code and the specification, and (3) adjusting existing code to meet the terms of the specification. Besides illustrating how task-orientation can be applied in the context of using frameworks, this thesis describes a systematic methodology for modeling any framework reuse interface in terms of software patterns based on dependency graphs. The methodology shows how framework-specific reuse interface specifi-cations can be derived from a library of existing reusable pattern hierarchies. Since the methodology focuses on reusing patterns, it also alleviates the recog-nized problem of framework reuse interface specification becoming complicated and unmanageable for frameworks of realistic size. The ideas and methods proposed in this thesis have been tested through imple-menting a framework specialization tool called JavaFrames. JavaFrames uses role-based patterns that specify a reuse interface of a framework to guide frame-work specialization in a task-oriented manner. This thesis reports the results of cases studies in which JavaFrames and the hierarchical framework reuse inter-face modeling methodology were applied to the Struts web application frame-work and the JHotDraw drawing editor framework.
Resumo:
Intensively managed pastures in subtropical Australia under dairy production are nitrogen (N) loaded agro-ecosystems, with an increased pool of N available for denitrification. The magnitude of denitrification losses and N2:N2O partitioning in these agro-ecosystems is largely unknown, representing a major uncertainty when estimating total N loss and replacement. This study investigated the influence of different soil moisture contents on N2 and N2O emissions from a subtropical dairy pasture in Queensland, Australia. Intact soil cores were incubated over 15 days at 80% and 100% water-filled pore space (WFPS), after the application of 15N labelled nitrate, equivalent to 50 kg N ha−1. This setup enabled the direct quantification of N2 and N2O emissions following fertilisation using the 15N gas flux method. The main product of denitrification in both treatments was N2. N2 emissions exceeded N2O emissions by a factor of 8 ± 1 at 80% WFPS and a factor of 17 ± 2 at 100% WFPS. The total amount of N-N2 lost over the incubation period was 21.27 kg ± 2.10 N2-N ha−1 at 80% WFPS and 25.26 kg ± 2.79 kg ha−1 at 100% WFPS respectively. N2 emissions remained high at 100% WFPS, while related N2O emissions decreased. At 80% WFPS, N2 emissions increased constantly over time while N2O fluxes declined. Consequently, N2/(N2 + N2O) product ratios increased over the incubation period in both treatments. N2/(N2 + N2O) product ratios responded significantly to soil moisture, confirming WFPS as a key driver of denitrification. The substantial amount of fertiliser lost as N2 reveals the agronomic significance of denitrification as a major pathway of N loss for sub-tropical pastures at high WFPS and may explain the low fertiliser N use efficiency observed for these agro-ecosystems.
Resumo:
Health challenges present arguably the most significant barrier to sustainable global development. The introduction of ICT in healthcare, especially the application of mobile communications, has created the potential to transform healthcare delivery by making it more accessible, affordable and effective across the developing world. However, current research into the assessment of mHealth from the perspective of developing countries particularly with community Health workers (CHWs) as primary users continues to be limited. The aim of this study is to analyze the contribution of mHealth in enhancing the performance of the health workers and its alignment with existing workflows to guide its utilization. The proposed research takes into account this consideration and aims to examine the task-technology alignment of mHealth for CHWs drawing upon the task technology fit as the theoretical foundation.
Resumo:
Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.
Resumo:
This paper presents an overview of the 6th ALTA shared task that ran in 2015. The task was to identify in English texts all the potential cognates from the perspective of the French language. In other words, identify all the words in the English text that would acceptably translate into a similar word in French. We present the motivations for the task, the description of the data and the results of the 4 participating teams. We discuss the results against a baseline and prior work.
Resumo:
In a reverse Stroop task, observers respond to the meaning of a color word irrespective of the color in which the word is printed—for example, the word red may be printed in the congruent color (red), an incongruent color (e.g., blue), or a neutral color (e.g., white). Although reading of color words in this task is often thought to be neither facilitated by congruent print colors nor interfered with incongruent print colors, this interference has been detected by using a response method that does not give any bias in favor of processing of word meanings or processing of print colors. On the other hand, evidence for the presence of facilitation in this task has been scarce, even though this facilitation is theoretically possible. By modifying the task such that participants respond to a stimulus color word by pointing to a corresponding response word on a computer screen with a mouse, the present study investigated the possibility that not only interference but also facilitation would take place in a reverse Stroop task. Importantly, in this study, participants’ responses were dynamically tracked by recording the entire trajectories of the mouse. Arguably, this method provided richer information about participants’ performance than traditional measures such as reaction time and accuracy, allowing for more detailed (and thus potentially more sensitive) investigation of facilitation and interference in the reverse Stroop task. These trajectories showed that the mouse’s approach toward correct response words was significantly delayed by incongruent print colors but not affected by congruent print colors, demonstrating that only interference, not facilitation, was present in the current task. Implications of these findings are discussed within a theoretical framework in which the strength of association between a task and its response method plays a critical role in determining how word meanings and print colors interact in reverse Stroop tasks.