895 resultados para Hierarchical task analysis
Resumo:
Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.
Resumo:
Evidence of a sport-specific hierarchy of protective factors against doping would thus be a powerful aid in adapting information and prevention campaigns to target the characteristics of specific athlete groups, and especially those athletes most vulnerable for doping control. The contents of phone calls to a free and anonymous national anti-doping service called 'ecoute dopage' were analysed (192 bodybuilders, 124 cyclists and 44 footballers). The results showed that the protective factors that emerged from analysis could be categorised into two groups. The first comprised 'Health concerns', 'Respect for the law' and 'Doping controls from the environment' and the second comprised 'Doubts about the effectiveness of illicit products, 'Thinking skills' and 'Doubts about doctors'. The ranking of the factors for the cyclists differed from that of the other athletes. The ordering of factors was 1) respect for the law, 2) doping controls from the environment, 3) health concerns 4) doubts about doctors, and 5) doubts about the effectiveness illicit products. The results are analysed in terms of the ranking in each athlete group and the consequences on the athletes' experience and relationship to doping. Specific prevention campaigns are proposed to limit doping behaviour in general and for each sport.
Resumo:
Analysis of variance is commonly used in morphometry in order to ascertain differences in parameters between several populations. Failure to detect significant differences between populations (type II error) may be due to suboptimal sampling and lead to erroneous conclusions; the concept of statistical power allows one to avoid such failures by means of an adequate sampling. Several examples are given in the morphometry of the nervous system, showing the use of the power of a hierarchical analysis of variance test for the choice of appropriate sample and subsample sizes. In the first case chosen, neuronal densities in the human visual cortex, we find the number of observations to be of little effect. For dendritic spine densities in the visual cortex of mice and humans, the effect is somewhat larger. A substantial effect is shown in our last example, dendritic segmental lengths in monkey lateral geniculate nucleus. It is in the nature of the hierarchical model that sample size is always more important than subsample size. The relative weight to be attributed to subsample size thus depends on the relative magnitude of the between observations variance compared to the between individuals variance.
Resumo:
Globalisation and technological advances have made possible to offshore specific productive tasks (that do not require physical proximity to the actual location of the work unit) to foreign countries where these are usually performed at lower costs. We analyse the effect of task trade (i.e. task offshorability) on Spanish regional and national employment levels correlating a newly built index of task-delocalisation index to key variables such as the region’s wealth, the worker’s age and level of education, the importance of the service sector and the technological level of the economic activities undertaken in that particular geographical area. We conclude that approximately 25 per cent of Spanish occupations are potentially affected by task trade / offshoring and that this is likely to benefit Spanish economy (and the performance of specific regions, categories of workers and sectors) being Spain a potential recipient of tasks offshored from abroad. Also we obtain that Spain’s trade in tasks correlates strongly with the above variables, presenting significant regional differences.
Resumo:
Throughout Nietzsche's writings we find discussions of the proper relationship of the scholar/scientist to the philosopher, wi th the scholar of ten being presented in a derogatory light. In this thesis, I examine Nietzsche's por t rai t of the scholar through the lens of his physiological or clinical perspective as articulated by Dr. Daniel R. Ahern in his monograph entitled Nietzsche as Cultural Physician. My aim in doing so is to grasp the affirmative, creative aspect of this seemingly destructive polemic against scholars. I begin wi th a detailed discussion of Nietzsche's por t rai t of the scholar in Beyond Good and Evil. This includes an explication of Ahern's position, followed by an application of the diagnostic perspective to Nietzsche's discussion of the objective type, the skeptic, and the critic. I then look at how the characteristics of all three types are present in the Nietzschean 'free spirit.' I also discuss the physiological basis of esotericism in Nietzsche's work, as well as Nietzsche's revaluation of the scholarly vi r tue known as Red/ichkeit (or 'honesty'). I conclude wi th comments on the free spirit's relationship to the future.
Resumo:
Behavioral researchers commonly use single subject designs to evaluate the effects of a given treatment. Several different methods of data analysis are used, each with their own set of methodological strengths and limitations. Visual inspection is commonly used as a method of analyzing data which assesses the variability, level, and trend both within and between conditions (Cooper, Heron, & Heward, 2007). In an attempt to quantify treatment outcomes, researchers developed two methods for analysing data called Percentage of Non-overlapping Data Points (PND) and Percentage of Data Points Exceeding the Median (PEM). The purpose of the present study is to compare and contrast the use of Hierarchical Linear Modelling (HLM), PND and PEM in single subject research. The present study used 39 behaviours, across 17 participants to compare treatment outcomes of a group cognitive behavioural therapy program, using PND, PEM, and HLM on three response classes of Obsessive Compulsive Behaviour in children with Autism Spectrum Disorder. Findings suggest that PEM and HLM complement each other and both add invaluable information to the overall treatment results. Future research should consider using both PEM and HLM when analysing single subject designs, specifically grouped data with variability.
Resumo:
Resumen tomado de la publicaci??n. Resumen tambi??n en ingl??s
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
We present, pedagogically, the Bayesian approach to composed error models under alternative, hierarchical characterizations; demonstrate, briefly, the Bayesian approach to model comparison using recent advances in Markov Chain Monte Carlo (MCMC) methods; and illustrate, empirically, the value of these techniques to natural resource economics and coastal fisheries management, in particular. The Bayesian approach to fisheries efficiency analysis is interesting for at least three reasons. First, it is a robust and highly flexible alternative to commonly applied, frequentist procedures, which dominate the literature. Second,the Bayesian approach is extremely simple to implement, requiring only a modest addition to most natural-resource economist tool-kits. Third, despite its attractions, applications of Bayesian methodology in coastal fisheries management are few.
Resumo:
This study aimed at analyzing the relationship between slow- and fast-alpha asymmetry within frontal cortex and the planning, execution and voluntary control of saccadic eye movements (SEM), and quantitative electroencephalography (qEEG) was recorded using a 20-channel EEG system in 12 healthy participants performing a fixed (i.e., memory-driven) and a random SEM (i.e., stimulus-driven) condition. We find main effects for SEM condition in slow- and fast-alpha asymmetry at electrodes F3-F4, which are located over premotor cortex, specifically a negative asymmetry between conditions. When analyzing electrodes F7-F8, which are located over prefrontal cortex, we found a main effect for condition in slow-alpha asymmetry, particularly a positive asymmetry between conditions. In conclusion, the present approach supports the association of slow- and fast-alpha bands with the planning and preparation of SEM, and the specific role of these sub-bands for both, the attention network and the coordination and integration of sensory information with a (oculo)-motor response. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Objective: To assess the risk factors for delayed diagnosis of uterine cervical lesions. Materials and Methods: This is a case-control study that recruited 178 women at 2 Brazilian hospitals. The cases (n = 74) were composed of women with a late diagnosis of a lesion in the uterine cervix (invasive carcinoma in any stage). The controls (n = 104) were composed of women with cervical lesions diagnosed early on (low-or high-grade intraepithelial lesions). The analysis was performed by means of logistic regression model using a hierarchical model. The socioeconomic and demographic variables were included at level I (distal). Level II (intermediate) included the personal and family antecedents and knowledge about the Papanicolaou test and human papillomavirus. Level III (proximal) encompassed the variables relating to individuals' care for their own health, gynecologic symptoms, and variables relating to access to the health care system. Results: The risk factors for late diagnosis of uterine cervical lesions were age older than 40 years (odds ratio [OR] = 10.4; 95% confidence interval [CI], 2.3-48.4), not knowing the difference between the Papanicolaou test and gynecological pelvic examinations (OR, = 2.5; 95% CI, 1.3-4.9), not thinking that the Papanicolaou test was important (odds ratio [OR], 4.2; 95% CI, 1.3-13.4), and abnormal vaginal bleeding (OR, 15.0; 95% CI, 6.5-35.0). Previous treatment for sexually transmissible disease was a protective factor (OR, 0.3; 95% CI, 0.1-0.8) for delayed diagnosis. Conclusions: Deficiencies in cervical cancer prevention programs in developing countries are not simply a matter of better provision and coverage of Papanicolaou tests. The misconception about the Papanicolaou test is a serious educational problem, as demonstrated by the present study.
Resumo:
Hierarchical multi-label classification is a complex classification task where the classes involved in the problem are hierarchically structured and each example may simultaneously belong to more than one class in each hierarchical level. In this paper, we extend our previous works, where we investigated a new local-based classification method that incrementally trains a multi-layer perceptron for each level of the classification hierarchy. Predictions made by a neural network in a given level are used as inputs to the neural network responsible for the prediction in the next level. We compare the proposed method with one state-of-the-art decision-tree induction method and two decision-tree induction methods, using several hierarchical multi-label classification datasets. We perform a thorough experimental analysis, showing that our method obtains competitive results to a robust global method regarding both precision and recall evaluation measures.