949 resultados para workload leave
Resumo:
Engineers and asset managers must often make decisions on how to best allocate limited resources amongst different interrelated activities, including repair, renewal, inspection, and procurement of new assets. The presence of project interdependencies and the lack of sufficient information on the true value of an activity often produce complex problems and leave the decision maker guessing about the quality and robustness of their decision. In this paper, a decision support framework for uncertain interrelated activities is presented. The framework employs a methodology for multi-criteria ranking in the presence of uncertainty, detailing the effect that uncertain valuations may have on the priority of a particular activity. The framework employs employing semi-quantitative risk measures that can be tailored to an organisation and enable a transparent and simple-to-use uncertainty specification by the decision maker. The framework is then demonstrated on a real world project set from a major Australian utility provider.
Resumo:
Lateralization of temporal lobe epilepsy (TLE) is critical for successful outcome of surgery to relieve seizures. TLE affects brain regions beyond the temporal lobes and has been associated with aberrant brain networks, based on evidence from functional magnetic resonance imaging. We present here a machine learning-based method for determining the laterality of TLE, using features extracted from resting-state functional connectivity of the brain. A comprehensive feature space was constructed to include network properties within local brain regions, between brain regions, and across the whole network. Feature selection was performed based on random forest and a support vector machine was employed to train a linear model to predict the laterality of TLE on unseen patients. A leave-one-patient-out cross validation was carried out on 12 patients and a prediction accuracy of 83% was achieved. The importance of selected features was analyzed to demonstrate the contribution of resting-state connectivity attributes at voxel, region, and network levels to TLE lateralization.
Resumo:
From 2008-09 to 2012-13, the most prevalent worker compensation claim in the Queensland Ambulance Service (QAS) was musculoskeletal injuries at >80%. This is consistent with literature that shows Musculoskeletal Disorders (MSD) was one of the front runners for workplace injuries among many professions. In an attempt to reduce the injury rate and related claims, the QAS created a selection criterion for their workers based on the Health Related Fitness Test. This method intended to select workers based upon their fitness level, instead of selecting for their ability to perform the tasks or modify the tasks to better suit the workers. With injury rates remaining high, further research produced the Patient Handling Equipment Project Report, which provided the background for the Manual Handling Program Book. The Manual Handling Program Book however lacks in accurately addressing musculoskeletal hazards; actions which cause or avoid injury, correct posture and motion for patient movement, muscular biomechanics, static and dynamic workload including activities causing strain, and equipment use in relation to musculoskeletal hazards. The exploratory research aims to better understand the ambulance service’s perception of Manual Materials Handling (MMH), how it relates to musculoskeletal injuries and how the service has attempted to reduce its prevalence. Based on a literature review and a critical analysis of the QAS Health Related Fitness Test, QAS Patient Handling Equipment Project Report and the QAS Manual Handling Program Book, an understanding of their shortfalls in the prevention of musculoskeletal injuries was gained. This entails understanding the work tasks, workloads, strains and workflow of paramedics. This research creates a starting point for further research into musculoskeletal injuries in paramedics. This study specifically looks at hazards related to musculoskeletal disorders. It identifies work system deficiencies that contribute to the prevalence of musculoskeletal injuries, and possible interventions to avoid them in paramedics.
Resumo:
Take-it or leave-it offers are probably as old as mankind. Our objective here is, first, to provide a, probably subjectively colored, recollection of the initial ultimatum game experiment, its motivation and the immediate responses. Second, we discuss extensions of the standard ultimatum bargaining game in a unified framework, and, third, we offer a survey of the experimental ultimatum bargaining literature containing papers published since the turn of the century. The paper argues that the ultimatum game is a versatile tool for research in bargaining and on social preferences. Finally, we provide examples for open research questions and directions for future studies.
Resumo:
This study compared fat oxidation rate from a graded exercise test (GXT) with a moderate-intensity interval training session (MIIT) in obese men. Twelve sedentary obese males (age 29 ± 4.1 years; BMI 29.1 ± 2.4 kg·m-2; fat mass 31.7 ± 4.4 %body mass) completed two exercise sessions: GXT to determine maximal fat oxidation (MFO) and maximal aerobic power (VO2max), and an interval cycling session during which respiratory gases were measured. The 30-min MIIT involved 5-min repetitions of workloads 20% below and 20% above the MFO intensity. VO2max was 31.8 ± 5.5 ml·kg-1·min-1 and all participants achieved ≥ 3 of the designated VO2max test criteria. The MFO identified during the GXT was not significantly different compared with the average fat oxidation rate in the MIIT session. During the MIIT session, fat oxidation rate increased with time; the highest rate (0.18 ± 0.11 g·min- 1) in minute 25 was significantly higher than the rate at minute 5 and 15 (p ≤ 0.01 and 0.05 respectively). In this cohort with low aerobic fitness, fat oxidation during the MIIT session was comparable with the MFO determined during a GXT. Future research may consider if the varying workload in moderate-intensity interval training helps adherence to exercise without compromising fat oxidation.
Resumo:
Interest in the area of collaborative Unmanned Aerial Vehicles (UAVs) in a Multi-Agent System is growing to compliment the strengths and weaknesses of the human-machine relationship. To achieve effective management of multiple heterogeneous UAVs, the status model of the agents must be communicated to each other. This paper presents the effects on operator Cognitive Workload (CW), Situation Awareness (SA), trust and performance by increasing the autonomy capability transparency through text-based communication of the UAVs to the human agents. The results revealed a reduction in CW, increase in SA, increase in the Competence, Predictability and Reliability dimensions of trust, and the operator performance.
Resumo:
Purpose: It is relatively common for many mine workers in Australia to drive an average of 250 kilometers to and from work following long shifts and shift blocks. Despite the long distances travelled following long shifts of 12- to 14-hours, there is evidence to suggest that these workers are not engaging in a break following their shift prior to driving home. This naturally raises issues of fatigue and sleepiness when driving. There is limited research in respect to commuting behaviours of mine workers and little is known about the factors that influence these workers to leave site immediately following their shift. Using the theory of planned behaviour, this paper examines individual control beliefs that encourage or prevent workers from leaving the site immediately following their shift block. Method: Data was collected using a cross-sectional survey. The survey instrument was developed following a series of in-depth interviews with workers from a Queensland coal mine (n=37). The quantitative written survey sample (n=461) was drawn from the same coal mine and consisted of workers from all levels of the organisation. Results: The results examine workers intentions to leave the work site and drive home immediately following a shift block. The results show differences in control beliefs between workers finishing night shifts compared with those finishing day shifts. Implications: An understanding of these control beliefs may potentially inform more targeted intervention strategies in the attempt to encourage a safer approach to driving home following shift blocks.
Resumo:
Avian species richness surveys, which measure the total number of unique avian species, can be conducted via remote acoustic sensors. An immense quantity of data can be collected, which, although rich in useful information, places a great workload on the scientists who manually inspect the audio. To deal with this big data problem, we calculated acoustic indices from audio data at a one-minute resolution and used them to classify one-minute recordings into five classes. By filtering out the non-avian minutes, we can reduce the amount of data by about 50% and improve the efficiency of determining avian species richness. The experimental results show that, given 60 one-minute samples, our approach enables to direct ecologists to find about 10% more avian species.
Resumo:
Inventory Management (IM) plays a decisive role in the enhancement of efficiency and competitiveness of manufacturing enterprises. Therefore, major manufacturing enterprises are following IM practices as a strategy to improve efficiency and achieve competitiveness. However, the spread of IM culture among Small and Medium Enterprises (SMEs) is limited due to lack of initiation, expertise and financial limitations in developed countries, leave alone developing countries. With this backdrop, this paper makes an attempt to ascertain the role and importance of IM practices and performance of SMEs in the machine tools industry of Bangalore, India. The relationship between inventory management practices and inventory cost are probed based on primary data gathered from 91 SMEs. The paper brings out that formal IM practices have a positive impact on the inventory performance of SMEs.
Resumo:
The structural stabilizing property of 2,2,2-trifluoroethanol (TFE) in peptides has been widely demonstrated, More recently, TFE has been shown to enhance secondary structure content in globular proteins, and to influence quaternary interactions in protein multimers. The molecular mechanisms by which TFE exerts its Influence on peptide and protein structures remain poorly understood. The present analysis integrates the known physical properties of TFE with a variety of experimental observations on the interaction of TFE with peptides and proteins and on the properties of fluorocarbons. Two features of TFE, namely the hydrophobicity of the trifluoromethyl group and the hydrogen bonding character (strong donor and poor acceptor), emerge as the most important factors for rationalising the observed effects of TFE. A model is proposed for TFE interaction with peptides which involves an initial replacement of the hydration shell by fluoroalcohol molecules, a process driven by apolar interactions and favourable entropy of dehydration. Subsequent bifurcated hydrogen-bond formation with peptide carbonyl groups, which leave intramolecular interactions unaffected, promotes secondary structure formation.
Resumo:
Background: A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN) from transcript profiling data. Results: The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting) problem and solved finally by formulating a Linear Program (LP). A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known regulatory associations. In each S. cerevisiae LP-SLGN, the number of nodes with a particular degree follows an approximate power law suggesting that its degree distributions is similar to that observed in real-world networks. Inspection of these LP-SLGNs suggests biological hypotheses amenable to experimental verification. Conclusion: A statistically robust and computationally efficient LP-based method for estimating the topology of a large sparse undirected graph from high-dimensional data yields representations of genetic networks that are biologically plausible and useful abstractions of the structures of real genetic networks. Analysis of the statistical and topological properties of learned LP-SLGNs may have practical value; for example, genes with high random walk betweenness, a measure of the centrality of a node in a graph, are good candidates for intervention studies and hence integrated computational – experimental investigations designed to infer more realistic and sophisticated probabilistic directed graphical model representations of genetic networks. The LP-based solutions of the sparse linear regression problem described here may provide a method for learning the structure of transcription factor networks from transcript profiling and transcription factor binding motif data.
Resumo:
Diploma students transitioning into the NS40 BNursing (BN) course at QUT withdraw from the bioscience and pharmacology units, and leave the university at higher rates than traditional students. The diploma students, entering in second year, have missed out on 2 units of bioscience taught to the traditional students in their first year, and miss out on a 3rd unit of bioscience taught to the traditional students in their 2nd year. Instead the diploma students receive one specialized unit in bioscience only i.e. a bridging unit. As a consequence, the diploma students may not have the depth of bioscience knowledge to be able to successfully study the bridging unit (LSB111) or the pharmacology unit (LSB384). Our plan was to write an eBook which refreshed and reinforced diploma students’ knowledge of bioscience aiming to prepare them with the concepts and terminology, and to build a level of confidence to support their transition to the BN. We have previously developed an intervention associated with reduced attrition of diploma nursing students, and this was our starting point. The study skills part of the initial intervention was addressed in the eBook, by links to the specialist services and resources available from our liaison librarian and academic skills adviser. The introductory bioscience/pharmacology information provided by the previous intervention involved material from standard textbooks. However, we considered this material too difficult for diploma students. Thus, we created simplified diagrams to go with text as part of our eBook. The outcome is an eBook, created and made available to the diploma students via the Community Website: “Surviving Bioscience and Pharmacology”. Using simplified diagrams to illustrate the concise text, definition to explain the concepts, the focus has been on encouraging self-awareness and help-seeking strategies and building students who take responsibility for their learning. All the nursing students in the second semester LSB384 Pharmacology Unit have been surveyed face-to-face to get feedback on their engagement with the eBook resource. The data has not been analysed to date. An important consideration is that the website be evaluated by the diploma students as they come into bioscience in first semester (LSB111), the student population for whom the eBook is primarily intended. To get a good response rate we need to do a face-to-face survey. However, we have not been able to do this, as the co-ordinator of the unit has changed since we started the project, and the present co-ordinator will not allow us access to these students.
Resumo:
The purpose of this study is to analyze and develop various forms of abduction as a means of conceptualizing processes of discovery. Abduction was originally presented by Charles S. Peirce (1839-1914) as a "weak", third main mode of inference -- besides deduction and induction -- one which, he proposed, is closely related to many kinds of cognitive processes, such as instincts, perception, practices and mediated activity in general. Both abduction and discovery are controversial issues in philosophy of science. It is often claimed that discovery cannot be a proper subject area for conceptual analysis and, accordingly, abduction cannot serve as a "logic of discovery". I argue, however, that abduction gives essential means for understanding processes of discovery although it cannot give rise to a manual or algorithm for making discoveries. In the first part of the study, I briefly present how the main trend in philosophy of science has, for a long time, been critical towards a systematic account of discovery. Various models have, however, been suggested. I outline a short history of abduction; first Peirce's evolving forms of his theory, and then later developments. Although abduction has not been a major area of research until quite recently, I review some critiques of it and look at the ways it has been analyzed, developed and used in various fields of research. Peirce's own writings and later developments, I argue, leave room for various subsequent interpretations of abduction. The second part of the study consists of six research articles. First I treat "classical" arguments against abduction as a logic of discovery. I show that by developing strategic aspects of abductive inference these arguments can be countered. Nowadays the term 'abduction' is often used as a synonym for the Inference to the Best Explanation (IBE) model. I argue, however, that it is useful to distinguish between IBE ("Harmanian abduction") and "Hansonian abduction"; the latter concentrating on analyzing processes of discovery. The distinctions between loveliness and likeliness, and between potential and actual explanations are more fruitful within Hansonian abduction. I clarify the nature of abduction by using Peirce's distinction between three areas of "semeiotic": grammar, critic, and methodeutic. Grammar (emphasizing "Firstnesses" and iconicity) and methodeutic (i.e., a processual approach) especially, give new means for understanding abduction. Peirce himself held a controversial view that new abductive ideas are products of an instinct and an inference at the same time. I maintain that it is beneficial to make a clear distinction between abductive inference and abductive instinct, on the basis of which both can be developed further. Besides these, I analyze abduction as a part of distributed cognition which emphasizes a long-term interaction with the material, social and cultural environment as a source for abductive ideas. This approach suggests a "trialogical" model in which inquirers are fundamentally connected both to other inquirers and to the objects of inquiry. As for the classical Meno paradox about discovery, I show that abduction provides more than one answer. As my main example of abductive methodology, I analyze the process of Ignaz Semmelweis' research on childbed fever. A central basis for abduction is the claim that discovery is not a sequence of events governed only by processes of chance. Abduction treats those processes which both constrain and instigate the search for new ideas; starting from the use of clues as a starting point for discovery, but continuing in considerations like elegance and 'loveliness'. The study then continues a Peircean-Hansonian research programme by developing abduction as a way of analyzing processes of discovery.
Resumo:
The dissertation describes the conscription of Finnish soldiers into the Swedish army during the Thirty Years' War. The work concentrates on so-called substitute soldiers, who were hired for conscription by wealthier peasants, who thus avoided the draft. The substitutes were the largest group recruited by the Swedish army in Sweden. The substitutes made up approximately 25-80% of the total number of soldiers. They recieved a significant sum of money from the peasants: about 50-250 Swedish copper dalers, corresponding to the price of a little peasant house. The practice of using substitutes was managed by the local village council. The recruits were normally from the landless population. However, when there was an urgent need of men, even the yeoman had to leave their homes for the distant garrisons across the Baltic. Conscription and its devastating effect on agricultural production also reduced the flow of state revenues. One of the tasks of the dissertation is the correlation between the custom of using substitutes and the abandonment of farmsteds (= in to the first place, to the non-ability to pay taxes). In areas where there were no substitutes available the peasants had to join the army themselves, which normally led to abandonment and financial ruin because agricultural production was based on physical labour. This led to rise of large farms at the cost of smaller ones. Hence, the system of substitutes was a factor that transformed the mode of settlement.
Resumo:
Distributions of lesser mealworm, Alphitobius diaperinus (Panzer) (Coleoptera: Tenebrionidae), in litter of a compacted earth floor broiler house in southeastern Queensland, Australia, were studied over two flocks. Larvae were the predominant stage recorded. Significantly low densities occurred in open locations and under drinker cups where chickens had complete access, whereas high densities were found under feed pans and along house edges where chicken access was restricted. For each flock, lesser mealworm numbers increased at all locations over the first 14 d, especially under feed pans and along house edges, peaking at 26 d and then declining over the final 28 d. A life stage profile per flock was devised that consisted of the following: beetles emerge from the earth floor at the beginning of each flock, and females lay eggs, producing larvae that peak in numbers at 3 wk; after a further 3 to 4 wk, larvae leave litter to pupate in the earth floor, and beetles then emerge by the end of the flock time. Removing old litter from the brooder section at the end of a flock did not greatly reduce mealworm numbers over the subsequent flock, but it seemed to prevent numbers increasing, while an increase in numbers in the grow-out section was recorded after reusing litter. Areas under feed pans and along house edges accounted for 5% of the total house area, but approximately half the estimated total number of lesser mealworms in the broiler house occurred in these locations. The results of this study will be used to determine optimal deployment of site-specific treatments for lesser mealworm control.