436 resultados para MULTIPLE MATERNAL ORIGINS
Resumo:
The paper addresses the topic that was important at the World Economic Forum in Davos (Jan. 2014)namely 'Empathetic leadership in organisations'. Research is presented supporting the work of Cambridge researcher, Simon Baron-Cohen and his theory of Systemizers and Empathizers.An argument is made for the importance of world decision makers having an empathetic approach to international and national decision making. The article contends that their work will directly impact on the health and future of life on our planet.
Resumo:
This thesis reports on a multiple case study of the actions of three Queensland secondary schools in the context of Year 9 NAPLAN numeracy testing, focusing on their administrative practices, curriculum, pedagogy and assessment. It was established that schools have found it both challenging and costly to operate in an environment of educational reform generally, and NAPLAN testing in particular. The lack of a common understanding of numeracy and the substantial demands of implementing the Australian Curriculum have impacted on schools' ability to prepare students appropriately for NAPLAN numeracy tests. It was concluded that there is scope for schools to improve their approaches to NAPLAN numeracy testing in a way that maximises learning as well as test outcomes.
Resumo:
This paper explores the concept that individual dancers leave traces in a choreographer’s body of work and similarly, that dancers carry forward residue of embodied choreographies into other working processes. This presentation will be grounded in a study of the multiple iterations of a programme of solo works commissioned in 2008 from choreographers John Jasperse, Jodi Melnick, Liz Roche and Rosemary Butcher and danced by the author. This includes an exploration of the development by John Jasperse of themes from his solo into the pieces PURE (2008) and Truth, Revised Histories, Wishful Thinking and Flat Out Lies (2009); an adaptation of the solo Business of the Bloom by Jodi Melnick in 2008 and a further adaptation of Business of the Bloom by this author in 2012. It will map some of the developments that occurred through a number of further performances over five years of the solo Shared Material on Dying by Liz Roche and the working process of the (uncompleted) solo Episodes of Flight by Rosemary Butcher. The purpose is to reflect back on authorship in dance, an art form in which lineages of influence can often be clearly observed. Normally, once a choreographic work is created and performed, it is archived through video recording, notation and/or reviews. The dancer is no longer called upon to represent the dance piece within the archive and thus her/his lived presence and experiential perspective disappears. The author will draw on the different traces still inhabiting her body as pathways towards understanding how choreographic movement circulates beyond this moment of performance. This will include the interrogation of ownership of choreographic movement, as once it becomes integrated in the body of the dancer, who owns the dance? Furthermore, certain dancers, through their individual physical characteristics and moving identities, can deeply influence the formation of choreographic signatures, a proposition that challenges the sole authorship role of the choreographer in dance production. This paper will be delivered in a presentation format that will bleed into movement demonstrations alongside video footage of the works and auto-ethnographic accounts of dancing experience. A further source of knowledge will be drawn from extracts of interviews with other dancers including Sara Rudner, Rebecca Hilton and Catherine Bennett.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Welcome to the Evaluation of course matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systemic manner. The primary purpose of the Evaluation of course matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being ready for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation. I hope you find this tool useful in your assessment review.
Resumo:
Train pedestrian collisions are the most likely to result in severe injuries and fatalities when compared to other types of rail crossing accidents. However, there is currently scant research that has examined the origins of pedestrians’ rule breaking at level crossings. As a result, this study examined the origins of pedestrians’ rule breaking behaviour at crossings, with particular emphasis directed towards examining the factors associated with making errors versus deliberation violations. A total of 636 individuals volunteered to participate in the study and completed either an online or paper version of the questionnaire. Quantitative analysis of the data revealed that knowledge regarding crossing rules was high, although up to 18% of level crossing users were either unsure or did not know (in some circumstances) when it was legal to cross at a level crossing. Furthermore, 156 participants (24.52%) reported having intentionally violated the rules at level crossings and 3.46% (n = 22) of the sample had previously made a mistake at a crossing. In regards to rule violators, males (particularly minors) were more likely to report breaking rules, and the most frequent occurrence was after the train had passed rather than before it arrives. Regression analysis revealed that males who frequently use pedestrian crossings and report higher sensation seeking traits are most likely to break the rules. This research provides evidence that pedestrians are more likely to deliberately violate rules (rather than make errors) at crossings and it illuminates high risk groups. This paper will further outline the study findings in regards to the development of countermeasures as well as provide direction for future research efforts in this area.
Resumo:
Aim To examine whether pre-pregnancy weight status was associated with maternal feeding beliefs and practices in the early post-partum period. Methods Secondary analysis of longitudinal data from Australian mothers. Participants (N=486) were divided into two weight status groups based on self-reported pre-pregnancy weight and measured height: healthy weight (BMI <25kg/m2; n=321) and overweight (BMI>25kg/m2; n=165). Feeding beliefs and practices were self-reported via an established questionnaire that assessed concerns about infant overeating and undereating, awareness of infant cues, feeding to a schedule, and using food to calm. Results Infants of overweight mothers were more likely to have been given solid foods in the previous 24hrs (29% vs 20%) and fewer were fully breastfed (50% vs 64%). Multivariable regression analyses (adjusted for maternal education, parity, average infant weekly weight gain, feeding mode and introduction of solids) revealed pre-pregnancy weight status was not associated with using food to calm, concern about undereating, awareness of infant cues or feeding to a schedule. However feeding mode was associated with feeding beliefs and practices. Conclusions Although no evidence for a relationship between maternal weight status and early maternal feeding beliefs and practices was observed, differences in feeding mode and early introduction of solids was observed. The emergence of a relationship between feeding practices and maternal weight status may occur when the children are older, solid feeding is established and they become more independent in feeding.
Resumo:
The positive relationship between household income and child health is well documented in the child health literature but the precise mechanisms via which income generates better health and whether the income gradient is increasing in child age are not well understood. This paper presents new Australian evidence on the child health–income gradient. We use data from the Longitudinal Study of Australian Children (LSAC), which involved two waves of data collection for children born between March 2003 and February 2004 (B-Cohort: 0–3 years), and between March 1999 and February 2000 (K-Cohort: 4–7 years). This data set allows us to test the robustness of some of the findings of the influential studies of Case et al. [Case, A., Lubotsky, D., Paxson, C., 2002. Economic status and health in childhood: the origins of the gradient. The American Economic Review 92 (5) 1308–1344] and Currie and Stabile [Currie, J., Stabile, M., 2003. Socioeconomic status and child health: why is the relationship stronger for older children. The American Economic Review 93 (5) 1813–1823], and a recent study by Currie et al. [Currie, A., Shields, M.A., Price, S.W., 2007. The child health/family income gradient: evidence from England. Journal of Health Economics 26 (2) 213–232]. The richness of the LSAC data set also allows us to conduct further exploration of the determinants of child health. Our results reveal an increasing income gradient by child age using similar covariates to Case et al. [Case, A., Lubotsky, D., Paxson, C., 2002. Economic status and health in childhood: the origins of the gradient. The American Economic Review 92 (5) 1308–1344]. However, the income gradient disappears if we include a rich set of controls. Our results indicate that parental health and, in particular, the mother's health plays a significant role, reducing the income coefficient to zero; suggesting an underlying mechanism that can explain the observed relationship between child health and family income. Overall, our results for Australian children are similar to those produced by Propper et al. [Propper, C., Rigg, J., Burgess, S., 2007. Child health: evidence on the roles of family income and maternal mental health from a UK birth cohort. Health Economics 16 (11) 1245–1269] on their British child cohort.
Resumo:
Alignment-free methods, in which shared properties of sub-sequences (e.g. identity or match length) are extracted and used to compute a distance matrix, have recently been explored for phylogenetic inference. However, the scalability and robustness of these methods to key evolutionary processes remain to be investigated. Here, using simulated sequence sets of various sizes in both nucleotides and amino acids, we systematically assess the accuracy of phylogenetic inference using an alignment-free approach, based on D2 statistics, under different evolutionary scenarios. We find that compared to a multiple sequence alignment approach, D2 methods are more robust against among-site rate heterogeneity, compositional biases, genetic rearrangements and insertions/deletions, but are more sensitive to recent sequence divergence and sequence truncation. Across diverse empirical datasets, the alignment-free methods perform well for sequences sharing low divergence, at greater computation speed. Our findings provide strong evidence for the scalability and the potential use of alignment-free methods in large-scale phylogenomics.
Resumo:
The autonomous capabilities in collaborative unmanned aircraft systems are growing rapidly. Without appropriate transparency, the effectiveness of the future multiple Unmanned Aerial Vehicle (UAV) management paradigm will be significantly limited by the human agent’s cognitive abilities; where the operator’s CognitiveWorkload (CW) and Situation Awareness (SA) will present as disproportionate. This proposes a challenge in evaluating the impact of robot autonomous capability feedback, allowing the human agent greater transparency into the robot’s autonomous status - in a supervisory role. This paper presents; the motivation, aim, related works, experiment theory, methodology, results and discussions, and the future work succeeding this preliminary study. The results in this paper illustrates that, with a greater transparency of a UAV’s autonomous capability, an overall improvement in the subjects’ cognitive abilities was evident, that is, with a confidence of 95%, the test subjects’ mean CW was demonstrated to have a statistically significant reduction, while their mean SA was demonstrated to have a significant increase.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
Player experiences and expectations are connected. The presumptions players have about how they control their gameplay interactions may shape the way they play and perceive videogames. A successfully engaging player experience might rest on the way controllers meet players' expectations. We studied player interaction with novel controllers on the Sony PlayStation Wonderbook, an augmented reality (AR) gaming system. Our goal was to understand player expectations regarding game controllers in AR game design. Based on this preliminary study, we propose several interaction guidelines for hybrid input from both augmented reality and physical game controllers
Resumo:
This thesis in software engineering presents a novel automated framework to identify similar operations utilized by multiple algorithms for solving related computing problems. It provides a new effective solution to perform multi-application based algorithm analysis, employing fundamentally light-weight static analysis techniques compared to the state-of-art approaches. Significant performance improvements are achieved across the objective algorithms through enhancing the efficiency of the identified similar operations, targeting discrete application domains.