811 resultados para mathematical problem-solving
Resumo:
In this paper we describe a study of learning outcomes at a research-intensive Australian university. Three graduate outcome variables (discipline knowledge and skills, communication and problem solving, and ethical and social sensitivity) are analysed separately using OLS regression and comparisons are made of the patterns of unique contributions from four independent variables (the CEQ Good Teaching and Learning Communities Scales, and two new, independent, scales for measuring Teaching and Program Quality). Further comparisons of these patterns are made across the Schools of the university. Results support the view that teaching and program quality are not the only important determinants of students' learning outcomes. It is concluded that, whilst it continues to be appropriate for universities to be concerned with the quality of their teaching and programs, the interactive, social and collaborative aspects of students' learning experiences, captured in the notion of the Learning Community, are also very important determinants of graduate outcomes, and so should be included in the focus of attempts at enhancing the quality of student learning.
Resumo:
To participate effectively in the post-industrial information societies and knowledge/service economies of the 21st century, individuals must be better-informed, have greater thinking and problem-solving abilities, be self-motivated; have a capacity for cooperative interaction; possess varied and specialised skills; and be more resourceful and adaptable than ever before. This paper reports on one outcome from a national project funded by the Ministerial Council on Education, Employment Training and Youth Affairs, which investigated what practices, processes, strategies and structures best promote lifelong learning and the development of lifelong learners in the middle years of schooling. The investigation linked lifelong learning with middle schooling because there were indications that middle schooling reform practices also lead to the development of lifelong learning attributes, which is regarded as a desirable outcome of schooling in Australia. While this larger project provides depth around these questions, this paper specifically reports on the development of a three-phase model that can guide the sequence in which schools undertaking middle schooling reform attend to particular core component changes. The model is developed from the extensive analysis of 25 innovative schools around the nation, and provides a unique insight into the desirable sequences and time spent achieving reforms, along with typical pitfalls that lead to a regression in the reform process. Importantly, the model confirms that schooling reform takes much more time than planners typically expect or allocate, and there are predictable and identifiable inhibitors to achieving it.
Resumo:
Allowing plant pathology students to tackle fictitious or real crop problems during the course of their formal training not only teaches them the diagnostic process, but also provides for a better understanding of disease etiology. Such a problem-solving approach can also engage, motivate, and enthuse students about plant pathologgy in general. This paper presents examples of three problem-based approaches to diagnostic training utilizing freely available software. The first provides an adventure-game simulation where Students are asked to provide a diagnosis and recommendation after exploring a hypothetical scenario or case. Guidance is given oil how to create these scenarios. The second approach involves students creating their own scenarios. The third uses a diagnostic template combined with reporting software to both guide and capture students' results and reflections during a real diagnostic assignment.
Resumo:
This trial of cognitive-behavioural therapy (CBT) based amphetamine abstinence program (n = 507) focused on refusal self-efficacy, improved coping, improved problem solving and planning for relapse prevention. Measures included the Severity of Dependence Scale (SDS), the General Health Questionnaire-28 (GHQ-28) and Amphetamine Refusal Self-Efficacy. Psychiatric case identification (caseness) across the four GHQ-28 sub-scales was compared with Australian normative data. Almost 90% were amphetamine-dependent (SDS 8.15 +/- 3.17). Pretreatment, all GHQ-28 sub-scale measures were below reported Australian population values. Caseness was substantially higher than Australian normative values {Somatic Symptoms (52.3%), Anxiety (68%), Social Dysfunction (46.5%) and Depression (33.7%). One hundred and sixty-eight subjects (33%) completed and reported program abstinence. Program completers reported improvement across all GHQ-28 sub-scales Somatic Symptoms (p < 0.001), Anxiety (p < 0.001), Social Dysfunction (p < 0.001) and Depression (p < 0.001)}. They also reported improvement in amphetamine refusal self-efficacy (p < 0.001). Improvement remained significant following intention-to-treat analyses, imputing baseline data for subjects that withdrew from the program. The GHQ-28 sub-scales, Amphetamine Refusal Self-Efficacy Questionnaire and the SDS successfully predicted treatment compliance through a discriminant analysis function (p
Resumo:
Semantic data models provide a map of the components of an information system. The characteristics of these models affect their usefulness for various tasks (e.g., information retrieval). The quality of information retrieval has obvious important consequences, both economic and otherwise. Traditionally, data base designers have produced parsimonious logical data models. In spite of their increased size, ontologically clearer conceptual models have been shown to facilitate better performance for both problem solving and information retrieval tasks in experimental settings. The experiments producing evidence of enhanced performance for ontologically clearer models have, however, used application domains of modest size. Data models in organizational settings are likely to be substantially larger than those used in these experiments. This research used an experiment to investigate whether the benefits of improved information retrieval performance associated with ontologically clearer models are robust as the size of the application domains increase. The experiment used an application domain of approximately twice the size as tested in prior experiments. The results indicate that, relative to the users of the parsimonious implementation, end users of the ontologically clearer implementation made significantly more semantic errors, took significantly more time to compose their queries, and were significantly less confident in the accuracy of their queries.
Resumo:
Stochastic simulation is a recognised tool for quantifying the spatial distribution of geological uncertainty and risk in earth science and engineering. Metals mining is an area where simulation technologies are extensively used; however, applications in the coal mining industry have been limited. This is particularly due to the lack of a systematic demonstration illustrating the capabilities these techniques have in problem solving in coal mining. This paper presents two broad and technically distinct areas of applications in coal mining. The first deals with the use of simulation in the quantification of uncertainty in coal seam attributes and risk assessment to assist coal resource classification, and drillhole spacing optimisation to meet pre-specified risk levels at a required confidence. The second application presents the use of stochastic simulation in the quantification of fault risk, an area of particular interest to underground coal mining, and documents the performance of the approach. The examples presented demonstrate the advantages and positive contribution stochastic simulation approaches bring to the coal mining industry
Resumo:
This paper reports on a current research project in which virtual reality simulators are being investigated as a means of simulating hazardous Rail work conditions in order to allow train drivers to practice decision-making under stress. When working under high stress conditions train drivers need to move beyond procedural responses into a response activated through their own problem-solving and decision-making skills. This study focuses on the use of stress inoculation training which aims to build driver’s confidence in the use of new decision-making skills by being repeatedly required to respond to hazardous driving conditions. In particular, the study makes use of a train cab driving simulator to reproduce potentially stress inducing real-world scenarios. Initial pilot research has been undertaken in which drivers have experienced the training simulation and subsequently completed surveys on the level of immersion experienced. Concurrently drivers have also participated in a velocity perception experiment designed to objectively measure the fidelity of the virtual training environment. Baseline data, against which decision-making skills post training will be measured, is being gathered via cognitive task analysis designed to identify primary decision requirements for specific rail events. While considerable efforts have been invested in improving Virtual Reality technology, little is known about how to best use this technology for training personnel to respond to workplace conditions in the Rail Industry. To enable the best use of simulators for training in the Rail context the project aims to identify those factors within virtual reality that support required learning outcomes and use this information to design training simulations that reliably and safely train staff in required workplace accident response skills.
QUALIDADE DE VIDA E ESTRATÉGIAS DE ENFRENTAMENTO DE MULHERES COM E SEM LINFEDEMA APÓS CÂNCER DE MAMA
Resumo:
O linfedema no membro superior é uma complicação inerente ao tratamento de câncer de mama. Caracterizado pelo aumento do volume do membro, leva às limitações físicas e funcionais, e impacto negativo no âmbito psicológico e social. O objetivo deste estudo foi investigar a qualidade de vida e seus domínios, as estratégias de enfrentamento frente ao câncer de mama, e a correlação entre essas variáveis. Este estudo foi realizado em um centro de saúde dedicado às mulheres, por quatro meses. Os instrumentos de avaliação foram: questionário de caracterização geral e específico do câncer de mama, perimetria dos membros superiores; questionários de qualidade de vida da Organização Européia de Pesquisa e Tratamento do Câncer, EORTC QLQ-30 e BR-23; e Inventário de Estratégias de Coping. Foram entrevistadas 82 mulheres, idade média de 57,4 anos (DV12,3), submetidas a tratamento cirúrgico de mama unilateral e esvaziamento axilar, sem metástase. O linfedema apresentou-se em 39,03% (32) e parece não interferir muito na qualidade de vida das mulheres pós-câncer de mama, sendo a função social a mais prejudicada. Sintomas relacionados à quimioterapia e a mama incomodam as mulheres de ambos grupos, porém os sintomas relacionados aos braços foram estatisticamente maiores nas portadoras de linfedema. As estratégias mais utilizadas pelas entrevistadas para enfrentar o câncer foram a reavaliação, resolução de problemas, fuga, suporte social e autocontrole, somente o autocontrole foi estatisticamente maior nas mulheres com linfedema. As estratégias de resolução de problemas, autocontrole e baixo suporte social podem ter colaborado para o desencadeamento do linfedema. Conclui-se que o uso de estratégias ativas e positivas para enfrentar o câncer de mama parece resultar na boa adaptação psicossocial
Resumo:
Transnational Environmental Policy analyses a surprising success story in the field of international environmental policy making: the threat to the ozone layer posed by industrial chemicals, and how it has been averted. The book also raises the more general question about the problem-solving capacities of industrialised countries and the world society as a whole. Reiner Grundmann investigates the regulations which have been put in place at an international level, and how the process evolved over twenty years in the US and Germany.
Resumo:
This chapter demonstrates diversity in the activity of authorship and the corresponding diversity of forensic authorship analysis questions and techniques. Authorship is discussed in terms of Love’s (2002) multifunctional description of precursory, executive, declarative and revisionary authorship activities and the implications of this distinction for forensic problem solving. Four different authorship questions are considered. These are ‘How was the text produced?’, ‘How many people wrote the text?’, ‘What kind of person wrote the text?’ and ‘What is the relationship of a queried text with comparison texts?’ Different approaches to forensic authorship analysis are discussed in terms of their appropriateness to answering different authorship questions. The conclusion drawn is that no one technique will ever be appropriate to all problems.
Resumo:
The initial aim of this research was to investigate the application of expert Systems, or Knowledge Base Systems technology to the automated synthesis of Hazard and Operability Studies. Due to the generic nature of Fault Analysis problems and the way in which Knowledge Base Systems work, this goal has evolved into a consideration of automated support for Fault Analysis in general, covering HAZOP, Fault Tree Analysis, FMEA and Fault Diagnosis in the Process Industries. This thesis described a proposed architecture for such an Expert System. The purpose of the System is to produce a descriptive model of faults and fault propagation from a description of the physical structure of the plant. From these descriptive models, the desired Fault Analysis may be produced. The way in which this is done reflects the complexity of the problem which, in principle, encompasses the whole of the discipline of Process Engineering. An attempt is made to incorporate the perceived method that an expert uses to solve the problem; keywords, heuristics and guidelines from techniques such as HAZOP and Fault Tree Synthesis are used. In a truly Expert System, the performance of the system is strongly dependent on the high quality of the knowledge that is incorporated. This expert knowledge takes the form of heuristics or rules of thumb which are used in problem solving. This research has shown that, for the application of fault analysis heuristics, it is necessary to have a representation of the details of fault propagation within a process. This helps to ensure the robustness of the system - a gradual rather than abrupt degradation at the boundaries of the domain knowledge.
Resumo:
In the quest to secure the much vaunted benefits of North Sea oil, highly non-incremental technologies have been adopted. Nowhere is this more the case than with the early fields of the central and northern North Sea. By focusing on the inflexible nature of North Sea hardware, in such fields, this thesis examines the problems that this sort of technology might pose for policy making. More particularly, the following issues are raised. First, the implications of non-incremental technical change for the successful conduct of oil policy is raised. Here, the focus is on the micro-economic performance of the first generation of North Sea oil fields and the manner in which this relates to government policy. Secondly, the question is posed as to whether there were more flexible, perhaps more incremental policy alternatives open to the decision makers. Conclusions drawn relate to the degree to which non-incremental shifts in policy permit decision makers to achieve their objectives at relatively low cost. To discover cases where non-incremental policy making has led to success in this way, would be to falsify the thesis that decision makers are best served by employing incremental politics as an approach to complex problem solving.
Resumo:
This thesis offers a methodology to study and design effective communication mechanisms in human activities. The methodology is focused in the management of complexity. It is argued that complexity is not something objective that can be worked out analytically, but something subjective that depends on the viewpoint. Also it is argued that while certain social contexts may inhibit, others may enhance the viewpoint's capabilities to deal with complexity. Certain organisation structures are more likely than others to allow individuals to release their potentials. Thus, the relevance of studying and designing effective organisations. The first part of the thesis offers a `cybernetic methodology' for problem solving in human activities, the second offers a `method' to study and design organisations. The cybernetics methodology discussed in this work is rooted in second order cybernetics, or the cybernetics of the observing systems (Von Foester 1979, Maturana and Varela 1980). Its main tenet is that the known properties of the real world reside in the individual and not in the world itself. This view, which puts emphasis in a, by nature, one sided and unilateral appreciation of reality, triggers the need for dialogue and conversations to construct it. The `method' to study and design organisations, it based on Beer's Viable System Model (Beer 1979, 1981, 1985). This model permits us to assess how successful is an organisation in coping with its environmental complexity, and, moreover, permits us to establish how to make more effective the responses to this complexity. These features of the model are of great significance in a world where complexity is perceived to be growing at an unthinkable pace. But, `seeing' these features of the model assumes an effective appreciation of organisational complexity; hence the need for the methodological discussions offered by the first part of the thesis.
Resumo:
Despite the growth of spoken academic corpora in recent years, relatively little is known about the language of seminar discussions in higher education. This thesis compares seminar discussions across three disciplinary areas. The aim of this thesis is to uncover the functions and patterns of talk used in different disciplinary discussions and to highlight language on a macro and micro level that would be useful for materials design and teaching purposes. A framework for identifying and analysing genres in spoken language based on Hallidayan Systemic Functional Linguistics (SFL) is used. Stretches of talk sharing a similar purpose and predictable functional staging, termed Discussion Macro Genres (DMGs) are identified. Language is compared across DMGs and across disciplines through use of corpus techniques in conjunction with SFL genre theory. Data for the study comprises just over 180,000 tokens and is drawn from the British Academic Spoken English corpus (BASE), recorded at two universities in the UK. The discipline areas investigated are Arts and Humanities, Social Sciences and Physical Sciences. Findings from this study make theoretical, empirical and methodological contributions to the field of spoken EAP. The empirical findings are firstly, that the majority of the seminar discussion can be assigned to one of the three main DMG in the corpus: Responding, Debating and Problem Solving. Secondly, it characterises each discipline area according to two DMGs. Thirdly, the majority of the discussion is non-oppositional in nature, suggesting that ‘debate’ is not the only form of discussion that students need to be prepared for. Finally, while some characteristics of the discussion are tied to the DMG and common across disciplines, others are discipline specific. On a theoretical level, this study shows that an SFL genre model for investigating spoken discourse can be successfully extended to investigate longer stretches of discourse than have previously been identified. The methodological contribution is to demonstrate how corpus techniques can be combined with SFL genre theory to investigate extended stretches of spoken discussion. The thesis will be of value to those working in the field of teaching spoken EAP/ ESAP as well as to materials developers.