815 resultados para Problem solving task
Resumo:
Objective: To investigate the psychosocial impact of young caregiving by empirically validating prominent qualitative themes.. This was achieved through developing an inventory called the Young Caregiver of Parents Inventory (YCOPI) designed to assess these themes and by comparing young caregivers and noncaregivers. Method: Two hundred forty-five participants between 10 and 25 years completed questionnaires: 100 young caregivers and 145 noncaregivers. In addition to the YCOPI, the following variables were measured: demographics, caregiving context, social support, appraisal, coping strategies, and adjustment (health, life satisfaction, distress, positive affect). Results: Eight reliable factors emerged from the YCOPI that described the diverse impacts of caregiving and reflected the key themes reported in prior research. The factors were related to most caregiving context variables and theoretically relevant stress and coping variables. Compared with noncaregivers, young caregivers reported higher levels of young caregiving impact, less reliance on problem-solving coping, and higher somatization and lower life satisfaction. Conclusions: Findings delineate key impacts of young caregiving and highlight the importance of ensuring that measures used in research on young caregivers are sensitive to issues pertinent to this population.
Resumo:
In this paper we describe a study of learning outcomes at a research-intensive Australian university. Three graduate outcome variables (discipline knowledge and skills, communication and problem solving, and ethical and social sensitivity) are analysed separately using OLS regression and comparisons are made of the patterns of unique contributions from four independent variables (the CEQ Good Teaching and Learning Communities Scales, and two new, independent, scales for measuring Teaching and Program Quality). Further comparisons of these patterns are made across the Schools of the university. Results support the view that teaching and program quality are not the only important determinants of students' learning outcomes. It is concluded that, whilst it continues to be appropriate for universities to be concerned with the quality of their teaching and programs, the interactive, social and collaborative aspects of students' learning experiences, captured in the notion of the Learning Community, are also very important determinants of graduate outcomes, and so should be included in the focus of attempts at enhancing the quality of student learning.
Resumo:
To participate effectively in the post-industrial information societies and knowledge/service economies of the 21st century, individuals must be better-informed, have greater thinking and problem-solving abilities, be self-motivated; have a capacity for cooperative interaction; possess varied and specialised skills; and be more resourceful and adaptable than ever before. This paper reports on one outcome from a national project funded by the Ministerial Council on Education, Employment Training and Youth Affairs, which investigated what practices, processes, strategies and structures best promote lifelong learning and the development of lifelong learners in the middle years of schooling. The investigation linked lifelong learning with middle schooling because there were indications that middle schooling reform practices also lead to the development of lifelong learning attributes, which is regarded as a desirable outcome of schooling in Australia. While this larger project provides depth around these questions, this paper specifically reports on the development of a three-phase model that can guide the sequence in which schools undertaking middle schooling reform attend to particular core component changes. The model is developed from the extensive analysis of 25 innovative schools around the nation, and provides a unique insight into the desirable sequences and time spent achieving reforms, along with typical pitfalls that lead to a regression in the reform process. Importantly, the model confirms that schooling reform takes much more time than planners typically expect or allocate, and there are predictable and identifiable inhibitors to achieving it.
Resumo:
Allowing plant pathology students to tackle fictitious or real crop problems during the course of their formal training not only teaches them the diagnostic process, but also provides for a better understanding of disease etiology. Such a problem-solving approach can also engage, motivate, and enthuse students about plant pathologgy in general. This paper presents examples of three problem-based approaches to diagnostic training utilizing freely available software. The first provides an adventure-game simulation where Students are asked to provide a diagnosis and recommendation after exploring a hypothetical scenario or case. Guidance is given oil how to create these scenarios. The second approach involves students creating their own scenarios. The third uses a diagnostic template combined with reporting software to both guide and capture students' results and reflections during a real diagnostic assignment.
Resumo:
This trial of cognitive-behavioural therapy (CBT) based amphetamine abstinence program (n = 507) focused on refusal self-efficacy, improved coping, improved problem solving and planning for relapse prevention. Measures included the Severity of Dependence Scale (SDS), the General Health Questionnaire-28 (GHQ-28) and Amphetamine Refusal Self-Efficacy. Psychiatric case identification (caseness) across the four GHQ-28 sub-scales was compared with Australian normative data. Almost 90% were amphetamine-dependent (SDS 8.15 +/- 3.17). Pretreatment, all GHQ-28 sub-scale measures were below reported Australian population values. Caseness was substantially higher than Australian normative values {Somatic Symptoms (52.3%), Anxiety (68%), Social Dysfunction (46.5%) and Depression (33.7%). One hundred and sixty-eight subjects (33%) completed and reported program abstinence. Program completers reported improvement across all GHQ-28 sub-scales Somatic Symptoms (p < 0.001), Anxiety (p < 0.001), Social Dysfunction (p < 0.001) and Depression (p < 0.001)}. They also reported improvement in amphetamine refusal self-efficacy (p < 0.001). Improvement remained significant following intention-to-treat analyses, imputing baseline data for subjects that withdrew from the program. The GHQ-28 sub-scales, Amphetamine Refusal Self-Efficacy Questionnaire and the SDS successfully predicted treatment compliance through a discriminant analysis function (p
Resumo:
Semantic data models provide a map of the components of an information system. The characteristics of these models affect their usefulness for various tasks (e.g., information retrieval). The quality of information retrieval has obvious important consequences, both economic and otherwise. Traditionally, data base designers have produced parsimonious logical data models. In spite of their increased size, ontologically clearer conceptual models have been shown to facilitate better performance for both problem solving and information retrieval tasks in experimental settings. The experiments producing evidence of enhanced performance for ontologically clearer models have, however, used application domains of modest size. Data models in organizational settings are likely to be substantially larger than those used in these experiments. This research used an experiment to investigate whether the benefits of improved information retrieval performance associated with ontologically clearer models are robust as the size of the application domains increase. The experiment used an application domain of approximately twice the size as tested in prior experiments. The results indicate that, relative to the users of the parsimonious implementation, end users of the ontologically clearer implementation made significantly more semantic errors, took significantly more time to compose their queries, and were significantly less confident in the accuracy of their queries.
Resumo:
Stochastic simulation is a recognised tool for quantifying the spatial distribution of geological uncertainty and risk in earth science and engineering. Metals mining is an area where simulation technologies are extensively used; however, applications in the coal mining industry have been limited. This is particularly due to the lack of a systematic demonstration illustrating the capabilities these techniques have in problem solving in coal mining. This paper presents two broad and technically distinct areas of applications in coal mining. The first deals with the use of simulation in the quantification of uncertainty in coal seam attributes and risk assessment to assist coal resource classification, and drillhole spacing optimisation to meet pre-specified risk levels at a required confidence. The second application presents the use of stochastic simulation in the quantification of fault risk, an area of particular interest to underground coal mining, and documents the performance of the approach. The examples presented demonstrate the advantages and positive contribution stochastic simulation approaches bring to the coal mining industry
Resumo:
An inherent incomputability in the specification of a functional language extension that combines assertions with dynamic type checking is isolated in an explicit derivation from mathematical specifications. The combination of types and assertions (into "dynamic assertion-types" - DATs) is a significant issue since, because the two are congruent means for program correctness, benefit arises from their better integration in contrast to the harm resulting from their unnecessary separation. However, projecting the "set membership" view of assertion-checking into dynamic types results in some incomputable combinations. Refinement of the specification of DAT checking into an implementation by rigorous application of mathematical identities becomes feasible through the addition of a "best-approximate" pseudo-equality that isolates the incomputable component of the specification. This formal treatment leads to an improved, more maintainable outcome with further development potential.
QUALIDADE DE VIDA E ESTRATÉGIAS DE ENFRENTAMENTO DE MULHERES COM E SEM LINFEDEMA APÓS CÂNCER DE MAMA
Resumo:
O linfedema no membro superior é uma complicação inerente ao tratamento de câncer de mama. Caracterizado pelo aumento do volume do membro, leva às limitações físicas e funcionais, e impacto negativo no âmbito psicológico e social. O objetivo deste estudo foi investigar a qualidade de vida e seus domínios, as estratégias de enfrentamento frente ao câncer de mama, e a correlação entre essas variáveis. Este estudo foi realizado em um centro de saúde dedicado às mulheres, por quatro meses. Os instrumentos de avaliação foram: questionário de caracterização geral e específico do câncer de mama, perimetria dos membros superiores; questionários de qualidade de vida da Organização Européia de Pesquisa e Tratamento do Câncer, EORTC QLQ-30 e BR-23; e Inventário de Estratégias de Coping. Foram entrevistadas 82 mulheres, idade média de 57,4 anos (DV12,3), submetidas a tratamento cirúrgico de mama unilateral e esvaziamento axilar, sem metástase. O linfedema apresentou-se em 39,03% (32) e parece não interferir muito na qualidade de vida das mulheres pós-câncer de mama, sendo a função social a mais prejudicada. Sintomas relacionados à quimioterapia e a mama incomodam as mulheres de ambos grupos, porém os sintomas relacionados aos braços foram estatisticamente maiores nas portadoras de linfedema. As estratégias mais utilizadas pelas entrevistadas para enfrentar o câncer foram a reavaliação, resolução de problemas, fuga, suporte social e autocontrole, somente o autocontrole foi estatisticamente maior nas mulheres com linfedema. As estratégias de resolução de problemas, autocontrole e baixo suporte social podem ter colaborado para o desencadeamento do linfedema. Conclui-se que o uso de estratégias ativas e positivas para enfrentar o câncer de mama parece resultar na boa adaptação psicossocial
Resumo:
Transnational Environmental Policy analyses a surprising success story in the field of international environmental policy making: the threat to the ozone layer posed by industrial chemicals, and how it has been averted. The book also raises the more general question about the problem-solving capacities of industrialised countries and the world society as a whole. Reiner Grundmann investigates the regulations which have been put in place at an international level, and how the process evolved over twenty years in the US and Germany.
Resumo:
This chapter demonstrates diversity in the activity of authorship and the corresponding diversity of forensic authorship analysis questions and techniques. Authorship is discussed in terms of Love’s (2002) multifunctional description of precursory, executive, declarative and revisionary authorship activities and the implications of this distinction for forensic problem solving. Four different authorship questions are considered. These are ‘How was the text produced?’, ‘How many people wrote the text?’, ‘What kind of person wrote the text?’ and ‘What is the relationship of a queried text with comparison texts?’ Different approaches to forensic authorship analysis are discussed in terms of their appropriateness to answering different authorship questions. The conclusion drawn is that no one technique will ever be appropriate to all problems.
Resumo:
The initial aim of this research was to investigate the application of expert Systems, or Knowledge Base Systems technology to the automated synthesis of Hazard and Operability Studies. Due to the generic nature of Fault Analysis problems and the way in which Knowledge Base Systems work, this goal has evolved into a consideration of automated support for Fault Analysis in general, covering HAZOP, Fault Tree Analysis, FMEA and Fault Diagnosis in the Process Industries. This thesis described a proposed architecture for such an Expert System. The purpose of the System is to produce a descriptive model of faults and fault propagation from a description of the physical structure of the plant. From these descriptive models, the desired Fault Analysis may be produced. The way in which this is done reflects the complexity of the problem which, in principle, encompasses the whole of the discipline of Process Engineering. An attempt is made to incorporate the perceived method that an expert uses to solve the problem; keywords, heuristics and guidelines from techniques such as HAZOP and Fault Tree Synthesis are used. In a truly Expert System, the performance of the system is strongly dependent on the high quality of the knowledge that is incorporated. This expert knowledge takes the form of heuristics or rules of thumb which are used in problem solving. This research has shown that, for the application of fault analysis heuristics, it is necessary to have a representation of the details of fault propagation within a process. This helps to ensure the robustness of the system - a gradual rather than abrupt degradation at the boundaries of the domain knowledge.
Resumo:
In the quest to secure the much vaunted benefits of North Sea oil, highly non-incremental technologies have been adopted. Nowhere is this more the case than with the early fields of the central and northern North Sea. By focusing on the inflexible nature of North Sea hardware, in such fields, this thesis examines the problems that this sort of technology might pose for policy making. More particularly, the following issues are raised. First, the implications of non-incremental technical change for the successful conduct of oil policy is raised. Here, the focus is on the micro-economic performance of the first generation of North Sea oil fields and the manner in which this relates to government policy. Secondly, the question is posed as to whether there were more flexible, perhaps more incremental policy alternatives open to the decision makers. Conclusions drawn relate to the degree to which non-incremental shifts in policy permit decision makers to achieve their objectives at relatively low cost. To discover cases where non-incremental policy making has led to success in this way, would be to falsify the thesis that decision makers are best served by employing incremental politics as an approach to complex problem solving.
Resumo:
This thesis offers a methodology to study and design effective communication mechanisms in human activities. The methodology is focused in the management of complexity. It is argued that complexity is not something objective that can be worked out analytically, but something subjective that depends on the viewpoint. Also it is argued that while certain social contexts may inhibit, others may enhance the viewpoint's capabilities to deal with complexity. Certain organisation structures are more likely than others to allow individuals to release their potentials. Thus, the relevance of studying and designing effective organisations. The first part of the thesis offers a `cybernetic methodology' for problem solving in human activities, the second offers a `method' to study and design organisations. The cybernetics methodology discussed in this work is rooted in second order cybernetics, or the cybernetics of the observing systems (Von Foester 1979, Maturana and Varela 1980). Its main tenet is that the known properties of the real world reside in the individual and not in the world itself. This view, which puts emphasis in a, by nature, one sided and unilateral appreciation of reality, triggers the need for dialogue and conversations to construct it. The `method' to study and design organisations, it based on Beer's Viable System Model (Beer 1979, 1981, 1985). This model permits us to assess how successful is an organisation in coping with its environmental complexity, and, moreover, permits us to establish how to make more effective the responses to this complexity. These features of the model are of great significance in a world where complexity is perceived to be growing at an unthinkable pace. But, `seeing' these features of the model assumes an effective appreciation of organisational complexity; hence the need for the methodological discussions offered by the first part of the thesis.