37 resultados para Problem solving method
Resumo:
The concept of “working” memory is traceable back to nineteenth century theorists (Baldwin, 1894; James 1890) but the term itself was not used until the mid-twentieth century (Miller, Galanter & Pribram, 1960). A variety of different explanatory constructs have since evolved which all make use of the working memory label (Miyake & Shah, 1999). This history is briefly reviewed and alternative formulations of working memory (as language-processor, executive attention, and global workspace) are considered as potential mechanisms for cognitive change within and between individuals and between species. A means, derived from the literature on human problem-solving (Newell & Simon, 1972), of tracing memory and computational demands across a single task is described and applied to two specific examples of tool-use by chimpanzees and early hominids. The examples show how specific proposals for necessary and/or sufficient computational and memory requirements can be more rigorously assessed on a task by task basis. General difficulties in connecting cognitive theories (arising from the observed capabilities of individuals deprived of material support) with archaeological data (primarily remnants of material culture) are discussed.
Resumo:
Productivity growth is conventionally measured by indices representing discreet approximations of the Divisia TFP index under the assumption that technological change is Hicks-neutral. When this assumption is violated, these indices are no longer meaningful because they conflate the effects of factor accumulation and technological change. We propose a way of adjusting the conventional TFP index that solves this problem. The method adopts a latent variable approach to the measurement of technical change biases that provides a simple means of correcting product and factor shares in the standard Tornqvist-Theil TFP index. An application to UK agriculture over the period 1953-2000 demonstrates that technical progress is strongly biased. The implications of that bias for productivity measurement are shown to be very large, with the conventional TFP index severely underestimating productivity growth. The result is explained primarily by the fact that technological change has favoured the rapidly accumulating factors against labour, the factor leaving the sector. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The disuse hypothesis of cognitive aging attributes decrements in fluid intelligence in older adults to reduced cognitively stimulating activity. This study experimentally tested the hypothesis that a period of increased mentally stimulating activities thus would enhance older adults' fluid intelligence performance. Participants (N = 44, mean age 67.82) were administered pre- and post-test measures, including the fluid intelligence measure, Cattell's Culture Fair (CCF) test. Experimental participants engaged in diverse, novel, mentally stimulating activities for 10-12 weeks and were compared to a control condition. Results supported the hypothesis; the experimental group showed greater pre- to post-CCF gain than did controls (effect size d = 0.56), with a similar gain on a spatial-perceptual task (WAIS-R Blocks). Even brief periods of increased cognitive stimulation can improve older adults' problem solving and flexible thinking.
Resumo:
Williams syndrome (WS) is characterized by apparent relative strengths in language, facial processing and social cognition but by profound impairment in spatial cognition, planning and problem solving. Following recent research which suggests that individuals with WS may be less linguistically able than was once thought, in this paper we begin to investigate why and how they may give the impression of linguistic proficiency despite poor standardized test results. This case study of Brendan, a 12-year-old boy with WS, who presents with a considerable lack of linguistic ability, suggests that impressions of linguistic competence may to some extent be the result of conversational strategies which enable him to compensate for various cognitive and linguistic deficits with a considerable degree of success. These conversational strengths are not predicted by his standardized language test results, and provide compelling support for the use of approaches such as Conversation Analysis in the assessment of individuals with communication impairments.
Resumo:
An enterprise is viewed as a complex system which can be engineered to accomplish organisational objectives. Systems analysis and modelling will enable to the planning and development of the enterprise and IT systems. Many IT systems design methods focus on functional and non-functional requirements of the IT systems. Most methods are normally capable of one but leave out other aspects. Analysing and modelling of both business and IT systems may often have to call on techniques from various suites of methods which may be placed on different philosophic and methodological underpinnings. Coherence and consistency between the analyses are hard to ensure. This paper introduces the Problem Articulation Method (PAM) which facilitates the design of an enterprise system infrastructure on which an IT system is built. Outcomes of this analysis represent requirements which can be further used for planning and designing a technical system. As a case study, a finance system, Agresso, for e-procurement has been used in this paper to illustrate the applicability of PAM in modelling complex systems.
Resumo:
Procurement is one of major business operations in public service sector. The advance of information and communication technology (ICT) pushes this business operation to increase its efficiency and foster collaborations between the organization and its suppliers. This leads to a shift from the traditional procurement transactions to an e-procurement paradigm. Such change impacts on business process, information management and decision making. E-procurement involves various stakeholders who engage in activities based on different social and cultural practices. Therefore, a design of e-procurement system may involve complex situations analysis. This paper describes an approach of using the problem articulation method to support such analysis. This approach is applied to a case study from UAE.
Resumo:
Requirements analysis focuses on stakeholders concerns and their influence towards e-government systems. Some characteristics of stakeholders concerns clearly show the complexity and conflicts. This imposes a number of questions in the requirements analysis, such as how are they relevant to stakeholders? What are their needs? How conflicts among the different stakeholders can be resolved? And what coherent requirements can be methodologically produced? This paper describes the problem articulation method in organizational semiotics which can be used to conduct such complex requirements analysis. The outcomes of the analysis enable e-government systems development and management to meet userspsila needs. A case study of Yantai Citizen Card is chosen to illustrate a process of analysing stakeholders in the lifecycle of requirements analysis.
Resumo:
A finite-difference scheme based on flux difference splitting is presented for the solution of the two-dimensional shallow-water equations of ideal fluid flow. A linearised problem, analogous to that of Riemann for gasdynamics, is defined and a scheme, based on numerical characteristic decomposition, is presented for obtaining approximate solutions to the linearised problem. The method of upwind differencing is used for the resulting scalar problems, together with a flux limiter for obtaining a second-order scheme which avoids non-physical, spurious oscillations. An extension to the two-dimensional equations with source terms, is included. The scheme is applied to a dam-break problem with cylindrical symmetry.
Resumo:
A finite difference scheme based on flux difference splitting is presented for the solution of the one-dimensional shallow water equations in open channels. A linearised problem, analogous to that of Riemann for gas dynamics, is defined and a scheme, based on numerical characteristic decomposition, is presented for obtaining approximate solutions to the linearised problem. The method of upwind differencing is used for the resulting scalar problems, together with a flux limiter for obtaining a second order scheme which avoids non-physical, spurious oscillations. The scheme is applied to a problem of flow in a river whose geometry induces a region of supercritical flow.
Resumo:
A finite difference scheme based on flux difference splitting is presented for the solution of the one-dimensional shallow-water equations in open channels, together with an extension to two-dimensional flows. A linearized problem, analogous to that of Riemann for gas dynamics, is defined and a scheme, based on numerical characteristic decomposition, is presented for obtaining approximate solutions to the linearized problem. The method of upwind differencing is used for the resulting scalar problems, together with a flux limiter for obtaining a second-order scheme which avoids non-physical, spurious oscillations. The scheme is applied to a one-dimensional dam-break problem, and to a problem of flow in a river whose geometry induces a region of supercritical flow. The scheme is also applied to a two-dimensional dam-break problem. The numerical results are compared with the exact solution, or other numerical results, where available.
Resumo:
In industrial practice, constrained steady state optimisation and predictive control are separate, albeit closely related functions within the control hierarchy. This paper presents a method which integrates predictive control with on-line optimisation with economic objectives. A receding horizon optimal control problem is formulated using linear state space models. This optimal control problem is very similar to the one presented in many predictive control formulations, but the main difference is that it includes in its formulation a general steady state objective depending on the magnitudes of manipulated and measured output variables. This steady state objective may include the standard quadratic regulatory objective, together with economic objectives which are often linear. Assuming that the system settles to a steady state operating point under receding horizon control, conditions are given for the satisfaction of the necessary optimality conditions of the steady-state optimisation problem. The method is based on adaptive linear state space models, which are obtained by using on-line identification techniques. The use of model adaptation is justified from a theoretical standpoint and its beneficial effects are shown in simulations. The method is tested with simulations of an industrial distillation column and a system of chemical reactors.
Resumo:
In his book Democratic Authority, David Estlund puts forward a case for democracy, which he labels epistemic proceduralism, that relies on democracy's ability to produce good – that is, substantively just – results. Alongside this case for democracy Estlund attacks what he labels ‘utopophobia’, an aversion to idealistic political theory. In this article I make two points. The first is a general point about what the correct level of ‘idealisation’ is in political theory. Various debates are emerging on this question and, to the extent that they are focused on ‘political theory’ as a whole, I argue, they are flawed. This is because there are different kinds of political concept, and they require different kinds of ideal. My second point is about democracy in particular. If we understand democracy as Estlund does, then we should see it as a problem-solving concept – the problem being that we need coercive institutions and rules, but we do not know what justice requires. As democracy is a response to a problem, we should not allow our theories of it, even at the ideal level, to be too idealised – they must be embedded in the nature of the problem they are to solve, and the beings that have it.
Resumo:
The accurate prediction of the biochemical function of a protein is becoming increasingly important, given the unprecedented growth of both structural and sequence databanks. Consequently, computational methods are required to analyse such data in an automated manner to ensure genomes are annotated accurately. Protein structure prediction methods, for example, are capable of generating approximate structural models on a genome-wide scale. However, the detection of functionally important regions in such crude models, as well as structural genomics targets, remains an extremely important problem. The method described in the current study, MetSite, represents a fully automatic approach for the detection of metal-binding residue clusters applicable to protein models of moderate quality. The method involves using sequence profile information in combination with approximate structural data. Several neural network classifiers are shown to be able to distinguish metal sites from non-sites with a mean accuracy of 94.5%. The method was demonstrated to identify metal-binding sites correctly in LiveBench targets where no obvious metal-binding sequence motifs were detectable using InterPro. Accurate detection of metal sites was shown to be feasible for low-resolution predicted structures generated using mGenTHREADER where no side-chain information was available. High-scoring predictions were observed for a recently solved hypothetical protein from Haemophilus influenzae, indicating a putative metal-binding site.
Resumo:
Active learning plays a strong role in mathematics and statistics, and formative problems are vital for developing key problem-solving skills. To keep students engaged and help them master the fundamentals before challenging themselves further, we have developed a system for delivering problems tailored to a student‟s current level of understanding. Specifically, by adapting simple methodology from clinical trials, a framework for delivering existing problems and other illustrative material has been developed, making use of macros in Excel. The problems are assigned a level of difficulty (a „dose‟), and problems are presented to the student in an order depending on their ability, i.e. based on their performance so far on other problems. We demonstrate and discuss the application of the approach with formative examples developed for a first year course on plane coordinate geometry, and also for problems centred on the topic of chi-square tests.
Resumo:
The construction field is dynamic and dominated by complex, ill-defined problems for which myriad possible solutions exist. Teaching students to solve construction-related problems requires an understanding of the nature of these complex problems as well as the implementation of effective instructional strategies to address them. Traditional approaches to teaching construction planning and management have long been criticized for presenting students primarily with well-defined problems - an approach inconsistent with the challenges encountered in the industry. However, growing evidence suggests that employing innovative teaching approaches, such as interactive simulation games, offers more active, hands-on and problem-based learning opportunities for students to synthesize and test acquired knowledge more closely aligned with real-life construction scenarios. Simulation games have demonstrated educational value in increasing student problem solving skills and motivation through critical attributes such as interaction and feedback-supported active learning. Nevertheless, broad acceptance of simulation games in construction engineering education remains limited. While recognizing benefits, research focused on the role of simulation games in educational settings lacks a unified approach to developing, implementing and evaluating these games. To address this gap, this paper provides an overview of the challenges associated with evaluating the effectiveness of simulation games in construction education that still impede their wide adoption. An overview of the current status, as well as the results from recently implemented Virtual Construction Simulator (VCS) game at Penn State provide lessons learned, and are intended to guide future efforts in developing interactive simulation games to reach their full potential.