951 resultados para puzzle difficulty
Resumo:
Classification is a kind of basic cognitive process, and category is an important way for human beings to define the world. At the same time, categories are organized in a hierarchical way, which makes it possible for human beings to process information efficiency. For those reasons, the development of classification ability is always one of the foci in developmental psychology. By using the methods of spontaneous and trained classification of both familiar stimuli materials and artificial concepts, this research explored the 4-6 year old children's classification criteria. And by the artificial concept system formed in these classification criteria experiments, the mastery degree of class hierarchy in these young children was analyzed. The main results and conclusions are: 1) The classification ability increases quickly among kindergarteners from 4 to 6 year old: the 4 year old children seemed unable to classify objects by classificatory criteria, however, the 6 year ones had shown the ability in many experimental conditions. But the main basis of classificatory criteria in these young children, including 6 year old ones, was the functional relation of the objects but the conceptual relations, and their classification criteria was not consistent because they seem to be easily affected by experimental conditions. 2) The age of 5 is a more sensitive period of classification ability development: for the children of 5 year old, it was found that their classification ability was easily enhanced by training. The zone of proximal development in classification ability by category standard could probably lie in this period of age. 3) Knowledge is an important factor that affects young children's classification ability, meanwhile, their classification activity are affected by cognitive processing ability: young children exhibited different classification ability as they had different understanding of stimuli materials. Kindergarteners of different age were significantly different in their classification ability as the difference in cognitive processing ability, even if they had the same knowledge about the stimuli materials. 4) Different properties of class hierarchy are different in difficulty for young children: the 5-6 year old children showed that the could master the transitivity of the class hierarchy. No matter under what learning condition, they could answer most of the transitivity questions correctly and infer the property of the sub-class according to that of the super-class. The young children at 5-6 years old had mastered the branching property of class hierarchy at a relative high level, but their answers were easily affected by the hints in the questions. However, it seemed that the asymmetry of class hierarchy was difficult for young children to learn. Because young children could not understand the class inclusion relation, they always drew wrong conclusions about super-class from sub-class in their classification.
Resumo:
Transfer of learning is one of the major concepts in educational psychology. As cognitive psychology develops, many researchers have found that transfer plays an important part in problem solving, and the awareness of the similarity of related problems is important in transfer. So they become more interested in researching the problem of transfer. But in the literature of transfer research, it has been found that many researchers do not hold identical conclusions about the influence of awareness of related problems during problem solving transfer. This dissertation is written on the basic of much of sub-research work, such as looking up literature concerning transfer of problem solving research, comparing the results of research work done recently and experimental researches. The author of this dissertation takes middle school students as subjects, geometry as materials, and adopts factorial design in his experiments. The influence of awareness of related problems on problem solving transfer is examined from three dimensions which are the degree of difficulty of transfer problems, the level of awareness of related problems and the characteristics of subjects themselves. Five conclusions have been made after the experimental research: (1) During the process of geometry problem solving, the level of awareness of related problems is one of the major factors that influence the effect of problem solving transfer. (2) Either more difficult or more easy of the transfer problems will hinder the influence of awareness of related problems during problem solving transfer, and the degree of difficulty of the transfer problems have interactions with the level of awareness of related problems in affecting transfer. (3) During geometry problems solving transfer, the level of awareness of related problems has interactions with the degree of student achievement. Compared with the students who have lower achievement, the influence of the level of the awareness is bigger in the students who have higher achievement. (4) There is positive correlation between geometry achievement and reasoning ability of the middle school students. The student who has higher reasoning ability has higher geometry achievement, while the level of awareness is raised, the transfer achievement of both can be raised significantly. (5) There is positive correlation between geometry achievement and cognitive style of the middle school students. The student who has independent field tendency of cognitive style has higher geometry achievement, while the level of awareness is raised, the transfer achievement of both can be raised significantly. At the end of the dissertation, the researcher offers two proposals concerning Geometry teaching on the basis of the research findings.
Resumo:
Human being built and updated the representations of spatial distances and spatial relations between protagonist and the around things in language comprehension. The representations of the spatial relations in egocentric spatial situational models were important in spatial cognition, narrative comprehension and psycholinguistic. Using imagery searching paradigm, Franklin and Tversky (1990) studied the representations of the spatial relations in egocentric spatial situational models and found the standard RT pattern of searching the objects in different directions around the observer (front
Resumo:
As a key issue in spatial cognitive developmental research, the coding of object location plays an important role in children's cognitive development. The development of location coding is a precondition for children's adaptation to their environments, and the development of corresponding ability could enhance children's adaptation ability and improve their synthetic diathesis. In this paper, under the improved paradigm of object searching, 7-, 9- and 11-year-olds of urban primary school students were involved in two studies including the total of four experiments. The children were examined upon the ability to encode target location in terms of the distance between two landmarks, three points on a line, the intersection of two lines, or the corresponding points on two parallel lines. The experiments were designed to explore the primary school children's cognitive developmental process upon spatial object location and the correlative restricting factors. From the studies, the following conclusions were drawn: 1)The ability of 7-year-olds to represent target location in terms of the relationships of points and lines is in the inceptive stage and appears unstable. Meanwhile, the same ability of 9-year-olds is in a state of fast developing. The 9-year-olds' performance depends on how difficult the task is. It is stable when task is easy while unstable when task becomes difficult. The ability of 11-year-olds reaches much-developed state and the group's performance is independent of the difficulty of tasks. 2) The correlate coefficient is significant between Raven Standard Inference ability levels and the performance of representing target location in terms of the relationships of points. Those children with good performance in Raven Standard Inference Test have good performance in target location coding. The case is true for all different age groups. As of the task in terms of the relationships of lines, the correlate coefficient between Raven Standard Inference ability levels and children's performance of representing target location is found significant only for the 7-year-olds' group. The case is not true for the groups of 9- and 11-year-olds. It is also found that the correlate coefficient is significant between the sum of performance and Raven Standard Inference ability levels, and that is true for all age groups. 3) Effects from task variable exist upon children's above-mentioned cognitive performance. The effects are different according to different difficulty levels of tasks. Also, they are different according to the different ages. 4) The subjects who failed in the 'no cues for encoding given' situation were able to improve their performances when the cues of encoding were given. Therefore it is possible to improve the primary school children's corresponding cognitive performance by providing the cues of encoding. 5) Two kinds of efficient strategies were used to solve the problem. They are trial-comparison strategy and anticipation-directed strategy.
Resumo:
A newly developed experimental model called simulation of real mission was used to explore law of time perception and user endurance for feedback delay under Network-Supported Co-operative Work. Some non-technological factors influencing time perception and user endurance (mission type、difficulty level、feedback method、partner type、gender and A type behavior pattern) were also examined. Results of the study showed that: (1) Under condition of waiting without feedback, mission type and difficulty level demonstrated significant main effects on judgment of waiting duration. People will wait more time to receive partner's feedback if he or she perceives that partner's task is difficult, and the longest waiting duration (LWD) in the mission of computation is longer than the LWD in the mission of proof searching. (2) Under condition of waiting with feedback, experimental data perfectly supported Vierordt's Law: short duration is underestimated, long duration is overestimated, only proper duration (2-6 second) can be estimated correctly. The proper duration will vary with the changing of difficulty levels of mission. More long the waiting duration is, more estimation error will be occurred. The type difference of partner has no significant effect on the law of time perception. (3) Under condition of waiting with feedback, non-technology factors can significantly effect user's endurance. When subjects were told their partner was human, mission type and difficulty level of mission could significantly effect user's endurance. When subjects were told their partner was computer, A type behavior pattern and difficulty level of mission could significantly effect user's endurance. The two-way interaction effect between A type behavior pattern and gender was detected.
Resumo:
The computer science technique of computational complexity analysis can provide powerful insights into the algorithm-neutral analysis of information processing tasks. Here we show that a simple, theory-neutral linguistic model of syntactic agreement and ambiguity demonstrates that natural language parsing may be computationally intractable. Significantly, we show that it may be syntactic features rather than rules that can cause this difficulty. Informally, human languages and the computationally intractable Satisfiability (SAT) problem share two costly computional mechanisms: both enforce agreement among symbols across unbounded distances (Subject-Verb agreement) and both allow ambiguity (is a word a Noun or a Verb?).
Resumo:
The task in text retrieval is to find the subset of a collection of documents relevant to a user's information request, usually expressed as a set of words. Classically, documents and queries are represented as vectors of word counts. In its simplest form, relevance is defined to be the dot product between a document and a query vector--a measure of the number of common terms. A central difficulty in text retrieval is that the presence or absence of a word is not sufficient to determine relevance to a query. Linear dimensionality reduction has been proposed as a technique for extracting underlying structure from the document collection. In some domains (such as vision) dimensionality reduction reduces computational complexity. In text retrieval it is more often used to improve retrieval performance. We propose an alternative and novel technique that produces sparse representations constructed from sets of highly-related words. Documents and queries are represented by their distance to these sets. and relevance is measured by the number of common clusters. This technique significantly improves retrieval performance, is efficient to compute and shares properties with the optimal linear projection operator and the independent components of documents.
Resumo:
This paper investigates how people return to information in a dynamic information environment. For example, a person might want to return to Web content via a link encountered earlier on a Web page, only to learn that the link has since been removed. Changes can benefit users by providing new information, but they hinder returning to previously viewed information. The observational study presented here analyzed instances, collected via a Web search, where people expressed difficulty re-finding information because of changes to the information or its environment. A number of interesting observations arose from this analysis, including that the path originally taken to get to the information target appeared important in its re-retrieval, whereas, surprisingly, the temporal aspects of when the information was seen before were not. While people expressed frustration when problems arose, an explanation of why the change had occurred was often sufficient to allay that frustration, even in the absence of a solution. The implications of these observations for systems that support re-finding in dynamic environments are discussed.
Resumo:
We describe a program called SketchIT capable of producing multiple families of designs from a single sketch. The program is given a rough sketch (drawn using line segments for part faces and icons for springs and kinematic joints) and a description of the desired behavior. The sketch is "rough" in the sense that taken literally, it may not work. From this single, perhaps flawed sketch and the behavior description, the program produces an entire family of working designs. The program also produces design variants, each of which is itself a family of designs. SketchIT represents each family of designs with a "behavior ensuring parametric model" (BEP-Model), a parametric model augmented with a set of constraints that ensure the geometry provides the desired behavior. The construction of the BEP-Model from the sketch and behavior description is the primary task and source of difficulty in this undertaking. SketchIT begins by abstracting the sketch to produce a qualitative configuration space (qc-space) which it then uses as its primary representation of behavior. SketchIT modifies this initial qc-space until qualitative simulation verifies that it produces the desired behavior. SketchIT's task is then to find geometries that implement this qc-space. It does this using a library of qc-space fragments. Each fragment is a piece of parametric geometry with a set of constraints that ensure the geometry implements a specific kind of boundary (qcs-curve) in qc-space. SketchIT assembles the fragments to produce the BEP-Model. SketchIT produces design variants by mapping the qc-space to multiple implementations, and by transforming rotating parts to translating parts and vice versa.
Resumo:
Methods for fusing two computer vision methods are discussed and several example algorithms are presented to illustrate the variational method of fusing algorithms. The example algorithms seek to determine planet topography given two images taken from two different locations with two different lighting conditions. The algorithms each employ assingle cost function that combines the computer vision methods of shape-from-shading and stereo in different ways. The algorithms are closely coupled and take into account all the constraints of the photo-topography problem. The algorithms are run on four synthetic test image sets of varying difficulty.
Resumo:
The aim of this thesis was to explore the design of interactive computer learning environments. The particular learning domain selected was Newtonian dynamics. Newtonian dynamics was chosen because it is an important area of physics with which many students have difficulty and because controlling Newtonian motion takes advantage of the computer's graphics and interactive capabilities. The learning environment involved games which simulated the motion of a spaceship on a display screen. The purpose of the games was to focus the students' attention on various aspects of the implications of Newton's laws.
Resumo:
As part of a larger research project in musical structure, a program has been written which "reads" scores encoded in an input language isomorphic to music notation. The program is believed to be the first of its kind. From a small number of parsing rules the program derives complex configurations, each of which is associated with a set of reference points in a numerical representation of a time-continuum. The logical structure of the program is such that all and only the defined classes of events are represented in the output. Because the basis of the program is syntactic (in the sense that parsing operations are performed on formal structures in the input string), many extensions and refinements can be made without excessive difficulty. The program can be applied to any music which can be represented in the input language. At present, however, it constitutes the first stage in the development of a set of analytic tools for the study of so-called atonal music, the revolutionary and little understood music which has exerted a decisive influence upon contemporary practice of the art. The program and the approach to automatic data-structuring may be of interest to linguists and scholars in other fields concerned with basic studies of complex structures produced by human beings.
Resumo:
SIN and SOLDIER are heuristic programs in LISP which solve symbolic integration problems. SIN (Symbolic INtegrator) solves indefinite integration problems at the difficulty approaching those in the larger integral tables. SIN contains several more methods than are used in the previous symbolic integration program SAINT, and solves most of the problems attempted by SAINT in less than one second. SOLDIER (SOLution of Ordinary Differential Equations Routine) solves first order, first degree ordinary differential equations at the level of a good college sophomore and at an average of about five seconds per problem attempted. The differences in philosophy and operation between SAINT and SIN are described, and suggestions for extending the work presented are made.
Resumo:
The work reported here lies in the area of overlap between artificial intelligence software engineering. As research in artificial intelligence, it is a step towards a model of problem solving in the domain of programming. In particular, this work focuses on the routine aspects of programming which involve the application of previous experience with similar programs. I call this programming by inspection. Programming is viewed here as a kind of engineering activity. Analysis and synthesis by inspection area prominent part of expert problem solving in many other engineering disciplines, such as electrical and mechanical engineering. The notion of inspections methods in programming developed in this work is motivated by similar notions in other areas of engineering. This work is also motivated by current practical concerns in the area of software engineering. The inadequacy of current programming technology is universally recognized. Part of the solution to this problem will be to increase the level of automation in programming. I believe that the next major step in the evolution of more automated programming will be interactive systems which provide a mixture of partially automated program analysis, synthesis and verification. One such system being developed at MIT, called the programmer's apprentice, is the immediate intended application of this work. This report concentrates on the knowledge are of the programmer's apprentice, which is the form of a taxonomy of commonly used algorithms and data structures. To the extent that a programmer is able to construct and manipulate programs in terms of the forms in such a taxonomy, he may relieve himself of many details and generally raise the conceptual level of his interaction with the system, as compared with present day programming environments. Also, since it is practical to expand a great deal of effort pre-analyzing the entries in a library, the difficulty of verifying the correctness of programs constructed this way is correspondingly reduced. The feasibility of this approach is demonstrated by the design of an initial library of common techniques for manipulating symbolic data. This document also reports on the further development of a formalism called the plan calculus for specifying computations in a programming language independent manner. This formalism combines both data and control abstraction in a uniform framework that has facilities for representing multiple points of view and side effects.
Resumo:
An understanding of research is important to enable nurses to provide evidencebasedcare. However, undergraduate nursing students often find research a challenging subject. The purpose of this paper is to present an evaluation of the introduction of podcasts in an undergraduate research module to enhance research teaching linkages between the theoretical content and research in practice and improve the level of student support offered in a blended learning environment. Two cohorts of students (n=228 and n=233) were given access to a series of 5 “guest speaker” podcasts made up of presentations and interviews with research experts within Edinburgh Napier. These staff would not normally have contact with students on this module, but through the podcasts were able to share their research expertise and methods with our learners. The main positive results of the podcasts suggest the increased understanding achieved by students due to the multi-modal delivery approach, a more personal student/tutor relationship leading to greater engagement, and the effective use of materials for revision and consolidation purposes. Negative effects of the podcasts centred around problems with the technology, most often difficulty in downloading and accessing the material. This paper contributes to the emerging knowledge base of podcasting in nurse education by demonstrating how podcasts can be used to enhance research-teaching linkages and raises the question of why students do not exploit the opportunities for mobile learning.