967 resultados para explicit categorization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The shift towards a knowledge-based economy has inevitably prompted the evolution of patent exploitation. Nowadays, patent is more than just a prevention tool for a company to block its competitors from developing rival technologies, but lies at the very heart of its strategy for value creation and is therefore strategically exploited for economic pro t and competitive advantage. Along with the evolution of patent exploitation, the demand for reliable and systematic patent valuation has also reached an unprecedented level. However, most of the quantitative approaches in use to assess patent could arguably fall into four categories and they are based solely on the conventional discounted cash flow analysis, whose usability and reliability in the context of patent valuation are greatly limited by five practical issues: the market illiquidity, the poor data availability, discriminatory cash-flow estimations, and its incapability to account for changing risk and managerial flexibility. This dissertation attempts to overcome these impeding barriers by rationalizing the use of two techniques, namely fuzzy set theory (aiming at the first three issues) and real option analysis (aiming at the last two). It commences with an investigation into the nature of the uncertainties inherent in patent cash flow estimation and claims that two levels of uncertainties must be properly accounted for. Further investigation reveals that both levels of uncertainties fall under the categorization of subjective uncertainty, which differs from objective uncertainty originating from inherent randomness in that uncertainties labelled as subjective are highly related to the behavioural aspects of decision making and are usually witnessed whenever human judgement, evaluation or reasoning is crucial to the system under consideration and there exists a lack of complete knowledge on its variables. Having clarified their nature, the application of fuzzy set theory in modelling patent-related uncertain quantities is effortlessly justified. The application of real option analysis to patent valuation is prompted by the fact that both patent application process and the subsequent patent exploitation (or commercialization) are subject to a wide range of decisions at multiple successive stages. In other words, both patent applicants and patentees are faced with a large variety of courses of action as to how their patent applications and granted patents can be managed. Since they have the right to run their projects actively, this flexibility has value and thus must be properly accounted for. Accordingly, an explicit identification of the types of managerial flexibility inherent in patent-related decision making problems and in patent valuation, and a discussion on how they could be interpreted in terms of real options are provided in this dissertation. Additionally, the use of the proposed techniques in practical applications is demonstrated by three fuzzy real option analysis based models. In particular, the pay-of method and the extended fuzzy Black-Scholes model are employed to investigate the profitability of a patent application project for a new process for the preparation of a gypsum-fibre composite and to justify the subsequent patent commercialization decision, respectively; a fuzzy binomial model is designed to reveal the economic potential of a patent licensing opportunity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies have shown a time-of-day of training effect on long-term explicit memory with a greater effect being shown in the afternoon than in the morning. However, these studies did not control the chronotype variable. Therefore, the purpose of this study was to assess if the time-of-day effect on explicit memory would continue if this variable were controlled, in addition to identifying the occurrence of a possible synchronic effect. A total of 68 undergraduates were classified as morning, intermediate, or afternoon types. The subjects listened to a list of 10 words during the training phase and immediately performed a recognition task, a procedure which they repeated twice. One week later, they underwent an unannounced recognition test. The target list and the distractor words were the same in all series. The subjects were allocated to two groups according to acquisition time: a morning group (N = 32), and an afternoon group (N = 36). One week later, some of the subjects in each of these groups were subjected to a test in the morning (N = 35) or in the afternoon (N = 33). The groups had similar chronotypes. Long-term explicit memory performance was not affected by test time-of-day or by chronotype. However, there was a training time-of-day effect [F (1,56) = 53.667; P = 0.009] with better performance for those who trained in the afternoon. Our data indicated that the advantage of training in the afternoon for long-term memory performance does not depend on chronotype and also that this performance is not affected by the synchronic effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This experimental study examined the effects of cooperative learning and expliciUimpliGit instruction on student achievement and attitudes toward working in cooperative groups. Specifically, fourth- and fifth-grade students (n=48) were randomly assigned to two conditions: cooperative learning with explicit instruction and cooperative learning with implicit instruction. All participants were given initial training either explicitly or implicitly in cooperative learning procedures via 10 one-hour sessions. Following the instruction period, all students participated in completing a group project related to a famous artists unit. It was hypothesized that the explicit instruction training would enhance students' scores on the famous artists test and the group projects, as well as improve students' attitudes toward cooperative learning. Although the explicit training group did not achieve significantly higher scores on the famous artists test, significant differences were found in group project results between the explicit and implicit groups. The explicit group also exhibited more favourable and positive attitudes toward cooperative learning. The findings of this study demonstrate that combining cooperative learning with explicit instruction is an effective classroom strategy and a useful practice for presenting and learning new information, as well as working in groups with success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One group of 12 non learning disabled students and two groups of 12 learning disabled students between the ges of 10 and 12 were measured on implicit and explicit knowledge cquisition. Students in each group implicitly cquired knowledge bout I of 2 vocabulary rules. The vocabulary rules governed the pronunciation of 2 types of pseudowords. After completing the implicit acquisition phase, all groups were administered a test of implicit knowledge. The non learning disabled group and I learning disabled group were then asked to verbalize the knowledge acquired during the initial phase. This was a test of explicit knowledge. All 3 groups were then given a postlest of implicit knowledge. This tcst was a measure of the effectiveness of the employment of the verbalization technique. Results indicate that implicit knowledge capabilities for both the learning disabled and non learning disabled groups were intact. However. there were significant differences between groups on explicit knowledge capabilities. This led to the conclusion that implicit functions show little individual differences, and that explicit functions are affected by ability difference. Furthermore, the employment of the verbalization technique significantly increased POStlest scores for learning disabled students. This suggested that the use of metacognitive techniques was a beneficial learning tool for learning disabled students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research attempted to address the question of the role of explicit algorithms and episodic contexts in the acquisition of computational procedures for regrouping in subtraction. Three groups of students having difficulty learning to subtract with regrouping were taught procedures for doing so through either an explicit algorithm, an episodic content or an examples approach. It was hypothesized that the use of an explicit algorithm represented in a flow chart format would facilitate the acquisition and retention of specific procedural steps relative to the other two conditions. On the other hand, the use of paragraph stories to create episodic content was expected to facilitate the retrieval of algorithms, particularly in a mixed presentation format. The subjects were tested on similar, near, and far transfer questions over a four-day period. Near and far transfer algorithms were also introduced on Day Two. The results suggested that both explicit and episodic context facilitate performance on questions requiring subtraction with regrouping. However, the differential effects of these two approaches on near and far transfer questions were not as easy to identify. Explicit algorithms may facilitate the acquisition of specific procedural steps while at the same time inhibiting the application of such steps to transfer questions. Similarly, the value of episodic context in cuing the retrieval of an algorithm may be limited by the ability of a subject to identify and classify a new question as an exemplar of a particular episodically deflned problem type or category. The implications of these findings in relation to the procedures employed in the teaching of Mathematics to students with learning problems are discussed in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The intent of this study was to investigate .the effectiveness of teaching thirty-five Grade One children a variety of effective spelling strategies in comparison to tradit~onal spelling instruction. Strategy instruction included training in phonology, imagery and analogy. In addition, the type of instruction pro~ided (implicit versus explicit) was also examined. Children were seen in small groups of four or five, for four, twenty-five minute sessions. All children were tested prior and immediately following the training sessions, as well as at 14-day follow-ups. Pretest and posttest measures included a dictated spelling test (based on words used in training), a developmental spelling test and a sample of each child's writing. In addition, children completed a metacognitive spelling test as a measure of their strategy awareness. Performance scores on the pretest and posttest measures were compared to determine if any differences existed between the three spelling instruction groups using the Dunn-Bonferroni and Dunnett procedures. Findings revealed that explicit strategy instruction was the most effective spelling program for improving Grade One children's invented spellings. Children who received this instruction were able to spell targeted words more accurately, even after a 14-day follow-up, and were able to recall more effective spelling strategies than children who received either implicit strategy instruction or traditional strategy instruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (M.Ed.)-- Brock University, 1995.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999.5 E38 L64 2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La tribune de l'éditeur / Editor's Soapbox

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces and examines the logicist construction of Peano Arithmetic that can be performed into Leśniewski’s logical calculus of names called Ontology. Against neo-Fregeans, it is argued that a logicist program cannot be based on implicit definitions of the mathematical concepts. Using only explicit definitions, the construction to be presented here constitutes a real reduction of arithmetic to Leśniewski’s logic with the addition of an axiom of infinity. I argue however that such a program is not reductionist, for it only provides what I will call a picture of arithmetic, that is to say a specific interpretation of arithmetic in which purely logical entities play the role of natural numbers. The reduction does not show that arithmetic is simply a part of logic. The process is not of ontological significance, for numbers are not shown to be logical entities. This neo-logicist program nevertheless shows the existence of a purely analytical route to the knowledge of arithmetical laws.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a new mode of operation for CD-systems of restarting automata by providing explicit enable and disable conditions in the form of regular constraints. We show that, for each CD-system M of restarting automata and each mode m of operation considered by Messerschmidt and Otto, there exists a CD-system M' of restarting automata of the same type as M that, working in the new mode ed, accepts the language that M accepts in mode m. Further, we prove that in mode ed, a locally deterministic CD-system of restarting automata of type RR(W)(W) can be simulated by a locally deterministic CD-system of restarting automata of the more restricted type R(W)(W). This is the first time that a non-monotone type of R-automaton without auxiliary symbols is shown to be as expressive as the corresponding type of RR-automaton.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the problem of categorizing natural objects. To provide a criteria for categorization we propose that the purpose of a categorization is to support the inference of unobserved properties of objects from the observed properties. Because no such set of categories can be constructed in an arbitrary world, we present the Principle of Natural Modes as a claim about the structure of the world. We first define an evaluation function that measures how well a set of categories supports the inference goals of the observer. Entropy measures for property uncertainty and category uncertainty are combined through a free parameter that reflects the goals of the observer. Natural categorizations are shown to be those that are stable with respect to this free parameter. The evaluation function is tested in the domain of leaves and is found to be sensitive to the structure of the natural categories corresponding to the different species. We next develop a categorization paradigm that utilizes the categorization evaluation function in recovering natural categories. A statistical hypothesis generation algorithm is presented that is shown to be an effective categorization procedure. Examples drawn from several natural domains are presented, including data known to be a difficult test case for numerical categorization techniques. We next extend the categorization paradigm such that multiple levels of natural categories are recovered; by means of recursively invoking the categorization procedure both the genera and species are recovered in a population of anaerobic bacteria. Finally, a method is presented for evaluating the utility of features in recovering natural categories. This method also provides a mechanism for determining which features are constrained by the different processes present in a multiple modal world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To recognize a previously seen object, the visual system must overcome the variability in the object's appearance caused by factors such as illumination and pose. Developments in computer vision suggest that it may be possible to counter the influence of these factors, by learning to interpolate between stored views of the target object, taken under representative combinations of viewing conditions. Daily life situations, however, typically require categorization, rather than recognition, of objects. Due to the open-ended character both of natural kinds and of artificial categories, categorization cannot rely on interpolation between stored examples. Nonetheless, knowledge of several representative members, or prototypes, of each of the categories of interest can still provide the necessary computational substrate for the categorization of new instances. The resulting representational scheme based on similarities to prototypes appears to be computationally viable, and is readily mapped onto the mechanisms of biological vision revealed by recent psychophysical and physiological studies.