906 resultados para epistemic marking


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research suggests that changing consumer and producer knowledge structures play a role in market evolution and that the sociocognitive processes of product markets are revealed in the sensemaking stories of market actors that are rebroadcasted in commercial publications. In this article, the authors lend further support to the story-based nature of market sensemaking and the use of the sociocognitive approach in explaining the evolution of high-technology markets. They examine the content (i.e., subject matter or topic) and volume (i.e., the number) of market stories and the extent to which content and volume of market stories evolve as a technology emerges. Data were obtained from a content analysis of 10,412 article abstracts, published in key trade journals, pertaining to Local Area Network (LAN) technologies and spanning the period 1981 to 2000. Hypotheses concerning the evolving nature (content and volume) of market stories in technology evolution are tested. The analysis identified four categories of market stories - technical, product availability, product adoption, and product discontinuation. The findings show that the emerging technology passes initially through a 'technical-intensive' phase whereby technology related stories dominate, through a 'supply-push' phase, in which stories presenting products embracing the technology tend to exceed technical stories while there is a rise in the number of product adoption reference stories, to a 'product-focus' phase, with stories predominantly focusing on product availability. Overall story volume declines when a technology matures as the need for sensemaking reduces. When stories about product discontinuation surface, these signal the decline of current technology. New technologies that fail to maintain the 'product-focus' stage also reflect limited market acceptance. The article also discusses the theoretical and managerial implications of the study's findings. © 2002 Elsevier Science Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent surveys reveal that many university students in the U.K. are not satisfied with the timeliness and usefulness of the feedback given by their tutors. Ensuring timeliness in marking can result in a reduction in the quality of feedback. Though suitable use of Information and Communication Technology should alleviate this problem, existing Virtual Learning Environments are inadequate to support detailed marking scheme creation and they provide little support for giving detailed feedback. This paper describes a unique new web-based tool called e-CAF for facilitating coursework assessment and feedback management directed by marking schemes. Using e-CAF, tutors can create or reuse detailed marking schemes efficiently without sacrificing the accuracy or thoroughness in marking. The flexibility in marking scheme design also makes it possible for tutors to modify a marking scheme during the marking process without having to reassess the students’ submissions. The resulting marking process will become more transparent to students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New Approach’ Directives now govern the health and safety of most products whether destined for workplace or domestic use. These Directives have been enacted into UK law by various specific legislation principally relating to work equipment, machinery and consumer products. This research investigates whether the risk assessment approach used to ensure the safety of machinery may be applied to consumer products. Crucially, consumer products are subject to the Consumer Protection Act (CPA) 1987, where there is no direct reference to “assessing risk”. This contrasts with the law governing the safety of products used in the workplace, where risk assessment underpins the approach. New Approach Directives are supported by European harmonised standards, and in the case of machinery, further supported by the risk assessment standard, EN 1050. The system regulating consumer product safety is discussed, its key elements identified and a graphical model produced. This model incorporates such matters as conformity assessment, the system of regulation, near miss and accident reporting. A key finding of the research is that New Approach Directives have a common feature of specifying essential performance requirements that provide a hazard prompt-list that can form the basis for a risk assessment (the hazard identification stage). Drawing upon 272 prosecution cases, and with thirty examples examined in detail, this research provides evidence that despite the high degree of regulation, unsafe consumer products still find their way onto the market. The research presents a number of risk assessment tools to help Trading Standards Officers (TSOs) prioritise their work at the initial inspection stage when dealing with subsequent enforcement action.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Engineers logbooks are an important part of the CDIO process, as a prequel to the logooks they will be expected to keep when in industry. Previously however, students logbooks were insufficient and students did not appear to appreciate the importance of the logbooks or how they would be assessed. In an attempt to improve the students understanding and quality of logbooks, a group of ~100 1st year CDIO students were asked to collaboratively develop a marking matrix with the tutors. The anticipated outcome was that students would have more ownership in, and a deeper understanding of, the logbook and what is expected from the student during assessment. A revised marking matrix was developed in class and a short questionnaire was implemented on delivery of the adapted matrix to gauge the students response to the process. Marks from the logbooks were collected twice during teaching period one and two and compared to marks from previous years. This poster will deliver the methodology and outcomes for this venture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The description of the support system for marking decision in terms of prognosing the inflation level based on the multifactor dependence represented by the decision – marking “tree” is given in the paper. The interrelation of factors affecting the inflation level – economic, financial, political, socio-demographic ones, is considered. The perspectives for developing the method of decision – marking “tree”, and pointing out the so- called “narrow” spaces and further analysis of possible scenarios for inflation level prognosing in particular, are defined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article I argue that the study of the linguistic aspects of epistemology has become unhelpfully focused on the corpus-based study of hedging and that a corpus-driven approach can help to improve upon this. Through focusing on a corpus of texts from one discourse community (that of genetics) and identifying frequent tri-lexical clusters containing highly frequent lexical items identified as keywords, I undertake an inductive analysis identifying patterns of epistemic significance. Several of these patterns are shown to be hedging devices and the whole corpus frequencies of the most salient of these, candidate and putative, are then compared to the whole corpus frequencies for comparable wordforms and clusters of epistemic significance. Finally I interviewed a ‘friendly geneticist’ in order to check my interpretation of some of the terms used and to get an expert interpretation of the overall findings. In summary I argue that the highly unexpected patterns of hedging found in genetics demonstrate the value of adopting a corpus-driven approach and constitute an advance in our current understanding of how to approach the relationship between language and epistemology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Meier (2012) gave a "mathematical logic foundation" of the purely measurable universal type space (Heifetz and Samet, 1998). The mathematical logic foundation, however, discloses an inconsistency in the type space literature: a finitary language is used for the belief hierarchies and an infinitary language is used for the beliefs. In this paper we propose an epistemic model to fix the inconsistency above. We show that in this new model the universal knowledgebelief space exists, is complete and encompasses all belief hierarchies. Moreover, by examples we demonstrate that in this model the players can agree to disagree Aumann (1976)'s result does not hold, and Aumann and Brandenburger (1995)'s conditions are not sufficient for Nash equilibrium. However, we show that if we substitute selfevidence (Osborne and Rubinstein, 1994) for common knowledge, then we get at that both Aumann (1976)'s and Aumann and Brandenburger (1995)'s results hold.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auditor decisions regarding the causes of accounting misstatements can have an audit effectiveness and efficiency. Specifically, overconfidence in one's decision can lead to an ineffective audit, whereas underconfidence in one's decision can lead to an inefficient audit. This dissertation explored the implications of providing various types of information cues to decision-makers regarding an Analytical Procedure task and investigated the relationship between different types of evidence cues (confirming, disconfirming, redundant or non-redundant) and the reduction in calibration bias. Information was collected using a laboratory experiment, from 45 accounting students participants. Research questions were analyzed using a 2 x 2 x 2 between-subject and within-subject analysis of covariance (ANCOVA). ^ Results indicated that presenting subjects with information cues dissimilar to the choice they made is an effective intervention in reducing the common overconfidence found in decision-making. In addition, other information characteristics, specifically non-redundant information can help in reducing a decision-maker's overconfidence/calibration bias for difficulty (compared to easy) decision-tasks. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SmartWater is a chemical taggant used as a crime deterrent. The chemical taggant is a colorless liquid that fluoresces yellow under ultra-violet (UV) light and contains distinctive, identifiable and traceable elemental composition. For instance, upon a break and entry scenario, the burglar is sprayed with a solution that has an elemental signature custom-made to a specific location. The residues of this taggant persist on skin and other objects and can be easily recovered for further analysis. The product has been effectively used in Europe as a crime deterrent and has been recently introduced in South Florida. In 2014, Fourt Lauderdale Police Department reported the use of SmartWater products with a reduction in burglaries of 14% [1]. The International Forensic Research Institute (IFRI) at FIU validated the scientific foundation of the methods of recovery and analysis of these chemical tagging systems using LA-ICP-MS. Analytical figures of merit of the method such as precision, accuracy, limits of detection, linearity and selectivity are reported in this study. Moreover, blind samples were analyzed by LA-ICP-MS to compare the chemical signatures to the company’s database and evaluate error rates and the accuracy of the method. This study demonstrated that LA-ICP-MS could be used to effectively detect these traceable taggants to assist law enforcement agencies in the United States with cases involving transfer of these forensic coding systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acknowledgments The research for this paper was primarily funded by an Australian Research Council (ARC) grant (DP120101092), How do we know what works? Ethics and evidence in surgical research. Katrina Hutchison’s research was also partly funded by the ARC Centre of Excellence for Electromaterials Science, where she has worked since June 2015. Discussions about the paper were facilitated by Macquarie University funding of a visit by Vikki A. Entwistle to participate in a Centre for Agency, Values and Ethics (CAVE) seminar on Capabilities Approaches to Justice. The authors would like to thank the anonymous reviewers for a number of helpful comments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.