952 resultados para monolithic reasoning


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprise Resource Planning (ERP) software typically takes the form of a package that is licensed for use to those in a client organisation and is sold as being able to automate a wide range of processes within organisations. ERP packages have become an important feature of information and communications technology (ICT) infrastructures in organizations. However, a number of highly publicised failures have been associated with the ERP packages too. For example: Hershey, Aero Group and Snap-On have blamed the implementation of ERP packages for negative impacts upon earnings (Scott and Vessey 2000); Cadbury Schweppes implemented plans to fulfil 250 orders where normally they would fulfil 1000 due to the increased complexity and the need to re-train staff post implementation (August 1999) and FoxMeyer drug company’s implementation of an ERP package has been argued to have lead to bankruptcy proceedings resulting in litigation against SAP, the software vendor in question (Bicknell 1998). Some have even rejected a single vendor approach outright (Light et. al. 2001). ERP packages appear to work for some and not for others, they contain contradictions. Indeed, if we start from the position that technologies do not provide their own explanation, then we have to consider the direction of a technological trajectory and why it moves in one way rather than another (Bijker and Law 1994). In other words, ERP appropriation cannot be predetermined as a success, despite the persuasive attempts of vendors via their websites and other marketing channels. Moreover, just because ERP exists, we cannot presume that all will appropriate it in the same fashion, if at all. There is more to the diffusion of innovations than stages of adoption and a simple demarcation between adoption and rejection. The processes that are enacted in appropriation need to be conceptualised as a site of struggle, political and imbued with power (Hislop et. al. 2000; Howcroft and Light, 2006). ERP appropriation and rejection can therefore be seen as a paradoxical phenomenon. In this paper we examine these contradictions as a way to shed light on the presence and role of inconsistencies in ERP appropriation and rejection. We argue that much of the reasoning associated with ERP adoption is pro-innovation biased and that deterministic models of the diffusion of innovations such as Rogers (2003), do not adequately take account of contradictions in the process. Our argument is that a better theoretical understanding of these contradictions is necessary to underpin research and practice in this area. In the next section, we introduce our view of appropriation. Following this is an outline of the idea of contradiction, and the strategies employed to ‘cope’ with this. Then, we introduce a number of reasons for ERP adoption and identify their inherent contradictions using these perspectives. From this discussion, we draw a framework, which illustrates how the interpretive flexibility of reasons to adopt ERP packages leads to contradictions which fuel the enactment of appropriation and rejection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vertical line extensions, both step-up and step-down, are common occurrence in consumer products. For example, Timex recently launched its luxury high-end Valentino line. On the other hand, many companies use downscale extensions to increase the overall sales volume. For instance, a number of luxury watch brands recently introduced watch collections with lower price points, like TAG Heur’s affordable watch the Aquaracer Calibre 5. Previous literature on vertical extensions has investigated how number of products in the line (Dacin and Smith 1994), the direction of the extension, brand concept (Kim, Lavack, and Smith 2001), and perceived risk (Lei, de Ruyter, and Wetzels 2008) affect extensions’ evaluation. Common to this literature is the use of models based on adaptation-level theory, which states that all relevant price information is integrated into a single prototype value and used in consumer judgments of price (Helson 1947; Mazumdar, Raj, and Sinha 2005). In the current research we argue that, while adaptation-level theory can be viewed as a useful simplification to understanding consumers’ evaluations, it misses out important contextual influences caused by a brand’s price range. Drawing on research on range-frequency theory (Mellers and Cooke 1994; Parducci 1965) we investigate the effects of price point distance and parent brand’s price range on evaluations of vertical extensions. Our reasoning leads to two important predictions that we test in a series of three experiments...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is not uncommon for firms to explore a new venture under the belief it will generate profits, only to find out later that although costs accumulated, profits did not materialize. To manage the high level of uncertainty involved in this process, new ventures are generally designed as vehicles of exploration (Wu, 2012) that allow for a staged investment of resources, starting with small initial investments that can be scaled up or discontinued as uncertainty is resolved over time (Folta, 1998; Li and Chi, 2013). As such, new ventures provide firms a vehicle by which they can probe an uncertain future (Brown and Eisenhardt, 1997) without fully committing early on to an irreversible course of action (Folta, Johnson, and O’Brien, 2006). Our focus in the present paper is on the timing of strategic decisions that firms make regarding their exploration ventures. Prior research in the fields of entrepreneurship, real options reasoning, and decision speed has demonstrated a link between the timing of making decisions and performance (Baum and Wally, 2003; Eisenhardt, 1989; Judge and Miller, 1991). The antecedents to the timing of decisions, however, are less understood and pose an interesting dilemma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deviant behaviour is an ongoing problem in the consumer marketplace (Daunt and Harris, 2012). To investigate this issue, a number of authors have focused on empirically assessing consumer perceptions of right and wrong behaviours, such as the Muncy-Vitell (1992) consumer ethics scale. While studies like this have provided extensive empirical insights, qualitative insight into why consumers make these behaviour classifications remains underexplored. The aim of this paper is to extend on that literature by exploring the reasoning behind behavioural classifications. Using interviews, seven factors were identified in consumer definitions of acceptable, questionable, and unacceptable consumer behaviours including; official classification, prevalence, ease of justification, perceived fairness, consequences, risk, and values. These results also provide actionable insights for marketers in that multi-level deterrence strategies must be employed to more effectively deter consumer deviance, as opposed to traditional deterrence strategies based on cost-benefit analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Archaeology has been called 'the science of the artefact' and nothing demonstrates this point better than the current interest displayed in provenance studies of archaeological objects. In theory, every vessel carries a chemical compositional pattern or 'fingerprint' identical with the clay from which it was made and this relationship is basic to provenance studies. The reasoning behind provenance or sourcing studies is to probe into this past and attempt to re-create prehistory by obtaining information on exchange and social interaction. This paper discusses the use of XRF spectrometry for the analysis of ancient pottery and ceramics to examine whether it is possible to predict prehictoric cultural exchanges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Engineers must have deep and accurate conceptual understanding of their field and Concept inventories (CIs) are one method of assessing conceptual understanding and providing formative feedback. Current CI tests use Multiple Choice Questions (MCQ) to identify misconceptions and have undergone reliability and validity testing to assess conceptual understanding. However, they do not readily provide the diagnostic information about students’ reasoning and therefore do not effectively point to specific actions that can be taken to improve student learning. We piloted the textual component of our diagnostic CI on electrical engineering students using items from the signals and systems CI. We then analysed the textual responses using automated lexical analysis software to test the effectiveness of these types of software and interviewed the students regarding their experience using the textual component. Results from the automated text analysis revealed that students held both incorrect and correct ideas for certain conceptual areas and provided indications of student misconceptions. User feedback also revealed that the inclusion of the textual component is helpful to students in assessing and reflecting on their own understanding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The upstream oil & gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data”—that is, the ability to apply more sophisticated types of analytical tools to information in a way that extracts new insights or creates new forms of value—is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil & gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This paper examines existing data management practices in the upstream oil & gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the Big Data revolution. The comparison shows that, in companies that are leading the Big Data revolution, data is regarded as a valuable asset. The presented evidence also shows, however, that this is usually not true within the oil & gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how upstream oil & gas companies could potentially extract more value from data, and concludes with a series of specific technical and management-related recommendations to this end.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Through an examination of Wallace v Kam, this article considers and evaluates the law of causation in the specific context of a medical practitioner’s duty to provide information to patients concerning material risks of treatment. To supply a contextual background for the analysis which follows, Part II summarises the basic principles of causation law, while Part III provides an overview of the case and the reasoning adopted in the decisions at first instance and on appeal. With particular emphasis upon the reasoning in the courts of appeal, Part IV then examines the implications of the case in the context of other jurisprudence in this field and, in so doing, provides a framework for a structured consideration of causation issues in future non-disclosure cases under the Australian civil liability legislation. As will become clear, Wallace was fundamentally decided on the basis of policy reasoning centred upon the purpose behind the legal duty violated. Although the plurality in Rogers v Whitaker rejected the utility of expressions such as ‘the patient’s right of self-determination’ in this context, some Australian jurisprudence may be thought to frame the practitioner’s duty to warn in terms of promoting a patient’s autonomy, or right to decide whether to submit to treatment proposed. Accordingly, the impact of Wallace upon the protection of this right, and the interrelation between it and the duty to warn’s purpose, is investigated. The analysis in Part IV also evaluates the courts’ reasoning in Wallace by questioning the extent to which Wallace’s approach to liability and causal connection in non-disclosure of risk cases: depends upon the nature and classification of the risk(s) in question; and can be reconciled with the way in which patients make decisions. Finally, Part V adopts a comparative approach by considering whether the same decision might be reached if Wallace was determined according to English law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bayesian networks (BNs) are graphical probabilistic models used for reasoning under uncertainty. These models are becoming increasing popular in a range of fields including ecology, computational biology, medical diagnosis, and forensics. In most of these cases, the BNs are quantified using information from experts, or from user opinions. An interest therefore lies in the way in which multiple opinions can be represented and used in a BN. This paper proposes the use of a measurement error model to combine opinions for use in the quantification of a BN. The multiple opinions are treated as a realisation of measurement error and the model uses the posterior probabilities ascribed to each node in the BN which are computed from the prior information given by each expert. The proposed model addresses the issues associated with current methods of combining opinions such as the absence of a coherent probability model, the lack of the conditional independence structure of the BN being maintained, and the provision of only a point estimate for the consensus. The proposed model is applied an existing Bayesian Network and performed well when compared to existing methods of combining opinions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question of the authority of law has occupied and vexed the literature and philosophy of law for centuries. Law is something that characteristically implies obedience, but the precise nature of law’s authority remains contentious. The return to the writings of the Apostle Paul in contemporary philosophy, theology and jurisprudence begs attention in relation to the authority of law, and so this article will consider his analysis and critique of law with a focus on his Epistle to the Romans. It argues that Paul’s conception of the authority of law is explained on the basis that the law is from God, it externally sanctions obedience by virtue of the civil authorities, and it convicts internally in conscience. This triad is justified by the law of love (‘‘love your neighbor as yourself’’), and will be explained in relation to the natural law tradition as well as converse ideas in positivism. Hence, considering the reasoning of Paul in relation to traditional jurisprudential themes and the law of love provides a useful alternative analysis and basis for further investigation regarding the authority of law and the need for its obedience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: A major challenge for assessing students’ conceptual understanding of STEM subjects is the capacity of assessment tools to reliably and robustly evaluate student thinking and reasoning. Multiple-choice tests are typically used to assess student learning and are designed to include distractors that can indicate students’ incomplete understanding of a topic or concept based on which distractor the student selects. However, these tests fail to provide the critical information uncovering the how and why of students’ reasoning for their multiple-choice selections. Open-ended or structured response questions are one method for capturing higher level thinking, but are often costly in terms of time and attention to properly assess student responses. Purpose: The goal of this study is to evaluate methods for automatically assessing open-ended responses, e.g. students’ written explanations and reasoning for multiple-choice selections. Design/Method: We incorporated an open response component for an online signals and systems multiple-choice test to capture written explanations of students’ selections. The effectiveness of an automated approach for identifying and assessing student conceptual understanding was evaluated by comparing results of lexical analysis software packages (Leximancer and NVivo) to expert human analysis of student responses. In order to understand and delineate the process for effectively analysing text provided by students, the researchers evaluated strengths and weakness for both the human and automated approaches. Results: Human and automated analyses revealed both correct and incorrect associations for certain conceptual areas. For some questions, that were not anticipated or included in the distractor selections, showing how multiple-choice questions alone fail to capture the comprehensive picture of student understanding. The comparison of textual analysis methods revealed the capability of automated lexical analysis software to assist in the identification of concepts and their relationships for large textual data sets. We also identified several challenges to using automated analysis as well as the manual and computer-assisted analysis. Conclusions: This study highlighted the usefulness incorporating and analysing students’ reasoning or explanations in understanding how students think about certain conceptual ideas. The ultimate value of automating the evaluation of written explanations is that it can be applied more frequently and at various stages of instruction to formatively evaluate conceptual understanding and engage students in reflective

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Prescribing is a complex task, requiring specific knowledge and skills combined with effective, context-specific clinical reasoning. Prescribing errors can result in significant morbidity and mortality. For all professions with prescribing rights, a clear need exists to ensure students graduate with a well-defined set of prescribing skills, which will contribute to competent prescribing. AIM To describe the methods employed to teach and assess the principles of effective prescribing across five non-medical professions at Queensland University of Technology. METHOD The NPS National Prescribing Competencies Framework (PCF) was used as the prescribing standard. A curriculum mapping exercise was undertaken to determine how well the PCF was addressed across the disciplines of paramedic science, pharmacy, podiatry, nurse practitioner and optometry. Identified gaps in teaching and/or assessment were noted. RESULTS Prescribing skills and knowledge are taught and assessed using a range of methods across disciplines. A multi-modal approach is employed by all disciplines. The Pharmacy discipline uses more tutorial sessions to teach prescribing principles and relies less on case studies and clinical appraisal to assess prescribing when compared to other disciplines. Within the pharmacy discipline approximately 90% of the PCF competencies are taught and assessed. This compares favourably with the other disciplines. CONCLUSION Further work is required to establish a practical, effective approach to the assessment of prescribing competence especially between the university and clinical settings. Effective and reliable assessment of prescribing undertaken by students in diverse settings remains challenging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores how four English teachers position their English language learners for critical literacy within senior high school curriculum in Queensland, Australia. Such learners are often positioned, even by their teachers, within a broader “deficit discourse” that claims they are inherently lacking the requisite knowledge and skills to engage with intransigent school curricula. As such, English language learners’ identity formation is often constrained by deficit views that can ultimately see limited kinds of literacy teaching offered to them. Using Fairclough’s (2003) critical discourse analysis method, analysis of 16 interviews with the teachers was conducted as part of a larger, critical instrumental case study in two state high schools during 2010. Five competing discourses were identified: deficit as lack; deficit as need; learner “difference” as a resource; conceptual capacity for critical literacy; and linguistic, cultural and conceptual difficulty with critical literacy. While a deficit view is present, counter-hegemonic discourses also exist in their talk. The combination of discourses challenges monolithic deficit views of English language learners, and opens up generative discursive territory to position English language learners in ways other than “problematic”. This has important implications for how teachers view and teach English language learners and their capacity for critical literacy work in senior high school classrooms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a layered framework for the purposes of integrating different Socio-Technical Systems (STS) models and perspectives into a whole-of-systems model. Holistic modelling plays a critical role in the engineering of STS due to the interplay between social and technical elements within these systems and resulting emergent behaviour. The framework decomposes STS models into components, where each component is either a static object, dynamic object or behavioural object. Based on existing literature, a classification of the different elements that make up STS, whether it be a social, technical or a natural environment element, is developed; each object can in turn be classified according to the STS elements it represents. Using the proposed framework, it is possible to systematically decompose models to an extent such that points of interface can be identified and the contextual factors required in transforming the component of one model to interface into another is obtained. Using an airport inbound passenger facilitation process as a case study socio-technical system, three different models are analysed: a Business Process Modelling Notation (BPMN) model, Hybrid Queue-based Bayesian Network (HQBN) model and an Agent Based Model (ABM). It is found that the framework enables the modeller to identify non-trivial interface points such as between the spatial interactions of an ABM and the causal reasoning of a HQBN, and between the process activity representation of a BPMN and simulated behavioural performance in a HQBN. Such a framework is a necessary enabler in order to integrate different modelling approaches in understanding and managing STS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three core components in developing children’s understanding and appreciation of data — establish a context, pose and answer statistical questions, represent and interpret data — lay the foundation for the fourth component: use data to enhance existing context.