907 resultados para paraconsistent model theory
Resumo:
The weaknesses of ‗traditional‘ modes of instruction in accounting education have been widely discussed. Many contend that the traditional approach limits the ability to provide opportunities for students to raise their competency level and allow them to apply knowledge and skills in professional problem solving situations. However, the recent body of literature suggests that accounting educators are indeed actively experimenting with ‗non-traditional‘ and ‗innovative‘ instructional approaches, where some authors clearly favour one approach over another. But can one instructional approach alone meet the necessary conditions for different learning objectives? Taking into account the ever changing landscape of not only business environments, but also the higher education sector, the premise guiding the collaborators in this research is that it is perhaps counter productive to promote competing dichotomous views of ‗traditional‘ and ‗non-traditional‘ instructional approaches to accounting education, and that the notion of ‗blended learning‘ might provide a useful framework to enhance the learning and teaching of accounting. This paper reports on the first cycle of a longitudinal study, which explores the possibility of using blended learning in first year accounting at one campus of a large regional university. The critical elements of blended learning which emerged in the study are discussed and, consistent with the design-based research framework, the paper also identifies key design modifications for successive cycles of the research.
Resumo:
This paper will report on the evaluation of a new undergraduate legal workplace unit, LWB421 Learning in Professional Practice. LWB421 was developed in response to the QUT’s strategic planning and a growing view that work experience is essential to developing the skills that law graduates need in order to be effective legal practitioners (Stuckey, 2007). Work integrated learning provides a context for students to develop their skills, to see the link between theory and practice and support students in making the transition from university to practice (Shirley, 2006). The literature in Australian legal education has given little consideration to the design of legal internship subjects (as distinct from legal clinic programs). Accordingly the design of placement subjects needs to be carefully considered to ensure alignment of learning objectives, learning tasks and assessment. Legal placements offer students the opportunity to develop their professional skills in practice, reflect on their own learning and job performance and take responsibility for their career development and planning. This paper will examine the literature relating to the design of placement subjects, particularly in a legal context. It will propose a collaborative model to facilitate learning and assessment of legal work placement subjects. The basis of the model is a negotiated learning contract between the student, workplace supervisor and academic supervisor. Finally the paper will evaluate the model in the context of LWB421. The evaluation will be based on data from surveys of students and supervisors and focus group sessions.
Resumo:
In this paper we respond to calls for an institution-based perspective on strategy. With its emphasis upon mimetic, coercive, and normative isomorphism, institutional theory has earned a deterministic reputation and seems an unlikely foundation on which to construct a theory of strategy. However, a second movement in institutional theory is emerging that gives greater emphasis to creativity and agency. We develop this approach by highlighting co-evolutionary processes that are shaping the varieties of capitalism (VoC) in Asia. To do so, we examine the extent to which the VoC model can be fruitfully applied in the Asian context. In the spirit of the second movement of institutional theory, we describe three processes in which firm strategy collectively and intentionally feeds back to shape institutions: (1) filling institutional voids, (2) retarding institutional innovation, and (3) deploying institutional escape. We outline the key contributions contained in the articles of this Special Issue and discuss a research agenda generated by the VoC perspective.
Resumo:
This paper proposes a novel relative entropy rate (RER) based approach for multiple HMM (MHMM) approximation of a class of discrete-time uncertain processes. Under different uncertainty assumptions, the model design problem is posed either as a min-max optimisation problem or stochastic minimisation problem on the RER between joint laws describing the state and output processes (rather than the more usual RER between output processes). A suitable filter is proposed for which performance results are established which bound conditional mean estimation performance and show that estimation performance improves as the RER is reduced. These filter consistency and convergence bounds are the first results characterising multiple HMM approximation performance and suggest that joint RER concepts provide a useful model selection criteria. The proposed model design process and MHMM filter are demonstrated on an important image processing dim-target detection problem.
Resumo:
There is currently a strong focus worldwide on the potential of large-scale Electronic Health Record (EHR) systems to cut costs and improve patient outcomes through increased efficiency. This is accomplished by aggregating medical data from isolated Electronic Medical Record databases maintained by different healthcare providers. Concerns about the privacy and reliability of Electronic Health Records are crucial to healthcare service consumers. Traditional security mechanisms are designed to satisfy confidentiality, integrity, and availability requirements, but they fail to provide a measurement tool for data reliability from a data entry perspective. In this paper, we introduce a Medical Data Reliability Assessment (MDRA) service model to assess the reliability of medical data by evaluating the trustworthiness of its sources, usually the healthcare provider which created the data and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record. The result is then expressed by manipulating health record metadata to alert medical practitioners relying on the information to possible reliability problems.
Resumo:
Electronic Health Record (EHR) systems are being introduced to overcome the limitations associated with paper-based and isolated Electronic Medical Record (EMR) systems. This is accomplished by aggregating medical data and consolidating them in one digital repository. Though an EHR system provides obvious functional benefits, there is a growing concern about the privacy and reliability (trustworthiness) of Electronic Health Records. Security requirements such as confidentiality, integrity, and availability can be satisfied by traditional hard security mechanisms. However, measuring data trustworthiness from the perspective of data entry is an issue that cannot be solved with traditional mechanisms, especially since degrees of trust change over time. In this paper, we introduce a Time-variant Medical Data Trustworthiness (TMDT) assessment model to evaluate the trustworthiness of medical data by evaluating the trustworthiness of its sources, namely the healthcare organisation where the data was created and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record, with respect to a certain period of time. The result can then be used by the EHR system to manipulate health record metadata to alert medical practitioners relying on the information to possible reliability problems.
Resumo:
It has been recognised that brands play a role in industrial markets, but to date a comprehensive model of business-to-business (B2B) branding does not exist, nor has there been an empirical study of the applicability of a full brand equity model in a B2B context. This paper is the first to begin to address these issues. The paper introduces the Customer- Based Brand Equity (CBBE) model by Kevin Keller (1993; 2001; 2003), and empirically tests its applicability in the market of electronic tracking systems for waste management. While Keller claims that the CBBE pyramid can be applied in a B2B context, this research highlights challenges of such an application, and suggests changes to the model are required. Assessing the equity of manufacturers’ brand names is more appropriate than measuring the equity of individual product brands as suggested by Keller. Secondly, the building blocks of Keller’s model appear useful in an organisational context, although differences in the subdimensions are required. Brand feelings appear to lack relevance in the industrial market investigated, and the pinnacle of Keller’s pyramid, resonance, needs serious modifications. Finally, company representatives play a role in building brand equity, indicating a need for this human element to be recognised in a B2B model.
Resumo:
Purpose – The importance of branding in industrial contexts has increased, yet a comprehensive model of business-to-business (B2B) branding does not exist, nor has there been a thoroughempirical study of the applicability of a full brand equitymodel in a B2B context. This paper aims to discuss the suitability and limitations of Keller’s customer-based brand equity model and tests its applicability in a B2B market. Design/methodology/approach – The study involved the use of semi-structured interviews with senior buyers of technology for electronic tracking of waste management. Findings – Findings suggest that amongst organisational buyers there is a much greater emphasis on the selling organisation, including its corporate brand, credibility and staff, than on individual brands and their associated dimensions. Research limitations/implications – The study investigates real brands with real potential buyers, so there is a risk that the results may represent industry-specific factors that are not representative of all B2B markets. Future research that validates the importance of the Keller elements in other industrial marketing contexts would be beneficial. Practical implications – The findings are relevant for marketing practitioners, researchers and managers as a starting-point for their B2B brand equity research. Originality/value – Detailed insights and key lessons from the field with regard to how B2B brand equity should be conceptualised and measured are offered. A revised brand equity model for B2B application is also presented.
Resumo:
This appendix describes the Order Fulfillment process followed by a fictitious company named Genko Oil. The process is freely inspired by the VICS (Voluntary Inter-industry Commerce Solutions) reference model1 and provides a demonstration of YAWL’s capabilities in modelling complex control-flow, data and resourcing requirements.
Resumo:
This paper examines consumers self-referencing as a mechanism for explaining ethnicity effects in advertising. Data was collected from a 2 (model ethnicity: Asian, white) x 2 (product stereotypicality: stereotypical, non-stereotypical) experiment. Measured independent variables included participant ethnicity and self-referencing. Results shows that (1) Asian exhibit greater self-referencing of Asian models than whites do; (2) self-referencing mediates ethnicity effects on attitude ( ie, attitude towards the model, attitude toward the add, brand attitude, and purchase intentions); (3) high self-referencing Asian have more favourable attitude towards the add and purchase intentions than low self referencing Asians; and (4) Asian models advertising atypical products generate more self-referencing and more favourable attitudes toward the model, A, and purchase intentions for both Asians and whites.
Resumo:
In a university context how should colour be taught in order to engage students? Entwistle states, ‘What we learn depends on how we learn, and why we have to learn it.’ Therefore, there is a need to address the accumulating evidence that highlights the effects of learning environments on the quality of student learning when considering colour education. It is necessary to embrace the contextual demands while ensuring that the student knowledge of colour and the joy of discovering its characteristics in practice are enhanced. Institutional policy is forcing educators to re-evaluate traditional studio’s effectiveness and the intensive 'hands-on' interactive approach that is embedded in such an approach. As curriculum development involves not only theory and project work, the classroom culture and physical environment also need to be addressed. The increase in student numbers impacting the number of academic staff/student ratio, availability of teaching support as well as increasing variety of student age, work commitments, learning styles and attitudes have called for positive changes to how we teach. The Queensland University of Technology’s restructure in 2005 was a great opportunity to re-evaluate and redesign the approach to teaching within the design units of Interior Design undergraduate program –including colour. The resultant approach “encapsulates a mode of delivery, studio structure, as well as the learning context in which students and staff interact to facilitate learning”1 with a potential “to be integrated into a range of Interior Design units as it provides an adaptive educational framework rather than a prescriptive set of rules”.
Resumo:
Focusing on the notion of street kids, the paper suggests that youth be viewed in an alternative way to the subculture theory associated with the Center for Contemporary Cultural Studies in Birmingham (CCCS). It is argued that not only is subculture theory an unsuitable mechanism for understanding homeless youth but also, and more importantly, is itself fundamentally problematic. It is suggested that the work of Michel Foucault necessitates a reevaluation of the domain assumptions underlying subculture theory and offers in its place a model that relocates street kids, and youth itself, as artifacts of a network of governmental strategies.
Resumo:
Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Appropriate mathematical models that are capable of estimating times to failures and the probability of failures in the future are essential in EAM. In most real-life situations, the lifetime of an engineering asset is influenced and/or indicated by different factors that are termed as covariates. Hazard prediction with covariates is an elemental notion in the reliability theory to estimate the tendency of an engineering asset failing instantaneously beyond the current time assumed that it has already survived up to the current time. A number of statistical covariate-based hazard models have been developed. However, none of them has explicitly incorporated both external and internal covariates into one model. This paper introduces a novel covariate-based hazard model to address this concern. This model is named as Explicit Hazard Model (EHM). Both the semi-parametric and non-parametric forms of this model are presented in the paper. The major purpose of this paper is to illustrate the theoretical development of EHM. Due to page limitation, a case study with the reliability field data is presented in the applications part of this study.
Resumo:
Hazard and reliability prediction of an engineering asset is one of the significant fields of research in Engineering Asset Health Management (EAHM). In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset can be influenced and/or indicated by different factors that are termed as covariates. The Explicit Hazard Model (EHM) as a covariate-based hazard model is a new approach for hazard prediction which explicitly incorporates both internal and external covariates into one model. EHM is an appropriate model to use in the analysis of lifetime data in presence of both internal and external covariates in the reliability field. This paper presents applications of the methodology which is introduced and illustrated in the theory part of this study. In this paper, the semi-parametric EHM is applied to a case study so as to predict the hazard and reliability of resistance elements on a Resistance Corrosion Sensor Board (RCSB).
Resumo:
There is increasing agreement that understanding complexity is important for project management because of difficulties associated with decision-making and goal attainment which appear to stem from complexity. However the current operational definitions of complex projects, based upon size and budget, have been challenged and questions have been raised about how complexity can be measured in a robust manner that takes account of structural, dynamic and interaction elements. Thematic analysis of data from 25 in-depth interviews of project managers involved with complex projects, together with an exploration of the literature reveals a wide range of factors that may contribute to project complexity. We argue that these factors contributing to project complexity may define in terms of dimensions, or source characteristics, which are in turn subject to a range of severity factors. In addition to investigating definitions and models of complexity from the literature and in the field, this study also explores the problematic issues of ‘measuring’ or assessing complexity. A research agenda is proposed to further the investigation of phenomena reported in this initial study.