1000 resultados para McGill Model
Resumo:
Driver aggression is a road safety issue of growing concern throughout most highly motorised countries, yet to date there is no comprehensive model that deals with this issue in the road safety area. This paper sets out to examine the current state of research and theory on aggressive driving with a view to incorporating useful developments in the area of human aggression from mainstream psychological research. As a first step, evidence regarding the prevalence and incidence of driver aggression, including the impact of the phenomenon on crash rates is reviewed. Inconsistencies in the definition and operationalisation of driver aggression that have hampered research in the area are noted. Existing models of driver aggression are then identified and the need to distinguish and address the role of intentionality as well as the purpose of perpetrating behaviours within both these and research efforts is highlighted. Drawing on recent findings from psychological research into general aggression, it is argued that progress in understanding driver aggression requires models that acknowledge not only the person-related and situational factors, but the cognitive and emotional appraisal processes involved in driver aggression. An effective model is expected to allow the explanation of not only the likelihood and severity of driver aggression behaviours, but also the escalation of incidents within the context of the road environment.
Resumo:
The purpose of this article is to examine how a consumer’s weight control beliefs (WCB), a female advertising model’s body size (slim or large) and product type influence consumer evaluations and consumer body perceptions. The study uses an experiment of 371 consumers. The design of the experiment was a 2 (weight control belief: internal, external) X 2 (model size: larger sized, slim) X 2 (product type: weight controlling, non-weight controlling) between-participants factorial design. Results reveal two key contributions. First, larger sized models result in consumers feeling less pressure from society to be thin, viewing their actual shape as slimmer relative to viewing a slim model and wanting a thinner ideal body shape. Slim models result in the opposite effects. Second this research reveals a boundary condition for the extent to which endorser–product congruency theory can be generalized to endorsers of a larger body size. Results indicate that consumer WCB may be a useful variable to consider when marketers consider the use of larger models in advertising.
Resumo:
The significant challenge faced by government in demonstrating value for money in the delivery of major infrastructure resolves around estimating costs and benefits of alternative modes of procurement. Faced with this challenge, one approach is to focus on a dominant performance outcome visible on the opening day of the asset, as the means to select the procurement approach. In this case, value for money becomes a largely nominal concept and determined by selected procurement mode delivering, or not delivering, the selected performance outcome, and notwithstanding possible under delivery on other desirable performance outcomes, as well as possibly incurring excessive transaction costs. This paper proposes a mind-set change in this particular practice, to an approach in which the analysis commences with the conditions pertaining to the project and proceeds to deploy transaction cost and production cost theory to indicate a procurement approach that can claim superior value for money relative to other competing procurement modes. This approach to delivering value for money in relative terms is developed in a first-order procurement decision making model outlined in this paper. The model developed could be complementary to the Public Sector Comparator (PSC) in terms of cross validation and the model more readily lends itself to public dissemination. As a possible alternative to the PSC, the model could save time and money in preparation of project details to lesser extent than that required in the reference project and may send a stronger signal to the market that may encourage more innovation and competition.
Resumo:
Shrinking product lifecycles, tough international competition, swiftly changing technologies, ever increasing customer quality expectation and demanding high variety options are some of the forces that drive next generation of development processes. To overcome these challenges, design cost and development time of product has to be reduced as well as quality to be improved. Design reuse is considered one of the lean strategies to win the race in this competitive environment. design reuse can reduce the product development time, product development cost as well as number of defects which will ultimately influence the product performance in cost, time and quality. However, it has been found that no or little work has been carried out for quantifying the effectiveness of design reuse in product development performance such as design cost, development time and quality. Therefore, in this study we propose a systematic design reuse based product design framework and developed a design leanness index (DLI) as a measure of effectiveness of design reuse. The DLI is a representative measure of reuse effectiveness in cost, development time and quality. Through this index, a clear relationship between reuse measure and product development performance metrics has been established. Finally, a cost based model has been developed to maximise the design leanness index for a product within the given set of constraints achieving leanness in design process.
Resumo:
Online scheduling in the Operating Theatre Department is a dynamic process that deals with both elective and emergency patients. Each business day begins with an elective schedule determined in advance based on a mastery surgery schedule. Throughout the course of the day however, disruptions to this baseline schedule occur due to variations in treatment time, emergency arrivals, equipment failure and resource unavailability. An innovative robust reactive surgery assignment model is developed for the operating theatre department. Following the completion of each surgery, the schedule is re-solved taking into account any disruptions in order to minimise cancellations of pre-planned patients and maximise throughput of emergency cases. The single theatre case is solved and future work on the computationally more complex multiple theatre case under resource constraints is discussed.
Resumo:
This paper introduces a model to facilitate delegation, including ad-hoc delegation, in cross security domain activities. Specifically, this paper proposes a novel delegation constraint management model to manage and track delegation constraints across security domains. An algorithm to trace the authority of delegation constraints is introduced as well as an algorithm to form a delegation constraint set and detect/prevent potential conflicts. The algorithms and the management model are built upon a set of formal definitions of delegation constraints.
Resumo:
The traditional searching method for model-order selection in linear regression is a nested full-parameters-set searching procedure over the desired orders, which we call full-model order selection. On the other hand, a method for model-selection searches for the best sub-model within each order. In this paper, we propose using the model-selection searching method for model-order selection, which we call partial-model order selection. We show by simulations that the proposed searching method gives better accuracies than the traditional one, especially for low signal-to-noise ratios over a wide range of model-order selection criteria (both information theoretic based and bootstrap-based). Also, we show that for some models the performance of the bootstrap-based criterion improves significantly by using the proposed partial-model selection searching method. Index Terms— Model order estimation, model selection, information theoretic criteria, bootstrap 1. INTRODUCTION Several model-order selection criteria can be applied to find the optimal order. Some of the more commonly used information theoretic-based procedures include Akaike’s information criterion (AIC) [1], corrected Akaike (AICc) [2], minimum description length (MDL) [3], normalized maximum likelihood (NML) [4], Hannan-Quinn criterion (HQC) [5], conditional model-order estimation (CME) [6], and the efficient detection criterion (EDC) [7]. From a practical point of view, it is difficult to decide which model order selection criterion to use. Many of them perform reasonably well when the signal-to-noise ratio (SNR) is high. The discrepancies in their performance, however, become more evident when the SNR is low. In those situations, the performance of the given technique is not only determined by the model structure (say a polynomial trend versus a Fourier series) but, more importantly, by the relative values of the parameters within the model. This makes the comparison between the model-order selection algorithms difficult as within the same model with a given order one could find an example for which one of the methods performs favourably well or fails [6, 8]. Our aim is to improve the performance of the model order selection criteria in cases where the SNR is low by considering a model-selection searching procedure that takes into account not only the full-model order search but also a partial model order search within the given model order. Understandably, the improvement in the performance of the model order estimation is at the expense of additional computational complexity.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
This action research examines the enhancement of visual communication within the architectural design studio through physical model making. „It is through physical model making that designers explore their conceptual ideas and develop the creation and understanding of space,‟ (Salama & Wilkinson 2007:126). This research supplements Crowther‟s findings extending the understanding of visual dialogue to include physical models. „Architecture Design 8‟ is the final core design unit at QUT in the fourth year of the Bachelor of Design Architecture. At this stage it is essential that students have the ability to communicate their ideas in a comprehensive manner, relying on a combination of skill sets including drawing, physical model making, and computer modeling. Observations within this research indicates that students did not integrate the combination of the skill sets in the design process through the first half of the semester by focusing primarily on drawing and computer modeling. The challenge was to promote deeper learning through physical model making. This research addresses one of the primary reasons for the lack of physical model making, which was the limited assessment emphasis on the physical models. The unit was modified midway through the semester to better correlate the lecture theory with studio activities by incorporating a series of model making exercises conducted during the studio time. The outcome of each exercise was assessed. Tutors were surveyed regarding the model making activities and a focus group was conducted to obtain formal feedback from students. Students and tutors recognised the added value in communicating design ideas through physical forms and model making. The studio environment was invigorated by the enhanced learning outcomes of the students who participated in the model making exercises. The conclusions of this research will guide the structure of the upcoming iteration of the fourth year design unit.
Resumo:
Community Child Health Nursing Services provide support for new mothers; however, the focus has often been on individual consultations, complemented by a series of group sessions soon after birth. We describe a new model of community care for first-time mothers that centres on group sessions throughout the whole contact period. The model was developed by practicing child health nurses for a large health service district in south-east Queensland, which offers a comprehensive community child health service. Issues identified by clinicians working within existing services, feedback from clients and the need for more resource-efficient methods of service provision underpinned the development of the model. The pilot program was implemented in two community child health centres in Brisbane. An early individual consultation to engage the family with the service was added in response to feedback from clinicians and clients. The modified model has since been implemented service-wide as the ‘First Steps Program’. The introduction of this model has ensured that the service has been able to retain a comprehensive service for first-time parents from a universal population, while responding to the challenges of population growth and the increasing number of complex clients placing demands on resources.
Resumo:
Corneal-height data are typically measured with videokeratoscopes and modeled using a set of orthogonal Zernike polynomials. We address the estimation of the number of Zernike polynomials, which is formalized as a model-order selection problem in linear regression. Classical information-theoretic criteria tend to overestimate the corneal surface due to the weakness of their penalty functions, while bootstrap-based techniques tend to underestimate the surface or require extensive processing. In this paper, we propose to use the efficient detection criterion (EDC), which has the same general form of information-theoretic-based criteria, as an alternative to estimating the optimal number of Zernike polynomials. We first show, via simulations, that the EDC outperforms a large number of information-theoretic criteria and resampling-based techniques. We then illustrate that using the EDC for real corneas results in models that are in closer agreement with clinical expectations and provides means for distinguishing normal corneal surfaces from astigmatic and keratoconic surfaces.
Resumo:
A hierarchical structure is used to represent the content of the semi-structured documents such as XML and XHTML. The traditional Vector Space Model (VSM) is not sufficient to represent both the structure and the content of such web documents. Hence in this paper, we introduce a novel method of representing the XML documents in Tensor Space Model (TSM) and then utilize it for clustering. Empirical analysis shows that the proposed method is scalable for a real-life dataset as well as the factorized matrices produced from the proposed method helps to improve the quality of clusters due to the enriched document representation with both the structure and the content information.
Resumo:
Information behavior models generally focus on one of many aspects of information behavior, either information finding, conceptualized as information seeking, information foraging or information sense-making, information organizing and information using. This ongoing study is developing an integrated model of information behavior. The research design involves a 2-week-long daily information journal self-maintained by the participants, combined with two interviews, one before, and one after the journal-keeping period. The data from the study will be analyzed using grounded theory to identify when the participants engage in the various behaviors that have already been observed, identified, and defined in previous models, in order to generate useful sequential data and an integrated model.