954 resultados para Accelerating Moment Release (amr) Model
Resumo:
Shrinking product lifecycles, tough international competition, swiftly changing technologies, ever increasing customer quality expectation and demanding high variety options are some of the forces that drive next generation of development processes. To overcome these challenges, design cost and development time of product has to be reduced as well as quality to be improved. Design reuse is considered one of the lean strategies to win the race in this competitive environment. design reuse can reduce the product development time, product development cost as well as number of defects which will ultimately influence the product performance in cost, time and quality. However, it has been found that no or little work has been carried out for quantifying the effectiveness of design reuse in product development performance such as design cost, development time and quality. Therefore, in this study we propose a systematic design reuse based product design framework and developed a design leanness index (DLI) as a measure of effectiveness of design reuse. The DLI is a representative measure of reuse effectiveness in cost, development time and quality. Through this index, a clear relationship between reuse measure and product development performance metrics has been established. Finally, a cost based model has been developed to maximise the design leanness index for a product within the given set of constraints achieving leanness in design process.
Resumo:
Online scheduling in the Operating Theatre Department is a dynamic process that deals with both elective and emergency patients. Each business day begins with an elective schedule determined in advance based on a mastery surgery schedule. Throughout the course of the day however, disruptions to this baseline schedule occur due to variations in treatment time, emergency arrivals, equipment failure and resource unavailability. An innovative robust reactive surgery assignment model is developed for the operating theatre department. Following the completion of each surgery, the schedule is re-solved taking into account any disruptions in order to minimise cancellations of pre-planned patients and maximise throughput of emergency cases. The single theatre case is solved and future work on the computationally more complex multiple theatre case under resource constraints is discussed.
Resumo:
This paper introduces a model to facilitate delegation, including ad-hoc delegation, in cross security domain activities. Specifically, this paper proposes a novel delegation constraint management model to manage and track delegation constraints across security domains. An algorithm to trace the authority of delegation constraints is introduced as well as an algorithm to form a delegation constraint set and detect/prevent potential conflicts. The algorithms and the management model are built upon a set of formal definitions of delegation constraints.
Resumo:
The traditional searching method for model-order selection in linear regression is a nested full-parameters-set searching procedure over the desired orders, which we call full-model order selection. On the other hand, a method for model-selection searches for the best sub-model within each order. In this paper, we propose using the model-selection searching method for model-order selection, which we call partial-model order selection. We show by simulations that the proposed searching method gives better accuracies than the traditional one, especially for low signal-to-noise ratios over a wide range of model-order selection criteria (both information theoretic based and bootstrap-based). Also, we show that for some models the performance of the bootstrap-based criterion improves significantly by using the proposed partial-model selection searching method. Index Terms— Model order estimation, model selection, information theoretic criteria, bootstrap 1. INTRODUCTION Several model-order selection criteria can be applied to find the optimal order. Some of the more commonly used information theoretic-based procedures include Akaike’s information criterion (AIC) [1], corrected Akaike (AICc) [2], minimum description length (MDL) [3], normalized maximum likelihood (NML) [4], Hannan-Quinn criterion (HQC) [5], conditional model-order estimation (CME) [6], and the efficient detection criterion (EDC) [7]. From a practical point of view, it is difficult to decide which model order selection criterion to use. Many of them perform reasonably well when the signal-to-noise ratio (SNR) is high. The discrepancies in their performance, however, become more evident when the SNR is low. In those situations, the performance of the given technique is not only determined by the model structure (say a polynomial trend versus a Fourier series) but, more importantly, by the relative values of the parameters within the model. This makes the comparison between the model-order selection algorithms difficult as within the same model with a given order one could find an example for which one of the methods performs favourably well or fails [6, 8]. Our aim is to improve the performance of the model order selection criteria in cases where the SNR is low by considering a model-selection searching procedure that takes into account not only the full-model order search but also a partial model order search within the given model order. Understandably, the improvement in the performance of the model order estimation is at the expense of additional computational complexity.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
This action research examines the enhancement of visual communication within the architectural design studio through physical model making. „It is through physical model making that designers explore their conceptual ideas and develop the creation and understanding of space,‟ (Salama & Wilkinson 2007:126). This research supplements Crowther‟s findings extending the understanding of visual dialogue to include physical models. „Architecture Design 8‟ is the final core design unit at QUT in the fourth year of the Bachelor of Design Architecture. At this stage it is essential that students have the ability to communicate their ideas in a comprehensive manner, relying on a combination of skill sets including drawing, physical model making, and computer modeling. Observations within this research indicates that students did not integrate the combination of the skill sets in the design process through the first half of the semester by focusing primarily on drawing and computer modeling. The challenge was to promote deeper learning through physical model making. This research addresses one of the primary reasons for the lack of physical model making, which was the limited assessment emphasis on the physical models. The unit was modified midway through the semester to better correlate the lecture theory with studio activities by incorporating a series of model making exercises conducted during the studio time. The outcome of each exercise was assessed. Tutors were surveyed regarding the model making activities and a focus group was conducted to obtain formal feedback from students. Students and tutors recognised the added value in communicating design ideas through physical forms and model making. The studio environment was invigorated by the enhanced learning outcomes of the students who participated in the model making exercises. The conclusions of this research will guide the structure of the upcoming iteration of the fourth year design unit.
Resumo:
This article examines the moment of exchange between artist, audience and culture in Live Art. Drawing on historical and contemporary examples, including examples from the Exist in 08 Live Art Event in Brisbane, Australia, in October 2008, it argues that Live Art - be it body art, activist art, site-specific performance, or other sorts of performative intervention in the public sphere - is characterised by a common set of claims about activating audiences, asking them to reflect on cultural norms challenged in the work. Live Art presents risky actions, in a context that blurs the boundaries between art and reality, to position audients as ‘witnesses’ who are personally implicated in, and responsible for, the actions unfolding before them. This article problematises assumptions about the way the uncertainties embedded in the Live Art encounter contribute to its deconstructive agenda. It uses the ethical theory of Emmanuel Levinas, Hans-Thies Lehmann and Dwight Conquergood to examine the mechanics of reductive, culturally-recuperative readings that can limit the efficacy of the Live Art encounter. It argues that, though ‘witnessing’ in Live Art depends on a relation to the real - real people, taking real risks, in real places - if it fails to foreground theatrical frame it is difficult for audients to develop the dual consciousness of the content, and their complicity in that content, that is the starting point for reflexivity, and response-ability, in the ethical encounter.
Resumo:
The focus of this paper is preparing research for dissemination by mainstream print, broadcast, and online media. While the rise of the blogosphere and social media is proving an effective way of reaching niche audiences, my own research reached such an audience through traditional media. The first major study of Australian horror cinema, my PhD thesis A Dark New World: Anatomy of Australian Horror Films, generated strong interest from horror movie fans, film scholars, and filmmakers. I worked closely with the Queensland University of Technology’s (QUT) public relations unit to write two separate media releases circulated on October 13, 2008 and October 14, 2009. This chapter reflects upon the process of working with the media and provides tips for reaching audiences, particularly in terms of strategically planning outcomes. It delves into the background of my study which would later influence my approach to the media, the process of drafting media releases, and key outcomes and benefits from popularising research. A key lesson from this experience is that redeveloping research for the media requires a sharp writing style, letting go of academic justification, catchy quotes, and an ability to distil complex details into easy-to-understand concepts. Although my study received strong media coverage, and I have since become a media commentator, my experiences also revealed a number of pitfalls that are likely to arise for other researchers keen on targeting media coverage.
Resumo:
In this paper, two ideal formation models of serrated chips, the symmetric formation model and the unilateral right-angle formation model, have been established for the first time. Based on the ideal models and related adiabatic shear theory of serrated chip formation, the theoretical relationship among average tooth pitch, average tooth height and chip thickness are obtained. Further, the theoretical relation of the passivation coefficient of chip's sawtooth and the chip thickness compression ratio is deduced as well. The comparison between these theoretical prediction curves and experimental data shows good agreement, which well validates the robustness of the ideal chip formation models and the correctness of the theoretical deducing analysis. The proposed ideal models may have provided a simple but effective theoretical basis for succeeding research on serrated chip morphology. Finally, the influences of most principal cutting factors on serrated chip formation are discussed on the basis of a series of finite element simulation results for practical advices of controlling serrated chips in engineering application.
Resumo:
Community Child Health Nursing Services provide support for new mothers; however, the focus has often been on individual consultations, complemented by a series of group sessions soon after birth. We describe a new model of community care for first-time mothers that centres on group sessions throughout the whole contact period. The model was developed by practicing child health nurses for a large health service district in south-east Queensland, which offers a comprehensive community child health service. Issues identified by clinicians working within existing services, feedback from clients and the need for more resource-efficient methods of service provision underpinned the development of the model. The pilot program was implemented in two community child health centres in Brisbane. An early individual consultation to engage the family with the service was added in response to feedback from clinicians and clients. The modified model has since been implemented service-wide as the ‘First Steps Program’. The introduction of this model has ensured that the service has been able to retain a comprehensive service for first-time parents from a universal population, while responding to the challenges of population growth and the increasing number of complex clients placing demands on resources.
Resumo:
Corneal-height data are typically measured with videokeratoscopes and modeled using a set of orthogonal Zernike polynomials. We address the estimation of the number of Zernike polynomials, which is formalized as a model-order selection problem in linear regression. Classical information-theoretic criteria tend to overestimate the corneal surface due to the weakness of their penalty functions, while bootstrap-based techniques tend to underestimate the surface or require extensive processing. In this paper, we propose to use the efficient detection criterion (EDC), which has the same general form of information-theoretic-based criteria, as an alternative to estimating the optimal number of Zernike polynomials. We first show, via simulations, that the EDC outperforms a large number of information-theoretic criteria and resampling-based techniques. We then illustrate that using the EDC for real corneas results in models that are in closer agreement with clinical expectations and provides means for distinguishing normal corneal surfaces from astigmatic and keratoconic surfaces.
Resumo:
A hierarchical structure is used to represent the content of the semi-structured documents such as XML and XHTML. The traditional Vector Space Model (VSM) is not sufficient to represent both the structure and the content of such web documents. Hence in this paper, we introduce a novel method of representing the XML documents in Tensor Space Model (TSM) and then utilize it for clustering. Empirical analysis shows that the proposed method is scalable for a real-life dataset as well as the factorized matrices produced from the proposed method helps to improve the quality of clusters due to the enriched document representation with both the structure and the content information.
Resumo:
Information behavior models generally focus on one of many aspects of information behavior, either information finding, conceptualized as information seeking, information foraging or information sense-making, information organizing and information using. This ongoing study is developing an integrated model of information behavior. The research design involves a 2-week-long daily information journal self-maintained by the participants, combined with two interviews, one before, and one after the journal-keeping period. The data from the study will be analyzed using grounded theory to identify when the participants engage in the various behaviors that have already been observed, identified, and defined in previous models, in order to generate useful sequential data and an integrated model.
Resumo:
Alexander’s Ecological Dominance and Social Competition (EDSC) model currently provides the most comprehensive overview of human traits in the development of a theory of human evolution and sociality (Alexander, 1990; Finn, Geary & Ward, 2005; Irons, 2005). His model provides a basis for explaining the evolution of human socio-cognitive abilities. Our paper examines the extension of Alexander’s model to incorporate the human trait of information behavior in synergy with ecological dominance and social competition as a human socio-cognitive competence. This paper discusses the various interdisciplinary perspectives exploring how evolution has shaped information behavior and why information behavior is emerging as an important human socio-cognitive competence. This paper outlines these issues, including the extension of Spink and Currier’s (2006a,b) evolution of information behavior model towards a more integrated understanding of how information behaviors have evolved (Spink & Cole, 2006).