817 resultados para Design theory
Resumo:
A flexible and simple Bayesian decision-theoretic design for dose-finding trials is proposed in this paper. In order to reduce the computational burden, we adopt a working model with conjugate priors, which is flexible to fit all monotonic dose-toxicity curves and produces analytic posterior distributions. We also discuss how to use a proper utility function to reflect the interest of the trial. Patients are allocated based on not only the utility function but also the chosen dose selection rule. The most popular dose selection rule is the one-step-look-ahead (OSLA), which selects the best-so-far dose. A more complicated rule, such as the two-step-look-ahead, is theoretically more efficient than the OSLA only when the required distributional assumptions are met, which is, however, often not the case in practice. We carried out extensive simulation studies to evaluate these two dose selection rules and found that OSLA was often more efficient than two-step-look-ahead under the proposed Bayesian structure. Moreover, our simulation results show that the proposed Bayesian method's performance is superior to several popular Bayesian methods and that the negative impact of prior misspecification can be managed in the design stage.
Resumo:
The primary goal of a phase I trial is to find the maximally tolerated dose (MTD) of a treatment. The MTD is usually defined in terms of a tolerable probability, q*, of toxicity. Our objective is to find the highest dose with toxicity risk that does not exceed q*, a criterion that is often desired in designing phase I trials. This criterion differs from that of finding the dose with toxicity risk closest to q*, that is used in methods such as the continual reassessment method. We use the theory of decision processes to find optimal sequential designs that maximize the expected number of patients within the trial allocated to the highest dose with toxicity not exceeding q*, among the doses under consideration. The proposed method is very general in the sense that criteria other than the one considered here can be optimized and that optimal dose assignment can be defined in terms of patients within or outside the trial. It includes as an important special case the continual reassessment method. Numerical study indicates the strategy compares favourably with other phase I designs.
Resumo:
It has been nearly 25 years since the problems associated with passive learning in large undergraduate classes were first established by McDermott (1991). STEM education, for example North Carolina State University’s SCALE-UP project, has subsequently been influenced by some unique aspects of design studio education. While there are now many institutions applying SCALE-UP or similar approaches to enable lively interaction, enhanced learning, increased student engagement, and to teach many different content areas to classes of all sizes, nearly all of these have remained in the STEM fields (Beichner, 2008). Architectural education, although originally at the forefront of this field, has arguably been left behind. Architectural practice is undergoing significant change, globally. Access to new technology and the development of specialised architectural documentation software has scaffolded new building procurement methods and allowed consultant teams to work more collaboratively, efficiently and even across different time zones. Up until recently, the spatial arrangements, pedagogical approaches, and project work outcomes in the architectural design studio, have not been dissimilar to its inception. It is not possible to keep operating architectural design studios the same way that they have for the past two hundred years, with this new injection of high-end technology and personal mobile Wi-Fi enabled devices. Employing a grounded theory methodology, this study reviews the current provision of architectural design learning terrains across a range of tertiary institutions, in Australia. Some suggestions are provided for how these spaces could be modified to address the changing nature of the profession, and implications for how these changes may impact the design of future SCALE-UP type spaces outside of the discipline of architecture, are also explored.
Connecting the space between design and research: Explorations in participatory research supervision
Resumo:
In this article we offer a single case study using an action research method for gathering and analysing data offering insights valuable to both design and research supervision practice. We do not attempt to generalise from this single case, but offer it as an instance that can improve our understanding of research supervision practice. We question the conventional ‘dyadic’ models of research supervision and outline a more collaborative model, based on the signature pedagogy of architecture: the design studio. A novel approach to the supervision of creatively oriented post-graduate students is proposed, including new approaches to design methods and participatory supervision that draw on established design studio practices. This model collapses the distance between design and research activities. Our case study involving Research Masters student supervision in the discipline of Architecture, shows how ‘connected learning’ emerges from this approach. This type of learning builds strong elements of creativity and fun, which promote and enhance student engagement. The results of our action research suggests that students learn to research more easily in such an environment and supervisory practices are enhanced when we apply the techniques and characteristics of design studio pedagogy to the more conventional research pedagogies imported from the humanities. We believe that other creative disciplines can apply similar tactics to enrich both the creative practice of research and the supervision of HDR students.
Resumo:
Reggio Emilia is an educational philosophy that encourages teachers, students and their parents to collaborate and actively engage with the environment. This study investigates how the Reggio Emilia design approach was translated architecturally for a kindergarten in an Australian context, and provides insights into the operation of this Reggio kindergarten and the impact that it is now having on the occupants. It evaluates the original architectural design intent of the Reggio Emilia early childhood learning environment against its spatial provision. The relationship that the Reggio Emilia approach facilitates between students and the environment, and the contribution that this approach has on their learning, are also explored. Several key themes emerging from the Reggio values were identified in the literature. These were then used to inform an exploration of the kindergarten spaces and places.. Architects, teachers and a sustainability manager of the kindergarten were interviewed with their experiences constituting the primary data of this study. Using a Grounded Theory methodology, systematic data coding and analysis were then conducted. Themes and concepts that emerged from this process include: differing interpretations of the Reggio Emilia philosophy; motivations for neglect of traditional external structures and play equipment; the impact of education for sustainability; and the positive effects that Reggio Emilia is having on the rest of the institution’s development.
Resumo:
This is presentation of the refereed paper accepted for the Conferences' proceedings. The presentation was given on Tuesday, 1 December 2015.
Resumo:
This thesis studies how conceptual process models - that is, graphical documentations of an organisation's business processes - can enable and constrain the actions of their users. The results from case study and experiment indicate that model design decisions and people's characteristics influence how these opportunities for action are perceived and acted upon in practice.
Resumo:
We present a new, generic method/model for multi-objective design optimization of laminated composite components using a novel multi-objective optimization algorithm developed on the basis of the Quantum behaved Particle Swarm Optimization (QPSO) paradigm. QPSO is a co-variant of the popular Particle Swarm Optimization (PSO) and has been developed and implemented successfully for the multi-objective design optimization of composites. The problem is formulated with multiple objectives of minimizing weight and the total cost of the composite component to achieve a specified strength. The primary optimization variables are - the number of layers, its stacking sequence (the orientation of the layers) and thickness of each layer. The classical lamination theory is utilized to determine the stresses in the component and the design is evaluated based on three failure criteria; Failure Mechanism based Failure criteria, Maximum stress failure criteria and the Tsai-Wu Failure criteria. The optimization method is validated for a number of different loading configurations - uniaxial, biaxial and bending loads. The design optimization has been carried for both variable stacking sequences as well as fixed standard stacking schemes and a comparative study of the different design configurations evolved has been presented. Also, the performance of QPSO is compared with the conventional PSO.
Resumo:
The aim of the thesis was to compare the correspondence of the outcome a computer assisted program appearance compared to the original image. The aspect of the study was directed to embroidery with household machines. The study was made from the usability point of view with Brother's PE-design 6.0 embroidery design programs two automatic techniques; multicoloured fragment design and multicoloured stitch surface design. The study's subject is very current because of the fast development of machine embroidery. The theory is based on history of household sewing machines, embroidery sewing machines, stitch types in household sewing machines, embroidery design programs as well as PE-design 6.0 embroidery design program's six automatic techniques. Additionally designing of embroidery designs were included: original image, digitizing, punching, applicable sewing threads as well as the connection between embroidery designs and materials used on embroidery. Correspondences of sewn appearances were examined with sewing experimental methods. 18 research samples of five original image were sewn with both techniques. Experiments were divided into four testing stages in design program. Every testing stage was followed by experimental sewing with Brother Super Galaxie 3100D embroidery machine. Experiments were reported into process files and forms made for the techniques. Research samples were analysed on images syntactic bases with sensory perception assessment. Original images and correspondence of the embroidery appearances were analysed with a form made of it. The form was divided into colour and shape assessment in five stage-similarity-scale. Based on this correspondence analysis it can be said that with both automatic techniques the best correspondence of colour and shape was achieved by changing the standard settings and using the makers own thread chart and edited original image. According to the testing made it is impossible to inform where the image editing possibilities of the images are sufficient or does the optimum correspondence need a separate program. When aiming at correspondence between appearances of two images the computer is unable to trace by itself the appearance of the original image. Processing a computer program assisted embroidery image human perception and personal decision making are unavoidable.
Resumo:
This paper critiques a traditional approach to music theory pedagogy. It argues that music theory courses should draw on pedagogies that reflect the diversity and pluralism inherent in 21st century music making. It presents the findings of an action research project investigating the experiences of undergraduate students undertaking an innovative contemporary art music theory course. It describes the students’ struggle in coming to terms with a course that integrated composing, performing, listening and analysing coupled with what for many was their first exposure to the diversity of contemporary art music. The paper concludes with suggesting that the approach could be adopted more widely throughout music programs.
Resumo:
There is an increased interest on the use of UAVs for environmental research such as tracking bush fires, volcanic eruptions, chemical accidents or pollution sources. The aim of this paper is to describe the theory and results of a bio-inspired plume tracking algorithm. A method for generating sparse plumes in a virtual environment was also developed. Results indicated the ability of the algorithms to track plumes in 2D and 3D. The system has been tested with hardware in the loop (HIL) simulations and in flight using a CO2 gas sensor mounted to a multi-rotor UAV. The UAV is controlled by the plume tracking algorithm running on the ground control station (GCS).
Resumo:
We investigate the Einstein relation for the diffusivity-mobility ratio (DMR) for n-i-p-i and the microstructures of nonlinear optical compounds on the basis of a newly formulated electron dispersion law. The corresponding results for III-V, ternary and quaternary materials form a special case of our generalized analysis. The respective DMRs for II-VI, IV-VI and stressed materials have been studied. It has been found that taking CdGeAs2, Cd3As2, InAs, InSb, Hg1−xCdxTe, In1−xGaxAsyP1−y lattices matched to InP, CdS, PbTe, PbSnTe and Pb1−xSnxSe and stressed InSb as examples that the DMR increases with increasing electron concentration in various manners with different numerical magnitudes which reflect the different signatures of the n-i-p-i systems and the corresponding microstructures. We have suggested an experimental method of determining the DMR in this case and the present simplified analysis is in agreement with the suggested relationship. In addition, our results find three applications in the field of quantum effect devices.
Resumo:
Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.
Resumo:
We have shown that novel synthesis methods combined with careful evaluation of DFT phonon calculations provides new insight into boron compounds including a capacity to predict Tc for AlB2-type superconductors.
Resumo:
Introduction Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. Methods A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. Results The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues.We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. Discussion Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.