806 resultados para Real-world process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper synthesises the existing literature on the contemporary conception of ‘real world’ and compares it with similar notions such as ‘authentic’ and ‘work integrated learning’. While the term ‘real world’ may be partly dependent on the discipline, it does not necessarily follow that the criterion-referenced assessment of ‘real world’ assessment must involve criteria and performance descriptors that are discipline specific. Two examples of summative assessment (court report and trial process exercise) from a final year core subject at the Queensland University of Technology, LWB432 Evidence, emphasise real world learning, are authentic, innovative and better prepare students for the transition into the workplace than more generic forms of assessment such as tutorial participation or oral presentations. The court report requires students to attend a criminal trial in a Queensland Court and complete a two page report on what they saw in practice compared with what they learned in the classroom. The trial process exercise is a 50 minute written closed book activity conducted in tutorials, where students plan questions that they would ask their witness in examination-in-chief, plan questions that they would ask their opponent’s witness in cross-examination, plan questions that they would ask in reexamination given what their opponent asked in cross-examination, and prepare written objections to their opponent’s questions. The trial process exercise simulates the real world, whereas the court report involves observing the real world, and both assessment items are important to the role of counsel. The design of the criterion-referenced assessment rubrics for the court report and trial process exercise is justified by the literature. Notably, the criteria and performance descriptors are not necessarily law specific and this paper highlights the parts that may be easily transferred to other disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While the need for teamwork skills consistently appears in job advertisements across all sectors, the development of these skills for many university students (and some academic staff) remains one of the most painful and often complained about experiences. This presentation introduces the final phase of a project that has investigated and analysed the design of teamwork assessment across all discipline areas in order to provide a university-wide protocol for this important graduate capability. The protocol concentrates best practice guidelines and resources across a range of approaches to team assessment and includes an online diagnostic tool for evaluating the quality of assessment design. Guide-lines are provided for all aspects of the design process such as the development of real-world relevance; choosing the ideal team structure; planning for intervention and conflict resolution; and selecting appropriate marking options. While still allowing academic staff to exercise creativity in assessment design; the guidelines increase the possibility of students’ experiencing a consistent and explicit approach to teamwork throughout their course. If implementation of the protocol is successful, the project team predicts that the resulting consistency and explicitness in approaches to teamwork will lead to more coherent skill development across units, more realistic expectations for students and staff and better communication between all those participating in the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The customary approach to the study of meal size suggests that ‘events’ occurring during a meal lead to its termination. Recent research, however, suggests that a number of decisions are made before eating commences that may affect meal size. The present study sought to address three key research questions around meal size: the extent to which plate cleaning occurs; prevalence of pre-meal planning and its influence on meal size; and the effect of within-meal experiences, notably the development of satiation. To address these, a large-cohort internet-based questionnaire was developed. Results showed that plate cleaning occurred at 91% of meals, and was planned from the outset in 92% of these cases. A significant relationship between plate cleaning and meal planning was observed. Pre meal plans were resistant to modification over the course of the meal: only 18% of participants reported consumption that deviated from expected. By contrast, 28% reported continuing eating beyond satiation, and 57% stated that they could have eaten more at the end of the meal. Logistic regression confirmed pre-meal planning as the most important predictor of consumption. Together, our findings demonstrate the importance of meal planning as a key determinant of meal size and energy intake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An identified issue within higher education is the high rates of student attrition after the first year, especially in the STEM disciplines. To address this issue, it is essential to reexamine and redesign the first year curriculum to engage and retain the students' interests while also scaffolding their learning experience. This session reports on an initiative based on the principles of the “inverted curriculum” within the Bachelor of Technology (BIT) course at the Queensland University of Technology (QUT) that began in 2009 and has resulted in a reduction in first-year attrition rates from 18% in 2008 to 10% in 2009 and 2010 despite a growth in student intake of 15% to 40% in the past two years. We present the process and methods that helped achieve this and initiate a discussion on the innovations that are possible within this concept of inverted curriculum and how it can be implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the topic of real-time decision making for autonomous city vehicles, i.e., the autonomous vehicles' ability to make appropriate driving decisions in city road traffic situations. The paper explains the overall controls system architecture, the decision making task decomposition, and focuses on how Multiple Criteria Decision Making (MCDM) is used in the process of selecting the most appropriate driving maneuver from the set of feasible ones. Experimental tests show that MCDM is suitable for this new application area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Bouncing Back research study, which began after the Queensland flooding in January 2011, has organically expanded through a number of architectural student design projects and exhibitions, which have sought to respond to catastrophic flooding events. In September 2011, 10 Queensland University of Technology architecture students travelled to Sydney to help construct a 1:1 true-to-life scale shelter, for the Emergency Shelter Exhibition at Customs House in Circular Quay. During the construction of the shelter, data were collected in situ, through dynamic interviews with the students. Using a grounded theory methodology, data were coded and then thematically analysed, to reveal three influential factors that positively impacted the students’ learning in this informal context. These were the student experience, the process of learning through physical making/fabrication, and development of empathy with the community. Analysis of these three factors demonstrated how this informal situated learning activity promoted vitally important learning in a real-world context, which is difficult to replicate in a physical on-campus environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN]This work analyzes the problem of community structure in real-world networks based on the synchronization of nonidentical coupled chaotic Rössler oscillators each one characterized by a defined natural frequency, and coupled according to a predefined network topology. The interaction scheme contemplates an uniformly increasing coupling force to simulate a society in which the association between the agents grows in time. To enhance the stability of the correlated states that could emerge from the synchronization process, we propose a parameterless mechanism that adapts the characteristic frequencies of coupled oscillators according to a dynamic connectivity matrix deduced from correlated data. We show that the characteristic frequency vector that results from the adaptation mechanism reveals the underlying community structure present in the network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process guidance supports users to increase their process model understanding, process execution effectiveness as well as efficiency, and process compliance performance. This paper presents a research in progress encompassing our ongoing DSR project on Process Guidance Systems and a field evaluation of the resulting artifact in cooperation with a company. Building on three theory-grounded design principles, a Process Guidance System artifact for the company’s IT service ticketing process is developed, deployed and used. Fol-lowing a multi-method approach, we plan to evaluate the artifact in a longitudinal field study. Thereby, we will not only gather self-reported but also real usage data. This article describes the development of the artifact and discusses an innovative evaluation approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The parallelization of real-world compute intensive Fortran application codes is generally not a trivial task. If the time to complete the parallelization is to be significantly reduced then an environment is needed that will assist the programmer in the various tasks of code parallelization. In this paper the authors present a code parallelization environment where a number of tools that address the main tasks such as code parallelization, debugging and optimization are available. The ParaWise and CAPO parallelization tools are discussed which enable the near automatic parallelization of real-world scientific application codes for shared and distributed memory-based parallel systems. As user involvement in the parallelization process can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform nearly automatic relative debugging of a program that has been parallelized using the tools. A high quality interprocedural dependence analysis as well as user-tool interaction are also highlighted and are vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of benchmark and real-world application codes parallelized are presented and show the benefits of using the environment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Designing is a heterogeneous, fuzzily defined, floating field of various activities and chunks of ideas and knowledge. Available theories about the foundations of designing as presented in "the basic PARADOX" (Jonas and Meyer-Veden 2004) have evoked the impression of Babylonian confusion. We located the reasons for this "mess" in the "non-fit", which is the problematic relation of theories and subject field. There seems to be a comparable interface problem in theory-building as in designing itself. "Complexity" sounds promising, but turns out to be a problematic and not really helpful concept. I will argue for a more precise application of systemic and evolutionary concepts instead, which - in my view - are able to model the underlying generative structures and processes that produce the visible phenomenon of complexity. It does not make sense to introduce a new fashionable meta-concept and to hope for a panacea before having clarified the more basic and still equally problematic older meta-concepts. This paper will take one step away from "theories of what" towards practice and doing and try to have a closer look at existing process models or "theories of how" to design instead. Doing this from a systemic perspective leads to an evolutionary view of the process, which finally allows to specify more clearly the "knowledge gaps" inherent in the design process. This aspect has to be taken into account as constitutive of any attempt at theory-building in design, which can be characterized as a "practice of not-knowing". I conclude, that comprehensive "unified" theories, or methods, or process models run aground on the identified knowledge gaps, which allow neither reliable models of the present, nor reliable projections into the future. Consolation may be found in performing a shift from the effort of adaptation towards strategies of exaptation, which means the development of stocks of alternatives for coping with unpredictable situations in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical computer vision methods can only weakly emulate some of the multi-level parallelisms in signal processing and information sharing that takes place in different parts of the primates’ visual system thus enabling it to accomplish many diverse functions of visual perception. One of the main functions of the primates’ vision is to detect and recognise objects in natural scenes despite all the linear and non-linear variations of the objects and their environment. The superior performance of the primates’ visual system compared to what machine vision systems have been able to achieve to date, motivates scientists and researchers to further explore this area in pursuit of more efficient vision systems inspired by natural models. In this paper building blocks for a hierarchical efficient object recognition model are proposed. Incorporating the attention-based processing would lead to a system that will process the visual data in a non-linear way focusing only on the regions of interest and hence reducing the time to achieve real-time performance. Further, it is suggested to modify the visual cortex model for recognizing objects by adding non-linearities in the ventral path consistent with earlier discoveries as reported by researchers in the neuro-physiology of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Logistics involves planning, managing, and organizing the flows of goods from the point of origin to the point of destination in order to meet some requirements. Logistics and transportation aspects are very important and represent a relevant costs for producing and shipping companies, but also for public administration and private citizens. The optimization of resources and the improvement in the organization of operations is crucial for all branches of logistics, from the operation management to the transportation. As we will have the chance to see in this work, optimization techniques, models, and algorithms represent important methods to solve the always new and more complex problems arising in different segments of logistics. Many operation management and transportation problems are related to the optimization class of problems called Vehicle Routing Problems (VRPs). In this work, we consider several real-world deterministic and stochastic problems that are included in the wide class of the VRPs, and we solve them by means of exact and heuristic methods. We treat three classes of real-world routing and logistics problems. We deal with one of the most important tactical problems that arises in the managing of the bike sharing systems, that is the Bike sharing Rebalancing Problem (BRP). We propose models and algorithms for real-world earthwork optimization problems. We describe the 3DP process and we highlight several optimization issues in 3DP. Among those, we define the problem related to the tool path definition in the 3DP process, the 3D Routing Problem (3DRP), which is a generalization of the arc routing problem. We present an ILP model and several heuristic algorithms to solve the 3DRP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.