989 resultados para Design check
Resumo:
This paper discusses the issues with sharing information between different disciplines in collaborative projects. The focus is on the information itself rather than the wider issues of collaboration. A range of projects carried out by the Cooperative Research Centre for Construction Innovation (CRC CI) in Australia is used to illustrate the issues.
Resumo:
The complex design process of airport terminal needs to support a wide range of changes in operational facilities for both usual and unusual/emergency events. Process model describes how activities within a process are connected and also states logical information flow of the various activities. The traditional design process overlooks the necessity of information flow from the process model to the actual building design, which needs to be considered as a integral part of building design. The current research introduced a generic method to obtain design related information from process model to incorporate with the design process. Appropriate integration of the process model prior to the design process uncovers the relationship exist between spaces and their relevant functions, which could be missed in the traditional design approach. The current paper examines the available Business Process Model (BPM) and generates modified Business Process Model(mBPM) of check-in facilities of Brisbane International airport. The information adopted from mBPM then transform into possible physical layout utilizing graph theory.
Resumo:
Australian higher education institutions (HEIs) have entered a new phase of regulation and accreditation which includes performance-based funding relating to the participation and retention of students from social and cultural groups previously underrepresented in higher education. However, in addressing these priorities, it is critical that HEIs do not further disadvantage students from certain groups by identifying them for attention because of their social or cultural backgrounds, circumstances which are largely beyond the control of students. In response, many HEIs are focusing effort on university-wide approaches to enhancing the student experience because such approaches will enhance the engagement, success and retention of all students, and in doing so, particularly benefit those students who come from underrepresented groups. Measuring and benchmarking student experiences and engagement that arise from these efforts is well supported by extensive collections of student experience survey data. However no comparable instrument exists that measures the capability of institutions to influence and/or enhance student experiences where capability is an indication of how well an organisational process does what it is designed to do (Rosemann & de Bruin, 2005). We have proposed that the concept of a maturity model (Marshall, 2010; Paulk, 1999) may be useful as a way of assessing the capability of HEIs to provide and implement student engagement, success and retention activities and we are currently articulating a Student Engagement, Success and Retention Maturity Model (SESR-MM), (Clarke, Nelson & Stoodley, 2012; Nelson, Clarke & Stoodley, 2012). Our research aims to address the current gap by facilitating the development of an SESR-MM instrument that aims (i) to enable institutions to assess the capability of their current student engagement and retention programs and strategies to influence and respond to student experiences within the institution; and (ii) to provide institutions with the opportunity to understand various practices across the sector with a view to further improving programs and practices relevant to their context. Our research extends the generational approach which has been useful in considering the evolutionary nature of the first year experience (FYE) (Wilson, 2009). Three generations have been identified and explored: First generation approaches that focus on co-curricular strategies (e.g. orientation and peer programs); Second generation approaches that focus on curriculum (e.g. pedagogy, curriculum design, and learning and teaching practice); and third generation approaches—also referred to as transition pedagogy—that focus on the production of an institution-wide integrated holistic intentional blend of curricular and co-curricular activities (Kift, Nelson & Clarke, 2010). Our research also moves beyond assessments of students’ experiences to focus on assessing institutional processes and their capability to influence student engagement. In essence, we propose to develop and use the maturity model concept to produce an instrument that will indicate the capability of HEIs to manage and improve student engagement, success and retention programs and strategies. The issues explored in this workshop are (i) whether the maturity model concept can be usefully applied to provide a measure of institutional capability for SESR; (ii) whether the SESR-MM can be used to assess the maturity of a particular set of institutional practices; and (iii) whether a collective assessment of an institution’s SESR capabilities can provide an indication of the maturity of the institution’s SESR activities. The workshop will be approached in three stages. Firstly, participants will be introduced to the key characteristics of maturity models, followed by a discussion of the SESR-MM and the processes involved in its development. Secondly, participants will be provided with resources to facilitate the development of a maturity model and an assessment instrument for a range of institutional processes and related practices. In the final stage of the workshop, participants will “assess” the capability of these practices to provide a collective assessment of the maturity of these processes. References Australian Council for Educational Research. (n.d.). Australasian Survey of Student Engagement. Retrieved from http://www.acer.edu.au/research/ausse/background Clarke, J., Nelson, K., & Stoodley, I. (2012, July). The Maturity Model concept as framework for assessing the capability of higher education institutions to address student engagement, success and retention: New horizon or false dawn? A Nuts & Bolts presentation at the 15th International Conference on the First Year in Higher Education, “New Horizons,” Brisbane, Australia. Department of Education, Employment and Workplace Relations. (n.d.). The University Experience Survey. Advancing quality in higher education information sheet. Retrieved from http://www.deewr.gov.au/HigherEducation/Policy/Documents/University_Experience_Survey.pdf Kift, S., Nelson, K., & Clarke, J. (2010) Transition pedagogy - a third generation approach to FYE: A case study of policy and practice for the higher education sector. The International Journal of the First Year in Higher Education, 1(1), pp. 1-20. Marshall, S. (2010). A quality framework for continuous improvement of e-Learning: The e-Learning Maturity Model. Journal of Distance Education, 24(1), 143-166. Nelson, K., Clarke, J., & Stoodley, I. (2012). An exploration of the Maturity Model concept as a vehicle for higher education institutions to assess their capability to address student engagement. A work in progress. Submitted for publication. Paulk, M. (1999). Using the Software CMM with good judgment, ASQ Software Quality Professional, 1(3), 19-29. Wilson, K. (2009, June–July). The impact of institutional, programmatic and personal interventions on an effective and sustainable first-year student experience. Keynote address presented at the 12th Pacific Rim First Year in Higher Education Conference, “Preparing for Tomorrow Today: The First Year as Foundation,” Townsville, Australia. Retrieved from http://www.fyhe.com.au/past_papers/papers09/ppts/Keithia_Wilson_paper.pdf
Resumo:
The main objective of this paper is to describe the development of a remote sensing airborne air sampling system for Unmanned Aerial Systems (UAS) and provide the capability for the detection of particle and gas concentrations in real time over remote locations. The design of the air sampling methodology started by defining system architecture, and then by selecting and integrating each subsystem. A multifunctional air sampling instrument, with capability for simultaneous measurement of particle and gas concentrations was modified and integrated with ARCAA’s Flamingo UAS platform and communications protocols. As result of the integration process, a system capable of both real time geo-location monitoring and indexed-link sampling was obtained. Wind tunnel tests were conducted in order to evaluate the performance of the air sampling instrument in controlled nonstationary conditions at the typical operational velocities of the UAS platform. Once the remote fully operative air sampling system was obtained, the problem of mission design was analyzed through the simulation of different scenarios. Furthermore, flight tests of the complete air sampling system were then conducted to check the dynamic characteristics of the UAS with the air sampling system and to prove its capability to perform an air sampling mission following a specific flight path.
Resumo:
The traditional structural design procedure, especially for the large-scale and complex structures, is time consuming and inefficient. This is due primarily to the fact that the traditional design takes the second-order effects indirectly by virtue of design specifications for every member instead of system analysis for a whole structure. Consequently, the complicated and tedious design procedures are inevitably necessary to consider the second-order effects for the member level in design specification. They are twofold in general: 1) Flexural buckling due to P-d effect, i.e. effective length. 2) Sway effect due to P-D effect, i.e. magnification factor. In this study, a new system design concept based on the second-order elastic analysis is presented, in which the second-order effects are taken into account directly in the system analysis, and also to avoid the tedious member-by-member stability check. The plastic design on the basis of this integrated method of direct approach is ignored in this paper for simplicity and clarity, as the only emphasis is placed on the difference between the second-order elastic limit-state design and present system design approach. A practical design example, a 57m-span dome steel skylight structure, is used to demonstrate the efficiency and effectiveness of the proposed approach. This skylight structure is also designed by the traditional design approach BS5950-2000 for comparison on which the emphasis of aforementioned P-d and P-D effects is placed.
Resumo:
Aims/Objectives Our study aims to test the capacity of a newly developed smartphone innovation to obtain data on social, structural, and spatial determinants of the daily health-related behaviours of women living in urban Brisbane neighbourhoods who have survived endometrial cancer. Methods The women used a mobile web app designed specifically for the project to record GIS/location data on every destination they visited within their local urban neighbourhoods over a two-week period. Additionally, we gathered textual data on the social context/reasons for travel, as well as mode of transport to reach these destinations. The data was transported to SPSS and Google Earth for statistical and spatial analysis. We then met with the women to discuss lifestyle interventions to maximise their use of their local neighbourhoods in ways that could increase their physical activity levels and improve their overall health and well-being. These interventions will be evaluated and translated into a large-scale national study if effective. Results Initial findings about patterns in the group’s use of the local urban environment will be displayed, including daily distances travelled, types of locations visited, walking levels, use of public transport, use of green spaces and use of health-related resources. Any socio-demograpahic differences found between the women will be reported. Qualitative, quantitative, and spatial/mapping data will be displayed Conclusion The benefits and limitations of the mobile website designed to collect a range of data types about human-neighbourhood interactions with implications for intervention design will be discussed.
Resumo:
In this letter, we characterize the extrinsic information transfer (EXIT) behavior of a factor graph based message passing algorithm for detection in large multiple-input multiple-output (MIMO) systems with tens to hundreds of antennas. The EXIT curves of a joint detection-decoding receiver are obtained for low density parity check (LDPC) codes of given degree distributions. From the obtained EXIT curves, an optimization of the LDPC code degree profiles is carried out to design irregular LDPC codes matched to the large-MIMO channel and joint message passing receiver. With low complexity joint detection-decoding, these codes are shown to perform better than off-the-shelf irregular codes in the literature by about 1 to 1.5 dB at a coded BER of 10(-5) in 16 x 16, 64 x 64 and 256 x 256 MIMO systems.
Resumo:
The objective of this paper is to empirically evaluate a framework for designing – GEMS of SAPPhIRE as req-sol – to check if it supports design for variety and novelty. A set of observational studies is designed where three teams of two designers each, solve three different design problems in the following order: without any support, using the framework, and using a combination of the framework and a catalogue. Results from the studies reveal that both variety and novelty of the concept space increases with the use of the framework or the framework and the catalogue. However, the number of concepts and the time taken by the designers decreases with the use of the framework and, the framework and the catalogue. Based on the results and the interview sessions with the designers, an interactive framework for designing to be supported on a computer is proposed as future work.
Resumo:
The design and implementation of a programmable cyclic redundancy check (CRC) computation circuit architecture, suitable for deployment in network related system-on-chips (SoCs) is presented. The architecture has been designed to be field reprogrammable so that it is fully flexible in terms of the polynomial deployed and the input port width. The circuit includes an embedded configuration controller that has a low reconfiguration time and hardware cost. The circuit has been synthesised and mapped to 130-nm UMC standard cell [application-specific integrated circuit (ASIC)] technology and is capable of supporting line speeds of 5 Gb/s. © 2006 IEEE.
Resumo:
Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.
Resumo:
In this paper the tracking system used to perform a scaled vehicle-barrier crash test is reported. The scaled crash test was performed as part of a wider project aimed at designing a new safety barrier making use of natural building materials. The scaled crash test was designed and performed as a proof of concept of the new mass-based safety barriers and the study was composed of two parts: the scaling technique and of a series of performed scaled crash tests. The scaling method was used for 1) setting the scaled test impact velocity so that energy dissipation and momentum transferring, from the car to the barrier, can be reproduced and 2) predicting the acceleration, velocity and displacement values occurring in the full-scale impact from the results obtained in a scaled test. To achieve this goal the vehicle and barrier displacements were to be recorded together with the vehicle accelerations and angular velocities. These quantities were measured during the tests using acceleration sensors and a tracking system. The tracking system was composed of a high speed camera and a set of targets to measure the vehicle linear and angular velocities. A code was developed to extract the target velocities from the videos and the velocities obtained were then compared with those obtained integrating the accelerations provided by the sensors to check the reliability of the method.
Resumo:
The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.