950 resultados para Mechanical systems
Resumo:
Purpose: In the present work we consider our (in progress) spectroscopy study of zinc and iron phosphates under the influence external high pressure to determine zinc ion change coordination from tetrahedral to octahedral (or hexahedral) structure.----- Design/methodology/approach: The standard equipment is the optical high pressure cell with diamond (DAC). The DAC is assembled and then vibrational or electronic spectra are collected by mounting the cell in an infrared, Raman, EXAFS or UV-visible spectrometer.----- Findings: Mechanism by which zinc and iron methaphosphate material is transformed to glassy meta-phosphate is enhancing mechanical properties of tribofilm. The two decades of intensive study demonstrates that Zn (II) and Fe (III) ions participate to cross-link network under friction, hardening the phosphate.----- Research limitations/implications: Transition metal atoms with d orbital have flexible coordination numbers, for example zinc acts as a cross-linking agent increasing hardness, by changing coordination from tetrahedral to octahedral. Perhaps the external pressure effect on the [Zn–(O-P-)4 ] complex causes a transformation to an [Zn –(O-P-)6] grouping.----- Originality/value: This paper analyses high-pressure spectroscopy which has been applied for the investigation of 3D transition metal ions in solids. When studying pressure effects on coordination compounds structure, we can expect changes in ground electronic state (spin-crossovers), electronic spectra due to structural distortions (piezochromism), and changes in the ligand field causing shifts in the electronic transitions.
Resumo:
Building integrated living systems (BILS), such as green roofs and living walls, could mitigate many of the challenges presented by climate change and biodiversity protection. However, few if any such systems have been constructed, and current tools for evaluating them are limited, especially under Australian subtropical conditions. BILS are difficult to assess, because living systems interact with complex, changing and site-specific social and environmental conditions. Our past research in design for eco-services has confirmed the need for better means of assessing the ecological values of BILS - let alone better models for assessing their thermal and hydrological performance. To address this problem, a research project is being developed jointly by researchers at the Central Queensland University (CQ University) and the Queensland University of Technology (QUT), along with industry collaborators. A mathematical model under development at CQ University will be applied and tested to determine its potential for predicting their complex, dynamic behaviour in different contexts. However, the paper focuses on the work at QUT. The QUT school of design is generating designs for living walls and roofs that provide a range of ecosystem goods and services, or ‘eco-services’, for a variety of micro-climates and functional contexts. The research at QUT aims to develop appropriate designs, virtual prototypes and quantitative methods for assessing the potential multiple benefits of BILS in subtropical climates. It is anticipated that the CQ University model for predicting thermal behaviour of living systems will provide a platform for the integration of ecological criteria and indicators. QUT will also explore means to predict and measure the value of eco-services provided by the systems, which is still largely uncharted territory. This research is ultimately intended to facilitate the eco-retrofitting of cities to increase natural capital and urban resource security - an essential component of sustainability. The talk will present the latest range of multifunctional, eco-productive living walls, roofs and urban space frames and their eco-services.
Resumo:
An emergent form of political economy, facilitated by information and communication technologies (ICTs), is widely propagated as the apotheosis of unmitigated social, economic, and technological progress. Meanwhile, throughout the world, social degradation and economic inequality are increasing logarithmically. Valued categories of thought are, axiomatically, the basic commodities of the “knowledge economy”. Language is its means of exchange. This paper proposes a sociolinguistic method with which to critically engage the hyperbole of the “Information Age”. The method is grounded in a systemic social theory that synthesises aspects of autopoiesis and Marxist political economy. A trade policy statement is analysed to exemplify the sociolinguistically created aberrations that are today most often construed as social and political determinants.
Resumo:
In this chapter I propose a theoretical framework for understanding the role of mediation processes in the inculcation, maintenance, and change of evaluative meaning systems, or axiologies, and how such a perspective can provide a useful and complementary dimension to analysis for SFL and CDA. I argue that an understanding of mediation—the movement of meaning across time and space—is essential for the analysis of meaning. Using two related texts as examples, I show how an understanding of mediation can aid SFL and CDA practitioners in the analysis of social change.
Resumo:
Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.
Resumo:
Structural health monitoring (SHM) is the term applied to the procedure of monitoring a structure’s performance, assessing its condition and carrying out appropriate retrofitting so that it performs reliably, safely and efficiently. Bridges form an important part of a nation’s infrastructure. They deteriorate due to age and changing load patterns and hence early detection of damage helps in prolonging the lives and preventing catastrophic failures. Monitoring of bridges has been traditionally done by means of visual inspection. With recent developments in sensor technology and availability of advanced computing resources, newer techniques have emerged for SHM. Acoustic emission (AE) is one such technology that is attracting attention of engineers and researchers all around the world. This paper discusses the use of AE technology in health monitoring of bridge structures, with a special focus on analysis of recorded data. AE waves are stress waves generated by mechanical deformation of material and can be recorded by means of sensors attached to the surface of the structure. Analysis of the AE signals provides vital information regarding the nature of the source of emission. Signal processing of the AE waveform data can be carried out in several ways and is predominantly based on time and frequency domains. Short time Fourier transform and wavelet analysis have proved to be superior alternatives to traditional frequency based analysis in extracting information from recorded waveform. Some of the preliminary results of the application of these analysis tools in signal processing of recorded AE data will be presented in this paper.
Resumo:
Crash risk is the statistical probability of a crash. Its assessment can be performed through ex post statistical analysis or in real-time with on-vehicle systems. These systems can be cooperative. Cooperative Vehicle-Infrastructure Systems (CVIS) are a developing research avenue in the automotive industry worldwide. This paper provides a survey of existing CVIS systems and methods to assess crash risk with them. It describes the advantages of cooperative systems versus non-cooperative systems. A sample of cooperative crash risk assessment systems is analysed to extract vulnerabilities according to three criteria: market penetration, over-reliance on GPS and broadcasting issues. It shows that cooperative risk assessment systems are still in their infancy and requires further development to provide their full benefits to road users.
Resumo:
Heart disease is attributed as the highest cause of death in the world. Although this could be alleviated by heart transplantation, there is a chronic shortage of donor hearts and so mechanical solutions are being considered. Currently, many Ventricular Assist Devices (VADs) are being developed worldwide in an effort to increase life expectancy and quality of life for end stage heart failure patients. Current pre-clinical testing methods for VADs involve laboratory testing using Mock Circulation Loops (MCLs), and in vivo testing in animal models. The research and development of highly accurate MCLs is vital to the continuous improvement of VAD performance. The first objective of this study was to develop and validate a mathematical model of a MCL. This model could then be used in the design and construction of a variable compliance chamber to improve the performance of an existing MCL as well as form the basis for a new miniaturised MCL. An extensive review of literature was carried out on MCLs and mathematical modelling of their function. A mathematical model of a MCL was then created in the MATLAB/SIMULINK environment. This model included variable features such as resistance, fluid inertia and volumes (resulting from the pipe lengths and diameters); compliance of Windkessel chambers, atria and ventricles; density of both fluid and compressed air applied to the system; gravitational effects on vertical columns of fluid; and accurately modelled actuators controlling the ventricle contraction. This model was then validated using the physical properties and pressure and flow traces produced from a previously developed MCL. A variable compliance chamber was designed to reproduce parameters determined by the mathematical model. The function of the variability was achieved by controlling the transmural pressure across a diaphragm to alter the compliance of the system. An initial prototype was tested in a previously developed MCL, and a variable level of arterial compliance was successfully produced; however, the complete range of compliance values required for accurate physiological representation was not able to be produced with this initial design. The mathematical model was then used to design a smaller physical mock circulation loop, with the tubing sizes adjusted to produce accurate pressure and flow traces whilst having an appropriate frequency response characteristic. The development of the mathematical model greatly assisted the general design of an in vitro cardiovascular device test rig, while the variable compliance chamber allowed simple and real-time manipulation of MCL compliance to allow accurate transition between a variety of physiological conditions. The newly developed MCL produced an accurate design of a mechanical representation of the human circulatory system for in vitro cardiovascular device testing and education purposes. The continued improvement of VAD test rigs is essential if VAD design is to improve, and hence improve quality of life and life expectancy for heart failure patients.
Resumo:
We propose an efficient and low-complexity scheme for estimating and compensating clipping noise in OFDMA systems. Conventional clipping noise estimation schemes, which need all demodulated data symbols, may become infeasible in OFDMA systems where a specific user may only know his own modulation scheme. The proposed scheme first uses equalized output to identify a limited number of candidate clips, and then exploits the information on known subcarriers to reconstruct clipped signal. Simulation results show that the proposed scheme can significantly improve the system performance.
Resumo:
The explosive growth of the World-Wide-Web and the emergence of ecommerce are the major two factors that have led to the development of recommender systems (Resnick and Varian, 1997). The main task of recommender systems is to learn from users and recommend items (e.g. information, products or books) that match the users’ personal preferences. Recommender systems have been an active research area for more than a decade. Many different techniques and systems with distinct strengths have been developed to generate better quality recommendations. One of the main factors that affect recommenders’ recommendation quality is the amount of information resources that are available to the recommenders. The main feature of the recommender systems is their ability to make personalised recommendations for different individuals. However, for many ecommerce sites, it is difficult for them to obtain sufficient knowledge about their users. Hence, the recommendations they provided to their users are often poor and not personalised. This information insufficiency problem is commonly referred to as the cold-start problem. Most existing research on recommender systems focus on developing techniques to better utilise the available information resources to achieve better recommendation quality. However, while the amount of available data and information remains insufficient, these techniques can only provide limited improvements to the overall recommendation quality. In this thesis, a novel and intuitive approach towards improving recommendation quality and alleviating the cold-start problem is attempted. This approach is enriching the information resources. It can be easily observed that when there is sufficient information and knowledge base to support recommendation making, even the simplest recommender systems can outperform the sophisticated ones with limited information resources. Two possible strategies are suggested in this thesis to achieve the proposed information enrichment for recommenders: • The first strategy suggests that information resources can be enriched by considering other information or data facets. Specifically, a taxonomy-based recommender, Hybrid Taxonomy Recommender (HTR), is presented in this thesis. HTR exploits the relationship between users’ taxonomic preferences and item preferences from the combination of the widely available product taxonomic information and the existing user rating data, and it then utilises this taxonomic preference to item preference relation to generate high quality recommendations. • The second strategy suggests that information resources can be enriched simply by obtaining information resources from other parties. In this thesis, a distributed recommender framework, Ecommerce-oriented Distributed Recommender System (EDRS), is proposed. The proposed EDRS allows multiple recommenders from different parties (i.e. organisations or ecommerce sites) to share recommendations and information resources with each other in order to improve their recommendation quality. Based on the results obtained from the experiments conducted in this thesis, the proposed systems and techniques have achieved great improvement in both making quality recommendations and alleviating the cold-start problem.
Resumo:
While there have been improvements in Australian engineering education since the 1990s, there are still strong concerns that more progress needs to be made, particularly in the areas of developing graduate competencies and in outcomes-based curricula. This paper reports on the findings from a two-day ALTC-funded forum that sought to establish a shared understanding with the 3 stakeholders (students, academics and industry) about how to achieve a design-based engineering curriculum. This paper reports on the findings from the first day’s activities and reveals that there is a shared desire for design and project-based curricula that would encourage the development of the ‘three-dimensional’ graduate: one who has technical, personal and professional and systems-thinking/design-based competence.
Resumo:
The CDIO (Conceive-Design-Implement-Operate) Initiative has been globally recognised as an enabler for engineering education reform. With the CDIO process, the CDIO Standards and the CDIO Syllabus, many scholarly contributions have been made around cultural change, curriculum reform and learning environments. In the Australasian region, reform is gaining significant momentum within the engineering education community, the profession, and higher education institutions. This paper presents the CDIO Syllabus cast into the Australian context by mapping it to the Engineers Australia Graduate Attributes, the Washington Accord Graduate Attributes and the Queensland University of Technology Graduate Capabilities. Furthermore, in recognition that many secondary schools and technical training institutions offer introductory engineering technology subjects, this paper presents an extended self-rating framework suited for recognising developing levels of proficiency at a preparatory level. A demonstrator mapping tool has been created to demonstrate the application of this extended graduate attribute mapping framework as a precursor to an integrated curriculum information model.
Resumo:
There is a growing need for international transparency of engineering qualifications, and mechanisms to support and facilitate student mobility. In response, there are a number of global initiatives attempting to address these needs, particularly in Europe, North America and Australia. The Conceive-Design-Implement-Operate (CDIO) Initiative has a set of standards, competencies, and proficiency levels developed through a global community of practice. It is a well-structured framework in which best-practice internationalisation and student mobility can be embedded. However, the current 12 CDIO Standards do not address international qualifications or student mobility. Based on an environmental scan of global activities, the underpinning principles of best practice are identified and form the basis of the proposed 13th CDIO Standard — “Internationalization and Mobility”.
Resumo:
The issue of what an effective high quality / high equity education system might look like remains contested. Indeed there is more educational commentary on those systems that do not achieve this goal (see for example Luke & Woods, 2009 for a detailed review of the No Child Left Behind policy initiatives put forward in the United States under the Bush Administration) than there is detailed consideration of what such a system might enact and represent. A long held critique of socio cultural and critical perspectives in education has been their focus on deconstruction to the supposed detriment of reconstructive work. This critique is less warranted in recent times based on work in the field, especially the plethora of qualitative research focusing on case studies of ‘best practice’. However it certainly remains the case that there is more work to be done in investigating the characteristics of a socially just system. This issue of Point and Counterpoint aims to progress such a discussion. Several of the authors call for a reconfiguration of the use of large scale comparative assessment measures and all suggest new ways of thinking about quality and equity for school systems. Each of the papers tackles different aspects of the problematic of how to achieve high equity without compromising quality within a large education system. They each take a reconstructive focus, highlighting ways forward for education systems in Australia and beyond. While each paper investigates different aspects of the issue, the clearly stated objective of seeking to delineate and articulate characteristics of socially just education is consistent throughout the issue.
Resumo:
The aim of this paper is to show how principles of ecological psychology and dynamical systems theory can underpin a philosophy of coaching practice in a nonlinear pedagogy. Nonlinear pedagogy is based on a view of the human movement system as a nonlinear dynamical system. We demonstrate how this perspective of the human movement system can aid understanding of skill acquisition processes and underpin practice for sports coaches. We provide a description of nonlinear pedagogy followed by a consideration of some of the fundamental principles of ecological psychology and dynamical systems theory that underpin it as a coaching philosophy. We illustrate how each principle impacts on nonlinear pedagogical coaching practice, demonstrating how each principle can substantiate a framework for the coaching process.