7 resultados para Softwares educacionais
em Queensland University of Technology - ePrints Archive
Resumo:
The little grey cat engine (greyCat) is part of a series of projects which explore software which can enable access to the potentially empowering nature of represented space and game design. GreyCat is the result of research into the culture of the software itself in order to provide participatory environments which enable the telling of ‘small stories’ – stories and experiences which are those of the everyday or those of a cultural perspective other than that prioritised by most world building softwares or game engines. GreyCat offers a simple framework which allows participants to use their own image materials (photographs for the most part) as a basis for spatial exploration of their own places.---------- Truna aka j.turner (2008) The little grey cat engine: telling small stories (Demo), Australasian Computer Human Interaction Conference, OZCHI 2008, December 8th-12th, Cairns, Australia---------- Research Publications: truna aka j.turner & Browning, D. (2009) Designing spatial story telling software, in proceedings OZCHI09, Melbourne---------- Truna aka j.turner, Browning, D. & Champion, E. (2008) Designing for Engaged Experience, In proceedings Australasian Computer Human Interaction Conference, OZCHI 2008, December 8th-12th, Cairns, Australia---------- Truna aka. J.turner & Bidwell, N. (2007) Through the looking glass: game worlds as representations and views from elsewhere, Proceedings of the 4th Australasian conference on Interactive entertainment, Melbourne, Australia---------- Truna aka j.turner, Browning, D & Bidwell, N. (2007) Wanderer beyond game worlds, in proceedings, Hutchinson, A (ed) PerthDAC 2007: The seventh International Digital Arts and Culture Conference: The future of digital media culture, 15-18 September 2007, Perth, Australia, Curtin University of Technology---------- Truna aka j.turner (2006) To explore strange new worlds: experience design in 3 dimensional immersive environments - role and place in a world as object of interaction, In proceedings, Australasian Computer Human Interaction Conference, OZCHI 2006, November 22nd-24th, Sydney, Australia, November 20th – 24th 2006, pp 26- 29---------- Truna aka j.turner (2006) Digital songlines environment (Demonstration), In proceedings 2006 International conference on Game research and development, Perth, Australia---------- Truna aka j.turner (2006) Destination Space: Experiential Spatiality and Stories, Special Session on Experiential Spatiality, In proceedings 2006 International conference on Game research and development, Perth, Australia
Resumo:
Computer aided technologies, medical imaging, and rapid prototyping has created new possibilities in biomedical engineering. The systematic variation of scaffold architecture as well as the mineralization inside a scaffold/bone construct can be studied using computer imaging technology and CAD/CAM and micro computed tomography (CT). In this paper, the potential of combining these technologies has been exploited in the study of scaffolds and osteochondral repair. Porosity, surface area per unit volume and the degree of interconnectivity were evaluated through imaging and computer aided manipulation of the scaffold scan data. For the osteochondral model, the spatial distribution and the degree of bone regeneration were evaluated. In this study the versatility of two softwares Mimics (Materialize), CTan and 3D realistic visualization (Skyscan) were assessed, too.
Resumo:
The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).
Resumo:
Abstract—Computational Intelligence Systems (CIS) is one of advanced softwares. CIS has been important position for solving single-objective / reverse / inverse and multi-objective design problems in engineering. The paper hybridise a CIS for optimisation with the concept of Nash-Equilibrium as an optimisation pre-conditioner to accelerate the optimisation process. The hybridised CIS (Hybrid Intelligence System) coupled to the Finite Element Analysis (FEA) tool and one type of Computer Aided Design(CAD) system; GiD is applied to solve an inverse engineering design problem; reconstruction of High Lift Systems (HLS). Numerical results obtained by the hybridised CIS are compared to the results obtained by the original CIS. The benefits of using the concept of Nash-Equilibrium are clearly demonstrated in terms of solution accuracy and optimisation efficiency.
Resumo:
Voltage drop and rise at network peak and off–peak periods along with voltage unbalance are the major power quality problems in low voltage distribution networks. Usually, the utilities try to use adjusting the transformer tap changers as a solution for the voltage drop. They also try to distribute the loads equally as a solution for network voltage unbalance problem. On the other hand, the ever increasing energy demand, along with the necessity of cost reduction and higher reliability requirements, are driving the modern power systems towards Distributed Generation (DG) units. This can be in the form of small rooftop photovoltaic cells (PV), Plug–in Electric Vehicles (PEVs) or Micro Grids (MGs). Rooftop PVs, typically with power levels ranging from 1–5 kW installed by the householders are gaining popularity due to their financial benefits for the householders. Also PEVs will be soon emerged in residential distribution networks which behave as a huge residential load when they are being charged while in their later generation, they are also expected to support the network as small DG units which transfer the energy stored in their battery into grid. Furthermore, the MG which is a cluster of loads and several DG units such as diesel generators, PVs, fuel cells and batteries are recently introduced to distribution networks. The voltage unbalance in the network can be increased due to the uncertainties in the random connection point of the PVs and PEVs to the network, their nominal capacity and time of operation. Therefore, it is of high interest to investigate the voltage unbalance in these networks as the result of MGs, PVs and PEVs integration to low voltage networks. In addition, the network might experience non–standard voltage drop due to high penetration of PEVs, being charged at night periods, or non–standard voltage rise due to high penetration of PVs and PEVs generating electricity back into the grid in the network off–peak periods. In this thesis, a voltage unbalance sensitivity analysis and stochastic evaluation is carried out for PVs installed by the householders versus their installation point, their nominal capacity and penetration level as different uncertainties. A similar analysis is carried out for PEVs penetration in the network working in two different modes: Grid to vehicle and Vehicle to grid. Furthermore, the conventional methods are discussed for improving the voltage unbalance within these networks. This is later continued by proposing new and efficient improvement methods for voltage profile improvement at network peak and off–peak periods and voltage unbalance reduction. In addition, voltage unbalance reduction is investigated for MGs and new improvement methods are proposed and applied for the MG test bed, planned to be established at Queensland University of Technology (QUT). MATLAB and PSCAD/EMTDC simulation softwares are used for verification of the analyses and the proposals.
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however undesirable processing losses are unavoidable and always have been the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical methods are the most preferred; mechanical peeling methods do not create any harmful effects on the tissue and they keep edible portions of produce fresh. The main disadvantage of mechanical peeling is the rate of material loss and deformations. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry, this needs more study on technological aspects of these operations. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. A computer model of mechanical peeling process will be developed in this study to stimulate the energy consumption and stress strain interactions of cutter and tissue. The available Finite Element softwares and methods will be applied to establish the model. Improving the knowledge of interactions and involves variables in food operation particularly in peeling process is the main objectives of the proposed study. Understanding of these interrelationships will help researchers and designer of food processing equipments to develop new and more efficient technologies. Presented work intends to review available literature and previous works has been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.