71 resultados para Editor of flow analysis methods
Resumo:
Owing to the successful use of non-invasive vibration analysis to monitor the progression of dental implant healing and stabilization, it is now being considered as a method to monitor femoral implants in transfemoral amputees. This study uses composite femur-implant physical models to investigate the ability of modal analysis to detect changes at the interface between the implant and bone simulating those that occur during osseointegration. Using electromagnetic shaker excitation, differences were detected in the resonant frequencies and mode shapes of the model when the implant fit in the bone was altered to simulate the two interface cases considered: firm and loose fixation. The study showed that it is beneficial to examine higher resonant frequencies and their mode shapes (rather than the fundamental frequency only) when assessing fixation. The influence of the model boundary conditions on the modal parameters was also demonstrated. Further work is required to more accurately model the mechanical changes occurring at the bone-implant interface in vivo, as well as further refinement of the model boundary conditions to appropriately represent the in vivo conditions. Nevertheless, the ability to detect changes in the model dynamic properties demonstrates the potential of modal analysis in this application and warrants further investigation.
Resumo:
Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.
Resumo:
This dissertation analyses how physical objects are translated into digital artworks using techniques which can lead to ‘imperfections’ in the resulting digital artwork that are typically removed to arrive at a ‘perfect’ final representation. The dissertation discusses the adaptation of existing techniques into an artistic workflow that acknowledges and incorporates the imperfections of translation into the final pieces. It presents an exploration of the relationship between physical and digital artefacts and the processes used to move between the two. The work explores the 'craft' of digital sculpting and the technology used in producing what the artist terms ‘a naturally imperfect form’, incorporating knowledge of traditional sculpture, an understanding of anatomy and an interest in the study of bones (Osteology). The outcomes of the research are presented as a series of digital sculptural works, exhibited as a collection of curiosities in multiple mediums, including interactive game spaces, augmented reality (AR), rapid prototype prints (RP) and video displays.
Resumo:
Conceptual modelling supports developers and users of information systems in areas of documentation, analysis or system redesign. The ongoing interest in the modelling of business processes has led to a variety of different grammars, raising the question of the quality of these grammars for modelling. An established way of evaluating the quality of a modelling grammar is by means of an ontological analysis, which can determine the extent to which grammars contain construct deficit, overload, excess or redundancy. While several studies have shown the relevance of most of these criteria, predictions about construct redundancy have yielded inconsistent results in the past, with some studies suggesting that redundancy may even be beneficial for modelling in practice. In this paper we seek to contribute to clarifying the concept of construct redundancy by introducing a revision to the ontological analysis method. Based on the concept of inheritance we propose an approach that distinguishes between specialized and distinct construct redundancy. We demonstrate the potential explanatory power of the revised method by reviewing and clarifying previous results found in the literature.
Resumo:
The pull-through/local dimpling failure strength of screwed connections is very important in the design of profiled steel cladding systems to help them resist storms and hurricanes. The current American and European provisions recommend four different test methods for the screwed connections in tension, but the accuracy of these methods in determining the connection strength is not known. It is unlikely that the four test methods are equivalent in all cases and thus it is necessary to reduce the number of methods recommended. This paper presents a review of these test methods based on some laboratory tests on crest- and valley-fixed claddings and then recommends alternative tests methods that reproduce the real behavior of the connections, including the bending and membrane deformations of the cladding around the screw fasteners and the tension load in the fastener.
Resumo:
Application of 'advanced analysis' methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. A series of large-scale tests were performed in order to provide experimental results for verification of the new analytical models. Each of the test frames comprised non-compact sections, and exhibited significant local buckling behaviour prior to failure. This paper presents details of the test program including the test specimens, set-up and instrumentation, procedure, and results.
Resumo:
Online business or Electronic Commerce (EC) is getting popular among customers today, as a result large number of product reviews have been posted online by the customers. This information is very valuable not only for prospective customers to make decision on buying product but also for companies to gather information of customers’ satisfaction about their products. Opinion mining is used to capture customer reviews and separated this review into subjective expressions (sentiment word) and objective expressions (no sentiment word). This paper proposes a novel, multi-dimensional model for opinion mining, which integrates customers’ characteristics and their opinion about any products. The model captures subjective expression from product reviews and transfers to fact table before representing in multi-dimensions named as customers, products, time and location. Data warehouse techniques such as OLAP and Data Cubes were used to analyze opinionated sentences. A comprehensive way to calculate customers’ orientation on products’ features and attributes are presented in this paper.
Resumo:
This study reports an action research undertaken at Queensland University of Technology. It evaluates the effectiveness of the integration of GIS within the substantive domains of an existing land use planning course in 2011. Using student performance, learning experience survey, and questionnaire survey data, it also evaluates the impacts of incorporating hybrid instructional methods (e.g., in-class and online instructional videos) in 2012 and 2013. Results show that: students (re)iterated the importance of GIS in the course justifying the integration; the hybrid methods significantly increased student performance; and unlike replacement, the videos are more suitable as a complement to in-class activity.
Resumo:
A control allocation system implements a function that maps the desired control forces generated by the vehicle motion controller into the commands of the different actuators. In this article, a survey of control allocation methods for over-actuated underwater vehicles is presented. The methods are applicable for both surface vessels and underwater vehicles. The paper presents a survey of control allocation methods with focus on mathematical representation and solvability of thruster allocation problems. The paper is useful for university students and engineers who want to get an overview of state-of-the art control allocation methods as well as advance methods to solve more complex problems.
Resumo:
The focus of this research is the creation of a stage-directing training manual on the researcher's site at the National Institute of Dramatic Art. The directing procedures build on the work of Stanislavski's Active Analysis and findings from present-day visual cognition studies. Action research methodology and evidence-based data collection are employed to improve the efficacy of both the directing procedures and the pedagogical manual. The manual serves as a supplement to director training and a toolkit for the more experienced practitioner. The manual and research findings provide a unique and innovative contribution to the field of theatre directing.
Resumo:
This paper offers numerical modelling of a waste heat recovery system. A thin layer of metal foam is attached to a cold plate to absorb heat from hot gases leaving the system. The heat transferred from the exhaust gas is then transferred to a cold liquid flowing in a secondary loop. Two different foam PPI (Pores Per Inch) values are examined over a range of fluid velocities. Numerical results are then compared to both experimental data and theoretical results available in the literature. Challenges in getting the simulation results to match those of the experiments are addressed and discussed in detail. In particular, interface boundary conditions specified between a porous layer and a fluid layer are investigated. While physically one expects much lower fluid velocity in the pores compared to that of free flow, capturing this sharp gradient at the interface can add to the difficulties of numerical simulation. The existing models in the literature are modified by considering the pressure gradient inside and outside the foam. Comparisons against the numerical modelling are presented. Finally, based on experimentally-validated numerical results, thermo-hydraulic performance of foam heat exchangers as waste heat recovery units is discussed with the main goal of reducing the excess pressure drop and maximising the amount of heat that can be recovered from the hot gas stream.
Resumo:
The Environmental Kuznets Curve (EKC) hypothesises an inverse U-shaped relationship between a measure of environmental pollution and per capita income levels. In this study, we apply non-parametric estimation of local polynomial regression (local quadratic fitting) to allow more flexibility in local estimation. This study uses a larger and globally representative sample of many local and global pollutants and natural resources including Biological Oxygen Demand (BOD) emission, CO2 emission, CO2 damage, energy use, energy depletion, mineral depletion, improved water source, PM10, particulate emission damage, forest area and net forest depletion. Copyright © 2009 Inderscience Enterprises Ltd.
Resumo:
We implemented six different boarding strategies (Wilma, Steffen, Reverse Pyramid, Random, Blocks and By letter) in order to investigate boarding times for Boeing 777 and Airbus 380 aircraft. We also introduce three new boarding methods to find the optimum boarding strategy. Our models explicitly simulate the behaviour of groups of people travelling together and we explicitly simulate the timing to store their luggage as part of the boarding process. Results from the simulation demonstrates the Reverse Pyramid method is the best boarding method for Boeing 777, and the Steffen method is the best boarding method for Airbus 380. For the new suggested boarding methods, aisle first boarding method is the best boarding strategy for Boeing 777 and row arrangement method is the best boarding strategy for Airbus 380. Overall best boarding strategy is aisle first boarding method for Boeing 777 and Steffen method for Airbus 380.
Resumo:
Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression