954 resultados para 3D modelling
Resumo:
A better understanding of the behaviour of prepared cane and bagasse, especially the ability to model the mechanical behaviour of bagasse as it is squeezed in a milling unit to extract juice, would help identify how to improve the current milling process; for example to reduce final bagasse moisture. Previous investigations have proven with certainty that juice flow through bagasse obeys Darcy’s permeability law, that the grip of the rough surface of the grooves on the bagasse can be represented by the Mohr- Coulomb failure criterion for soils, and that the internal mechanical behaviour of the bagasse can be represented by critical state behaviour similar to that of sand and clay. Current Finite Element Models (FEM) available in commercial software have adequate permeability models. However, commercial software does not contain an adequate mechanical model for bagasse. Progress has been made in the last ten years towards implementing a mechanical model for bagasse in finite element software code. This paper builds on that progress and carries out a further step towards obtaining an adequate material model. In particular, the prediction of volume change during shearing of normally consolidated final bagasse is addressed.
Resumo:
Modelling the power systems load is a challenge since the load level and composition varies with time. An accurate load model is important because there is a substantial component of load dynamics in the frequency range relevant to system stability. The composition of loads need to be charaterised because the time constants of composite loads affect the damping contributions of the loads to power system oscillations, and their effects vary with the time of the day, depending on the mix of motors loads. This chapter has two main objectives: 1) describe the load modelling in small signal using on-line measurements; and 2) present a new approach to develop models that reflect the load response to large disturbances. Small signal load characterisation based on on-line measurements allows predicting the composition of load with improved accuracy compared with post-mortem or classical load models. Rather than a generic dynamic model for small signal modelling of the load, an explicit induction motor is used so the performance for larger disturbances can be more reliably inferred. The relation between power and frequency/voltage can be explicitly formulated and the contribution of induction motors extracted. One of the main features of this work is the induction motor component can be associated to nominal powers or equivalent motors
Resumo:
Handling information overload online, from the user's point of view is a big challenge, especially when the number of websites is growing rapidly due to growth in e-commerce and other related activities. Personalization based on user needs is the key to solving the problem of information overload. Personalization methods help in identifying relevant information, which may be liked by a user. User profile and object profile are the important elements of a personalization system. When creating user and object profiles, most of the existing methods adopt two-dimensional similarity methods based on vector or matrix models in order to find inter-user and inter-object similarity. Moreover, for recommending similar objects to users, personalization systems use the users-users, items-items and users-items similarity measures. In most cases similarity measures such as Euclidian, Manhattan, cosine and many others based on vector or matrix methods are used to find the similarities. Web logs are high-dimensional datasets, consisting of multiple users, multiple searches with many attributes to each. Two-dimensional data analysis methods may often overlook latent relationships that may exist between users and items. In contrast to other studies, this thesis utilises tensors, the high-dimensional data models, to build user and object profiles and to find the inter-relationships between users-users and users-items. To create an improved personalized Web system, this thesis proposes to build three types of profiles: individual user, group users and object profiles utilising decomposition factors of tensor data models. A hybrid recommendation approach utilising group profiles (forming the basis of a collaborative filtering method) and object profiles (forming the basis of a content-based method) in conjunction with individual user profiles (forming the basis of a model based approach) is proposed for making effective recommendations. A tensor-based clustering method is proposed that utilises the outcomes of popular tensor decomposition techniques such as PARAFAC, Tucker and HOSVD to group similar instances. An individual user profile, showing the user's highest interest, is represented by the top dimension values, extracted from the component matrix obtained after tensor decomposition. A group profile, showing similar users and their highest interest, is built by clustering similar users based on tensor decomposed values. A group profile is represented by the top association rules (containing various unique object combinations) that are derived from the searches made by the users of the cluster. An object profile is created to represent similar objects clustered on the basis of their similarity of features. Depending on the category of a user (known, anonymous or frequent visitor to the website), any of the profiles or their combinations is used for making personalized recommendations. A ranking algorithm is also proposed that utilizes the personalized information to order and rank the recommendations. The proposed methodology is evaluated on data collected from a real life car website. Empirical analysis confirms the effectiveness of recommendations made by the proposed approach over other collaborative filtering and content-based recommendation approaches based on two-dimensional data analysis methods.
Resumo:
Mixture models are a flexible tool for unsupervised clustering that have found popularity in a vast array of research areas. In studies of medicine, the use of mixtures holds the potential to greatly enhance our understanding of patient responses through the identification of clinically meaningful clusters that, given the complexity of many data sources, may otherwise by intangible. Furthermore, when developed in the Bayesian framework, mixture models provide a natural means for capturing and propagating uncertainty in different aspects of a clustering solution, arguably resulting in richer analyses of the population under study. This thesis aims to investigate the use of Bayesian mixture models in analysing varied and detailed sources of patient information collected in the study of complex disease. The first aim of this thesis is to showcase the flexibility of mixture models in modelling markedly different types of data. In particular, we examine three common variants on the mixture model, namely, finite mixtures, Dirichlet Process mixtures and hidden Markov models. Beyond the development and application of these models to different sources of data, this thesis also focuses on modelling different aspects relating to uncertainty in clustering. Examples of clustering uncertainty considered are uncertainty in a patient’s true cluster membership and accounting for uncertainty in the true number of clusters present. Finally, this thesis aims to address and propose solutions to the task of comparing clustering solutions, whether this be comparing patients or observations assigned to different subgroups or comparing clustering solutions over multiple datasets. To address these aims, we consider a case study in Parkinson’s disease (PD), a complex and commonly diagnosed neurodegenerative disorder. In particular, two commonly collected sources of patient information are considered. The first source of data are on symptoms associated with PD, recorded using the Unified Parkinson’s Disease Rating Scale (UPDRS) and constitutes the first half of this thesis. The second half of this thesis is dedicated to the analysis of microelectrode recordings collected during Deep Brain Stimulation (DBS), a popular palliative treatment for advanced PD. Analysis of this second source of data centers on the problems of unsupervised detection and sorting of action potentials or "spikes" in recordings of multiple cell activity, providing valuable information on real time neural activity in the brain.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the beginning school years, with opportunities for children to engage in data modelling. Results are reported from the first year of a 3-year longitudinal study in which three classes of first-grade children (6-year-olds) and their teachers engaged in data modelling activities. The theme of Looking after our Environment, part of the children’s science curriculum, provided the task context. The goals for the two activities addressed here included engaging children in core components of data modelling, namely, selecting attributes, structuring and representing data, identifying variation in data, and making predictions from given data. Results include the various ways in which children represented and re represented collected data, including attribute selection, and the metarepresentational competence they displayed in doing so. The “data lenses” through which the children dealt with informal inference (variation and prediction) are also reported.
Resumo:
Nursing training for an Intensive Care Unit (ICU) is a resource intensive process. High demands are made on staff, students and physical resources. Interactive, 3D computer simulations, known as virtual worlds, are increasingly being used to supplement training regimes in the health sciences; especially in areas such as complex hospital ward processes. Such worlds have been found to be very useful in maximising the utilisation of training resources. Our aim is to design and develop a novel virtual world application for teaching and training Intensive Care nurses in the approach and method for shift handover, to provide an independent, but rigorous approach to teaching these important skills. In this paper we present a virtual world simulator for students to practice key steps in handing over the 24/7 care requirements of intensive care patients during the commencing first hour of a shift. We describe the modelling process to provide a convincing interactive simulation of the handover steps involved. The virtual world provides a practice tool for students to test their analytical skills with scenarios previously provided by simple physical simulations, and live on the job training. Additional educational benefits include facilitation of remote learning, high flexibility in study hours and the automatic recording of a reviewable log from the session. To the best of our knowledge, we believe this is a novel and original application of virtual worlds to an ICU handover process. The major outcome of the work was a virtual world environment for training nurses in the shift handover process, designed and developed for use by postgraduate nurses in training.
Resumo:
Three-dimensional wagon train models have been developed for the crashworthiness analysis using multi-body dynamics approach. The contributions of the train size (number of wagon) to the frontal crash forces can be identified through the simulations. The effects of crash energy management (CEM) design and crash speed on train crashworthiness are examined. The CEM design can significantly improve the train crashworthiness and the consequential vehicle stability performance - reducing derailment risks.
Resumo:
The growth of solid tumours beyond a critical size is dependent upon angiogenesis, the formation of new blood vessels from an existing vasculature. Tumours may remain dormant at microscopic sizes for some years before switching to a mode in which growth of a supportive vasculature is initiated. The new blood vessels supply nutrients, oxygen, and access to routes by which tumour cells may travel to other sites within the host (metastasize). In recent decades an abundance of biological research has focused on tumour-induced angiogenesis in the hope that treatments targeted at the vasculature may result in a stabilisation or regression of the disease: a tantalizing prospect. The complex and fascinating process of angiogenesis has also attracted the interest of researchers in the field of mathematical biology, a discipline that is, for mathematics, relatively new. The challenge in mathematical biology is to produce a model that captures the essential elements and critical dependencies of a biological system. Such a model may ultimately be used as a predictive tool. In this thesis we examine a number of aspects of tumour-induced angiogenesis, focusing on growth of the neovasculature external to the tumour. Firstly we present a one-dimensional continuum model of tumour-induced angiogenesis in which elements of the immune system or other tumour-cytotoxins are delivered via the newly formed vessels. This model, based on observations from experiments by Judah Folkman et al., is able to show regression of the tumour for some parameter regimes. The modelling highlights a number of interesting aspects of the process that may be characterised further in the laboratory. The next model we present examines the initiation positions of blood vessel sprouts on an existing vessel, in a two-dimensional domain. This model hypothesises that a simple feedback inhibition mechanism may be used to describe the spacing of these sprouts with the inhibitor being produced by breakdown of the existing vessel's basement membrane. Finally, we have developed a stochastic model of blood vessel growth and anastomosis in three dimensions. The model has been implemented in C++, includes an openGL interface, and uses a novel algorithm for calculating proximity of the line segments representing a growing vessel. This choice of programming language and graphics interface allows for near-simultaneous calculation and visualisation of blood vessel networks using a contemporary personal computer. In addition the visualised results may be transformed interactively, and drop-down menus facilitate changes in the parameter values. Visualisation of results is of vital importance in the communication of mathematical information to a wide audience, and we aim to incorporate this philosophy in the thesis. As biological research further uncovers the intriguing processes involved in tumourinduced angiogenesis, we conclude with a comment from mathematical biologist Jim Murray, Mathematical biology is : : : the most exciting modern application of mathematics.
Resumo:
Abstract: LiteSteel beam (LSB) is a new cold-formed steel hollow flange channel beam produced using a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. It has the beneficial characteristics of torsionally rigid closed rectangular flanges combined with economical fabrication processes from a single strip of high strength steel. Although the LSB sections are commonly used as flexural members, no research has been undertaken on the shear behaviour of LSBs. Therefore experimental and numerical studies were undertaken to investigate the shear behaviour and strength of LSBs. In this research finite element models of LSBs were developed to investigate their nonlinear shear behaviour including their buckling characteristics and ultimate shear strength. They were validated by comparing their results with available experimental results. The models provided full details of the shear buckling and strength characteristics of LSBs, and showed the presence of considerable improvements to web shear buckling in LSBs and associated post-buckling strength. This paper presents the details of the finite element models of LSBs and the results. Both finite element analysis and experimental results showed that the current design rules in cold-formed steel codes are very conservative for the shear design of LSBs. The ultimate shear capacities from finite element analyses confirmed the accuracy of proposed shear strength equations for LSBs based on the North American specification and DSM design equations. Developed finite element models were used to investigate the reduction to shear capacity of LSBs when full height web side plates were not used or when only one web side plate was used, and these results are also presented in this paper.
Resumo:
Recently an innovative composite panel system was developed, where a thin insulation layer was used externally between two plasterboards to improve the fire performance of light gauge cold-formed steel frame walls. In this research, finite-element thermal models of both the traditional light gauge cold-formed steel frame wall panels with cavity insulation and the new light gauge cold-formed steel frame composite wall panels were developed to simulate their thermal behaviour under standard and realistic fire conditions. Suitable apparent thermal properties of gypsum plasterboard, insulation materials and steel were proposed and used. The developed models were then validated by comparing their results with available fire test results. This article presents the details of the developed finite-element models of small-scale non-load-bearing light gauge cold-formed steel frame wall panels and the results of the thermal analysis. It has been shown that accurate finite-element models can be used to simulate the thermal behaviour of small-scale light gauge cold-formed steel frame walls with varying configurations of insulations and plasterboards. The numerical results show that the use of cavity insulation was detrimental to the fire rating of light gauge cold-formed steel frame walls, while the use of external insulation offered superior thermal protection to them. The effects of real fire conditions are also presented.
Resumo:
The complex interaction of the bones of the foot has been explored in detail in recent years, which has led to the acknowledgement in the biomechanics community that the foot can no longer be considered as a single rigid segment. With the advance of motion analysis technology it has become possible to quantify the biomechanics of simplified units or segments that make up the foot. Advances in technology coupled with reducing hardware prices has resulted in the uptake of more advanced tools available for clinical gait analysis. The increased use of these techniques in clinical practice requires defined standards for modelling and reporting of foot and ankle kinematics. This systematic review aims to provide a critical appraisal of commonly used foot and ankle marker sets designed to assess kinematics and thus provide a theoretical background for the development of modelling standards.
Resumo:
Using complex event rules for capturing dependencies between business processes is an emerging trend in enterprise information systems. In previous work we have identified a set of requirements for event extensions for business process modeling languages. This paper introduces a graphical language for modeling composite events in business processes, namely BEMN, that fulfills all these requirements. These include event conjunction, disjunction and inhibition as well as cardinality of events whose graphical expression can be factored into flow-oriented process modeling and event rule modeling. Formal semantics for the language are provided.
Resumo:
Digital human modeling (DHM), as a convenient and cost-effective tool, is increasingly incorporated into product and workplace design. In product design, it is predominantly used for the development of driver-vehicle systems. Most digital human modeling software tools, such as JACK, RAMSIS and DELMIA HUMANBUILDER provide functions to predict posture and positions for drivers with selected anthropometry according to SAE (Society of Automotive Engineers) Recommended Practices and other ergonomics guidelines. However, few studies have presented 2nd row passenger postural information, and digital human modeling of these passenger postures cannot be performed directly using the existing driver posture prediction functions. In this paper, the significant studies related to occupant posture and modeling were reviewed and a framework of determinants of driver vs. 2nd row occupant posture modeling was extracted. The determinants which are regarded as input factors for posture modeling include target population anthropometry, vehicle package geometry and seat design variables as well as task definitions. The differences between determinants of driver and 2nd row occupant posture models are significant, as driver posture modeling is primarily based on the position of the foot on the accelerator pedal (accelerator actuation point AAP, accelerator heel point AHP) and the hands on the steering wheel (steering wheel centre point A-Point). The objectives of this paper are aimed to investigate those differences between driver and passenger posture, and to supplement the existing parametric model for occupant posture prediction. With the guide of the framework, the associated input parameters of occupant digital human models of both driver and second row occupant will be identified. Beyond the existing occupant posture models, for example a driver posture model could be modified to predict second row occupant posture, by adjusting the associated input parameters introduced in this paper. This study combines results from a literature review and the theoretical modeling stage of a second row passenger posture prediction model project.
Resumo:
In this work a biomechanical model is used for simulation of muscle forces necessary to maintain the posture in a car seat under different support conditions.