177 resultados para Physical modelling

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

“What did you think you were doing?” Was the question posed by the conference organizers to me as the inventor and constructor of the first working Tangible Interfaces over 40 years ago. I think the question was intended to encourage me to talk about the underlying ideas and intentionality rather than describe an endless sequence of electronic bricks and that is what I shall do in this presentation. In the sixties the prevalent idea for a graphics interface was an analogue with sketching which was to somehow be understood by the computer as three dimensional form. I rebelled against this notion for reasons which I will explain in the presentation and instead came up with tangible physical three dimensional intelligent objects. I called these first prototypes “Intelligent Physical Modelling Systems” which is a really dumb name for an obvious concept. I am eternally grateful to Hiroshi Ishii for coining the term “Tangible User Interfaces” - the same idea but with a much smarter name. Another motivator was user involvement in the design process, and that led to the Generator (1979) project with Cedric Price for the world’s first intelligent building capable of organizing itself in response to the appetites of the users. The working model of that project is in MoMA. And the same motivation led to a self builders design kit (1980) for Walter Segal which facilitated self-builders to design their own houses. And indeed as the organizer’s question implied, the motivation and intentionality of these projects developed over the years in step with advancing technology. The speaker will attempt to articulate these changes with medical, psychological and educational examples. Much of this later work indeed stemming from the Media Lab where we are talking. Related topics such as “tangible thinking” and “intelligent teacups” will be introduced and the presentation will end with some speculations for the future. The presentation will be given against a background of images of early prototypes many of which have never been previously published.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This design research concerns the generation of spaces that fully respond to people’s presence and their activities and spatialises the dynamics of a full body massage. Researched though digital and physical modelling full size physical form was constructed using Ethylene Vinyl Acetate (EVA) foam with three-dimensional shape defined by a computer generated cutting pattern, and assembled into a non-linear articulated surface.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis is a comparative study of the modelling of mechanical behaviours of F-actin cytoskeleton which is an important structural component in living cells. A new granular model was developed for F-actin cytoskeleton based on the concept of multiscale modelling. This framework overcomes difficulties encountered in physical modelling of cytoskeleton in conventional continuum mechanics modelling, and the computational challenges in all-atom molecular dynamics simulation. The thermostat algorithm was further modified to better predict the thermodynamic properties of F-actin cytoskeleton in modelling. This multiscale modelling framework was applied in explaining the physical mechanisms of cytoskeleton responses to external mechanical loads.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The geometry of ductile strain localization phenomena is related to the rheology of the deformed rocks. Both qualitative and quantitative rheological properties of natural rocks have been estimated from finite field structures such as folds and shear zones. We apply physical modelling to investigate the relationship between rheology and the temporal evolution of the width and transversal strain distribution in shear zones, both of which have been used previously as rheological proxies. Geologically relevant materials with well-characterized rheological properties (Newtonian, strain hardening, strain softening, Mohr-Coulomb) are deformed in a shear box and observed with Particle Imaging Velocimetry (PIV). It is shown that the width and strain distribution histories in model shear zones display characteristic finite responses related to material properties as predicted by previous studies. Application of the results to natural shear zones in the field is discussed. An investigation of the impact of 3D boundary conditions in the experiments demonstrates that quantitative methods for estimating rheology from finite natural structures must take these into account carefully.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thunderstorm downbursts are important for wind engineers as they have been shown to produce the design wind speeds for mid to high return periods in many regions of Australia [1]. In structural design codes (e.g. AS/NZS1170.02-02) an atmospheric boundary layer (ABL) is assumed, and a vertical profile is interpolated from recorded 10 m wind speeds. The ABL assumption is however inaccurate when considering the complex structure of a thunderstorm outflow, and its effects on engineered structures. Several researchers have shown that the downburst, close to its point of divergence is better represented by an impinging wall jet profile than the traditional ABL. Physical modelling is the generally accepted approach to estimate wind loads on structures and it is therefore important to physically model the thunderstorm downburst so that its effects on engineered structures may be studied. An advancement on the simple impinging jet theory, addressed here is the addition of a pulsing mechanism to the jet which allows not only the divergent characteristics of a downburst to be produced, but also it allows the associated leading ring vortex to be developed. The ring vortex modelling is considered very important for structural design as it is within the horizontal vortex that the largest velocities occur [2]. This paper discusses the flow field produced by a pulsed wall jet, and also discusses the induced pressures that this type of flow has on a scaled tall building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of a building is a complicated process, having to formulate diverse components through unique tasks involving different personalities and organisations in order to satisfy multi-faceted client requirements. To do this successfully, the project team must encapsulate an integrated design that accommodates various social, economic and legislative factors. Therefore, in this era of increasing global competition integrated design has been increasingly recognised as a solution to deliver value to clients.----- The ‘From 3D to nD modelling’ project at the University of Salford aims to support integrated design; to enable and equip the design and construction industry with a tool that allows users to create, share, contemplate and apply knowledge from multiple perspectives of user requirements (accessibility, maintainability, sustainability, acoustics, crime, energy simulation, scheduling, costing etc.). Thus taking the concept of 3-dimensional computer modelling of the built environment to an almost infinite number of dimensions, to cope with whole-life construction and asset management issues in the design of modern buildings. This paper reports on the development of a vision for how integrated environments that will allow nD-enabled construction and asset management to be undertaken. The project is funded by a four-year platform grant from the Engineering and Physical Sciences Research Council (EPSRC) in the UK; thus awarded to a multi-disciplinary research team, to enable flexibility in the research strategy and to produce leading innovation. This paper reports on the development of a business process and IT vision for how integrated environments will allow nD-enabled construction and asset management to be undertaken. It further develops many of the key issues of a future vision arising from previous CIB W78 conferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Given escalating rates of chronic disease, broad-reach and cost-effective interventions to increase physical activity and improve dietary intake are needed. The cost-effectiveness of a Telephone Counselling intervention to improve physical activity and diet, targeting adults with established chronic diseases in a low socio-economic area of a major Australian city was examined. Methodology/Principal Findings: A cost-effectiveness modelling study using data collected between February 2005 and November 2007 from a cluster-randomised trial that compared Telephone Counselling with a “Usual Care” (brief intervention) alternative. Economic outcomes were assessed using a state-transition Markov model, which predicted the progress of participants through five health states relating to physical activity and dietary improvement, for ten years after recruitment. The costs and health benefits of Telephone Counselling, Usual Care and an existing practice (Real Control) group were compared. Telephone Counselling compared to Usual Care was not cost-effective ($78,489 per quality adjusted life year gained). However, the Usual Care group did not represent existing practice and is not a useful comparator for decision making. Comparing Telephone Counselling outcomes to existing practice (Real Control), the intervention was found to be cost-effective ($29,375 per quality adjusted life year gained). Usual Care (brief intervention) compared to existing practice (Real Control) was also cost-effective ($12,153 per quality adjusted life year gained). Conclusions/Significance: This modelling study shows that a decision to adopt a Telephone Counselling program over existing practice (Real Control) is likely to be cost-effective. Choosing the ‘Usual Care’ brief intervention over existing practice (Real Control) shows a lower cost per quality adjusted life year, but the lack of supporting evidence for efficacy or sustainability is an important consideration for decision makers. The economics of behavioural approaches to improving health must be made explicit if decision makers are to be convinced that allocating resources toward such programs is worthwhile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In order to design appropriate environments for performance and learning of movement skills, physical educators need a sound theoretical model of the learner and of processes of learning. In physical education, this type of modelling informs the organization of learning environments and effective and efficient use of practice time. An emerging theoretical framework in motor learning, relevant to physical education, advocates a constraints-led perspective for acquisition of movement skills and game play knowledge. This framework shows how physical educators could use task, performer and environmental constraints to channel acquisition of movement skills and decision making behaviours in learners. From this viewpoint, learners generate specific movement solutions to satisfy the unique combination of constraints imposed on them, a process which can be harnessed during physical education lessons. Purpose: In this paper the aim is to provide an overview of the motor learning approach emanating from the constraints-led perspective, and examine how it can substantiate a platform for a new pedagogical framework in physical education: nonlinear pedagogy. We aim to demonstrate that it is only through theoretically valid and objective empirical work of an applied nature that a conceptually sound nonlinear pedagogy model can continue to evolve and support research in physical education. We present some important implications for designing practices in games lessons, showing how a constraints-led perspective on motor learning could assist physical educators in understanding how to structure learning experiences for learners at different stages, with specific focus on understanding the design of games teaching programmes in physical education, using exemplars from Rugby Union and Cricket. Findings: Research evidence from recent studies examining movement models demonstrates that physical education teachers need a strong understanding of sport performance so that task constraints can be manipulated so that information-movement couplings are maintained in a learning environment that is representative of real performance situations. Physical educators should also understand that movement variability may not necessarily be detrimental to learning and could be an important phenomenon prior to the acquisition of a stable and functional movement pattern. We highlight how the nonlinear pedagogical approach is student-centred and empowers individuals to become active learners via a more hands-off approach to learning. Summary: A constraints-based perspective has the potential to provide physical educators with a framework for understanding how performer, task and environmental constraints shape each individual‟s physical education. Understanding the underlying neurobiological processes present in a constraints-led perspective to skill acquisition and game play can raise awareness of physical educators that teaching is a dynamic 'art' interwoven with the 'science' of motor learning theories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuum mechanics provides a mathematical framework for modelling the physical stresses experienced by a material. Recent studies show that physical stresses play an important role in a wide variety of biological processes, including dermal wound healing, soft tissue growth and morphogenesis. Thus, continuum mechanics is a useful mathematical tool for modelling a range of biological phenomena. Unfortunately, classical continuum mechanics is of limited use in biomechanical problems. As cells refashion the �bres that make up a soft tissue, they sometimes alter the tissue's fundamental mechanical structure. Advanced mathematical techniques are needed in order to accurately describe this sort of biological `plasticity'. A number of such techniques have been proposed by previous researchers. However, models that incorporate biological plasticity tend to be very complicated. Furthermore, these models are often di�cult to apply and/or interpret, making them of limited practical use. One alternative approach is to ignore biological plasticity and use classical continuum mechanics. For example, most mechanochemical models of dermal wound healing assume that the skin behaves as a linear viscoelastic solid. Our analysis indicates that this assumption leads to physically unrealistic results. In this thesis we present a novel and practical approach to modelling biological plasticity. Our principal aim is to combine the simplicity of classical linear models with the sophistication of plasticity theory. To achieve this, we perform a careful mathematical analysis of the concept of a `zero stress state'. This leads us to a formal de�nition of strain that is appropriate for materials that undergo internal remodelling. Next, we consider the evolution of the zero stress state over time. We develop a novel theory of `morphoelasticity' that can be used to describe how the zero stress state changes in response to growth and remodelling. Importantly, our work yields an intuitive and internally consistent way of modelling anisotropic growth. Furthermore, we are able to use our theory of morphoelasticity to develop evolution equations for elastic strain. We also present some applications of our theory. For example, we show that morphoelasticity can be used to obtain a constitutive law for a Maxwell viscoelastic uid that is valid at large deformation gradients. Similarly, we analyse a morphoelastic model of the stress-dependent growth of a tumour spheroid. This work leads to the prediction that a tumour spheroid will always be in a state of radial compression and circumferential tension. Finally, we conclude by presenting a novel mechanochemical model of dermal wound healing that takes into account the plasticity of the healing skin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper argues a model of adaptive design for sustainable architecture within a framework of entropy evolution. The spectrum of sustainable architecture consists of efficient use of energy and material resource in the life-cycle of buildings, active involvement of the occupants into micro-climate control within the building, and the natural environment as the physical context. The interactions amongst all the parameters compose a complex system of sustainable architecture design, of which the conventional linear and fragmented design technologies are insufficient to indicate holistic and ongoing environmental performance. The latest interpretation of the Second Law of Thermodynamics states a microscopic formulation of an entropy evolution of complex open systems. It provides a design framework for an adaptive system evolves for the optimization in open systems, this adaptive system evolves for the optimization of building environmental performance. The paper concludes that adaptive modelling in entropy evolution is a design alternative for sustainable architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.