991 resultados para ANN modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this computational study we investigate the role of turbulence in ideal axisymmetric vortex breakdown. A pipe geometry with a slight constriction near the inlet is used to stabilise the location of the breakdown within the computed domain. Eddy-viscosity and differential Reynolds stress models are used to model the turbulence. Changes in upstream turbulence levels, flow Reynolds and Swirl numbers are considered. The different computed solutions are monitored for indications of different breakdown flow configurations. Trends in vortex breakdown due to turbulent flow conditions are identified and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews the main development of approaches to modelling urban public transit users’ route choice behaviour from 1960s to the present. The approaches reviewed include the early heuristic studies on finding the least cost transit route and all-or-nothing transit assignment, the bus common line problem and corresponding network representation methods, the disaggregate discrete choice models which are based on random utility maximization assumptions, the deterministic use equilibrium and stochastic user equilibrium transit assignment models, and the recent dynamic transit assignment models using either frequency or schedule based network formulation. In addition to reviewing past outcomes, this paper also gives an outlook into the possible future directions of modelling transit users’ route choice behaviour. Based on the comparison with the development of models for motorists’ route choice and traffic assignment problems in an urban road area, this paper points out that it is rewarding for transit route choice research to draw inspiration from the intellectual outcomes out of the road area. Particularly, in light of the recent advancement of modelling motorists’ complex road route choice behaviour, this paper advocates that the modelling practice of transit users’ route choice should further explore the complexities of the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fracture healing process is modulated by the mechanical environment created by imposed loads and motion between the bone fragments. Contact between the fragments obviously results in a significantly different stress and strain environment to a uniform fracture gap containing only soft tissue (e.g. haematoma). The assumption of the latter in existing computational models of the healing process will hence exaggerate the inter-fragmentary strain in many clinically-relevant cases. To address this issue, we introduce the concept of a contact zone that represents a variable degree of contact between cortices by the relative proportions of bone and soft tissue present. This is introduced as an initial condition in a two-dimensional iterative finite element model of a healing tibial fracture, in which material properties are defined by the volume fractions of each tissue present. The algorithm governing the formation of cartilage and bone in the fracture callus uses fuzzy logic rules based on strain energy density resulting from axial compression. The model predicts that increasing the degree of initial bone contact reduces the amount of callus formed (periosteal callus thickness 3.1mm without contact, down to 0.5mm with 10% bone in contact zone). This is consistent with the greater effective stiffness in the contact zone and hence, a smaller inter-fragmentary strain. These results demonstrate that the contact zone strategy reasonably simulates the differences in the healing sequence resulting from the closeness of reduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is an information technology [IT] enabled approach to managing design data in the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry. BIM enables improved interdisciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. Despite the apparent benefits the adoption of BIM in practice has been slow. Workshops with industry focus groups were conducted to identify the industry needs, concerns and expectations from participants who had implemented BIM or were BIM “ready”. Factors inhibiting BIM adoption include lack of training, low business incentives, perception of lack of rewards, technological concerns, industry fragmentation related to uneven ICT adoption practices, contractual matters and resistance to changing current work practice. Successful BIM usage depends on collective adoption of BIM across the different disciplines and support by the client. The relationship of current work practices to future BIM scenarios was identified as an important strategy as the participants believed that BIM cannot be efficiently used with traditional practices and methods. The key to successful implementation is to explore the extent to which current work practices must change. Currently there is a perception that all work practices and processes must adopt and change for effective usage of BIM. It is acknowledged that new roles and responsibilities are emerging and that different parties will lead BIM on different projects. A contingency based approach to the problem of implementation was taken which relies upon integration of BIM project champion, procurement strategy, team capability analysis, commercial software availability/applicability and phase decision making and event analysis. Organizations need to understand: (a) their own work processes and requirements; (b) the range of BIM applications available in the market and their capabilities (c) the potential benefits of different BIM applications and their roles in different phases of the project lifecycle, and (d) collective supply chain adoption capabilities. A framework is proposed to support organizations selection of BIM usage strategies that meet their project requirements. Case studies are being conducted to develop the framework. The results of the preliminary design management case study is presented for contractor led BIM specific to the design and construct procurement strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is an IT enabled technology that allows storage, management, sharing, access, update and use of all the data relevant to a project through out the project life-cycle in the form of a data repository. BIM enables improved inter-disciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. While the technology itself may not be new, and similar approaches have been in use in some other sectors like Aircraft and Automobile industry for well over a decade now, the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry is still to catch up with them in its ability to exploit the benefits of the IT revolution. Though the potential benefits of the technology in terms of knowledge sharing, project management, project co-ordination and collaboration are near to obvious, the adoption rate has been rather lethargic, inspite of some well directed efforts and availability of supporting commercial tools. Since the technology itself has been well tested over the years in some other domains the plausible causes must be rooted well beyond the explanation of the ‘Bell Curve of innovation adoption’. This paper discusses the preliminary findings of an ongoing research project funded by the Cooperative Research Centre for Construction Innovation (CRC-CI) which aims to identify these gaps and come up with specifications and guidelines to enable greater adoption of the BIM approach in practice. A detailed literature review is conducted that looks at some of the similar research reported in the recent years. A desktop audit of some of the existing commercial tools that support BIM application has been conducted to identify the technological issues and concerns, and a workshop was organized with industry partners and various players in the AEC industry for needs analysis, expectations and feedback on the possible deterrents and inhibitions surrounding the BIM adoption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents one approach to addressing the important issue of interdisciplinarity in the primary school mathematics curriculum, namely, through realistic mathematical modelling problems. Such problems draw upon other disciplines for their contexts and data. The article initially considers the nature of modelling with complex systems and discusses how such experiences differ from existing problem-solving activities in the primary mathematics curriculum. Principles for designing interdisciplinary modelling problems are then addressed, with reference to two mathematical modelling problems— one based in the scientific domain and the other in the literary domain. Examples of the models children have created in solving these problems follow. A reflection on the differences in the diversity and sophistication of these models raises issues regarding the design of interdisciplinary modelling problems. The article concludes with suggested opportunities for generating multidisciplinary projects within the regular mathematics curriculum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The equations governing saltwater intrusion in coastal aquifers are complex. Backward Euler time stepping approaches are often used to advance the solution to these equations in time, which typically requires that small time steps be taken in order to ensure that an accurate solution is obtained. We show that a method of lines approach incorporating variable order backward differentiation formulas can greatly improve the efficiency of the time stepping process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents an extended Joint Factor Analysis model including explicit modelling of unwanted within-session variability. The goals of the proposed extended JFA model are to improve verification performance with short utterances by compensating for the effects of limited or imbalanced phonetic coverage, and to produce a flexible JFA model that is effective over a wide range of utterance lengths without adjusting model parameters such as retraining session subspaces. Experimental results on the 2006 NIST SRE corpus demonstrate the flexibility of the proposed model by providing competitive results over a wide range of utterance lengths without retraining and also yielding modest improvements in a number of conditions over current state-of-the-art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, conceptual modelling of business processes involves the use of visual grammars for the representation of, amongst other things, activities, choices and events. These grammars, while very useful for experts, are difficult to understand by naive stakeholders. Annotations of such process models have been developed to assist in understanding aspects of these grammars via map-based approaches, and further work has looked at forms of 3D conceptual models. However, no one has sought to embed the conceptual models into a fully featured 3D world, using the spatial annotations to explicate the underlying model clearly. In this paper, we present an approach to conceptual process model visualisation that enhances a 3D virtual world with annotations representing process constructs, facilitating insight into the developed model. We then present a prototype implementation of a 3D Virtual BPMN Editor that embeds BPMN process models into a 3D world. We show how this gives extra support for tasks performed by the conceptual modeller, providing better process model communication to stakeholders..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

These National Guidelines and Case Studies for Digital Modelling are the outcomes from one of a number of Building Information Modelling (BIM)-related projects undertaken by the CRC for Construction Innovation. Since the CRC opened its doors in 2001, the industry has seen a rapid increase in interest in BIM, and widening adoption. These guidelines and case studies are thus very timely, as the industry moves to model-based working and starts to share models in a new context called integrated practice. Governments, both federal and state, and in New Zealand are starting to outline the role they might take, so that in contrast to the adoption of 2D CAD in the early 90s, we ensure that a national, industry-wide benefit results from this new paradigm of working. Section 1 of the guidelines give us an overview of BIM: how it affects our current mode of working, what we need to do to move to fully collaborative model-based facility development. The role of open standards such as IFC is described as a mechanism to support new processes, and make the extensive design and construction information available to asset operators and managers. Digital collaboration modes, types of models, levels of detail, object properties and model management complete this section. It will be relevant for owners, managers and project leaders as well as direct users of BIM. Section 2 provides recommendations and guides for key areas of model creation and development, and the move to simulation and performance measurement. These are the more practical parts of the guidelines developed for design professionals, BIM managers, technical staff and ‘in the field’ workers. The guidelines are supported by six case studies including a summary of lessons learnt about implementing BIM in Australian building projects. A key aspect of these publications is the identification of a number of important industry actions: the need for BIM-compatible product information and a national context for classifying product data; the need for an industry agreement and setting process-for-process definition; and finally, the need to ensure a national standard for sharing data between all of the participants in the facility-development process.