42 resultados para Constraint based modeling

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method for converting unrestricted fiction text into a time-based graphical form. Key concepts extracted from the text are used to formulate constraints describing the interaction of entities in a scene. The solution of these constraints over their respective time intervals provides the trajectories for these entities in a graphical representation.

Three types of entity are extracted from fiction books to describe the scene, namely Avatars, Areas and Objects. We present a novel method for modeling the temporal aspect of a fiction story using multiple time-line representations after which the information extracted regarding entities and time-lines is used to formulate constraints. A constraint solving technique based on interval arithmetic is used to ensure that the behaviour of the entities satisfies the constraints over multiple universally quantified time intervals. This approach is demonstrated by finding solutions to multiple time-based constraints, and represents a new contribution to the field of Text-to-Scene conversion. An example of the automatically produced graphical output is provided in support of our constraint-based conversion scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses some experimental results on the influence of grain refinement on the final mechanical properties of IF and microalloyed steels designed for auto-body components. It shows also some modeling approaches to understanding the dynamic behavior of fine-rained materials. The Zerilli–Armstrong (Z–A) and Khan–Huang–Liang (KHL) models for studied steels were implemented into FEM code in order to simulate the dynamic compression tests with different strain rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper utilizes a methodological approach called Multi-Level Modeling (MLM) that addresses two major shortcomings in the two step analytic process that is traditionally adopted in the pertinent literature for modeling corporate collapse; thereby, enhancing procedural efficiency. The robustness of MLM vis-à-vis the traditional two-step procedure is ascertained using a data sample of Australian
publicly listed companies, equally split between collapsed and non collapsed, during the period 1989 to 2006. The results indicate that not only does MLM improve procedural efficiency, it does so while
enhancing the robustness of signaling corporate collapse; in particular, MLM signals collapse with an overall 6.6% increase in accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Constraint-based modeling of reconstructed genome-scale metabolic networks has been successfully applied on several microorganisms. In constraint-based modeling, in order to characterize all allowable phenotypes, network-based pathways, such as extreme pathways and elementary flux modes, are defined. However, as the scale of metabolic network rises, the number of extreme pathways and elementary flux modes increases exponentially. Uniform random sampling solves this problem to some extent to study the contents of the available phenotypes. After uniform random sampling, correlated reaction sets can be identified by the dependencies between reactions derived from sample phenotypes. In this paper, we study the relationship between extreme pathways and correlated reaction sets.

Results: Correlated reaction sets are identified for E. coli core, red blood cell and Saccharomyces cerevisiae metabolic networks respectively. All extreme pathways are enumerated for the former two metabolic networks. As for Saccharomyces cerevisiae metabolic network, because of the large scale, we get a set of extreme pathways by sampling the whole extreme pathway space. In most cases, an extreme pathway covers a correlated reaction set in an 'all or none' manner, which means either all reactions in a correlated reaction set or none is used by some extreme pathway. In rare cases, besides the 'all or none' manner, a correlated reaction set may be fully covered by combination of a few extreme pathways with related function, which may bring redundancy and flexibility to improve the survivability of a cell. In a word, extreme pathways show strong complementary relationship on usage of reactions in the same correlated reaction set.

Conclusion: Both extreme pathways and correlated reaction sets are derived from the topology information of metabolic networks. The strong relationship between correlated reaction sets and extreme pathways suggests a possible mechanism: as a controllable unit, an extreme pathway is regulated by its corresponding correlated reaction sets, and a correlated reaction set is further regulated by the organism's regulatory network.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The recognition of behavioural elements in finance has caused major shifts in the analytic framework pertaining to ratio-based modeling of corporate collapse. The modeling approach so far has been based on the classical rational theory in behavioural economics, which assumes that the financial ratios (i.e., the predictors of collapse) are static over time. The paper argues that, in the absence of rational economic theory, a static model is flawed, and that a suitable model instead is one that reflects the heuristic behavioural framework, which is what characterises behavioural attributes of company directors and in turn influences the accounting numbers used in calculating the financial ratios. This calls for a dynamic model: dynamic in the sense that it does not rely on a coherent assortment of financial ratios for signaling corporate collapse over multiple time periods. This paper provides empirical evidence, using a data set of Australian publicly listed companies, to demonstrate that a dynamic model consistently outperforms its static counterpart in signaling the event of collapse. On average, the overall predictive power of the dynamic model is 86.83% compared to an average overall predictive power of 69.35% for the static model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Nagara tradition of temple building created a rich corpus of Latina (single-spired) temples spread across Northern India between the fifth and thirteenth centuries. Computing methods offer a distinct methodology for reconstructing the genesis and evolution of geometry in this tradition over time. This paper reports a hybrid technique, comprising three distinct computations for recovering and explaining the geometry of temples. The application of the technique enables scholars to bring together fragments of evidence, construe "best-fit" strategies and unearth implicit or hidden relationships. The advantage of this approach is that changes in assumptions and testing of geometric alternatives can be easily simulated from multiple sources of information, such as texts, sacred diagrams and individual temples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we examine the geometrically constrained optimization approach to localization with hybrid bearing (angle of arrival, AOA) and time difference of  arrival (TDOA) sensors. In particular, we formulate a constraint on the measurement errors which is then used along with constraint-based optimization tools in order to estimate the maximum likelihood values of the errors given an appropriate cost function. In particular we focus on deriving a localization algorithm for stationary target localization in the so-called adverse localization geometries where the relative positioning of the sensors and the target do not readily permit accurate or convergent localization using traditional approaches. We illustrate this point via simulation and we compare our approach to a number of different techniques that are discussed in the literature.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Constraint based tools for architectural design exploration need to satisfy aesthetic and functional criteria as well as combine discrete and continuous modes of exploration. In this paper, we examine the possibilities for stochastic processes in design space exploration.

Specifically, we address the application of a stochastic wind motion model to the subdivision of an external building envelope into smaller discrete components. Instead of deterministic subdivision constraints, we introduce explicit uncertainty into the system of subdivision. To address these aims, we develop a model of stochastic wind motion; create a subdivision scheme that is governed by the wind model and explore a design space of a facade subdivision problem. A discrete version of the facade, composed of light strips and panels, based on the bamboo elements deformed by continuous wind motion, is developed. The results of the experiments are presented in the paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The data-based modeling of the haptic interaction simulation is a growing trend in research. These techniques offer a quick alternative to parametric modeling of the simulation. So far, most of the use of the data-based techniques was applied to static simulations. This paper introduces how to use data-based model in dynamic simulations. This ensures realistic behavior and produce results that are very close to parametric modeling. The results show that a quick and accurate response can be achieved using the proposed methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Commonly, surface and solid haptic effects are defined in such a way that they hardly can be rendered together. We propose a method for defining mixed haptic effects including surface, solid, and force fields. These haptic effects can be applied to virtual scenes containing various objects, including polygon meshes, point clouds, impostors, and layered textures, voxel models as well as function-based shapes. Accordingly, we propose a way how to identify location of the haptic tool in such virtual scenes as well as consistently and seamlessly determine haptic effects when the haptic tool moves in the scenes with objects having different sizes, locations, and mutual penetrations. To provide for an efficient and flexible rendering of haptic effects, we propose to concurrently use explicit, implicit and parametric functions, and algorithmic procedures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Artificial neural network (NN) is an alternative way (to conventional physical or chemical based modeling technique) to solve complex ill-defined problems. Neural networks trained from historical data are able to handle nonlinear problems and to find the relationship between input data and output data when there is no obvious one between them. Neural Networks has been successfully used in control, robotic, pattern recognition, forecasting areas. This paper presents an application of neural networks in finding some key factors eg. heat loss factor in power station modeling process. In the conventional modeling of power station, these factors such as heat loss are normally determined by experience or “rule of thumb”. To get an accurate estimation of these factors special experiment needs to be carried out and is a very time consuming process. In this paper the neural networks (technique) is used to assist this difficult conventional modeling process. The historical data from a real running brown coal power station in Victoria has been used to train the neural network model and the outcomes of the trained NN model will be used to determine the factors in the conventional energy modeling of the power stations that is under the development as a part of an on-going ARC Linkage project aiming to detail modeling the internal energy flows in the power station.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates problems associated with interpretations of corporate collapse, and argues for a unified legal, rather than financial, definition of the event. In the absence of a formal definition of the event of corporate collapse, the integrity of sample selection becomes questionable; moreover, comparisons between empirical studies becomes less useful, if not altogether futile, due to the lack of a common ground in the basic building block. Upon close examination of 84 studies on ratio-based modeling of corporate collapse, between 1968 and 2004, this paper finds evidence in favor of a legal interpretation of the event of corporate collapse. Specifically, studies that adopted a legal definition are five times as many as those that opted for a financial explanation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Content-based indexing is fundamental to support and sustain the ongoing growth of broadcasted sports video. The main challenge is to design extensible frameworks to detect and index highlight events. This paper presents: 1) A statistical-driven event detection approach that utilizes a minimum amount of manual knowledge and is based on a universal scope-of-detection and audio-visual features; 2) A semi-schema-based indexing that combines the benefits of schema-based modeling to ensure that the video indexes are valid at all time without manual checking, and schema-less modeling to allow several passes of instantiation in which additional elements can be declared. To demonstrate the performance of the events detection, a large dataset of sport videos with a total of around 15 hours including soccer, basketball and Australian football is used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A particle-based method for multiscale modeling of multiphase materials such as Dual Phase (DP) and Transformation Induced Plasticity (TRIP) steels has been developed. The multiscale Particle-In-Cell (PIC) method benefits from the many advantages of the FEM and mesh-free methods, and to bridge the micro and macro scales through homogenization. The conventional mesh-based modeling methods fail to give reasonable and accurate predictions for materials with complex microstructures. Alternatively in the multiscale PIC method, the Lagrangian particles moving in an Eulerian grid represent the material deformation at both the micro and macro scales. The uniaxial tension test of two phase and three-phase materials was simulated and compared with FE based simulations. The predictions using multiscale PIC method showed that accuracy of field variables could be improved by up to 7%. This can lead to more accurate forming and springback predictions for materials with important multiphase microstructural effects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the present paper the effect of grain refinement on the dynamic response of ultra fine-grained (UFG) structures for C–Mn and HSLA steels is investigated. A physically based flow stress model (Khan-Huang-Liang, KHL) was used to predict the mechanical response of steel structures over a wide range of strain rates and grain sizes. However, the comparison was restricted to the bcc ferrite structures. In previous work [K. Muszka, P.D. Hodgson, J. Majta, A physical based modeling approach for the dynamic behavior of ultra fine-grained structures, J. Mater. Process. Technol. 177 (2006) 456–460] it was shown that the KHL model has better accuracy for structures with a higher level of refinement (below 1 μm) compared to other flow stress models (e.g. Zerrili-Armstrong model). In the present paper, simulation results using the KHL model were compared with experiments. To provide a wide range of the experimental data, a complex thermomechanical processing was applied. The mechanical behavior of the steels was examined utilizing quasi-static tension and dynamic compression tests. The application of the different deformation histories enabled to obtain complex microstructure evolution that was reflected in the level of ferrite refinement.