11 resultados para Large modeling projects

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relative stability and magnitude of genetic and environmental effects underlying major dimensions of adolescent personality across time were investigated. The Junior Eysenck Personality Questionnaire was administered to over 540 twin pairs at ages 12, 14 and 16 years. Their personality scores were analyzed using genetic simplex modeling which explicitly took into account the longitudinal nature of the data. With the exception of the dimension lie, multivariate model fitting results revealed that familial aggregation was entirely explained by additive genetic effects. Results from simplex model fitting suggest that large proportions of the additive genetic variance observed at ages 14 and 16 years could be explained by genetic effects present at the age of 12 years. There was also evidence for smaller but significant genetic innovations at 14 and 16 years of age for male and female neuroticism, at 14 years for male extraversion, at 14 and 16 years for female psychoticism, and at 14 years for male psychoticism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A stress-wave force balance for measurement of thrust, lift, and pitching moment on a large scramjet model (40 kg in mass, 1.165 in in length) in a reflected shock tunnel has been designed, calibrated, and tested. Transient finite element analysis was used to model the performance of the balance. This modeling indicates that good decoupling of signals and low sensitivity of the balance to the distribution of. the load can be achieved with a three-bar balance. The balance was constructed and calibrated by applying a series of point loads to the model. A good comparison between finite element analysis and experimental results was obtained with finite element analysis aiding in the interpretation of some experimental results. Force measurements were made in a shock tunnel both with and without fuel injection, and measurements were compared with predictions using simple models of the scramjet and combustion. Results indicate that the balance is capable of resolving lift, thrust, and pitching moments with and without combustion. However vibrations associated with tunnel operation interfered with the signals indicating the importance of vibration isolation for accurate measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the practical issue of using thermal image data without adjustment or calibration for projects which do not require actual temperatures per se. Large scale airborne scanning in the thermal band at 8.5–13 μm was obtained for a mangrove and salt marsh in subtropical eastern Australia. For open sites, the raw image values were strongly positively correlated with ground level temperatures. For sites under mangrove canopy cover, image values indicated temperatures 2–4°C lower than those measured on the ground. The raw image was useful in identifying water bodies under canopy and has the potential for locating channel lines of deeper water. This could facilitate modification to increase flushing in the system, thereby reducing mosquito larval survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, large areas of native tropical forests are being transformed into a mosaic of human dominated land uses with scattered mature remnants and secondary forests. In general, at the end of the land clearing process, the landscape will have two forest components: a stable component of surviving mature forests, and a dynamic component of secondary forests of different ages. As the proportion of mature forests continues to decline, secondary forests play an increasing role in the conservation and restoration of biodiversity. This paper aims to predict and explain spatial and temporal patterns in the age of remnant mature and secondary forests in lowland Colombian landscapes. We analyse the age distributions of forest fragments, using detailed temporal land cover data derived from aerial photographs. Ordinal logistic regression analysis was applied to model the spatial dynamics of mature and secondary forest patches. In particular, the effect of soil fertility, accessibility and auto-correlated neighbourhood terms on forest age and time of isolation of remnant patches was assessed. In heavily transformed landscapes, forests account for approximately 8% of the total landscape area, of which three quarters are comprised of secondary forests. Secondary forest growth adjacent to mature forest patches increases mean patch size and core area, and therefore plays an important ecological role in maintaining landscape structure. The regression models show that forest age is positively associated with the amount of neighbouring forest, and negatively associated with the amount of neighbouring secondary vegetation, so the older the forest is the less secondary vegetation there is adjacent to it. Accessibility and soil fertility also have a negative but variable influence on the age of forest remnants. The probability of future clearing if current conditions hold is higher for regenerated than mature forests. The challenge of biodiversity conservation and restoration in dynamic and spatially heterogeneous landscape mosaics composed of mature and secondary forests is discussed. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic signature verification is a well-established and an active area of research with numerous applications such as bank check verification, ATM access, etc. This paper proposes a novel approach to the problem of automatic off-line signature verification and forgery detection. The proposed approach is based on fuzzy modeling that employs the Takagi-Sugeno (TS) model. Signature verification and forgery detection are carried out using angle features extracted from box approach. Each feature corresponds to a fuzzy set. The features are fuzzified by an exponential membership function involved in the TS model, which is modified to include structural parameters. The structural parameters are devised to take account of possible variations due to handwriting styles and to reflect moods. The membership functions constitute weights in the TS model. The optimization of the output of the TS model with respect to the structural parameters yields the solution for the parameters. We have also derived two TS models by considering a rule for each input feature in the first formulation (Multiple rules) and by considering a single rule for all input features in the second formulation. In this work, we have found that TS model with multiple rules is better than TS model with single rule for detecting three types of forgeries; random, skilled and unskilled from a large database of sample signatures in addition to verifying genuine signatures. We have also devised three approaches, viz., an innovative approach and two intuitive approaches using the TS model with multiple rules for improved performance. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large number of mineral processing equipment employs the basic principles of gravity concentration in a flowing fluid of a few millimetres thick in small open channels where the particles are distributed along the flow height based on their physical properties and the fluid flow characteristics. Fluid flow behaviour and slurry transportation characteristics in open channels have been the research topic for many years in many engineering disciplines. However, the open channels used in the mineral processing industries are different in terms of the size of the channel and the flow velocity used. Understanding of water split behaviour is, therefore, essential in modeling flowing film concentrators. In this paper, an attempt has been made to model the water split behaviour in an inclined open rectangular channel, resembling the actual size and the flow velocity used by the mineral processing industries, based on the Prandtl's mixing length approach. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brugada syndrome (BS) is a genetic disease identified by an abnormal electrocardiogram ( ECG) ( mainly abnormal ECGs associated with right bundle branch block and ST-elevation in right precordial leads). BS can lead to increased risk of sudden cardiac death. Experimental studies on human ventricular myocardium with BS have been limited due to difficulties in obtaining data. Thus, the use of computer simulation is an important alternative. Most previous BS simulations were based on animal heart cell models. However, due to species differences, the use of human heart cell models, especially a model with three-dimensional whole-heart anatomical structure, is needed. In this study, we developed a model of the human ventricular action potential (AP) based on refining the ten Tusscher et al (2004 Am. J. Physiol. Heart Circ. Physiol. 286 H1573 - 89) model to incorporate newly available experimental data of some major ionic currents of human ventricular myocytes. These modified channels include the L-type calcium current (ICaL), fast sodium current (I-Na), transient outward potassium current (I-to), rapidly and slowly delayed rectifier potassium currents (I-Kr and I-Ks) and inward rectifier potassium current (I-Ki). Transmural heterogeneity of APs for epicardial, endocardial and mid-myocardial (M) cells was simulated by varying the maximum conductance of IKs and Ito. The modified AP models were then used to simulate the effects of BS on cellular AP and body surface potentials using a three-dimensional dynamic heart - torso model. Our main findings are as follows. (1) BS has little effect on the AP of endocardial or mid-myocardial cells, but has a large impact on the AP of epicardial cells. (2) A likely region of BS with abnormal cell AP is near the right ventricular outflow track, and the resulting ST-segment elevation is located in the median precordium area. These simulation results are consistent with experimental findings reported in the literature. The model can reproduce a variety of electrophysiological behaviors and provides a good basis for understanding the genesis of abnormal ECG under the condition of BS disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to grow microscopic spherical birefringent crystals of vaterite, a calcium carbonate mineral, has allowed the development of an optical microrheometer based on optical tweezers. However, since these crystals are birefringent, and worse, are expected to have non-uniform birefringence, computational modeling of the microrheometer is a highly challenging task. Modeling the microrheometer - and optical tweezers in general - typically requires large numbers of repeated calculations for the same trapped particle. This places strong demands on the efficiency of computational methods used. While our usual method of choice for computational modelling of optical tweezers - the T-matrix method - meets this requirement of efficiency, it is restricted to homogeneous isotropic particles. General methods that can model complex structures such as the vaterite particles, such as finite-difference time-domain (FDTD) or finite-difference frequency-domain (FDFD) methods, are inefficient. Therefore, we have developed a hybrid FDFD/T-matrix method that combines the generality of volume-discretisation methods such as FDFD with the efficiency of the T-matrix method. We have used this hybrid method to calculate optical forces and torques on model vaterite spheres in optical traps. We present and compare the results of computational modelling and experimental measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The developments of models in Earth Sciences, e.g. for earthquake prediction and for the simulation of mantel convection, are fare from being finalized. Therefore there is a need for a modelling environment that allows scientist to implement and test new models in an easy but flexible way. After been verified, the models should be easy to apply within its scope, typically by setting input parameters through a GUI or web services. It should be possible to link certain parameters to external data sources, such as databases and other simulation codes. Moreover, as typically large-scale meshes have to be used to achieve appropriate resolutions, the computational efficiency of the underlying numerical methods is important. Conceptional this leads to a software system with three major layers: the application layer, the mathematical layer, and the numerical algorithm layer. The latter is implemented as a C/C++ library to solve a basic, computational intensive linear problem, such as a linear partial differential equation. The mathematical layer allows the model developer to define his model and to implement high level solution algorithms (e.g. Newton-Raphson scheme, Crank-Nicholson scheme) or choose these algorithms form an algorithm library. The kernels of the model are generic, typically linear, solvers provided through the numerical algorithm layer. Finally, to provide an easy-to-use application environment, a web interface is (semi-automatically) built to edit the XML input file for the modelling code. In the talk, we will discuss the advantages and disadvantages of this concept in more details. We will also present the modelling environment escript which is a prototype implementation toward such a software system in Python (see www.python.org). Key components of escript are the Data class and the PDE class. Objects of the Data class allow generating, holding, accessing, and manipulating data, in such a way that the actual, in the particular context best, representation is transparent to the user. They are also the key to establish connections with external data sources. PDE class objects are describing (linear) partial differential equation objects to be solved by a numerical library. The current implementation of escript has been linked to the finite element code Finley to solve general linear partial differential equations. We will give a few simple examples which will illustrate the usage escript. Moreover, we show the usage of escript together with Finley for the modelling of interacting fault systems and for the simulation of mantel convection.