161 resultados para Modeling methods
em University of Queensland eSpace - Australia
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
Current Physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation.
Resumo:
In a recent study, severe distortions in the proton images of an excised, fixed, human brain in an 11.1 Tesla/40 cm MR instrument have been observed, and the effect modeled on phantom images using a finite difference time domain (FDTD) model. in the present study, we extend these simulations to that of a complete human head, employing a hybrid FDTD and method of moments (MoM) approach, which provides a validated method for simulating biological samples in coil structures. The effect of fixative on the image distortions is explored. importantly, temperature distributions within the head are also simulated using a bioheat method based on parameters derived from the electromagnetic simulations. The MoM/FDTD simulations confirm that the transverse magnetic field (B,) from a ReCav resonator exhibits good homogeneity in air but strong inhomogeneity when loaded with the head with or without fixative. The fixative serves to increase the distortions, but they are still significant for the in vivo simulations. The simulated signal intensity (SI) distribution within the sample confirm the distortions in the experimental images are caused by the complex interactions of the incident electromagnetic fields with tissue, which is heterogeneous in terms of conductivity and permittivity. The temperature distribution is likewise heterogeneous, raising concerns regarding hot spot generation in the sample that may exceed acceptable levels in future in vivo studies. As human imaging at 11.1 T is some time away, simulations are important in terms of predicting potential safety issues as well as evaluating practical concerns about the quality of images. Simulation on a whole human head at 11.1 T implies the wave behavior presents significant engineering challenges for ultra-high-field (UHF) MRI. Novel strategies will have to be employed in imaging technique and resonator design for UHF MRI to achieve the theoretical signal-to-noise ratio (SNR) improvements it offers over lower field systems. (C) 2005 Wiley Periodicals, Inc.
Resumo:
Cpfg is a program for simulating and visualizing plant development, based on the theory of L-systems. A special-purpose programming language, used to specify plant models, is an essential feature of cpfg. We review postulates of L-system theory that have influenced the design of this language. We then present the main constructs of this language, and evaluate it from a user's perspective.
Resumo:
L-studio/cpfg is a plant modeling software system designed for Windows 95/98/NT platforms. Its key components are the L-system-based plant simulator cpfg and the modeling environment called L-studio. We overview version 1.0 of this system from the user's perspective.
Resumo:
Land related information about the Earth's surface is commonIJ found in two forms: (1) map infornlation and (2) satellite image da ta. Satellite imagery provides a good visual picture of what is on the ground but complex image processing is required to interpret features in an image scene. Increasingly, methods are being sought to integrate the knowledge embodied in mop information into the interpretation task, or, alternatively, to bypass interpretation and perform biophysical modeling directly on derived data sources. A cartographic modeling language, as a generic map analysis package, is suggested as a means to integrate geographical knowledge and imagery in a process-oriented view of the Earth. Specialized cartographic models may be developed by users, which incorporate mapping information in performing land classification. In addition, a cartographic modeling language may be enhanced with operators suited to processing remotely sensed imagery. We demonstrate the usefulness of a cartographic modeling language for pre-processing satellite imagery, and define two nerv cartographic operators that evaluate image neighborhoods as post-processing operations to interpret thematic map values. The language and operators are demonstrated with an example image classification task.
Resumo:
Application of novel analytical and investigative methods such as fluorescence in situ hybridization, confocal laser scanning microscopy (CLSM), microelectrodes and advanced numerical simulation has led to new insights into micro-and macroscopic processes in bioreactors. However, the question is still open whether or not these new findings and the subsequent gain of knowledge are of significant practical relevance and if so, where and how. To find suitable answers it is necessary for engineers to know what can be expected by applying these modern analytical tools. Similarly, scientists could benefit significantly from an intensive dialogue with engineers in order to find out about practical problems and conditions existing in wastewater treatment systems. In this paper, an attempt is made to help bridge the gap between science and engineering in biological wastewater treatment. We provide an overview of recently developed methods in microbiology and in mathematical modeling and numerical simulation. A questionnaire is presented which may help generate a platform from which further technical and scientific developments can be accomplished. Both the paper and the questionnaire are aimed at encouraging scientists and engineers to enter into an intensive, mutually beneficial dialogue. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Models for the occurrence of the vibrational instability during rolling known as third octave chatter are presented and discussed. An analysis of rolling mill chatter was performed for the purpose of identifying characteristics of the vibrations and to determine any dependency on the rolling schedule. In particular, a stability criterion for the critical rolling speed is used to predict the maximum rolling speed without chatter instability on schedules from a 5 stand tandem mill rolling thin steel product. The results correlate well with measurements of critical speed occurring on the mill using a vibration monitor: This research provides significant insights into the chatter phenomena and has been used to investigate control methods for suppression of the instability.
Resumo:
Numerical modeling of the eddy currents induced in the human body by the pulsed field gradients in MRI presents a difficult computational problem. It requires an efficient and accurate computational method for high spatial resolution analyses with a relatively low input frequency. In this article, a new technique is described which allows the finite difference time domain (FDTD) method to be efficiently applied over a very large frequency range, including low frequencies. This is not the case in conventional FDTD-based methods. A method of implementing streamline gradients in FDTD is presented, as well as comparative analyses which show that the correct source injection in the FDTD simulation plays a crucial rule in obtaining accurate solutions. In particular, making use of the derivative of the input source waveform is shown to provide distinct benefits in accuracy over direct source injection. In the method, no alterations to the properties of either the source or the transmission media are required. The method is essentially frequency independent and the source injection method has been verified against examples with analytical solutions. Results are presented showing the spatial distribution of gradient-induced electric fields and eddy currents in a complete body model.
Resumo:
A new modeling approach-multiple mapping conditioning (MMC)-is introduced to treat mixing and reaction in turbulent flows. The model combines the advantages of the probability density function and the conditional moment closure methods and is based on a certain generalization of the mapping closure concept. An equivalent stochastic formulation of the MMC model is given. The validity of the closuring hypothesis of the model is demonstrated by a comparison with direct numerical simulation results for the three-stream mixing problem. (C) 2003 American Institute of Physics.
Resumo:
A review is given on the fundamental studies of gas-carbon reactions using electronic structure methods in the last several decades. The three types of electronic structure methods including semi-empirical, ab initio and density functional theory, methods are briefly introduced first, followed by the studies on carbon reactions with hydrogen and oxygen-containing gases (non-catalysed and catalysed). The problems yet to solve and possible promising directions are discussed. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The advantages of antennas that can resemble the shape of the body to which they are attached are obvious. However, electromagnetic modeling of such unusually shaped antennas can be difficult. In this paper, the commercially available software SolidWorks(TM) is used for accurately drawing complex shapes in conjunction with the electromagnetic software FEKO(TM) to model the EM behavior of conformal antennas. The application of SolidWorks and custom-written software allows all the required information that forms the analyzed structure to be automatically inserted into FEKO, and gives the user complete control over the antenna being modeled. This approach is illustrated by a number of simulation examples of single, wideband, multi-band planar and curved patch antennas.
Resumo:
Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Due to complex field/tissue interactions, high-field magnetic resonance (MR) images suffer significant image distortions that result in compromised diagnostic quality. A new method that attempts to remove these distortions is proposed in this paper and is based on the use of transceiver-phased arrays. The proposed system uses, in the examples presented herein, a shielded four-element transceive-phased array head coil and involves performing two separate scans of the same slice with each scan using different excitations during transmission. By optimizing the amplitudes and phases for each scan, antipodal signal profiles can be obtained, and by combining both the images together, the image distortion can be reduced several fold. A combined hybrid method of moments (MoM)/finite element method (FEM) and finite-difference time-domain (FDTD) technique is proposed and used to elucidate the concept of the new method and to accurately evaluate the electromagnetic field (EMF) in a human head model. In addition, the proposed method is used in conjunction with the generalized auto-calibrating partially parallel acquisitions (GRAPPA) reconstruction technique to enable rapid imaging of the two scans. Simulation results reported herein for 11-T (470-MHz) brain imaging applications show that the new method with GRAPPA reconstruction theoretically results in improved image quality and that the proposed combined hybrid MoM/FEM and FDTD technique is. suitable for high-field magnetic resonance imaging (MRI) numerical analysis.
Resumo:
The purpose of this paper was to evaluate the psychometric properties of a stage-specific selfefficacy scale for physical activity with classical test theory (CTT), confirmatory factor analysis (CFA) and item response modeling (IRM). Women who enrolled in the Women On The Move study completed a 20-item stage-specific self-efficacy scale developed for this study [n = 226, 51.1% African-American and 48.9% Hispanic women, mean age = 49.2 (67.0) years, mean body mass index = 29.7 (66.4)]. Three analyses were conducted: (i) a CTT item analysis, (ii) a CFA to validate the factor structure and (iii) an IRM analysis. The CTT item analysis and the CFA results showed that the scale had high internal consistency (ranging from 0.76 to 0.93) and a strong factor structure. Results also showed that the scale could be improved by modifying or eliminating some of the existing items without significantly altering the content of the scale. The IRM results also showed that the scale had few items that targeted high self-efficacy and the stage-specific assumption underlying the scale was rejected. In addition, the IRM analyses found that the five-point response format functioned more like a four-point response format. Overall, employing multiple methods to assess the psychometric properties of the stage-specific self-efficacy scale demonstrated the complimentary nature of these methods and it highlighted the strengths and weaknesses of this scale.