440 resultados para Modeling methods
em Queensland University of Technology - ePrints Archive
Resumo:
Recent studies have started to explore context-awareness as a driver in the design of adaptable business processes. The emerging challenge of identifying and considering contextual drivers in the environment of a business process are well understood, however, typical methods used in business process modeling do not yet consider this additional contextual information in their process designs. In this chapter, we describe our research towards innovative and advanced process modeling methods that include mechanisms to incorporate relevant contextual drivers and their impacts on business processes in process design models. We report on our ongoing work with an Australian insurance provider and describe the design science we employed to develop these innovative and useful artifacts as part of a context-aware method framework. We discuss the utility of these artifacts in an application in the claims handling process at the case organization.
Resumo:
This chapter discusses reference modelling languages for business systems analysis and design. In particular, it reports on reference models in the context of the design-for/by-reuse paradigm, explains how traditional modelling techniques fail to provide adequate conceptual expressiveness to allow for easy model reuse by configuration or adaptation and elaborates on the need for reference modelling languages to be configurable. We discuss requirements for and the development of reference modelling languages that reflect the need for configurability. Exemplarily, we report on the development, definition and configuration of configurable event-driven process chains. We further outline how configurable reference modelling languages and the corresponding design principles can be used in future scenarios such as process mining and data modelling.
Resumo:
In practical terms, conceptual modeling is at the core of systems analysis and design. The plurality of modeling methods available has however been regarded as detrimental, and as a strong indication that a common view or theoretical grounding of modeling is wanting. This theoretical foundation must universally address all potential matters to be represented in a model, which consequently suggested ontology as the point of departure for theory development. The Bunge–Wand–Weber (BWW) ontology has become a widely accepted modeling theory. Its application has simultaneously led to the recognition that, although suitable as a meta-model, the BWW ontology needs to be enhanced regarding its expressiveness in empirical domains. In this paper, a first step in this direction has been made by revisiting BUNGE’s ontology, and by proposing the integration of a “hierarchy of systems” in the BWW ontology for accommodating domain specific conceptualizations.
Resumo:
This paper summarises the development of a machine-readable model series for explaining Gaudi's use of ruled surface geometry in the Sagrada Familia in Barcelona, Spain. The first part discusses the modeling methods underlying the columns of the cathedral and the techniques required to translate them into built structures. The second part discusses the design and development of a tangible machine-readable model to explain column-modeling methods interactively in educational contexts such as art exhibitions. It is designed to explain the principles underlying the column design by means of physical interaction without using mathematical terms or language.
Resumo:
Dengue fever is one of the world’s most important vector-borne diseases. The transmission area of this disease continues to expand due to many factors including urban sprawl, increased travel and global warming. Current preventative techniques are primarily based on controlling mosquito vectors as other prophylactic measures, such as a tetravalent vaccine are unlikely to be available in the foreseeable future. However, the continually increasing dengue incidence suggests that this strategy alone is not sufficient. Epidemiological models attempt to predict future outbreaks using information on the risk factors of the disease. Through a systematic literature review, this paper aims at analyzing the different modeling methods and their outputs in terms of accurately predicting disease outbreaks. We found that many previous studies have not sufficiently accounted for the spatio-temporal features of the disease in the modeling process. Yet with advances in technology, the ability to incorporate such information as well as the socio-environmental aspect allowed for its use as an early warning system, albeit limited geographically to a local scale.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.
Resumo:
Advanced grid stiffened composite cylindrical shell is widely adopted in advanced structures due to its exceptional mechanical properties. Buckling is a main failure mode of advanced grid stiffened structures in engineering, which calls for increasing attention. In this paper, the buckling response of advanced grid stiffened structure is investigated by three different means including equivalent stiffness model, finite element model and a hybrid model (H-model) that combines equivalent stiffness model with finite element model. Buckling experiment is carried out on an advanced grid stiffened structure to validate the efficiency of different modeling methods. Based on the comparison, the characteristics of different methods are independently evaluated. It is arguable that, by considering the defects of material, finite element model is a suitable numerical tool for the buckling analysis of advanced grid stiffened structures.
Resumo:
Background Understanding the progression of prostate cancer to androgen-independence/castrate resistance and development of preclinical testing models are important for developing new prostate cancer therapies. This report describes studies performed 30 years ago, which demonstrate utility and shortfalls of xenografting to preclinical modeling. Methods We subcutaneously implanted male nude mice with small prostate cancer fragments from transurethral resection of the prostate (TURP) from 29 patients. Successful xenografts were passaged into new host mice. They were characterized using histology, immunohistochemistry for marker expression, flow cytometry for ploidy status, and in some cases by electron microscopy and response to testosterone. Two xenografts were karyotyped by G-banding. Results Tissues from 3/29 donors (10%) gave rise to xenografts that were successfully serially passaged in vivo. Two, (UCRU-PR-1, which subsequently was replaced by a mouse fibrosarcoma, and UCRU-PR-2, which combined epithelial and neuroendocrine features) have been described. UCRU-PR-4 line was a poorly differentiated prostatic adenocarcinoma derived from a patient who had undergone estrogen therapy and bilateral castration after his cancer relapsed. Histologically, this comprised diffusely infiltrating small acinar cell carcinoma with more solid aggregates of poorly differentiated adenocarcinoma. The xenografted line showed histology consistent with a poorly differentiated adenocarcinoma and stained positively for prostatic acid phosphatase (PAcP), epithelial membrane antigen (EMA) and the cytokeratin cocktail, CAM5.2, with weak staining for prostate specific antigen (PSA). The line failed to grow in female nude mice. Castration of three male nude mice after xenograft establishment resulted in cessation of growth in one, growth regression in another and transient growth in another, suggesting that some cells had retained androgen sensitivity. The karyotype (from passage 1) was 43–46, XY, dic(1;12)(p11;p11), der(3)t(3:?5)(q13;q13), -5, inv(7)(p15q35) x2, +add(7)(p13), add(8)(p22), add(11)(p14), add(13)(p11), add(20)(p12), -22, +r4[cp8]. Conclusions Xenografts provide a clinically relevant model of prostate cancer, although establishing serially transplantable prostate cancer patient derived xenografts is challenging and requires rigorous characterization and high quality starting material. Xenografting from advanced prostate cancer is more likely to succeed, as xenografting from well differentiated, localized disease has not been achieved in our experience. Strong translational correlations can be demonstrated between the clinical disease state and the xenograft model
Resumo:
Some new types of mathematical model among four key techno - economic indexes of highway rapid passenger through transportation were established based on the principles of transportation economics. According to the research on the feasible solutions to the associated parameters which were then compared to the actual value, found some limitation in the existing transport organization method. In order to conquer that, two new types of transport organization method, namely CD (Collecting and Distributing) Method and Relay Method were brought forward. What’s more, a further research was down to estimate their characteristics, such as feasibilities, operation flows, applicability fields, etc. This analysis proves the two methods can offset the shortage of rapid passenger through transportation. To ensure highway rapid passenger transport develop harmoniously, a three-stage development targets was suggested to fuse different organization methods.
Resumo:
This paper is a continuation of the paper titled “Concurrent multi-scale modeling of civil infrastructure for analyses on structural deteriorating—Part I: Modeling methodology and strategy” with the emphasis on model updating and verification for the developed concurrent multi-scale model. The sensitivity-based parameter updating method was applied and some important issues such as selection of reference data and model parameters, and model updating procedures on the multi-scale model were investigated based on the sensitivity analysis of the selected model parameters. The experimental modal data as well as static response in terms of component nominal stresses and hot-spot stresses at the concerned locations were used for dynamic response- and static response-oriented model updating, respectively. The updated multi-scale model was further verified to act as the baseline model which is assumed to be finite-element model closest to the real situation of the structure available for the subsequent arbitrary numerical simulation. The comparison of dynamic and static responses between the calculated results by the final model and measured data indicated the updating and verification methods applied in this paper are reliable and accurate for the multi-scale model of frame-like structure. The general procedures of multi-scale model updating and verification were finally proposed for nonlinear physical-based modeling of large civil infrastructure, and it was applied to the model verification of a long-span bridge as an actual engineering practice of the proposed procedures.
Resumo:
Background. We investigated the likely impact of vaccines on the prevalence of and morbidity due to Chlamydia trachomatis (chlamydia) infections in heterosexual populations. Methods.An individual‐based mathematical model of chlamydia transmission was developed and linked to the infection course in chlamydia‐infected individuals. The model describes the impact of a vaccine through its effect on the chlamydial load required to infect susceptible individuals (the “critical load”), the load in infected individuals, and their subsequent infectiousness. The model was calibrated using behavioral, biological, and clinical data. Results.A fully protective chlamydia vaccine administered before sexual debut can theoretically eliminate chlamydia epidemics within 20 years. Partially effective vaccines can still greatly reduce the incidence of chlamydia infection. Vaccines should aim primarily to increase the critical load in susceptible individuals and secondarily to decrease the peak load and/or the duration of infection in vaccinated individuals who become infected. Vaccinating both sexes has a beneficial impact on chlamydia‐related morbidity, but targeting women is more effective than targeting men. Conclusions.Our findings can be used in laboratory settings to evaluate vaccine candidates in animal models, by regulatory bodies in the promotion of candidates for clinical trials, and by public health authorities in deciding on optimal intervention strategies.
Resumo:
Purpose: All currently considered parametric models used for decomposing videokeratoscopy height data are viewercentered and hence describe what the operator sees rather than what the surface is. The purpose of this study was to ascertain the applicability of an object-centered representation to modeling of corneal surfaces. Methods: A three-dimensional surface decomposition into a series of spherical harmonics is considered and compared with the traditional Zernike polynomial expansion for a range of videokeratoscopic height data. Results: Spherical harmonic decomposition led to significantly better fits to corneal surfaces (in terms of the root mean square error values) than the corresponding Zernike polynomial expansions with the same number of coefficients, for all considered corneal surfaces, corneal diameters, and model orders. Conclusions: Spherical harmonic decomposition is a viable alternative to Zernike polynomial decomposition. It achieves better fits to videokeratoscopic height data and has the advantage of an object-centered representation that could be particularly suited to the analysis of multiple corneal measurements.
Resumo:
Purpose: To ascertain the effectiveness of object-centered three-dimensional representations for the modeling of corneal surfaces. Methods: Three-dimensional (3D) surface decomposition into series of basis functions including: (i) spherical harmonics, (ii) hemispherical harmonics, and (iii) 3D Zernike polynomials were considered and compared to the traditional viewer-centered representation of two-dimensional (2D) Zernike polynomial expansion for a range of retrospective videokeratoscopic height data from three clinical groups. The data were collected using the Medmont E300 videokeratoscope. The groups included 10 normal corneas with corneal astigmatism less than −0.75 D, 10 astigmatic corneas with corneal astigmatism between −1.07 D and 3.34 D (Mean = −1.83 D, SD = ±0.75 D), and 10 keratoconic corneas. Only data from the right eyes of the subjects were considered. Results: All object-centered decompositions led to significantly better fits to corneal surfaces (in terms of the RMS error values) than the corresponding 2D Zernike polynomial expansions with the same number of coefficients, for all considered corneal surfaces, corneal diameters (2, 4, 6, and 8 mm), and model orders (4th to 10th radial orders) The best results (smallest RMS fit error) were obtained with spherical harmonics decomposition which lead to about 22% reduction in the RMS fit error, as compared to the traditional 2D Zernike polynomials. Hemispherical harmonics and the 3D Zernike polynomials reduced the RMS fit error by about 15% and 12%, respectively. Larger reduction in RMS fit error was achieved for smaller corneral diameters and lower order fits. Conclusions: Object-centered 3D decompositions provide viable alternatives to traditional viewer-centered 2D Zernike polynomial expansion of a corneal surface. They achieve better fits to videokeratoscopic height data and could be particularly suited to the analysis of multiple corneal measurements, where there can be slight variations in the position of the cornea from one map acquisition to the next.
Resumo:
In this paper, we propose a multivariate GARCH model with a time-varying conditional correlation structure. The new double smooth transition conditional correlation (DSTCC) GARCH model extends the smooth transition conditional correlation (STCC) GARCH model of Silvennoinen and Teräsvirta (2005) by including another variable according to which the correlations change smoothly between states of constant correlations. A Lagrange multiplier test is derived to test the constancy of correlations against the DSTCC-GARCH model, and another one to test for another transition in the STCC-GARCH framework. In addition, other specification tests, with the aim of aiding the model building procedure, are considered. Analytical expressions for the test statistics and the required derivatives are provided. Applying the model to the stock and bond futures data, we discover that the correlation pattern between them has dramatically changed around the turn of the century. The model is also applied to a selection of world stock indices, and we find evidence for an increasing degree of integration in the capital markets.
Resumo:
This paper presents a new approach to improving the effectiveness of autonomous systems that deal with dynamic environments. The basis of the approach is to find repeating patterns of behavior in the dynamic elements of the system, and then to use predictions of the repeating elements to better plan goal directed behavior. It is a layered approach involving classifying, modeling, predicting and exploiting. Classifying involves using observations to place the moving elements into previously defined classes. Modeling involves recording features of the behavior on a coarse grained grid. Exploitation is achieved by integrating predictions from the model into the behavior selection module to improve the utility of the robot's actions. This is in contrast to typical approaches that use the model to select between different strategies or plays. Three methods of adaptation to the dynamic features of the environment are explored. The effectiveness of each method is determined using statistical tests over a number of repeated experiments. The work is presented in the context of predicting opponent behavior in the highly dynamic and multi-agent robot soccer domain (RoboCup).