634 resultados para Dental Models
Resumo:
Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an ffective input for travel time prediction. In this paper, the hazard based prediction odels are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS) for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.
Resumo:
Commodity price modeling is normally approached in terms of structural time-series models, in which the different components (states) have a financial interpretation. The parameters of these models can be estimated using maximum likelihood. This approach results in a non-linear parameter estimation problem and thus a key issue is how to obtain reliable initial estimates. In this paper, we focus on the initial parameter estimation problem for the Schwartz-Smith two-factor model commonly used in asset valuation. We propose the use of a two-step method. The first step considers a univariate model based only on the spot price and uses a transfer function model to obtain initial estimates of the fundamental parameters. The second step uses the estimates obtained in the first step to initialize a re-parameterized state-space-innovations based estimator, which includes information related to future prices. The second step refines the estimates obtained in the first step and also gives estimates of the remaining parameters in the model. This paper is part tutorial in nature and gives an introduction to aspects of commodity price modeling and the associated parameter estimation problem.
Resumo:
The Construction industry accounts for a tenth of global GDP. Still, challenges such as slow adoption of new work processes, islands of information, and legal disputes, remain frequent, industry-wide occurrences despite various attempts to address them. In response, IT-based approaches have been adopted to explore collaborative ways of executing construction projects. Building Information Modelling (BIM) is an exemplar of integrative technologies whose 3D-visualisation capabilities have fostered collaboration especially between clients and design teams. Yet, the ways in which specification documents are created and used in capturing clients' expectations based on industry standards have remained largely unchanged since the 18th century. As a result, specification-related errors are still common place in an industry where vast amounts of information are consumed as well as produced in the course project implementation in the built environment. By implication, processes such as cost planning which depend on specification-related information remain largely inaccurate even with the use of BIM-based technologies. This paper briefly distinguishes between non-BIM-based and BIM-based specifications and reports on-going efforts geared towards the latter. We review exemplars aimed at extending Building Information Models to specification information embedded within the objects in a product library and explore a viable way of reasoning about a semi-automated process of specification using our product library.
Resumo:
The articles collected here in this special edition Epithelial-Mesenchymal (EMT) and Mesenchymal-Epithelial Transitions (MET) in Cancer provide a snapshot of the very rapidly progressing cinemascope of the involvement of these transitions in carcinoma progression. Pubmed analysis of EMT and cancer shows an exponential increase in the last few years in the number of papers and reviews published under these terms (Fig. 1). The last few years have seen these articles appearing in high calibre journals including Nature, Nature Cell Biology, Cancer Cell, PNAS, JNCI, JCI, and Cell, signaling the acceptance and quality of work in this field.
Resumo:
We investigated the effects of the matrix metalloproteinase 13 (MMP13)-selective inhibitor, 5-(4-{4-[4-(4-fluorophenyl)-1,3-oxazol-2-yl]phenoxy}phenoxy)-5-(2-methoxyethyl) pyrimidine-2,4,6(1H,3H,5H)-trione (Cmpd-1), on the primary tumor growth and breast cancer-associated bone remodeling using xenograft and syngeneic mouse models. We used human breast cancer MDA-MB-231 cells inoculated into the mammary fat pad and left ventricle of BALB/c Nu/Nu mice, respectively, and spontaneously metastasizing 4T1.2-Luc mouse mammary cells inoculated into mammary fat pad of BALB/c mice. In a prevention setting, treatment with Cmpd-1 markedly delayed the growth of primary tumors in both models, and reduced the onset and severity of osteolytic lesions in the MDA-MB-231 intracardiac model. Intervention treatment with Cmpd-1 on established MDA-MB-231 primary tumors also significantly inhibited subsequent growth. In contrast, no effects of Cmpd-1 were observed on soft organ metastatic burden following intracardiac or mammary fat pad inoculations of MDA-MB-231 and 4T1.2-Luc cells respectively. MMP13 immunostaining of clinical primary breast tumors and experimental mice tumors revealed intra-tumoral and stromal expression in most tumors, and vasculature expression in all. MMP13 was also detected in osteoblasts in clinical samples of breast-to-bone metastases. The data suggest that MMP13-selective inhibitors, which lack musculoskeletal side effects, may have therapeutic potential both in primary breast cancer and cancer-induced bone osteolysis.
Resumo:
We construct a two-scale mathematical model for modern, high-rate LiFePO4cathodes. We attempt to validate against experimental data using two forms of the phase-field model developed recently to represent the concentration of Li+ in nano-sized LiFePO4crystals. We also compare this with the shrinking-core based model we developed previously. Validating against high-rate experimental data, in which electronic and electrolytic resistances have been reduced is an excellent test of the validity of the crystal-scale model used to represent the phase-change that may occur in LiFePO4material. We obtain poor fits with the shrinking-core based model, even with fitting based on “effective” parameter values. Surprisingly, using the more sophisticated phase-field models on the crystal-scale results in poorer fits, though a significant parameter regime could not be investigated due to numerical difficulties. Separate to the fits obtained, using phase-field based models embedded in a two-scale cathodic model results in “many-particle” effects consistent with those reported recently.
Resumo:
In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.
Resumo:
Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
Tissue engineering is a multidisciplinary field with the potential to replace tissues lost as a result of trauma, cancer surgery, or organ dysfunction. The successful production, integration, and maintenance of any tissue-engineered product are a result of numerous molecular interactions inside and outside the cell. We consider the essential elements for successful tissue engineering to be a matrix scaffold, space, cells, and vasculature, each of which has a significant and distinct molecular underpinning (Fig. 1). Our approach capitalizes on these elements. Originally developed in the rat, our chamber model (Fig. 2) involves the placement of an arteriovenous loop (the vascular supply) in a polycarbonate chamber (protected space) with the addition of cells and an extracellular matrix such as Matrigel or endogenous fibrin (34, 153, 246, 247). This model has also been extended to the rabbit and pig (J. Dolderer, M. Findlay, W. Morrison, manuscript in preparation), and has been modified for the mouse to grow adipose tissue and islet cells (33, 114, 122) (Fig. 3)...
Resumo:
Invasion of extracellular matrices is crucial to a number of physiological and pathophysiological states, including tumor cell metastasis, arthritis, embryo implantation, wound healing, and early development. To isolate invasion from the additional complexities of these scenarios a number of in vitro invasion assays have been developed over the years. Early studies employed intact tissues, like denuded amniotic membrane (1) or embryonic chick heart fragments (2), however recently, purified matrix components or complex matrix extracts have been used to provide more uniform and often more rapid analyses (for examples, see the following integrin studies). Of course, the more holistic view of invasion offered in the earlier assays is valuable and cannot be fully reproduced in these more rapid assays, but advantages of reproducibility among replicates, ease of preparation and analysis, and overall high throughput favor the newer assays. In this chapter, we will focus on providing detailed protocols for Matrigel-based assays (Matrigel=reconstituted basement membrane; reviewed in ref. (3)). Matrigel is an extract from the transplantable Engelbreth-Holm-Swarm murine sarcoma that deposits a multilammelar basement membrane. Matrigel is available commercially (Becton Dickinson, Bedford, MA), and can be manipulated as a liquid at 4°C into a variety of different formats. Alternatively, cell culture inserts precoated with Matrigel can be purchased for even greater simplicity.
Resumo:
Water management is vital for mine sites both for production and sustainability related issues. Effective water management is a complex task since the role of water on mine sites is multifaceted. Computers models are tools that represent mine site water interaction and can be used by mine sites to inform or evaluate their water management strategies. There exist several types of models that can be used to represent mine site water interactions. This paper presents three such models: an operational model, an aggregated systems model and a generic systems model. For each model the paper provides a description and example followed by an analysis of its advantages and disadvantages. The paper hypotheses that since no model is optimal for all situations, each model should be applied in situations where it is most appropriate based upon the scale of water interactions being investigated, either unit (operation), inter-site (aggregated systems) or intra-site (generic systems).
Resumo:
There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.
Resumo:
This thesis investigates the use of building information models for access control and security applications in critical infrastructures and complex building environments. It examines current problems in security management for physical and logical access control and proposes novel solutions that exploit the detailed information available in building information models. The project was carried out as part of the Airports of the Future Project and the research was modelled based on real-world problems identified in collaboration with our industry partners in the project.
Resumo:
Western economies are highly dependent on service innovation for their growth and employment. An important driver for economic growth is, therefore, the development of new, innovative services like electronic services, mobile end-user services, new financial or personalized services. Service innovation joins four trends that currently shape the western economies: the growing importance of services, the need for innovation, changes in consumer and business markets, and the advancements in information and communication technology (ICT).