681 resultados para model calibration


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis provides a query model suitable for context sensitive access to a wide range of distributed linked datasets which are available to scientists using the Internet. The model is designed based on scientific research standards which require scientists to provide replicable methods in their publications. Although there are query models available that provide limited replicability, they do not contextualise the process whereby different scientists select dataset locations based on their trust and physical location. In different contexts, scientists need to perform different data cleaning actions, independent of the overall query, and the model was designed to accommodate this function. The query model was implemented as a prototype web application and its features were verified through its use as the engine behind a major scientific data access site, Bio2RDF.org. The prototype showed that it was possible to have context sensitive behaviour for each of the three mirrors of Bio2RDF.org using a single set of configuration settings. The prototype provided executable query provenance that could be attached to scientific publications to fulfil replicability requirements. The model was designed to make it simple to independently interpret and execute the query provenance documents using context specific profiles, without modifying the original provenance documents. Experiments using the prototype as the data access tool in workflow management systems confirmed that the design of the model made it possible to replicate results in different contexts with minimal additions, and no deletions, to query provenance documents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a porous medium model of the growth and deterioration of the viable sublayers of an epidermal skin substitute. It consists of five species: cells, intracellular and extracellular calcium, tight junctions, and a hypothesised signal chemical emanating from the stratum corneum. The model is solved numerically in Matlab using a finite difference scheme. Steady state calcium distributions are predicted that agree well with the experimental data. Our model also demonstrates epidermal skin substitute deterioration if the calcium diffusion coefficient is reduced compared to reported values in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Driver response (reaction) time (tr) of the second queuing vehicle is generally longer than other vehicles at signalized intersections. Though this phenomenon was revealed in 1972, the above factor is still ignored in conventional departure models. This paper highlights the need for quantitative measurements and analysis of queuing vehicle performance in spontaneous discharge pattern because it can improve microsimulation. Video recording from major cities in Australia plus twenty two sets of vehicle trajectories extracted from the Next Generation Simulation (NGSIM) Peachtree Street Dataset have been analyzed to better understand queuing vehicle performance in the discharge process. Findings from this research will alleviate driver response time and also can be used for the calibration of the microscopic traffic simulation model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the further noise reduction in the future, the traffic management which controls traffic flow and physical distribution is important. To conduct the measure by the traffic management effectively, it is necessary to apply the model for predicting the traffic flow in the citywide road network. For this purpose, the existing model named AVENUE was used as a macro-traffic flow prediction model. The traffic flow model was integrated with the road vehicles' sound power model, and the new road traffic noise prediction model was established. By using this prediction model, the noise map of entire city can be made. In this study, first, the change of traffic flow on the road network after the establishment of new roads was estimated, and the change of the road traffic noise caused by the new roads was predicted. As a result, it has been found that this prediction model has the ability to estimate the change of noise map by the traffic management. In addition, the macro-traffic flow model and our conventional micro-traffic flow model were combined, and the coverage of the noise prediction model was expanded.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Having a good automatic anomalous human behaviour detection is one of the goals of smart surveillance systems’ domain of research. The automatic detection addresses several human factor issues underlying the existing surveillance systems. To create such a detection system, contextual information needs to be considered. This is because context is required in order to correctly understand human behaviour. Unfortunately, the use of contextual information is still limited in the automatic anomalous human behaviour detection approaches. This paper proposes a context space model which has two benefits: (a) It provides guidelines for the system designers to select information which can be used to describe context; (b)It enables a system to distinguish between different contexts. A comparative analysis is conducted between a context-based system which employs the proposed context space model and a system which is implemented based on one of the existing approaches. The comparison is applied on a scenario constructed using video clips from CAVIAR dataset. The results show that the context-based system outperforms the other system. This is because the context space model allows the system to considering knowledge learned from the relevant context only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advent of social web initiatives, some argued that these new emerging tools might be useful in tacit knowledge sharing through providing interactive and collaborative technologies. However, there is still a poverty of literature to understand how and what might be the contributions of social media in facilitating tacit knowledge sharing. Therefore, this paper is intended to theoretically investigate and map social media concepts and characteristics with tacit knowledge creation and sharing requirements. By conducting a systematic literature review, five major requirements found that need to be present in an environment that involves tacit knowledge sharing. These requirements have been analyzed against social media concepts and characteristics to see how they map together. The results showed that social media have abilities to comply some of the main requirements of tacit knowledge sharing. The relationships have been illustrated in a conceptual framework, suggesting further empirical studies to acknowledge findings of this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the current transfer pricing regime to consider whether it is a sound model to be applied to modern multinational entities. The arm's length price methodology is examined to enable a discussion of the arguments in favour of such a regime. The article then refutes these arguments concluding that, contrary to the very reason multinational entities exist, applying arm's length rules involves a legal fiction of imagining transactions between unrelated parties. Multinational entities exist to operate in a way that independent entities would not, which the arm's length rules fail to take into account. As such, there is clearly an air of artificiality in applying the arm's length standard. To demonstrate this artificiality with respect to modern multinational entities, multinational banks are used as an example. The article concluded that the separate entity paradigm adopted by the traditional transfer pricing regime is incongruous with the economic theory of modern multinational enterprises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a behavioral car-following model based on empirical trajectory data that is able to reproduce the spontaneous formation and ensuing propagation of stop-and-go waves in congested traffic. By analyzing individual drivers’ car-following behavior throughout oscillation cycles it is found that this behavior is consistent across drivers and can be captured by a simple model. The statistical analysis of the model’s parameters reveals that there is a strong correlation between driver behavior before and during the oscillation, and that this correlation should not be ignored if one is interested in microscopic output. If macroscopic outputs are of interest, simulation results indicate that an existing model with fewer parameters can be used instead. This is shown for traffic oscillations caused by rubbernecking as observed in the US 101 NGSIM dataset. The same experiment is used to establish the relationship between rubbernecking behavior and the period of oscillations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The encryption method is a well established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data. In this paper we review the conventional encryption method which can be partially queried and propose the encryption method for numerical data which can be effectively queried. The proposed system includes the design of the service scenario, and metadata.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Despite the increasing clinical problems with metaphyseal fractures, most experimental studies investigate the healing of diaphyseal fractures. Although the mouse would be the preferable species to study the molecular and genetic aspects of metaphyseal fracture healing, a murine model does not exist yet. Using a special locking plate system, we herein introduce a new model, which allows the analysis of metaphyseal bone healing in mice. Methods: In 24 CD-1 mice the distal metaphysis of the femur was osteotomized. After stabilization with the locking plate, bone repair was analyzed radiologically, biomechanically, and histologically after 2 (n = 12) and 5 wk (n = 12). Additionally, the stiffness of the bone-implant construct was tested biomechanically ex vivo. Results: The torsional stiffness of the bone-implant construct was low compared with nonfractured control femora (0.23 ± 0.1 Nmm/°versus 1.78 ± 0.15 Nmm/°, P < 0.05). The cause of failure was a pullout of the distal screw. At 2 wk after stabilization, radiological analysis showed that most bones were partly bridged. At 5 wk, all bones showed radiological union. Accordingly, biomechanical analyses revealed a significantly higher torsional stiffness after 5 wk compared with that after 2 wk. Successful healing was indicated by a torsional stiffness of 90% of the contralateral control femora. Histological analyses showed new woven bone bridging the osteotomy without external callus formation and in absence of any cartilaginous tissue, indicating intramembranous healing. Conclusion: With the model introduced herein we report, for the first time, successful metaphyseal bone repair in mice. The model may be used to obtain deeper insights into the molecular mechanisms of metaphyseal fracture healing. © 2012 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Divergence dating studies, which combine temporal data from the fossil record with branch length data from molecular phylogenetic trees, represent a rapidly expanding approach to understanding the history of life. National Evolutionary Synthesis Center hosted the first Fossil Calibrations Working Group (3–6 March, 2011, Durham, NC, USA), bringing together palaeontologists, molecular evolutionists and bioinformatics experts to present perspectives from disciplines that generate, model and use fossil calibration data. Presentations and discussions focused on channels for interdisciplinary collaboration, best practices for justifying, reporting and using fossil calibrations and roadblocks to synthesis of palaeontological and molecular data. Bioinformatics solutions were proposed, with the primary objective being a new database for vetted fossil calibrations with linkages to existing resources, targeted for a 2012 launch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For over half a century, it has been known that the rate of morphological evolution appears to vary with the time frame of measurement. Rates of microevolutionary change, measured between successive generations, were found to be far higher than rates of macroevolutionary change inferred from the fossil record. More recently, it has been suggested that rates of molecular evolution are also time dependent, with the estimated rate depending on the timescale of measurement. This followed surprising observations that estimates of mutation rates, obtained in studies of pedigrees and laboratory mutation-accumulation lines, exceeded long-term substitution rates by an order of magnitude or more. Although a range of studies have provided evidence for such a pattern, the hypothesis remains relatively contentious. Furthermore, there is ongoing discussion about the factors that can cause molecular rate estimates to be dependent on time. Here we present an overview of our current understanding of time-dependent rates. We provide a summary of the evidence for time-dependent rates in animals, bacteria and viruses. We review the various biological and methodological factors that can cause rates to be time dependent, including the effects of natural selection, calibration errors, model misspecification and other artefacts. We also describe the challenges in calibrating estimates of molecular rates, particularly on the intermediate timescales that are critical for an accurate characterization of time-dependent rates. This has important consequences for the use of molecular-clock methods to estimate timescales of recent evolutionary events.