1000 resultados para McGill Model
Resumo:
A key issue for the economic development and for performance of organizations is the existence of standards. As their definitions and control are source of power, it seems to be important to understand the concept and to wonder about the representations authorized by the concept which give their direction and their legitimacy. The difficulties of classical microeconomics of establishing a theory of standardisation compatible with its fundamental axiomatic are underlined. We propose to reconsider the problem by carrying out the opposite way: to question the theoretical base, by reformulating assumptions on the autonomy of the choice of the actors. The theory of conventions will offer us both a theoretical framework and tools, enabling us to understand the systemic dimension and dynamic structure of standards seen as special case of conventions. This work aims thus to provide a sound basis and promote a better consciousness in the development of global project management standards, aiming also to underline that social construction is not a matter of copyright but a matter of open minds, collective cognitive process and freedom for the common wealth.
Resumo:
Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.
Resumo:
Prefabricated construction is regarded by many as an effective and efficient approach to improving construction processes and productivity, ensuring construction quality and reducing time and cost in the construction industry. However, many problems occur with this approach in practice, including higher risk levels and cost or time overruns. In order to solve such problems, it is proposed that the IKEA model of the manufacturing industry and VP technology are introduced into a prefabricated construction process. The concept of the IKEA model is identified in detail and VP technology is briefly introduced. In conjunction with VP technology, the applications of the IKEA model are presented in detail, i.e. design optimization, production optimization and installation optimization. Furthermore, through a case study of a prefabricated hotel project in Hong Kong, it is shown that the VP-based IKEA model can improve the efficiency and safety of prefabricated construction as well as reducing cost and time.
Resumo:
Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.
Resumo:
This thesis provides a query model suitable for context sensitive access to a wide range of distributed linked datasets which are available to scientists using the Internet. The model is designed based on scientific research standards which require scientists to provide replicable methods in their publications. Although there are query models available that provide limited replicability, they do not contextualise the process whereby different scientists select dataset locations based on their trust and physical location. In different contexts, scientists need to perform different data cleaning actions, independent of the overall query, and the model was designed to accommodate this function. The query model was implemented as a prototype web application and its features were verified through its use as the engine behind a major scientific data access site, Bio2RDF.org. The prototype showed that it was possible to have context sensitive behaviour for each of the three mirrors of Bio2RDF.org using a single set of configuration settings. The prototype provided executable query provenance that could be attached to scientific publications to fulfil replicability requirements. The model was designed to make it simple to independently interpret and execute the query provenance documents using context specific profiles, without modifying the original provenance documents. Experiments using the prototype as the data access tool in workflow management systems confirmed that the design of the model made it possible to replicate results in different contexts with minimal additions, and no deletions, to query provenance documents.
Resumo:
We present a porous medium model of the growth and deterioration of the viable sublayers of an epidermal skin substitute. It consists of five species: cells, intracellular and extracellular calcium, tight junctions, and a hypothesised signal chemical emanating from the stratum corneum. The model is solved numerically in Matlab using a finite difference scheme. Steady state calcium distributions are predicted that agree well with the experimental data. Our model also demonstrates epidermal skin substitute deterioration if the calcium diffusion coefficient is reduced compared to reported values in the literature.
Resumo:
For the further noise reduction in the future, the traffic management which controls traffic flow and physical distribution is important. To conduct the measure by the traffic management effectively, it is necessary to apply the model for predicting the traffic flow in the citywide road network. For this purpose, the existing model named AVENUE was used as a macro-traffic flow prediction model. The traffic flow model was integrated with the road vehicles' sound power model, and the new road traffic noise prediction model was established. By using this prediction model, the noise map of entire city can be made. In this study, first, the change of traffic flow on the road network after the establishment of new roads was estimated, and the change of the road traffic noise caused by the new roads was predicted. As a result, it has been found that this prediction model has the ability to estimate the change of noise map by the traffic management. In addition, the macro-traffic flow model and our conventional micro-traffic flow model were combined, and the coverage of the noise prediction model was expanded.
Resumo:
Having a good automatic anomalous human behaviour detection is one of the goals of smart surveillance systems’ domain of research. The automatic detection addresses several human factor issues underlying the existing surveillance systems. To create such a detection system, contextual information needs to be considered. This is because context is required in order to correctly understand human behaviour. Unfortunately, the use of contextual information is still limited in the automatic anomalous human behaviour detection approaches. This paper proposes a context space model which has two benefits: (a) It provides guidelines for the system designers to select information which can be used to describe context; (b)It enables a system to distinguish between different contexts. A comparative analysis is conducted between a context-based system which employs the proposed context space model and a system which is implemented based on one of the existing approaches. The comparison is applied on a scenario constructed using video clips from CAVIAR dataset. The results show that the context-based system outperforms the other system. This is because the context space model allows the system to considering knowledge learned from the relevant context only.
Resumo:
With the advent of social web initiatives, some argued that these new emerging tools might be useful in tacit knowledge sharing through providing interactive and collaborative technologies. However, there is still a poverty of literature to understand how and what might be the contributions of social media in facilitating tacit knowledge sharing. Therefore, this paper is intended to theoretically investigate and map social media concepts and characteristics with tacit knowledge creation and sharing requirements. By conducting a systematic literature review, five major requirements found that need to be present in an environment that involves tacit knowledge sharing. These requirements have been analyzed against social media concepts and characteristics to see how they map together. The results showed that social media have abilities to comply some of the main requirements of tacit knowledge sharing. The relationships have been illustrated in a conceptual framework, suggesting further empirical studies to acknowledge findings of this study.
Resumo:
This article examines the current transfer pricing regime to consider whether it is a sound model to be applied to modern multinational entities. The arm's length price methodology is examined to enable a discussion of the arguments in favour of such a regime. The article then refutes these arguments concluding that, contrary to the very reason multinational entities exist, applying arm's length rules involves a legal fiction of imagining transactions between unrelated parties. Multinational entities exist to operate in a way that independent entities would not, which the arm's length rules fail to take into account. As such, there is clearly an air of artificiality in applying the arm's length standard. To demonstrate this artificiality with respect to modern multinational entities, multinational banks are used as an example. The article concluded that the separate entity paradigm adopted by the traditional transfer pricing regime is incongruous with the economic theory of modern multinational enterprises.
Resumo:
This paper presents a behavioral car-following model based on empirical trajectory data that is able to reproduce the spontaneous formation and ensuing propagation of stop-and-go waves in congested traffic. By analyzing individual drivers’ car-following behavior throughout oscillation cycles it is found that this behavior is consistent across drivers and can be captured by a simple model. The statistical analysis of the model’s parameters reveals that there is a strong correlation between driver behavior before and during the oscillation, and that this correlation should not be ignored if one is interested in microscopic output. If macroscopic outputs are of interest, simulation results indicate that an existing model with fewer parameters can be used instead. This is shown for traffic oscillations caused by rubbernecking as observed in the US 101 NGSIM dataset. The same experiment is used to establish the relationship between rubbernecking behavior and the period of oscillations.
Resumo:
The encryption method is a well established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data. In this paper we review the conventional encryption method which can be partially queried and propose the encryption method for numerical data which can be effectively queried. The proposed system includes the design of the service scenario, and metadata.
Resumo:
Background: Despite the increasing clinical problems with metaphyseal fractures, most experimental studies investigate the healing of diaphyseal fractures. Although the mouse would be the preferable species to study the molecular and genetic aspects of metaphyseal fracture healing, a murine model does not exist yet. Using a special locking plate system, we herein introduce a new model, which allows the analysis of metaphyseal bone healing in mice. Methods: In 24 CD-1 mice the distal metaphysis of the femur was osteotomized. After stabilization with the locking plate, bone repair was analyzed radiologically, biomechanically, and histologically after 2 (n = 12) and 5 wk (n = 12). Additionally, the stiffness of the bone-implant construct was tested biomechanically ex vivo. Results: The torsional stiffness of the bone-implant construct was low compared with nonfractured control femora (0.23 ± 0.1 Nmm/°versus 1.78 ± 0.15 Nmm/°, P < 0.05). The cause of failure was a pullout of the distal screw. At 2 wk after stabilization, radiological analysis showed that most bones were partly bridged. At 5 wk, all bones showed radiological union. Accordingly, biomechanical analyses revealed a significantly higher torsional stiffness after 5 wk compared with that after 2 wk. Successful healing was indicated by a torsional stiffness of 90% of the contralateral control femora. Histological analyses showed new woven bone bridging the osteotomy without external callus formation and in absence of any cartilaginous tissue, indicating intramembranous healing. Conclusion: With the model introduced herein we report, for the first time, successful metaphyseal bone repair in mice. The model may be used to obtain deeper insights into the molecular mechanisms of metaphyseal fracture healing. © 2012 Elsevier Inc. All rights reserved.
Resumo:
Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.