204 resultados para Multi-model inference
Resumo:
While IS function has gained widespread attention for over two decades, there is little consensus among information systems (IS) researchers and practitioners on how best to evaluate IS function's support performance. This paper reports on preliminary findings of a larger research effort proceeds from a central interest in the importance of evaluating IS function's support in organisations. This study is the first that attempts to re-conceptualise and conceive evaluate IS function's support as a multi- dimensional formative construct. We argue that a holistic measure for evaluating evaluate IS function's support should consist of dimensions that together assess the variety of the support functions and the quality of the support services provided to end-users. Thus, the proposed model consists of two halves, "Variety" and "Quality" within which resides seven dimensions. The Variety half includes five dimensions: Training; Documentation; Data- related Support, Software-related Support; and Hardware-related Support. The Quality half includes two dimensions: IS Support Staff and Support Services Performance. The proposed model is derived using a directed content analysis of 83 studies; from top IS outlets, employing the characteristics of the analytic theory and consistent with formative construct development procedures.
Resumo:
Minimizing complexity of group key exchange (GKE) protocols is an important milestone towards their practical deployment. An interesting approach to achieve this goal is to simplify the design of GKE protocols by using generic building blocks. In this paper we investigate the possibility of founding GKE protocols based on a primitive called multi key encapsulation mechanism (mKEM) and describe advantages and limitations of this approach. In particular, we show how to design a one-round GKE protocol which satisfies the classical requirement of authenticated key exchange (AKE) security, yet without forward secrecy. As a result, we obtain the first one-round GKE protocol secure in the standard model. We also conduct our analysis using recent formal models that take into account both outsider and insider attacks as well as the notion of key compromise impersonation resilience (KCIR). In contrast to previous models we show how to model both outsider and insider KCIR within the definition of mutual authentication. Our analysis additionally implies that the insider security compiler by Katz and Shin from ACM CCS 2005 can be used to achieve more than what is shown in the original work, namely both outsider and insider KCIR.
Resumo:
This paper presents a novel study that aims to contribute to understanding the phenomenon of Enterprise Systems (ES) evaluation in Australasian universities. The proposed study addresses known limitations of arguably the most significant dependent variable in the Information System (IS) field - IS Success or IS-Impact. This study adopts the IS-Impact measurement model, reported by Gable et al. (2008), as the primary commencing theory-base and applies research extension strategy described by Berthon et al. (2002); extending both theory and the context. This study employs a longitudinal, multi-method research design, with two interrelated phases – exploratory and confirmatory. The exploratory phase aims to investigate the applicability and sufficiency of the IS-Impact dimensions and measures in the new context. The confirmatory phase will gather quantitative data to statistically validate IS-Impact model as a formative index.
Resumo:
Shell structures find use in many fields of engineering, notably structural, mechanical, aerospace and nuclear-reactor disciplines. Axisymmetric shell structures are used as dome type of roofs, hyperbolic cooling towers, silos for storage of grain, oil and industrial chemicals and water tanks. Despite their thin walls, strength is derived due to the curvature. The generally high strength-to-weight ratio of the shell form, combined with its inherent stiffness, has formed the basis of this vast application. With the advent in computation technology, the finite element method and optimisation techniques, structural engineers have extremely versatile tools for the optimum design of such structures. Optimisation of shell structures can result not only in improved designs, but also in a large saving of material. The finite element method being a general numerical procedure that could be used to treat any shell problem to any desired degree of accuracy, requires several runs in order to obtain a complete picture of the effect of one parameter on the shell structure. This redesign I re-analysis cycle has been achieved via structural optimisation in the present research, and MSC/NASTRAN (a commercially available finite element code) has been used in this context for volume optimisation of axisymmetric shell structures under axisymmetric and non-axisymmetric loading conditions. The parametric study of different axisymmetric shell structures has revealed that the hyperbolic shape is the most economical solution of shells of revolution. To establish this, axisymmetric loading; self-weight and hydrostatic pressure, and non-axisymmetric loading; wind pressure and earthquake dynamic forces have been modelled on graphical pre and post processor (PATRAN) and analysis has been performed on two finite element codes (ABAQUS and NASTRAN), numerical model verification studies are performed, and optimum material volume required in the walls of cylindrical, conical, parabolic and hyperbolic forms of axisymmetric shell structures are evaluated and reviewed. Free vibration and transient earthquake analysis of hyperbolic shells have been performed once it was established that hyperbolic shape is the most economical under all possible loading conditions. Effect of important parameters of hyperbolic shell structures; shell wall thickness, height and curvature, have been evaluated and empirical relationships have been developed to estimate an approximate value of the lowest (first) natural frequency of vibration. The outcome of this thesis has been the generation of new research information on performance characteristics of axisymmetric shell structures that will facilitate improved designs of shells with better choice of shapes and enhanced levels of economy and performance. Key words; Axisymmetric shell structures, Finite element analysis, Volume Optimisation_ Free vibration_ Transient response.
Resumo:
Open access reforms to railway regulations allow multiple train operators to provide rail services on a common infrastructure. As railway operations are now independently managed by different stakeholders, conflicts in operations may arise, and there have been attempts to derive an effective access charge regime so that these conflicts may be resolved. One approach is by direct negotiation between the infrastructure manager and the train service providers. Despite the substantial literature on the topic, few consider the benefits of employing computer simulation as an evaluation tool for railway operational activities such as access pricing. This article proposes a multi-agent system (MAS) framework for the railway open market and demonstrates its feasibility by modelling the negotiation between an infrastructure provider and a train service operator. Empirical results show that the model is capable of resolving operational conflicts according to market demand.
Resumo:
Many infrastructure and necessity systems such as electricity and telecommunication in Europe and the Northern America were used to be operated as monopolies, if not state-owned. However, they have now been disintegrated into a group of smaller companies managed by different stakeholders. Railways are no exceptions. Since the early 1980s, there have been reforms in the shape of restructuring of the national railways in different parts of the world. Continuous refinements are still conducted to allow better utilisation of railway resources and quality of service. There has been a growing interest for the industry to understand the impacts of these reforms on the operation efficiency and constraints. A number of post-evaluations have been conducted by analysing the performance of the stakeholders on their profits (Crompton and Jupe 2003), quality of train service (Shaw 2001) and engineering operations (Watson 2001). Results from these studies are valuable for future improvement in the system, followed by a new cycle of post-evaluations. However, direct implementation of these changes is often costly and the consequences take a long period of time (e.g. years) to surface. With the advance of fast computing technologies, computer simulation is a cost-effective means to evaluate a hypothetical change in a system prior to actual implementation. For example, simulation suites have been developed to study a variety of traffic control strategies according to sophisticated models of train dynamics, traction and power systems (Goodman, Siu and Ho 1998, Ho and Yeung 2001). Unfortunately, under the restructured railway environment, it is by no means easy to model the complex behaviour of the stakeholders and the interactions between them. Multi-agent system (MAS) is a recently developed modelling technique which may be useful in assisting the railway industry to conduct simulations on the restructured railway system. In MAS, a real-world entity is modelled as a software agent that is autonomous, reactive to changes, able to initiate proactive actions and social communicative acts. It has been applied in the areas of supply-chain management processes (García-Flores, Wang and Goltz 2000, Jennings et al. 2000a, b) and e-commerce activities (Au, Ngai and Parameswaran 2003, Liu and You 2003), in which the objectives and behaviour of the buyers and sellers are captured by software agents. It is therefore beneficial to investigate the suitability or feasibility of applying agent modelling in railways and the extent to which it might help in developing better resource management strategies. This paper sets out to examine the benefits of using MAS to model the resource management process in railways. Section 2 first describes the business environment after the railway 2 Modelling issues on the railway resource management process using MAS reforms. Then the problems emerge from the restructuring process are identified in section 3. Section 4 describes the realisation of a MAS for railway resource management under the restructured scheme and the feasible studies expected from the model.
Resumo:
This paper introduces an event-based traffic model for railway systems adopting fixed-block signalling schemes. In this model, the events of trains' arrival at and departure from signalling blocks constitute the states of the traffic flow. A state transition is equivalent to the progress of the trains by one signalling block and it is realised by referring to past and present states, as well as a number of pre-calculated look-up tables of run-times in the signalling block under various signalling conditions. Simulation results are compared with those from a time-based multi-train simulator to study the improvement of processing time and accuracy.
Resumo:
A composite line source emission (CLSE) model was developed to specifically quantify exposure levels and describe the spatial variability of vehicle emissions in traffic interrupted microenvironments. This model took into account the complexity of vehicle movements in the queue, as well as different emission rates relevant to various driving conditions (cruise, decelerate, idle and accelerate), and it utilised multi-representative segments to capture the accurate emission distribution for real vehicle flow. Hence, this model was able to quickly quantify the time spent in each segment within the considered zone, as well as the composition and position of the requisite segments based on the vehicle fleet information, which not only helped to quantify the enhanced emissions at critical locations, but it also helped to define the emission source distribution of the disrupted steady flow for further dispersion modelling. The model then was applied to estimate particle number emissions at a bi-directional bus station used by diesel and compressed natural gas fuelled buses. It was found that the acceleration distance was of critical importance when estimating particle number emission, since the highest emissions occurred in sections where most of the buses were accelerating and no significant increases were observed at locations where they idled. It was also shown that emissions at the front end of the platform were 43 times greater than at the rear of the platform. Although the CLSE model is intended to be applied in traffic management and transport analysis systems for the evaluation of exposure, as well as the simulation of vehicle emissions in traffic interrupted microenvironments, the bus station model can also be used for the input of initial source definitions in future dispersion models.
Resumo:
The multi-criteria decision making methods, Preference METHods for Enrichment Evaluation (PROMETHEE) and Graphical Analysis for Interactive Assistance (GAIA), and the two-way Positive Matrix Factorization (PMF) receptor model were applied to airborne fine particle compositional data collected at three sites in Hong Kong during two monitoring campaigns held from November 2000 to October 2001 and November 2004 to October 2005. PROMETHEE/GAIA indicated that the three sites were worse during the later monitoring campaign, and that the order of the air quality at the sites during each campaign was: rural site > urban site > roadside site. The PMF analysis on the other hand, identified 6 common sources at all of the sites (diesel vehicle, fresh sea salt, secondary sulphate, soil, aged sea salt and oil combustion) which accounted for approximately 68.8 ± 8.7% of the fine particle mass at the sites. In addition, road dust, gasoline vehicle, biomass burning, secondary nitrate, and metal processing were identified at some of the sites. Secondary sulphate was found to be the highest contributor to the fine particle mass at the rural and urban sites with vehicle emission as a high contributor to the roadside site. The PMF results are broadly similar to those obtained in a previous analysis by PCA/APCS. However, the PMF analysis resolved more factors at each site than the PCA/APCS. In addition, the study demonstrated that combined results from multi-criteria decision making analysis and receptor modelling can provide more detailed information that can be used to formulate the scientific basis for mitigating air pollution in the region.
Resumo:
The collaboration of clinicians with basic science researchers is crucial for addressing clinically relevant research questions. In order to initiate such mutually beneficial relationships, we propose a model where early career clinicians spend a designated time embedded in established basic science research groups, in order to pursue a postgraduate qualification. During this time, clinicians become integral members of the research team, fostering long term relationships and opening up opportunities for continuing collaboration. However, for these collaborations to be successful there are pitfalls to be avoided. Limited time and funding can lead to attempts to answer clinical challenges with highly complex research projects characterised by a large number of "clinical" factors being introduced in the hope that the research outcomes will be more clinically relevant. As a result, the complexity of such studies and variability of its outcomes may lead to difficulties in drawing scientifically justified and clinically useful conclusions. Consequently, we stress that it is the basic science researcher and the clinician's obligation to be mindful of the limitations and challenges of such multi-factorial research projects. A systematic step-by-step approach to address clinical research questions with limited, but highly targeted and well defined research projects provides the solid foundation which may lead to the development of a longer term research program for addressing more challenging clinical problems. Ultimately, we believe that it is such models, encouraging the vital collaboration between clinicians and researchers for the work on targeted, well defined research projects, which will result in answers to the important clinical challenges of today.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
The airport city concept has been embraced by many airports of different scales and in varied ways around the world. Airports everywhere have diversified their landside revenues with non-aviation commercial and industrial development in order to increase revenues and spread risk in the notoriously volatile aviation market. As intermodal hubs in a connected, globalised world, airports have evolved from transportation nodes into multi-faceted business enterprises. They have assumed a critical role as ‘transactional’ spaces in the global economy (Gottdiener 2001).
Resumo:
We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms by using indirect inference. ABC methods are useful for posterior inference in the presence of an intractable likelihood function. In the indirect inference approach to ABC the parameters of an auxiliary model fitted to the data become the summary statistics. Although applicable to any ABC technique, we embed this approach within a sequential Monte Carlo algorithm that is completely adaptive and requires very little tuning. This methodological development was motivated by an application involving data on macroparasite population evolution modelled by a trivariate stochastic process for which there is no tractable likelihood function. The auxiliary model here is based on a beta–binomial distribution. The main objective of the analysis is to determine which parameters of the stochastic model are estimable from the observed data on mature parasite worms.
Resumo:
A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modelling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for configurable process modelling is restricted, thus hindering their applicability. Specifically, these notations focus on capturing tasks and control-flow dependencies, neglecting equally important ingredients of business processes such as data and resources. This research fills this gap by proposing a configurable process modelling notation incorporating features for capturing resources, data and physical objects involved in the performance of tasks. The proposal has been implemented in a toolset that assists analysts during the configuration phase and guarantees the correctness of the resulting process models. The approach has been validated by means of a case study from the film industry.
Resumo:
A Simulink Matlab control system of a heavy vehicle suspension has been developed. The aim of the exercise presented in this paper was to develop a Simulink Matlab control system of a heavy vehicle suspension. The objective facilitated by this outcome was the use of a working model of a heavy vehicle (HV) suspension that could be used for future research. A working computer model is easier and cheaper to re-configure than a HV axle group installed on a truck; it presents less risk should something go wrong and allows more scope for variation and sensitivity analysis before embarking on further "real-world" testing. Empirical data recorded as the input and output signals of a heavy vehicle (HV) suspension were used to develop the parameters for computer simulation of a linear time invariant system described by a second-order differential equation of the form: (i.e. a "2nd-order" system). Using the empirical data as an input to the computer model allowed validation of its output compared with the empirical data. The errors ranged from less than 1% to approximately 3% for any parameter, when comparing like-for-like inputs and outputs. The model is presented along with the results of the validation. This model will be used in future research in the QUT/Main Roads project Heavy vehicle suspensions – testing and analysis, particularly so for a theoretical model of a multi-axle HV suspension with varying values of dynamic load sharing. Allowance will need to be made for the errors noted when using the computer models in this future work.