898 resultados para Analysis Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT. The phenomenon of consumer co-creation is often framed in terms of whether either economic market forces or socio-cultural non-market forces ultimately dominate. We propose an alternate model of consumer co-creation in terms of co-evolution between markets and non-markets. Our model is based on a recent ethnographic study of a massively multiplayer online game through its development, release and ultimate failure, and cast in terms of two explanatory models: multiple games and social network markets. We conclude that consumer co-creation is indeed complex, but in ways that relate to both emergent market expectations and the evolution of markets, not to the transcendence of markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selecting an appropriate business process modelling technique forms an important task within the methodological challenges of a business process management project. While a plethora of available techniques has been developed over the last decades, there is an obvious shortage of well-accepted reference frameworks that can be used to evaluate and compare the capabilities of the different techniques. Academic progress has been made at least in the area of representational analyses that use ontology as a benchmark for such evaluations. This paper reflects on the comprehensive experiences with the application of a model based on the Bunge ontology in this context. A brief overview of the underlying research model characterizes the different steps in such a research project. A comparative summary of previous representational analyses of process modelling techniques over time gives insights into the relative maturity of selected process modelling techniques. Based on these experiences suggestions are made as to where ontology-based representational analyses could be further developed and what limitations are inherent to such analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document provides an overview of the differences and similarities in the objectives and implementation frameworks of the training and employment policies applying to public construction projects in Western Australia and Queensland. The material in the document clearly demonstrates the extent to which approaches to the pursuit of training objectives in particular have been informed by the experiences of other jurisdictions. The two State governments now have very similar approaches to the promotion of training with the WA government basing a good part of its policy approach on the “Queensland model”. As the two States share many similar economic and other characteristics, and have very similar social and economic goals, this similarity is to be expected. The capacity to benefit from the experiences of other jurisdictions is to be welcomed. The similarity in policy approach also suggests a potential for ongoing collaborations between the State governments on research aimed at further improving training and employment outcomes via public construction projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the stability of an autonomous microgrid with multiple distributed generators (DG) is studied through eigenvalue analysis. It is assumed that all the DGs are connected through Voltage Source Converter (VSC) and all connected loads are passive. The VSCs are controlled by state feedback controller to achieve desired voltage and current outputs that are decided by a droop controller. The state space models of each of the converters with its associated feedback are derived. These are then connected with the state space models of the droop, network and loads to form a homogeneous model, through which the eigenvalues are evaluated. The system stability is then investigated as a function of the droop controller real and reac-tive power coefficients. These observations are then verified through simulation studies using PSCAD/EMTDC. It will be shown that the simulation results closely agree with stability be-havior predicted by the eigenvalue analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capital works procurement and its regulatory policy environment within a country can be complex entities. For example, by virtue of Australia’s governmental division between the Commonwealth, states and local jurisdictions and the associated procurement networks and responsibilities at each level, the tendering process is often convoluted. There are four inter-related key themes identified in the literature in relation to procurement disharmony, including decentralisation, risk & risk mitigation, free trade & competition, and tendering costs. This paper defines and discusses these key areas of conflict that adversely impact upon the business environments of industry through a literature review, policy analysis and consultation with capital works procurement stakeholders. The aim of this national study is to identify policy differences between jurisdictions in Australia, and ascertain whether those differences are a barrier to productivity and innovation. This research forms an element of a broader investigation with an aim of developing efficient, effective and nationally harmonised procurement systems. Keywords: capital works, procurement policy reform Acknowledgement: The research described in this paper carried out by the Australian Cooperative Research Centre for Construction Innovation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is a continuation of the paper titled “Concurrent multi-scale modeling of civil infrastructure for analyses on structural deteriorating—Part I: Modeling methodology and strategy” with the emphasis on model updating and verification for the developed concurrent multi-scale model. The sensitivity-based parameter updating method was applied and some important issues such as selection of reference data and model parameters, and model updating procedures on the multi-scale model were investigated based on the sensitivity analysis of the selected model parameters. The experimental modal data as well as static response in terms of component nominal stresses and hot-spot stresses at the concerned locations were used for dynamic response- and static response-oriented model updating, respectively. The updated multi-scale model was further verified to act as the baseline model which is assumed to be finite-element model closest to the real situation of the structure available for the subsequent arbitrary numerical simulation. The comparison of dynamic and static responses between the calculated results by the final model and measured data indicated the updating and verification methods applied in this paper are reliable and accurate for the multi-scale model of frame-like structure. The general procedures of multi-scale model updating and verification were finally proposed for nonlinear physical-based modeling of large civil infrastructure, and it was applied to the model verification of a long-span bridge as an actual engineering practice of the proposed procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a novel relative entropy rate (RER) based approach for multiple HMM (MHMM) approximation of a class of discrete-time uncertain processes. Under different uncertainty assumptions, the model design problem is posed either as a min-max optimisation problem or stochastic minimisation problem on the RER between joint laws describing the state and output processes (rather than the more usual RER between output processes). A suitable filter is proposed for which performance results are established which bound conditional mean estimation performance and show that estimation performance improves as the RER is reduced. These filter consistency and convergence bounds are the first results characterising multiple HMM approximation performance and suggest that joint RER concepts provide a useful model selection criteria. The proposed model design process and MHMM filter are demonstrated on an important image processing dim-target detection problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to take advantage of recent developments in joint factor analysis (JFA) in the context of a phonetically conditioned GMM speaker verification system. Previous work has shown performance advantages through phonetic conditioning, but this has not been shown to date with the JFA framework. Our focus is particularly on strategies for combining the phone-conditioned systems. We show that the classic fusion of the scores is suboptimal when using multiple GMM systems. We investigate several combination strategies in the model space, and demonstrate improvement over score-level combination as well as over a non-phonetic baseline system. This work was conducted during the 2008 CLSP Workshop at Johns Hopkins University.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industrial employment growth has been one of the most dynamic areas of expansion in Asia; however, current trends in industrialised working environments have resulted in greater employee stress. Despite research showing that cultural values affect the way people cope with stress, there is a dearth of psychometrically established tools for use in non-Western countries to measure these constructs. Studies of the "Way of Coping Checklist-Revised" (WCCL-R) in the West suggest that the WCCL-R has good psychometric properties, but its applicability in the East is still understudied. A confirmatory factor analysis (CFA) is used to validate the WCCL-R constructs in an Asian population. This study used 1,314 participants from Indonesia, Sri Lanka, Singapore, and Thailand. An initial exploratory factor analysis revealed that original structures were not confirmed; however, a subsequent EFA and CFA showed that a 38-item, five-factor structure model was confirmed. The revised WCCL-R in the Asian sample was also found to have good reliability and sound construct and concurrent validity. The 38-item structure of the WCCL-R has considerable potential in future occupational stress-related research in Asian countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current policy decision making in Australia regarding non-health public investments (for example, transport/housing/social welfare programmes) does not quantify health benefits and costs systematically. To address this knowledge gap, this study proposes an economic model for quantifying health impacts of public policies in terms of dollar value. The intention is to enable policy-makers in conducting economic evaluation of health effects of non-health policies and in implementing policies those reduce health inequalities as well as enhance positive health gains of the target population. Health Impact Assessment (HIA) provides an appropriate framework for this study since HIA assesses the beneficial and adverse effects of a programme/policy on public health and on health inequalities through the distribution of those effects. However, HIA usually tries to influence the decision making process using its scientific findings, mostly epidemiological and toxicological evidence. In reality, this evidence can not establish causal links between policy and health impacts since it can not explain how an individual or a community reacts to changing circumstances. The proposed economic model addresses this health-policy linkage using a consumer choice approach that can explain changes in group and individual behaviour in a given economic set up. The economic model suggested in this paper links epidemiological findings with economic analysis to estimate the health costs and benefits of public investment policies. That is, estimating dollar impacts when health status of the exposed population group changes by public programmes – for example, transport initiatives to reduce congestion by building new roads/ highways/ tunnels etc. or by imposing congestion taxes. For policy evaluation purposes, the model is incorporated in the HIA framework by establishing association among identified factors, which drive changes in the behaviour of target population group and in turn, in the health outcomes. The economic variables identified to estimate the health inequality and health costs are levels of income, unemployment, education, age groups, disadvantaged population groups, mortality/morbidity etc. However, though the model validation using case studies and/or available database from Australian non-health policy (say, transport) arena is in the future tasks agenda, it is beyond the scope of this current paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With increasingly complex engineering assets and tight economic requirements, asset reliability becomes more crucial in Engineering Asset Management (EAM). Improving the reliability of systems has always been a major aim of EAM. Reliability assessment using degradation data has become a significant approach to evaluate the reliability and safety of critical systems. Degradation data often provide more information than failure time data for assessing reliability and predicting the remnant life of systems. In general, degradation is the reduction in performance, reliability, and life span of assets. Many failure mechanisms can be traced to an underlying degradation process. Degradation phenomenon is a kind of stochastic process; therefore, it could be modelled in several approaches. Degradation modelling techniques have generated a great amount of research in reliability field. While degradation models play a significant role in reliability analysis, there are few review papers on that. This paper presents a review of the existing literature on commonly used degradation models in reliability analysis. The current research and developments in degradation models are reviewed and summarised in this paper. This study synthesises these models and classifies them in certain groups. Additionally, it attempts to identify the merits, limitations, and applications of each model. It provides potential applications of these degradation models in asset health and reliability prediction.