964 resultados para Policy Modelling
Resumo:
This work presents an extended Joint Factor Analysis model including explicit modelling of unwanted within-session variability. The goals of the proposed extended JFA model are to improve verification performance with short utterances by compensating for the effects of limited or imbalanced phonetic coverage, and to produce a flexible JFA model that is effective over a wide range of utterance lengths without adjusting model parameters such as retraining session subspaces. Experimental results on the 2006 NIST SRE corpus demonstrate the flexibility of the proposed model by providing competitive results over a wide range of utterance lengths without retraining and also yielding modest improvements in a number of conditions over current state-of-the-art.
Resumo:
Abstract: Purpose – Several major infrastructure projects in the Hong Kong Special Administrative Region (HKSAR) have been delivered by the build-operate-transfer (BOT) model since the 1960s. Although the benefits of using BOT have been reported abundantly in the contemporary literature, some BOT projects were less successful than the others. This paper aims to find out why this is so and to explore whether BOT is the best financing model to procure major infrastructure projects. Design/methodology/approach – The benefits of BOT will first be reviewed. Some completed BOT projects in Hong Kong will be examined to ascertain how far the perceived benefits of BOT have been materialized in these projects. A highly profiled project, the Hong Kong-Zhuhai-Macau Bridge, which has long been promoted by the governments of the People's Republic of China, Macau Special Administrative Region and the HKSAR that BOT is the preferred financing model, but suddenly reverted back to the traditional financing model to be funded primarily by the three governments with public money instead, will be studied to explore the true value of the BOT financial model. Findings – Six main reasons for this radical change are derived from the analysis: shorter take-off time for the project; difference in legal systems causing difficulties in drafting BOT agreements; more government control on tolls; private sector uninterested due to unattractive economic package; avoid allegation of collusion between business and the governments; and a comfortable financial reserve possessed by the host governments. Originality/value – The findings from this paper are believed to provide a better understanding to the real benefits of BOT and the governments' main decision criteria in delivering major infrastructure projects.
Resumo:
Traditionally, conceptual modelling of business processes involves the use of visual grammars for the representation of, amongst other things, activities, choices and events. These grammars, while very useful for experts, are difficult to understand by naive stakeholders. Annotations of such process models have been developed to assist in understanding aspects of these grammars via map-based approaches, and further work has looked at forms of 3D conceptual models. However, no one has sought to embed the conceptual models into a fully featured 3D world, using the spatial annotations to explicate the underlying model clearly. In this paper, we present an approach to conceptual process model visualisation that enhances a 3D virtual world with annotations representing process constructs, facilitating insight into the developed model. We then present a prototype implementation of a 3D Virtual BPMN Editor that embeds BPMN process models into a 3D world. We show how this gives extra support for tasks performed by the conceptual modeller, providing better process model communication to stakeholders..
Resumo:
This fascinating handbook defines how knowledge contributes to social and economic life, and vice versa. It considers the five areas critical to acquiring a comprehensive understanding of the knowledge economy: the nature of the knowledge economy; social, cooperative, cultural, creative, ethical and intellectual capital; knowledge and innovation systems; policy analysis for knowledge-based economies; and knowledge management. In presenting the outcomes of an important body of research, the handbook enables knowledge policy and management practitioners to be more systematically guided in their thinking and actions. The contributors cover a wide disciplinary spectrum in an accessible way, presenting concise, to-the-point discussions of critical concepts and practices that will enable practitioners to make effective research, managerial and policy decisions. They also highlight important new areas of concern to knowledge economies such as wisdom, ethics, language and creative economies that are largely overlooked.
Resumo:
The article presents a criticism of the accounts of John Carey in his book entitled "The Intellectuals and the Masses." The author focuses on Carey's argument that the art is not an eternal category but an invention of the late eighteenth century and it no longer has any intellectual legitimacy other than that of provoking feelings which are no more and no less valuable than those provoked by any other form of entertainment or physical activity
Resumo:
It has been suggested that the Internet is the most significant driver of international trade in recent years to the extent that the term =internetalisation‘ has been coined (Bell, Deans, Ibbotson & Sinkovics, 2001; Buttriss & Wilkinson, 2003). This term is used to describe the Internet‘s affect on the internationalisation process of the firm. Consequently, researchers have argued that the internationalisation process of the firm has altered due to the Internet, hence is in need of further investigation. However, as there is limited research and understanding, ambiguity remains in how the Internet has influenced international market growth. Thus, the purpose of this study was to explore how the Internet influences firms‘ internationalisation process, specifically, international market growth. To this end, Internet marketing and international market growth theories are used to illuminate this ambiguity in the body of knowledge. Thus, the research problem =How and why does the Internet influence international market growth of the firm’ is justified for investigation. To explore the research question a two-stage approach is used. Firstly, twelve case studies were used to evaluate key concepts, generate hypotheses and to develop a model of Internetalisation for testing. The participants held key positions within their firm, so that rich data could be drawn from international market growth decision makers. Secondly, a quantitative confirmation process analysed the identified themes or constructs, using two hundred and twenty four valid responses. Constructs were evaluated through an exploratory factor analysis, confirmatory factor analysis and structural equation modelling process. Structural equation modelling was used to test the model of =internetalisation‘ to examine the interrelationships between the internationalisation process components: information availability, information usage, interaction communication, international mindset, business relationship usage, psychic distance, the Internet intensity of the firm and international market growth. This study found that the Internet intensity of the firm mediates information availability, information usage, international mindset, and business relationships when firms grow in international markets. Therefore, these results provide empirical evidence that the Internet has a positive influence on international information, knowledge, entrepreneurship and networks and these in turn influence international market growth. The theoretical contributions are three fold. Firstly, the study identifies a holistic model of the impact the Internet has had on the outward internationalisation of the firm. This contribution extends the body of knowledge pertaining to Internet international marketing by mapping and confirming interrelationships between the Internet, internationalisation and growth concepts. Secondly, the study highlights the broad scope and accelerated rate of international market growth of firms. Evidence that the Internet influences the traditional and virtual networks for the pursuit of international market growth extends the current understanding. Thirdly, this study confirms that international information, knowledge, entrepreneurship and network concepts are valid in a single model. Thus, these three contributions identify constructs, measure constructs in a multi-item capacity, map interrelationships and confirm single holistic model of ‗internetalisation‘. The main practical contribution is that the findings identified information, knowledge and entrepreneurial opportunities for firms wishing to maximise international market growth. To capitalise on these opportunities suggestions are offered to assist firms to develop greater Internet intensity and internationalisation capabilities. From a policy perspective, educational institutions and government bodies need to promote more applied programs for Internet international marketing. The study provides future researchers with a platform of identified constructs and interrelationships related to internetalisation, with which to investigate. However, a single study has limitations of generalisability; thus, future research should replicate this study. Such replication or cross validation will assist in the verification of scales used in this research and enhance the validity of causal predications. Furthermore, this study was undertaken in the Australian outward-bound context. Research in other nations, as well as research into inbound internationalisation would be fruitful.
Resumo:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Leucodepletion, the removal of leucocytes from blood products improves the safety of blood transfusion by reducing adverse events associated with the incidental non-therapeutic transfusion of leucocytes. Leucodepletion has been shown to have clinical benefit for immuno-suppressed patients who require transfusion. The selective leucodepletion of blood products by bed side filtration for these patients has been widely practiced. This study investigated the economic consequences in Queensland of moving from a policy of selective leucodepletion to one of universal leucodepletion, that is providing all transfused patients with blood products leucodepleted during the manufacturing process. Using an analytic decision model a cost-effectiveness analysis was conducted. An ICER of $16.3M per life year gained was derived. Sensitivity analysis found this result to be robust to uncertainty in the parameters used in the model. This result argues against moving to a policy of universal leucodepletion. However during the course of the study the policy decision for universal leucodepletion was made and implemented in Queensland in October 2008. This study has concluded that cost-effectiveness is not an influential factor in policy decisions regarding quality and safety initiatives in the Australian blood sector.
Resumo:
These National Guidelines and Case Studies for Digital Modelling are the outcomes from one of a number of Building Information Modelling (BIM)-related projects undertaken by the CRC for Construction Innovation. Since the CRC opened its doors in 2001, the industry has seen a rapid increase in interest in BIM, and widening adoption. These guidelines and case studies are thus very timely, as the industry moves to model-based working and starts to share models in a new context called integrated practice. Governments, both federal and state, and in New Zealand are starting to outline the role they might take, so that in contrast to the adoption of 2D CAD in the early 90s, we ensure that a national, industry-wide benefit results from this new paradigm of working. Section 1 of the guidelines give us an overview of BIM: how it affects our current mode of working, what we need to do to move to fully collaborative model-based facility development. The role of open standards such as IFC is described as a mechanism to support new processes, and make the extensive design and construction information available to asset operators and managers. Digital collaboration modes, types of models, levels of detail, object properties and model management complete this section. It will be relevant for owners, managers and project leaders as well as direct users of BIM. Section 2 provides recommendations and guides for key areas of model creation and development, and the move to simulation and performance measurement. These are the more practical parts of the guidelines developed for design professionals, BIM managers, technical staff and ‘in the field’ workers. The guidelines are supported by six case studies including a summary of lessons learnt about implementing BIM in Australian building projects. A key aspect of these publications is the identification of a number of important industry actions: the need for BIM-compatible product information and a national context for classifying product data; the need for an industry agreement and setting process-for-process definition; and finally, the need to ensure a national standard for sharing data between all of the participants in the facility-development process.
Resumo:
These National Guidelines and Case Studies for Digital Modelling are the outcomes from one of a number of Building Information Modelling (BIM)-related projects undertaken by the CRC for Construction Innovation. Since the CRC opened its doors in 2001, the industry has seen a rapid increase in interest in BIM, and widening adoption. These guidelines and case studies are thus very timely, as the industry moves to model-based working and starts to share models in a new context called integrated practice. Governments, both federal and state, and in New Zealand are starting to outline the role they might take, so that in contrast to the adoption of 2D CAD in the early 90s, we ensure that a national, industry-wide benefit results from this new paradigm of working. Section 1 of the guidelines give us an overview of BIM: how it affects our current mode of working, what we need to do to move to fully collaborative model-based facility development. The role of open standards such as IFC is described as a mechanism to support new processes, and make the extensive design and construction information available to asset operators and managers. Digital collaboration modes, types of models, levels of detail, object properties and model management complete this section. It will be relevant for owners, managers and project leaders as well as direct users of BIM. Section 2 provides recommendations and guides for key areas of model creation and development, and the move to simulation and performance measurement. These are the more practical parts of the guidelines developed for design professionals, BIM managers, technical staff and ‘in the field’ workers. The guidelines are supported by six case studies including a summary of lessons learnt about implementing BIM in Australian building projects. A key aspect of these publications is the identification of a number of important industry actions: the need for BIMcompatible product information and a national context for classifying product data; the need for an industry agreement and setting process-for-process definition; and finally, the need to ensure a national standard for sharing data between all of the participants in the facility-development process.