44 resultados para Discrete Choice Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building on a previous conceptual article, we present an empirically derived model of network learning - learning by a group of organizations as a group. Based on a qualitative, longitudinal, multiple-method empirical investigation, five episodes of network learning were identified. Treating each episode as a discrete analytic case, through cross-case comparison, a model of network learning is developed which reflects the common, critical features of the episodes. The model comprises three conceptual themes relating to learning outcomes, and three conceptual themes of learning process. Although closely related to conceptualizations that emphasize the social and political character of organizational learning, the model of network learning is derived from, and specifically for, more extensive networks in which relations among numerous actors may be arms-length or collaborative, and may be expected to change over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present syllable-based duration modelling in the context of a prosody model for Standard Yorùbá (SY) text-to-speech (TTS) synthesis applications. Our prosody model is conceptualised around a modular holistic framework. This framework is implemented using the Relational Tree (R-Tree) techniques. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration, intonation, and intensity, using different techniques and their subsequent integration. We applied the Fuzzy Decision Tree (FDT) technique to model the duration dimension. In order to evaluate the effectiveness of FDT in duration modelling, we have also developed a Classification And Regression Tree (CART) based duration model using the same speech data. Each of these models was integrated into our R-Tree based prosody model. We performed both quantitative (i.e. Root Mean Square Error (RMSE) and Correlation (Corr)) and qualitative (i.e. intelligibility and naturalness) evaluations on the two duration models. The results show that CART models the training data more accurately than FDT. The FDT model, however, shows a better ability to extrapolate from the training data since it achieved a better accuracy for the test data set. Our qualitative evaluation results show that our FDT model produces synthesised speech that is perceived to be more natural than our CART model. In addition, we also observed that the expressiveness of FDT is much better than that of CART. That is because the representation in FDT is not restricted to a set of piece-wise or discrete constant approximation. We, therefore, conclude that the FDT approach is a practical approach for duration modelling in SY TTS applications. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent development of using negative stiffness inclusions to achieve extreme overall stiffness and mechanical damping of composite materials reveals a new avenue for constructing high performance materials. One of the negative stiffness sources can be obtained from phase transforming materials in the vicinity of their phase transition, as suggested by the Landau theory. To understand the underlying mechanism from a microscopic viewpoint, we theoretically analyze a 2D, nested triangular lattice cell with pre-chosen elements containing negative stiffness to demonstrate anomalies in overall stiffness and damping. Combining with current knowledge from continuum models, based on the composite theory, such as the Voigt, Reuss, and Hashin-Shtrikman model, we further explore the stability of the system with Lyapunov's indirect stability theorem. The evolution of the microstructure in terms of the discrete system is discussed. A potential application of the results presented here is to develop special thin films with unusual in-plane mechanical properties. © 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, we have developed the hierarchical Generative Topographic Mapping (HGTM), an interactive method for visualization of large high-dimensional real-valued data sets. In this paper, we propose a more general visualization system by extending HGTM in three ways, which allows the user to visualize a wider range of data sets and better support the model development process. 1) We integrate HGTM with noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM). This enables us to visualize data of inherently discrete nature, e.g., collections of documents, in a hierarchical manner. 2) We give the user a choice of initializing the child plots of the current plot in either interactive, or automatic mode. In the interactive mode, the user selects "regions of interest," whereas in the automatic mode, an unsupervised minimum message length (MML)-inspired construction of a mixture of LTMs is employed. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. 3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualization plots, since they can highlight the boundaries between data clusters. We illustrate our approach on a toy example and evaluate it on three more complex real data sets. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Coronary heart disease (CHD) is a public health priority in the UK. The National Service Framework (NSF) has set standards for the prevention, diagnosis and treatment of CHD, which include the use of cholesterol-lowering agents aimed at achieving targets of blood total cholesterol (TC) < 5.0 mmol/L and low density lipoprotein-cholesterol (LDL-C) < 3.0 mmol/L. In order to achieve these targets cost effectively, prescribers need to make an informed choice from the range of statins available. Aim: To estimate the average and relative cost effectiveness of atorvastatin, fluvastatin, pravastatin and simvastatin in achieving the NSF LDL-C and TC targets. Design: Model-based economic evaluation. Methods: An economic model was constructed to estimate the number of patients achieving the NSF targets for LDL-C and TC at each dose of statin, and to calculate the average drug cost and incremental drug cost per patient achieving the target levels. The population baseline LDL-C and TC, and drug efficacy and drug costs were taken from previously published data. Estimates of the distribution of patients receiving each dose of statin were derived from the UK national DIN-LINK database. Results: The estimated annual drug cost per 1000 patients treated with atorvastatin was £289 000, with simvastatin £315 000, with pravastatin £333 000 and with fluvastatin £167 000. The percentages of patients achieving target are 74.4%, 46.4%, 28.4% and 13.2% for atorvastatin, simvastatin, pravastatin and fluvastatin, respectively. Incremental drug cost per extra patient treated to LDL-C and TC targets compared with fluvastafin were £198 and £226 for atorvastatin, £443 and £567 for simvastatin and £1089 and £2298 for pravastatin, using 2002 drug costs. Conclusions: As a result of its superior efficacy, atorvastatin generates a favourable cost-effectiveness profile as measured by drug cost per patient treated to LDL-C and TC targets. For a given drug budget, more patients would achieve NSF LDL-C and TC targets with atorvastatin than with any of the other statins examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysing the molecular polymorphism and interactions of DNA, RNA and proteins is of fundamental importance in biology. Predicting functions of polymorphic molecules is important in order to design more effective medicines. Analysing major histocompatibility complex (MHC) polymorphism is important for mate choice, epitope-based vaccine design and transplantation rejection etc. Most of the existing exploratory approaches cannot analyse these datasets because of the large number of molecules with a high number of descriptors per molecule. This thesis develops novel methods for data projection in order to explore high dimensional biological dataset by visualising them in a low-dimensional space. With increasing dimensionality, some existing data visualisation methods such as generative topographic mapping (GTM) become computationally intractable. We propose variants of these methods, where we use log-transformations at certain steps of expectation maximisation (EM) based parameter learning process, to make them tractable for high-dimensional datasets. We demonstrate these proposed variants both for synthetic and electrostatic potential dataset of MHC class-I. We also propose to extend a latent trait model (LTM), suitable for visualising high dimensional discrete data, to simultaneously estimate feature saliency as an integrated part of the parameter learning process of a visualisation model. This LTM variant not only gives better visualisation by modifying the project map based on feature relevance, but also helps users to assess the significance of each feature. Another problem which is not addressed much in the literature is the visualisation of mixed-type data. We propose to combine GTM and LTM in a principled way where appropriate noise models are used for each type of data in order to visualise mixed-type data in a single plot. We call this model a generalised GTM (GGTM). We also propose to extend GGTM model to estimate feature saliencies while training a visualisation model and this is called GGTM with feature saliency (GGTM-FS). We demonstrate effectiveness of these proposed models both for synthetic and real datasets. We evaluate visualisation quality using quality metrics such as distance distortion measure and rank based measures: trustworthiness, continuity, mean relative rank errors with respect to data space and latent space. In cases where the labels are known we also use quality metrics of KL divergence and nearest neighbour classifications error in order to determine the separation between classes. We demonstrate the efficacy of these proposed models both for synthetic and real biological datasets with a main focus on the MHC class-I dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the impact of load factor, facility and generator types on the productivity of Korean electric power plants. In order to capture important differences in the effect of load policy on power output, we use a semiparametric smooth coefficient (SPSC) model that allows us to model heterogeneous performances across power plants and over time by allowing underlying technologies to be heterogeneous. The SPSC model accommodates both continuous and discrete covariates. Various specification tests are conducted to assess the performance of the SPSC model. Using a unique generator level panel dataset spanning the period 1995-2006, we find that the impact of load factor, generator and facility types on power generation varies substantially in terms of magnitude and significance across different plant characteristics. The results have strong implications for generation policy in Korea as outlined in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare spot patterns generated by Turing mechanisms with those generated by replication cascades, in a model one-dimensional reaction-diffusion system. We determine the stability region of spot solutions in parameter space as a function of a natural control parameter (feed-rate) where degenerate patterns with different numbers of spots coexist for a fixed feed-rate. While it is possible to generate identical patterns via both mechanisms, we show that replication cascades lead to a wider choice of pattern profiles that can be selected through a tuning of the feed-rate, exploiting hysteresis and directionality effects of the different pattern pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical mechanics of two coupled vector fields is studied in the tight-binding model that describes propagation of polarized light in discrete waveguides in the presence of the four-wave mixing. The energy and power conservation laws enable the formulation of the equilibrium properties of the polarization state in terms of the Gibbs measure with positive temperature. The transition line T=∞ is established beyond which the discrete vector solitons are created. Also in the limit of the large nonlinearity an analytical expression for the distribution of Stokes parameters is obtained, which is found to be dependent only on the statistical properties of the initial polarization state and not on the strength of nonlinearity. The evolution of the system to the final equilibrium state is shown to pass through the intermediate stage when the energy exchange between the waveguides is still negligible. The distribution of the Stokes parameters in this regime has a complex multimodal structure strongly dependent on the nonlinear coupling coefficients and the initial conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Markovian models are widely used to analyse quality-of-service properties of both system designs and deployed systems. Thanks to the emergence of probabilistic model checkers, this analysis can be performed with high accuracy. However, its usefulness is heavily dependent on how well the model captures the actual behaviour of the analysed system. Our work addresses this problem for a class of Markovian models termed discrete-time Markov chains (DTMCs). We propose a new Bayesian technique for learning the state transition probabilities of DTMCs based on observations of the modelled system. Unlike existing approaches, our technique weighs observations based on their age, to account for the fact that older observations are less relevant than more recent ones. A case study from the area of bioinformatics workflows demonstrates the effectiveness of the technique in scenarios where the model parameters change over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the mobility of nonlinear localized modes in a generalized discrete Ginzburg-Landau-type model, describing a one-dimensional waveguide array in an active Kerr medium with intrinsic, saturable gain and damping. It is shown that exponentially localized, traveling discrete dissipative breather-solitons may exist as stable attractors supported only by intrinsic properties of the medium, i.e., in the absence of any external field or symmetry-breaking perturbations. Through an interplay by the gain and damping effects, the moving soliton may overcome the Peierls-Nabarro barrier, present in the corresponding conservative system, by self-induced time-periodic oscillations of its power (norm) and energy (Hamiltonian), yielding exponential decays to zero with different rates in the forward and backward directions. In certain parameter windows, bistability appears between fast modes with small oscillations and slower, large-oscillation modes. The velocities and the oscillation periods are typically related by lattice commensurability and exhibit period-doubling bifurcations to chaotically "walking" modes under parameter variations. If the model is augmented by intersite Kerr nonlinearity, thereby reducing the Peierls-Nabarro barrier of the conservative system, the existence regime for moving solitons increases considerably, and a richer scenario appears including Hopf bifurcations to incommensurately moving solutions and phase-locking intervals. Stable moving breathers also survive in the presence of weak disorder. © 2014 American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrete-event simulation (DES) is a developed technology used to model manufacturing and service systems. However, although the importance of modelling people in a DES has been recognised, there is little guidance on how this can be achieved in practice. The results from a literature review were used in order to identify examples of the use of DES to model people. Each article was examined in order to determine the method used to model people within the simulation study. It was found that there are no common methods but a diverse range of approaches used to model human behaviour in DES. This paper provides an outline of the approaches used to model people in terms of their decision making, availability for work, task performance and arrival rate. The outcome brings together the current knowledge in this area and will be of interest to researchers considering developing a methodology for modelling people in DES and to practitioners engaged with a simulation project involving the model ling of people’s behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our goal here is a more complete understanding of how information about luminance contrast is encoded and used by the binocular visual system. In two-interval forced-choice experiments we assessed observers' ability to discriminate changes in contrast that could be an increase or decrease of contrast in one or both eyes, or an increase in one eye coupled with a decrease in the other (termed IncDec). The base or pedestal contrasts were either in-phase or out-of-phase in the two eyes. The opposed changes in the IncDec condition did not cancel each other out, implying that along with binocular summation, information is also available from mechanisms that do not sum the two eyes' inputs. These might be monocular mechanisms. With a binocular pedestal, monocular increments of contrast were much easier to see than monocular decrements. These findings suggest that there are separate binocular (B) and monocular (L,R) channels, but only the largest of the three responses, max(L,B,R), is available to perception and decision. Results from contrast discrimination and contrast matching tasks were described very accurately by this model. Stimuli, data, and model responses can all be visualized in a common binocular contrast space, allowing a more direct comparison between models and data. Some results with out-of-phase pedestals were not accounted for by the max model of contrast coding, but were well explained by an extended model in which gratings of opposite polarity create the sensation of lustre. Observers can discriminate changes in lustre alongside changes in contrast.