898 resultados para Bayesian shared component model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän tutkielman tarkoituksena on selittää asiakasomistajien asiakaskäyttäy-tymiseen vaikuttavia asenteellisia ja psykologisia tekijöitä taloudellisten kan-nustimien läsnä ollessa. Asiakaskäyttäytyminen jaetaan tutkielmassa kolmeen eri muotoon, word-of-mouth - käyttäytymiseen, ostojen suhteelliseen keskittämiseen sekä vaihtohalukkuuteen. Asiakaskäyttäytymisen eri aspekteja selitetään organisationaalisen identifioitumisen, sitoutumisen kolmen komponentin, organisaation imagon sekä psykologisen omistajuuden käsitteiden avulla. Samalla tarkastellaan käsitteiden muodostumismekanismeja asiakaskontekstissa. Tutkielma on luonteeltaan kvantitatiivinen tutkimus, jossa kerättyä survey -aineistoa analysoidaan käsitteiden välisten suhteiden ja vaikutusten löytämiseksi polkuanalyysiä käyttäen. Tuloksina havaittiin useiden asiakasomistajien kokemien psykologisten tilojen vaikuttavan asiakaskäyttäytymisen elementteihin taloudellisten kannustimien lisäksi. Tutkielmassa havaittiin myös psykologisen omistajuuden sekä organisationaalisen identifioitumisen olevan relevantteja käsitteitä osuustoiminnallisen yrityksen jäsenien asiakaskäyttäytymistä tutkittaessa, vaikkei niitä aiemmin ole juurikaan tutkittu tämäntyyppisissä konteksteissa. Tutkielman käsitteiden muodostumismekanismien havaittiin noudattavan pääosin kirjallisuudessa esitettyjä näkemyksiä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advent of new advances in mobile computing has changed the manner we do our daily work, even enabling us to perform collaborative activities. However, current groupware approaches do not offer an integrating and efficient solution that jointly tackles the flexibility and heterogeneity inherent to mobility as well as the awareness aspects intrinsic to collaborative environments. Issues related to the diversity of contexts of use are collected under the term plasticity. A great amount of tools have emerged offering a solution to some of these issues, although always focused on individual scenarios. We are working on reusing and specializing some already existing plasticity tools to the groupware design. The aim is to offer the benefits from plasticity and awareness jointly, trying to reach a real collaboration and a deeper understanding of multi-environment groupware scenarios. In particular, this paper presents a conceptual framework aimed at being a reference for the generation of plastic User Interfaces for collaborative environments in a systematic and comprehensive way. Starting from a previous conceptual framework for individual environments, inspired on the model-based approach, we introduce specific components and considerations related to groupware.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liiketoimintaa tukevien palvelujen etätuotanto edustaa uutta kansainvälistymisen muotoa. Kehittyvien markkinoiden nousu yhdistettynä yritysten arvoketjutoimintojen kansainvälistymiseen on luonut yrityksille kasvavan paineen etsiä parasta sijaintia toiminnoilleen. Monikansalliset yritykset ovat yhä useammin korvanneet paikallisia henkilöstöpalvelujaan siirtymällä globaaliin malliin jaettujen palvelujen tuotannossa. Tämä diplomityö on toteutettu tukeakseen UPM:n henkilöstöhallintoa globaalin palvelukeskuksen perustamisessa Puolaan. Tutkimuksen tavoitteena on laajentaa käsitystä henkilöstöpalvelujen tarjontamallin uudistamiseen johtaneista tekijöistä ja motiiveista. Empiirisen tutkimuksen tärkein tavoite on tukea rekrytoinnin hallinnollisten töiden siirtoa globaaliin palvelukeskukseen palvelun laadun säilyessä vähintään aikaisemmalla tasolla. Tutkimuksen tulokset painottavat strategista näkökulmaa muutokseen. Strategiset syyt UPM:n henkilöstöhallinnon globaalin palvelukeskuksen perustamiselle sisältävät ylikapasiteetin ja päällekkäisten toimintojen vähentämisen eri maissa. Muutos lisää palvelun joustavuutta sekä edesauttaa toiminnan läpinäkyvyyttä, ennustettavuutta ja kustannusten valvontaa. Onnistuneesti toteutetut jaetut palvelut voivat toimia hyvänä lähtökohtana tehokkaiden henkilöstöpalvelujen tuottamiselle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena on tutkia globaalin taloushallinnon palvelukeskuksen perustamisprosessia ja palvelukeskusmalliin liittyviä hyötyjä. Tarkoituksena on määrittää palvelukeskuksen perustamiseen liittyvät vaiheet sekä ne kriittiset tekijät, jotka on huomioitava ennen perustamisprosessia sekä sen aikana. Empiirinen osuus on suoritettu kvalitatiivisena tutkimuksena suorittamalla neljä puolistrukturoitua teemahaastattelua globaalin taloushallinnon palvelukeskuksen perustaneissa organisaatioissa. Tutkimustulokset osoittavat, että merkittävimmät palvelukeskusmalliin liitetyt edut ovat tuottavuuden, tehokkuuden ja laadun paraneminen sekä kustannussäästöt. Kriittisimpiä menestystekijöitä ovat muutoksen suunnitteluun ja hallintaan liittyvät tekijät. Malliin liitetyt haasteet liittyvät pääasiassa globaalin toimintaympäristön monimuotoisuuteen, henkilöstöön, teknologiaan ja prosesseihin. Saadut tulokset ovat erittäin yhteneväisiä aikaisempien tutkimusten kanssa. Empiirisen tutkimuksen tulosten sekä aikaisemman tutkimustiedon perusteella on luotu viisivaiheinen malli globaalin taloushallinnon palvelukeskuksen perustamisprosessista.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to evaluate the interference of tuberculin test on the gamma-interferon (INFg) assay, to estimate the sensitivity and specificity of the INFg assay in Brazilian conditions, and to simulate multiple testing using the comparative tuberculin test and the INFg assay. Three hundred-fifty cattle from two TB-free and two TB-infected herds were submitted to the comparative tuberculin test and the INFg assay. The comparative tuberculin test was performed using avian and bovine PPD. The INFg assay was performed by the BovigamTM kit (CSL Veterinary, Australia), according to the manufacturer's specifications. Sensitivity and specificity of the INFg assay were assessed by a Bayesian latent class model. These diagnostic parameters were also estimate for multiple testing. The results of INFg assay on D0 and D3 after the comparative tuberculin test were compared by the McNemar's test and kappa statistics. Results of mean optical density from INFg assay on both days were similar. Sensitivity and specificity of the INFg assay showed results varying (95% confidence intervals) from 72 to 100% and 74 to 100% respectively. Sensitivity of parallel testing was over 97.5%, while specificity of serial testing was over 99.7%. The INFg assay proved to be a very useful diagnostic method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT Towards a contextual understanding of B2B salespeople’s selling competencies − an exploratory study among purchasing decision-makers of internationally-oriented technology firms The characteristics of modern selling can be classified as follows: customer retention and loyalty targets, database and knowledge management, customer relationship management, marketing activities, problem solving and system selling, and satisfying needs and creating value. For salespeople to be successful in this environment, they need a wide range of competencies. Salespeople’s selling skills are well documented in seller side literature through quantitative methods, but the knowledge, skills and competencies from the buyer’s perspective are under-researched. The existing research on selling competencies should be broadened and updated through a qualitative research perspective due to the dynamic nature and the contextual dependence of selling competencies. The purpose of the study is to increase understanding of the professional salesperson’s selling competencies from the industrial purchasing decision- makers’ viewpoint within the relationship selling context. In this study, competencies are defined as sales-related knowledge and skills. The scope of the study includes goods, materials and services managed by a company’s purchasing function and used by an organization on a daily basis. The abductive approach and ‘systematic combining’ have been applied as a research strategy. In this research, data were generated through semi- structured, person-to-person interviews and open-ended questions. The study was conducted among purchasing decision-makers in the technology industry in Finland. The branches consisted of the electronics and electro-technical industries and the mechanical engineering and metals industries. A total of 30 companies and one purchasing decision-maker from each company were purposively chosen for the sampling. The sample covers different company sizes based on their revenues, their differing structures – varying from public to family companies –that represent domestic and international ownerships. Before analyzing the data, they were organized by the purchasing orientations of the buyers: the buying, procurement or supply management orientation. Thematic analysis was chosen as the analysis method. After analyzing the data, the results were contrasted with the theory. There was a continuous interaction between the empirical data and the theory. Based on the findings, a total of 19 major knowledge and skills were identified from the buyers’ perspective. The specific knowledge and skills from the viewpoint of customers’ prevalent purchasing orientations were divided into two categories, generic and contextual. The generic knowledge and skills apply to all purchasing orientations, and the contextual knowledge and skills depend on customers’ prevalent purchasing orientations. Generic knowledge and skills relate to price setting, negotiation, communication and interaction skills, while contextual ones relate to knowledge brokering, ability to present solutions and relationship skills. Buying-oriented buyers value salespeople who are ‘action oriented experts, however at a bit of an arm’s length’, procurement buyers value salespeople who are ‘experts deeply dedicated to the customer and fostering the relationship’ and supply management buyers value salespeople who are ‘corporate-oriented experts’. In addition, the buyer’s perceptions on knowledge and selling skills differ from the seller’s ones. The buyer side emphasizes managing the subject matter, consisting of the expertise, understanding the customers’ business and needs, creating a customized solution and creating value, reliability and an ability to build long-term relationships, while the seller side emphasizes communica- tion, interaction and salesmanship skills. The study integrates the selling skills of the current three-component model− technical knowledge, salesmanship skills, interpersonal skills− and relationship skills and purchasing orientations, into a selling competency model. The findings deepen and update the content of these knowledges and skills in the B2B setting and create new insights into them from the buyer’s perspective, and thus the study increases contextual understanding of selling competencies. It generates new knowledge of the salesperson’s competencies for the relationship selling and personal selling and sales management literature. It also adds knowledge of the buying orientations to the buying behavior literature. The findings challenge sales management to perceive salespeople’s selling skills both from a contingency and competence perspective. The study has several managerial implications: it increases understanding of what the critical selling knowledge and skills from the buyer’s point of view are, understanding of how salespeople effectively implement the relationship marketing concept, sales management’s knowledge of how to manage the sales process more effectively and efficiently, and the knowledge of how sales management should develop a salesperson’s selling competencies when managing and developing the sales force. Keywords: selling competencies, knowledge, selling skills, relationship skills, purchasing orientations, B2B selling, abductive approach, technology firms

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les logiciels utilisés sont Splus et R.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La mammite subclinique est un problème de santé fréquent et coûteux. Les infections intra-mammaires (IIM) sont souvent détectées à l’aide de mesures du comptage des cellules somatiques (CCS). La culture bactériologique du lait est cependant requise afin d’identifier le pathogène en cause. À cause de cette difficulté, pratiquement toutes les recherches sur la mammite subclinique ont été centrées sur la prévalence d’IIM et les facteurs de risque pour l’incidence ou l’élimination des IIM sont peu connus. L’objectif principal de cette thèse était d’identifier les facteurs de risque modifiables associés à l’incidence, l’élimination et la prévalence d’IIM d’importance dans les troupeaux laitiers Canadiens. En premier lieu, une revue systématique de la littérature sur les associations entre pratiques utilisées à la ferme et CCS a été réalisée. Les pratiques de gestion constamment associées au CCS ont été identifiées et différentiées de celles faisant l’objet de rapports anecdotiques. Par la suite, un questionnaire bilingue a été développé, validé, et utilisé afin de mesurer les pratiques de gestion d’un échantillon de 90 troupeaux laitiers canadiens. Afin de valider l’outil, des mesures de répétabilité et de validité des items composant le questionnaire ont été analysées et une évaluation de l’équivalence des versions anglaise et française a été réalisée. Ces analyses ont permis d’identifier des items problématiques qui ont du être recatégorisés, lorsque possible, ou exclus des analyses subséquentes pour assurer une certaine qualité des données. La plupart des troupeaux étudiés utilisaient déjà la désinfection post-traite des trayons et le traitement universel des vaches au tarissement, mais beaucoup des pratiques recommandées n’étaient que peu utilisées. Ensuite, les facteurs de risque modifiables associés à l’incidence, à l’élimination et à la prévalence d’IIM à Staphylococcus aureus ont été investigués de manière longitudinale sur les 90 troupeaux sélectionnés. L’incidence d’IIM semblait être un déterminant plus important de la prévalence d’IIM du troupeau comparativement à l’élimination des IIM. Le port de gants durant la traite, la désinfection pré-traite des trayons, de même qu’une condition adéquate des bouts de trayons démontraient des associations désirables avec les différentes mesures d’IIM. Ces résultats viennent souligner l’importance des procédures de traite pour l’obtention d’une réduction à long-terme de la prévalence d’IIM. Finalement, les facteurs de risque modifiables associés à l’incidence, à l’élimination et à la prévalence d’IIM à staphylocoques coagulase-négatif (SCN) ont été étudiés de manière similaire. Cependant, afin de prendre en considération les limitations de la culture bactériologique du lait pour l’identification des IIM causées par ce groupe de pathogènes, une approche semi-Bayesienne à l’aide de modèles de variable à classe latente a été utilisée. Les estimés non-ajusté de l’incidence, de l’élimination, de la prévalence et des associations avec les expositions apparaissaient tous considérablement biaisés par les imperfections de la procédure diagnostique. Ce biais était en général vers la valeur nulle. Encore une fois, l’incidence d’IIM était le principal déterminant de la prévalence d’IIM des troupeaux. Les litières de sable et de produits du bois, de même que l’accès au pâturage étaient associés à une incidence et une prévalence plus basse de SCN.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un système, décrit avec un grand nombre d'éléments fortement interdépendants, est complexe, difficile à comprendre et à maintenir. Ainsi, une application orientée objet est souvent complexe, car elle contient des centaines de classes avec de nombreuses dépendances plus ou moins explicites. Une même application, utilisant le paradigme composant, contiendrait un plus petit nombre d'éléments, faiblement couplés entre eux et avec des interdépendances clairement définies. Ceci est dû au fait que le paradigme composant fournit une bonne représentation de haut niveau des systèmes complexes. Ainsi, ce paradigme peut être utilisé comme "espace de projection" des systèmes orientés objets. Une telle projection peut faciliter l'étape de compréhension d'un système, un pré-requis nécessaire avant toute activité de maintenance et/ou d'évolution. De plus, il est possible d'utiliser cette représentation, comme un modèle pour effectuer une restructuration complète d'une application orientée objets opérationnelle vers une application équivalente à base de composants tout aussi opérationnelle. Ainsi, La nouvelle application bénéficiant ainsi, de toutes les bonnes propriétés associées au paradigme composants. L'objectif de ma thèse est de proposer une méthode semi-automatique pour identifier une architecture à base de composants dans une application orientée objets. Cette architecture doit, non seulement aider à la compréhension de l'application originale, mais aussi simplifier la projection de cette dernière dans un modèle concret de composant. L'identification d'une architecture à base de composants est réalisée en trois grandes étapes: i) obtention des données nécessaires au processus d'identification. Elles correspondent aux dépendances entre les classes et sont obtenues avec une analyse dynamique de l'application cible. ii) identification des composants. Trois méthodes ont été explorées. La première utilise un treillis de Galois, la seconde deux méta-heuristiques et la dernière une méta-heuristique multi-objective. iii) identification de l'architecture à base de composants de l'application cible. Cela est fait en identifiant les interfaces requises et fournis pour chaque composant. Afin de valider ce processus d'identification, ainsi que les différents choix faits durant son développement, j'ai réalisé différentes études de cas. Enfin, je montre la faisabilité de la projection de l'architecture à base de composants identifiée vers un modèle concret de composants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis Entitled “modelling and analysis of recurrent event data with multiple causes.Survival data is a term used for describing data that measures the time to occurrence of an event.In survival studies, the time to occurrence of an event is generally referred to as lifetime.Recurrent event data are commonly encountered in longitudinal studies when individuals are followed to observe the repeated occurrences of certain events. In many practical situations, individuals under study are exposed to the failure due to more than one causes and the eventual failure can be attributed to exactly one of these causes.The proposed model was useful in real life situations to study the effect of covariates on recurrences of certain events due to different causes.In Chapter 3, an additive hazards model for gap time distributions of recurrent event data with multiple causes was introduced. The parameter estimation and asymptotic properties were discussed .In Chapter 4, a shared frailty model for the analysis of bivariate competing risks data was presented and the estimation procedures for shared gamma frailty model, without covariates and with covariates, using EM algorithm were discussed. In Chapter 6, two nonparametric estimators for bivariate survivor function of paired recurrent event data were developed. The asymptotic properties of the estimators were studied. The proposed estimators were applied to a real life data set. Simulation studies were carried out to find the efficiency of the proposed estimators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motion instability is an important issue that occurs during the operation of towed underwater vehicles (TUV), which considerably affects the accuracy of high precision acoustic instrumentations housed inside the same. Out of the various parameters responsible for this, the disturbances from the tow-ship are the most significant one. The present study focus on the motion dynamics of an underwater towing system with ship induced disturbances as the input. The study focus on an innovative system called two-part towing. The methodology involves numerical modeling of the tow system, which consists of modeling of the tow-cables and vehicles formulation. Previous study in this direction used a segmental approach for the modeling of the cable. Even though, the model was successful in predicting the heave response of the tow-body, instabilities were observed in the numerical solution. The present study devises a simple approach called lumped mass spring model (LMSM) for the cable formulation. In this work, the traditional LMSM has been modified in two ways. First, by implementing advanced time integration procedures and secondly, use of a modified beam model which uses only translational degrees of freedoms for solving beam equation. A number of time integration procedures, such as Euler, Houbolt, Newmark and HHT-α were implemented in the traditional LMSM and the strength and weakness of each scheme were numerically estimated. In most of the previous studies, hydrodynamic forces acting on the tow-system such as drag and lift etc. are approximated as analytical expression of velocities. This approach restricts these models to use simple cylindrical shaped towed bodies and may not be applicable modern tow systems which are diversed in shape and complexity. Hence, this particular study, hydrodynamic parameters such as drag and lift of the tow-system are estimated using CFD techniques. To achieve this, a RANS based CFD code has been developed. Further, a new convection interpolation scheme for CFD simulation, called BNCUS, which is blend of cell based and node based formulation, was proposed in the study and numerically tested. To account for the fact that simulation takes considerable time in solving fluid dynamic equations, a dedicated parallel computing setup has been developed. Two types of computational parallelisms are explored in the current study, viz; the model for shared memory processors and distributed memory processors. In the present study, shared memory model was used for structural dynamic analysis of towing system, distributed memory one was devised in solving fluid dynamic equations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the number of processors in distributed-memory multiprocessors grows, efficiently supporting a shared-memory programming model becomes difficult. We have designed the Protocol for Hierarchical Directories (PHD) to allow shared-memory support for systems containing massive numbers of processors. PHD eliminates bandwidth problems by using a scalable network, decreases hot-spots by not relying on a single point to distribute blocks, and uses a scalable amount of space for its directories. PHD provides a shared-memory model by synthesizing a global shared memory from the local memories of processors. PHD supports sequentially consistent read, write, and test- and-set operations. This thesis also introduces a method of describing locality for hierarchical protocols and employs this method in the derivation of an abstract model of the protocol behavior. An embedded model, based on the work of Johnson[ISCA19], describes the protocol behavior when mapped to a k-ary n-cube. The thesis uses these two models to study the average height in the hierarchy that operations reach, the longest path messages travel, the number of messages that operations generate, the inter-transaction issue time, and the protocol overhead for different locality parameters, degrees of multithreading, and machine sizes. We determine that multithreading is only useful for approximately two to four threads; any additional interleaving does not decrease the overall latency. For small machines and high locality applications, this limitation is due mainly to the length of the running threads. For large machines with medium to low locality, this limitation is due mainly to the protocol overhead being too large. Our study using the embedded model shows that in situations where the run length between references to shared memory is at least an order of magnitude longer than the time to process a single state transition in the protocol, applications exhibit good performance. If separate controllers for processing protocol requests are included, the protocol scales to 32k processor machines as long as the application exhibits hierarchical locality: at least 22% of the global references must be able to be satisfied locally; at most 35% of the global references are allowed to reach the top level of the hierarchy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financial integration has been pursued aggressively across the globe in the last fifty years; however, there is no conclusive evidence on the diversification gains (or losses) of such efforts. These gains (or losses) are related to the degree of comovements and synchronization among increasingly integrated global markets. We quantify the degree of comovements within the integrated Latin American market (MILA). We use dynamic correlation models to quantify comovements across securities as well as a direct integration measure. Our results show an increase in comovements when we look at the country indexes, however, the increase in the trend of correlation is previous to the institutional efforts to establish an integrated market in the region. On the other hand, when we look at sector indexes and an integration measure, we find a decreased in comovements among a representative sample of securities form the integrated market.