927 resultados para Endogenous Information Structure
Resumo:
Customer knowledge management (CKM) practices enable organizations to create customer competence with systematic use of customer information that is integrated throughout the organization. Nonetheless, organizations are not able to fully exploit the vast amount of data available. Previous research on use of customer information is limited especially in a multichannel environment. The aim of this study was to identify the main obstacles for utilizing customer information efficiently across multiple sales channels. The study was conducted as a single case study in order to gain deeper understanding of the research problem. The empirical findings indicate that lack of CKM practices and a common goal are major challenges obstructing effective utilization of customer information. Furthermore, decentralized organizational structure and insufficient analytical skills create obstacles for information sharing and capabilities to process information and create new knowledge. The implications of the study suggest that in order to create customer competence organizations should shift their focus from technology to the organizational factors affecting use of information and implement CKM practices throughout the organization.
Resumo:
Previous genetic association studies have overlooked the potential for biased results when analyzing different population structures in ethnically diverse populations. The purpose of the present study was to quantify this bias in two-locus association studies conducted on an admixtured urban population. We studied the genetic structure distribution of angiotensin-converting enzyme insertion/deletion (ACE I/D) and angiotensinogen methionine/threonine (M/T) polymorphisms in 382 subjects from three subgroups in a highly admixtured urban population. Group I included 150 white subjects; group II, 142 mulatto subjects, and group III, 90 black subjects. We conducted sample size simulation studies using these data in different genetic models of gene action and interaction and used genetic distance calculation algorithms to help determine the population structure for the studied loci. Our results showed a statistically different population structure distribution of both ACE I/D (P = 0.02, OR = 1.56, 95% CI = 1.05-2.33 for the D allele, white versus black subgroup) and angiotensinogen M/T polymorphism (P = 0.007, OR = 1.71, 95% CI = 1.14-2.58 for the T allele, white versus black subgroup). Different sample sizes are predicted to be determinant of the power to detect a given genotypic association with a particular phenotype when conducting two-locus association studies in admixtured populations. In addition, the postulated genetic model is also a major determinant of the power to detect any association in a given sample size. The present simulation study helped to demonstrate the complex interrelation among ethnicity, power of the association, and the postulated genetic model of action of a particular allele in the context of clustering studies. This information is essential for the correct planning and interpretation of future association studies conducted on this population.
Resumo:
It has long been known that amino acids are the building blocks for proteins and govern their folding into specific three-dimensional structures. However, the details of this process are still unknown and represent one of the main problems in structural bioinformatics, which is a highly active research area with the focus on the prediction of three-dimensional structure and its relationship to protein function. The protein structure prediction procedure encompasses several different steps from searches and analyses of sequences and structures, through sequence alignment to the creation of the structural model. Careful evaluation and analysis ultimately results in a hypothetical structure, which can be used to study biological phenomena in, for example, research at the molecular level, biotechnology and especially in drug discovery and development. In this thesis, the structures of five proteins were modeled with templatebased methods, which use proteins with known structures (templates) to model related or structurally similar proteins. The resulting models were an important asset for the interpretation and explanation of biological phenomena, such as amino acids and interaction networks that are essential for the function and/or ligand specificity of the studied proteins. The five proteins represent different case studies with their own challenges like varying template availability, which resulted in a different structure prediction process. This thesis presents the techniques and considerations, which should be taken into account in the modeling procedure to overcome limitations and produce a hypothetical and reliable three-dimensional structure. As each project shows, the reliability is highly dependent on the extensive incorporation of experimental data or known literature and, although experimental verification of in silico results is always desirable to increase the reliability, the presented projects show that also the experimental studies can greatly benefit from structural models. With the help of in silico studies, the experiments can be targeted and precisely designed, thereby saving both money and time. As the programs used in structural bioinformatics are constantly improved and the range of templates increases through structural genomics efforts, the mutual benefits between in silico and experimental studies become even more prominent. Hence, reliable models for protein three-dimensional structures achieved through careful planning and thoughtful executions are, and will continue to be, valuable and indispensable sources for structural information to be combined with functional data.
Resumo:
Tutkimus sai innoituksensa, kun tutkija huomasi tarpeen liiketaloudelliselle, ajantasaiselle ja realistiselle tutkimukselle Pohjois-Korean markkinoista, joka kuvailisi markkinoiden olemassaolevia ja puuttuvia rakenteita sekä tutkisi mahdollisuuksia ylittää puuttuvat rakenteet. Institutionaalinen teoria valittiin sopivaksi viitekehykseksi kuvailla ja tutkia markkinarakennetta. Tutkimuskysymys muotoiltiin seuraavasti: “Miten ulkomaiset yritykset voivat reagoida puuttuviin markkinarakenteisiin Pohjois-Koreassa?”. Tutkimuskysymys jaettiin kolmeen osakysymykseen: (1) Millainen on Pohjois-Korean markkinoiden institutionaalinen ympäristö? (2) Mitkä ovat merkittävimmät puuttuvat markkinarakenteet Pohjois-Koreassa? (3) Mitä mahdollisuuksia ulkomaisilla yrityksillä voisi olla reagoida puuttuviin markkinarakenteisiin? Tutkimus toteutettiin kvalitatiivisena, koska tutkimuskysymys on deskriptiivinen. Aineisto kerättiin asiantuntijahaastattelun ja kvalitatiivisen sisällönanalyysin keinoin. Primääriaineiston muodostavat 2 asiantuntijahaastattelua ja sekundääriaineiston muodostavat 95 artikkelia, jotka kerättiin 40 lähteestä. Aineisto analysoitiin kvalitatiivisen sisällönanalyysin keinoin. Aineisto koodattiin, luokiteltiin ja esitettiin kokonaisuuksina luokittelurungon avulla, joka laadittiin tutkimusta varten muodostetun teoreettisen viitekehyksen mukaan. Tulokset ja johtopäätökset voidaan tiivistää seuraavasti. (1) Pohjois-Korean markkinan instituutioihin vaikuttaa kaksoisrakenne, jossa muodollinen, sosialistinen rakenne ja epämuodollinen, markkinalähtöinen rakenne toimivat päällekkäin. (2) Puuttuvia rakenteita on sekä markkinan kontekstissa että markkinatasolla. Puutteet ovat osittain seurausta vanhojen rakenteiden korvaantumisesta uusilla, jotka eivät ole institutionalisoituneet. (3) Yritykset voivat käyttää samoja mahdollisuuuksia reagoida puuttuviin markkinarakenteisiin Pohjois-Koreassa, joita kehittyvien markkinoiden yhteydessä on esitetty. Sen tulkittiin vähentävän käsitystä, jonka mukaan Pohjois-Korean markkina on liian erikoinen yritystoiminnalle. (4) Kasvava keskiluokka sekä yrittäjyyden ja naisten yhä merkittävämpi rooli liike-elämässä aiheuttavat alhaalta ylöspäin suuntautuvaa kehitystä markkinoilla. Nämä ovat merkkejä viimeaikaisesta kehityksestä, jotka eivät ole saaneet laajaa huomiota länsimaisessa mediassa. Se korostaa tarvetta liiketaloudelliselle, ajantasaiselle jatkotutkimukselle Pohjois-Korean markkinoista.
Resumo:
The research was sparked by an exchange in South Korea, as the author identified a gap in research that provides economic, up to date and realistic information about the North Korean market in English language. A need for a research was identified that would describe the market’s existing and missing market structures and explore possibilities to overcome the missing market structures. Institutional theory was chosen as a suitable framework to describe and explore the market. The research question was formulated as follows: “How can foreign companies overcome institutional voids in the North Korean market?”. To answer the research question, it was divided into three sub-questions as follows: (1) What is the institutional environment in North Korea like? (2) What are the major institutional voids in the North Korean market? (3) What possibilities do foreign companies have to overcome institutional voids? The research is qualitative by nature due to the descriptive and exploratory nature of the research question. Data collection consisted of expert interview and content analysis, resulting in primary data of two interviews and secondary data of 95 articles from 40 different sources. The data was analyzed with the systematical technique of content analysis. The data was coded, classified and presented as concepts with the help of a classification system that was build following the theoretical framework adapted for this study. The findings can be summarized as follows. (1) The market institutions are characterized by an overlapping dual system of formal, socialist structures and informal, market-oriented structures. (2) Institutional voids prevail on both the market’s contextual and on the market level. They are partly result of old institutions being replaced by new institutions that lack institutionalization. (3) Identified possibilities to overcome institutional voids correspond with possibilities drawn from previous research. This decreases the image of North Korea as an impossibly unique market to operate in. (4) Emerging middle class, rapidly growing entrepreneurial activities and women’s increasing role in business drive a down-to-up change in the market. This signals the recent development of the market, yet has been overlooked in the Western media. Thus there is a need for further economic, up to date research concerning North Korea.
Resumo:
The research on the interaction between radiation and biomolecules pro-vides valuable information for both radiobiology and molecular physics. While radiobiology is interested in the damage inflicted on the molecule upon irradiation, molecular physics exploits these studies to obtain infor-mation about the physical properties of the molecule and the quantum me-chanical processes involved in the interaction. This thesis work investigated how a small change in the structure or composition of a biomolecule changes the response of the molecule to ioniz-ing radiation. Altogether eight different biomolecules were studied: nucleo-sides uridine, 5-methyluridine and thymidine; amino acids alanine, cysteine and serine; and halogenated acetic acids chloro- and bromoacetic acids. The effect of ionizing radiation on these molecules was studied on molecular level, investigating the samples in gas phase. Synchrotron radiation of VUV or soft x-ray range was used to ionize sample molecules, and the subsequent fragmentation processes were investigated with ion mass spectroscopy and ion-ion-electron coincidence spectroscopy. The comparison between the three nucleosides revealed that adding or removing a single functional group can affect not only the bonds from which the molecule ruptures upon ionization but also the charge localiza-tion in the formed fragments. Studies on amino acids and halogenated acetic acids indicated that one simple substitution in the molecule can dramatical-ly change the extent of fragmentation. This thesis work also demonstrates that in order to steer the radiation-induced fragmentation of the molecules, it is not always necessary to alter the amount of energy deposited on the molecules but selecting a suitable substitution may suffice.
Resumo:
This case study examines the impact of a computer information system as it was being implemented in one Ontario hospital. The attitudes of a cross section of the hospital staff acted as a barometer to measure their perceptions of the implementation process. With The Mississauga Hospital in the early stages of an extensive computer implementation project, the opportunity existed to identify staff attitudes about the computer system, overall knowledge and compare the findings with the literature. The goal of the study was to develop a greater base about the affective domain in the relationship between people and the computer system. Eight exploratory questions shaped the focus of the investigation. Data were collected from three sources: a survey questionnaire, focused interviews, and internal hospital documents. Both quantitative and qualitative data were analyzed. Instrumentation in the study consisted of a survey distributed at two points in time to randomly selected hospital employees who represented all staff levels.Other sources of data included hospital documents, and twenty-five focused interviews with staff who replied to both surveys. Leavitt's socio-technical system, with its four subsystems: task, structure, technology, and people was used to classify staff responses to the research questions. The study findings revealed that the majority of respondents felt positive about using the computer as part of their jobs. No apparent correlations were found between sex, age, or staff group and feelings about using the computer. Differences in attitudes, and attitude changes were found in potential relationship to the element of time. Another difference was found in staff group and perception of being involved in the decision making process. These findings and other evidence about the role of change agents in this change process help to emphasize that planning change is one thing, managing the transition is another.
Resumo:
The nucleotide sequence of a genomic DNA fragment thought previously to contain the dihydrofolate reductase gene (DFR1) of Saccharomyces cerevisiae by genetic criteria was determined. This DNA fragment of 1784' basepairs contains a large open reading frame from position 800 to 1432, which encodes a enzyme with a predicted molecular weight of 24,229.8 Daltons. Analysis of the amino acid sequence of this protein revealed that the yeast polypep·tide contained 211 amino acids, compared to the 186 residues commonly found in the polypeptides of other eukaryotes. The difference in size of the gene product can be attributed mainly to an insert in the yeast gene. Within this region, several consensus sequences required for processing of yeast nuclear and class II mitochondrial introns were identified, but appear not sufficient for the RNA splicing. The primary structure of the yeast DHFR protein has considerable sequence homology with analogous polypeptides from other organisms, especially in the consensus residues involved in cofactor and/or inhibitor binding. Analysis of the nucleotide sequence also revealed the presence of a number of canonical sequences identified in yeast as having some function in the regulation of gene expression. These include UAS elements (TGACTC) required for tIle amino acid general control response, and "TATA H boxes as well as several consensus sequences thought to be required for transcriptional termination and polyadenylation. Analysis of the codon usage of the yeast DFRl coding region revealed a codon bias index of 0.0083. this valve very close to zero suggestes 3 that the gene is expressed at a relatively low level under normal physiological conditions. The information concerning the organization of the DFRl were used to construct a variety of fusions of its 5' regulatory region with the coding region of the lacZ gene of E. coli. Some of such fused genes encoded a fusion product that expressed in E.coli and/or in yeast under the control of the 5' regulatory elements of the DFR1. Further studies with these fusion constructions revealed that the beta-galactosidase activity encoded on multicopy plasmids was stimulated transiently by prior exposure of yeast host cells to UV light. This suggests that the yeast PFRl gene is indu.ced by UV light and nlay in1ply a novel function of DHFR protein in the cellular responses to DNA damage. Another novel f~ature of yeast DHFR was revealed during preliminary studies of a diploid strain containing a heterozygous DFRl null allele. The strain was constructed by insertion of a URA3 gene within the coding region of DFR1. Sporulation of this diploid revealed that meiotic products segregated 2:0 for uracil prototrophy when spore clones were germinated on medium supplemented with 5-formyltetrahydrofolate (folinic acid). This finding suggests that, in addition to its catalytic activity, the DFRl gene product nlay play some role in the anabolisln of folinic acid. Alternatively, this result may indicate that Ura+ haploid segregants were inviable and suggest that the enzyme has an essential cellular function in this species.
Resumo:
The crystal structure of Cu(PM)2(N03hoH20 (where PM is pyridoxamine, CSHI2N202) has been determined from three dimensional x-ray diffraction data. The crystals are triclinic, space group pI, a = 14.248 (2), b = 8.568 (1), c = 9.319 (1) 1, a = 94.08 (1), e = 89.73 (1), y~~ 99.18 (1)°, z = 2, jl(MoK) = 10.90 em-I, Po = 1.61 g/cm3 and Pc = 1.61 g/em3• The structure a was solved by Patterson techniques from data collected on a Picker 4-circle diffractometer to 26max = 45°. All atoms, including hydrogens, have been located. Anisotropic thermal parameters have been refined for all nonhydrogen atoms. For the 2390 independent reflections with F ? 3cr(F) , R = 0.0408. The results presented here provide the first detailed structural information of a metal complex with PM itself. The copper atoms are located on centres of symmetry and each is chela ted by two PM zwitterions through the amino groups and phenolate oxygen atoms. The zwitterionic form found in this structure involves the loss of a proton from the phenolate group and protonation of the pyridine ring nitrogen atoms. The two independent Cu(PM)2 moieties are symmetrically bridged by a single oxygen atom from one of the nitrate groups. The second nitrate group is not coordinated to the copper atoms but is central to an extensive hydrogen bonding network involving the water molecule and uncoordinated functional groups of PM.
Resumo:
Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.
Resumo:
The model studies information sharing and the stability of cooperation in cost reducing Research Joint Ventures (RJVs). In a four-stage game-theoretic framework, firms decide on participation in a RJV, information sharing, R&D expenditures, and output. An important feature of the model is that voluntary information sharing between cooperating firms increases information leakage from the RJV to outsiders. It is found that it is the spillover from the RJV to outsiders which determines the decision of insiders whether to share information, while it is the spillover affecting all firms which determines the level of information sharing within the RJV. RJVs representing a larger portion of firms in the industry are more likely to share information. It is also found that when sharing information is costless, firms never choose intermediate levels of information sharing : they share all the information or none at all. The size of the RJV is found to depend on three effects : a coordination effect, an information sharing effect, and a competition effect. Depending on the relative magnitudes of these effects, the size of the RJV may increase or decrease with spillovers. The effect of information sharing on the profitability of firms as well as on welfare is studied.
Resumo:
This paper addresses the question of whether R&D should be carried out by an independent research unit or be produced in-house by the firm marketing the innovation. We define two organizational structures. In an integrated structure, the firm that markets the innovation also carries out and finances research leading to the innovation. In an independent structure, the firm that markets the innovation buys it from an independent research unit which is financed externally. We compare the two structures under the assumption that the research unit has some private information about the real cost of developing the new product. When development costs are negatively correlated with revenues from the innovation, the integrated structure dominates. The independent structure dominates in the opposite case.
Resumo:
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.
Resumo:
This article reviews the origins of the Documentation, Information and Research Branch (the 'Documentation Center') of Canada's Immigration and Refugee Board (IRB), established in 1988 as a part of a major revision of the procedure for determination of refugee status. The Documentation Center conducts research to produce documents describing conditions in refugee-producing countries, and also disseminates information from outside. The information is available to decision-makers, IRB staff, counsel and claimants. Given the importance of decisions on refugee status, the article looks at the credibility and the authoritativeness of the information, by analyzing the structure of information used. It recalls the different types of information 'package' produced, such as a country profiles and the Question and Answer Series, the Weekly Madia Review, the 'Perspectives' series, Responses to Information Requests and Country files, and considers the trend towards standardization across the country. The research process is reviewed, as are the hiring criteria for researchers, the composition of the 'collection', how acquisitions are made, and the development of databases, particularly on country of origin (human rights material) and legal information, which are accessible on-line. The author examines how documentary information can be used by decision-makers to draw conclusions as to whether the claim has a credible basis or the claimant has a well-founded fear of persecution. Relevant caselaw is available to assess and weigh the claim. The experience of Amnesty International in similar work is cited for comparative purposes. A number of 'safeguards' are mentioned, which contribute to the goal of impartiality in research, or which otherwise enhance the credibility of the information, and the author suggests that guidelines might be drafted to explain and assist in the realization of these aims. Greater resources might also enable the Center to undertake the task of 'certifying' the authoritativeness of sources. The author concludes that, as a new institution in Canadian administrative law, the Documentation Center opens interesting avenues for the future. Beacause it ensures an acceptable degree of impartiality of its research and the documents it produces, it may be a useful model for others tribunals adjudicating in fields where evidence is either difficult to gather, or is otherwise complex.
Resumo:
Le présent mémoire comprend un survol des principales méthodes de rendu en demi-tons, de l’analog screening à la recherche binaire directe en passant par l’ordered dither, avec une attention particulière pour la diffusion d’erreur. Ces méthodes seront comparées dans la perspective moderne de la sensibilité à la structure. Une nouvelle méthode de rendu en demi-tons par diffusion d’erreur est présentée et soumise à diverses évaluations. La méthode proposée se veut originale, simple, autant à même de préserver le caractère structurel des images que la méthode à l’état de l’art, et plus rapide que cette dernière par deux à trois ordres de magnitude. D’abord, l’image est décomposée en fréquences locales caractéristiques. Puis, le comportement de base de la méthode proposée est donné. Ensuite, un ensemble minutieusement choisi de paramètres permet de modifier ce comportement de façon à épouser les différents caractères fréquentiels locaux. Finalement, une calibration détermine les bons paramètres à associer à chaque fréquence possible. Une fois l’algorithme assemblé, toute image peut être traitée très rapidement : chaque pixel est attaché à une fréquence propre, cette fréquence sert d’indice pour la table de calibration, les paramètres de diffusion appropriés sont récupérés, et la couleur de sortie déterminée pour le pixel contribue en espérance à souligner la structure dont il fait partie.