11 resultados para Genetic group model
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The size and complexity of projects in the software development are growing very fast. At the same time, the proportion of successful projects is still quite low according to the previous research. Although almost every project's team knows main areas of responsibility which would help to finish project on time and on budget, this knowledge is rarely used in practice. So it is important to evaluate the success of existing software development projects and to suggest a method for evaluating success chances which can be used in the software development projects. The main aim of this study is to evaluate the success of projects in the selected geographical region (Russia-Ukraine-Belarus). The second aim is to compare existing models of success prediction and to determine their strengths and weaknesses. Research was done as an empirical study. A survey with structured forms and theme-based interviews were used as the data collection methods. The information gathering was done in two stages. At the first stage, project manager or someone with similar responsibilities answered the questions over Internet. At the second stage, the participant was interviewed; his or her answers were discussed and refined. It made possible to get accurate information about each project and to avoid errors. It was found out that there are many problems in the software development projects. These problems are widely known and were discussed in literature many times. The research showed that most of the projects have problems with schedule, requirements, architecture, quality, and budget. Comparison of two models of success prediction presented that The Standish Group overestimates problems in project. At the same time, McConnell's model can help to identify problems in time and avoid troubles in future. A framework for evaluating success chances in distributed projects was suggested. The framework is similar to The Standish Group model but it was customized for distributed projects.
Resumo:
This study examines how firms interpret new, potentially disruptive technologies in their own strategic context. The work presents a cross-case analysis of four potentially disruptive technologies or technical operating models: Bluetooth, WLAN, Grid computing and Mobile Peer-to-peer paradigm. The technologies were investigated from the perspective of three mobile operators, a device manufacturer and a software company in the ICT industry. The theoretical background for the study consists of the resource-based view of the firm with dynamic perspective, the theories on the nature of technology and innovations, and the concept of business model. The literature review builds up a propositional framework for estimating the amount of radical change in the companies' business model with two middle variables, the disruptiveness potential of a new technology, and the strategic importance of a new technology to a firm. The data was gathered in group discussion sessions in each company. The results of each case analysis were brought together to evaluate, how firms interpret the potential disruptiveness in terms of changes in product characteristics and added value, technology and market uncertainty, changes in product-market positions, possible competence disruption and changes in value network positions. The results indicate that the perceived disruptiveness in terms ofproduct characteristics does not necessarily translate into strategic importance. In addition, firms did not see the new technologies as a threat in terms of potential competence disruption.
Resumo:
The evaluation of investments in advanced technology is one of the most important decision making tasks. The importance is even more pronounced considering the huge budget concerning the strategic, economic and analytic justification in order to shorten design and development time. Choosing the most appropriate technology requires an accurate and reliable system that can lead the decision makers to obtain such a complicated task. Currently, several Information and Communication Technologies (ICTs) manufacturers that design global products are seeking local firms to act as their sales and services representatives (called distributors) to the end user. At the same time, the end user or customer is also searching for the best possible deal for their investment in ICT's projects. Therefore, the objective of this research is to present a holistic decision support system to assist the decision maker in Small and Medium Enterprises (SMEs) - working either as individual decision makers or in a group - in the evaluation of the investment to become an ICT's distributor or an ICT's end user. The model is composed of the Delphi/MAH (Maximising Agreement Heuristic) Analysis, a well-known quantitative method in Group Support System (GSS), which is applied to gather the average ranking data from amongst Decision Makers (DMs). After that the Analytic Network Process (ANP) analysis is brought in to analyse holistically: it performs quantitative and qualitative analysis simultaneously. The illustrative data are obtained from industrial entrepreneurs by using the Group Support System (GSS) laboratory facilities at Lappeenranta University of Technology, Finland and in Thailand. The result of the research, which is currently implemented in Thailand, can provide benefits to the industry in the evaluation of becoming an ICT's distributor or an ICT's end user, particularly in the assessment of the Enterprise Resource Planning (ERP) programme. After the model is put to test with an in-depth collaboration with industrial entrepreneurs in Finland and Thailand, the sensitivity analysis is also performed to validate the robustness of the model. The contribution of this research is in developing a new approach and the Delphi/MAH software to obtain an analysis of the value of becoming an ERP distributor or end user that is flexible and applicable to entrepreneurs, who are looking for the most appropriate investment to become an ERP distributor or end user. The main advantage of this research over others is that the model can deliver the value of becoming an ERP distributor or end user in a single number which makes it easier for DMs to choose the most appropriate ERP vendor. The associated advantage is that the model can include qualitative data as well as quantitative data, as the results from using quantitative data alone can be misleading and inadequate. There is a need to utilise quantitative and qualitative analysis together, as can be seen from the case studies.
Resumo:
Tutkielman tavoitteena oli selvittää ASP-, eli sovellusvuokrausmallin erityis-piirteitä sekä määrittää alustavat kohdesegmentit toiminnan-, talouden- ja henkilöstönohjausjärjestelmille ASP-ratkaisuna. Työ on kaksiosainen. Tutkimuksen teoreettisessa osassa keskitytään segmentoinnin teorioihin, tietojärjestelmien ulkoistamisen motiiveihin sekä etuihin ja haittoihin sekä ASP-malliin kokonaisuudessaan. Tutkimuksen empiiristä osuutta varten suoritettiin kvalitatiivinen tutkimus, joka koostui 20 haastattelusta 18 eri organisaatiossa. Haastateltavat olivat pääasiassa asiakasyritysten tietohallintopäällikköjä. Alustavat kohdesegmentit muodostettiin teoreettisessa osassa esitettyjen ASP-mallin erityispiirteiden sekä haastattelututkimuksesta saadun informaation perusteella. ASP on uusi malli tietojärjestelmien hankintaan. ASP on käsitteenä vain noin kolme vuotta vanha tutkielman tekohetkellä. Kyseessä on malli, jolle tutkimuslaitokset ovat tehneet lupaavia kasvuennusteita IT-markkinoilla. Markkinoiden kasvu ei ole kuitenkaan ollut ennusteiden mukaista. ASP-ratkaisuna hankittu tietojärjestelmä on kokonaisvaltainen ulkoistusmalli. Kaikki mahdollinen ASP-ratkaisuna hankittavaan järjestelmään tai sovellukseen liittyvä ulkoistetaan palveluntarjoajalle. Tutkimuksessa selvisi, että kiinnostusta mallia kohtaan on olemassa. Mallin edut nähdään suurempina kuin haitat. ASP-ratkaisun käyttöönotolla asiakasyritykselle mahdollistuu resurssien kohdentaminen ydinliiketoimintaan.
Resumo:
Phlorotannins are the least studied group of tannins and are found only in brown algae. Hitherto the roles of phlorotannins, e.g. in plant-herbivore interactions, have been studied by quantifying the total contents of the soluble phlorotannins with a variety of methods. Little attention has been given to either quantitative variation in cell-wall-bound and exuded phlorotannins or to qualitative variation in individual compounds. A quantification procedure was developed to measure the amount of cell-wall-bound phlorotannins. The quantification of soluble phlorotannins was adjusted for both large- and small-scale samples and used to estimate the amounts of exuded phlorotannins using bladder wrack (Fucus vesiculosus) as a model species. In addition, separation of individual soluble phlorotannins to produce a phlorotannin profile from the phenolic crude extract was achieved by high-performance liquid chromatography (HPLC). Along with these methodological studies, attention was focused on the factors in the procedure which generated variation in the yield of phlorotannins. The objective was to enhance the efficiency of the sample preparation procedure. To resolve the problem of rapid oxidation of phlorotannins in HPLC analyses, ascorbic acid was added to the extractant. The widely used colourimetric method was found to produce a variation in the yield that was dependent upon the pH and concentration of the sample. Using these developed, adjusted and modified methods, the phenotypic plasticity of phlorotannins was studied with respect to nutrient availability and herbivory. An increase in nutrients decreased the total amount of soluble phlorotannins but did not affect the cell-wall-bound phlorotannins, the exudation of phlorotannins or the phlorotannin profile achieved with HPLC. The presence of the snail Thedoxus fluviatilis on the thallus induced production of soluble phlorotannins, and grazing by the herbivorous isopod Idotea baltica increased the exudation of phlorotannins. To study whether the among-population variations in phlorotannin contents arise from the genetic divergence or from the plastic response of algae, or both, algae from separate populations were reared in a common garden. Genetic variation among local populations was found in both the phlorotannin profile and the content of total phlorotannins. Phlorotannins were also genetically variable within populations. This suggests that local algal populations have diverged in their contents of phlorotannins, and that they may respond to natural selection and evolve both quantitatively and qualitatively.
Resumo:
The objective of the thesis was to develop a competitors’ financial performance monitoring model for management reporting. The research consisted of the selections of the comparison group and the performance meters as well as the actual creation of the model. A brief analysis of the current situation was also made. The aim of the results was to improve the financial reporting quality in the case organization by adding external business environment observation to the management reports. The comparison group for the case company was selected to include five companies that were all involved in power equipment engineering and project type business. The most limiting factor related to the comparison group selection was the availability of quarterly financial reporting. The most suitable performance meters were defined to be the developments of revenue, order backlog and EBITDA. These meters should be monitored systematically on quarterly basis and reported to the company management in a brief and informative way. The monitoring model was based on spreadsheet construction with key characteristics being usability, flexibility and simplicity. The model acts as a centered storage for financial competitor information as well as a reporting tool. The current market situation is strongly affected by the economic boom in the recent years and future challenges can be clearly seen in declining order backlogs. The case company has succeeded well related to its comparison group during the observation period since its business volume and profitability have developed in the best way.
Resumo:
Biology is turning into an information science. The science of systems biology seeks to understand the genetic networks that govern organism development and functions. In this study the chicken was used as a model organism in the study of B cell regulatory factors. These studies open new avenues for plasma cell research by connecting the down regulation of the B cell gene expression program directly to the initiation of plasma cell differentiation. The unique advantages of the DT40 avian B cell model system, specifically its high homologous recombination rate, were utilized to study gene regulation in Pax5 knock out cell lines and to gain new insights into the B cell to plasma cell transitions that underlie the secretion of antibodies as part of the adaptive immune response. The Pax5 transcription factor is central to the commitment, development and maintenance of the B cell phenotype. Mice lacking the Pax5 gene have an arrest in development at the pro-B lymphocyte stage while DT40 cells have been derived from cells at a more mature stage of development. The DT40 Pax5-/- cells exhibited gene expression similarities with primary chicken plasma cells. The expression of the plasma cell transcription factors Blimp-1 and XBP-1 were significantly upregulated while the expression of the germinal centre factor BCL6 was diminished in Pax5-/- cells, and this alteration was normalized by Pax5 re-introduction. The Pax5-deficient cells further manifested substantially elevated secretion of IgM into the supernatant, another characteristic of plasma cells. These results for the first time indicated that the downregulation of the Pax5 gene in B cells promotes plasma cell differentiation. Cross-species meta-analysis of chicken and mouse Pax5 gene knockout studies uncovers genes and pathways whose regulatory relationship to Pax5 has remained unchanged for over 300 million years. Restriction of the hematopoietic stem cell fate to produce T, B and NK cell lineages is dependent on the Ikaros and its molecular partners, the closely related Helios and Aiolos. Ikaros family members are zinc finger proteins which act as transcriptional repressors while helping to activate lymphoid genes. Helios in mice is expressed from the hematopoietic stem cell level onwards, although later in development its expression seems to predominate in the T cell lineage. This study establishes the emergence and sequence of the chicken Ikaros family members. Helios expression in the bursa of Fabricius, germinal centres and B cell lines suggested a role for Helios in the avian B-cell lineage, too. Phylogenetic studies of the Ikaros family connect the expansion of the Ikaros family, and thus possibly the emergence of the adaptive immune system, with the second round of genome duplications originally proposed by Ohno. Paralogs that have arisen as a result of genome-wide duplications are sometimes termed ohnologs – Ikaros family proteins appear to fit that definition. This study highlighted the opportunities afforded by the genome sequencing efforts and somatic cell reverse genetics approaches using the DT40 cell line. The DT40 cell line and the avian model system promise to remain a fruitful model for mechanistic insight in the post-genomic era as well.
Resumo:
Previous studies on pencil grip have typically dealt with the developmental aspects in young children while handwriting research is mainly concerned with speed and legibility. Studies linking these areas are few. Evaluation of the existing pencil grip studies is hampered by methodological inconsistencies. The operational definitions of pencil grip arerational but tend to be oversimplified while detailed descriptors tend to be impractical due to their multiplicity. The present study introduces a descriptive two-dimensional model for the categorisation of pencil grip suitable for research applications in a classroom setting. The model is used in four empirical studies of children during the first six years of writing instruction. Study 1 describes the pencil grips observed in a large group of pupils in Finland (n = 504). The results indicate that in Finland the majority of grips resemble the traditional dynamic tripod grip. Significant genderrelated differences in pencil grip were observed. Study 2 is a longitudinal exploration of grip stability vs. change (n = 117). Both expected and unexpected changes were observed in about 25 per cent of the children's grips over four years. A new finding emerged using the present model for categorisation: whereas pencil grips would change, either in terms of ease of grip manipulation or grip configuration, no instances were found where a grip would have changed concurrently on both dimensions. Study 3 is a cross-cultural comparison of grips observed in Finland and the USA (n = 793). The distribution of the pencil grips observed in the American pupils was significantly different from those found in Finland. The cross-cultural disparity is most likely related to the differences in the onset of writing instruction. The differences between the boys' and girls' grips in the American group were non-significant.An implication of Studies 2 and 3 is that the initial pencil grip is of foremost importance since pencil grips are largely stable over time. Study 4 connects the pencil grips to assessment of the mechanics of writing (n = 61). It seems that certain previously not recommended pencil grips might nevertheless be includedamong those accepted since they did not appear to hamper either fluency or legibility.
Resumo:
In this thesis, I conduct a series of molecular systematic studies on the large phytophagous moth superfamily Noctuoidea (Insecta, Lepidoptera) to clarify deep divergences and evolutionary affinities of the group, based on material from every zoogeographic region of the globe. Noctuoidea are the most speciose radiations of butterflies and moths on earth, comprising about a quarter of all lepidopteran diversity. The general aim of these studies was to apply suitably conservative genetic markers (DNA sequences of mitochondrial—mtDNA—and nuclear gene— nDNA—regions) to reconstruct, as the initial step, a robust skeleton phylogenetic hypothesis for the superfamily, then build up robust phylogenetic frameworks for those circumscribed monophyletic entities (i.e., families), as well as clarifying the internal classification of monophyletic lineages (subfamilies and tribes), to develop an understanding of the major lineages at various taxonomic levels within the superfamily Noctuoidea, and their inter-relationships. The approaches applied included: i) stabilizing a robust family-level classification for the superfamily; ii) resolving the phylogeny of the most speciose radiation of Noctuoidea: the family Erebidae; iii) reconstruction of ancestral feeding behaviors and evolution of the vampire moths (Erebidae, Calpinae); iv) elucidating the evolutionary relationships within the family Nolidae and v) clarifying the basal lineages of Noctuidae sensu stricto. Thus, in this thesis I present a wellresolved molecular phylogenetic hypothesis for higher taxa of Noctuoidea consisting of six strongly supported families: Oenosandridae, Notodontidae, Euteliidae, Erebidae, Nolidae, and Noctuidae. The studies in my thesis highlight the importance of molecular data in systematic and phylogenetic studies, in particular DNA sequences of nuclear genes, and an extensive sampling strategy to include representatives of all known major lineages of entire world fauna of Noctuoidea from every biogeographic region. This is crucial, especially when the model organism is as species-rich, highly diverse, cosmopolitan and heterogeneous as the Noctuoidea, traits that represent obstacles to the use of morphology at this taxonomic level.
Resumo:
This thesis presents an approach for formulating and validating a space averaged drag model for coarse mesh simulations of gas-solid flows in fluidized beds using the two-fluid model. Proper modeling for fluid dynamics is central in understanding any industrial multiphase flow. The gas-solid flows in fluidized beds are heterogeneous and usually simulated with the Eulerian description of phases. Such a description requires the usage of fine meshes and small time steps for the proper prediction of its hydrodynamics. Such constraint on the mesh and time step size results in a large number of control volumes and long computational times which are unaffordable for simulations of large scale fluidized beds. If proper closure models are not included, coarse mesh simulations for fluidized beds do not give reasonable results. The coarse mesh simulation fails to resolve the mesoscale structures and results in uniform solids concentration profiles. For a circulating fluidized bed riser, such predicted profiles result in a higher drag force between the gas and solid phase and also overestimated solids mass flux at the outlet. Thus, there is a need to formulate the closure correlations which can accurately predict the hydrodynamics using coarse meshes. This thesis uses the space averaging modeling approach in the formulation of closure models for coarse mesh simulations of the gas-solid flow in fluidized beds using Geldart group B particles. In the analysis of formulating the closure correlation for space averaged drag model, the main parameters for the modeling were found to be the averaging size, solid volume fraction, and distance from the wall. The closure model for the gas-solid drag force was formulated and validated for coarse mesh simulations of the riser, which showed the verification of this modeling approach. Coarse mesh simulations using the corrected drag model resulted in lowered values of solids mass flux. Such an approach is a promising tool in the formulation of appropriate closure models which can be used in coarse mesh simulations of large scale fluidized beds.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.