903 resultados para many core
Resumo:
The application of information technology (IT) in customer relationship management (CRM) is growing rapidly as many companies implement CRM systems to support their numerous customer facing activities. However, failure rates of CRM projects remain notably high as they deliver scant solutions and poor user acceptance. As a consequence, it is justified to study previously researched CRM success factors and apply them to CRM system implementation. The aim of this master’s thesis was to get acquainted with relevant academic theories, frameworks and practices concerning CRM and agile development, and use them to generate a modified CRM project strategy to support the successful execution of the case company’s, Process Vision Oy, CRM implementation project. The empirical CRM system implementation project was conducted simultaneously with writing this thesis. Its theoretical findings could be transferred into practice through active participation in the CRM system development and deployment work. The project’s main goal was to produce and take into use a functioning CRM system. The goal was met, since at the time of printing this thesis the first system release was successfully published to its users at Process Vision’s marketing and sales departments. The key success elements in the CRM project were cyclic, iterative system development, customer oriented approach, user inclusion and flexible project management. Implying agile development practices ensured being able to quickly respond to changes arising during the progress of the CRM project. Throughout modelling of the core sales process formed a strong basis, on which the CRM system’s operational and analytical functionalities were built. End users were included in the initial specification of system requirements and they provided feedback on the system’s usage. To conclude, the chosen theoretical CRM roadmaps and agile development practices proved as beneficial in the successful planning and execution of the agile CRM system implementation project at Process Vision.
Resumo:
Modern sophisticated telecommunication devices require even more and more comprehensive testing to ensure quality. The test case amount to ensure well enough coverage of testing has increased rapidly and this increased demand cannot be fulfilled anymore only by using manual testing. Also new agile development models require execution of all test cases with every iteration. This has lead manufactures to use test automation more than ever to achieve adequate testing coverage and quality. This thesis is separated into three parts. Evolution of cellular networks is presented at the beginning of the first part. Also software testing, test automation and the influence of development model for testing are examined in the first part. The second part describes a process which was used to implement test automation scheme for functional testing of LTE core network MME element. In implementation of the test automation scheme agile development models and Robot Framework test automation tool were used. In the third part two alternative models are presented for integrating this test automation scheme as part of a continuous integration process. As a result, the test automation scheme for functional testing was implemented. Almost all new functional level testing test cases can now be automated with this scheme. In addition, two models for integrating this scheme to be part of a wider continuous integration pipe were introduced. Also shift from usage of a traditional waterfall model to a new agile development based model in testing stated to be successful.
Resumo:
XML-muotoista tiedonesitystapaa hyödynnetään yhä enemmän esitettäessä rakenteellista tietoa. Tarkoituksena on antaa yleishyödyllinen ja uudelleenkäytettävä tapa jakaa yleistä tietoa erilaisten rajapintojen yli. XML-tekniikoita käytetään myös korjaamaan aiemmin tehdyissä sovellutuksissa esiintyneitä puutteita ja parantamaan niiden toimintaa. Tässä diplomityössä esitellään Telestelle LabView-pohjaiseen testaussovellusympäristöön suunniteltava ajuriuudistus. Työssä paranneltiin aiempaa ajurimallia soveltamalla siihen XML-tekniikoita hyödyntäviä toimintoja. Tarkoituksena oli vähentää testaussovelluskehityksessä vaadittavaa ohjelmointityötä korvaamalla sovelluksiin kovakoodatut ominaisuudet XML-pohjaisilla konfiguraatiotiedostoilla. Järjestelmän pohjana on yleiskäyttöinen ajuri, joka käyttää Telesten omaa EMS-protokollaa kommunikoinnissaan testattavien tuotteiden kanssa. Ajurimalli käyttää XML-pohjaisia konfiguraatiotiedostoja määrittelemään testattavien tuotteiden ominaisuuksia. XML-skeematiedostoilla esitetään ajurin käyttämän kommunikaatioprotokollan viestityypit ja niiden rakenteet. Työn tuloksena onnistuttiin luomaan uudenlainen XML-tekniikoita hyödyntävä ajurimalli. Yhteen yhteiseen ajuriin perustuva malli yhdenmukaistaa testaussovelluksien toteuttamista ja vähentää tarvittavaa ohjelmointityötä. Ajurin käyttöä helpotettiin toteuttamalla testaussovelluksien kehitysympäristöön erityinen editori, jolla voidaan helposti luoda ajuria käyttäviä toimintoja.
Resumo:
The purpose of this thesis was to study how certificates could be used to improve security of mobile devices. In the theoretical part the usage of certificates to improve security is explained. In the practical part a concept of certificate handling middleware is introduced and implemented. This is to demonstrate what kind of functionality is needed to provide an improvement over the current situation in security with mobile devices. The certificate handling middleware is a concept that would work better if implemented directly into mobile device's core functionality. Many of the mobile devices have a certificate store to some degree and often it is not used to store other people's certificates. A certificate store combined with address book and added with possibility to add attributes to the people such as group memberships would be sufficient to satisfy the needs of many emerging sharing and social applications.
Resumo:
Methane-rich landfill gas is generated when biodegradable organic wastes disposed of in landfills decompose under anaerobic conditions. Methane is a significant greenhouse gas, and landfills are its major source in Finland. Methane production in landfill depends on many factors such as the composition of waste and landfill conditions, and it can vary a lot temporally and spatially. Methane generation from waste can be estimated with various models. In this thesis three spreadsheet applications, a reaction equation and a triangular model for estimating the gas generation were introduced. The spreadsheet models introduced are IPCC Waste Model (2006), Metaanilaskentamalli by Jouko Petäjä of Finnish Environment Institute and LandGEM (3.02) of U.S. Environmental Protection Agency. All these are based on the first order decay (FOD) method. Gas recovery methods and gas emission measurements were also examined. Vertical wells and horizontal trenches are the most commonly used gas collection systems. Emission measurements chamber method, tracer method, soil core and isotope measurements, micrometeorological mass-balance and eddy covariance methods and gas measuring FID-technology were discussed. Methane production at Ämmässuo landfill of HSY Helsinki Region Environmental Services Authority was estimated with methane generation models and the results were compared with the volumes of collected gas. All spreadsheet models underestimated the methane generation at some point. LandGEM with default parameters and Metaanilaskentamalli with modified parameters corresponded best with the gas recovery numbers. Reason for the differences between evaluated and collected volumes could be e.g. that the parameter values of the degradable organic carbon (DOC) and the fraction of decomposable degradable organic carbon (DOCf) do not represent the real values well enough. Notable uncertainty is associated with the modelling results and model parameters. However, no simple explanation for the discovered differences can be given within this thesis.
Resumo:
The aim of the study was to create an easily upgradable product costing model for laser welded hollow core steel panels to help in pricing decisions. The theory section includes a literature review to identify traditional and modern cost accounting methodologies, which are used by manufacturing companies. The theory section also presents the basics of steel panel structures and their manufacturing methods and manufacturing costs based on previous research. Activity-Based costing turned out to be the most appropriate methodology for the costing model because of wide product variations. Activity analysis and the determination of cost drivers based on observations and interviews were the key steps in the creation of the model. The created model was used to test how panel parameters affect the costs caused by the main manufacturing stages and materials. By comparing cost structures, it was possible to find the panel types that are the most economic and uneconomic to manufacture. A sensitivity analysis proved that the model gives sufficiently reliable cost information to support pricing decisions. More reliable cost information could be achieved by determining the cost drivers more accurately. Alternative methods for manufacturing the cores were compared with the model. The comparison proved that roll forming can be more advantageous and flexible than press brake bending. However, more extensive research showed that roll forming is possible only when the cores are designed to be manufactured by roll forming. Due to that fact, when new panels are designed consideration should be given to the possibility of using roll forming.
Resumo:
Lipopolysacharide (LPS) present on the outer leaflet of Gram-negative bacteria is important for the adaptation of the bacteria to the environment. Structurally, LPS can be divided into three parts: lipid A, core and O-polysaccharide (OPS). OPS is the outermost and also the most diverse moiety. When OPS is composed of identical sugar residues it is called homopolymeric and when it is composed of repeating units of oligosaccharides it is called heteropolymeric. Bacteria synthesize LPS at the inner membrane via two separate pathways, Lipid A-core via one and OPS via the other. These are ligated together in the periplasmic space and the completed LPS molecule is translocated to the surface of the bacteria. The genes directing the OPS biosynthesis are often clustered and the clusters directing the biosynthesis of heteropolymeric OPS often contain genes for i) the biosynthesis of required NDP-sugar precursors, ii) glycosyltransferases needed to build up the repeating unit, iii) translocation of the completed O-unit to the periplasmic side of the inner membrane (flippase) and iv) polymerization of the repeating units to complete OPS. The aim of this thesis was to characterize the biosynthesis of the outer core (OC) of Yersinia enterocolitica serotype O:3 (YeO3). Y. enterocolitica is a member of the Gram-negative Yersinia genus and it causes diarrhea followed sometimes by reactive arthritis. The chemical structure of the OC and the nucleotide sequence of the gene cluster directing its biosynthesis were already known; however, no experimental evidence had been provided for the predicted functions of the gene products. The hypothesis was that the OC biosynthesis would follow the pathway described for heteropolymeric OPS, i.e. a Wzy-dependent pathway. In this work the biochemical activities of two enzymes involved in the NDP-sugar biosynthesis was established. Gne was determined to be a UDP-N-acetylglucosamine-4-epimerase catalyzing the conversion of UDP-GlcNAc to UDP-GalNAc and WbcP was shown to be a UDP-GlcNAc- 4,6-dehydratase catalyzing the reaction that converts UDP-GlcNAc to a rare UDP-2-acetamido- 2,6-dideoxy-d-xylo-hex-4-ulopyranose (UDP-Sugp). In this work, the linkage specificities and the order in which the different glycosyltransferases build up the OC onto the lipid carrier were also investigated. In addition, by using a site-directed mutagenesis approach the catalytically important amino acids of Gne and two of the characterized glycosyltranferases were identified. Also evidence to show the enzymes involved in the ligations of OC and OPS to the lipid A inner core was provided. The importance of the OC to the physiology of Y. enterocolitica O:3 was defined by determining the minimum requirements for the OC to be recognized by a bacteriophage, bacteriocin and monoclonal antibody. The biological importance of the rare keto sugar (Sugp) was also shown. As a conclusion this work provides an extensive overview of the biosynthesis of YeO3 OC as it provides a substantial amount of information of the stepwise and coordinated synthesis of the Ye O:3 OC hexasaccharide and detailed information of its properties as a receptor.
Resumo:
The purpose of this Thesis was to comprehensively analyze and develop the spare part business in Company Oy’s five biggest product groups by searching development issues related to single spare parts’ supply chains as well as the spare part business process, make implementation plans for them and implement the plans when possible. The items were classified based on special characteristics of spare parts and on their actual sales volumes. The created item classes were examined for finding improvement possibilities. Management strategies for classified items were suggested. Vendors and customers were analyzed for supporting the comprehensive supply network development work. The effectiveness of the current spare part business process was analyzed in co-operation with the spare part teams in three business unit locations. Several items were taken away from inventories as uselessly stocked items. Price list related to core items with one of the main product group’s core item manufacturer was suggested to be expanded in Town A. Refinement equipment seal item supply chain management was seen important to develop in Town B. A new internal business process model was created for minimizing and enhancing the internal business between Company’s business units. SAP inventory reports and several other features were suggested to be changed or developed. Also the SAP data material management was seen very important to be developed continuously. Many other development issues related to spare parts’ supply chains and the work done in the business process were found. The need for investigating the development possibilities deeper became very clear during the project.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.
Resumo:
RESUMO A revolução biotecnológica das últimas décadas teve como resultado o desenvolvimento de um poder quase sem limites sobre a vida humana. Tal contexto exige do profissional uma visão globalizada dos problemas éticos e sociais da era contemporânea, alicerçada em sólidas bases filosóficas e legais. Este contexto torna necessária a promoção de novas competências e habilidades relacionadas à vida profissional. Neste sentido, o ensino da Bioética desponta como uma possibilidade de inovação curricular alternativa ao tradicional modelo prescritivo e normativo. Este artigo relata a experiência da Cátedra Unesco de Bioética da Universidade de Brasília com a utilização do Core Curriculum proposto pela Unesco como instrumento didático-pedagógico adequado ao ensino da Bioética. Entre os dilemas pedagógicos enfrentados pela Bioética como disciplina encontram-se: a construção de seus conteúdos, sua estruturação, as concepções teóricas a serem seguidas e seus objetivos. A contextualização e o aperfeiçoamento da estratégia proposta pelo Core Curriculum podem significar importantes instrumentos facilitadores para docentes que buscam organizar práticas didático-pedagógicas inovadoras em Bioética com o intuito de proporcionar resultados efetivos na formação de seus estudantes.
Resumo:
Added engraved title page: The history of Lapland.
Resumo:
The paper industry is constantly looking for new ideas for improving paper products while competition and raw material prices are increasing. Many paper products are pigment coated. Coating layer is the top layer of paper, thus by modifying coating pigment also the paper itself can be altered and value added to the final product. In this thesis, synthesis of new plastic and hybrid pigments and their performance in paper and paperboard coating is reported. Two types of plastic pigments were studied: core-shell latexes and solid beads of maleimide copolymers. Core-shell latexes with partially crosslinked hydrophilic polymer core of poly(n-butyl acrylate-co-methacrylic acid) and a hard hydrophobic polystyrene shell were prepared to improve the optical properties of coated paper. In addition, the effect of different crosslinkers was analyzed and the best overall performance was achieved by the use of ethylene glycol dimethacrylate (EGDMA). Furthermore, the possibility to modify core-shell latex was investigated by introducing a new polymerizable optical brightening agent, 1-[(4-vinylphenoxy)methyl]-4-(2-henylethylenyl)benzene which gave promising results. The prepared core-shell latex pigments performed smoothly also in pilot coating and printing trials. The results demonstrated that by optimizing polymer composition, the optical and surface properties of coated paper can be significantly enhanced. The optimal reaction conditions were established for thermal imidization of poly(styrene-co-maleimide) (SMI) and poly(octadecene-co-maleimide) (OMI) from respective maleic anhydride copolymer precursors and ammonia in a solvent free process. The obtained aqueous dispersions of nanoparticle copolymers exhibited glass transition temperatures (Tg) between 140-170ºC and particle sizes from 50-230 nm. Furthermore, the maleimide copolymers were evaluated in paperboard coating as additional pigments. The maleimide copolymer nanoparticles were partly imbedded into the porous coating structure and therefore the full potential of optical property enhancement for paperboard was not achieved by this method. The possibility to modify maleimide copolymers was also studied. Modifications were carried out via N-substitution by replacing part of the ammonia in the imidization reaction with amines, such as triacetonediamine (TAD), aspartic acid (ASP) and fluorinated amines (2,2,2- trifluoroethylamine, TFEA and 2,2,3,3,4,4,4-heptafluorobuthylamine, HFBA). The obtained functional nanoparticles varied in size between 50-217 nm and their Tg from 150-180ºC. During the coating process the produced plastic pigments exhibited good runnability. No significant improvements were achieved in light stability with TAD modified copolymers whereas nanoparticles modified with aspartic acid and those containing fluorinated groups showed the desired changes in surface properties of the coated paperboard. Finally, reports on preliminary studies with organic-inorganic hybrids are presented. The hybrids prepared by an in situ polymerization reaction consisted of 30 wt% poly(styrene- co-maleimide) (SMI) and high levels of 70 wt% inorganic components of kaolin and/or alumina trihydrate. Scanning Electron Microscopy (SEM) images and characterization by Fourier Transform Infrared Spcetroscopy (FTIR) and X-Ray Diffraction (XRD) revealed that the hybrids had conventional composite structure and inorganic components were covered with precipitated SMI nanoparticles attached to the surface via hydrogen bonding. In paper coating, the hybrids had a beneficial effect on increasing gloss levels.
Resumo:
Cloud computing enables on-demand network access to shared resources (e.g., computation, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort. Cloud computing refers to both the applications delivered as services over the Internet and the hardware and system software in the data centers. Software as a service (SaaS) is part of cloud computing. It is one of the cloud service models. SaaS is software deployed as a hosted service and accessed over the Internet. In SaaS, the consumer uses the provider‘s applications running in the cloud. SaaS separates the possession and ownership of software from its use. The applications can be accessed from any device through a thin client interface. A typical SaaS application is used with a web browser based on monthly pricing. In this thesis, the characteristics of cloud computing and SaaS are presented. Also, a few implementation platforms for SaaS are discussed. Then, four different SaaS implementation cases and one transformation case are deliberated. The pros and cons of SaaS are studied. This is done based on literature references and analysis of the SaaS implementations and the transformation case. The analysis is done both from the customer‘s and service provider‘s point of view. In addition, the pros and cons of on-premises software are listed. The purpose of this thesis is to find when SaaS should be utilized and when it is better to choose a traditional on-premises software. The qualities of SaaS bring many benefits both for the customer as well as the provider. A customer should utilize SaaS when it provides cost savings, ease, and scalability over on-premises software. SaaS is reasonable when the customer does not need tailoring, but he only needs a simple, general-purpose service, and the application supports customer‘s core business. A provider should utilize SaaS when it offers cost savings, scalability, faster development, and wider customer base over on-premises software. It is wise to choose SaaS when the application is cheap, aimed at mass market, needs frequent updating, needs high performance computing, needs storing large amounts of data, or there is some other direct value from the cloud infrastructure.
Resumo:
The purpose of this thesis is to study organizational core values and their application in practice. With the help of literature, the thesis discusses the implementation of core values and the benefits that companies can gain by doing it successfully. Also, ways in which companies can improve their values’ application to their everyday work are presented. The case company’s value implementation is evaluated through a survey research conducted on their employees. The true power of values lies in their application, and therefore, core values should be the basis for all organizational behavior, integrated into everything a company does. Applying values in practice is an ongoing process and companies should continuously work towards creating a more value-based organizational culture. If a company does this effectively, they will most likely become more successful with stakeholders as well as financially. Companies looking to turn their values into actions should start with a self-assessment. Employee surveys are effective in assessing the current level of value implementation, since employees have valuable, first-hand information regarding the situations and behaviors they face in their everyday work. After the self-assessment, things like management commitment, communication, training, and support are key success factors in value implementation.