122 resultados para Machine Tools
Resumo:
Agile methods have become increasingly popular in the field of software engineering. While agile methods are now generally considered applicable to software projects of many different kinds, they have not been widely adopted in embedded systems development. This is partly due to the natural constraints that are present in embedded systems development (e.g. hardware–software interdependencies) that challenge the utilization of agile values, principles and practices. The research in agile embedded systems development has been very limited, and this thesis tackles an even less researched theme related to it: the suitability of different project management tools in agile embedded systems development. The thesis covers the basic aspects of many different agile tool types from physical tools, such as task boards and cards, to web-based agile tools that offer all-round solutions for application lifecycle management. In addition to these two extremities, there is also a wide range of lighter agile tools that focus on the core agile practices, such as backlog management. Also other non-agile tools, such as bug trackers, can be used to support agile development, for instance, with plug-ins. To investigate the special tool requirements in agile embedded development, the author observed tool related issues and solutions in a case study involving three different companies operating in the field of embedded systems development. All three companies had a distinct situation in the beginning of the case and thus the tool solutions varied from a backlog spreadsheet built from scratch to plug-in development for an already existing agile software tool. Detailed reports are presented of all three tool cases. Based on the knowledge gathered from agile tools and the case study experiences, it is concluded that there are tool related issues in the pilot phase, such as backlog management and user motivation. These can be overcome in various ways epending on the type of a team in question. Finally, five principles are formed to give guidelines for tool selection and usage in agile embedded systems development.
Resumo:
Can crowdsourcing solutions serve many masters? Can they be beneficial for both, for the layman or native speakers of minority languages on the one hand and serious linguistic research on the other? How did an infrastructure that was designed to support linguistics turn out to be a solution for raising awareness of native languages? Since 2012 the National Library of Finland has been developing the Digitisation Project for Kindred Languages, in which the key objective is to support a culture of openness and interaction in linguistic research, but also to promote crowdsourcing as a tool for participation of the language community in research. In the course of the project, over 1,200 monographs and nearly 111,000 pages of newspapers in Finno-Ugric languages will be digitised and made available in the Fenno-Ugrica digital collection. This material was published in the Soviet Union in the 1920s and 1930s, and users have had only sporadic access to the material. The publication of open-access and searchable materials from this period is a goldmine for researchers. Historians, social scientists and laymen with an interest in specific local publications can now find text materials pertinent to their studies. The linguistically-oriented population can also find writings to delight them: (1) lexical items specific to a given publication, and (2) orthographically-documented specifics of phonetics. In addition to the open access collection, we developed an open source code OCR editor that enables the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary since these rare and peripheral prints often include already archaic characters, which are neglected by modern OCR software developers but belong to the historical context of kindred languages, and are thus an essential part of the linguistic heritage. When modelling the OCR editor, it was essential to consider both the needs of researchers and the capabilities of lay citizens, and to have them participate in the planning and execution of the project from the very beginning. By implementing the feedback iteratively from both groups, it was possible to transform the requested changes as tools for research that not only supported the work of linguistics but also encouraged the citizen scientists to face the challenge and work with the crowdsourcing tools for the benefit of research. This presentation will not only deal with the technical aspects, developments and achievements of the infrastructure but will highlight the way in which user groups, researchers and lay citizens were engaged in a process as an active and communicative group of users and how their contributions were made to mutual benefit.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Biomedical natural language processing (BioNLP) is a subfield of natural language processing, an area of computational linguistics concerned with developing programs that work with natural language: written texts and speech. Biomedical relation extraction concerns the detection of semantic relations such as protein-protein interactions (PPI) from scientific texts. The aim is to enhance information retrieval by detecting relations between concepts, not just individual concepts as with a keyword search. In recent years, events have been proposed as a more detailed alternative for simple pairwise PPI relations. Events provide a systematic, structural representation for annotating the content of natural language texts. Events are characterized by annotated trigger words, directed and typed arguments and the ability to nest other events. For example, the sentence “Protein A causes protein B to bind protein C” can be annotated with the nested event structure CAUSE(A, BIND(B, C)). Converted to such formal representations, the information of natural language texts can be used by computational applications. Biomedical event annotations were introduced by the BioInfer and GENIA corpora, and event extraction was popularized by the BioNLP'09 Shared Task on Event Extraction. In this thesis we present a method for automated event extraction, implemented as the Turku Event Extraction System (TEES). A unified graph format is defined for representing event annotations and the problem of extracting complex event structures is decomposed into a number of independent classification tasks. These classification tasks are solved using SVM and RLS classifiers, utilizing rich feature representations built from full dependency parsing. Building on earlier work on pairwise relation extraction and using a generalized graph representation, the resulting TEES system is capable of detecting binary relations as well as complex event structures. We show that this event extraction system has good performance, reaching the first place in the BioNLP'09 Shared Task on Event Extraction. Subsequently, TEES has achieved several first ranks in the BioNLP'11 and BioNLP'13 Shared Tasks, as well as shown competitive performance in the binary relation Drug-Drug Interaction Extraction 2011 and 2013 shared tasks. The Turku Event Extraction System is published as a freely available open-source project, documenting the research in detail as well as making the method available for practical applications. In particular, in this thesis we describe the application of the event extraction method to PubMed-scale text mining, showing how the developed approach not only shows good performance, but is generalizable and applicable to large-scale real-world text mining projects. Finally, we discuss related literature, summarize the contributions of the work and present some thoughts on future directions for biomedical event extraction. This thesis includes and builds on six original research publications. The first of these introduces the analysis of dependency parses that leads to development of TEES. The entries in the three BioNLP Shared Tasks, as well as in the DDIExtraction 2011 task are covered in four publications, and the sixth one demonstrates the application of the system to PubMed-scale text mining.
Resumo:
Remote monitoring of a power boiler allows the supplying company to make sure that equipment is used as supposed to and gives a good chance for process optimization. This improves co-operation between the supplier and the customer and creates an aura of trust that helps securing future contracts. Remote monitoring is already in use with recovery boilers but the goal is to expand especially to biomass-fired BFB-boilers. To make remote monitoring possible, data has to be measured reliably on site and the link between the power plant and supplying company’s server has to work reliably. Data can be gathered either with the supplier’s sensors or with measurements originally installed in the power plant if the plant in question is not originally built by the supplying company. Main goal in remote monitoring is process optimization and avoiding unnecessary accidents. This can be achieved for instance by following the efficiency curves and fouling in different parts of the process and comparing them to past values. The final amount of calculations depends on the amount of data gathered. Sudden changes in efficiency or fouling require further notice and in such a case it’s important that dialogue toward the power plant in question also works.
Resumo:
Macroalgae are the main primary producers of the temperate rocky shores providing a three-dimensional habitat, food and nursery grounds for many other species. During the past decades, the state of the coastal waters has deteriorated due to increasing human pressures, resulting in dramatic changes in coastal ecosystems, including macroalgal communities. To reverse the deterioration of the European seas, the EU has adopted the Water Framework Directive (WFD) and the Marine Strategy Framework Directive (MSFD), aiming at improved status of the coastal waters and the marine environment. Further, the Habitats Directive (HD) calls for the protection of important habitats and species (many of which are marine) and the Maritime Spatial Planning Directive for sustainability in the use of resources and human activities at sea and by the coasts. To efficiently protect important marine habitats and communities, we need knowledge on their spatial distribution. Ecological knowledge is also needed to assess the status of the marine areas by involving biological indicators, as required by the WFD and the MSFD; knowledge on how biota changes with human-induced pressures is essential, but to reliably assess change, we need also to know how biotic communities vary over natural environmental gradients. This is especially important in sea areas such as the Baltic Sea, where the natural environmental gradients create substantial differences in biota between areas. In this thesis, I studied the variation occurring in macroalgal communities across the environmental gradients of the northern Baltic Sea, including eutrophication induced changes. The aim was to produce knowledge to support the reliable use of macroalgae as indicators of ecological status of the marine areas and to test practical metrics that could potentially be used in status assessments. Further, the aim was to develop a methodology for mapping the HD Annex I habitat reefs, using the best available data on geology and bathymetry. The results showed that the large-scale variation in the macroalgal community composition of the northern Baltic Sea is largely driven by salinity and exposure. Exposure is important also on smaller spatial scales, affecting species occurrence, community structure and depth penetration of algae. Consequently, the natural variability complicates the use of macroalgae as indicators of human-induced changes. Of the studied indicators, the number of perennial algal species, the perennial cover, the fraction of annual algae, and the lower limit of occurrence of red and brown perennial algae showed potential as usable indicators of ecological status. However, the cumulated cover of algae, commonly used as an indicator in the fully marine environments, showed low responses to eutrophication in the area. Although the mere occurrence of perennial algae did not show clear indicator potential, a distinct discrepancy in the occurrence of bladderwrack, Fucus vesiculosus, was found between two areas with differing eutrophication history, the Bothnian Sea and the Archipelago Sea. The absence of Fucus from many potential sites in the outer Archipelago Sea is likely due to its inability to recover from its disappearance from the area 30-40 years ago, highlighting the importance of past events in macroalgal occurrence. The methodology presented for mapping the potential distribution and the ecological value of reefs showed, that relatively high accuracy in mapping can be achieved by combining existing available data, and the maps produced serve as valuable background information for more detailed surveys. Taken together, the results of the theses contribute significantly to the knowledge on macroalgal communities of the northern Baltic Sea that can be directly applied in various management contexts.
Resumo:
Cardiac troponin (cTn) I and T are the recommended biomarkers for the diagnosis and risk stratification of patients with suspected acute coronary syndrome (ACS), a major cause of cardiovascular death and disability worldwide. It has recently been demonstrated that cTn-specific autoantibodies (cTnAAb) can negatively interfere with cTnI detection by immunoassays to the extent that cTnAAb-positive patients may be falsely designated as cTnI-negative. The aim of this thesis was to develop and optimize immunoassays for the detection of both cTnI and cTnAAb, which would eventually enable exploring the clinical impact of these autoantibodies on cTnI testing and subsequent patient management. The extent of cTnAAb interference in different cTnI assay configurations and the molecular characteristics of cTnAAbs were investigated in publications I and II, respectively. The findings showed that cTnI midfragment targeting immunoassays used predominantly in clinical practice are affected by cTnAAb interference which can be circumvented by using a novel 3+1-type assay design with three capture antibodies against the N-terminus, midfragment and C-terminus and one tracer antibody against the C-terminus. The use of this assay configuration was further supported by the epitope specificity study, which showed that although the midfragment is most commonly targeted by cTnAAbs, the interference basically encompasses the whole molecule, and there may be remarkable individual variation at the affected sites. In publications III and IV, all the data obtained in previous studies were utilized to develop an improved version of an existing cTnAAb assay and a sensitive cTnI assay free of this specific analytical interference. The results of the thesis showed that approximately one in 10 patients with suspected ACS have detectable amounts of cTnAAbs in their circulation and that cTnAAbs can inhibit cTnI determination when targeted against the binding sites of assay antibodies used in its immunological detection. In the light of these observations, the risk of clinical misclassification caused by the presence of cTnAAbs remains a valid and reasonable concern. Because the titers, affinities and epitope specificities of cTnAAbs and the concentration of endogenous cTnI determine the final effect of circulating cTnAAbs, appropriately sized studies on their clinical significance are warranted. The new cTnI and cTnAAb assays could serve as analytical tools for establishing the impact of cTnAAbs on cTnI testing and also for unraveling the etiology of cTn-related autoimmune responses.
Resumo:
This study examines information security as a process (information securing) in terms of what it does, especially beyond its obvious role of protector. It investigates concepts related to ‘ontology of becoming’, and examines what it is that information securing produces. The research is theory driven and draws upon three fields: sociology (especially actor-network theory), philosophy (especially Gilles Deleuze and Félix Guattari’s concept of ‘machine’, ‘territory’ and ‘becoming’, and Michel Serres’s concept of ‘parasite’), and information systems science (the subject of information security). Social engineering (used here in the sense of breaking into systems through non-technical means) and software cracker groups (groups which remove copy protection systems from software) are analysed as examples of breaches of information security. Firstly, the study finds that information securing is always interruptive: every entity (regardless of whether or not it is malicious) that becomes connected to information security is interrupted. Furthermore, every entity changes, becomes different, as it makes a connection with information security (ontology of becoming). Moreover, information security organizes entities into different territories. However, the territories – the insides and outsides of information systems – are ontologically similar; the only difference is in the order of the territories, not in the ontological status of entities that inhabit the territories. In other words, malicious software is ontologically similar to benign software; they both are users in terms of a system. The difference is based on the order of the system and users: who uses the system and what the system is used for. Secondly, the research shows that information security is always external (in the terms of this study it is a ‘parasite’) to the information system that it protects. Information securing creates and maintains order while simultaneously disrupting the existing order of the system that it protects. For example, in terms of software itself, the implementation of a copy protection system is an entirely external addition. In fact, this parasitic addition makes software different. Thus, information security disrupts that which it is supposed to defend from disruption. Finally, it is asserted that, in its interruption, information security is a connector that creates passages; it connects users to systems while also creating its own threats. For example, copy protection systems invite crackers and information security policies entice social engineers to use and exploit information security techniques in a novel manner.
Resumo:
Presentation at "Soome-ugri keelte andmebaasid ja e-leksikograafia" at Eesti Keele Instituut (Institution of Estonian Languages) in Tallnn on the 18th of November 2014.
Resumo:
Syftet med denna studie är att granska hur användningen av smarttelefoner kan stöda elevens reflektion av sin egen slöjdprocess och samtidigt se hur eleven dokumenterar sin slöjdprocess med hjälp av Talking Tools (TT). Fokus i denna studie ligger på dokumentation av elevens slöjdprocess, men jag behandlar även presentation av slöjdprodukter och olika möjligheter till och sätt på hur eleverna använder TT på en smarttelefon i slöjdundervisningen. En förändring i hur informations- och kommunikationsteknik (IKT) används sker och kommer även att ske i fortsättningen i grundskolan. Då en ny läroplan för den grundläggande utbildningen tas i bruk år 2016 kommer IKT att ha en avsevärt större roll. Denna förändring lägger press på lärare och på skolan överlag för att skapa ändamålsenliga metoder i användningen av IKT. Studien baserar sig på intervjuer med elever som använt TT under ett arbetsområde i slöjden. Elva elever presenterade först sina slöjdprodukter för forskaren varefter eleverna intervjuades. Eleverna hade tillgång till sina slöjdbloggar gjorda i TT under intervjutillfällena. Intervjuerna filmades och lades ihop med bloggarna för att förenkla analysen av materialet. I den här undersökningen framkom att dokumentationen av slöjdprocessen bidrog till en reflektion av när bilder och texter från bloggarna användes under presentationerna. Då kraven på dokumenteringen av arbetet i skolan blir högre i och med läroplanen 2016 tror jag att Talking Tools kan vara ett hjälpmedel för elever och lärare i förverkligandet av bland annat användningen av IKT i arbetet i skolan. Detta förutsätter att eleverna ges tid för användningen samt att tekniken fungerar.
Resumo:
In recent times the packaging industry is finding means to maximize profit. Wood used to be the most advantageous and everyday material for packaging, worktables, counters, constructions, interiors, tools and as materials and utensils in the food companies in the world. The use of wood has declined vigorously, and other materials like plastic, ceramic, stainless steel, concrete, and aluminum have taken its place. One way that the industry could reduce its cost is by finding possibilities of using wood for primary packaging after which it can be safely recycled or burned as a carbon source for energy. Therefore, the main objective of this thesis is to investigate the possibility of press-forming a wood film into primary packaging. In order to achieve the stated objectives, discussion on major characteristics of wood in terms of structure, types and application were studied. Two different wood species, pine and birch were used for the experimental work. These were provided by a local carpentry workshop in Lappeenranta and a workshop in Ruokolahti supervised by Professor Timo Kärki. Laboratory tests were carried out at Lappeenranta University of Technology FMS workshop on Stenhøj EPS40 M hydraulic C-frame press coupled with National Instruments VI Logger and on the Adjustable packaging line machine at LUT Packaging laboratory. The tests succeeded better on the LUT packaging line than on the Stenhoj equipment due to the integrated heating system in the machine. However, there is much work to be done before the quality of a tray produced from the wood film is comparable to that of the wood plastic composite tray.
Resumo:
Analytical calculation methods for all the major components of the synchronous inductance of tooth-coil permanentmagnet synchronous machines are reevaluated in this paper. The inductance estimation is different in the tooth-coil machine compared with the one in the traditional rotating field winding machine. The accuracy of the analytical torque calculation highly depends on the estimated synchronous inductance. Despite powerful finite element method (FEM) tools, an accurate and fast analytical method is required at an early design stage to find an initialmachine design structure with the desired performance. The results of the analytical inductance calculation are verified and assessed in terms of accuracy with the FEM simulation results and with the prototype measurement results.
Resumo:
Marketing communications has gone through significant changes during the last decades and new online tools have been leading this change for the last decade. Now, in the digital age, if wanting to be successful companies need to experiment new things and seize the opportunities online provides and adapt to the new environment. However, during this time the marketing communication mix and the meanings given to the various components of it have not changed dramatically, and personal selling, direct marketing, sales promotion, advertising and public relations activities are still regarded as important tools in the marketing communications mix. The purpose of this study was to examine business-to-business marketing communications and the tools used by companies in their marketing communications efforts in the digital age and in a global environment. The research questions dealt with the marketing communication tools and their roles as well as the role of online marketing communication and the way it has shaped the field of b2b marketing communications. In order to answer these questions, qualitative approach was chosen and the data was collected by theme interviews with six representatives of Finnish global companies from the b2b sector being interviewed. The theoretical framework covers the general field of b2b marketing communications and its main elements. The online environment as well as the concept of a global marketplace and integrated approach to marketing communication activities is also discussed. The theory was supported by the interviews regarding the activities and roles of the marketing communication tools and both theory as well as the interviews found personal selling to be a vital tool. However, the importance of online has grown and online marketing activities have risen right next to personal selling. The use of analytics and marketing automation was found to be of great interest in the interviews and they were seen as a growing domain in b2b marketing communications. The importance of targeted and personalised messages from relevant medias was a repeating theme as was the customer-centric approach in marketing communication activities. Also, offline and online tools and channels were seen as something that should be treated together as an entity rather than as separate activities. Relevant content created according to the needs of the customer with the gathered data from analytics were seen as the future of b2b marketing communications. Online has added its input to the more traditionally perceived tools and they are executed within the framework of the digital age. Nevertheless, even though online has increased its presence in the b2b marketing communications mix, the more traditionally perceived marketing communication tools, especially personal selling, have not lost their meaning or place in the b2b marketing communication mix.