908 resultados para Machine Tools
Resumo:
Remote monitoring of a power boiler allows the supplying company to make sure that equipment is used as supposed to and gives a good chance for process optimization. This improves co-operation between the supplier and the customer and creates an aura of trust that helps securing future contracts. Remote monitoring is already in use with recovery boilers but the goal is to expand especially to biomass-fired BFB-boilers. To make remote monitoring possible, data has to be measured reliably on site and the link between the power plant and supplying company’s server has to work reliably. Data can be gathered either with the supplier’s sensors or with measurements originally installed in the power plant if the plant in question is not originally built by the supplying company. Main goal in remote monitoring is process optimization and avoiding unnecessary accidents. This can be achieved for instance by following the efficiency curves and fouling in different parts of the process and comparing them to past values. The final amount of calculations depends on the amount of data gathered. Sudden changes in efficiency or fouling require further notice and in such a case it’s important that dialogue toward the power plant in question also works.
Resumo:
Macroalgae are the main primary producers of the temperate rocky shores providing a three-dimensional habitat, food and nursery grounds for many other species. During the past decades, the state of the coastal waters has deteriorated due to increasing human pressures, resulting in dramatic changes in coastal ecosystems, including macroalgal communities. To reverse the deterioration of the European seas, the EU has adopted the Water Framework Directive (WFD) and the Marine Strategy Framework Directive (MSFD), aiming at improved status of the coastal waters and the marine environment. Further, the Habitats Directive (HD) calls for the protection of important habitats and species (many of which are marine) and the Maritime Spatial Planning Directive for sustainability in the use of resources and human activities at sea and by the coasts. To efficiently protect important marine habitats and communities, we need knowledge on their spatial distribution. Ecological knowledge is also needed to assess the status of the marine areas by involving biological indicators, as required by the WFD and the MSFD; knowledge on how biota changes with human-induced pressures is essential, but to reliably assess change, we need also to know how biotic communities vary over natural environmental gradients. This is especially important in sea areas such as the Baltic Sea, where the natural environmental gradients create substantial differences in biota between areas. In this thesis, I studied the variation occurring in macroalgal communities across the environmental gradients of the northern Baltic Sea, including eutrophication induced changes. The aim was to produce knowledge to support the reliable use of macroalgae as indicators of ecological status of the marine areas and to test practical metrics that could potentially be used in status assessments. Further, the aim was to develop a methodology for mapping the HD Annex I habitat reefs, using the best available data on geology and bathymetry. The results showed that the large-scale variation in the macroalgal community composition of the northern Baltic Sea is largely driven by salinity and exposure. Exposure is important also on smaller spatial scales, affecting species occurrence, community structure and depth penetration of algae. Consequently, the natural variability complicates the use of macroalgae as indicators of human-induced changes. Of the studied indicators, the number of perennial algal species, the perennial cover, the fraction of annual algae, and the lower limit of occurrence of red and brown perennial algae showed potential as usable indicators of ecological status. However, the cumulated cover of algae, commonly used as an indicator in the fully marine environments, showed low responses to eutrophication in the area. Although the mere occurrence of perennial algae did not show clear indicator potential, a distinct discrepancy in the occurrence of bladderwrack, Fucus vesiculosus, was found between two areas with differing eutrophication history, the Bothnian Sea and the Archipelago Sea. The absence of Fucus from many potential sites in the outer Archipelago Sea is likely due to its inability to recover from its disappearance from the area 30-40 years ago, highlighting the importance of past events in macroalgal occurrence. The methodology presented for mapping the potential distribution and the ecological value of reefs showed, that relatively high accuracy in mapping can be achieved by combining existing available data, and the maps produced serve as valuable background information for more detailed surveys. Taken together, the results of the theses contribute significantly to the knowledge on macroalgal communities of the northern Baltic Sea that can be directly applied in various management contexts.
Resumo:
Cardiac troponin (cTn) I and T are the recommended biomarkers for the diagnosis and risk stratification of patients with suspected acute coronary syndrome (ACS), a major cause of cardiovascular death and disability worldwide. It has recently been demonstrated that cTn-specific autoantibodies (cTnAAb) can negatively interfere with cTnI detection by immunoassays to the extent that cTnAAb-positive patients may be falsely designated as cTnI-negative. The aim of this thesis was to develop and optimize immunoassays for the detection of both cTnI and cTnAAb, which would eventually enable exploring the clinical impact of these autoantibodies on cTnI testing and subsequent patient management. The extent of cTnAAb interference in different cTnI assay configurations and the molecular characteristics of cTnAAbs were investigated in publications I and II, respectively. The findings showed that cTnI midfragment targeting immunoassays used predominantly in clinical practice are affected by cTnAAb interference which can be circumvented by using a novel 3+1-type assay design with three capture antibodies against the N-terminus, midfragment and C-terminus and one tracer antibody against the C-terminus. The use of this assay configuration was further supported by the epitope specificity study, which showed that although the midfragment is most commonly targeted by cTnAAbs, the interference basically encompasses the whole molecule, and there may be remarkable individual variation at the affected sites. In publications III and IV, all the data obtained in previous studies were utilized to develop an improved version of an existing cTnAAb assay and a sensitive cTnI assay free of this specific analytical interference. The results of the thesis showed that approximately one in 10 patients with suspected ACS have detectable amounts of cTnAAbs in their circulation and that cTnAAbs can inhibit cTnI determination when targeted against the binding sites of assay antibodies used in its immunological detection. In the light of these observations, the risk of clinical misclassification caused by the presence of cTnAAbs remains a valid and reasonable concern. Because the titers, affinities and epitope specificities of cTnAAbs and the concentration of endogenous cTnI determine the final effect of circulating cTnAAbs, appropriately sized studies on their clinical significance are warranted. The new cTnI and cTnAAb assays could serve as analytical tools for establishing the impact of cTnAAbs on cTnI testing and also for unraveling the etiology of cTn-related autoimmune responses.
Resumo:
This study examines information security as a process (information securing) in terms of what it does, especially beyond its obvious role of protector. It investigates concepts related to ‘ontology of becoming’, and examines what it is that information securing produces. The research is theory driven and draws upon three fields: sociology (especially actor-network theory), philosophy (especially Gilles Deleuze and Félix Guattari’s concept of ‘machine’, ‘territory’ and ‘becoming’, and Michel Serres’s concept of ‘parasite’), and information systems science (the subject of information security). Social engineering (used here in the sense of breaking into systems through non-technical means) and software cracker groups (groups which remove copy protection systems from software) are analysed as examples of breaches of information security. Firstly, the study finds that information securing is always interruptive: every entity (regardless of whether or not it is malicious) that becomes connected to information security is interrupted. Furthermore, every entity changes, becomes different, as it makes a connection with information security (ontology of becoming). Moreover, information security organizes entities into different territories. However, the territories – the insides and outsides of information systems – are ontologically similar; the only difference is in the order of the territories, not in the ontological status of entities that inhabit the territories. In other words, malicious software is ontologically similar to benign software; they both are users in terms of a system. The difference is based on the order of the system and users: who uses the system and what the system is used for. Secondly, the research shows that information security is always external (in the terms of this study it is a ‘parasite’) to the information system that it protects. Information securing creates and maintains order while simultaneously disrupting the existing order of the system that it protects. For example, in terms of software itself, the implementation of a copy protection system is an entirely external addition. In fact, this parasitic addition makes software different. Thus, information security disrupts that which it is supposed to defend from disruption. Finally, it is asserted that, in its interruption, information security is a connector that creates passages; it connects users to systems while also creating its own threats. For example, copy protection systems invite crackers and information security policies entice social engineers to use and exploit information security techniques in a novel manner.
Resumo:
Presentation at "Soome-ugri keelte andmebaasid ja e-leksikograafia" at Eesti Keele Instituut (Institution of Estonian Languages) in Tallnn on the 18th of November 2014.
Resumo:
Recent research has shown that receptor-ligand interactions between surfaces of communicating cells are necessary prerequisites for cell proliferation, cell differentiation and immune defense. Cell-adhesion events have also been proposed for pathological conditions such as cancer growth, metastasis, and host-cell invasion by parasites such as Trypanosoma cruzi. RNA and DNA aptamers (aptus = Latin, fit) that have been selected from combinatorial nucleic acid libraries are capable of binding to cell-adhesion receptors leading to a halt in cellular processes induced by outside signals as a consequence of blockage of receptor-ligand interactions. We outline here a novel approach using RNA aptamers that bind to T. cruzi receptors and interrupt host-cell invasion in analogy to existing procedures of blocking selectin adhesion and function in vitro and in vivo.
Resumo:
We have developed a software called pp-Blast that uses the publicly available Blast package and PVM (parallel virtual machine) to partition a multi-sequence query across a set of nodes with replicated or shared databases. Benchmark tests show that pp-Blast running in a cluster of 14 PCs outperformed conventional Blast running in large servers. In addition, using pp-Blast and the cluster we were able to map all human cDNAs onto the draft of the human genome in less than 6 days. We propose here that the cost/benefit ratio of pp-Blast makes it appropriate for large-scale sequence analysis. The source code and configuration files for pp-Blast are available at http://www.ludwig.org.br/biocomp/tools/pp-blast.
Resumo:
Syftet med denna studie är att granska hur användningen av smarttelefoner kan stöda elevens reflektion av sin egen slöjdprocess och samtidigt se hur eleven dokumenterar sin slöjdprocess med hjälp av Talking Tools (TT). Fokus i denna studie ligger på dokumentation av elevens slöjdprocess, men jag behandlar även presentation av slöjdprodukter och olika möjligheter till och sätt på hur eleverna använder TT på en smarttelefon i slöjdundervisningen. En förändring i hur informations- och kommunikationsteknik (IKT) används sker och kommer även att ske i fortsättningen i grundskolan. Då en ny läroplan för den grundläggande utbildningen tas i bruk år 2016 kommer IKT att ha en avsevärt större roll. Denna förändring lägger press på lärare och på skolan överlag för att skapa ändamålsenliga metoder i användningen av IKT. Studien baserar sig på intervjuer med elever som använt TT under ett arbetsområde i slöjden. Elva elever presenterade först sina slöjdprodukter för forskaren varefter eleverna intervjuades. Eleverna hade tillgång till sina slöjdbloggar gjorda i TT under intervjutillfällena. Intervjuerna filmades och lades ihop med bloggarna för att förenkla analysen av materialet. I den här undersökningen framkom att dokumentationen av slöjdprocessen bidrog till en reflektion av när bilder och texter från bloggarna användes under presentationerna. Då kraven på dokumenteringen av arbetet i skolan blir högre i och med läroplanen 2016 tror jag att Talking Tools kan vara ett hjälpmedel för elever och lärare i förverkligandet av bland annat användningen av IKT i arbetet i skolan. Detta förutsätter att eleverna ges tid för användningen samt att tekniken fungerar.
Resumo:
In recent times the packaging industry is finding means to maximize profit. Wood used to be the most advantageous and everyday material for packaging, worktables, counters, constructions, interiors, tools and as materials and utensils in the food companies in the world. The use of wood has declined vigorously, and other materials like plastic, ceramic, stainless steel, concrete, and aluminum have taken its place. One way that the industry could reduce its cost is by finding possibilities of using wood for primary packaging after which it can be safely recycled or burned as a carbon source for energy. Therefore, the main objective of this thesis is to investigate the possibility of press-forming a wood film into primary packaging. In order to achieve the stated objectives, discussion on major characteristics of wood in terms of structure, types and application were studied. Two different wood species, pine and birch were used for the experimental work. These were provided by a local carpentry workshop in Lappeenranta and a workshop in Ruokolahti supervised by Professor Timo Kärki. Laboratory tests were carried out at Lappeenranta University of Technology FMS workshop on Stenhøj EPS40 M hydraulic C-frame press coupled with National Instruments VI Logger and on the Adjustable packaging line machine at LUT Packaging laboratory. The tests succeeded better on the LUT packaging line than on the Stenhoj equipment due to the integrated heating system in the machine. However, there is much work to be done before the quality of a tray produced from the wood film is comparable to that of the wood plastic composite tray.
Resumo:
Analytical calculation methods for all the major components of the synchronous inductance of tooth-coil permanentmagnet synchronous machines are reevaluated in this paper. The inductance estimation is different in the tooth-coil machine compared with the one in the traditional rotating field winding machine. The accuracy of the analytical torque calculation highly depends on the estimated synchronous inductance. Despite powerful finite element method (FEM) tools, an accurate and fast analytical method is required at an early design stage to find an initialmachine design structure with the desired performance. The results of the analytical inductance calculation are verified and assessed in terms of accuracy with the FEM simulation results and with the prototype measurement results.
Resumo:
Marketing communications has gone through significant changes during the last decades and new online tools have been leading this change for the last decade. Now, in the digital age, if wanting to be successful companies need to experiment new things and seize the opportunities online provides and adapt to the new environment. However, during this time the marketing communication mix and the meanings given to the various components of it have not changed dramatically, and personal selling, direct marketing, sales promotion, advertising and public relations activities are still regarded as important tools in the marketing communications mix. The purpose of this study was to examine business-to-business marketing communications and the tools used by companies in their marketing communications efforts in the digital age and in a global environment. The research questions dealt with the marketing communication tools and their roles as well as the role of online marketing communication and the way it has shaped the field of b2b marketing communications. In order to answer these questions, qualitative approach was chosen and the data was collected by theme interviews with six representatives of Finnish global companies from the b2b sector being interviewed. The theoretical framework covers the general field of b2b marketing communications and its main elements. The online environment as well as the concept of a global marketplace and integrated approach to marketing communication activities is also discussed. The theory was supported by the interviews regarding the activities and roles of the marketing communication tools and both theory as well as the interviews found personal selling to be a vital tool. However, the importance of online has grown and online marketing activities have risen right next to personal selling. The use of analytics and marketing automation was found to be of great interest in the interviews and they were seen as a growing domain in b2b marketing communications. The importance of targeted and personalised messages from relevant medias was a repeating theme as was the customer-centric approach in marketing communication activities. Also, offline and online tools and channels were seen as something that should be treated together as an entity rather than as separate activities. Relevant content created according to the needs of the customer with the gathered data from analytics were seen as the future of b2b marketing communications. Online has added its input to the more traditionally perceived tools and they are executed within the framework of the digital age. Nevertheless, even though online has increased its presence in the b2b marketing communications mix, the more traditionally perceived marketing communication tools, especially personal selling, have not lost their meaning or place in the b2b marketing communication mix.
Resumo:
Electrical machines have significant improvement potential. Nevertheless, the field is characterized by incremental innovations. Admittedly, steady improvement has been achieved, but no breakthrough development. Radical development in the field would require the introduction of new elements, such that may change the whole electrical machine industry system. Recent technological advancements in nanomaterials have opened up new horizons for the macroscopic application of carbon nanotube (CNT) fibres. With values of 100 MS/m measured on individual CNTs, CNT fibre materials hold promise for conductivities far beyond those of metals. Highly conductive, lightweight and strong CNT yarn is finally within reach; it could replace copper as a potentially better winding material. Although not yet providing low resistivity, the newest CNT yarn offers attractive perspectives for accelerated efficiency improvement of electrical machines. In this article, the potential for using new CNT materials to replace copper in machine windings is introduced. It does so, firstly, by describing the environment for a change that could revolutionize the industry and, secondly, by presenting the breakthrough results of a prototype construction. In the test motor, which is to our knowledge the first in its kind, the presently most electrically conductive carbon nanotube yarn replaces usual copper in the windings.
Resumo:
A direct-driven permanent magnet synchronous machine for a small urban use electric vehicle is presented. The measured performance of the machine at the test bench as well as the performance over the modified New European Drive Cycle will be given. The effect of optimal current components, maximizing the efficiency and taking into account the iron loss, is compared with the simple id=0 – control. The machine currents and losses during the drive cycle are calculated and compared with each other.
Resumo:
Electrical machine drives are the most electrical energy-consuming systems worldwide. The largest proportion of drives is found in industrial applications. There are, however many other applications that are also based on the use of electrical machines, because they have a relatively high efficiency, a low noise level, and do not produce local pollution. Electrical machines can be classified into several categories. One of the most commonly used electrical machine types (especially in the industry) is induction motors, also known as asynchronous machines. They have a mature production process and a robust rotor construction. However, in the world pursuing higher energy efficiency with reasonable investments not every application receives the advantage of using this type of motor drives. The main drawback of induction motors is the fact that they need slipcaused and thus loss-generating current in the rotor, and additional stator current for magnetic field production along with the torque-producing current. This can reduce the electric motor drive efficiency, especially in low-speed, low-power applications. Often, when high torque density is required together with low losses, it is desirable to apply permanent magnet technology, because in this case there is no need to use current to produce the basic excitation of the machine. This promotes the effectiveness of copper use in the stator, and further, there is no rotor current in these machines. Again, if permanent magnets with a high remanent flux density are used, the air gap flux density can be higher than in conventional induction motors. These advantages have raised the popularity of PMSMs in some challenging applications, such as hybrid electric vehicles (HEV), wind turbines, and home appliances. Usually, a correctly designed PMSM has a higher efficiency and consequently lower losses than its induction machine counterparts. Therefore, the use of these electrical machines reduces the energy consumption of the whole system to some extent, which can provide good motivation to apply permanent magnet technology to electrical machines. However, the cost of high performance rare earth permanent magnets in these machines may not be affordable in many industrial applications, because the tight competition between the manufacturers dictates the rules of low-cost and highly robust solutions, where asynchronous machines seem to be more feasible at the moment. Two main electromagnetic components of an electrical machine are the stator and the rotor. In the case of a conventional radial flux PMSM, the stator contains magnetic circuit lamination and stator winding, and the rotor consists of rotor steel (laminated or solid) and permanent magnets. The lamination itself does not significantly influence the total cost of the machine, even though it can considerably increase the construction complexity, as it requires a special assembly arrangement. However, thin metal sheet processing methods are very effective and economically feasible. Therefore, the cost of the machine is mainly affected by the stator winding and the permanent magnets. The work proposed in this doctoral dissertation comprises a description and analysis of two approaches of PMSM cost reduction: one on the rotor side and the other on the stator side. The first approach on the rotor side includes the use of low-cost and abundant ferrite magnets together with a tooth-coil winding topology and an outer rotor construction. The second approach on the stator side exploits the use of a modular stator structure instead of a monolithic one. PMSMs with the proposed structures were thoroughly analysed by finite element method based tools (FEM). It was found out that by implementing the described principles, some favourable characteristics of the machine (mainly concerning the machine size) will inevitable be compromised. However, the main target of the proposed approaches is not to compete with conventional rare earth PMSMs, but to reduce the price at which they can be implemented in industrial applications, keeping their dimensions at the same level or lower than those of a typical electrical machine used in the industry at the moment. The measurement results of the prototypes show that the main performance characteristics of these machines are at an acceptable level. It is shown that with certain specific actions it is possible to achieve a desirable efficiency level of the machine with the proposed cost reduction methods.
Resumo:
The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.