27 resultados para complex tasks


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The changing business environment demands that chemical industrial processes be designed such that they enable the attainment of multi-objective requirements and the enhancement of innovativedesign activities. The requirements and key issues for conceptual process synthesis have changed and are no longer those of conventional process design; there is an increased emphasis on innovative research to develop new concepts, novel techniques and processes. A central issue, how to enhance the creativity of the design process, requires further research into methodologies. The thesis presentsa conflict-based methodology for conceptual process synthesis. The motivation of the work is to support decision-making in design and synthesis and to enhance the creativity of design activities. It deals with the multi-objective requirements and combinatorially complex nature of process synthesis. The work is carriedout based on a new concept and design paradigm adapted from Theory of InventiveProblem Solving methodology (TRIZ). TRIZ is claimed to be a `systematic creativity' framework thanks to its knowledge based and evolutionary-directed nature. The conflict concept, when applied to process synthesis, throws new lights on design problems and activities. The conflict model is proposed as a way of describing design problems and handling design information. The design tasks are represented as groups of conflicts and conflict table is built as the design tool. The general design paradigm is formulated to handle conflicts in both the early and detailed design stages. The methodology developed reflects the conflict nature of process design and synthesis. The method is implemented and verified through case studies of distillation system design, reactor/separator network design and waste minimization. Handling the various levels of conflicts evolve possible design alternatives in a systematic procedure which consists of establishing an efficient and compact solution space for the detailed design stage. The approach also provides the information to bridge the gap between the application of qualitative knowledge in the early stage and quantitative techniques in the detailed design stage. Enhancement of creativity is realized through the better understanding of the design problems gained from the conflict concept and in the improvement in engineering design practice via the systematic nature of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämä diplomityökuuluu tietoliikenneverkkojen suunnittelun tutkimukseen ja pohjimmiltaan kohdistuu verkon mallintamiseen. Tietoliikenneverkkojen suunnittelu on monimutkainen ja vaativa ongelma, joka sisältää mutkikkaita ja aikaa vieviä tehtäviä. Tämä diplomityö esittelee ”monikerroksisen verkkomallin”, jonka tarkoitus on auttaa verkon suunnittelijoita selviytymään ongelmien monimutkaisuudesta ja vähentää verkkojen suunnitteluun kuluvaa aikaa. Monikerroksinen verkkomalli perustuu yleisille objekteille, jotka ovat yhteisiä kaikille tietoliikenneverkoille. Tämä tekee mallista soveltuvan mielivaltaisille verkoille, välittämättä verkkokohtaisista ominaisuuksista tai verkon toteutuksessa käytetyistä teknologioista. Malli määrittelee tarkan terminologian ja käyttää kolmea käsitettä: verkon jakaminen tasoihin (plane separation), kerrosten muodostaminen (layering) ja osittaminen (partitioning). Nämä käsitteet kuvataan yksityiskohtaisesti tässä työssä. Monikerroksisen verkkomallin sisäinen rakenne ja toiminnallisuus ovat määritelty käyttäen Unified Modelling Language (UML) -notaatiota. Tämä työ esittelee mallin use case- , paketti- ja luokkakaaviot. Diplomityö esittelee myös tulokset, jotka on saatu vertailemalla monikerroksista verkkomallia muihin verkkomalleihin. Tulokset osoittavat, että monikerroksisella verkkomallilla on etuja muihin malleihin verrattuna.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the thesis was to create a framework that can be used to define a manufacturing strategy taking advantage of the product life cycle method, which enables PQP enhancements. The starting point was to study synkron implementation of cost leadership and differentiation strategies in different stages of the life cycles. It was soon observed that Porter’s strategies were too generic for the complex and dynamic environment where customer needs deviate market and product specifically. Therefore, the strategy formulation process is based on the Terry Hill’s order-winner and qualifier concepts. The manufacturing strategy formulation is initiated with the definition of order-winning and qualifying criteria. From these criteria there can be shaped product specific proposals for action and production site specific key manufacturing tasks that they need to answer in order to meet customers and markets needs. As a future research it is suggested that the process of capturing order-winners and qualifiers should be developed so that the process would be simple and streamlined at Wallac Oy. In addition, defined strategy process should be integrated to the PerkinElmer’s SGS process. SGS (Strategic Goal Setting) is one of the PerkinElmer’s core management processes. Full Text: Null

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kiristyvä kansainvälinen kilpailu pakottaa automaatiojärjestelmien valmistajat ottamaan käyttöön uusia menetelmiä, joiden avulla järjestelmien suorituskykyä ja joustavuutta saadaan parannettua. Agenttiteknologiaa on esitetty käytettäväksi olemassa olevien automaatiojärjestelmien kanssa vastaamaan automaatiolle asetettaviin uusiin haasteisiin. Agentit ovat itsenäisiä yhteisöllisiä toimijoita, jotka suorittavat niille ennalta määrättyjä tehtäviä. Ne tarjoavat yhtenäisen kehyksen kehittyneiden toimintojen toteutukselle. Agenttiteknologian avulla automaatiojärjestelmä saadaan toimimaan joustavasti ja vikasietoisesti. Tässä työssä selostetaan agenttiteknologian ajatuksia ja käsitteitä. Lisäksi selvitetään sen soveltuvuutta monimutkaisten ohjausjärjestelmien kehittämiseen ja etsitään käyttökohteita sen soveltamiselle levytehtaassa. Työssä käsitellään myös aatteita, jotka ovat johtaneet agenttiteknologian käyttöön automaatiojärjestelmissä, sekä selostetaan agenttiavusteisen esimerkkisovelluksen rakenne ja testitulokset. Tutkimuksen tuloksena löydettiin useita kohteita agenttiteknologian käytölle levytehtaassa. Esimerkkisovellus osoittaa sen sopivan hyvin kehittyneiden toimintojen toteutukseen automaatiojärjestelmissä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lähitulevaisuudessa langattomien järjestelmien kaupalliset mahdollisuudet tulevat olemaan valtavia. Tutkiaksemme tulevia tarpeita, tässä diplomityössä esitellään kuinka voidaan suunnitella ja toteuttaa avoin langaton asiakas-palvelin järjestelmä. Järjestelmänä päätettiin käyttää Bluetooth:ia. Tutkituista langattomista standardeista Bluetooth sopii parhaiten akkukäyttöiselle laitteelle, jonka tulee olla monipuolinen. Lisäksi Bluetooth:iin on liitetty suuria kaupallisia odotuksia ja yksi työn tavoitteista olikin tutkia, ovatko nämä odotukset realistisia. Bluetooth:iin havaittiin liittyvän paljon ylimainontaa ja, sen todettiin olevan monimutkainen. Sillä on kuitenkin paljon ominaisuuksia ja erilaisten käyttöprofiilien avulla sitä voidaan käyttää monenlaisiin tehtäviin. Suunniteltu järjestelmä ajaa socket-palvelinta Bluetooth-yhteyden päällä. Tietyntyyppiseen liikenteeseen erikoistuneet socket:t tarjoavat vaaditun laajennattavuuden. Palvelin toteutetiin Linux-säikeenä ja se hallitsee Bluetooth protokollapinoa sekä sovelluksia, joita suoritetaan palvelimella. Näiden sovelluksien palvelut ovat muiden käytössä Bluetooth:n kautta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Informaatiotulva ja organisaation monimutkaisuus luoneet tarpeen tietämyksen hallinnalle. Tämän tutkimuksen tavoitteena on tunnistaa muutostarpeet, jotka portaalin käyttöönotto tietämyksenhallintatyökaluna luo. Tutkimuksessa verrataan myös uusia työkaluja olemassa oleviin sekä arvioidaan organisaation kykyä siirtää tietämystä virtuaalisesti. Kirjallisuutta vastaavanlaisista projekteista ei ole ollut saatavilla, sillä käyttöönotettava teknologia on melko uutta. Samaa teknologiaa on käytössä hieman eri alueella, kuin tässä projektissa on tavoitteena. Tutkimus on tapaustutkimus, jonka pääasialliset lähteet ovat erilaisissa kokouksissa tuotettuja dokumentteja. Tutkija on osallistunut aktiivisesti projektityöhön, joten osa taustatiedoista perustuu tutkijan huomioihin sekä vielä keskusteluihin. Teoriaosassa käsitellään tietämyksen jakamista tietämyksen hallinnan ja virtuaalisuuden näkökulmasta. Muutoksen hallintaa on käsitelty lyhyesti tietämyksenhallintatyökalun käyttöönotossa. Tutkimus liittyy Stora Enso Consumer Boardsin tietämyksen hallintaprojektiin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nopea teknologian kehitys sekä kansainvälistymisen mukana tuoma kilpailupaine pakottavat yritykset jatkuvaan liiketoimintaprosessien kehittämiseen. Muutoksista organisaation rakenteissa sekä yrityksen prosesseissa on tullut yleisiä toimenpiteitä. Yksi näkyvimmistä toiminnallisista uudistuksesta on ollut toiminnanohjausjärjestelmän käyttöönotto. Toiminnanohjausjärjestelmän rakenne ja kehitys aiheuttaa yleensä suurimmat vaikeudet pyrittäessä rakentamaan liiketoimintaprosessien läpinäkyvyyttä esittävä tietojärjestelmäympäristö. Tässä tutkimuksessa liiketoiminnan sekä toiminnanohjausjärjestelmän prosessien yhdistäminen on tehty ns. toiminnanohjausjärjestelmä muutostyökaluilla. Kyseiset muutostyökalut on järjestetty yrityksissä tietojärjestelmä ympäristöön ja niiden avulla voidaan korjata teknisiä ongelmia sekä muuttaa itse prosesseja. Tutkimuksen empiria osuudessa on käytetty case-tutkimusmenetelmää Kone Oyj:n prosessien kehittämisosastolla. Tutkimuksen tavoitteena oli parantaa toiminnanohjausjärjestelmän muutostyökalujen prosesseja, liiketoimintaprosessien sekä toiminnanohjausjärjestelmän yhdistämiseksi ja harmonisoimiseksi. Tutkimuksen tavoitteiden täyttämiseksi, prosessijohtamisen käsitteitä käytettiin muutostyökaluprosessien parannusehdotusten löytymiseksi. Prosessijohtamisen käsitteet tarkoittavat prosessikartan, prosessin toimintojen, sekä prosessin kustannusten tutkimista ja hyväksikäyttöä. Prosessijohtamisen käsitteeseen kuuluu myös liiketoimintaprosessien jatkuvan parantamisen sekä uudelleenjärjestämisen mallien kuvaus. Toiminnanohjausjärjestelmäympäristön kuvaus teorian toisena osuutena antaa pohjaa muutostyökalujen prosessien käytölle. Tutkimuksen tuloksina voidaan todeta että tutkimusalue on hyvin monimutkainen ja vaikea. Toimintajärjestelmistä ei ole kirjoitettu teoriaa kovinkaan runsaasti, lukuunottamatta yritysten itse tekemiä tutkimuksia. Tutkimuksessa tarkasteltaville prosesseille löytyi kuitenkin parannusehdotuksia sekä ns. optimaalisen prosessimallin ominaisuuksia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Innovation is the word of this decade. According to innovation definitions, without positive sales impact and meaningful market share the company’s product or service has not been an innovation. Research problem of this master thesis is to find out what is the innovation process of complex new consumer products and services in new innovation paradigm. The objective is to get answers to two research questions: 1) What are the critical success factors what company should do when it is implementing the paradigm change in mass markets consumer business with complex products and services? 2) What is the process or framework one firm could follow? The research problem is looked from one company’s innovation creation process, networking and organization change management challenges point of views. Special focus is to look the research problem from an existing company perspective which is entering new business area. Innovation process management framework of complex new consumer products and services in new innovation paradigm has been created with support of several existing innovation theories. The new process framework includes the critical innovation process elements companies should take into consideration in their daily activities when they are in their new business innovation implementing process. Case company location based business implementation activities are studied via the new innovation process framework. This case study showed how important it is to manage the process, look how the target market and the competition in it is developing during company’s own innovation process, make decisions at right time and from beginning plan and implement the organization change management as one activity in the innovation process. In the end this master thesis showed that all companies need to create their own innovation process master plan with milestones and activities. One plan does not fit all, but all companies can start their planning from the new innovation process what was introduced in this master thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The management and conservation of coastal waters in the Baltic is challenged by a number of complex environmental problems, including eutrophication and habitat degradation. Demands for a more holistic, integrated and adaptive framework of ecosystem-based management emphasize the importance of appropriate information on the status and changes of the aquatic ecosystems. The thesis focuses on the spatiotemporal aspects of environmental monitoring in the extensive and geomorphologically complex coastal region of SW Finland, where the acquisition of spatially and temporally representative monitoring data is inherently challenging. Furthermore, the region is subject to multiple human interests and uses. A holistic geographical approach is emphasized, as it is ultimately the physical conditions that set the frame for any human activity. Characteristics of the coastal environment were examined using water quality data from the database of the Finnish environmental administration and Landsat TM/ETM+ images. A basic feature of the complex aquatic environment in the Archipelago Sea is its high spatial and temporal variability; this foregrounds the importance of geographical information as a basis of environmental assessments. While evidence of a consistent water turbidity pattern was observed, the coastal hydrodynamic realm is also characterized by high spatial and temporal variability. It is therefore also crucial to consider the spatial and temporal representativeness of field monitoring data. Remote sensing may facilitate evaluation of hydrodynamic conditions in the coastal region and the spatial extrapolation of in situ data despite their restrictions. Additionally, remotely sensed images can be used in the mapping of many of those coastal habitats that need to be considered in environmental management. With regard to surface water monitoring, only a small fraction of the currently available data stored in the Hertta-PIVET register can be used effectively in scientific studies and environmental assessments. Long-term consistent data collection from established sampling stations should be emphasized but research-type seasonal assessments producing abundant data should also be encouraged. Thus a more comprehensive coordination of field work efforts is called for. The integration of remote sensing and various field measurement techniques would be especially useful in the complex coastal waters. The integration and development of monitoring system in Finnish coastal areas also requires further scientific assesement of monitoring practices. A holistic approach to the gathering and management of environmental monitoring data could be a cost-effective way of serving a multitude of information needs, and would fit the holistic, ecosystem-based management regimes that are currently being strongly promoted in Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary objective is to identify the critical factors that have a natural impact on the performance measurement system. It is important to make correct decisions related to measurement systems, which are based on the complex business environment. The performance measurement system is combined with a very complex non-linear factor. The Six Sigma methodology is seen as one potential approach at every organisational level. It will be linked to the performance and financial measurement as well as to the analytical thinking on which the viewpoint of management depends. The complex systems are connected to the customer relationship study. As the primary throughput can be seen in a new well-defined performance measurement structure that will also be facilitated as will an analytical multifactor system. These critical factors should also be seen as a business innovation opportunity at the same time. This master's thesis has been divided into two different theoretical parts. The empirical part consists of both action-oriented and constructive research approaches with an empirical case study. The secondary objective is to seek a competitive advantage factor with a new analytical tool and the Six Sigma thinking. Process and product capabilities will be linked to the contribution of complex system. These critical barriers will be identified by the performance measuring system. The secondary throughput can be recognised as the product and the process cost efficiencies which throughputs are achieved with an advantage of management. The performance measurement potential is related to the different productivity analysis. Productivity can be seen as one essential part of the competitive advantage factor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical research is currently facing a new type of challenge: an excess of information, both in terms of raw data from experiments and in the number of scientific publications describing their results. Mirroring the focus on data mining techniques to address the issues of structured data, there has recently been great interest in the development and application of text mining techniques to make more effective use of the knowledge contained in biomedical scientific publications, accessible only in the form of natural human language. This thesis describes research done in the broader scope of projects aiming to develop methods, tools and techniques for text mining tasks in general and for the biomedical domain in particular. The work described here involves more specifically the goal of extracting information from statements concerning relations of biomedical entities, such as protein-protein interactions. The approach taken is one using full parsing—syntactic analysis of the entire structure of sentences—and machine learning, aiming to develop reliable methods that can further be generalized to apply also to other domains. The five papers at the core of this thesis describe research on a number of distinct but related topics in text mining. In the first of these studies, we assessed the applicability of two popular general English parsers to biomedical text mining and, finding their performance limited, identified several specific challenges to accurate parsing of domain text. In a follow-up study focusing on parsing issues related to specialized domain terminology, we evaluated three lexical adaptation methods. We found that the accurate resolution of unknown words can considerably improve parsing performance and introduced a domain-adapted parser that reduced the error rate of theoriginal by 10% while also roughly halving parsing time. To establish the relative merits of parsers that differ in the applied formalisms and the representation given to their syntactic analyses, we have also developed evaluation methodology, considering different approaches to establishing comparable dependency-based evaluation results. We introduced a methodology for creating highly accurate conversions between different parse representations, demonstrating the feasibility of unification of idiverse syntactic schemes under a shared, application-oriented representation. In addition to allowing formalism-neutral evaluation, we argue that such unification can also increase the value of parsers for domain text mining. As a further step in this direction, we analysed the characteristics of publicly available biomedical corpora annotated for protein-protein interactions and created tools for converting them into a shared form, thus contributing also to the unification of text mining resources. The introduced unified corpora allowed us to perform a task-oriented comparative evaluation of biomedical text mining corpora. This evaluation established clear limits on the comparability of results for text mining methods evaluated on different resources, prompting further efforts toward standardization. To support this and other research, we have also designed and annotated BioInfer, the first domain corpus of its size combining annotation of syntax and biomedical entities with a detailed annotation of their relationships. The corpus represents a major design and development effort of the research group, with manual annotation that identifies over 6000 entities, 2500 relationships and 28,000 syntactic dependencies in 1100 sentences. In addition to combining these key annotations for a single set of sentences, BioInfer was also the first domain resource to introduce a representation of entity relations that is supported by ontologies and able to capture complex, structured relationships. Part I of this thesis presents a summary of this research in the broader context of a text mining system, and Part II contains reprints of the five included publications.