934 resultados para Observation Ontology
Resumo:
Importance of customer as a source of knowledge and finding out the customer needs have both been emphasised both in business literature and in the interviews made for this study. Especially the latent customer needs are seen important for future competitiveness. However, the methods for finding out the customer needs concentrate on the present, clearly phrased needs. There is a need for rich customer based knowledge. Customer contacts are underutilised as a source of knowledge. The objective of this study is to find a method to utilise the customer contacts as a source of knowledge more efficiently. To reach this objective, the customer observation is presented. A concrete goal is the development of a general method of customer observation and its application to the needs of the case company Fastems. Fastems hopes that the method raises the amount of ideas that come to the product and service development processes and increases their customer orientation. The method is tested in practise in the piloting stage. The testing is done in service organisations of Fastems in several countries. The observations received during the piloting phase are analysed and feedback is given to the observers. An important part of the study is to find appropriate methods for motivating the observers. The successful implementation and sustenance of the method are seen as goals of high importance. Later on, the customer observation method is implemented throughout the company. The results of the piloting are promising and the described method has roused interest among other companies as well.
Resumo:
To determine the morphological differences in the epithelium of the airways of recovered and susceptible pigs after Mycoplasma hyopneumoniae challenge, twenty-four 4-week-old M. hyopneumoniae-free pigs were intratracheally inoculated with 107ccu/ml of a pure low-passaged culture of the P5722-3 strain of M. hyopneumoniae challenge material. Eight pigs (group I) were challenged at the beginning of the experiment and rechallenged 3 months later. Group II pigs were also challenged at the beginning of the experiment and necropsied 3 months later. Group III pigs were challenged at the same time as the rechallenge of group I pigs. Eight nonchallenged pigs served as controls (group IV). Three days after the second challenge of group I and the first challenge of group III, and every 3 and 4 days thereafter, two pigs from each group were euthanatized by electrocution and necropsied. Samples of bronchi and lung tissue were examined using light and electron microscopy (SEM and TEM). Macroscopic lesions were observed in the lungs of all group III pigs (average = 4.74%) and were characterized by purple-red areas of discoloration and increased firmness affecting the cranioventral aspect of the lungs. Macroscopic lesions of pneumonia in groups I and II were minimal (less than 1%). There were no gross lesions of pneumonia in control (group IV) pigs. Microscopic lesions were characterized by hyperplasia of the peribronchial lymphoid tissue and mild neutrophilic infiltrates in alveoli. Electron microscopy showed patchy areas with loss of cilia and presence of leukocytes and mycoplasmas in bronchi of susceptible pigs (group III). The bronchial epithelium of rechallenged (group I), recovered (group II), and control (group IV) pigs was ultrastructurally similar indicating recovery of the former two groups. Although mycoplasmas were seen among cilia, a second challenge on pigs of group I did not produce another episode of the disease nor did it enhance morphological changes, suggesting that those pigs could become carriers of M. hyopneumoniae.
Resumo:
Software plays an important role in our society and economy. Software development is an intricate process, and it comprises many different tasks: gathering requirements, designing new solutions that fulfill these requirements, as well as implementing these designs using a programming language into a working system. As a consequence, the development of high quality software is a core problem in software engineering. This thesis focuses on the validation of software designs. The issue of the analysis of designs is of great importance, since errors originating from designs may appear in the final system. It is considered economical to rectify the problems as early in the software development process as possible. Practitioners often create and visualize designs using modeling languages, one of the more popular being the Uni ed Modeling Language (UML). The analysis of the designs can be done manually, but in case of large systems, the need of mechanisms that automatically analyze these designs arises. In this thesis, we propose an automatic approach to analyze UML based designs using logic reasoners. This approach firstly proposes the translations of the UML based designs into a language understandable by reasoners in the form of logic facts, and secondly shows how to use the logic reasoners to infer the logical consequences of these logic facts. We have implemented the proposed translations in the form of a tool that can be used with any standard compliant UML modeling tool. Moreover, we authenticate the proposed approach by automatically validating hundreds of UML based designs that consist of thousands of model elements available in an online model repository. The proposed approach is limited in scope, but is fully automatic and does not require any expertise of logic languages from the user. We exemplify the proposed approach with two applications, which include the validation of domain specific languages and the validation of web service interfaces.
Resumo:
A growing concern for organisations is how they should deal with increasing amounts of collected data. With fierce competition and smaller margins, organisations that are able to fully realize the potential in the data they collect can gain an advantage over the competitors. It is almost impossible to avoid imprecision when processing large amounts of data. Still, many of the available information systems are not capable of handling imprecise data, even though it can offer various advantages. Expert knowledge stored as linguistic expressions is a good example of imprecise but valuable data, i.e. data that is hard to exactly pinpoint to a definitive value. There is an obvious concern among organisations on how this problem should be handled; finding new methods for processing and storing imprecise data are therefore a key issue. Additionally, it is equally important to show that tacit knowledge and imprecise data can be used with success, which encourages organisations to analyse their imprecise data. The objective of the research conducted was therefore to explore how fuzzy ontologies could facilitate the exploitation and mobilisation of tacit knowledge and imprecise data in organisational and operational decision making processes. The thesis introduces both practical and theoretical advances on how fuzzy logic, ontologies (fuzzy ontologies) and OWA operators can be utilized for different decision making problems. It is demonstrated how a fuzzy ontology can model tacit knowledge which was collected from wine connoisseurs. The approach can be generalised and applied also to other practically important problems, such as intrusion detection. Additionally, a fuzzy ontology is applied in a novel consensus model for group decision making. By combining the fuzzy ontology with Semantic Web affiliated techniques novel applications have been designed. These applications show how the mobilisation of knowledge can successfully utilize also imprecise data. An important part of decision making processes is undeniably aggregation, which in combination with a fuzzy ontology provides a promising basis for demonstrating the benefits that one can retrieve from handling imprecise data. The new aggregation operators defined in the thesis often provide new possibilities to handle imprecision and expert opinions. This is demonstrated through both theoretical examples and practical implementations. This thesis shows the benefits of utilizing all the available data one possess, including imprecise data. By combining the concept of fuzzy ontology with the Semantic Web movement, it aspires to show the corporate world and industry the benefits of embracing fuzzy ontologies and imprecision.
Resumo:
This study examines information security as a process (information securing) in terms of what it does, especially beyond its obvious role of protector. It investigates concepts related to ‘ontology of becoming’, and examines what it is that information securing produces. The research is theory driven and draws upon three fields: sociology (especially actor-network theory), philosophy (especially Gilles Deleuze and Félix Guattari’s concept of ‘machine’, ‘territory’ and ‘becoming’, and Michel Serres’s concept of ‘parasite’), and information systems science (the subject of information security). Social engineering (used here in the sense of breaking into systems through non-technical means) and software cracker groups (groups which remove copy protection systems from software) are analysed as examples of breaches of information security. Firstly, the study finds that information securing is always interruptive: every entity (regardless of whether or not it is malicious) that becomes connected to information security is interrupted. Furthermore, every entity changes, becomes different, as it makes a connection with information security (ontology of becoming). Moreover, information security organizes entities into different territories. However, the territories – the insides and outsides of information systems – are ontologically similar; the only difference is in the order of the territories, not in the ontological status of entities that inhabit the territories. In other words, malicious software is ontologically similar to benign software; they both are users in terms of a system. The difference is based on the order of the system and users: who uses the system and what the system is used for. Secondly, the research shows that information security is always external (in the terms of this study it is a ‘parasite’) to the information system that it protects. Information securing creates and maintains order while simultaneously disrupting the existing order of the system that it protects. For example, in terms of software itself, the implementation of a copy protection system is an entirely external addition. In fact, this parasitic addition makes software different. Thus, information security disrupts that which it is supposed to defend from disruption. Finally, it is asserted that, in its interruption, information security is a connector that creates passages; it connects users to systems while also creating its own threats. For example, copy protection systems invite crackers and information security policies entice social engineers to use and exploit information security techniques in a novel manner.
Resumo:
The main characteristic of the nursing Interactive Observation Scale for Psychiatric Inpatients (IOSPI) is the necessity of interaction between raters and patients during assessment. The aim of this study was to evaluate the reliability and validity of the scale in the "real" world of daily ward practice and to determine whether the IOSPI can increase the interaction time between raters and patients and influence the raters' opinion about mental illness. All inpatients of a general university hospital psychiatric ward were assessed daily over a period of two months by 9 nursing aides during the morning and afternoon shifts, with 273 pairs of daily observations. Once a week the patients were interviewed by a psychiatrist who filled in the Brief Psychiatric Rating Scale (BPRS). The IOSPI total score was found to show significant test-retest reliability (interclass correlation coefficient = 0.83) and significant correlation with the BPRS total score (r = 0.69), meeting the criteria of concurrent validity. The instrument can also discriminate between patients in need of further inpatient treatment from those about to be discharged (negative predictive value for discharge = 0.91). Using this scale, the interaction time between nursing aides and patients increased significantly (t = 2.93, P<0.05) and their opinion about the mental illness changed. The "social restrictiveness" factor of the opinion scale about mental illness showed a significant reduction (t = 4.27, P<0.01) and the "interpersonal etiology" factor tended to increase (t = 1.98, P = 0.08). The IOSPI was confirmed as a reliable and valid scale and as an efficient tool to stimulate the therapeutic attitudes of the nursing staff.
Resumo:
Ontology matching is an important task when data from multiple data sources is integrated. Problems of ontology matching have been studied widely in the researchliterature and many different solutions and approaches have been proposed alsoin commercial software tools. In this survey, well-known approaches of ontologymatching, and its subtype schema matching, are reviewed and compared. The aimof this report is to summarize the knowledge about the state-of-the-art solutionsfrom the research literature, discuss how the methods work on different application domains, and analyze pros and cons of different open source and academic tools inthe commercial world.
Resumo:
The text examines Sergej Nikolajeviè Bulgakov's description of the philosopheme as thoroughly "immanent" (viz., the immanence of man qua being, such that ontology in Bulgakov becomes a conceptual analogue for immanence) and the corollary that such immanence necessarily excludes the problematic of the "creation of the world." Because of this resolute immanence and the notion that the creation of the world in the form of creatio ex nihilo requires a non-immanent or non-ontological thought and concept, the problematic for Bulgakov is approached only by a theologeme. Appropriating this argument as material for a cursory philosopheme, the text attempts to transform Bulgakov's theologeme into a philosopheme through an elision of God and dogma that overdetermines the theologeme. This philosopheme (nascent within Bulgakov's work itself, in both his hesitation to the overdetermination of immanence and the commitment to the problem of creation) would be a thoroughly non-ontological philosopheme, one that allows for the treatment of the problematic of "creation" or singular ontogenesis, yet with the corollary that this philosopheme must rely on an "ontological zero" Such a philosopheme qua ontologically empty formula nevertheless remains ontologically significant insofar as it is to evince the limit of ontology, in the ontological zero's non-relationality to ontology.
Resumo:
Posiva Oy’s final disposal facility’s encapsulation plant will start to operate in the 2020s. Once the operation starts, the facility is designed to run more than a hundred years. The encapsulation plant will be first of its kind in the world, being part of the solution to solve a global issue of final disposal of nuclear waste. In the encapsulation plant’s fuel handling cell the spent nuclear fuel will be processed to be deposited into the Finnish bedrock, into ONKALO. In the fuel handling cell, the environment is highly radioactive forming a permit-required enclosed space. Remote observation is needed in order to monitor the fuel handling process. The purpose of this thesis is to map (Part I) and compare (Part II) remote observation methods to observe Posiva Oy’s fuel handling cell’s process, and provide a possible theoretical solution for this case. Secondary purpose for this thesis is to provide resources for other remote observation cases, as well as to inform about possible future technology to enable readiness in the design of the encapsulation plant. The approach was to theoretically analyze the mapped remote observation methods. Firstly, the methods were filtered by three environmental challenges. These are the high levels of radiation, the permit-required confined space and the hundred year timespan. Secondly, the most promising methods were selected by the experts designing the facility. Thirdly, a customized feasibility analysis was created and performed on the selected methods to rank the methods with scores. The results are the mapped methods and the feasibility analysis scores. The three highest scoring methods were radiation tolerant camera, fiberscope and audio feed. A combination of these three methods was given as a possible theoretical solution for this case. As this case is first in the world, remote observation methods for it had not been thoroughly researched. The findings in this thesis will act as initial data for the design of the fuel handling cell’s remote observation systems and can potentially effect on the overall design of the facility by providing unique and case specific information. In addition, this thesis could provide resources for other remote observation cases.
Resumo:
1928/09.
Resumo:
1930/05.
Resumo:
1935/01.
Resumo:
1925/03.