36 resultados para all substring common subsequence problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The basic goal of this study is to extend old and propose new ways to generate knapsack sets suitable for use in public key cryptography. The knapsack problem and its cryptographic use are reviewed in the introductory chapter. Terminology is based on common cryptographic vocabulary. For example, solving the knapsack problem (which is here a subset sum problem) is termed decipherment. Chapter 1 also reviews the most famous knapsack cryptosystem, the Merkle Hellman system. It is based on a superincreasing knapsack and uses modular multiplication as a trapdoor transformation. The insecurity caused by these two properties exemplifies the two general categories of attacks against knapsack systems. These categories provide the motivation for Chapters 2 and 4. Chapter 2 discusses the density of a knapsack and the dangers of having a low density. Chapter 3 interrupts for a while the more abstract treatment by showing examples of small injective knapsacks and extrapolating conjectures on some characteristics of knapsacks of larger size, especially their density and number. The most common trapdoor technique, modular multiplication, is likely to cause insecurity, but as argued in Chapter 4, it is difficult to find any other simple trapdoor techniques. This discussion also provides a basis for the introduction of various categories of non injectivity in Chapter 5. Besides general ideas of non injectivity of knapsack systems, Chapter 5 introduces and evaluates several ways to construct such systems, most notably the "exceptional blocks" in superincreasing knapsacks and the usage of "too small" a modulus in the modular multiplication as a trapdoor technique. The author believes that non injectivity is the most promising direction for development of knapsack cryptosystema. Chapter 6 modifies two well known knapsack schemes, the Merkle Hellman multiplicative trapdoor knapsack and the Graham Shamir knapsack. The main interest is in aspects other than non injectivity, although that is also exploited. In the end of the chapter, constructions proposed by Desmedt et. al. are presented to serve as a comparison for the developments of the subsequent three chapters. Chapter 7 provides a general framework for the iterative construction of injective knapsacks from smaller knapsacks, together with a simple example, the "three elements" system. In Chapters 8 and 9 the general framework is put into practice in two different ways. Modularly injective small knapsacks are used in Chapter 9 to construct a large knapsack, which is called the congruential knapsack. The addends of a subset sum can be found by decrementing the sum iteratively by using each of the small knapsacks and their moduli in turn. The construction is also generalized to the non injective case, which can lead to especially good results in the density, without complicating the deciphering process too much. Chapter 9 presents three related ways to realize the general framework of Chapter 7. The main idea is to join iteratively small knapsacks, each element of which would satisfy the superincreasing condition. As a whole, none of these systems need become superincreasing, though the development of density is not better than that. The new knapsack systems are injective but they can be deciphered with the same searching method as the non injective knapsacks with the "exceptional blocks" in Chapter 5. The final Chapter 10 first reviews the Chor Rivest knapsack system, which has withstood all cryptanalytic attacks. A couple of modifications to the use of this system are presented in order to further increase the security or make the construction easier. The latter goal is attempted by reducing the size of the Chor Rivest knapsack embedded in the modified system. '

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient problem solving in cellular networks is important when enhancing the network performance and liability. Analysis of calls and packet switched sessions in protocol level between the network elements is an important part of this process. They can provide very detailed information about error situations which otherwise would be difficult to recognise. In this thesis we seek solutions for monitoring GPRS/EDGE sessions in two specific interfaces simultaneously in such manner that all information important to the users will be provided in easily understandable form. This thesis focuses on Abis and AGPRS interfaces of GSM radio network and introduces a solution for managing the correlation between these interfaces by using signalling messages and common parameters as linking elements. ~: Finally this thesis presents an implementation of GPRS/EDGE session monitoring application for Abis and AGPRS interfaces and evaluates its benefits to the end users. Application is implemented as a part of Windows based 3G/GSM network analyser.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena on tuottaa malli globaalin tietoliikenneverkkojen palvelutuotannon toteuttamiseksi metsäteollisuuden yrityksessä. Pääongelmana on selvittää miten tietoliikenneverkkojen palvelutuotanto on toteutettava ottaen huomioon tarvittavat toimijat, kustannustehokkuus ja tiedonhallinta. Ongelmaa lähestyttiin IT- johtamisen, ulkoistamisen, yhteistyöverkostojen, prosessien, yhteisten työkalujen ja standardoinnin näkökulmista. Yrityksen tietoliikenneverkkojen nykytilaa ja tavoitetilaa tutkittiin laadullisella kyselytutkimuksella. Tutkimuksen perusteella päädyttiin malliin, jossa tietoliikenneverkkojen palvelutuotannon laadukkuuteen ja tehokkuuteen vaikuttavat oleellisesti toimijat, yhteiset työkalut, standardointi kaikilla tasoilla ja harkittu ja laskelmoitu ulkoistaminen. Keskeiseen rooliin nousivat standardit prosessit ja niitä tukevat yhteiset työkalut, jotka on toteutettu keskitetysti ja kriittisten osien osalta myös paikallisesti. Prosessien ja työkalujen tulee tukea tiedonhallintaa luonnollisena osana päivittäistä toimintaa. Toimijoiden rajapinnat on oltava yksiselitteisesti määritelty ja toimijoiden välillä on kyettävä kommunikoimaan yhteisillä työkaluilla ja jakamaan tietoa avoimesti.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on regional innovation strategy (RIS) and sustainability aspects in selected regions of European Union (EU) countries. It is known that RIS helps a region to innovate locally and to compete globally and it is considered as one of the main policy tools of the EU for innovation support at a regional level. This study is conducted to explore the existence and adoption of RIS in different regions of selected EU countries, and to highlight and compare regional RIS characteristics. The study is also aimed at identifying the factors that characterise the formulation and implementation of RIS as well as the problems associated thereof. In this study, six regions of EU countries are considered: Päijät-Häme Region (Finland); London Region (United Kingdom); Mid-West Region (Ireland); Veneto Region (Italy); Eastern Region (Poland); and West Region (Romania). Data and information are collected by sending questionnaires to the respective regional authorities of these selected regions. Based on the gathered information and analysis, RIS or equivalent strategy document serves as a blueprint for forwarding innovative programmes towards regional sustainability. The objectives of RIS in these regions are found to be dependent on the priority sectors and state of the region’s development. The current environmental sustainability aspects are focused on eco-design, eco-products, and eco-innovation, although each region also has its own specific aspects supported by RIS. Likewise, regional policies typically follow the RIS yet translated in various sectoral focus or priority areas. The main enhancing factors supporting RIS among selected regions have some similarities and variations; among others, some regions are strongly supported by EU while others have support from own regional agencies, organisations and professional networks. RIS implementation is not without challenges and despite the differences in challenges, almost all of reviewed regions consider financial resource as a common problem. Generally, it is learned from this study that RIS and regional sustainability are reinforcing each other mutually. In this study, the strong focus is given towards environmental sustainability in the regions although regional sustainability also includes economic and social aspects. A well-focused and prioritised RIS is beneficial for regional sustainable development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the research was to understand the success factors of the Danish energy service industry. The research phenomenon was studied greatly but the aim was to examine it from the service logic point of view. The research was threefold and it examined the phenomena from the company, industrial and national levels. The purpose of the multi-level study was to understand all the success factors and to examine how they are combined together. First, the research problem was approached through the literature review. After that, the empirical part of the study was conducted as a case study and the data was collected by theme interviews. The collected data was analyzed through theoretical point of view and compared with earlier studies. This study shows that the most important success factor was the country, because it has affected to the other aspects of the success. Because the actors of the industry are linked together tightly, communication and common understanding of business is essential to the industry success. The new energy technologies do not produce directly added value for the customers. This has sifted energy business towards service business, and the customers have been included in the value creation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autoalalla automyynnin transaktiokustannukset ovat merkittävä osa myyntikustannuksista. Yritykset pyrkivät alentamaan näitä kustannuksia hakemalla lisää volyymia ja keskittymällä voimakkaammin volyymikauppoihin. Tämän tutkimuksen tarkoituksena on löytää keinoja, joilla autokaupan volyymimyynnin kannattavuutta voidaan parantaa. Tutkimuksen teoreettisessa viitekehyksessä esitettiin teoriamalleja, joita hyödyntämällä autokaupan kovassa kilpailutilanteessa yritykset voisivat menestyä paremmin. Tutkimusmetodina käytettiin ei-kokeellista tutkimusasetelmaa, jossa tieto kerättiin strukturoiduilla kysymyksillä. Tutkimusaineistoksi saatiin kohdeyrityksessä ja joukosta kilpailevien yritysten henkilöiltä haastattelemalla hankittua tietoa. Johtopäätöksenä voidaan todeta, että autokaupassa liikkeenjohdon teoriat ja käytäntö seurailevat toisiaan osin hyvinkin paljon, vaikka tiettyjä ristiriitoja on havaittavissa. Yritysten tietojärjestelmistä saatavan tiedon laadun parantamisella, sovittujen toimintaprosessien noudattamisella ja olemassa olevia toimintatapoja kyseenalaistamalla volyymikaupan tuottavuutta pystytään jo nyt kohentamaan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All over the world power systems become bigger and bigger every day. New equipment is installed, new feeders are constructed, new power units are installed. Some old elements of the network, however, are not changed in time. As a result, “bottlenecks” for capacity transmission can occur. By locked power problem the situation when a power plant has installed capacity exceeding the power it can actually deliver is usually meant. Regime, scheme or even technical restrictions-related issues usually cause this kind of problem. It is really important, since from the regime point of view it is typical decision to have a mobile capacity reserve, in case of malfunctions. And, what can be even more significant, power plant owner (JSC Fortum in our case) losses his money because of selling less electrical energy. The goal of master`s thesis is to analyze the current state of Chelyabinsk power system and the CHP-3 (Combined Heat and Power plant) in particular in relation with it`s ability to deliver the whole capacity of the CHP in it`s existing state and also taking into consideration the prospect of power unit 3 installation by the fourth quarter of 2010. The thesis contains some general information about the UPS of Russia, CPS of Ural, power system of Chelyabinsk and the Chelyabinsk region itself. Then the CHP-3 is described from technical point of view with it`s equipment observation. Regimes for the nowadays power system and for the system after the power unit 3 installation are reviewed. The problems occurring are described and, finally, a solution is offered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TRIZ is one of the well-known tools, based on analytical methods for creative problem solving. This thesis suggests adapted version of contradiction matrix, a powerful tool of TRIZ and few principles based on concept of original TRIZ. It is believed that the proposed version would aid in problem solving, especially those encountered in chemical process industries with unit operations. In addition, this thesis would help fresh process engineers to recognize importance of various available methods for creative problem solving and learn TRIZ method of creative problem solving. This thesis work mainly provides idea on how to modify TRIZ based method according to ones requirements to fit in particular niche area and solve problems efficiently in creative way. Here in this case, the contradiction matrix developed is based on review of common problems encountered in chemical process industry, particularly in unit operations and resolutions are based on approaches used in past to handle those issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is a part of the Ecologically Friendly Port Ust-Luga (EFP) project. The purpose of this study is to examine the environmental status of the Finnish ports and, more specifically, the Port of HaminaKotka. An analysis of the environmental status is performed mainly as a literature review, because the Finnish ports must comply with Finnish and EU legislation and with the binding international regulations and conventions created by different organizations. The International Maritime Organisation (IMO) has done groundbreaking work in the field of maritime safety and maritime environmental protection. The MARPOL convention has a great impact on decreasing pollution from international shipping and it applies to 99% of the world’s merchant tonnage. Pollution prevention covers: Oil pollution, Chemical pollution, Air pollution and GHG Emissions, Dumping of Wasted and Other Matters, Garbage, Sewage, Port Reception Facilities, Special Areas under MARPOL and Particularly Sensitive Sea Areas. There is also Pollution Prevention for other treaties like anti-fouling systems used on ships, the transfer of alien species by ships’ ballast water and the environmentally sound recycling of ships. There are more than twenty different EU and international regulations that influence ports and port operations in Finland. In addition, there is also national legislation that has an effect on Finnish ports. For the most part, the legislation for ports is common in the EU area, but the biggest and most important difference between the legislation in Finland and other EU countries is due to the Act on Environmental Impact Assessment Procedure. The Act states that the environmental impact assessment procedure shall be applied to projects that may have significant adverse environmental impacts, due to the special features of Finland`s nature and environment. In this Act, the term environmental impact refers to the direct and indirect effects inside and outside Finnish territory of a project or operations on human health, living conditions and amenity; soil, water, air, climate, organisms, interaction between them and biodiversity; community structure, buildings, landscape, townscape and cultural heritage; utilization of natural resources. In Finland, the Environmental Permit requires that ports collect all necessary information concerning environmental effects and make required reports to the Finnish authorities, stakeholders and the public. Commonly, environmental reporting is public and environmental achievements are emphasized in reporting and in media. At the moment, the problem in environmental reporting is that it’s difficult to compare data from different ports. There is enough data concerning the environmental effects and performance, but the manner of reporting and the quality of the data varies between ports. There are differences in the units and codes used, in some cases the information is not sufficient and it can even be rather unreliable. There are also differences regarding the subjects that are emphasized in reporting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

End-user development is a very common but often largely overlooked phenomenon in information systems research and practice. End-user development means that regular people, the end-users of software, and not professional developers are doing software development. A large number of people are directly or indirectly impacted by the results of these non-professional development activities. The numbers of users performing end-user development activities are difficult to ascertain precisely. But it is very large, and still growing. Computer adoption is growing towards 100% and many new types of computational devices are continually introduced. In addition, other devices not previously programmable are becoming so. This means that, at this very moment, hundreds of millions of people are likely struggling with development problems. Furthermore, software itself is continually being adapted for more flexibility, enabling users to change the behaviour of their software themselves. New software and services are helping to transform users from consumers to producers. Much of this is now found on-line. The problem for the end-user developer is that little of this development is supported by anyone. Often organisations do not notice end-user development and consequently neither provide support for it, nor are equipped to be able to do so. Many end-user developers do not belong to any organisation at all. Also, the end-user development process may be aggravating the problem. End-users are usually not really committed to the development process, which tends to be more iterative and ad hoc. This means support becomes a distant third behind getting the job done and figuring out the development issues to get the job done. Sometimes the software itself may exacerbate the issue by simplifying the development process, deemphasising the difficulty of the task being undertaken. On-line support could be the lifeline the end-user developer needs. Going online one can find all the knowledge one could ever need. However, that does still not help the end-user apply this information or knowledge in practice. A virtual community, through its ability to adopt the end-user’s specific context, could surmount this final obstacle. This thesis explores the concept of end-user development and how it could be supported through on-line sources, in particular virtual communities, which it is argued here, seem to fit the end-user developer’s needs very well. The experiences of real end-user developers and prior literature were used in this process. Emphasis has been on those end-user developers, e.g. small business owners, who may have literally nowhere to turn to for support. Adopting the viewpoint of the end-user developer, the thesis examines the question of how an end-user could use a virtual community effectively, improving the results of the support process. Assuming the common situation where the demand for support outstrips the supply.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linguistic modelling is a rather new branch of mathematics that is still undergoing rapid development. It is closely related to fuzzy set theory and fuzzy logic, but knowledge and experience from other fields of mathematics, as well as other fields of science including linguistics and behavioral sciences, is also necessary to build appropriate mathematical models. This topic has received considerable attention as it provides tools for mathematical representation of the most common means of human communication - natural language. Adding a natural language level to mathematical models can provide an interface between the mathematical representation of the modelled system and the user of the model - one that is sufficiently easy to use and understand, but yet conveys all the information necessary to avoid misinterpretations. It is, however, not a trivial task and the link between the linguistic and computational level of such models has to be established and maintained properly during the whole modelling process. In this thesis, we focus on the relationship between the linguistic and the mathematical level of decision support models. We discuss several important issues concerning the mathematical representation of meaning of linguistic expressions, their transformation into the language of mathematics and the retranslation of mathematical outputs back into natural language. In the first part of the thesis, our view of the linguistic modelling for decision support is presented and the main guidelines for building linguistic models for real-life decision support that are the basis of our modeling methodology are outlined. From the theoretical point of view, the issues of representation of meaning of linguistic terms, computations with these representations and the retranslation process back into the linguistic level (linguistic approximation) are studied in this part of the thesis. We focus on the reasonability of operations with the meanings of linguistic terms, the correspondence of the linguistic and mathematical level of the models and on proper presentation of appropriate outputs. We also discuss several issues concerning the ethical aspects of decision support - particularly the loss of meaning due to the transformation of mathematical outputs into natural language and the issue or responsibility for the final decisions. In the second part several case studies of real-life problems are presented. These provide background and necessary context and motivation for the mathematical results and models presented in this part. A linguistic decision support model for disaster management is presented here – formulated as a fuzzy linear programming problem and a heuristic solution to it is proposed. Uncertainty of outputs, expert knowledge concerning disaster response practice and the necessity of obtaining outputs that are easy to interpret (and available in very short time) are reflected in the design of the model. Saaty’s analytic hierarchy process (AHP) is considered in two case studies - first in the context of the evaluation of works of art, where a weak consistency condition is introduced and an adaptation of AHP for large matrices of preference intensities is presented. The second AHP case-study deals with the fuzzified version of AHP and its use for evaluation purposes – particularly the integration of peer-review into the evaluation of R&D outputs is considered. In the context of HR management, we present a fuzzy rule based evaluation model (academic faculty evaluation is considered) constructed to provide outputs that do not require linguistic approximation and are easily transformed into graphical information. This is achieved by designing a specific form of fuzzy inference. Finally the last case study is from the area of humanities - psychological diagnostics is considered and a linguistic fuzzy model for the interpretation of outputs of multidimensional questionnaires is suggested. The issue of the quality of data in mathematical classification models is also studied here. A modification of the receiver operating characteristics (ROC) method is presented to reflect variable quality of data instances in the validation set during classifier performance assessment. Twelve publications on which the author participated are appended as a third part of this thesis. These summarize the mathematical results and provide a closer insight into the issues of the practicalapplications that are considered in the second part of the thesis.