968 resultados para Client feedback framework
Resumo:
Learning of preference relations has recently received significant attention in machine learning community. It is closely related to the classification and regression analysis and can be reduced to these tasks. However, preference learning involves prediction of ordering of the data points rather than prediction of a single numerical value as in case of regression or a class label as in case of classification. Therefore, studying preference relations within a separate framework facilitates not only better theoretical understanding of the problem, but also motivates development of the efficient algorithms for the task. Preference learning has many applications in domains such as information retrieval, bioinformatics, natural language processing, etc. For example, algorithms that learn to rank are frequently used in search engines for ordering documents retrieved by the query. Preference learning methods have been also applied to collaborative filtering problems for predicting individual customer choices from the vast amount of user generated feedback. In this thesis we propose several algorithms for learning preference relations. These algorithms stem from well founded and robust class of regularized least-squares methods and have many attractive computational properties. In order to improve the performance of our methods, we introduce several non-linear kernel functions. Thus, contribution of this thesis is twofold: kernel functions for structured data that are used to take advantage of various non-vectorial data representations and the preference learning algorithms that are suitable for different tasks, namely efficient learning of preference relations, learning with large amount of training data, and semi-supervised preference learning. Proposed kernel-based algorithms and kernels are applied to the parse ranking task in natural language processing, document ranking in information retrieval, and remote homology detection in bioinformatics domain. Training of kernel-based ranking algorithms can be infeasible when the size of the training set is large. This problem is addressed by proposing a preference learning algorithm whose computation complexity scales linearly with the number of training data points. We also introduce sparse approximation of the algorithm that can be efficiently trained with large amount of data. For situations when small amount of labeled data but a large amount of unlabeled data is available, we propose a co-regularized preference learning algorithm. To conclude, the methods presented in this thesis address not only the problem of the efficient training of the algorithms but also fast regularization parameter selection, multiple output prediction, and cross-validation. Furthermore, proposed algorithms lead to notably better performance in many preference learning tasks considered.
Resumo:
This article focuses on the analysis of the regulatory framework of citizen participation in the local government, which organises direct and participatory democracy at the local level, and identifies the laws and mechanisms through which the constitutional requirements for participation are accomplished. Mu nicipalities, the authority closest to citizens, are the best level of government since they directly involve civil society in the decision-making process experiencing the scope and appropriateness of the instruments by which it is channeled.
Resumo:
Marketing has studied the permanence of a client within an enterprise because it is a key element in the study of the value (economic) of the client (CLV). The research that they have developed is based in deterministic or random models, which allowed estimating the permanence of the client, and the CLV. However, when it is not possible to apply these schemes for not having the panel data that this model requires, the period of time of a client with the enterprise is uncertain data. We consider that the value of the current work is to have an alternative way to estimate the period of time with subjective information proper of the theory of uncertainty.
Resumo:
Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.
Resumo:
The threats caused by global warming motivate different stake holders to deal with and control them. This Master's thesis focuses on analyzing carbon trade permits in optimization framework. The studied model determines optimal emission and uncertainty levels which minimize the total cost. Research questions are formulated and answered by using different optimization tools. The model is developed and calibrated by using available consistent data in the area of carbon emission technology and control. Data and some basic modeling assumptions were extracted from reports and existing literatures. The data collected from the countries in the Kyoto treaty are used to estimate the cost functions. Theory and methods of constrained optimization are briefly presented. A two-level optimization problem (individual and between the parties) is analyzed by using several optimization methods. The combined cost optimization between the parties leads into multivariate model and calls for advanced techniques. Lagrangian, Sequential Quadratic Programming and Differential Evolution (DE) algorithm are referred to. The role of inherent measurement uncertainty in the monitoring of emissions is discussed. We briefly investigate an approach where emission uncertainty would be described in stochastic framework. MATLAB software has been used to provide visualizations including the relationship between decision variables and objective function values. Interpretations in the context of carbon trading were briefly presented. Suggestions for future work are given in stochastic modeling, emission trading and coupled analysis of energy prices and carbon permits.
Resumo:
The direct effect of human capital on economic growth has been widely analysed in the economic literature. This paper, however, focuses on its indirect effect as a stimulus for private investment in physical capital. The methodological framework used is the duality theory, estimating a cost system aggregated with human capital. Empirical evidence is given for Spain for the period 1980-2000. We provide evidence on the indirect effect of human capital in making private capital investment more attractive. Among the main explanations for this process, we observe that higher worker skill levels enable higher returns to be extracted from investment in physical capital.
Resumo:
El propósito de este trabajo es optimizar el sistema de gestión de workflows COMPSs caracterizando el comportamiento de diferentes dispositivos de memoria a nivel de consumo energético y tiempo de ejecución. Para llevar a cabo este propósito, se ha implementado un servicio de caché para COMPSs para conseguir que sea consciente de la jerarquía de memoria y se han realizado múltiples experimentos para caracterizar los dispositivos de memoria y las mejoras en el rendimiento.
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
This thesis is done as a complementary part for the active magnet bearing (AMB) control software development project in Lappeenranta University of Technology. The main focus of the thesis is to examine an idea of a real-time operating system (RTOS) framework that operates in a dedicated digital signal processor (DSP) environment. General use real-time operating systems do not necessarily provide sufficient platform for periodic control algorithm utilisation. In addition, application program interfaces found in real-time operating systems are commonly non-existent or provided as chip-support libraries, thus hindering platform independent software development. Hence, two divergent real-time operating systems and additional periodic extension software with the framework design are examined to find solutions for the research problems. The research is discharged by; tracing the selected real-time operating system, formulating requirements for the system, and designing the real-time operating system framework (OSFW). The OSFW is formed by programming the framework and conjoining the outcome with the RTOS and the periodic extension. The system is tested and functionality of the software is evaluated in theoretical context of the Rate Monotonic Scheduling (RMS) theory. The performance of the OSFW and substance of the approach are discussed in contrast to the research theme. The findings of the thesis demonstrates that the forged real-time operating system framework is a viable groundwork solution for periodic control applications.
Resumo:
Partiendo de la difundida distinción, entre unos ordenamientos jurídicos abiertos, como el derecho inglés y el anglo americano, que se vinculan en el pasado al Derecho Romano, y otros cerrados o codificados, como los derechos del continente europeo, y tras detenernos en el origen terminológico de ambos sistemas y en su rígida contraposición, se procura destacar en este trabajo que Roma y su Derecho tampoco abrazan en toda su pureza, un sistema abierto.
Resumo:
The research around performance measurement and management has focused mainly on the design, implementation and use of performance measurement systems. However, there is little evidence about the actual impacts of performance measurement on the different levels of business and operations of organisations, as well as the underlying factors that lead to a positive impact of performance measurement. The study thus focuses on this research gap, which can be considered both important and challenging to cover. The first objective of the study was to examine the impacts of performance measurement on different aspects of management, leadership and the quality of working life, after which the factors that facilitate and improve performance and performance measurement at the operative level of an organisation were examined. The second objective was to study how these factors operate in practice. The third objective focused on the construction of a framework for successful operative level performance measurement and the utilisation of the factors in the organisations. The research objectives have been studied through six research papers utilising empirical data from three separate studies, including two sets of interview data and one of quantitative data. The study applies mainly the hermeneutical research approach. As a contribution of the study, a framework for successful operative level performance measurement was formed by matching the findings of the current study and performance measurement theory. The study extents the prior research regarding the impacts of performance measurement and the factors that have a positive effect on operative level performance and performance measurement. The results indicate that under suitable circumstances, performance measurement has positive impacts on different aspects of management, leadership, and the quality of working life. The results reveal that for example the perception of the employees and the management of the impacts of performance measurement on leadership style differ considerably. Furthermore, the fragmented literature has been reorganised into six factors that facilitate and improve the performance of the operations and employees, and the use of performance measurement at the operative level of an organisation. Regarding the managerial implications of the study, managers who operate around performance measurement can utilise the framework for example by putting the different phases of the framework into practice.
Resumo:
Cost estimation is an important, but challenging process when designing a new product or a feature of it, verifying the product prices given by suppliers or planning a cost saving actions of existing products. It is even more challenging when the product is highly modular, not a bulk product. In general, cost estimation techniques can be divided into two main groups - qualitative and quantitative techniques - which can further be classified into more detailed methods. Generally, qualitative techniques are preferable when comparing alternatives and quantitative techniques when cost relationships can be found. The main objective of this thesis was to develop a method on how to estimate costs of internally manufactured and commercial elevator landing doors. Because of the challenging product structure, the proposed cost estimation framework is developed under three different levels based on past cost information available. The framework consists of features from both qualitative and quantitative cost estimation techniques. The starting point for the whole cost estimation process is an unambiguous, hierarchical product structure so that the product can be classified into controllable parts and is then easier to handle. Those controllable parts can then be compared to existing past cost knowledge of similar parts and create as accurate cost estimates as possible by that way.
Resumo:
The GMO Risk Assessment and Communication of Evidence (GRACE; www.grace-fp7.eu) project is funded by the European Commission within the 7th Framework Programme. A key objective of GRACE is to conduct 90-day animal feeding trials, animal studies with an extended time frame as well as analytical, in vitro and in silico studies on genetically modified (GM) maize in order to comparatively evaluate their use in GM plant risk assessment. In the present study, the results of two 90-day feeding trials with two different GM maize MON810 varieties, their near-isogenic non-GM varieties and four additional conventional maize varieties are presented. The feeding trials were performed by taking into account the guidance for such studies published by the EFSA Scientific Committee in 2011 and the OECD Test Guideline 408. The results obtained show that the MON810 maize at a level of up to 33 % in the diet did not induce adverse effects in male and female Wistar Han RCC rats after subchronic exposure, independently of the two different genetic backgrounds of the event
Resumo:
Feedback-related negativity (FRN) is an ERP component that distinguishes positive from negative feedback. FRN has been hypothesized to be the product of an error signal that may be used to adjust future behavior. In addition, associative learning models assume that the trial-to-trial learning of cueoutcome mappings involves the minimization of an error term. This study evaluated whether FRN is a possible electrophysiological correlate of this error term in a predictive learning task where human subjects were asked to learn different cueoutcome relationships. Specifically, we evaluated the sensitivity of the FRN to the course of learning when different stimuli interact or compete to become a predictor of certain outcomes. Importantly, some of these cues were blocked by more informative or predictive cues (i.e., the blocking effect). Interestingly, the present results show that both learning and blocking affect the amplitude of the FRN component. Furthermore, independent analyses of positive and negative feedback event-related signals showed that the learning effect was restricted to the ERP component elicited by positive feedback. The blocking test showed differences in the FRN magnitude between a predictive and a blocked cue. Overall, the present results show that ERPs that are related to feedback processing correspond to the main predictions of associative learning models. ■
Resumo:
Both the competitive environment and the internal structure of an industrial organization are typically included in the processes which describe the strategic management processes of the firm, but less attention has been paid to the interdependence between these views. Therefore, this research focuses on explaining the particular conditions of an industry change, which lead managers to realign the firm in respect of its environment for generating competitive advantage. The research question that directs the development of the theoretical framework is: Why do firms outsource some of their functions? The three general stages of the analysis are related to the following research topics: (i) understanding forces that shape the industry, (ii) estimating the impacts of transforming customer preferences, rivalry, and changing capability bases on the relevance of existing assets and activities, and emergence of new business models, and (iii) developing optional structures for future value chains and understanding general boundaries for market emergence. The defined research setting contributes to the managerial research questions “Why do firms reorganize their value chains?”, “Why and how are decisions made?” Combining Transaction Cost Economics (TCE) and Resource-Based View (RBV) within an integrated framework makes it possible to evaluate the two dimensions of a company’s resources, namely the strategic value and transferability. The final decision of restructuring will be made based on an analysis of the actual business potential of the outsourcing, where benefits and risks are evaluated. The firm focuses on the risk of opportunism, hold-up problems, pricing, and opportunities to reach a complete contract, and finally on the direct benefits and risks for financial performance. The supplier analyzes the business potential of an activity outside the specific customer, the amount of customer-specific investments, the service provider’s competitive position, abilities to revenue gains in generic segments, and long-term dependence on the customer.