820 resultados para Performance Based Assessment


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel superabsorbent hydrogel (SH) composite based on a poly(acrylamide-co-acrylate) matrix filled with nontronite (NONT), a Fe(III)-rich member of the smectite group of clay minerals, is described in this manuscript. A variety of techniques, including FTIR, XRD, TGA, and SEM/EDX, were utilized to characterize this original composite. Experimental data confirmed the SH composite formation and suggested NONT was completely dispersed in the polymeric matrix. Additionally, NONT improved the water uptake capacity of the final material, which exhibited fast absorption, low sensitivity to the presence of salt, high water retention and a pH sensitive properties. These preliminary data showed that the original SH composite prepared here possesses highly attractive properties for applications in areas such as the agriculture field, particularly as a soil conditioner.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this thesis was to study the role of capabilities in purchasing and supply management. For the pre-understanding of the research topic, purchasing and supply management development and the multidimensional, unstructured and complex nature of purchasing and supply management performance was studied in literature review. In addition, a capability-based purchasing and supply management performance framework were researched and structured for the empirical research. Due to the unstructured nature of the research topic, the empirical research is three-pronged in this study including three different research methods: the Delphi method, semi-structured interview, and case research. As a result, the purchasing and supply management capability assessment tool was structured to measure current level of capabilities and impact of capabilities on purchasing and supply management performance. The final results indicate that capabilities are enablers of purchasing and supply management performance, and therefore critical to purchasing and supply performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis traditional investment strategies (value and growth) are compared to modern investment strategies (momentum, contrarian and GARP) in terms of risk, performance and cumulative returns. Strategies are compared during time period reaching from 1996 to 2010 in the Finnish stock market. Used data includes all listed main list stocks, dividends and is adjusted in case of splits, and mergers and acquisitions. Strategies are tested using different holding periods (6, 12 and 36 months) and data is divided into tercile portfolios based on different ranking criteria. Contrarian and growth strategies are the only strategies with improved cumulative returns when longer holding periods are used. Momentum (52-week high price1) and GARP strategies based on short holding period have the best performance and contrarian and growth strategies the worst. Momentum strategies (52-week high price) along with short holding period contrarian strategies (52-week low price2) have the lowest risk. Strategies with the highest risk are both growth strategies and two momentum strategies (52-week low price). The empirical results support the efficiency of momentum, GARP and value strategies. The least efficient strategies are contrarian and growth strategies in terms of risk, performance and cumulative returns. Most strategies outperform the market portfolio in all three measures. 1 Stock ranking criterion (current price/52-week highest price) 2 Stock ranking criterion (current price/52-week lowest price)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past decade, organizations worldwide have begun to widely adopt agile software development practices, which offer greater flexibility to frequently changing business requirements, better cost effectiveness due to minimization of waste, faster time-to-market, and closer collaboration between business and IT. At the same time, IT services are continuing to be increasingly outsourced to third parties providing the organizations with the ability to focus on their core capabilities as well as to take advantage of better demand scalability, access to specialized skills, and cost benefits. An output-based pricing model, where the customers pay directly for the functionality that was delivered rather than the effort spent, is quickly becoming a new trend in IT outsourcing allowing to transfer the risk away from the customer while at the same time offering much better incentives for the supplier to optimize processes and improve efficiency, and consequently producing a true win-win outcome. Despite the widespread adoption of both agile practices and output-based outsourcing, there is little formal research available on how the two can be effectively combined in practice. Moreover, little practical guidance exists on how companies can measure the performance of their agile projects, which are being delivered in an output-based outsourced environment. This research attempted to shed light on this issue by developing a practical project monitoring framework which may be readily applied by organizations to monitor the performance of agile projects in an output-based outsourcing context, thus taking advantage of the combined benefits of such an arrangement Modified from action research approach, this research was divided into two cycles, each consisting of the Identification, Analysis, Verification, and Conclusion phases. During Cycle 1, a list of six Key Performance Indicators (KPIs) was proposed and accepted by the professionals in the studied multinational organization, which formed the core of the proposed framework and answered the first research sub-question of what needs to be measured. In Cycle 2, a more in-depth analysis was provided for each of the suggested Key Performance Indicators including the techniques for capturing, calculating, and evaluating the information provided by each KPI. In the course of Cycle 2, the second research sub-question was answered, clarifying how the data for each KPI needed to be measured, interpreted, and acted upon. Consequently, after two incremental research cycles, the primary research question was answered describing the practical framework that may be used for monitoring the performance of agile IT projects delivered in an output-based outsourcing context. This framework was evaluated by the professionals within the context of the studied organization and received positive feedback across all four evaluation criteria set forth in this research, including the low overhead of data collection, high value of provided information, ease of understandability of the metric dashboard, and high generalizability of the proposed framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Value-based selling is a salesperson behavioral mode which concentrates on generating superior customer value. Although service dominant logic emphasizes customer value as a central tenet for achieving strategic objectives, sales management literature has predominantly circumvented the subject matter of customer value. The purpose of this thesis is to demonstrate the distinctiveness and positive sales performance outcomes of value-based selling. Additionally, performance outcomes of value-based selling are contrasted with other key sales behaviors, selling skills and motivational orientations. As a part of this thesis, large-scale survey of 730 respondents was collected. The survey was tailored for the needs of a value-based selling research group led by Ph.D. Harri Terho. The research group used convenience sampling to select the salespeople of 25 medium- and large-scale companies in Finland which currently either practice value-based selling or consider developing these activities. This thesis contains three key findings: value-based selling is established as a distinct sales behavior, it relates directly and positively to salesperson performance and it explains the link between customer-oriented selling and salesperson performance. Value-based selling relates to salesperson performance especially in the following GICS-sectors: energy, industrials and materials. However, relationship selling relates to performance strongest in the energy sector and adaptive selling in industrials sector. In sum, it is evident that actively crafting customer value is a successful sales behavior in many business-to-business marketing environments while other sales behaviors, excluding customer-oriented selling, still uphold their significance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To describe the change of purchasing moving from administrative to strategic function academics have put forward maturity models which help practitioners to compare their purchasing activities to industry top performers and best practices. However, none of the models aim to distinguish the purchasing maturity from the after-sales point of view, even though after-sales activities are acknowledged as a relevant source of revenue, profit and competitive advantage in most manufacturing firms. The maturity of purchasing and supply management practices have a large impact to the overall performance of the spare parts supply chain and ultimately to the value creation and relationship building for the end customer. The research was done as a case study for a European after-sales organization which is part of a globally operating industrial firm specialized in heavy machinery. The study mapped the current state of the purchasing practices in the case organization and also distinguished the relevant areas for future development. The study was based on the purchasing maturity model developed by Schiele (2007) and investigated also how applicable is the maturity model in the spare parts supply chain context. Data for the assessment was gathered using five expert interviews inside the case organization and other parties involved in the company’s spare parts supply chain. Inventory management dimension was added to the original maturity model in order to better capture the important areas in a spare parts supply chain. The added five questions were deduced from the spare parts management literature and verified as relevant areas by the case organization’s personnel. Results indicate that largest need for development in the case organization are: better collaboration between sourcing and operative procurement functions, use of installed base information in the spare parts management, training plan development for new buyers, assessment of aligned KPI’s between the supply chain parties and better defining the role of after-sales sourcing. The purchasing maturity model used in this research worked well in H&R Leading, Controlling and Inventory Management dimensions. The assessment was more difficult to conduct in the Supplier related processes, Process integration and Organizational structure –dimensions, mainly because the assessment in these sections would for some parts require more company-wide assessment. Results indicate also that the purchasing maturity model developed by Schiele (2007) captures the relevant areas in the spare parts supply as well.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In recent decade customer loyalty programs have become very popular and almost every retail chain seems to have one. Through the loyalty programs companies are able to collect information about the customer behavior and to use this information in business and marketing management to guide decision making and resource allocation. The benefits for the loyalty program member are often monetary, which has an effect on the profitability of the loyalty program. Not all the loyalty program members are equally profitable, as some purchase products for the recommended retail price and some buy only discounted products. If the company spends similar amount of resources to all members, it can be seen that the customer margin is lower on the customer who bought only discounted products. It is vital for a company to measure the profitability of their members in order to be able to calculate the customer value. To calculate the customer value several different customer value metrics can be used. During the recent years especially customer lifetime value has received a lot of attention and it is seen to be superior against other customer value metrics. In this master’s thesis the customer lifetime value is implemented on the case company’s customer loyalty program. The data was collected from the customer loyalty program’s database and represents year 2012 on the Finnish market. The data was not complete to fully take advantage of customer lifetime value and as a conclusion it can be stated that a new key performance indicator of customer margin should be acquired in order to profitably drive the business of the customer loyalty program. Through the customer margin the company would be able to compute the customer lifetime value on regular basis enabling efficient resource allocation in marketing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although echocardiography has been used in rats, few studies have determined its efficacy for estimating myocardial infarct size. Our objective was to estimate the myocardial infarct size, and to evaluate anatomic and functional variables of the left ventricle. Myocardial infarction was produced in 43 female Wistar rats by ligature of the left coronary artery. Echocardiography was performed 5 weeks later to measure left ventricular diameter and transverse area (mean of 3 transverse planes), infarct size (percentage of the arc with infarct on 3 transverse planes), systolic function by the change in fractional area, and diastolic function by mitral inflow parameters. The histologic measurement of myocardial infarction size was similar to the echocardiographic method. Myocardial infarct size ranged from 4.8 to 66.6% when determined by histology and from 5 to 69.8% when determined by echocardiography, with good correlation (r = 0.88; P < 0.05; Pearson correlation coefficient). Left ventricular diameter and mean diastolic transverse area correlated with myocardial infarct size by histology (r = 0.57 and r = 0.78; P < 0.0005). The fractional area change ranged from 28.5 ± 5.6 (large-size myocardial infarction) to 53.1 ± 1.5% (control) and correlated with myocardial infarct size by echocardiography (r = -0.87; P < 0.00001) and histology (r = -0.78; P < 00001). The E/A wave ratio of mitral inflow velocity for animals with large-size myocardial infarction (5.6 ± 2.7) was significantly higher than for all others (control: 1.9 ± 0.1; small-size myocardial infarction: 1.9 ± 0.4; moderate-size myocardial infarction: 2.8 ± 2.3). There was good agreement between echocardiographic and histologic estimates of myocardial infarct size in rats.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study aimed to verify the association between the contribution of energy systems during an incremental exercise test (IET), pacing, and performance during a 10-km running time trial. Thirteen male recreational runners completed an incremental exercise test on a treadmill to determine the respiratory compensation point (RCP), maximal oxygen uptake (V˙O2max), peak treadmill speed (PTS), and energy systems contribution; and a 10-km running time trial (T10-km) to determine endurance performance. The fractions of the aerobic (WAER) and glycolytic (WGLYCOL) contributions were calculated for each stage based on the oxygen uptake and the oxygen energy equivalents derived by blood lactate accumulation, respectively. Total metabolic demand (WTOTAL) was the sum of these two energy systems. Endurance performance during the T10-km was moderately correlated with RCP, V˙O2maxand PTS (P<@0.05), and moderate-to-highly correlated with WAER, WGLYCOL, and WTOTAL (P<0.05). In addition, WAER, WGLYCOL, and WTOTAL were also significantly correlated with running speed in the middle (P<0.01) and final (P<0.01) sections of the T10-km. These findings suggest that the assessment of energy contribution during IET is potentially useful as an alternative variable in the evaluation of endurance runners, especially because of its relationship with specific parts of a long-distance race.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The lack of research of private real estate is a well-known problem. Earlier studies have mostly concentrated on the USA or the UK. Therefore, this master thesis offers more information about the performance and risk associated with private real estate investments in Nordic countries, but especially in Finland. The structure of this master thesis is divided into two independent sections based on the research questions. In first section, database analysis is performed to assess risk-return ratio of direct real estate investment for Nordic countries. Risk-return ratios are also assessed for different property sectors and economic regions. Finally, review of diversification strategies based on property sectors and economic regions is performed. However, standard deviation itself is not usually sufficient method to evaluate riskiness of private real estate. There is demand for more explicit assessment of property risk. One solution is property risk scoring. In second section risk scorecard based tool is built to make different real estate comparable in terms of risk. In order to do this, nine real estate professionals were interviewed to enhance the structure of theory-based risk scorecard and to assess weights for different risk factors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.