811 resultados para dynamic performance appraisal
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
Purpose: This paper aims to explore the role of internal and external knowledgebased linkages across the supply chain in achieving better operational performance. It investigates how knowledge is accumulated, shared, and applied to create organization-specific knowledge resources that increase and sustain the organization's competitive advantage. Design/methodology/approach: This paper uses a single case study with multiple, embedded units of analysis, and the social network analysis (SNA) to demonstrate the impact of internal and external knowledge-based linkages across multiple tiers in the supply chain on the organizational operational performance. The focal company of the case study is an Italian manufacturer supplying rubber components to European automotive enterprises. Findings: With the aid of the SNA, the internal knowledge-based linkages can be mapped and visualized. We found that the most central nodes having the most connections with other nodes in the linkages are the most crucial members in terms of knowledge exploration and exploitation within the organization. We also revealed that the effective management of external knowledge-based linkages, such as buyer company, competitors, university, suppliers, and subcontractors, can help improve the operational performance. Research limitations/implications: First, our hypothesis was tested on a single case. The analysis of multiple case studies using SNA would provide a deeper understanding of the relationship between the knowledge-based linkages at all levels of the supply chain and the integration of knowledge. Second, the static nature of knowledge flows was studied in this research. Future research could also consider ongoing monitoring of dynamic linkages and the dynamic characteristic of knowledge flows. Originality/value: To the best of our knowledge, the phrase 'knowledge-based linkages' has not been used in the literature and there is lack of investigation on the relationship between the management of internal and external knowledge-based linkages and the operational performance. To bridge the knowledge gap, this paper will show the importance of understanding the composition and characteristics of knowledge-based linkages and their knowledge nodes. In addition, this paper will show that effective management of knowledge-based linkages leads to the creation of new knowledge and improves organizations' operational performance.
Resumo:
Purpose: The purpose of this paper is to ascertain how today’s international marketers can perform better on the global scene by harnessing spontaneity. Design/methodology/approach: The authors draw on contingency theory to develop a model of the spontaneity – international marketing performance relationship, and identify three potential moderators, namely, strategic planning, centralization, and market dynamism. The authors test the model via structural equation modeling with survey data from 197 UK exporters. Findings: The results indicate that spontaneity is beneficial to exporters in terms of enhancing profit performance. In addition, greater centralization and strategic planning strengthen the positive effects of spontaneity. However, market dynamism mitigates the positive effect of spontaneity on export performance (when customer needs are volatile, spontaneous decisions do not function as well in terms of ensuring success). Practical implications: Learning to be spontaneous when making export decisions appears to result in favorable outcomes for the export function. To harness spontaneity, export managers should look to develop company heuristics (increase centralization and strategic planning). Finally, if operating in dynamic export market environments, the role of spontaneity is weaker, so more conventional decision-making approaches should be adopted. Originality/value: The international marketing environment typically requires decisions to be flexible and fast. In this context, spontaneity could enable accelerated and responsive decision-making, allowing international marketers to realize superior performance. Yet, there is a lack of research on decision-making spontaneity and its potential for international marketing performance enhancement.
Resumo:
Previous research has established that relationships with authority figures and procedural justice perceptions are important in terms of the way in which employees react to organizational procedures that affect them. What is less clear are the reasons why exchange quality with authorities is related to perceptions of process fairness and the role of procedural justice climate in this process. Results indicate that individual-level perceptions of procedural justice, but not performance ratings, partially mediate the relationship between exchange quality and reactions to performance appraisals, and that procedural justice climate is positively related to perceptions of procedural justice and appraisal reactions. These results support a more relational than instrumental view of justice perceptions in organizational procedures bound by exchange quality with an authority figure. Our study suggests that it is essential for managers to actively monitor and manage employee perceptions of process fairness at the group and individual levels. © 2015 Wiley Periodicals, Inc.
Resumo:
A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.
Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.
The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.
The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.
All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.
Resumo:
The oceans take up more than 1 million tons of CO2 from the air per hour, about one-quarter of the anthropogenically released amount, leading to disrupted seawater chemistry due to increasing CO2 emissions. Based on the fossil fuel-intensive CO2 emission scenario (A1F1; Houghton et al., 2001), the H+ concentration or acidity of surface seawater will increase by about 150% (pH drop by 0.4) by the end of this century, the process known as ocean acidification (OA; Sabine et al., 2004; Doney et al., 2009; Gruber et al., 2012). Seawater pH is suggested to decrease faster in the coastal waters than in the pelagic oceans due to the interactions of hypoxia, respiration, and OA (Cai et al., 2011). Therefore, responses of coastal algae to OA are of general concern, considering the economic and social services provided by the coastal ecosystem that is adjacent to human living areas and that is dependent on coastal primary productivity. On the other hand, dynamic environmental changes in the coastal waters can interact with OA (Beardall et al., 2009).
Resumo:
This paper proposes a three-step method of evaluating high performance coaches involving feedback from the athletes. First, data are collected using an instrument such as the Coaching Behavior Scale for Sport (CBS-S: Côté, Yardley, Hay, Sedgwick, & Baker, 1999). Second, a summary report is prepared with descriptive information regarding the frequency of behaviors demonstrated by the coach that can be compared to previous results or to a criterion measure. The third step involves appropriate personnel reviewing the report and subsequently providing guidance for individual coach development. This three-step appraisal method provides useful evaluative feedback to coaches and has been used in several sport programs in Canada, the United States, and Australia.
Resumo:
This study examined the effect of a spanwise angle of attack gradient on the growth and stability of a dynamic stall vortex in a rotating system. It was found that a spanwise angle of attack gradient induces a corresponding spanwise vorticity gradient, which, in combination with spanwise flow, results in a redistribution of circulation along the blade. Specifically, when modelling the angle of attack gradient experienced by a wind turbine at the 30% span position during a gust event, the spanwise vorticity gradient was aligned such that circulation was transported from areas of high circulation to areas of low circulation, increasing the local dynamic stall vortex growth rate, which corresponds to an increase in the lift coefficient, and a decrease in the local vortex stability at this point. Reversing the relative alignment of the spanwise vorticity gradient and spanwise flow results in circulation transport from areas of low circulation generation to areas of high circulation generation, acting to reduce local circulation and stabilise the vortex. This circulation redistribution behaviour describes a mechanism by which the fluctuating loads on a wind turbine are magnified, which is detrimental to turbine lifetime and performance. Therefore, an understanding of this phenomenon has the potential to facilitate optimised wind turbine design.
Resumo:
This article presents a global vision for sport through a new framework that incorporates the elements necessary for a developmentally sound approach to youth sport involvement. This framework proposes that youth sport involvement includes three basic elements: (1) taking part in activities (what), while creating relationships with others (who), in a specific setting (where). When these three elements positively interact, it creates a context that, when repeated on a regular basis, leads to changes in the personal assets of the participants. Changes in individuals’ personal assets, such as Competence, Confidence, Connection, and Character (4 C’s), have long been associated with positive sport experiences, which in turn lead to long-term outcomes, including continued sport Participation, higher levels of Performance in sport, and Personal development through sport (3 P’s). Research linking the three basic elements of youth sport (activities, relationships, and settings) to positive changes in personal assets (4 C’s) and long-term outcomes (3 P’s) are discussed and the Personal Assets Framework is presented
Resumo:
The purpose of this study was to report the knowledge used by expert high performance gymnastic coaches in the organization of training and competition. In-depth interviews were conducted with 9 coaches who worked with male gymnasts and 8 coaches who worked with female gymnasts. Qualitative analyses showed that coaches of males and coaches of females planned training similarly, except that coaches of females appeared to emphasize esthetic and nutritional issues to a greater extent. Coaches of males revealed more concerns about the organization of gymnasts' physical conditioning. Analysis indicated that expert gymnastic coaches of males and females are constantly involved in dynamic social interactions with gymnasts, parents, and assistant coaches. Many areas of coaches' organizational work, such as dealing with the athletes' personal concerns and working with parents, are not part of the structure of coaches' training programs and emerged as crucial tasks of expert gymnastic coaches for developing elite gymnasts.
Resumo:
Thermal and fatigue cracking are the two of the major pavement distress phenomena that contribute significantly towards increased premature pavement failures in Ontario. This in turn puts a massive burden on the provincial budgets as the government spends huge sums of money on the repair and rehabilitation of roads every year. Governments therefore need to rethink and re-evaluate their current measures in order to prevent it in future. The main objectives of this study include: the investigation of fatigue distress of 11 contract samples at 10oC, 15oC, 20oC and 25oC and the use of crack-tip-opening-displacement (CTOD) requirements at temperatures other than 15oC; investigation of thermal and fatigue distress of the comparative analysis of 8 Ministry of Transportation (MTO) recovered and straight asphalt samples through double-edge-notched-tension test (DENT) and extended bending beam rheometry (EBBR); chemical testing of all samples though X-ray Fluorescence (XRF) and Fourier transform infrared analysis (FTIR); Dynamic Shear Rheometer (DSR) higher and intermediate temperature grading; and the case study of a local Kingston road. Majority of 11 contract samples showed satisfactory performance at all temperatures except one sample. Study of CTOD at various temperatures found a strong correlation between the two variables. All recovered samples showed poor performance in terms of their ability to resist thermal and fatigue distress relative to their corresponding straight asphalt as evident in DENT test and EBBR results. XRF and FTIR testing of all samples showed the addition of waste engine oil (WEO) to be the root cause of pavement failures. DSR high temperature grading showed superior performance of recovered binders relative to straight asphalt. The local Kingston road showed extensive signs of damage due to thermal and fatigue distress as evident from DENT test, EBBR results and pictures taken in the field. In the light of these facts, the use of waste engine oil and recycled asphalt in pavements should be avoided as these have been shown to cause premature failure in pavements. The DENT test existing CTOD requirements should be implemented at other temperatures in order to prevent the occurrences of premature pavement failures in future.
Resumo:
Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.
Resumo:
With the emerging prevalence of smart phones and 4G LTE networks, the demand for faster-better-cheaper mobile services anytime and anywhere is ever growing. The Dynamic Network Optimization (DNO) concept emerged as a solution that optimally and continuously tunes the network settings, in response to varying network conditions and subscriber needs. Yet, the DNO realization is still at infancy, largely hindered by the bottleneck of the lengthy optimization runtime. This paper presents the design and prototype of a novel cloud based parallel solution that further enhances the scalability of our prior work on various parallel solutions that accelerate network optimization algorithms. The solution aims to satisfy the high performance required by DNO, preliminarily on a sub-hourly basis. The paper subsequently visualizes a design and a full cycle of a DNO system. A set of potential solutions to large network and real-time DNO are also proposed. Overall, this work creates a breakthrough towards the realization of DNO.