848 resultados para performance management framework
Resumo:
Current water management practices in South Florida have negatively impacted many species inhabiting Florida Bay. Variable and high salinity has been identified as a key stressor in these estuaries. The comprehensive Everglades Restoration Plan (CERP) includes water redistribution projects that will restore natural freshwater flows to northeastern Florida Bay. My studies focused on the following central theme and hypotheses: Biological performance measures (i.e., growth, reproduction, survival), behavior (i.e., habitat preference and locomotor behavior) and diversity of estuarine fish will be controlled by changes in salinity and water quality that will occur as a result of the restoration of freshwater flow to the bay. A series of acute and subchronic physiological toxicity studies were conducted to determine the effects of salinity changes on the life stages (embryo/larval, juvenile, adult) and fecundity of four native estuarine fish (Cyprinodon variegatus, Floridichthys carpio, Poecilia latipinna, and Gambusia holbrooki). Fishe were exposed to a range of salinity concentrations (freshwater to hypersaline) based on salinity profiles in the study areas. Growth (length, weight) and survival were measured. Salinity trials included both rapid and gradual change events. Results show negative effects of acute, abrupt salinity changes on fish survival, development and reproductive success as a result of salinity stress. Other studies targeted reproduction and critical embryo-larval/neonate development as key areas for detecting long-term population effects of salinity change in Florida Bay. Adults of C. variegatus and P. latipinna were also examined for behavioral responses to pulsed salinity changes. These responses include changes in swimming performance, locomotor behavior and zone preference. Finally, an ecological risk assessment was conducted for adverse salinity conditions in northeastern Florida Bay. Using the U.S. EPA's framework, the risk to estuarine fish species diversity was assessed against regional salinity profiles from a 17-year database. Based on the risk assessment, target salinity profiles for these areas are recommended for managers.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
“Availability” is the terminology used in asset intensive industries such as petrochemical and hydrocarbons processing to describe the readiness of equipment, systems or plants to perform their designed functions. It is a measure to suggest a facility’s capability of meeting targeted production in a safe working environment. Availability is also vital as it encompasses reliability and maintainability, allowing engineers to manage and operate facilities by focusing on one performance indicator. These benefits make availability a very demanding and highly desired area of interest and research for both industry and academia. In this dissertation, new models, approaches and algorithms have been explored to estimate and manage the availability of complex hydrocarbon processing systems. The risk of equipment failure and its effect on availability is vital in the hydrocarbon industry, and is also explored in this research. The importance of availability encouraged companies to invest in this domain by putting efforts and resources to develop novel techniques for system availability enhancement. Most of the work in this area is focused on individual equipment compared to facility or system level availability assessment and management. This research is focused on developing an new systematic methods to estimate system availability. The main focus areas in this research are to address availability estimation and management through physical asset management, risk-based availability estimation strategies, availability and safety using a failure assessment framework, and availability enhancement using early equipment fault detection and maintenance scheduling optimization.
Resumo:
Postprint
Resumo:
La présente thèse vise à évaluer le degré d’implantation et d’utilisation de systèmes de mesure de la performance (SMP) par les décideurs des organisations de réadaptation et à comprendre les facteurs contextuels ayant influencé leur implantation. Pour ce faire, une étude de cas multiples a été réalisée comprenant deux sources de données: des entrevues individuelles avec des cadres supérieurs des organisations de réadaptation du Québec et des documents organisationnels. Le cadre conceptuel Consolidated Framework for Implementation Research a été utilisé pour guider la collecte et l’analyse des données. Une analyse intra-cas ainsi qu’une analyse inter-cas ont été réalisées. Nos résultats montrent que le niveau de préparation organisationnelle à l’implantation d’un SMP était élevé et que les SMP ont été implantés avec succès et utilisés de plusieurs façons. Les organisations les ont utilisés de façon passive (comme outil d’information), de façon ciblée (pour tenter d’améliorer des domaines sous-performants) et de façon politique (comme outil de négociation auprès des autorités gouvernementales). Cette utilisation diversifiée des SMP est suscitée par l’interaction complexe de facteurs provenant du contexte interne propre à chaque organisation, des caractéristiques du SMP, du processus d’implantation appliqué et du contexte externe dans lequel évoluent ces organisations. Au niveau du contexte interne, l’engagement continu et le leadership de la haute direction ont été décisifs dans l’implantation du SMP de par leur influence sur l’identification du besoin d’un SMP, l’engagement des utilisateurs visés dans le projet, la priorité organisationnelle accordée au SMP ainsi que les ressources octroyées à son implantation, la qualité des communications et le climat d’apprentissage organisationnel. Toutefois, même si certains de ces facteurs, comme les ressources octroyées à l’implantation, la priorité organisationnelle du SMP et le climat d’apprentissage se sont révélés être des barrières à l’implantation, ultimement, ces barrières n’étaient pas suffisamment importantes pour entraver l’utilisation du SMP. Cette étude a également confirmé l’importance des caractéristiques du SMP, particulièrement la perception de qualité et d’utilité de l’information. Cependant, à elles seules, ces caractéristiques sont insuffisantes pour assurer le succès d’implantation. Cette analyse d’implantation a également révélé que, même si le processus d’implantation ne suit pas des étapes formelles, un plan de développement du SMP, la participation et l’engagement des décideurs ainsi que la désignation d’un responsable de projet ont tous facilité son implantation. Cependant, l’absence d’évaluation et de réflexion collective sur le processus d’implantation a limité le potentiel d’apprentissage organisationnel, un prérequis à l’amélioration de la performance. Quant au contexte externe, le soutien d’un organisme externe s’est avéré un facilitateur indispensable pour favoriser l’implantation de SMP par les organisations de réadaptation malgré l’absence de politiques et incitatifs gouvernementaux à cet effet. Cette étude contribue à accroître les connaissances sur les facteurs contextuels ainsi que sur leurs interactions dans l’utilisation d’innovations tels les SMP et confirme l’importance d’aborder l’analyse de l’implantation avec une perspective systémique.
Resumo:
This paper will explore a data-driven approach called Sales Resource Management (SRM) that can provide real insight into sales management. The DSMT (Diagnosis, Strategy, Metrics and Tools) framework can be used to solve field sales management challenges. This paper focus on the 6P's strategy of SRM and illustrates how to use them to solve the CAPS (Concentration, Attrition, Performance and Spend) challenges. © 2010 IEEE.
Resumo:
Drawing on the organizational capabilities literature, the authors developed and tested a model of how supportive human resource management (HRM) improved firms’ financial performance perceived by marketing managers through fostering the implementation of a customer-oriented strategy. Customer-linking capability, which is the capability in managing close customer relationships, indicated the implementation of the customer-oriented strategy. Data collected from two emerging economies – China and Hungary –established that supportive HRM partially mediated the relationship between customer-oriented strategy and customer-linking capability. Customer-linking capability further explained how supportive HRM contributed to perceived financial performance. This study explicates the implication of customer-oriented strategy for HRM and reveals the
importance of HRM in strategy implementation. It also sheds some light on the ‘black box’ between HRM and performance. While making important contributions to the field of strategy, HRM and marketing, this study also offers useful practical implications.
Resumo:
The astonishing development of diverse and different hardware platforms is twofold: on one side, the challenge for the exascale performance for big data processing and management; on the other side, the mobile and embedded devices for data collection and human machine interaction. This drove to a highly hierarchical evolution of programming models. GVirtuS is the general virtualization system developed in 2009 and firstly introduced in 2010 enabling a completely transparent layer among GPUs and VMs. This paper shows the latest achievements and developments of GVirtuS, now supporting CUDA 6.5, memory management and scheduling. Thanks to the new and improved remoting capabilities, GVirtus now enables GPU sharing among physical and virtual machines based on x86 and ARM CPUs on local workstations,computing clusters and distributed cloud appliances.
Resumo:
Effectiveness in achieving mission is fundamental to evaluating charity performance, and is of central concern to stakeholders who fund, regulate and otherwise engage with such organisations. Exploring the meaning of transparency in the context of stakeholder engagement, and utilising previous research and authoritative sector discussion, this paper develops a novel framework of transparent, stakeholder-focused effectiveness reporting. It is contended that such reporting can assist the charity sector in discharging accountability, gaining legitimacy, and in sharpening mission-centred managerial decision making. Then applying this to UK charities’ publicly-available communications, it highlights significant challenges and weaknesses in current effectiveness reporting.
Resumo:
Today a number of studies are published on how organizational strategy is developed and how organizations contribute to local and regional development through the realization of these strategies. There are also many articles dealing with the success of a project by identifying the criteria and the factors that influence them. This article introduces the project-oriented strategic planning process that reveals how projects contribute to local and regional development and demonstrates the relationship between this approach and the regional competitiveness model as well as the KRAFT concept. There is a lot of research that focuses on sustainability in business. These studies argue that sustainability is very important to the success of a business in the future. The Project Excellence Model that analyses project success does not contain the sustainability criteria; the GPM P5 standard consists of sustainability components related either to the organizational level. To fill this gap a Project Sustainability Excellence Model (PSEM) was developed. The model was tested by interviews with managers of Hungarian for-profit and non-profit organizations. This paper introduces the PSEM and highlights the most important elements of the empirical analysis.
Resumo:
This paper addresses the two opposing extremes of standardisation in franchising and the dynamics of sales in search of a juncture point in order to reduce franchisees’ uncertainties in sales and improve sales performance. A conceptual framework is developed based on both theory and practice in order to investigate the sales process of a specific franchise network. The research is conducted over a period of six weeks in form of a customised sales report considering the sales funnel concept and performance indicators along the sales process. The received quantitative data is analysed through descriptive statistics and logistic regressions in respect to what variations in the sales process can be discovered and what practices yield higher performance. The results indicate an advantage of a prioritisation guideline regarding the activities and choices to make as a salesperson over strict standardisation. Defining the sales funnel plus engaging in the process of monitoring sales in itself has proven to be a way of reducing uncertainty as the franchisor and franchisees alike inherently gain a greater understanding of the process. The extended knowledge gained from this research allowed for both practical as well as theoretical implications and expands the knowledge on standardisation of sales and the appropriateness of the sales funnel and its management for dealing with the dilemma between standardisation and flexibility of sales in franchising contexts.