912 resultados para Feedback multi-source


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé : Dans une ère de concurrence économique accrue, les organisations éprouvent de plus en plus le besoin de mesurer le rendement de leurs employés, y compris celui de leurs dirigeants. À cette fin, l'évaluation demeure un outil de gestion privilégié. Au nombre des systèmes d'évaluation existants, le feed-back multi-source (FMS) — ou feed-back 360°— est en progression. Malheureusement, on ne connaît pas encore très bien l'incidence de ce type de système, la littérature étant plutôt muette sur ce qu'il donne concrètement et, particulièrement, sur la réaction des évalués subséquemment à l'obtention de leur rapport d'évaluation. Il ressort néanmoins de certaines études que les dirigeants, et surtout, les chefs se sentent laissés à eux-mêmes quant à l'évaluation de leurs compétences. Il y a quelques années, à la demande du Groupement des chefs d'entreprise du Québec (GCEQ), un instrument de type multi-source a été conçu par le Laboratoire de recherche sur la performance des entreprises (LaRePe), afin de mesurer spécifiquement la performance des chefs d'entreprise comme leader. À ce stade-ci, les promoteurs désirent mieux comprendre l'incidence de l'utilisation de leur outil : le PDG-Leadership. Pour combler les lacunes de la littérature, et aussi pour répondre au besoin du GCEQ, la présente thèse porte sur la réaction des chefs à la suite de la réception de leur rapport d'évaluation. L'objet de la recherche est double : il s'agit d'examiner les variables qui influencent le fait que les évalués entreprennent des actions à la suite de leur feed-back (considération théorique) et, d'autre part, de connaître davantage ces actions entreprises bref, ce que le système de feed-back multi-source XFMS) donne vraiment (considération pratique). Afin de mener à bien la recherche, une résidence a été réalisée. Elle a fourni le contexte pour élaborer un questionnaire d'enquête s'appliquant particulièrement aux chefs d'entreprise. L'enquête a permis de rejoindre 351 dirigeants ayant été évalués au moins une fois par l'entremise du PDG-Leadership. De ce nombre, 87 répondants, membres du Groupement se sont manifestés. Le cadre conceptuel utilisé consiste en une adaptation du modèle proposé par Smither, London et Reilly (2005a). comporte sept variables, desquelles ont été tirées cinq hypothèses de recherche. Quatre hypothèses ont été rejetées alors qu'une autre ne s'est avérée supportée que pour le groupe constitué des femmes faisant partie de l'échantillon. De plus, il est intéressant de constater que ce n'est pas le feed-back (rapport) qui déclenche l'acceptation puis les actions, mais une attitude personnelle représentée par la possibilité d'un changement perçue (V4). Chez les chefs, il ne se produit donc pas de réaction en chaîne comme le suppose le modèle théorique utilisé. Il semble que ce soit plutôt la possibilité de changement perçu qui est à la base du fait d'entreprendre des actions, laquelle s'apparente au sentiment d'auto-efficacité défini par Bandura (2007). Les données recueillies auront aussi servies à générer de nouvelles connaissances et à faire ressortir une liste de 112 actions que les chefs disent avoir engagées à la suite de l'obtention de leur rapport d'évaluation. Cette liste a permis de faire une catégorisation des actions posées. Les actions qu'ils entreprennent sont toutefois davantage dirigées vers l'amélioration de l'organisation que vers leur propre amélioration. Il s'agit là, d'une des contributions de la présente thèse.||Abstract : In a context of intense economic competition, organizations are increasingly using instruments of performance evaluation. The multi-source feedback or 360 [degrees] is one of those. The literature seems still silent on what type of evaluation is really about the reaction it generates among evaluated. In response to a request from the Groupement des chefs d'entreprise du Québec (GCEQ), a System of multi-source assessment was designed by the Laboratoire de recherche sur la performance des entreprises (LaRePe). The PDG-Leadership, specifically used to measure the skills of managers of SMLs as a leader. After some years of use, developers want to better understand its impact in order to improve it and make it even better. To address these theoretical and practical considérations, a survey was conducted among 87 business leaders from Quebec who had already been assessed using this tool. This research bas the purpose, the validation of a preliminary model proposed by Smither, London, and Reilly, 2005a, to examine the variables that influence that evaluated undertake actions as a result of their feedback and the other, to know these actions, in short, that the System of feed-back multi-source (FMS) really. From the analysis of data collected, a list of 112 shares was established. In turn, this led to a categorization of actions taken. Although the FMS system is effective, it should be noted that entrepreneurs seem to react differently from other catégories assessed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network coding is a method for achieving channel capacity in networks. The key idea is to allow network routers to linearly mix packets as they traverse the network so that recipients receive linear combinations of packets. Network coded systems are vulnerable to pollution attacks where a single malicious node floods the network with bad packets and prevents the receiver from decoding correctly. Cryptographic defenses to these problems are based on homomorphic signatures and MACs. These proposals, however, cannot handle mixing of packets from multiple sources, which is needed to achieve the full benefits of network coding. In this paper we address integrity of multi-source mixing. We propose a security model for this setting and provide a generic construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Opportunistic relay selection in a multiple source-destination (MSD) cooperative system requires quickly allocating to each source-destination (SD) pair a suitable relay based on channel gains. Since the channel knowledge is available only locally at a relay and not globally, efficient relay selection algorithms are needed. For an MSD system, in which the SD pairs communicate in a time-orthogonal manner with the help of decode-and-forward relays, we propose three novel relay selection algorithms, namely, contention-free en masse assignment (CFEA), contention-based en masse assignment (CBEA), and a hybrid algorithm that combines the best features of CFEA and CBEA. En masse assignment exploits the fact that a relay can often aid not one but multiple SD pairs, and, therefore, can be assigned to multiple SD pairs. This drastically reduces the average time required to allocate an SD pair when compared to allocating the SD pairs one by one. We show that the algorithms are much faster than other selection schemes proposed in the literature and yield significantly higher net system throughputs. Interestingly, CFEA is as effective as CBEA over a wider range of system parameters than in single SD pair systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a hierarchical energy management system for multi-source multi-product (MSMP) microgrids. Traditional energy hub based scheduling method is combined with a hierarchical control structure to incorporate transient characteristics of natural gas flow and dynamics of energy converters in microgrids. The hierarchical EMS includes a supervisory control layer, an optimizing control layer, and an execution control layer. In order to efficiently accommodate the systems multi time-scale characteristics, the optimizing control layer is decomposed into three sub-layers: slow, medium and fast. Thermal, gas and electrical management systems are integrated into the slow, medium, and fast control layer, respectively. Compared with wind energy, solar energy is easier to integrate and more suitable for the microgrid environment, therefore, potential impacts of the hierarchical EMS on MSMP microgrids is investigated based on a building energy system integrating photovoltaic and microturbines. Numerical studies indicate that by using a hierarchical EMS, MSMP microgrids can be economically operated. Also, interactions among thermal, gas, and electrical system can be effectively managed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Feedback is considered to be one of the most important drivers of learning. One form of structured feedback used in medical settings is multisource feedback (MSF). This feedback technique provides the opportunity to gain a differentiated view on a doctor’s performance from several perspectives using a questionnaire and a facilitating conversation, in which learning goals are formulated. While many studies have been conducted on the validity, reliability and feasibility of the instrument, little is known about the impact of factors that might influence the effects of MSF on clinical performance. Summary of Work: To study under which circumstances MSF is most effective, we performed a literature review on Google Scholar with focus on MSF and feedback in general. Main key-words were: MSF, multi-source-feedback, multi source feedback, and feedback each combined with influencing/ hindering/ facilitating factors, effective, effectiveness, doctors-intraining, and surgery. Summary of Results: Based on the literature, we developed a preliminary model of facilitating factors. This model includes five main factors influencing MSF: questionnaire, doctor-in-training, group of raters, facilitating supervisor, and facilitating conversation. Discussion and Conclusions: Especially the following points that might influence MSF have not yet been sufficiently studied: facilitating conversation with the supervisor, individual aspects of doctors-in-training, and the causal relations between influencing factors. Overall there are only very few studies focusing on the impact of MSF on actual and long-term performance. We developed a preliminary model of hindering and facilitating factors on MSF. Further studies are needed to better understand under which circumstances MSF is most effective. Take-home messages: The preliminary model might help to guide further studies on how to implement MSF to use it at its full potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.