954 resultados para Internal process-level performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – A growing body of literature points to the importance of public service motivation (PSM) for the performance of public organizations. The purpose of this paper is to assess the method predominantly used for studying this linkage by comparing the findings it yields without and with a correction suggested by Brewer (2006), which removes the common-method bias arising from employee-specific response tendencies. Design/methodology/approach – First, the authors conduct a systematic review of published empirical research on the effects of PSM on performance and show that all studies found have been conducted at the individual level. Performance indicators in all but three studies were obtained by surveying the same employees who were also asked about their PSM. Second, the authors conduct an empirical analysis. Using survey data from 240 organizational units within the Swiss federal government, the paper compares results from an individual-level analysis (comparable to existing research) to two analyses where the data are aggregated to the organizational level, one without and one with the correction for common-method bias suggested by Brewer (2006). Findings – Looking at the Attraction to Policy-Making dimension of PSM, there is an interesting contrast: While this variable is positively correlated with performance in both the individual-level analysis and the aggregated data analysis without the correction for common-method bias, it is not statistically associated with performance in the aggregated data analysis with the correction. Originality/value – The analysis is the first to assess the robustness of the performance-PSM linkage to a correction for common-method bias. The findings place the validity of at least one part of the individual-level linkage between PSM and performance into question.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process by which young talents develop to become topclass players once they reach the age of maximum performance is influenced by numerous factors. Among the exogenous factors, the family plays a central role. In the context of a research project carried out in cooperation with the Swiss Football Association SFV, 159 former members of the national youth football team were interviewed retrospectively, among other things concerning their family circumstances. The study is interested in understanding two issues: 1) It examines which family conditions – compared with average Swiss families – lead to success in adolescence (nomination for a national youth team). 2) Since success in adolescence by no means guarantees top-level performance at the age of maximum performance, the heterogeneity of the sample’s adult level of performance is used to compare players who later achieve greater success to the less successful players. It is found that these players come from families with many children and a strong affinity to sports. Those players who are particularly successful at the age of maximum performance also felt they received more support from their parents and siblings during childhood and adolescence than the players who went on to be less successful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individuals differ in their preference for processing information on the basis of taxonomic, feature-based similarity, or thematic, relation-based similarity. These differences, which have been investigated in a recently emerging research stream in cognitive psychology, affect innovative behavior and thus constitute an important antecedent of individual performance in research and development (R&D) that has been overlooked so far in the literature on innovation management. To fill this research gap, survey and test data from the employees of a multinational information technology services firm are used to examine the relationship between thematic thinking and R&D professionals' individual performance. A moderated mediation model is applied to investigate the proposed relationships of thematic thinking and individual-level performance indicators. Results show a positive relationship between thematic thinking and innovativeness, as well as individual job performance. While the results do not support the postulated moderation of the innovativeness–job performance relationship by employees' political skill, they show that the relationship between thematic thinking and job performance is fully mediated by R&D professionals' innovativeness. The present study is thus the first to reveal a positive relationship between thematic thinking and innovative performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research project is an extension of a series of administrative science and health care research projects evaluating the influence of external context, organizational strategy, and organizational structure upon organizational success or performance. The research will rely on the assumption that there is not one single best approach to the management of organizations (the contingency theory). As organizational effectiveness is dependent on an appropriate mix of factors, organizations may be equally effective based on differing combinations of factors. The external context of the organization is expected to influence internal organizational strategy and structure and in turn the internal measures affect performance (discriminant theory). The research considers the relationship of external context and organization performance.^ The unit of study for the research will be the health maintenance organization (HMO); an organization the accepts in exchange for a fixed, advance capitation payment, contractual responsibility to assure the delivery of a stated range of health sevices to a voluntary enrolled population. With the current Federal resurgence of interest in the Health Maintenance Organization (HMO) as a major component in the health care system, attention must be directed at maximizing development of HMOs from the limited resources available. Increased skills are needed in both Federal and private evaluation of HMO feasibility in order to prevent resource investment and in projects that will fail while concurrently identifying potentially successful projects that will not be considered using current standards.^ The research considers 192 factors measuring contextual milieu (social, educational, economic, legal, demographic, health and technological factors). Through intercorrelation and principle components data reduction techniques this was reduced to 12 variables. Two measures of HMO performance were identified, they are (1) HMO status (operational or defunct), and (2) a principle components factor score considering eight measures of performance. The relationship between HMO context and performance was analysed using correlation and stepwise multiple regression methods. In each case it has been concluded that the external contextual variables are not predictive of success or failure of study Health Maintenance Organizations. This suggests that performance of an HMO may rely on internal organizational factors. These findings have policy implications as contextual measures are used as a major determinant in HMO feasibility analysis, and as a factor in the allocation of limited Federal funds. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo de la presente investigación es el desarrollo de un modelo de cálculo rápido, eficiente y preciso, para la estimación de los costes finales de construcción, en las fases preliminares del proyecto arquitectónico. Se trata de una herramienta a utilizar durante el proceso de elaboración de estudios previos, anteproyecto y proyecto básico, no siendo por tanto preciso para calcular el “predimensionado de costes” disponer de la total definición grafica y literal del proyecto. Se parte de la hipótesis de que en la aplicación práctica del modelo no se producirán desviaciones superiores al 10 % sobre el coste final de la obra proyectada. Para ello se formulan en el modelo de predimensionado cinco niveles de estimación de costes, de menor a mayor definición conceptual y gráfica del proyecto arquitectónico. Los cinco niveles de cálculo son: dos que toman como referencia los valores “exógenos” de venta de las viviendas (promoción inicial y promoción básica) y tres basados en cálculos de costes “endógenos” de la obra proyectada (estudios previos, anteproyecto y proyecto básico). El primer nivel de estimación de carácter “exógeno” (nivel .1), se calcula en base a la valoración de mercado de la promoción inmobiliaria y a su porcentaje de repercusión de suelo sobre el valor de venta de las viviendas. El quinto nivel de valoración, también de carácter “exógeno” (nivel .5), se calcula a partir del contraste entre el valor externo básico de mercado, los costes de construcción y los gastos de promoción estimados de la obra proyectada. Este contraste entre la “repercusión del coste de construcción” y el valor de mercado, supone una innovación respecto a los modelos de predimensionado de costes existentes, como proceso metodológico de verificación y validación extrínseca, de la precisión y validez de las estimaciones resultantes de la aplicación práctica del modelo, que se denomina Pcr.5n (Predimensionado costes de referencia con .5niveles de cálculo según fase de definición proyectual / ideación arquitectónica). Los otros tres niveles de predimensionado de costes de construcción “endógenos”, se estiman mediante cálculos analíticos internos por unidades de obra y cálculos sintéticos por sistemas constructivos y espacios funcionales, lo que se lleva a cabo en las etapas iniciales del proyecto correspondientes a estudios previos (nivel .2), anteproyecto (nivel .3) y proyecto básico (nivel .4). Estos cálculos teóricos internos son finalmente evaluados y validados mediante la aplicación práctica del modelo en obras de edificación residencial, de las que se conocen sus costes reales de liquidación final de obra. Según va evolucionando y se incrementa el nivel de definición y desarrollo del proyecto, desde los estudios previos hasta el proyecto básico, el cálculo se va perfeccionando en su nivel de eficiencia y precisión de la estimación, según la metodología aplicada: [aproximaciones sucesivas en intervalos finitos], siendo la hipótesis básica como anteriormente se ha avanzado, lograr una desviación máxima de una décima parte en el cálculo estimativo del predimensionado del coste real de obra. El cálculo del coste de ejecución material de la obra, se desarrolla en base a parámetros cúbicos funcionales “tridimensionales” del espacio proyectado y parámetros métricos constructivos “bidimensionales” de la envolvente exterior de cubierta/fachada y de la huella del edificio sobre el terreno. Los costes funcionales y constructivos se ponderan en cada fase del proceso de cálculo con sus parámetros “temáticos/específicos” de gestión (Pg), proyecto (Pp) y ejecución (Pe) de la concreta obra presupuestada, para finalmente estimar el coste de construcción por contrata, como resultado de incrementar al coste de ejecución material el porcentaje correspondiente al parámetro temático/especifico de la obra proyectada. El modelo de predimensionado de costes de construcción Pcr.5n, será una herramienta de gran interés y utilidad en el ámbito profesional, para la estimación del coste correspondiente al Proyecto Básico previsto en el marco técnico y legal de aplicación. Según el Anejo I del Código Técnico de la Edificación (CTE), es de obligado cumplimiento que el proyecto básico contenga una “Valoración aproximada de la ejecución material de la obra proyectada por capítulos”, es decir , que el Proyecto Básico ha de contener al menos un “presupuesto aproximado”, por capítulos, oficios ó tecnologías. El referido cálculo aproximado del presupuesto en el Proyecto Básico, necesariamente se ha de realizar mediante la técnica del predimensionado de costes, dado que en esta fase del proyecto arquitectónico aún no se dispone de cálculos de estructura, planos de acondicionamiento e instalaciones, ni de la resolución constructiva de la envolvente, por cuanto no se han desarrollado las especificaciones propias del posterior proyecto de ejecución. Esta estimación aproximada del coste de la obra, es sencilla de calcular mediante la aplicación práctica del modelo desarrollado, y ello tanto para estudiantes como para profesionales del sector de la construcción. Como se contiene y justifica en el presente trabajo, la aplicación práctica del modelo para el cálculo de costes en las fases preliminares del proyecto, es rápida y certera, siendo de sencilla aplicación tanto en vivienda unifamiliar (aisladas y pareadas), como en viviendas colectivas (bloques y manzanas). También, el modelo es de aplicación en el ámbito de la valoración inmobiliaria, tasaciones, análisis de viabilidad económica de promociones inmobiliarias, estimación de costes de obras terminadas y en general, cuando no se dispone del proyecto de ejecución y sea preciso calcular los costes de construcción de las obras proyectadas. Además, el modelo puede ser de aplicación para el chequeo de presupuestos calculados por el método analítico tradicional (estado de mediciones pormenorizadas por sus precios unitarios y costes descompuestos), tanto en obras de iniciativa privada como en obras promovidas por las Administraciones Públicas. Por último, como líneas abiertas a futuras investigaciones, el modelo de “predimensionado costes de referencia 5 niveles de cálculo”, se podría adaptar y aplicar para otros usos y tipologías diferentes a la residencial, como edificios de equipamientos y dotaciones públicas, valoración de edificios históricos, obras de urbanización interior y exterior de parcela, proyectos de parques y jardines, etc….. Estas lineas de investigación suponen trabajos paralelos al aquí desarrollado, y que a modo de avance parcial se recogen en las comunicaciones presentadas en los Congresos internacionales Scieconf/Junio 2013, Rics‐Cobra/Septiembre 2013 y en el IV Congreso nacional de patología en la edificación‐Ucam/Abril 2014. ABSTRACT The aim of this research is to develop a fast, efficient and accurate calculation model to estimate the final costs of construction, during the preliminary stages of the architectural project. It is a tool to be used during the preliminary study process, drafting and basic project. It is not therefore necessary to have the exact, graphic definition of the project in order to be able to calculate the cost‐scaling. It is assumed that no deviation 10% higher than the final cost of the projected work will occur during the implementation. To that purpose five levels of cost estimation are formulated in the scaling model, from a lower to a higher conceptual and graphic definition of the architectural project. The five calculation levels are: two that take as point of reference the ”exogenous” values of house sales (initial development and basic development), and three based on calculation of endogenous costs (preliminary study, drafting and basic project). The first ”exogenous” estimation level (level.1) is calculated over the market valuation of real estate development and the proportion the cost of land has over the value of the houses. The fifth level of valuation, also an ”exogenous” one (level.5) is calculated from the contrast between the basic external market value, the construction costs, and the estimated development costs of the projected work. This contrast between the ”repercussions of construction costs” and the market value is an innovation regarding the existing cost‐scaling models, as a methodological process of extrinsic verification and validation, of the accuracy and validity of the estimations obtained from the implementation of the model, which is called Pcr.5n (reference cost‐scaling with .5calculation levels according to the stage of project definition/ architectural conceptualization) The other three levels of “endogenous” construction cost‐scaling are estimated from internal analytical calculations by project units and synthetic calculations by construction systems and functional spaces. This is performed during the initial stages of the project corresponding to preliminary study process (level.2), drafting (level.3) and basic project (level.4). These theoretical internal calculations are finally evaluated and validated via implementation of the model in residential buildings, whose real costs on final payment of the works are known. As the level of definition and development of the project evolves, from preliminary study to basic project, the calculation improves in its level of efficiency and estimation accuracy, following the applied methodology: [successive approximations at finite intervals]. The basic hypothesis as above has been made, achieving a maximum deviation of one tenth, in the estimated calculation of the true cost of predimensioning work. The cost calculation for material execution of the works is developed from functional “three‐dimensional” cubic parameters for the planned space and constructive “two dimensional” metric parameters for the surface that envelopes around the facade and the building’s footprint on the plot. The functional and building costs are analyzed at every stage of the process of calculation with “thematic/specific” parameters of management (Pg), project (Pp) and execution (Pe) of the estimated work in question, and finally the cost of contractual construction is estimated, as a consequence of increasing the cost of material execution with the percentage pertaining to the thematic/specific parameter of the projected work. The construction cost‐scaling Pcr.5n model will be a useful tool of great interest in the professional field to estimate the cost of the Basic Project as prescribed in the technical and legal framework of application. According to the appendix of the Technical Building Code (CTE), it is compulsory that the basic project contains an “approximate valuation of the material execution of the work, projected by chapters”, that is, that the basic project must contain at least an “approximate estimate” by chapter, trade or technology. This approximate estimate in the Basic Project is to be performed through the cost‐scaling technique, given that structural calculations, reconditioning plans and definitive contruction details of the envelope are still not available at this stage of the architectural project, insofar as specifications pertaining to the later project have not yet been developed. This approximate estimate of the cost of the works is easy to calculate through the implementation of the given model, both for students and professionals of the building sector. As explained and justified in this work, the implementation of the model for cost‐scaling during the preliminary stage is fast and accurate, as well as easy to apply both in single‐family houses (detached and semi‐detached) and collective housing (blocks). The model can also be applied in the field of the real‐estate valuation, official appraisal, analysis of the economic viability of real estate developments, estimate of the cost of finished projects and, generally, when an implementation project is not available and it is necessary to calculate the building costs of the projected works. The model can also be applied to check estimates calculated by the traditional analytical method (state of measurements broken down into price per unit cost details), both in private works and those promoted by Public Authorities. Finally, as potential lines for future research, the “five levels of calculation cost‐scaling model”, could be adapted and applied to purposes and typologies other than the residential one, such as service buildings and public facilities, valuation of historical buildings, interior and exterior development works, park and garden planning, etc… These lines of investigation are parallel to this one and, by way of a preview, can be found in the dissertations given in the International Congresses Scieconf/June 2013, Rics‐Cobra/September 2013 and in the IV Congress on building pathology ‐Ucam/April 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este proyecto fin de grado presenta dos herramientas, Papify y Papify-Viewer, para medir y visualizar, respectivamente, las prestaciones a bajo nivel de especificaciones RVC-CAL basándose en eventos hardware. RVC-CAL es un lenguaje de flujo de datos estandarizado por MPEG y utilizado para definir herramientas relacionadas con la codificación de vídeo. La estructura de los programas descritos en RVC-CAL se basa en unidades funcionales llamadas actores, que a su vez se subdividen en funciones o procedimientos llamados acciones. ORCC (Open RVC-CAL Compiler) es un compilador de código abierto que utiliza como entrada descripciones RVC-CAL y genera a partir de ellas código fuente en un lenguaje dado, como por ejemplo C. Internamente, el compilador ORCC se divide en tres etapas distinguibles: front-end, middle-end y back-end. La implementación de Papify consiste en modificar la etapa del back-end del compilador, encargada de la generación de código, de modo tal que los actores, al ser traducidos a lenguaje C, queden instrumentados con PAPI (Performance Application Programing Interface), una herramienta utilizada como interfaz a los registros contadores de rendimiento (PMC) de los procesadores. Además, también se modifica el front-end para permitir identificar cierto tipo de anotaciones en las descripciones RVC-CAL, utilizadas para que el diseñador pueda indicar qué actores o acciones en particular se desean analizar. Los actores instrumentados, además de conservar su funcionalidad original, generan una serie de ficheros que contienen datos sobre los distintos eventos hardware que suceden a lo largo de su ejecución. Los eventos incluidos en estos ficheros son configurables dentro de las anotaciones previamente mencionadas. La segunda herramienta, Papify-Viewer, utiliza los datos generados por Papify y los procesa, obteniendo una representación visual de la información a dos niveles: por un lado, representa cronológicamente la ejecución de la aplicación, distinguiendo cada uno de los actores a lo largo de la misma. Por otro lado, genera estadísticas sobre la cantidad de eventos disparados por acción, actor o núcleo de ejecución y las representa mediante gráficos de barra. Ambas herramientas pueden ser utilizadas en conjunto para verificar el funcionamiento del programa, balancear la carga de los actores o la distribución por núcleos de los mismos, mejorar el rendimiento y diagnosticar problemas. ABSTRACT. This diploma project presents two tools, Papify and Papify-Viewer, used to measure and visualize the low level performance of RVC-CAL specifications based on hardware events. RVC-CAL is a dataflow language standardized by MPEG which is used to define video codec tools. The structure of the applications described in RVC-CAL is based on functional units called actors, which are in turn divided into smaller procedures called actions. ORCC (Open RVC-CAL Compiler) is an open-source compiler capable of transforming RVC-CAL descriptions into source code in a given language, such as C. Internally, the compiler is divided into three distinguishable stages: front-end, middle-end and back-end. Papify’s implementation consists of modifying the compiler’s back-end stage, which is responsible for generating the final source code, so that translated actors in C code are now instrumented with PAPI (Performance Application Programming Interface), a tool that provides an interface to the microprocessor’s performance monitoring counters (PMC). In addition, the front-end is also modified in such a way that allows identification of a certain type of annotations in the RVC-CAL descriptions, allowing the designer to set the actors or actions to be included in the measurement. Besides preserving their initial behavior, the instrumented actors will also generate a set of files containing data about the different events triggered throughout the program’s execution. The events included in these files can be configured inside the previously mentioned annotations. The second tool, Papify-Viewer, makes use of the files generated by Papify to process them and provide a visual representation of the information in two different ways: on one hand, a chronological representation of the application’s execution where each actor has its own timeline. On the other hand, statistical information is generated about the amount of triggered events per action, actor or core. Both tools can be used together to assert the normal functioning of the program, balance the load between actors or cores, improve performance and identify problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nodulation in legumes provides a major conduit of available nitrogen into the biosphere. The development of nitrogen-fixing nodules results from a symbiotic interaction between soil bacteria, commonly called rhizobia, and legume plants. Molecular genetic analysis in both model and agriculturally important legume species has resulted in the identification of a variety of genes that are essential for the establishment, maintenance and regulation of this symbiosis. Autoregulation of nodulation (AON) is a major internal process by which nodule numbers are controlled through prior nodulation events. Characterisation of AON-deficient mutants has revealed a novel systemic signal transduction pathway controlled by a receptor-like kinase. This review reports our present level of understanding on the short- and long-distance signalling networks controlling early nodulation events and AON.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hospital employees who work in an environment with zero tolerance to error, face several stressors that may result in psychological, physiological, and behavioural strains, and subsequently, in suboptimal performance. This thesis includes two studies which investigate the stressor-to-strain-to-performance relationships in hospitals. The first study is a cross-sectional, multi-group investigation based on secondary data from 65,142 respondents in 172 acute/specialist UK NHS trusts. This model proposes that senior management leadership predicts social support and job design which, in turn, moderate stressors-to-strains across team structure. The results confirm the model's robustness. Regression analysis provides support for main effects and minimal support for moderation hypotheses. Therefore, based on its conclusions and inherent limitations, study one lays the framework for study two. The second study is a cross-sectional, multilevel investigation of the strain-reducing effects of social environment on externally-rated unit-level performance based on primary data from 1,137 employees in 136 units, in a hospital in Malta. The term "social environment" refers to the prediction of the moderator variables, which is to say, social support and decision latitude/control, by transformational leadership and team climate across hospital units. This study demonstrates that transformational leadership is positively associated with social support, whereas team climate is positively associated with both moderators. At the same time, it identifies a number of moderating effects which social support and decision latitude/control, both separately and together, had on specific stressor-to-strain relationships. The results show significant mediated stressor-to-strain-to-performance relationships. Furthermore, at the higher level, unit-level performance is positively associated with shared unit-level team climate and with unit-level vision, the latter being one of the five sub-dimension of transformational leadership. At the same time, performance is also positively related to both transformational leadership and team climate when the two constructs are tested together. Few studies have linked the buffering effects of the social environment in occupational stress with performance. Therefore, this research strives to make a significant contribution to the occupational stress and performance literature with a focus on hospital practice. Indeed, the study highlights the wide-ranging and far-reaching implications that these findings provide for theory, management, and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the modelling of drying processes for the promotion of market-led Demand Side Management (DSM) as applied to the UK Public Electricity Suppliers. A review of DSM in the electricity supply industry is provided, together with a discussion of the relevant drivers supporting market-led DSM and energy services (ES). The potential opportunities for ES in a fully deregulated energy market are outlined. It is suggested that targeted industrial sector energy efficiency schemes offer significant opportunity for long term customer and supplier benefit. On a process level, industrial drying is highlighted as offering significant scope for the application of energy services. Drying is an energy-intensive process used widely throughout industry. The results of an energy survey suggest that 17.7 per cent of total UK industrial energy use derives from drying processes. Comparison with published work indicates that energy use for drying shows an increasing trend against a background of reducing overall industrial energy use. Airless drying is highlighted as offering potential energy saving and production benefits to industry. To this end, a comprehensive review of the novel airless drying technology and its background theory is made. Advantages and disadvantages of airless operation are defined and the limited market penetration of airless drying is identified, as are the key opportunities for energy saving. Limited literature has been found which details the modelling of energy use for airless drying. A review of drying theory and previous modelling work is made in an attempt to model energy consumption for drying processes. The history of drying models is presented as well as a discussion of the different approaches taken and their relative merits. The viability of deriving energy use from empirical drying data is examined. Adaptive neuro fuzzy inference systems (ANFIS) are successfully applied to the modelling of drying rates for 3 drying technologies, namely convective air, heat pump and airless drying. The ANFIS systems are then integrated into a novel energy services model for the prediction of relative drying times, energy cost and atmospheric carbon dioxide emission levels. The author believes that this work constitutes the first to use fuzzy systems for the modelling of drying performance as an energy services approach to DSM. To gain an insight into the 'real world' use of energy for drying, this thesis presents a unique first-order energy audit of every ceramic sanitaryware manufacturing site in the UK. Previously unknown patterns of energy use are highlighted. Supplementary comments on the timing and use of drying systems are also made. The limitations of such large scope energy surveys are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to delineate a green supply chain (GSC) performance measurement framework using an intra-organisational collaborative decision-making (CDM) approach. A fuzzy analytic network process (ANP)-based green-balanced scorecard (GrBSc) has been used within the CDM approach to assist in arriving at a consistent, accurate and timely data flow across all cross-functional areas of a business. A green causal relationship is established and linked to the fuzzy ANP approach. The causal relationship involves organisational commitment, eco-design, GSC process, social performance and sustainable performance constructs. Sub-constructs and sub-sub-constructs are also identified and linked to the causal relationship to form a network. The fuzzy ANP approach suitably handles the vagueness of the linguistics information of the CDM approach. The CDM approach is implemented in a UK-based carpet-manufacturing firm. The performance measurement approach, in addition to the traditional financial performance and accounting measures, aids in firms decision-making with regard to the overall organisational goals. The implemented approach assists the firm in identifying further requirements of the collaborative data across the supply-cain and information about customers and markets. Overall, the CDM-based GrBSc approach assists managers in deciding if the suppliers performances meet the industry and environment standards with effective human resource. © 2013 Taylor & Francis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using plant level data from a global survey with multiple time frames, one begun in the late 1990s, this paper introduces measures of supply chain integration and discusses the dynamic relationship between the level of integration and a set of internal and external performance measurements. Specifically, data from Hungary, The Netherlands and The People’s Republic of China are used in the analyses. The time frames considered range from the late 1990s till 2009, encompassing major changes and transitions. Our results seem to indicate that SCI has an underlying structure of four sets of indicators, namely: (1) delivery frequency from the supplier or to the customer; (2) sharing internal processes with suppliers; (3) sharing internal processes with buyers and (4) joint facility location with partners. The differences between groups in terms of several performance measures proved to be small, being mostly statistically insignificant - but looking at the ANOVA table we can conclude that in this sample of companies those having joint location with their partners seem to outperform others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The improving performance of public administration and the reform of public financing system have been on agenda in Hungary for many years, in accordance with the international trends. However, governments have not expected and supported creating of a performance-oriented public administration in a comprehensive and explicit way. Nevertheless, there are bottom-up initiatives at organizational level, which target performance-oriented organizational function. The research focuses on organizations of central public administration where the successful application of performance management methods is most likely based on the international literature. These are the so called agency-type organizations, which are in Hungary called autonomous state-administration organizations independent of the Government (e.g. Hungarian Competition Authority), government bureaus (e.g. Hungarian Central Statistical Office), and central offices subordinated to the government (either the cabinet or a ministry) (e.g. Hungarian Meteorological Service). The studied agencies are legally independent organizations with managerial autonomy based on public law. The purpose of this study is to get an overview on organizational level performance management tools applied by Hungarian agencies, and to reveal the reasons and drivers of the application of these tools. The empirical research is based on a mixed methods approach which combines both quantitative methods and qualitative procedures. The first – quantitative – phase of the author’s research was content analysis of homepages of the studied organizations. As a results she got information about all agencies and their practice related to some performance management tools. The second – qualitative – phase was based on semi-structured face-to-face interviews with some senior managers of agencies. The author selected the interviewees based on the results of the first phase, the relatively strong performance orientation was an important selection criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, researchers have begun to investigate the benefits of cross-training teams. It has been hypothesized that cross-training should help improve team processes and team performance (Cannon-Bowers, Salas, Blickensderfer, & Bowers, 1998; Travillian, Volpe, Cannon-Bowers, & Salas, 1993). The current study extends previous research by examining different methods of cross-training (positional clarification and positional modeling) and the impact they have on team process and performance in both more complex and less complex environments. One hundred and thirty-five psychology undergraduates were placed in 45 three-person teams. Participants were randomly assigned to roles within teams. Teams were asked to “fly” a series of missions on a PC-based helicopter flight simulation. ^ Results suggest that cross-training improves team mental model accuracy and similarity. Accuracy of team mental models was found to be a predictor of coordination quality, but similarity of team mental models was not. Neither similarity nor accuracy of team mental models was found to be a predictor of backup behavior (quality and quantity). As expected, both team coordination (quality) and backup behaviors (quantity and quality) were significant predictors of overall team performance. Contrary to expectations, there was no interaction between cross-training and environmental complexity. Results from this study further cross-training research by establishing positional clarification and positional modeling as training strategies for improving team performance. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the two opposing extremes of standardisation in franchising and the dynamics of sales in search of a juncture point in order to reduce franchisees’ uncertainties in sales and improve sales performance. A conceptual framework is developed based on both theory and practice in order to investigate the sales process of a specific franchise network. The research is conducted over a period of six weeks in form of a customised sales report considering the sales funnel concept and performance indicators along the sales process. The received quantitative data is analysed through descriptive statistics and logistic regressions in respect to what variations in the sales process can be discovered and what practices yield higher performance. The results indicate an advantage of a prioritisation guideline regarding the activities and choices to make as a salesperson over strict standardisation. Defining the sales funnel plus engaging in the process of monitoring sales in itself has proven to be a way of reducing uncertainty as the franchisor and franchisees alike inherently gain a greater understanding of the process. The extended knowledge gained from this research allowed for both practical as well as theoretical implications and expands the knowledge on standardisation of sales and the appropriateness of the sales funnel and its management for dealing with the dilemma between standardisation and flexibility of sales in franchising contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dissertation intends to develop an investigation on the artistic existence in human beings with different bodies in society at different historical moments. In this regard and based on this scenario, the study develops a description of the stigmas production and how they are established, spread and interfere with the sociability among human beings regarded as normal and those with different bodies. Regarding the scenic arts, the text describes about the participation of artists with different bodies in the scene, specifically the freak show and postdramatic theater. The text also investigates aspects of the biography and the work of mexican artist Frida Kahlo, which underpin methodological proceedings and produce contribution to the creative process of performance Kahlo em mim Eu e(m) Kahlo , which is to investigate the practice of the scene in this dissertation