890 resultados para key performance indicator(s)
Resumo:
Corporate social responsibility or CSR is today a widely recognized concept which is receiving in- creasing popularity extremely rapidly, especially in the business world. The pressure on companies to carry out their business practices in ethical manners, which promote the wellbeing of the environment and society, is coming from all directions and all stakeholders. Alstom, a French multinational conglomerate operating in the rail transport and energy industry, is no exception to this norm. This company, which will be used as the case example in this thesis, is being brought to bay in terms of engaging in CSR practices and practicing business with high ethics. It is surely not a negatively conceived phenomenon that CSR is being put on a pedestal – quite the opposite. Instead of corporations practicing CSR only to meet their stakeholder requirements through practicing window dressing, many corporations actually strive to benefit from the practice of corporate social business. In addition to bringing benefit to externals a corporation such as Alstom itself can benefit from being involved in CSR. The purpose of this thesis is to evaluate the current strategic values and the future perspectives of CSR at Alstom and moreover the added value which the practice of CSR could bring Alstom as a business. A set of perspectives from a futures studies viewpoint is looked at, with critical examination of the company’s current corporate practices as well as the CSR related studies and theories written for corporations. Through this, some solutions and practices will be suggested to Alstom in order for it to fully utilize the potential of corporate social business and the value it can bring in the most probable futures that the company is expected to face. By utilizing the Soft Systems Methodology (SSM), a method mainly used in organizations to solve problematic issues in management and policy contexts, a process is developed to see what improvements could be of help in improving Alstom and its way towards involving CSR in its business practices even more than it currently does. Alstom is already deeply involved in the practicing of CSR and its vision has a strong emphasis on this popular concept of today. In order to stay in the game and to use CSR as a competitive advantage to the company, Alstom ought to embed corporate social practices even deeper in its organizational culture by using them as a tool to reduce risk and costs, increasing employee commitment and customer loyalty and to attract socially responsible investors, just to name a few. CSR as a concept is seen to have great potential in the future, an opportunity Alstom will not miss.
Resumo:
This paper deals with the energy consumption and the evaluation of the performance of air supply systems for a ventilated room involving high- and low-level supplies. The energy performance assessment is based on the airflow rate, which is related to the fan power consumption by achieving the same environmental quality performance for each case. Four different ventilation systems are considered: wall displacement ventilation, confluent jets ventilation, impinging jet ventilation and a high level mixing ventilation system. The ventilation performance of these systems will be examined by means of achieving the same Air Distribution Index (ADI) for different cases. The widely used high-level supplies require much more fan power than those for low-level supplies for achieving the same value of ADI. In addition, the supply velocity, hence the supply dynamic pressure, for a high-level supply is much larger than for low-level supplies. This further increases the power consumption for high-level supply systems. The paper considers these factors and attempts to provide some guidelines on the difference in the energy consumption associated with high and low level air supply systems. This will be useful information for designers and to the authors' knowledge there is a lack of information available in the literature on this area of room air distribution. The energy performance of the above-mentioned ventilation systems has been evaluated on the basis of the fan power consumed which is related to the airflow rate required to provide equivalent indoor environment. The Air Distribution Index (ADI) is used to evaluate the indoor environment produced in the room by the ventilation strategy being used. The results reveal that mixing ventilation requires the highest fan power and the confluent jets ventilation needs the lowest fan power in order to achieve nearly the same value of ADI.
How self-determined choice facilitates performance: a key role of the ventromedial prefrontal cortex
Resumo:
Recent studies have documented that self-determined choice does indeed enhance performance. However, the precise neural mechanisms underlying this effect are not well understood. We examined the neural correlates of the facilitative effects of self-determined choice using functional magnetic resonance imaging (fMRI). Participants played a game-like task involving a stopwatch with either a stopwatch they selected (self-determined-choice condition) or one they were assigned without choice (forced-choice condition). Our results showed that self-determined choice enhanced performance on the stopwatch task, despite the fact that the choices were clearly irrelevant to task difficulty. Neuroimaging results showed that failure feedback, compared with success feedback, elicited a drop in the vmPFC activation in the forced-choice condition, but not in the self-determined-choice condition, indicating that negative reward value associated with the failure feedback vanished in the self-determined-choice condition. Moreover, the vmPFC resilience to failure in the self-determined-choice condition was significantly correlated with the increased performance. Striatal responses to failure and success feedback were not modulated by the choice condition, indicating the dissociation between the vmPFC and striatal activation pattern. These findings suggest that the vmPFC plays a unique and critical role in the facilitative effects of self-determined choice on performance.
Resumo:
There is increasing recognition that agricultural landscapes meet multiple societal needs and demands beyond provision of economic and environmental goods and services. Accordingly, there have been significant calls for the inclusion of societal, amenity and cultural values in agri-environmental landscape indicators to assist policy makers in monitoring the wider impacts of land-based policies. However, capturing the amenity and cultural values that rural agrarian areas provide, by use of such indicators, presents significant challenges. The EU social awareness of landscape indicator represents a new class of generalized social indicator using a top-down methodology to capture the social dimensions of landscape without reference to the specific structural and cultural characteristics of individual landscapes. This paper reviews this indicator in the context of existing agri-environmental indicators and their differing design concepts. Using a stakeholder consultation approach in five case study regions, the potential and limitations of the indicator are evaluated, with a particular focus on its perceived meaning, utility and performance in the context of different user groups and at different geographical scales. This analysis supplements previous EU-wide assessments, through regional scale assessment of the limitations and potentialities of the indicator and the need for further data collection. The evaluation finds that the perceived meaning of the indicator does not vary with scale, but in common with all mapped indicators, the usefulness of the indicator, to different user groups, does change with scale of presentation. This indicator is viewed as most useful when presented at the scale of governance at which end users operate. The relevance of the different sub-components of the indicator are also found to vary across regions.
Resumo:
Introduction: The objective of this study was to analyze the spatial behavior of the occurrence of trachoma cases detected in the City of Bauru, State of São Paulo, Brazil, in 2006 in order to use the information collected to set priority areas for optimization of health resources. Methods: the trachoma cases identified in 2006 were georeferenced. The data evaluated were: schools where the trachoma cases studied, data from the 2000 Census, census tract, type of housing, water supply conditions, distribution of income and levels of education of household heads. In the Google Earth® software and TerraView® were made descriptive spatial analysis and estimates of the Kernel. Each area was studied by interpolation of the density surfaces exposing events to facilitate to recognize the clusters. Results: of the 66 cases detected, only one (1.5%) was not a resident of the city's outskirts. A positive association was detected of trachoma cases and the percentage of heads of household with income below three minimum wages and schooling under eight years of education. Conclusions: The recognition of the spatial distribution of trachoma cases coincided with the areas of greatest social inequality in Bauru City. The micro-areas identified are those that should be prioritized in the rationalization of health resources. There is the possibility of using the trachoma cases detected as an indicator of performance of micro priority health programs.
Resumo:
Long-haul drivers work in irregular schedules due to load delivery demands. In general, driving and sleeping occur at irregular times and, consequently, partial sleep deprivation and/or circadian misalignment may emerge and result in sleepiness at the wheel. In this way, the aim of this study was to verify changes in the postural control parameters of professional drivers after one-night working. Eight male truck drivers working at night - night drivers (ND) and nine day drivers (DD) volunteered to participate in this study. The night drivers' postural stability was assessed immediately before and after an approximately 430 km journey by two identical force platforms at departure and arrival sites. The DD group was measured before and after a day's work. An interaction effect of time of day and type of shift in both conditions: eyes open (p < 0.01) and eyes closed (p < 0.001) for amplitude of mediolateral movements was observed. Postural stability, measured by force platform, is affected by a night of work, suggesting that it could be an effect of circadian and homeostatic influences over postural control.
Resumo:
The world's rising demand of energy turns the development of sustainable and more efficient technologies for energy production and storage into an inevitable task. Thermoelectric generators, composed of pairs of n-type and p-type semiconducting materials, di¬rectly transform waste heat into useful electricity. The efficiency of a thermoelectric mate¬rial depends on its electronic and lattice properties, summarized in its figure of merit ZT. Desirable are high electrical conductivity and Seebeck coefficients, and low thermal con¬ductivity. Half-Heusler materials are very promising candidates for thermoelectric applications in the medium¬ temperature range such as in industrial and automotive waste heat recovery. The advantage of Heusler compounds are excellent electronic properties and high thermal and mechanical stability, as well as their low toxicity and elemental abundance. Thus, the main obstacle to further enhance their thermoelectric performance is their relatively high thermal conductivity.rn rnIn this work, the thermoelectric properties of the p-type material (Ti/Zr/Hf)CoSb1-xSnx were optimized in a multistep process. The concept of an intrinsic phase separation has recently become a focus of research in the compatible n-type (Ti/Zr/Hf)NiSn system to achieve low thermal conductivities and boost the TE performance. This concept is successfully transferred to the TiCoSb system. The phase separation approach can form a significant alternative to the previous nanostructuring approach via ball milling and hot pressing, saving pro¬cessing time, energy consumption and increasing the thermoelectric efficiency. A fundamental concept to tune the performance of thermoelectric materials is charge carrier concentration optimization. The optimum carrier concentration is reached with a substitution level for Sn of x = 0.15, enhancing the ZT about 40% compared to previous state-of-the-art samples with x = 0.2. The TE performance can be enhanced further by a fine-tuning of the Ti-to-Hf ratio. A correlation of the microstructure and the thermoelectric properties is observed and a record figure of merit ZT = 1.2 at 710°C was reached with the composition Ti0.25Hf0.75CoSb0.85Sn0.15.rnTowards application, the long term stability of the material under actual conditions of operation are an important issue. The impact of such a heat treatment on the structural and thermoelectric properties is investigated. Particularly, the best and most reliable performance is achieved in Ti0.5Hf0.5CoSb0.85Sn0.15, which reached a maximum ZT of 1.1 at 700°C. The intrinsic phase separation and resulting microstructure is stable even after 500 heating and cooling cycles.
Resumo:
Objective. The study reviewed one year of Texas hospital discharge data and Trauma Registry data for the 22 trauma services regions in Texas to identify regional variations in capacity, process of care and clinical outcomes for trauma patients, and analyze the statistical associations among capacity, process of care, and outcomes. ^ Methods. Cross sectional study design covering one year of state-wide Texas data. Indicators of trauma capacity, trauma care processes, and clinical outcomes were defined and data were collected on each indicator. Descriptive analyses were conducted of regional variations in trauma capacity, process of care, and clinical outcomes at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers. Multilevel regression models were performed to test the relations among trauma capacity, process of care, and outcome measures at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers while controlling for confounders such as age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization. ^ Results. Significant regional variation was found among the 22 trauma services regions across Texas in trauma capacity, process of care, and clinical outcomes. The regional trauma bed rate, the average staffed bed per 100,000 varied significantly by trauma service region. Pre-hospital trauma care processes were significantly variable by region---EMS time, transfer time, and triage. Clinical outcomes including mortality, hospital and intensive care unit length of stay, and hospital charges also varied significantly by region. In multilevel regression analysis, the average trauma bed rate was significantly related to trauma care processes including ambulance delivery time, transfer time, and triage after controlling for age, gender, race/ethnicity, injury severity, level of trauma centers, and urbanization at all trauma centers. Transfer time only among processes of care was significant with the average trauma bed rate by region at Level III and IV. Also trauma mortality only among outcomes measures was significantly associated with the average trauma bed rate by region at all trauma centers. Hospital charges only among outcomes measures were statistically related to trauma bed rate at Level I and II trauma centers. The effect of confounders on processes and outcomes such as age, gender, race/ethnicity, injury severity, and urbanization was found significantly variable by level of trauma centers. ^ Conclusions. Regional variation in trauma capacity, process, and outcomes in Texas was extensive. Trauma capacity, age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization were significantly associated with trauma process and clinical outcomes depending on level of trauma centers. ^ Key words: regionalized trauma systems, trauma capacity, pre-hospital trauma care, process, trauma outcomes, trauma performance, evaluation measures, regional variations ^
Resumo:
Core competencies form the basis of an organization’s skills and the basic element of a successful strategic execution. Identifying and strengthening the core competencies enhances flexibility thereby strategically positioning a firm for responding to competition in the dynamic marketplace and can be the difference in quality among firms that follow the same business model. A correct understanding of the concept of business models, employing the right core competencies, organizing them effectively and building the business model around the competencies that are constantly gained and assimilated can result in enhanced business performance and thus having implications for firms that want to innovate their business models. Flexibility can be the firm’s agility to shift focus in response to external factors such as changing markets, new technologies or competition and a firm’s success can be gauged by the ability it displays in this transition. Although industry transformations generally emanate from technological changes, recent examples suggests they may also be due to the introduction of new business models and nowhere is it more relevant than in the airline industry. An analysis of the business model flexibility of 17 Airlines from Asia, Europe and Oceania, that is done with core competence as the indicator reveals a picture of inconsistencies in the core competence strategy of certain airlines and the corresponding reduction in business performance. The performance variations are explained from a service oriented core competence strategy employed by airlines that ultimately enables them in having a flexible business model that not only increases business performance but also helps in reducing the uncertainties in the internal and external operating environments. This is more relevant in the case of airline industry, as the product (the air transportation of passengers) minus the service competence is all the same.
Resumo:
La hipótesis de esta tesis es: "La optimización de la ventana considerando simultáneamente aspectos energéticos y aspectos relativos a la calidad ambiental interior (confort higrotérmico, lumínico y acústico) es compatible, siempre que se conozcan y consideren las sinergias existentes entre ellos desde las primeras fases de diseño". En la actualidad se desconocen las implicaciones de muchas de las decisiones tomadas en torno a la ventana; para que su eficiencia en relación a todos los aspectos mencionados pueda hacerse efectiva es necesaria una herramienta que aporte más información de la actualmente disponible en el proceso de diseño, permitiendo así la optimización integral, en función de las circunstancias específicas de cada proyecto. En la fase inicial de esta investigación se realiza un primer acercamiento al tema, a través del estado del arte de la ventana; analizando la normativa existente, los componentes, las prestaciones, los elementos experimentales y la investigación. Se observa que, en ocasiones, altos requisitos de eficiencia energética pueden suponer una disminución de las prestaciones del sistema en relación con la calidad ambiental interior, por lo que surge el interés por integrar al análisis energético aspectos relativos a la calidad ambiental interior, como son las prestaciones lumínicas y acústicas y la renovación de aire. En este punto se detecta la necesidad de realizar un estudio integral que incorpore los distintos aspectos y evaluar las sinergias que se dan entre las distintas prestaciones que cumple la ventana. Además, del análisis de las soluciones innovadoras y experimentales se observa la dificultad de determinar en qué medida dichas soluciones son eficientes, ya que son soluciones complejas, no caracterizadas y que no están incorporadas en las metodologías de cálculo o en las bases de datos de los programas de simulación. Por lo tanto, se plantea una segunda necesidad, generar una metodología experimental para llevar a cabo la caracterización y el análisis de la eficiencia de sistemas innovadores. Para abordar esta doble necesidad se plantea la optimización mediante una evaluación del elemento acristalado que integre la eficiencia energética y la calidad ambiental interior, combinando la investigación teórica y la investigación experimental. En el ámbito teórico, se realizan simulaciones, cálculos y recopilación de información de distintas tipologías de hueco, en relación con cada prestación de forma independiente (acústica, iluminación, ventilación). A pesar de haber partido con un enfoque integrador, resulta difícil esa integración detectándose una carencia de herramientas disponible. En el ámbito experimental se desarrolla una metodología para la evaluación del rendimiento y de aspectos ambientales de aplicación a elementos innovadores de difícil valoración mediante la metodología teórica. Esta evaluación consiste en el análisis comparativo experimental entre el elemento innovador y un elemento estándar; para llevar a cabo este análisis se han diseñado dos espacios iguales, que denominamos módulos de experimentación, en los que se han incorporado los dos sistemas; estos espacios se han monitorizado, obteniéndose datos de consumo, temperatura, iluminancia y humedad relativa. Se ha realizado una medición durante un periodo de nueve meses y se han analizado y comparado los resultados, obteniendo así el comportamiento real del sistema. Tras el análisis teórico y el experimental, y como consecuencia de esa necesidad de integrar el conocimiento existente se propone una herramienta de evaluación integral del elemento acristalado. El desarrollo de esta herramienta se realiza en base al procedimiento de diagnóstico de calidad ambiental interior (CAI) de acuerdo con la norma UNE 171330 “Calidad ambiental en interiores”, incorporando el factor de eficiencia energética. De la primera parte del proceso, la parte teórica y el estado del arte, se obtendrán los parámetros que son determinantes y los valores de referencia de dichos parámetros. En base a los parámetros relevantes obtenidos se da forma a la herramienta, que consiste en un indicador de producto para ventanas que integra todos los factores analizados y que se desarrolla según la Norma UNE 21929 “Sostenibilidad en construcción de edificios. Indicadores de sostenibilidad”. ABSTRACT The hypothesis of this thesis is: "The optimization of windows considering energy and indoor environmental quality issues simultaneously (hydrothermal comfort, lighting comfort, and acoustic comfort) is compatible, provided that the synergies between these issues are known and considered from the early stages of design ". The implications of many of the decisions made on this item are currently unclear. So that savings can be made, an effective tool is needed to provide more information during the design process than the currently available, thus enabling optimization of the system according to the specific circumstances of each project. The initial phase deals with the study from an energy efficiency point of view, performing a qualitative and quantitative analysis of commercial, innovative and experimental windows. It is observed that sometimes, high-energy efficiency requirements may mean a reduction in the system's performance in relation to user comfort and health, that's why there is an interest in performing an integrated analysis of indoor environment aspects and energy efficiency. At this point a need for a comprehensive study incorporating the different aspects is detected, to evaluate the synergies that exist between the various benefits that meet the window. Moreover, from the analysis of experimental and innovative windows, a difficulty in establishing to what extent these solutions are efficient is observed; therefore, there is a need to generate a methodology for performing the analysis of the efficiency of the systems. Therefore, a second need arises, to generate an experimental methodology to perform characterization and analysis of the efficiency of innovative systems. To address this dual need, the optimization of windows by an integrated evaluation arises, considering energy efficiency and indoor environmental quality, combining theoretical and experimental research. In the theoretical field, simulations and calculations are performed; also information about the different aspects of indoor environment (acoustics, lighting, ventilation) is gathered independently. Despite having started with an integrative approach, this integration is difficult detecting lack available tools. In the experimental field, a methodology for evaluating energy efficiency and indoor environment quality is developed, to be implemented in innovative elements which are difficult to evaluate using a theoretical methodology This evaluation is an experimental comparative analysis between an innovative element and a standard element. To carry out this analysis, two equal spaces, called experimental cells, have been designed. These cells have been monitored, obtaining consumption, temperature, luminance and relative humidity data. Measurement has been performed during nine months and results have been analyzed and compared, obtaining results of actual system behavior. To advance this optimization, windows have been studied from the point of view of energy performance and performance in relation to user comfort and health: thermal comfort, acoustic comfort, lighting comfort and air quality; proposing the development of a methodology for an integrated analysis including energy efficiency and indoor environment quality. After theoretical and experimental analysis and as a result of the need to integrate existing knowledge, a comprehensive evaluation procedure for windows is proposed. This evaluation procedure is developed according to the UNE 171330 "Indoor Environmental Quality", also incorporating energy efficiency and cost as factors to evaluate. From the first part of the research process, outstanding parameters are chosen and reference values of these parameters are set. Finally, based on the parameters obtained, an indicator is proposed as windows product indicator. The indicator integrates all factors analyzed and is developed according to ISO 21929-1:2011"Sustainability in building construction. Sustainability indicators. Part 1: Framework for the development of indicators and a core set of indicators for buildings".
Resumo:
Quantum Key Distribution is carving its place among the tools used to secure communications. While a difficult technology, it enjoys benefits that set it apart from the rest, the most prominent is its provable security based on the laws of physics. QKD requires not only the mastering of signals at the quantum level, but also a classical processing to extract a secret-key from them. This postprocessing has been customarily studied in terms of the efficiency, a figure of merit that offers a biased view of the performance of real devices. Here we argue that it is the throughput the significant magnitude in practical QKD, specially in the case of high speed devices, where the differences are more marked, and give some examples contrasting the usual postprocessing schemes with new ones from modern coding theory. A good understanding of its implications is very important for the design of modern QKD devices.
Resumo:
The postprocessing or secret-key distillation process in quantum key distribution (QKD) mainly involves two well-known procedures: information reconciliation and privacy amplification. Information or key reconciliation has been customarily studied in terms of efficiency. During this, some information needs to be disclosed for reconciling discrepancies in the exchanged keys. The leakage of information is lower bounded by a theoretical limit, and is usually parameterized by the reconciliation efficiency (or inefficiency), i.e. the ratio of additional information disclosed over the Shannon limit. Most techniques for reconciling errors in QKD try to optimize this parameter. For instance, the well-known Cascade (probably the most widely used procedure for reconciling errors in QKD) was recently shown to have an average efficiency of 1.05 at the cost of a high interactivity (number of exchanged messages). Modern coding techniques, such as rate-adaptive low-density parity-check (LDPC) codes were also shown to achieve similar efficiency values exchanging only one message, or even better values with few interactivity and shorter block-length codes.