871 resultados para Measuring and Performance System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

El principal objectiu d'aquest treball és implementar i exposar una descripció teòrica per a diferents esquemes de Physical Layer Network Coding. Utilitzant un esquema bàsic com a punt de partida, el projecte presenta la construcció i l'anàlisis de diferents esquemes de comunicació on la complexitat va augmentant a mesura que anem avançant en el projecte. El treball està estructurat en diferents parts: primer, es presenta una introducció a Physical Layer Network Coding i a Lattice Network Codes. A continuació, s'introdueixen les eines matemàtiques necessàries per entendre el CF System. Després, s'analitza i implementa el primer esquema bàsic. A partir del qual, implementem una versió vectorial del CF System i una versió codificada amb un Hamming q-ari. Finalment, s'estudien i implementen diferents estratègies per millorar la matriu de coeficients A.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction : Décrire les patients d'une structure gériatrique offrant des hospitalisations de courte durée, dans un contexte ambulatoire, pour des situations gériatriques courantes dans le canton de Genève (Suisse). Mesurer les performances de cette structure en termes de qualité des soins et de coûts. Méthodes : Des données relatives au profil des 100 premiers patients ont été collectées (huit mois), ainsi qu'aux prestations, aux ressources et aux effets (réadmissions, décès, satisfaction, complications) de manière à mesurer différents indicateurs de qualité et de coûts. Les valeurs observées ont été systématiquement comparées aux valeurs attendues, calculées à partir du profil des patients. Résultats : Des critères d'admission ont été fixés pour exclure les situations dans lesquelles d'autres structures offrent des soins mieux adaptés. La spécificité de cette structure intermédiaire a été d'assurer une continuité des soins et d'organiser d'emblée le retour à domicile par des prestations de liaison ambulatoire. La faible occurrence des réadmissions potentiellement évitables, une bonne satisfaction des patients, l'absence de décès prématurés et le faible nombre de complications suggèrent que les soins médicaux et infirmiers ont été délivrés avec une bonne qualité. Le coût s'est révélé nettement plus économique que des séjours hospitaliers après ajustement pour la lourdeur des cas. Conclusion : L'expérience-pilote a démontré la faisabilité et l'utilité d'une unité d'hébergement et d'hospitalisation de court séjour en toute sécurité. Le suivi du patient par le médecin traitant assure une continuité des soins et évite la perte d'information lors des transitions ainsi que les examens non pertinents. INTRODUCTION: To describe patients admitted to a geriatric institution, providing short-term hospitalizations in the context of ambulatory care in the canton of Geneva. To measure the performances of this structure in terms of quality ofcare and costs. METHOD: Data related to the clinical,functioning and participation profiles of the first 100 patients were collected. Data related to effects (readmission, deaths, satisfaction, complications), services and resources were also documented over an 8-month period to measure various quality and costindicators. Observed values were systematically compared to expected values, adjusted for case mix. RESULTS: Explicit criteria were proposed to focus on the suitable patients, excluding situations in which other structures were considered to be more appropriate. The specificity of this intermediate structure was to immediately organize, upon discharge, outpatient services at home. The low rate of potentially avoidable readmissions, the high patient satisfaction scores, the absence of premature death and the low number of iatrogenic complications suggest that medical and nursing care delivered reflect a good quality of services. The cost was significantly lower than expected, after adjusting for case mix. CONCLUSION: The pilot experience showed that a short-stay hospitalization unit was feasible with acceptable security conditions. The attending physician's knowledge of the patients allowed this system tofocus on essential issues without proposing inappropriate services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The strategic group theory provides an intermediate level of analysis between a single company and the whole industry for identifying issues about the company's competitive position and strategic choices. Strategic groups are companies within an industry with similar strategic characteristics or competing on similar bases. Strategic choices are aligned with the firms’ resources. The purpose of this study was to identify the strategic groups in the wind energy industry in Europe, and study, whether a certain group membership results in financial performance differences. Altogether 80 European wind energy companies were included in the study, which were clustered into four strategic groups according to their age and growth rate. Each group corresponds to a different strategy. The results show that the wind energy companies can be clustered according to the chosen strategic characteristics. Strategic decisions were investigated with characteristic variables. Performance variables were used in the analysis measuring profitability, liquidity and solvency of the groups. These strategic choices of the companies did not have a significant influence on the firms’ performance. The more mature and slower growing group proved to be the most successful. However, the differences between groups were generally not statistically significant. The only statistically significant difference found was in the solvency ratio between Mature Slow and Young Rapid groups. Measured with these variables, more mature and slower growing companies performed better. Therefore, a certain strategic group membership results in performance differences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena oli analysoida Stora Enson palkkakeskuksen ja sen henkilöstöhallinnon järjestelmän, SAP HR:n, kustannustehokkuutta ja suorituskykyä. Tutkimuksessa käytettiin apuna benchmarkingia. Viisi suurta suomalaisyritystä osallistui benchmarkingiin. Benchmarkingin pääkohteena oli yritysten välinen kustannusvertailu. Kyselyssä perehdyttiin myös yritysten järjestelmien suorituskykyyn. Tuloksien perusteella Stora Enson palkkakeskus tarjoaa kustannustehokkaan ja kilpailukykyisen ratkaisun, joka menestyy hyvin vertailussa muihin suomalaisiin yrityksiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The analysis of efficiency and productivity in banking has received a great deal of attention for almost three decades now. However, most of the literature to date has not explicitly accounted for risk when measuring efficiency. We propose an analysis of profit efficiency taking into account how the inclusion of a variety of bank risk measures might bias efficiency scores. Our measures of risk are partly inspired by the literature on earnings management and earnings quality, keeping in mind that loan loss provisions, as a generally accepted proxy for risk, can be adjusted to manage earnings and regulatory capital. We also consider some variants of traditional models of profit efficiency where different regimes are stipulated so that financial institutions can be evaluated in different dimensions—i.e., prices, quantities, or prices and quantities simultaneously. We perform this analysis on the Spanish banking industry, whose institutions have been deeply affected by the current international financial crisis, and where re-regulation is taking place. Our results can be explored in multiple dimensions but, in general, they indicate that the impact of earnings management on profit efficiency is of less magnitude than what might a priori be expected, and that on the whole, savings banks have performed less well than commercial banks. However, savings banks are adapting to the new regulatory scenario and rapidly catching up with commercial banks, especially in some dimensions of performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper reports a bacteria autonomous controlled concentrator prototype with a user-friendly interface for bench-top applications. It is based on a micro-fluidic lab-on-a-chip and its associated custom instrumentation, which consists in a dielectrophoretic actuator, to pre-concentrate the sample, and an impedance analyser, to measure concentrated bacteria levels. The system is composed by a single micro-fluidic chamber with interdigitated electrodes and a instrumentation with custom electronics. The prototype is supported by a real-time platform connected to a remote computer, which automatically controls the system and displays impedance data used to monitor the status of bacteria accumulation on-chip. The system automates the whole concentrating operation. Performance has been studied for controlled volumes of Escherichia coli (E. coli) samples injected into the micro-fluidic chip at constant flow rate of 10 μL/min. A media conductivity correcting protocol has been developed, as the preliminary results showed distortion of the impedance analyser measurement produced by bacterial media conductivity variations through time. With the correcting protocol, the measured impedance values were related to the quantity of bacteria concentrated with a correlation of 0.988 and a coefficient of variation of 3.1%. Feasibility of E. coli on-chip automated concentration, using the miniaturized system, has been demonstrated. Furthermore, the impedance monitoring protocol had been adjusted and optimized, to handle changes in the electrical properties of the bacteria media over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Corporate events as an effective part of marketing communications strategy seem to be underestimated in Finnish companies. In the rest of the Europe and the USA, investments in events are increasing, and their share of the marketing budget is significant. The growth of the industry may be explained by the numerous advantages and opportunities that events provide for attendees, such as face-to-face marketing, enhancing corporate image, building relationships, increasing sales, and gathering information. In order to maximize these benefits and return on investment, specific measurement strategies are required, yet there seems to exist a lack of understanding of how event performance should be perceived or evaluated. To address this research gap, this research attempts to describe the perceptions of and strategies for evaluating corporate event performance in the Finnish events industry. First, corporate events are discussed in terms of definitions and characteristics, typologies, and their role in marketing communications. Second, different theories on evaluating corporate event performance are presented and analyzed. Third, a conceptual model is presented based on the literature review, which serves as a basis for the empirical research conducted as an online questionnaire. The empirical findings are to a great extent in line with the existing literature, suggesting that there remains a lack of understanding corporate event performance evaluation, and challenges arise in determining appropriate measurement procedures for it. Setting clear objectives for events is a significant aspect of the evaluation process, since the outcomes of events are usually evaluated against the preset objectives. The respondent companies utilize many of the individual techniques that were recognized in theory, such as calculating the number of sales leads and delegates. However, some of the measurement tools may require further investments and resources, thus restricting their application especially in smaller companies. In addition, there seems to be a lack of knowledge of the most appropriate methods in different contexts, which take into account the characteristics of the organizing party as well as the size and nature of the event. The lack of inhouse expertise enhances the need for third-party service-providers in solving problems of corporate event measurement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis: A liquid-cooled, direct-drive, permanent-magnet, synchronous generator with helical, double-layer, non-overlapping windings formed from a copper conductor with a coaxial internal coolant conduit offers an excellent combination of attributes to reliably provide economic wind power for the coming generation of wind turbines with power ratings between 5 and 20MW. A generator based on the liquid-cooled architecture proposed here will be reliable and cost effective. Its smaller size and mass will reduce build, transport, and installation costs. Summary: Converting wind energy into electricity and transmitting it to an electrical power grid to supply consumers is a relatively new and rapidly developing method of electricity generation. In the most recent decade, the increase in wind energy’s share of overall energy production has been remarkable. Thousands of land-based and offshore wind turbines have been commissioned around the globe, and thousands more are being planned. The technologies have evolved rapidly and are continuing to evolve, and wind turbine sizes and power ratings are continually increasing. Many of the newer wind turbine designs feature drivetrains based on Direct-Drive, Permanent-Magnet, Synchronous Generators (DD-PMSGs). Being low-speed high-torque machines, the diameters of air-cooled DD-PMSGs become very large to generate higher levels of power. The largest direct-drive wind turbine generator in operation today, rated just below 8MW, is 12m in diameter and approximately 220 tonne. To generate higher powers, traditional DD-PMSGs would need to become extraordinarily large. A 15MW air-cooled direct-drive generator would be of colossal size and tremendous mass and no longer economically viable. One alternative to increasing diameter is instead to increase torque density. In a permanent magnet machine, this is best done by increasing the linear current density of the stator windings. However, greater linear current density results in more Joule heating, and the additional heat cannot be removed practically using a traditional air-cooling approach. Direct liquid cooling is more effective, and when applied directly to the stator windings, higher linear current densities can be sustained leading to substantial increases in torque density. The higher torque density, in turn, makes possible significant reductions in DD-PMSG size. Over the past five years, a multidisciplinary team of researchers has applied a holistic approach to explore the application of liquid cooling to permanent-magnet wind turbine generator design. The approach has considered wind energy markets and the economics of wind power, system reliability, electromagnetic behaviors and design, thermal design and performance, mechanical architecture and behaviors, and the performance modeling of installed wind turbines. This dissertation is based on seven publications that chronicle the work. The primary outcomes are the proposal of a novel generator architecture, a multidisciplinary set of analyses to predict the behaviors, and experimentation to demonstrate some of the key principles and validate the analyses. The proposed generator concept is a direct-drive, surface-magnet, synchronous generator with fractional-slot, duplex-helical, double-layer, non-overlapping windings formed from a copper conductor with a coaxial internal coolant conduit to accommodate liquid coolant flow. The novel liquid-cooling architecture is referred to as LC DD-PMSG. The first of the seven publications summarized in this dissertation discusses the technological and economic benefits and limitations of DD-PMSGs as applied to wind energy. The second publication addresses the long-term reliability of the proposed LC DD-PMSG design. Publication 3 examines the machine’s electromagnetic design, and Publication 4 introduces an optimization tool developed to quickly define basic machine parameters. The static and harmonic behaviors of the stator and rotor wheel structures are the subject of Publication 5. And finally, Publications 6 and 7 examine steady-state and transient thermal behaviors. There have been a number of ancillary concrete outcomes associated with the work including the following. X Intellectual Property (IP) for direct liquid cooling of stator windings via an embedded coaxial coolant conduit, IP for a lightweight wheel structure for lowspeed, high-torque electrical machinery, and IP for numerous other details of the LC DD-PMSG design X Analytical demonstrations of the equivalent reliability of the LC DD-PMSG; validated electromagnetic, thermal, structural, and dynamic prediction models; and an analytical demonstration of the superior partial load efficiency and annual energy output of an LC DD-PMSG design X A set of LC DD-PMSG design guidelines and an analytical tool to establish optimal geometries quickly and early on X Proposed 8 MW LC DD-PMSG concepts for both inner and outer rotor configurations Furthermore, three technologies introduced could be relevant across a broader spectrum of applications. 1) The cost optimization methodology developed as part of this work could be further improved to produce a simple tool to establish base geometries for various electromagnetic machine types. 2) The layered sheet-steel element construction technology used for the LC DD-PMSG stator and rotor wheel structures has potential for a wide range of applications. And finally, 3) the direct liquid-cooling technology could be beneficial in higher speed electromotive applications such as vehicular electric drives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The strength and nature of the video game practice effect on tests of visual and perceptual skills were examined using high functioning Grades Four and Five students who had been tested with the WISC-R .for the purpose of gifted identification and placement. The control group, who did not own and .play video games on a sustained basis, and the experimental group, who did own a video game system and had some mastery of video games, including the -Nintendo game, "Tetris", were each composed of 18 juniorg:r;-ade students and were chosen from pre-existing conditions. The experimental group corresponded to the control group in terms of age, sex, and community. Data on the Verbal and Performance I.Q. Scores were· collected for both groups and the author was interested in the difference between the Verbal and Performance Scores within each group, anticipating a P > V outcome for the experimental group. The results showed a significant P > V difference in the experimental, video game playing group, as expected, but no significant difference between the Performance $cores of the control and experimental groups. The results, thus, indicated lower Verbal I.Q. Scores in the experimental group relat'ive to 'the control group.' The study conclu~ed that information about a sUbject's video game experience and "learhing style pref~rence is important for a clear interpretation of the Verbal and Performance I.Q. Scores of the WISC-R. Although the time spent on video game play may, 'indeed, increase P~rformance Scores relative to Verbal Scores for an individual, the possibilities exist that the time borrowed and spent away from language based activities may retard verbal growth and/or that the cognitive style associated with some Performance I.Q.subtests may have a negative effect on the approach to the tasks on the Verbal I.Q. Scale. The study also discussed the possibility that exposure to ,the video game experience, in pre-puberty, can provide spatial instruction which will result in improved spatial skills. strong spatial skills have been linked to improved performance and preference in mathematics, science, and engineering and it was suggested that appropriate video game play might be a way to involve girls more in the fields of mathematics and science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores the representation of Swinging London in three examples of 1960s British cinema: Blowup (Michelangelo Antonioni, 1966), Smashing Time (Desmond Davis, 1967) and Performance (Donald Cammell and Nicolas Roeg, 1970). It suggests that the films chronologically signify the evolution, commodification and dissolution of the Swinging London era. The thesis explores how the concept of Swinging London is both critiqued and perpetuated in each film through the use of visual tropes: the reconstruction of London as a cinematic space; the Pop photographer; the dolly; representations of music performance and fashion; the appropriation of signs and symbols associated with the visual culture of Swinging London. Using fashion, music performance, consumerism and cultural symbolism as visual narratives, each film also explores the construction of youth identity through the representation of manufactured and mediated images. Ultimately, these films reinforce Swinging London as a visual economy that circulates media images as commodities within a system of exchange. With this in view, the signs and symbols that comprise the visual culture of Swinging London are as central and significant to the cultural era as their material reality. While they attempt to destabilize prevailing representations of the era through the reproduction and exchange of such symbols, Blowup, Smashing Time, and Performance nevertheless contribute to the nostalgia for Swinging London in larger cultural memory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imaging studies have shown reduced frontal lobe resources following total sleep deprivation (TSD). The anterior cingulate cortex (ACC) in the frontal region plays a role in performance monitoring and cognitive control; both error detection and response inhibition are impaired following sleep loss. Event-related potentials (ERPs) are an electrophysiological tool used to index the brain's response to stimuli and information processing. In the Flanker task, the error-related negativity (ERN) and error positivity (Pe) ERPs are elicited after erroneous button presses. In a Go/NoGo task, NoGo-N2 and NoGo-P3 ERPs are elicited during high conflict stimulus processing. Research investigating the impact of sleep loss on ERPs during performance monitoring is equivocal, possibly due to task differences, sample size differences and varying degrees of sleep loss. Based on the effects of sleep loss on frontal function and prior research, it was expected that the sleep deprivation group would have lower accuracy, slower reaction time and impaired remediation on performance monitoring tasks, along with attenuated and delayed stimulus- and response-locked ERPs. In the current study, 49 young adults (24 male) were screened to be healthy good sleepers and then randomly assigned to a sleep deprived (n = 24) or rested control (n = 25) group. Participants slept in the laboratory on a baseline night, followed by a second night of sleep or wake. Flanker and Go/NoGo tasks were administered in a battery at 1O:30am (i.e., 27 hours awake for the sleep deprivation group) to measure performance monitoring. On the Flanker task, the sleep deprivation group was significantly slower than controls (p's <.05), but groups did not differ on accuracy. No group differences were observed in post-error slowing, but a trend was observed for less remedial accuracy in the sleep deprived group compared to controls (p = .09), suggesting impairment in the ability to take remedial action following TSD. Delayed P300s were observed in the sleep deprived group on congruent and incongruent Flanker trials combined (p = .001). On the Go/NoGo task, the hit rate (i.e., Go accuracy) was significantly lower in the sleep deprived group compared to controls (p <.001), but no differences were found on false alarm rates (i.e., NoGo Accuracy). For the sleep deprived group, the Go-P3 was significantly smaller (p = .045) and there was a trend for a smaller NoGo-N2 compared to controls (p = .08). The ERN amplitude was reduced in the TSD group compared to controls in both the Flanker and Go/NoGo tasks. Error rate was significantly correlated with the amplitude of response-locked ERNs in control (r = -.55, p=.005) and sleep deprived groups (r = -.46, p = .021); error rate was also correlated with Pe amplitude in controls (r = .46, p=.022) and a trend was found in the sleep deprived participants (r = .39, p =. 052). An exploratory analysis showed significantly larger Pe mean amplitudes (p = .025) in the sleep deprived group compared to controls for participants who made more than 40+ errors on the Flanker task. Altered stimulus processing as indexed by delayed P3 latency during the Flanker task and smaller amplitude Go-P3s during the Go/NoGo task indicate impairment in stimulus evaluation and / or context updating during frontal lobe tasks. ERN and NoGoN2 reductions in the sleep deprived group confirm impairments in the monitoring system. These data add to a body of evidence showing that the frontal brain region is particularly vulnerable to sleep loss. Understanding the neural basis of these deficits in performance monitoring abilities is particularly important for our increasingly sleep deprived society and for safety and productivity in situations like driving and sustained operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the energy consumption and the evaluation of the performance of air supply systems for a ventilated room involving high- and low-level supplies. The energy performance assessment is based on the airflow rate, which is related to the fan power consumption by achieving the same environmental quality performance for each case. Four different ventilation systems are considered: wall displacement ventilation, confluent jets ventilation, impinging jet ventilation and a high level mixing ventilation system. The ventilation performance of these systems will be examined by means of achieving the same Air Distribution Index (ADI) for different cases. The widely used high-level supplies require much more fan power than those for low-level supplies for achieving the same value of ADI. In addition, the supply velocity, hence the supply dynamic pressure, for a high-level supply is much larger than for low-level supplies. This further increases the power consumption for high-level supply systems. The paper considers these factors and attempts to provide some guidelines on the difference in the energy consumption associated with high and low level air supply systems. This will be useful information for designers and to the authors' knowledge there is a lack of information available in the literature on this area of room air distribution. The energy performance of the above-mentioned ventilation systems has been evaluated on the basis of the fan power consumed which is related to the airflow rate required to provide equivalent indoor environment. The Air Distribution Index (ADI) is used to evaluate the indoor environment produced in the room by the ventilation strategy being used. The results reveal that mixing ventilation requires the highest fan power and the confluent jets ventilation needs the lowest fan power in order to achieve nearly the same value of ADI.