895 resultados para system performance evaluation
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
Resumo:
Orthogonal Frequency-Division Multiplexing (OFDM) has been proved to be a promising technology that enables the transmission of higher data rate. Multicarrier Code-Division Multiple Access (MC-CDMA) is a transmission technique which combines the advantages of both OFDM and Code-Division Multiplexing Access (CDMA), so as to allow high transmission rates over severe time-dispersive multi-path channels without the need of a complex receiver implementation. Also MC-CDMA exploits frequency diversity via the different subcarriers, and therefore allows the high code rates systems to achieve good Bit Error Rate (BER) performances. Furthermore, the spreading in the frequency domain makes the time synchronization requirement much lower than traditional direct sequence CDMA schemes. There are still some problems when we use MC-CDMA. One is the high Peak-to-Average Power Ratio (PAPR) of the transmit signal. High PAPR leads to nonlinear distortion of the amplifier and results in inter-carrier self-interference plus out-of-band radiation. On the other hand, suppressing the Multiple Access Interference (MAI) is another crucial problem in the MC-CDMA system. Imperfect cross-correlation characteristics of the spreading codes and the multipath fading destroy the orthogonality among the users, and then cause MAI, which produces serious BER degradation in the system. Moreover, in uplink system the received signals at a base station are always asynchronous. This also destroys the orthogonality among the users, and hence, generates MAI which degrades the system performance. Besides those two problems, the interference should always be considered seriously for any communication system. In this dissertation, we design a novel MC-CDMA system, which has low PAPR and mitigated MAI. The new Semi-blind channel estimation and multi-user data detection based on Parallel Interference Cancellation (PIC) have been applied in the system. The Low Density Parity Codes (LDPC) has also been introduced into the system to improve the performance. Different interference models are analyzed in multi-carrier communication systems and then the effective interference suppression for MC-CDMA systems is employed in this dissertation. The experimental results indicate that our system not only significantly reduces the PAPR and MAI but also effectively suppresses the outside interference with low complexity. Finally, we present a practical cognitive application of the proposed system over the software defined radio platform.
Resumo:
Today, smart-phones have revolutionized wireless communication industry towards an era of mobile data. To cater for the ever increasing data traffic demand, it is of utmost importance to have more spectrum resources whereby sharing under-utilized spectrum bands is an effective solution. In particular, the 4G broadband Long Term Evolution (LTE) technology and its foreseen 5G successor will benefit immensely if their operation can be extended to the under-utilized unlicensed spectrum. In this thesis, first we analyze WiFi 802.11n and LTE coexistence performance in the unlicensed spectrum considering multi-layer cell layouts through system level simulations. We consider a time division duplexing (TDD)-LTE system with an FTP traffic model for performance evaluation. Simulation results show that WiFi performance is more vulnerable to LTE interference, while LTE performance is degraded only slightly. Based on the initial findings, we propose a Q-Learning based dynamic duty cycle selection technique for configuring LTE transmission gaps, so that a satisfactory throughput is maintained both for LTE and WiFi systems. Simulation results show that the proposed approach can enhance the overall capacity performance by 19% and WiFi capacity performance by 77%, hence enabling effective coexistence of LTE and WiFi systems in the unlicensed band.
Resumo:
Orthogonal Frequency-Division Multiplexing (OFDM) has been proved to be a promising technology that enables the transmission of higher data rate. Multicarrier Code-Division Multiple Access (MC-CDMA) is a transmission technique which combines the advantages of both OFDM and Code-Division Multiplexing Access (CDMA), so as to allow high transmission rates over severe time-dispersive multi-path channels without the need of a complex receiver implementation. Also MC-CDMA exploits frequency diversity via the different subcarriers, and therefore allows the high code rates systems to achieve good Bit Error Rate (BER) performances. Furthermore, the spreading in the frequency domain makes the time synchronization requirement much lower than traditional direct sequence CDMA schemes. There are still some problems when we use MC-CDMA. One is the high Peak-to-Average Power Ratio (PAPR) of the transmit signal. High PAPR leads to nonlinear distortion of the amplifier and results in inter-carrier self-interference plus out-of-band radiation. On the other hand, suppressing the Multiple Access Interference (MAI) is another crucial problem in the MC-CDMA system. Imperfect cross-correlation characteristics of the spreading codes and the multipath fading destroy the orthogonality among the users, and then cause MAI, which produces serious BER degradation in the system. Moreover, in uplink system the received signals at a base station are always asynchronous. This also destroys the orthogonality among the users, and hence, generates MAI which degrades the system performance. Besides those two problems, the interference should always be considered seriously for any communication system. In this dissertation, we design a novel MC-CDMA system, which has low PAPR and mitigated MAI. The new Semi-blind channel estimation and multi-user data detection based on Parallel Interference Cancellation (PIC) have been applied in the system. The Low Density Parity Codes (LDPC) has also been introduced into the system to improve the performance. Different interference models are analyzed in multi-carrier communication systems and then the effective interference suppression for MC-CDMA systems is employed in this dissertation. The experimental results indicate that our system not only significantly reduces the PAPR and MAI but also effectively suppresses the outside interference with low complexity. Finally, we present a practical cognitive application of the proposed system over the software defined radio platform.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.
Resumo:
This study is about the enhancement of the elementary school in Natal/RN, (PCCR - Law No. 058/2004), concerning to the horizontal promotion through the performance evaluation. It uses as reference the education policy on the legislative field hegemony and the managerial model. The analysis of the teacher´s valorization is based on the policy of Funds (Fundef and Fundeb) in the Brazilian social and educational agenda. The study focuses on the Career Plan, Career and Remuneration of teachers in the period (2004-2010).The study focuses on the Career Plan, Career and Remuneration of teachers in the period (2004-2010). The thesis argues for the necessity to adopt a direct relationship between career development and horizontal promotion for more others fifteen classes, no matter to any conditioning variables. In addition, the performance shall be evaluated by interval of 25 years to reach at least the provisions decided in the law, which determines the salary adjustment in 5% at every two years, as pointed at the PCCR, about teachers remuneration, and teachers qualifications. A work of a bibliographic and a documental review about the education funding with the purpose of enhancement of educational work, career concepts, and also promotion and evaluation performance as well was performed based on experts authors in this field. The survey was organized with the aim of articulating quantitative and qualitative information, analyzing data from the teacher's salary - payrolls and paychecks - also applying a questionnaire. After the implementation of the PCCR, it was found that the wage indices for horizontal promotion during the teaching career are tied to a strategy for evaluating the performance which disqualifies the teacher‟s salaries in a minimum percentage of 25% (up to 25 years) and there are also elements that disturb the promotion strategy. The national minimum wage was set in three salaries by the PSPN Lei nº11.738/2008 but it never reaches the three salaries at Natal/RN educational system.Otherwise, the elements that structure the horizontal promotion in fifteen classes, throughout the career, flout the minimum years of teaching work, long established in 25 years. In addition, changes in terms in the salary increase depend on individual efforts by professional development through titration. Concerning to the career, despite of the category approving its PCCR, neither this instrument nor the Funds Policy managed to establish regulations were able to cope effective rules for valuing the teachers in the educational district system. It is necessary to ensure, in percentage terms and financial, the real remuneration of teachers with the attainment of horizontal promotion, reviewing the elements that structure the career and the determinants of performance evaluation.
Resumo:
In Brazil, the National Agency of Electric Energy (ANEEL) represents the energy regulator. The rates review have been one of its main tasks, which establish a pricing practice at a level to cover the efficient operating costs and also the appropriate return of the distributors investments. The changes in the procedures to redefine the efficient costs and the several studies on the methodologies employed to regulate this segment denote the challenge faced by regulators about the best methodological strategy to be employed. In this context, this research aims to propose a benchmarking evaluation applied to the national regulation system in the establishment of efficient operating costs of electricity distribution utilities. The model is formulated to promote the electricity market development, partnering with government policies ant to society benefit. To conduct this research, an integration of Data Envelopment Analysis (DEA) with the Stochastic Frontier Analysis (SFA) is adopted in a three stages procedure to correct the efficiency in terms of environmental effects: (i) evaluation by means of DEA to measure operating costs slacks of the utilities, in which environmental variables are omitted; (ii) The slacks calculated in the first stage are regressed on a set of environmental variables by means of SFA and operating costs are adjusted to account the environmental impact and statistical noise effects; and, (iii) reassess the performance of the electric power distribution utilities by means of DEA. Based on this methodology it is possible to obtain a performance evaluation exclusively expressed in terms of management efficiency, in which the operating environment and statistical noise effects are controlled.
Resumo:
The reduction in energy consumption is the main requirement to be satisfied in refrigeration and air conditioning by mechanical vapor compression system. In automotive system isn´t different. Thermal analyses in these systems are crucial for a better performance in automotive air conditioner. This work aims to evaluate the conditions of use of R134A refrigerant (used in vehicles) and compare with R437A (alternative refrigerant), varying the speed of the electric fan in the evaporator. All tests were performed in automotive air conditioning unit ATR600, simulating the thermal conditions of the system. The equipment is instrumented for data acquisition temperature, condensation and evaporation pressures and electrical power consumed to determine the coefficient of performance of the cycle. The system was tested under rotations of 800, 1600 and 2400 rpm with constant load of R- 134a. It occurred with the same conditions with R437A. Both recommended by the manufacturer. The results show that the best system performance occurs in the rotation of 800 RPM for both refrigerants.
Resumo:
The aim of this study was to propose a Performance Evaluation System for outsourced employees of the University Restaurant of the Federal University of Rio Grande do Norte to supply the lack of evaluative instruments. According to Provision of Services Contract nº050/2010 and nº055/2011 of FURN with SAFE LOCAÇÃO DE MÃO DE OBRA LTDA ME, it is the hired company to promote periodic functional performance evaluation of the outsourced employees, but this is not done. The performance evaluation process serves to evaluate if the employees are making their tasks according to the organizations’ objectives and goals, besides that helps to find service failures and capacity of employees demands, thus contributing to improve work conditions and the global performance of the organizations. To elaborate the proposal of evaluation, it was chosen an action research with the participation of all stakeholders, employees and managers from UR. On data collect, first, outsourced employees and management servers were interviewed, in order to raise existing perceptions about performance evaluation aspects. From these data and the work routine observation, a proposal of performance evaluation was elaborated, that was appreciated, criticized and adjusted by the actors involved (employees and managers) to the final formulation of the instrument. This study also presents the necessary steps to the implementation of the Performance Evaluation System. The proposed Performance Evaluation System can be applied to the FURN assuming this process, after modification of contract terms and the approval by the ADCON. It also can serve as an example to others units that works with the provision of outsourced services, enabling so the performance evaluation to be part of the management policy of all people working in FURN.
Resumo:
The rate of non-full-time faculty members has increased rapidly over the last decade (Louis, 2009; MacKay, 2014; Meranze & Newfield, 2013), as the post-secondary landscape of fluctuating enrolment, fiscal and operational challenges, and the requirement to hire specialized skill sets have required institutions to rely heavily on this demographic. In the Ontario Colleges of Applied Arts and Technology (CAATs) system, institutions have tried to preserve and enhance educational quality with fewer resources through greater reliance on non-full-time faculty. The purpose of this study was to explore the perceptions and experiences of teaching and support of non-full-time faculty at one Eastern Ontario college. Employing a narrative inquiry methodology, data were collected from four participants through their writing three individual letters at the end of each month and participating in one interview at the end of the contract period. The data were analyzed and coded. This analysis revealed five themes: motivation, connection and engagement, compensation, teaching and development, and performance evaluation. Differences in the participants’ perceptions tended to reflect divergences across career stage: retired versus early career. The compensation package provided to non-full-time faculty was considered inadequate for those in the early career stage, especially comparing it to that of full-time faculty. In addition, the amount of previous teaching experience was an important indicator for the appropriate level of teaching resources and support provided by the institution. The newer faculty members required a higher level of support to combat feelings of role isolation. The temporary nature of the role made it difficult to establish a feeling of a strong connection to the institution and subsequently opportunities to engage further to deepen the relationship. Despite these differences across participants, autonomous motivators were consistent across all narratives, as participants expressed their desire to teach and share their knowledge to help students achieve their goals. Participants concluded their narratives by sharing future advice for faculty interested in pursuing the role. The narratives provided areas for improvement that would help increase the level of job satisfaction for non-full-time college faculty members: (a) establishing a more thorough performance evaluation process to align with institutional supports, (b) offering more diverse teaching resources to better prepare faculty and enhance teaching practices, (c) overhauling the compensation package to better recognize the amount of time and effort spent in the role and aligning with the compensation provided to full-time faculty, and (d) including rewards and incentives as part of the compensation package to enhance the level of commitment and availability for the role. These changes might well increase the job satisfaction and improve the retention of non-full-time faculty members.
Resumo:
The inherent analogue nature of medical ultrasound signals in conjunction with the abundant merits provided by digital image acquisition, together with the increasing use of relatively simple front-end circuitries, have created considerable demand for single-bit beamformers in digital ultrasound imaging systems. Furthermore, the increasing need to design lightweight ultrasound systems with low power consumption and low noise, provide ample justification for development and innovation in the use of single-bit beamformers in ultrasound imaging systems. The overall aim of this research program is to investigate, establish, develop and confirm through a combination of theoretical analysis and detailed simulations, that utilize raw phantom data sets, suitable techniques for the design of simple-to-implement hardware efficient digital ultrasound beamformers to address the requirements for 3D scanners with large channel counts, as well as portable and lightweight ultrasound scanners for point-of-care applications and intravascular imaging systems. In addition, the stability boundaries of higher-order High-Pass (HP) and Band-Pass (BP) Σ−Δ modulators for single- and dual- sinusoidal inputs are determined using quasi-linear modeling together with the describing-function method, to more accurately model the modulator quantizer. The theoretical results are shown to be in good agreement with the simulation results for a variety of input amplitudes, bandwidths, and modulator orders. The proposed mathematical models of the quantizer will immensely help speed up the design of higher order HP and BP Σ−Δ modulators to be applicable for digital ultrasound beamformers. Finally, a user friendly design and performance evaluation tool for LP, BP and HP modulators is developed. This toolbox, which uses various design methodologies and covers an assortment of modulators topologies, is intended to accelerate the design process and evaluation of modulators. This design tool is further developed to enable the design, analysis and evaluation of beamformer structures including the noise analyses of the final B-scan images. Thus, this tool will allow researchers and practitioners to design and verify different reconstruction filters and analyze the results directly on the B-scan ultrasound images thereby saving considerable time and effort.
Resumo:
This work studies the uplink of a cellular network with zero-forcing (ZF) receivers under imperfect channel state information at the base station. More specifically, apart from the pilot contamination, we investigate the effect of time variation of the channel due to the relative users' movement with regard to the base station. Our contributions include analytical expressions for the sum-rate with finite number of BS antennas, and also the asymptotic limits with infinite power and number of BS antennas, respectively. The numerical results provide interesting insights on how the user mobility degrades the system performance which extends previous results in the literature.
Resumo:
We evaluate the impact of the Eurozone sovereign debt crisis on the performance and performance persistence of a survivorship bias-free sample of bond funds from a small market, identified as one of the most affected by this event, during the 2001–2012 period. Besides avoiding data mining, we also introduce a methodological innovation in assessing bond fund performance persistence. Our results show that bond funds underperform significantly both during crisis and non-crisis periods. Besides, we find strong evidence of performance persistence, for both short- and longer-term horizons, during non-crisis periods but not during the debt crisis. In this way, the persistence phenomenon in small markets seems to occur only during non-crisis periods and this is valuable information for bond fund investors to exploit.
Resumo:
This paper provides the first investigation about bond mutual fund performance during recession and expansion periods separately. Based on multi-factor performance evaluation models, results show that bond funds significantly underperform the market during both phases of the business cycle. Nevertheless, unlike equity funds, bond funds exhibit considerably higher alphas during good economic states than during market downturns. These results, however, seem entirely driven by the global financial crisis subperiod. In contrast, during the recession associated to the Euro sovereign debt crisis, bond funds are able to accomplish neutral performance. This improved performance throughout the debt crisis seems to be related to more conservative investment strategies, which reflect an increase in managers’ risk aversion.