783 resultados para Performance evaluation
Resumo:
Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB).IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth datatransmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] First, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one incontinuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be include CAC, traffic policing used for traffic control.QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator andanalytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.
Resumo:
Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB). IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth data transmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] first, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one in continuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be including CAC, traffic policing used for traffic control. QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator and analytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.
Resumo:
Hybrid Photovoltaic Thermal (PVT) collectors are an emerging technology that combines PV and solar thermal systems in a single solar collector producing heat and electricity simultaneously. The focus of this thesis work is to evaluate the performance of unglazed open loop PVT air system integrated on a garage roof in Borlänge. As it is thought to have a significant potential for preheating ventilation of the building and improving the PV modules electrical efficiency. The performance evaluation is important to optimize the cooling strategy of the collector in order to enhance its electrical efficiency and maximize the production of thermal energy. The evaluation process involves monitoring the electrical and thermal energies for a certain period of time and investigating the cooling effect on the performance through controlling the air mass flow provided by a variable speed fan connected to the collector by an air distribution duct. The distribution duct transfers the heated outlet air from the collector to inside the building. The PVT air collector consists of 34 Solibro CIGS type PV modules (115 Wp for each module) which are roof integrated and have replaced the traditional roof material. The collector is oriented toward the south-west with a tilt of 29 ᵒ. The collector consists of 17 parallel air ducts formed between the PV modules and the insulated roof surface. Each air duct has a depth of 0.05 m, length of 2.38 m and width of 2.38 m. The air ducts are connected to each other through holes. The monitoring system is based on using T-type thermocouples to measure the relevant temperatures, air sensor to measure the air mass flow. These parameters are needed to calculate the thermal energy. The monitoring system contains also voltage dividers to measure the PV modules voltage and shunt resistance to measure the PV current, and AC energy meters which are needed to calculate the produced electrical energy. All signals recorded from the thermocouples, voltage dividers and shunt resistances are connected to data loggers. The strategy of cooling in this work was based on switching the fan on, only when the difference between the air duct temperature (under the middle of top of PV column) and the room temperature becomes higher than 5 °C. This strategy was effective in term of avoiding high electrical consumption by the fan, and it is recommended for further development. The temperature difference of 5 °C is the minimum value to compensate the heat losses in the collecting duct and distribution duct. The PVT air collector has an area of (Ac=32 m2), and air mass flow of 0.002 kg/s m2. The nominal output power of the collector is 4 kWppv (34 CIGS modules with 115 Wppvfor each module). The collector produces thermal output energy of 6.88 kWth/day (0.21 kWth/m2 day) and an electrical output energy of 13.46 kWhel/day (0.42 kWhel/m2 day) with cooling case. The PVT air collector has a daily thermal energy yield of 1.72 kWhth/kWppv, and a daily PV electrical energy yield of 3.36 kWhel /kWppv. The fan energy requirement in this case was 0.18 kWh/day which is very small compared to the electrical energy generated by the PV collector. The obtained thermal efficiency was 8 % which is small compared to the results reported in literature for PVT air collectors. The small thermal efficiency was due to small operating air mass flow. Therefore, the study suggests increasing the air mass flow by a factor of 25. The electrical efficiency was fluctuating around 14 %, which is higher than the theoretical efficiency of the PV modules, and this discrepancy was due to the poor method of recording the solar irradiance in the location. Due to shading effect, it was better to use more than one pyranometer.
Resumo:
Studies were conducted to evaluate the nutritional value and inclusion levels of babassu meal (BM) in the diet of grower layer pullets in substitution to wheat meal. Digestibility, metabolism and growth trials were conducted. Twelve cecectomized roosters were used in the digestibility assay to determine the coefficients of standardized digestibility of amino acids (CSDAA). The metabolism trial was conducted with 30 adult roosters to determine the apparent metabolizable energy corrected for nitrogen (AMEn) of BM. A growth trial was performed to determine replacement levels of wheat midds by BM diet using 360 six-week-old commercial layer pullets. BM was included at the 0, 75 and 150 g/kg of BM, during grower and development rearing phases, respectively. Feed intake, body weight gain, and feed conversion were evaluated. BM AMEn was determined as 1,474 kcal/kg, on as-fed basis. The CSDAA determined for BM were below 88% for all AA. The inclusion of BM in the feed of grower layers (7-18 week) significantly decreased feed intake (p < 0.05), but significantly improved body weight gain and feedconversion ratio (p < 0.05) at 15% inclusion level. Considering the nutritional value and performance results, BM can replace wheat midds in diets of grower layer pullets.
Resumo:
Most consumers consider the fat of chicken meat undesirable for a healthy diet, due to the high levels of saturated fatty acids and cholesterol. The purpose of this experiment was to investigate the influence of changes in dietary metabolizable energy level, associated with a proportional nutrient density variation, on broiler chickens performance and on the lipid composition of meat. Males and females Cobb 500 broilers were evaluated separately. Performance evaluation followed a completely randomized design with factorial 6x3 arrangement - six energy levels (2,800, 2,900, 3,000, 3,100, 3,200 and 3,300 kcal/kg) and three slaughter ages (42, 49 and 56 days). Response surface methodology was used to establish a mathematical model to explain live weight, feed intake and feed conversion behavior. Total lipids and cholesterol were determined in skinned breast meat and in thigh meat, with and without skin. For lipid composition analysis, a 3x3x2 factorial arrangement in a completely randomized design - three ration’s metabolizable energy levels (2,800, 3,000 and 3,300 kcal/kg), three slaughter ages (42, 49 and 56 days) and two sexes - was used. The reduction in the diet metabolizable energy up to close to 3,000 kcal/kg did not affect live weight but, below this value, the live weight decreased. Feed intake was lower when the dietary energy level was higher. Feed conversion was favored in a direct proportion to the increase of the energy level of the diet. The performance of all birds was within the range considered appropriate for the lineage. Breast meat had less total lipids and cholesterol than thigh meat. Thigh with skin had more than the double of total lipids of skinned thigh, but the cholesterol content did not differ with the removal of the skin, suggesting that cholesterol content is not associated with the subcutaneous fat. Intramuscular fat content was lower in the meat from birds fed diets with lower energy level. These results may help to define the most appropriate nutritional management. Despite the decrease in bird’s productive performance, the restriction of energy in broiler chickens feed may be a viable alternative, if the consumers are willing to pay more for meat with less fat.
Resumo:
The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.
Resumo:
The study is aimed to calculate an innovative numerical index for bit performance evaluation called Bit Index (BI), applied on a new type of bit database named Formation Drillability Catalogue (FDC). A dedicated research programme (developed by Eni E&P and the University of Bologna) studied a drilling model for bit performance evaluation named BI, derived from data recorded while drilling (bit records, master log, wireline log, etc.) and dull bit evaluation. This index is calculated with data collected inside the FDC, a novel classification of Italian formations aimed to the geotechnical and geomechanical characterization and subdivisions of the formations, called Minimum Interval (MI). FDC was conceived and prepared at Eni E&P Div., and contains a large number of significant drilling parameters. Five wells have been identified inside the FDC and have been tested for bit performance evaluation. The values of BI are calculated for each bit run and are compared with the values of the cost per metre. The case study analyzes bits of the same type, diameters and run in the same formation. The BI methodology implemented on MI classification of FDC can improve consistently the bit performances evaluation, and it helps to identify the best performer bits. Moreover, FDC turned out to be functional to BI, since it discloses and organizes formation details that are not easily detectable or usable from bit records or master logs, allowing for targeted bit performance evaluations. At this stage of development, the BI methodology proved to be economic and reliable. The quality of bit performance analysis obtained with BI seems also more effective than the traditional “quick look” analysis, performed on bit records, or on the pure cost per metre evaluation.
Resumo:
The task considered in this paper is performance evaluation of region segmentation algorithms in the ground-truth-based paradigm. Given a machine segmentation and a ground-truth segmentation, performance measures are needed. We propose to consider the image segmentation problem as one of data clustering and, as a consequence, to use measures for comparing clusterings developed in statistics and machine learning. By doing so, we obtain a variety of performance measures which have not been used before in image processing. In particular, some of these measures have the highly desired property of being a metric. Experimental results are reported on both synthetic and real data to validate the measures and compare them with others.
Resumo:
Objective. The study reviewed one year of Texas hospital discharge data and Trauma Registry data for the 22 trauma services regions in Texas to identify regional variations in capacity, process of care and clinical outcomes for trauma patients, and analyze the statistical associations among capacity, process of care, and outcomes. ^ Methods. Cross sectional study design covering one year of state-wide Texas data. Indicators of trauma capacity, trauma care processes, and clinical outcomes were defined and data were collected on each indicator. Descriptive analyses were conducted of regional variations in trauma capacity, process of care, and clinical outcomes at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers. Multilevel regression models were performed to test the relations among trauma capacity, process of care, and outcome measures at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers while controlling for confounders such as age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization. ^ Results. Significant regional variation was found among the 22 trauma services regions across Texas in trauma capacity, process of care, and clinical outcomes. The regional trauma bed rate, the average staffed bed per 100,000 varied significantly by trauma service region. Pre-hospital trauma care processes were significantly variable by region---EMS time, transfer time, and triage. Clinical outcomes including mortality, hospital and intensive care unit length of stay, and hospital charges also varied significantly by region. In multilevel regression analysis, the average trauma bed rate was significantly related to trauma care processes including ambulance delivery time, transfer time, and triage after controlling for age, gender, race/ethnicity, injury severity, level of trauma centers, and urbanization at all trauma centers. Transfer time only among processes of care was significant with the average trauma bed rate by region at Level III and IV. Also trauma mortality only among outcomes measures was significantly associated with the average trauma bed rate by region at all trauma centers. Hospital charges only among outcomes measures were statistically related to trauma bed rate at Level I and II trauma centers. The effect of confounders on processes and outcomes such as age, gender, race/ethnicity, injury severity, and urbanization was found significantly variable by level of trauma centers. ^ Conclusions. Regional variation in trauma capacity, process, and outcomes in Texas was extensive. Trauma capacity, age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization were significantly associated with trauma process and clinical outcomes depending on level of trauma centers. ^ Key words: regionalized trauma systems, trauma capacity, pre-hospital trauma care, process, trauma outcomes, trauma performance, evaluation measures, regional variations ^
Resumo:
Strategic control is defined as the use of qualitative and quantitative tools for the evaluation of strategic organizational performance. Most research in strategic planning has focused on strategy formulation and implementation, but little work has been done on strategic performance evaluation particularly in the area of cancer research. The objective of this study was to identify strategic control approaches and financial performance metrics used by major cancer centers in the country as an initial step in expanding the theory and practice behind strategic organizational performance. Focusing on hospitals which share similar mandate and resource constraints was expected to improve measurement precision. The results indicate that most cancer centers use a wide selection of evaluation tools, but sophisticated analytical approaches were less common. In addition, there was evidence that high-performing centers tend to invest a larger degree of resources in the area of strategic performance analysis than centers showing lower financial results. The conclusions point to the need for incorporating higher degree of analytical power in order to improve the tracking of strategic performance. This study is one of the first to concentrate in the area of strategic control.^
Resumo:
This Capstone focuses on the overview of generic performance evaluation process, characteristics of Generation X and Y employees in a workplace, first and second-hand research in the area of Effective Performance Evaluations for Generation X and Y employees, and recommends different approaches to performance evaluations for Generation X and Y employees to increase its effectiveness.
Resumo:
thesis is developed from a real life application of performance evaluation of small and medium-sized enterprises (SMEs) in Vietnam. The thesis presents two main methodological developments on evaluation of dichotomous environment variable impacts on technical efficiency. Taking into account the selection bias the thesis proposes a revised frontier separation approach for the seminal Data Envelopment Analysis (DEA) model which was developed by Charnes, Cooper, and Rhodes (1981). The revised frontier separation approach is based on a nearest neighbour propensity score matching pairing treated SMEs with their counterfactuals on the propensity score. The thesis develops order-m frontier conditioning on propensity score from the conditional order-m approach proposed by Cazals, Florens, and Simar (2002), advocated by Daraio and Simar (2005). By this development, the thesis allows the application of the conditional order-m approach with a dichotomous environment variable taking into account the existence of the self-selection problem of impact evaluation. Monte Carlo style simulations have been built to examine the effectiveness of the aforementioned developments. Methodological developments of the thesis are applied in empirical studies to evaluate the impact of training programmes on the performance of food processing SMEs and the impact of exporting on technical efficiency of textile and garment SMEs of Vietnam. The analysis shows that training programmes have no significant impact on the technical efficiency of food processing SMEs. Moreover, the analysis confirms the conclusion of the export literature that exporters are self selected into the sector. The thesis finds no significant impact from exporting activities on technical efficiency of textile and garment SMEs. However, large bias has been eliminated by the proposed approach. Results of empirical studies contribute to the understanding of the impact of different environmental variables on the performance of SMEs. It helps policy makers to design proper policy supporting the development of Vietnamese SMEs.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
Impairments characterization and performance evaluation of Raman amplified unrepeated DP-16QAM transmissions are conducted. Experimental results indicate that small gain in forward direction enhance the system signal-to-noise ratio for longer reach without introducing noticeable penalty.
Resumo:
WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.