792 resultados para communication performance evaluation
Resumo:
Este estudo discute processo de implantação do contrato de gestão na Administração Pública brasileira particularmente, experiência iniciada pelo Governo do Estado de São Paulo, em 1991. objetivo principal do contrato de gestão concentrar controle governamental sobre os resultados das entidades, que possibilita simplificação gradativa das estruturas normativas introdução de um sistema de sanções recompensas. Este sistema pode viabilizar implantação de novas formas de avaliação de desempenho contribuir para aumentar produtividade no setor público. contrato de gestão um instrumento que incentiva diálogo parceria torna transparentes as intenções orientações entre as partes contratantes. Além disso, pode funcionar como um instrumento de racionalização administrativa comunicação interna para própria instituição. existência de um planejamento prévio uma das pré-condições necessárias ao processo de implantação do contrato de gestão. As metas objetivos devem ser especificadas de forma precisa, clara sem ambigüidade, refletir as condições capacidades reais da entidade. Alguns problemas podem ser evitados, se for previsto um estágio inicial de preparação dos técnicos responsáveis pela implantação acompanhamento dos contratos, nos conceitos instrumentos indispensáveis ao processo. fundamental que seja previsto também, um período de negociação do apoio dos principais decisores formadores de opinião, de conscientização preparação do corpo funcional das entidades.
Resumo:
The methodology Balanced Scorecard (BSC) focuses on the major critical issues of modern organizations, whether with or without profit. The measurement of the effective performance of the latter is by evaluating the successful implementation of organizational strategy. The aim of this paper is to present the development of a system of performance measurement strategy for a nonprofit organization, whose object of study is the Associação de Apoio as Comunidades do Campo - AACC, in the context of the BSC methodology of Kaplan and Norton. The methodology of this case study is an exploratory, descriptive and qualitative, and diagnose the coherence of the Strategy Map in an organization, based strategic planning from 2010 to 2012. Initially conducted a literature review covering the main aspects of strategy maps and performance evaluation involving the translation of the BSC and strategy evaluation. The main results of the proposed approach refers to evaluation of overall scores for each dimension of the BSC methodology, financial, customer, internal processes, learning and growth. These results are able to help the organization evaluate and revise their strategy and, in general, to adopt management methods more accurately. Data collection is centered on interviews with semi-structured questionnaire. The findings highlight on balancing and alignment of strategic objectives, low causality map, strategic communication insufficient and fragmented. For interviewees organizational culture is the biggest impediment to structuring a management model based on indicators and strategic process should be initiated by non-financial indicators gradually. The performance indicators of the AACC/RN portray more meritocracy operational procedures of social projects in the context of the Strategic Map determined in a shortterm over the long term. However, there is evidence of improved performance management and strategic taken as a basis of planning as both the strategic map structured. Therefore, the nonprofits need to adopt a form of management that enables planning, setting objectives and targets that provide the continuity of its activities, and generating instruments that can measure the financial performance and non-financial, in order to develop strategic actions for growth and sustainability
Resumo:
This paper aims to present, using a set of guidelines, how to apply the conservative distributed simulation paradigm (CMB protocol) to develop efficient applications. Using these guidelines, even a user with little experience on distributed simulation and computer architecture can have good performance on distributed simulations using conservative synchronization protocols for parallel processes.The set of guidelines is focus on a specific application domain, the performance evaluation of computer systems, considering models with coarse granularity and few logical processes and running over two platforms: parallel (high performance communication environment) and distributed (low performance communication environment).
Resumo:
Nos últimos de 20 anos, economia e tecnologia evoluíram em muitas direções e em novas áreas. Muitas dessas evoluções criaram oportunidades que estão sendo consideradas na concepção de futuras redes de comunicação. Estas novas possibilidades estão relacionadas à, sobretudo, utilização da internet para o acesso à serviços e englobam: mobilidade; tecnologias de baixo custo; crescimento e empregos (pela Internet participa-se de cada processo de negócios e produção); serviços; educação (oportunidade para as pessoas crescerem e se desenvolverem); entretenimento (mundos virtuais para o lazer, compras e jogos); volume de tráfego maior (texto, voz, imagens, vídeo). Como uma consequência, a Internet se tornou, semelhante a eletricidade ou água, um bem público. Com quase 2 bilhões de usuários (aproximadamente 28% da população mundial), a Internet está se tornando, cada vez mais, uma infraestrutura difusivo oferecendo em qualquer lugar, a qualquer momento conectividade e serviços. Este mundo da Internet atual é o resultado de sucessivas alterações que aconteceram desde o seu surgimento e que tornaram a infraestrutura de comunicação de importância crítica. Em termos de tecnologias de comunicação, os sistemas móveis sem fio têm um lugar especial devido a sua difusão excepcional na última década e que, junto com a Internet, tem permitido o aparecimento de dispositivos inteligentes, a introdução de novos serviços inovadores e exigindo, para tanto, um ambiente que suporte a inovação e criatividade. Porém, os vários padrões de redes para suporte, principalmente, ao acesso de última milha são desvantagens na perspectiva do usuário, pois este tem de se habilitar nessas redes (contratar os serviços) e, não raro, ter terminais específicos para o acesso. A idéia de um padrão único para estas redes não obteve resultados satisfatórios e uma solução aponta para a integração dessas redes para prover acesso único e transparente ao usuário. Esse trabalho, portanto, apresenta uma solução embarcada para integrar padrões de comunicações sem fio heterogênea do tipo IEEE 802.15.4 ZigBee, IEEE 802.20 GSM/GPRS e IEEE 802.2 Wi-Fi. Essa heterogeneidade de tecnologias sem fio permite a um usuário em movimento, através de seu terminal local ou remoto, acessar aplicativos e serviços de forma transparente. A avaliação de desempenho da solução foi realizada utilizando-se dois tipos de serviços: domótica e telemedicina. Os resultados indicaram que a solução proposta consegue integrar e prover os serviços com segurança e confiabilidade.
Resumo:
Studies were conducted to evaluate the nutritional value and inclusion levels of babassu meal (BM) in the diet of grower layer pullets in substitution to wheat meal. Digestibility, metabolism and growth trials were conducted. Twelve cecectomized roosters were used in the digestibility assay to determine the coefficients of standardized digestibility of amino acids (CSDAA). The metabolism trial was conducted with 30 adult roosters to determine the apparent metabolizable energy corrected for nitrogen (AMEn) of BM. A growth trial was performed to determine replacement levels of wheat midds by BM diet using 360 six-week-old commercial layer pullets. BM was included at the 0, 75 and 150 g/kg of BM, during grower and development rearing phases, respectively. Feed intake, body weight gain, and feed conversion were evaluated. BM AMEn was determined as 1,474 kcal/kg, on as-fed basis. The CSDAA determined for BM were below 88% for all AA. The inclusion of BM in the feed of grower layers (7-18 week) significantly decreased feed intake (p < 0.05), but significantly improved body weight gain and feedconversion ratio (p < 0.05) at 15% inclusion level. Considering the nutritional value and performance results, BM can replace wheat midds in diets of grower layer pullets.
Resumo:
Most consumers consider the fat of chicken meat undesirable for a healthy diet, due to the high levels of saturated fatty acids and cholesterol. The purpose of this experiment was to investigate the influence of changes in dietary metabolizable energy level, associated with a proportional nutrient density variation, on broiler chickens performance and on the lipid composition of meat. Males and females Cobb 500 broilers were evaluated separately. Performance evaluation followed a completely randomized design with factorial 6x3 arrangement - six energy levels (2,800, 2,900, 3,000, 3,100, 3,200 and 3,300 kcal/kg) and three slaughter ages (42, 49 and 56 days). Response surface methodology was used to establish a mathematical model to explain live weight, feed intake and feed conversion behavior. Total lipids and cholesterol were determined in skinned breast meat and in thigh meat, with and without skin. For lipid composition analysis, a 3x3x2 factorial arrangement in a completely randomized design - three ration’s metabolizable energy levels (2,800, 3,000 and 3,300 kcal/kg), three slaughter ages (42, 49 and 56 days) and two sexes - was used. The reduction in the diet metabolizable energy up to close to 3,000 kcal/kg did not affect live weight but, below this value, the live weight decreased. Feed intake was lower when the dietary energy level was higher. Feed conversion was favored in a direct proportion to the increase of the energy level of the diet. The performance of all birds was within the range considered appropriate for the lineage. Breast meat had less total lipids and cholesterol than thigh meat. Thigh with skin had more than the double of total lipids of skinned thigh, but the cholesterol content did not differ with the removal of the skin, suggesting that cholesterol content is not associated with the subcutaneous fat. Intramuscular fat content was lower in the meat from birds fed diets with lower energy level. These results may help to define the most appropriate nutritional management. Despite the decrease in bird’s productive performance, the restriction of energy in broiler chickens feed may be a viable alternative, if the consumers are willing to pay more for meat with less fat.
Resumo:
The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.
Resumo:
The study is aimed to calculate an innovative numerical index for bit performance evaluation called Bit Index (BI), applied on a new type of bit database named Formation Drillability Catalogue (FDC). A dedicated research programme (developed by Eni E&P and the University of Bologna) studied a drilling model for bit performance evaluation named BI, derived from data recorded while drilling (bit records, master log, wireline log, etc.) and dull bit evaluation. This index is calculated with data collected inside the FDC, a novel classification of Italian formations aimed to the geotechnical and geomechanical characterization and subdivisions of the formations, called Minimum Interval (MI). FDC was conceived and prepared at Eni E&P Div., and contains a large number of significant drilling parameters. Five wells have been identified inside the FDC and have been tested for bit performance evaluation. The values of BI are calculated for each bit run and are compared with the values of the cost per metre. The case study analyzes bits of the same type, diameters and run in the same formation. The BI methodology implemented on MI classification of FDC can improve consistently the bit performances evaluation, and it helps to identify the best performer bits. Moreover, FDC turned out to be functional to BI, since it discloses and organizes formation details that are not easily detectable or usable from bit records or master logs, allowing for targeted bit performance evaluations. At this stage of development, the BI methodology proved to be economic and reliable. The quality of bit performance analysis obtained with BI seems also more effective than the traditional “quick look” analysis, performed on bit records, or on the pure cost per metre evaluation.
Resumo:
The thesis deals with channel coding theory applied to upper layers in the protocol stack of a communication link and it is the outcome of four year research activity. A specific aspect of this activity has been the continuous interaction between the natural curiosity related to the academic blue-sky research and the system oriented design deriving from the collaboration with European industry in the framework of European funded research projects. In this dissertation, the classical channel coding techniques, that are traditionally applied at physical layer, find their application at upper layers where the encoding units (symbols) are packets of bits and not just single bits, thus explaining why such upper layer coding techniques are usually referred to as packet layer coding. The rationale behind the adoption of packet layer techniques is in that physical layer channel coding is a suitable countermeasure to cope with small-scale fading, while it is less efficient against large-scale fading. This is mainly due to the limitation of the time diversity inherent in the necessity of adopting a physical layer interleaver of a reasonable size so as to avoid increasing the modem complexity and the latency of all services. Packet layer techniques, thanks to the longer codeword duration (each codeword is composed of several packets of bits), have an intrinsic longer protection against long fading events. Furthermore, being they are implemented at upper layer, Packet layer techniques have the indisputable advantages of simpler implementations (very close to software implementation) and of a selective applicability to different services, thus enabling a better matching with the service requirements (e.g. latency constraints). Packet coding technique improvement has been largely recognized in the recent communication standards as a viable and efficient coding solution: Digital Video Broadcasting standards, like DVB-H, DVB-SH, and DVB-RCS mobile, and 3GPP standards (MBMS) employ packet coding techniques working at layers higher than the physical one. In this framework, the aim of the research work has been the study of the state-of-the-art coding techniques working at upper layer, the performance evaluation of these techniques in realistic propagation scenario, and the design of new coding schemes for upper layer applications. After a review of the most important packet layer codes, i.e. Reed Solomon, LDPC and Fountain codes, in the thesis focus our attention on the performance evaluation of ideal codes (i.e. Maximum Distance Separable codes) working at UL. In particular, we analyze the performance of UL-FEC techniques in Land Mobile Satellite channels. We derive an analytical framework which is a useful tool for system design allowing to foresee the performance of the upper layer decoder. We also analyze a system in which upper layer and physical layer codes work together, and we derive the optimal splitting of redundancy when a frequency non-selective slowly varying fading channel is taken into account. The whole analysis is supported and validated through computer simulation. In the last part of the dissertation, we propose LDPC Convolutional Codes (LDPCCC) as possible coding scheme for future UL-FEC application. Since one of the main drawbacks related to the adoption of packet layer codes is the large decoding latency, we introduce a latency-constrained decoder for LDPCCC (called windowed erasure decoder). We analyze the performance of the state-of-the-art LDPCCC when our decoder is adopted. Finally, we propose a design rule which allows to trade-off performance and latency.
Resumo:
Progress in miniaturization of electronic components and design of wireless systems paved the way towards ubiquitous and pervasive communications, enabling anywhere and anytime connectivity. Wireless devices present on, inside, around the human body are becoming commonly used, leading to the class of body-centric communications. The presence of the body with all its peculiar characteristics has to be properly taken into account in the development and design of wireless networks in this context. This thesis addresses various aspects of body-centric communications, with the aim of investigating network performance achievable in different scenarios. The main original contributions pertain to the performance evaluation for Wireless Body Area Networks (WBANs) at the Medium Access Control layer: the application of Link Adaptation to these networks is proposed, Carrier Sense Multiple Access with Collision Avoidance algorithms used for WBAN are extensively investigated, coexistence with other wireless systems is examined. Then, an analytical model for interference in wireless access network is developed, which can be applied to the study of communication between devices located on humans and fixed nodes of an external infrastructure. Finally, results on experimental activities regarding the investigation of human mobility and sociality are presented.
Resumo:
Body-centric communications are emerging as a new paradigm in the panorama of personal communications. Being concerned with human behaviour, they are suitable for a wide variety of applications. The advances in the miniaturization of portable devices to be placed on or around the body, foster the diffusion of these systems, where the human body is the key element defining communication characteristics. This thesis investigates the human impact on body-centric communications under its distinctive aspects. First of all, the unique propagation environment defined by the body is described through a scenario-based channel modeling approach, according to the communication scenario considered, i.e., on- or on- to off-body. The novelty introduced pertains to the description of radio channel features accounting for multiple sources of variability at the same time. Secondly, the importance of a proper channel characterisation is shown integrating the on-body channel model in a system level simulator, allowing a more realistic comparison of different Physical and Medium Access Control layer solutions. Finally, the structure of a comprehensive simulation framework for system performance evaluation is proposed. It aims at merging in one tool, mobility and social features typical of the human being, together with the propagation aspects, in a scenario where multiple users interact sharing space and resources.
Resumo:
The task considered in this paper is performance evaluation of region segmentation algorithms in the ground-truth-based paradigm. Given a machine segmentation and a ground-truth segmentation, performance measures are needed. We propose to consider the image segmentation problem as one of data clustering and, as a consequence, to use measures for comparing clusterings developed in statistics and machine learning. By doing so, we obtain a variety of performance measures which have not been used before in image processing. In particular, some of these measures have the highly desired property of being a metric. Experimental results are reported on both synthetic and real data to validate the measures and compare them with others.
Resumo:
A tandem mass spectral database system consists of a library of reference spectra and a search program. State-of-the-art search programs show a high tolerance for variability in compound-specific fragmentation patterns produced by collision-induced decomposition and enable sensitive and specific 'identity search'. In this communication, performance characteristics of two search algorithms combined with the 'Wiley Registry of Tandem Mass Spectral Data, MSforID' (Wiley Registry MSMS, John Wiley and Sons, Hoboken, NJ, USA) were evaluated. The search algorithms tested were the MSMS search algorithm implemented in the NIST MS Search program 2.0g (NIST, Gaithersburg, MD, USA) and the MSforID algorithm (John Wiley and Sons, Hoboken, NJ, USA). Sample spectra were acquired on different instruments and, thus, covered a broad range of possible experimental conditions or were generated in silico. For each algorithm, more than 30,000 matches were performed. Statistical evaluation of the library search results revealed that principally both search algorithms can be combined with the Wiley Registry MSMS to create a reliable identification tool. It appears, however, that a higher degree of spectral similarity is necessary to obtain a correct match with the NIST MS Search program. This characteristic of the NIST MS Search program has a positive effect on specificity as it helps to avoid false positive matches (type I errors), but reduces sensitivity. Thus, particularly with sample spectra acquired on instruments differing in their Setup from tandem-in-space type fragmentation, a comparably higher number of false negative matches (type II errors) were observed by searching the Wiley Registry MSMS.
Resumo:
Information Centric Networking (ICN) as an emerging paradigm for the Future Internet has initially been rather focusing on bandwidth savings in wired networks, but there might also be some significant potential to support communication in mobile wireless networks as well as opportunistic network scenarios, where end systems have spontaneous but time-limited contact to exchange data. This chapter addresses the reasoning why ICN has an important role in mobile and opportunistic networks by identifying several challenges in mobile and opportunistic Information-Centric Networks and discussing appropriate solutions for them. In particular, it discusses the issues of receiver and source mobility. Source mobility needs special attention. Solutions based on routing protocol extensions, indirection, and separation of name resolution and data transfer are discussed. Moreover, the chapter presents solutions for problems in opportunistic Information-Centric Networks. Among those are mechanisms for efficient content discovery in neighbour nodes, resume mechanisms to recover from intermittent connectivity disruptions, a novel agent delegation mechanisms to offload content discovery and delivery to mobile agent nodes, and the exploitation of overhearing to populate routing tables of mobile nodes. Some preliminary performance evaluation results of these developed mechanisms are provided.
Resumo:
Objective. The study reviewed one year of Texas hospital discharge data and Trauma Registry data for the 22 trauma services regions in Texas to identify regional variations in capacity, process of care and clinical outcomes for trauma patients, and analyze the statistical associations among capacity, process of care, and outcomes. ^ Methods. Cross sectional study design covering one year of state-wide Texas data. Indicators of trauma capacity, trauma care processes, and clinical outcomes were defined and data were collected on each indicator. Descriptive analyses were conducted of regional variations in trauma capacity, process of care, and clinical outcomes at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers. Multilevel regression models were performed to test the relations among trauma capacity, process of care, and outcome measures at all trauma centers, at Level I and II trauma centers and at Level III and IV trauma centers while controlling for confounders such as age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization. ^ Results. Significant regional variation was found among the 22 trauma services regions across Texas in trauma capacity, process of care, and clinical outcomes. The regional trauma bed rate, the average staffed bed per 100,000 varied significantly by trauma service region. Pre-hospital trauma care processes were significantly variable by region---EMS time, transfer time, and triage. Clinical outcomes including mortality, hospital and intensive care unit length of stay, and hospital charges also varied significantly by region. In multilevel regression analysis, the average trauma bed rate was significantly related to trauma care processes including ambulance delivery time, transfer time, and triage after controlling for age, gender, race/ethnicity, injury severity, level of trauma centers, and urbanization at all trauma centers. Transfer time only among processes of care was significant with the average trauma bed rate by region at Level III and IV. Also trauma mortality only among outcomes measures was significantly associated with the average trauma bed rate by region at all trauma centers. Hospital charges only among outcomes measures were statistically related to trauma bed rate at Level I and II trauma centers. The effect of confounders on processes and outcomes such as age, gender, race/ethnicity, injury severity, and urbanization was found significantly variable by level of trauma centers. ^ Conclusions. Regional variation in trauma capacity, process, and outcomes in Texas was extensive. Trauma capacity, age, gender, race/ethnicity, injury severity, level of trauma centers and urbanization were significantly associated with trauma process and clinical outcomes depending on level of trauma centers. ^ Key words: regionalized trauma systems, trauma capacity, pre-hospital trauma care, process, trauma outcomes, trauma performance, evaluation measures, regional variations ^