781 resultados para Portfolio Performance Evaluation
Resumo:
The acquisition of machinery used in timber harvesting depends on high financial investment, which implies the need for assessments that allow defining more precisely, what is the machine or the whole more recommended for streamlining the operation. This study aimed to technically and economically evaluating the performance of a harvester in Eucalyptus forest harvest first cut. The technique analysis included a time and movements, productivity, efficiency operational and mechanical availability. The economic analysis included the parameters operational cost, harvesting cost and energy consumption. The results obtained from the technological-economic parameters evidenced that of Diameter at Breast Height directly influenced the productivity of harvester. Consequently the lower costs of forest harvest were obtained for the compartments with wider diameter trees.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The study proposes a constrained least square (CLS) pre-distortion scheme for multiple-input single-output (MISO) multiple access ultra-wideband (UWB) systems. In such a scheme, a simple objective function is defined, which can be efficiently solved by a gradient-based algorithm. For the performance evaluation, scenarios CM1 and CM3 of the IEEE 802.15.3a channel model are considered. Results show that the CLS algorithm has a fast convergence and a good trade-off between intersymbol interference (ISI) and multiple access interference (MAI) reduction and signal-to-noise ratio (SNR) preservation, performing better than time-reversal (TR) pre-distortion.
Resumo:
Background: In the analysis of effects by cell treatment such as drug dosing, identifying changes on gene network structures between normal and treated cells is a key task. A possible way for identifying the changes is to compare structures of networks estimated from data on normal and treated cells separately. However, this approach usually fails to estimate accurate gene networks due to the limited length of time series data and measurement noise. Thus, approaches that identify changes on regulations by using time series data on both conditions in an efficient manner are demanded. Methods: We propose a new statistical approach that is based on the state space representation of the vector autoregressive model and estimates gene networks on two different conditions in order to identify changes on regulations between the conditions. In the mathematical model of our approach, hidden binary variables are newly introduced to indicate the presence of regulations on each condition. The use of the hidden binary variables enables an efficient data usage; data on both conditions are used for commonly existing regulations, while for condition specific regulations corresponding data are only applied. Also, the similarity of networks on two conditions is automatically considered from the design of the potential function for the hidden binary variables. For the estimation of the hidden binary variables, we derive a new variational annealing method that searches the configuration of the binary variables maximizing the marginal likelihood. Results: For the performance evaluation, we use time series data from two topologically similar synthetic networks, and confirm that our proposed approach estimates commonly existing regulations as well as changes on regulations with higher coverage and precision than other existing approaches in almost all the experimental settings. For a real data application, our proposed approach is applied to time series data from normal Human lung cells and Human lung cells treated by stimulating EGF-receptors and dosing an anticancer drug termed Gefitinib. In the treated lung cells, a cancer cell condition is simulated by the stimulation of EGF-receptors, but the effect would be counteracted due to the selective inhibition of EGF-receptors by Gefitinib. However, gene expression profiles are actually different between the conditions, and the genes related to the identified changes are considered as possible off-targets of Gefitinib. Conclusions: From the synthetically generated time series data, our proposed approach can identify changes on regulations more accurately than existing methods. By applying the proposed approach to the time series data on normal and treated Human lung cells, candidates of off-target genes of Gefitinib are found. According to the published clinical information, one of the genes can be related to a factor of interstitial pneumonia, which is known as a side effect of Gefitinib.
Resumo:
The objective of this study was to investigate, in a population of crossbred cattle, the obtainment of the non-additive genetic effects for the characteristics weight at 205 and 390 days and scrotal circumference, and to evaluate the consideration of these effects in the prediction of breeding values of sires using different estimation methodologies. In method 1, the data were pre-adjusted for the non-additive effects obtained by least squares means method in a model that considered the direct additive, maternal and non-additive fixed genetic effects, the direct and total maternal heterozygosities, and epistasis. In method 2, the non-additive effects were considered covariates in genetic model. Genetic values for adjusted and non-adjusted data were predicted considering additive direct and maternal effects, and for weight at 205 days, also the permanent environmental effect, as random effects in the model. The breeding values of the categories of sires considered for the weight characteristic at 205 days were organized in files, in order to verify alterations in the magnitude of the predictions and ranking of animals in the two methods of correction data for the non-additives effects. The non-additive effects were not similar in magnitude and direction in the two estimation methods used, nor for the characteristics evaluated. Pearson and Spearman correlations between breeding values were higher than 0.94, and the use of different methods does not imply changes in the selection of animals.
Resumo:
As a part of the AMAZE-08 campaign during the wet season in the rainforest of central Amazonia, an ultraviolet aerodynamic particle sizer (UV-APS) was operated for continuous measurements of fluorescent biological aerosol particles (FBAP). In the coarse particle size range (> 1 mu m) the campaign median and quartiles of FBAP number and mass concentration were 7.3x10(4) m(-3) (4.0-13.2x10(4) m(-3)) and 0.72 mu g m(-3) (0.42-1.19 mu g m(-3)), respectively, accounting for 24% (11-41%) of total particle number and 47% (25-65%) of total particle mass. During the five-week campaign in February-March 2008 the concentration of coarse-mode Saharan dust particles was highly variable. In contrast, FBAP concentrations remained fairly constant over the course of weeks and had a consistent daily pattern, peaking several hours before sunrise, suggesting observed FBAP was dominated by nocturnal spore emission. This conclusion was supported by the consistent FBAP number size distribution peaking at 2.3 mu m, also attributed to fungal spores and mixed biological particles by scanning electron microscopy (SEM), light microscopy and biochemical staining. A second primary biological aerosol particle (PBAP) mode between 0.5 and 1.0 mu m was also observed by SEM, but exhibited little fluorescence and no true fungal staining. This mode may have consisted of single bacterial cells, brochosomes, various fragments of biological material, and small Chromalveolata (Chromista) spores. Particles liquid-coated with mixed organic-inorganic material constituted a large fraction of observations, and these coatings contained salts likely from primary biological origin. We provide key support for the suggestion that real-time laser-induce fluorescence (LIF) techniques using 355 nm excitation provide size-resolved concentrations of FBAP as a lower limit for the atmospheric abundance of biological particles in a pristine environment. We also show some limitations of using the instrument for ambient monitoring of weakly fluorescent particles < 2 mu m. Our measurements confirm that primary biological particles, fungal spores in particular, are an important fraction of supermicron aerosol in the Amazon and that may contribute significantly to hydrological cycling, especially when coated by mixed inorganic material.
Resumo:
O princípio da controlabilidade prevê que os gestores deveriam ser avaliados com base em fatores controláveis. Consequentemente, os incentivos gerenciais estariam relacionados a práticas de contabilidade gerencial capazes de evitar que os gestores sejam responsabilizados por resultados financeiros além do controle gerencial, tais como análise por centro de responsabilidade, custo padrão, preço de transferência, orçamento e avaliação de desempenho. Este artigo desenvolve um estudo de campo para investigar se há relação entre a presença de incentivos gerenciais e as práticas de contabilidade gerencial associadas ao princípio da controlabilidade. Entrevistas in loco foram realizadas para a coleta de dados em nível organizacional e testes estatísticos não paramétricos foram utilizados para a análise dos dados. Entre as práticas de contabilidade gerencial examinadas, os resultados sugerem que apenas orçamento anual, análise por centro de responsabilidade e avaliação de desempenho estão associados à presença de sistemas de incentivos nas empresas entrevistadas.
Resumo:
Current commercial and academic OLAP tools do not process XML data that contains XLink. Aiming at overcoming this issue, this paper proposes an analytical system composed by LMDQL, an analytical query language. Also, the XLDM metamodel is given to model cubes of XML documents with XLink and to deal with syntactic, semantic and structural heterogeneities commonly found in XML documents. As current W3C query languages for navigating in XML documents do not support XLink, XLPath is discussed in this article to provide features for the LMDQL query processing. A prototype system enabling the analytical processing of XML documents that use XLink is also detailed. This prototype includes a driver, named sql2xquery, which performs the mapping of SQL queries into XQuery. To validate the proposed system, a case study and its performance evaluation are presented to analyze the impact of analytical processing over XML/XLink documents.
Resumo:
O princípio da controlabilidade prevê que os gestores deveriam ser avaliados com base em fatores controláveis. Consequentemente, os incentivos gerenciais estariam relacionados a práticas de contabilidade gerencial capazes de evitar que os gestores sejam responsabilizados por resultados financeiros além do controle gerencial, tais como análise por centro de responsabilidade, custo padrão, preço de transferência, orçamento e avaliação de desempenho. Este artigo desenvolve um estudo de campo para investigar se há relação entre a presença de incentivos gerenciais e as práticas de contabilidade gerencial associadas ao princípio da controlabilidade. Entrevistas in loco foram realizadas para a coleta de dados em nível organizacional e testes estatísticos não paramétricos foram utilizados para a análise dos dados. Entre as práticas de contabilidade gerencial examinadas, os resultados sugerem que apenas orçamento anual, análise por centro de responsabilidade e avaliação de desempenho estão associados à presença de sistemas de incentivos nas empresas entrevistadas.
Resumo:
Neste trabalho, teve-se por objetivo construir uma metodologia de avaliação de desempenho dos municípios paulistas quanto à eficiência técnica na aplicação de recursos públicos nas ações de atenção básica à saúde e analisar a influência de variáveis não controláveis no processo de produção em tal área. A eficiência técnica é um dos parâmetros de avaliação de desempenho dos gestores públicos, refletindo a capacidade de uma entidade obter máximos outputs com o menor consumo de inputs. O alcance de tal métrica pode ser prejudicado ou favorecido pelas variáveis ambientais ou não controláveis que, se não forem consideradas na avaliação de desempenho, podem gerar vieses. Nesse sentido, por meio da metodologia Data Envelopment Analysis (DEA) em dois estágios, os escores de eficiência dos municípios foram estimados e depois ajustados com o uso da análise de regressão. Os resultados indicaram que seria possível aumentar, consideravelmente, a quantidade de serviços prestados à população sem a necessidade de novas dotações orçamentárias na maioria dos municípios. Além disso, verificou-se que a maior proporção de idosos em uma jurisdição torna a prestação de serviços mais cara; por sua vez, maiores densidade populacional, grau de urbanização e escala dos estabelecimentos de saúde favorecem o gasto público com eficiência. Os cinco municípios paulistas considerados mais eficientes foram Tuiuti, Nova Guataporanga, Sabino, Lins e Santos.
Resumo:
Cada vez mais, as empresas têm percebido as vantagens de estabelecerem alianças umas com as outras, formando redes. Essas vantagens podem estar ligadas aos mais variados objetivos, que são mensurados por diferentes indicadores de desempenho. A presente pesquisa teve como objetivo a determinação das características estruturais que são mais adequadas, considerando uma série de possíveis indicadores de desempenho, para uma rede interorganizacional colaborativa. Para tal, foi realizado um levantamento bibliográfico-exploratório em duas bases de dados, e utilizada uma abordagem qualitativa para sistematizar as informações encontradas. Como principal resultado deste trabalho, foi construído um quadro teórico que sistematiza a relação entre a estrutura e o desempenho de uma rede. Acredita-se que este artigo possa servir como uma contribuição inicial para um campo de pesquisa ainda pouco explorado.
Resumo:
Today, third generation networks are consolidated realities, and user expectations on new applications and services are becoming higher and higher. Therefore, new systems and technologies are necessary to move towards the market needs and the user requirements. This has driven the development of fourth generation networks. ”Wireless network for the fourth generation” is the expression used to describe the next step in wireless communications. There is no formal definition for what these fourth generation networks are; however, we can say that the next generation networks will be based on the coexistence of heterogeneous networks, on the integration with the existing radio access network (e.g. GPRS, UMTS, WIFI, ...) and, in particular, on new emerging architectures that are obtaining more and more relevance, as Wireless Ad Hoc and Sensor Networks (WASN). Thanks to their characteristics, fourth generation wireless systems will be able to offer custom-made solutions and applications personalized according to the user requirements; they will offer all types of services at an affordable cost, and solutions characterized by flexibility, scalability and reconfigurability. This PhD’s work has been focused on WASNs, autoconfiguring networks which are not based on a fixed infrastructure, but are characterized by being infrastructure less, where devices have to automatically generate the network in the initial phase, and maintain it through reconfiguration procedures (if nodes’ mobility, or energy drain, etc..., cause disconnections). The main part of the PhD activity has been focused on an analytical study on connectivity models for wireless ad hoc and sensor networks, nevertheless a small part of my work was experimental. Anyway, both the theoretical and experimental activities have had a common aim, related to the performance evaluation of WASNs. Concerning the theoretical analysis, the objective of the connectivity studies has been the evaluation of models for the interference estimation. This is due to the fact that interference is the most important performance degradation cause in WASNs. As a consequence, is very important to find an accurate model that allows its investigation, and I’ve tried to obtain a model the most realistic and general as possible, in particular for the evaluation of the interference coming from bounded interfering areas (i.e. a WiFi hot spot, a wireless covered research laboratory, ...). On the other hand, the experimental activity has led to Throughput and Packet Error Rare measurements on a real IEEE802.15.4 Wireless Sensor Network.
Resumo:
The thesis deals with channel coding theory applied to upper layers in the protocol stack of a communication link and it is the outcome of four year research activity. A specific aspect of this activity has been the continuous interaction between the natural curiosity related to the academic blue-sky research and the system oriented design deriving from the collaboration with European industry in the framework of European funded research projects. In this dissertation, the classical channel coding techniques, that are traditionally applied at physical layer, find their application at upper layers where the encoding units (symbols) are packets of bits and not just single bits, thus explaining why such upper layer coding techniques are usually referred to as packet layer coding. The rationale behind the adoption of packet layer techniques is in that physical layer channel coding is a suitable countermeasure to cope with small-scale fading, while it is less efficient against large-scale fading. This is mainly due to the limitation of the time diversity inherent in the necessity of adopting a physical layer interleaver of a reasonable size so as to avoid increasing the modem complexity and the latency of all services. Packet layer techniques, thanks to the longer codeword duration (each codeword is composed of several packets of bits), have an intrinsic longer protection against long fading events. Furthermore, being they are implemented at upper layer, Packet layer techniques have the indisputable advantages of simpler implementations (very close to software implementation) and of a selective applicability to different services, thus enabling a better matching with the service requirements (e.g. latency constraints). Packet coding technique improvement has been largely recognized in the recent communication standards as a viable and efficient coding solution: Digital Video Broadcasting standards, like DVB-H, DVB-SH, and DVB-RCS mobile, and 3GPP standards (MBMS) employ packet coding techniques working at layers higher than the physical one. In this framework, the aim of the research work has been the study of the state-of-the-art coding techniques working at upper layer, the performance evaluation of these techniques in realistic propagation scenario, and the design of new coding schemes for upper layer applications. After a review of the most important packet layer codes, i.e. Reed Solomon, LDPC and Fountain codes, in the thesis focus our attention on the performance evaluation of ideal codes (i.e. Maximum Distance Separable codes) working at UL. In particular, we analyze the performance of UL-FEC techniques in Land Mobile Satellite channels. We derive an analytical framework which is a useful tool for system design allowing to foresee the performance of the upper layer decoder. We also analyze a system in which upper layer and physical layer codes work together, and we derive the optimal splitting of redundancy when a frequency non-selective slowly varying fading channel is taken into account. The whole analysis is supported and validated through computer simulation. In the last part of the dissertation, we propose LDPC Convolutional Codes (LDPCCC) as possible coding scheme for future UL-FEC application. Since one of the main drawbacks related to the adoption of packet layer codes is the large decoding latency, we introduce a latency-constrained decoder for LDPCCC (called windowed erasure decoder). We analyze the performance of the state-of-the-art LDPCCC when our decoder is adopted. Finally, we propose a design rule which allows to trade-off performance and latency.
Resumo:
Sports biomechanics describes human movement from a performance enhancement and an injury reduction perspective. In this respect, the purpose of sports scientists is to support coaches and physicians with reliable information about athletes’ technique. The lack of methods allowing for in-field athlete evaluation as well as for accurate joint force estimates represents, to date, the main limitation to this purpose. The investigations illustrated in the present thesis aimed at providing a contribution towards the development of the above mentioned methods. Two complementary approaches were adopted: a Low Resolution Approach – related to performance assessment – where the use of wearable inertial measurement units is exploited during different phases of sprint running, and a High Resolution Approach – related to joint kinetics estimate for injury prevention – where subject-specific, non-rigid constraints for knee joint kinematic modelling used in multi-body optimization techniques are defined. Results obtained using the Low Resolution Approach indicated that, due to their portability and inexpensiveness, inertial measurement systems are a valid alternative to laboratory-based instrumentation for in-field performance evaluation of sprint running. Using acceleration and angular velocity data, the following quantities were estimated: trunk inclination and angular velocity, instantaneous horizontal velocity and displacement of a point approximating the centre of mass, and stride and support phase durations. As concerns the High Resolution Approach, results indicated that the length of the anterior cruciate and lateral collateral ligaments decreased, while that of the deep bundle of the medial collateral ligament increased significantly during flexion. Variations of the posterior cruciate and the superficial bundle of the medial collateral ligament lengths were concealed by the experimental indeterminacy. A mathematical model was provided that allowed the estimate of subject-specific ligament lengths as a function of knee flexion and that can be integrated in a multi-body optimization procedure.
Resumo:
Gossip protocols have been analyzed as a feasible solution for data dissemination on peer-to-peer networks. In this thesis, a new data dissemination protocol is proposed and compared with other known gossip mechanisms. Performance evaluation is based on simulation.