431 resultados para Métricas


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste trabalho será apresentada a modelagem por regressão logística, com a finalidade de prever qual seria a inadimplência dos clientes que compõem o portfólio de uma grande instituição financeira do país. Sendo assim, será explorada a ideia de usar o conceito de provisionamento pura e simplesmente, através da estimação de uma probabilidade de default dado por um ou mais modelos estatísticos que serão construídos no decorrer do trabalho, conforme incentiva o comitê de Basileia. Um dos modelos será feito a partir de uma separação prévia de público através de clusters e a outra técnica a ser explorada será a criação de um modelo sem nenhuma separação. O objetivo será a comparação entre as duas métricas de classificação de risco e verificar os trade-off entre elas e os impactos de variáveis macroeconômicas nestes modelos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho tem por objetivo o estudo sobre o encerramento de fundos de investimento multiestratégia no mercado brasileiro com relação à influência das variáveis: taxas de administração e performance, idade, desempenho, patrimônio e captação líquida. Os fundos selecionados apresentam características de condomínio aberto, e destinados a investidores em geral, com duas janelas de observação, anuais e trimestrais, durante o período de 2008 a 2014. Os estudos apresentados são separados para dois tipos de fundos: com classificação Master e fundos de investimento em cotas (FIC). Para a determinação de desempenho dos fundos utiliza-se duas diferentes métricas, o retorno descontado da taxa DI e o índice de Sharpe. As variáveis foram avaliadas sob dois distintos modelos, o primeiro utilizando os dados das variáveis em valores absolutos e o segundo classificando-as em quintis. Para a obtenção dos resultados os dados foram aplicados por regressões em painel logística, painel e cross section. Os resultados obtidos mostraram a importância do patrimônio e da captação líquida, seguido por idade. Os desempenhos, tanto retorno descontado quanto o índice de Sharpe, apresentaram menor influência, e com pouca significância os custos de administração e de performance. Um segundo resultado apresentado é a diferença da significância das variáveis entre os tipos de fundos Master e FIC, no grupo de fundos Master apenas patrimônio líquido apresentou consistente influência nos eventos de encerramento em janelas anuais. Outra importante constatação é a diferença existente entre janelas trimestrais e anuais, em que grande parte das variáveis estudadas, principalmente as variáveis de desempenho apresentam maior nível de significância nas janelas mais curtas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A mensuração do risco país é de extrema importância em um momento de frequente diversificação internacional do portfólio. O presente trabalho pretende entender quais as variáveis são importantes nessas métricas, com um foco principal entre os aspectos institucionais. Para isso, são analisados o Credit Default Swap (CDS) e o Emerging Markets Bond Index (EMBI), que além de medirem o risco dos países, são também produtos financeiros, comprados e vendidos por hedgers e especuladores. Seus preços são, portanto, formados pelo mercado. A intenção aqui é analisar se os aspectos institucionais dos países, bem como suas alterações, são importantes na definição deste risco, sem esquecer, obviamente, das variáveis econômicas de cada país. Por aspectos institucionais, entendemos a estrutura do Estado, como é a democracia e a corrupção em cada país, a liberdade de imprensa, o nível socioeconômico da população, o fato de o país é parlamentarista, as influências do sistema jurídico, entre outras variáveis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nesta dissertação defendemos uma forma nova de medir o produto de software com base nas medidas usadas na teoria dos sistemas complexos. Consideramos o uso dessas medidas vantajoso em relação ao uso das medidas tradicionais da engenharia de software. A inovação desta dissertação sintetiza-se em considerar o produto de software como um sistema complexo, dotado de uma estrutura que comporta vários níveis e na proposta da correlação de gama longa como medida da complexidade de estrutura de programas fontes. Essa medida, invariante para a escala de cada nível da estrutura, pode ser calculada automaticamente. Na dissertação, primeiro descrevemos o processo de desenvolvimento do software e as medidas existentes para medir o referido processo e produto e introduzimos a teoria dos sistemas complexos. Concluímos que o processo tem características de sistema complexo e propomos que seja medido como tal. Seguidamente, estudamos a estrutura do produto e a dinâmica do seu. processo de desenvolvimento. Apresentamos um estudo experimental sobre algoritmos codificados em C, que usamos para validar hipóteses sobre a complexidade da estrutura do produto. Propomos a correlação de gama longa como medida da complexidade da estrutura. Estendemos essa medida a uma amostra codificada em Java. Concluímos, evidenciando as limitações e as potencialidades dessa medida e a sua aplicação em Engenharia de Software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PEDROSA, Diogo P. F. ; MEDEIROS, Adelardo A. D. ; ALSINA, Pablo J. . Uma Proposta de SLAM com Determinação de Informações Geométricas do Ambiente. In: CONGRESSO BRASILEIRO DE AUTOMÁTICA, 16, Salvador, BA, 2006. Anais... Salvador: CBA, 2006. v. 1. p. 1704-1709

Relevância:

10.00% 10.00%

Publicador:

Resumo:

VANTI, Nadia. Links hipertextuais na comunicação científica: análise webométrica dos sítios acadêmicos latino-americanos em Ciências Sociais. Porto Alegre, 2007. 292 f. Tese (Doutorado em Comunicação e Informação) – Universidade Federal do Rio Grande do Sul. Porto Alegre, 2007.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims to investigate the influence of the asset class and the breakdown of tangibility as determinant factors of the capital structure of companies listed on the BM & FBOVESPA in the period of 2008-2012. Two current assets classes were composed and once they were grouped by liquidity, they were also analyzed by the financial institutions for credit granting: current resources (Cash, Bank and Financial Applications) and operations with duplicates (Stocks and Receivables). The breakdown of the tangible assets was made based on its main components provided as warrantees for loans like Machinery & Equipment and Land & Buildings. For an analysis extension, three metrics for leverage (accounting, financial and market) were applied and the sample was divided into economic sectors, adopted by BM&FBOVESPA. The data model in dynamic panel estimated by a systemic GMM of two levels was used in this study due its strength to problems of endogenous relationship as well as the omitted variables bias. The found results suggest that current resources are determinants of the capital structure possibly because they re characterized as proxies for financial solvency, being its relationship with debt positive. The sectorial analysis confirmed the results for current resources. The tangibility of assets has inverse proportional relationship with the leverage. As it is disintegrated in its main components, the significant and negative influence of machinery & equipment was more marked in the Industrial Goods sector. This result shows that, on average, the most specific assets from operating activities of a company compete for a less use of third party resources. As complementary results, it was observed that the leverage has persistence, which is linked with the static trade-off theory. Specifically for financial leverage, it was observed that the persistence is relevant when it is controlled for the lagged current assets classes variables. The proxy variable for growth opportunities, measured by the Market -to -Book, has the sign of its contradictory coefficient. The company size has a positive relationship with debt, in favor of static trade-off theory. Profitability is the most consistent variable in all the performed estimations, showing strong negative and significant relationship with leverage, as the pecking order theory predicts

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of the present study is to propose a methodological approach to the teaching of Geometry and, in particular, to the construction of the concepts of circle (circumference) and ellipse by 7th and 8th grade students. In order to aid the students in the construction of these concepts, we developed a module based on mathematical modeling, and both Urban Geometry (Taxicab Geometry) and Isoperimetric Geometry. Our analysis was based on Jean Piaget's Equilibrium Theory. Emphasizing the use of intuition based on accumulated past experiences, the students were encouraged to come up with a hypothesis, try it out, discuss it with their peers, and derive conclusions. Although the graphs of circles and ellipses assume different shapes in Urban and Isoperimetric Geometry than they do in the standard Euclidian Geometry, their definitions are identical regardless of the metric used. Thus, by comparing the graphs produced in the different metrics, the students were able to consolidate their understanding of these concepts. The intervention took place in a series of small group activities. At the end of the study, the 53 seventh grade and the 55 eighth grade students had a better understanding of the concepts of circle and ellipse

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The greater part of monitoring onshore Oil and Gas environment currently are based on wireless solutions. However, these solutions have a technological configuration that are out-of-date, mainly because analog radios and inefficient communication topologies are used. On the other hand, solutions based in digital radios can provide more efficient solutions related to energy consumption, security and fault tolerance. Thus, this paper evaluated if the Wireless Sensor Network, communication technology based on digital radios, are adequate to monitoring Oil and Gas onshore wells. Percent of packets transmitted with successful, energy consumption, communication delay and routing techniques applied to a mesh topology will be used as metrics to validate the proposal in the different routing techniques through network simulation tool NS-2

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work deals with experimental studies about VoIP conections into WiFi 802.11b networks with handoff. Indoor and outdoor network experiments are realised to take measurements for the QoS parameters delay, throughput, jitter and packt loss. The performance parameters are obtained through the use of software tools Ekiga, Iperf and Wimanager that assure, respectvely, VoIP conection simulation, trafic network generator and metric parameters acquisition for, throughput, jitter and packt loss. The avarage delay is obtained from the measured throughput and the concept of packt virtual transmition time. The experimental data are validated based on de QoS level for each metric parameter accepted as adequated by the specialized literature