61 resultados para Performance Estimation
em Instituto Politécnico do Porto, Portugal
Residential property loans and performance during property price booms: evidence from European banks
Resumo:
Understanding the performance of banks is of the utmost relevance, because of the impact of this sector on economic growth and financial stability. Of all the different assets that make up a bank portfolio, the residential mortgage loans constitute one of its main. Using the dynamic panel data method, we analyse the influence of residential mortgage loans on bank profitability and risk, using a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that banks with larger weights of residential mortgage loans show lower credit risk in good times. This result explains why banks rush to lend on property during booms due to the positive effects it has on credit risk. The results show further that credit risk and profitability are lower during the upturn in the residential property price cycle. The results also reveal the existence of a non-linear relationship (U-shaped marginal effect), as a function of bank’s risk, between profitability and the residential mortgage loans exposure. For those banks that have high credit risk, a large exposure of residential mortgage loans is associated with higher risk-adjusted profitability, through lower risk. For banks with a moderate/low credit risk, the effects of higher residential mortgage loan exposure on its risk-adjusted profitability are also positive or marginally positive.
Resumo:
Understanding the performance of banks is of the u tmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of performance. Using a dynamic panel model , we analyse the impact of res idential mortgage loans on bank profitability and risk , based on a sample of 555 banks in the European Union ( EU - 15 ) , over the period from 1995 to 2008. We find that banks with larger weight s in residential mortgage loans display lower credit risk in good market conditions . This result may explain why banks rush to lend on property during b ooms due to the positive effect it has on credit risk . The results also show that credit risk and profitability are lower during the upturn in the residential property cy cle. Furthermore, t he results reveal the existence of a non - linear relationship ( U - shaped marginal effect), as a function of bank’s risk, between profitability and residential mortgage exposure . For those banks that have high er credit risk, a large exposur e to residential loans is associated with increased risk - adjusted profitability, through a reduction in risk. For banks with a moderate to low credit risk, the impact of higher exposure are also positive on risk - adjusted profitability.
Resumo:
In the last few years, the number of systems and devices that use voice based interaction has grown significantly. For a continued use of these systems, the interface must be reliable and pleasant in order to provide an optimal user experience. However there are currently very few studies that try to evaluate how pleasant is a voice from a perceptual point of view when the final application is a speech based interface. In this paper we present an objective definition for voice pleasantness based on the composition of a representative feature subset and a new automatic voice pleasantness classification and intensity estimation system. Our study is based on a database composed by European Portuguese female voices but the methodology can be extended to male voices or to other languages. In the objective performance evaluation the system achieved a 9.1% error rate for voice pleasantness classification and a 15.7% error rate for voice pleasantness intensity estimation.
Resumo:
Radio link quality estimation in Wireless Sensor Networks (WSNs) has a fundamental impact on the network performance and also affects the design of higher-layer protocols. Therefore, for about a decade, it has been attracting a vast array of research works. Reported works on link quality estimation are typically based on different assumptions, consider different scenarios, and provide radically different (and sometimes contradictory) results. This article provides a comprehensive survey on related literature, covering the characteristics of low-power links, the fundamental concepts of link quality estimation in WSNs, a taxonomy of existing link quality estimators, and their performance analysis. To the best of our knowledge, this is the first survey tackling in detail link quality estimation in WSNs. We believe our efforts will serve as a reference to orient researchers and system designers in this area.
Resumo:
Radio link quality estimation is essential for protocols and mechanisms such as routing, mobility management and localization, particularly for low-power wireless networks such as wireless sensor networks. Commodity Link Quality Estimators (LQEs), e.g. PRR, RNP, ETX, four-bit and RSSI, can only provide a partial characterization of links as they ignore several link properties such as channel quality and stability. In this paper, we propose F-LQE (Fuzzy Link Quality Estimator, a holistic metric that estimates link quality on the basis of four link quality properties—packet delivery, asymmetry, stability, and channel quality—that are expressed and combined using Fuzzy Logic. We demonstrate through an extensive experimental analysis that F-LQE is more reliable than existing estimators (e.g., PRR, WMEWMA, ETX, RNP, and four-bit) as it provides a finer grain link classification. It is also more stable as it has lower coefficient of variation of link estimates. Importantly, we evaluate the impact of F-LQE on the performance of tree routing, specifically the CTP (Collection Tree Protocol). For this purpose, we adapted F-LQE to build a new routing metric for CTP, which we dubbed as F-LQE/RM. Extensive experimental results obtained with state-of-the-art widely used test-beds show that F-LQE/RM improves significantly CTP routing performance over four-bit (the default LQE of CTP) and ETX (another popular LQE). F-LQE/RM improves the end-to-end packet delivery by up to 16%, reduces the number of packet retransmissions by up to 32%, reduces the Hop count by up to 4%, and improves the topology stability by up to 47%.
Resumo:
This work aims to shed some light on longshore sediment transport (LST) in the highly energetic northwest coast of Portugal. Data achieved through a sand-tracer experiment are compared with data obtained from the original and the new re-evaluated longshore sediment transport formulas (USACE Waterways Experiment Station’s Coastal Engineering and Research Center, Kamphuis, and Bayram bulk formulas) to assess their performance. The field experiment with dyed sand was held at Ofir Beach during one tidal cycle under medium wave-energy conditions. Local hydrodynamic conditions and beach topography were recorded. The tracer was driven southward in response to the local swell and wind- and wave-induced currents (Hsb=0.75mHsb=0.75m, Tp=11.5sTp=11.5s, θb=8−12°θb=8−12°). The LST was estimated by using a linear sediment transport flux approach. The obtained value (2.3×10−3m3⋅s−12.3×10−3m3⋅s−1) approached the estimation provided by the original Bayram formula (2.5×10−3m3⋅s−12.5×10−3m3⋅s−1). The other formulas overestimated the transport, but the estimations resulting from the new re-evaluated formulas also yield approximate results. Therefore, the results of this work indicated that the Bayram formula may give satisfactory results for predicting the longshore sediment transport on Ofir Beach.
Resumo:
In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Mat`ern models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.
Resumo:
In this work an adaptive modeling and spectral estimation scheme based on a dual Discrete Kalman Filtering (DKF) is proposed for speech enhancement. Both speech and noise signals are modeled by an autoregressive structure which provides an underlying time frame dependency and improves time-frequency resolution. The model parameters are arranged to obtain a combined state-space model and are also used to calculate instantaneous power spectral density estimates. The speech enhancement is performed by a dual discrete Kalman filter that simultaneously gives estimates for the models and the signals. This approach is particularly useful as a pre-processing module for parametric based speech recognition systems that rely on spectral time dependent models. The system performance has been evaluated by a set of human listeners and by spectral distances. In both cases the use of this pre-processing module has led to improved results.
Resumo:
There has been a growing interest in research on performance measurement and management practices, which seems to reflect researchers’ response to calls for the need to increase the relevance of management accounting research. However, despite the development of the new public management literature, studies involving public sector organizations are relatively small compared to those involving business organizations and extremely limited when it comes to public primary health care organizations. Yet, the economic significance of public health care organizations in the economy of developed countries and the criticisms these organizations regularly face from the public suggests there is a need for research. This is particularly true in the case of research that may lead to improvement in performance measurement and management practices and ultimately to improvements in the way health care organizations use their limited resources in the provision of services to the communities. This study reports on a field study involving three public primary health care organisations. The evidence obtained from interviews and archival data suggests a performance management practices in these institutions lacked consistency and coherence, potentially leading to decreased performance. Hierarchical controls seemed to be very weak and accountability limited, leading to a lack of direction, low motivation and, in some circumstances to insufficient managerial abilities and skills. Also, the performance management systems revealed a number of weaknesses, which suggests that there are various opportunities for improvement in performance in the studied organisations.
Resumo:
This paper proposes a particle swarm optimization (PSO) approach to support electricity producers for multiperiod optimal contract allocation. The producer risk preference is stated by a utility function (U) expressing the tradeoff between the expectation and variance of the return. Variance estimation and expected return are based on a forecasted scenario interval determined by a price range forecasting model developed by the authors. A certain confidence level is associated to each forecasted scenario interval. The proposed model makes use of contracts with physical (spot and forward) and financial (options) settlement. PSO performance was evaluated by comparing it with a genetic algorithm-based approach. This model can be used by producers in deregulated electricity markets but can easily be adapted to load serving entities and retailers. Moreover, it can easily be adapted to the use of other type of contracts.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.
Resumo:
This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn.
Resumo:
Neste artigo, pretende-se analisar a performance como um gênero artístico que exige uma reflexão em torno do ritual e das transposições dos atos cotidianos para o campo da arte, mas principalmente como uma manifestação que implica a inevitável consideração de que é um recurso cênico não mais calcado na palavra, funcionando como fator determinante para o teatro pós-modernista, que faz constante recusa ao texto em prol do chamado teatro pós-dramático.
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.