23 resultados para Analysis Model
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
The Cultural Property Risk Analysis Model was applied in 2006 to a Portuguese archive located in Lisbon. Its results highlighted the need for the institution to take care of risks related to fire, physical forces and relative humidity problems. Five years after this first analysis the results are revisited and a few changes are introduced due to recent events: fire and high humidity remain an important hazard but are now accompanied by a pressing contaminants problem. Improvements in storage systems were responsible for a large decrease in terms of calculated risk magnitude and proved to be very cost-effective.
Resumo:
Mestrado em Intervenção Sócio-Organizacional na Saúde - Área de especialização: Políticas de Gestão e Administração dos Serviços de Saúde.
Resumo:
Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Publicidade e Marketing.
Resumo:
In the last years the electricity industry has faced a restructuring process. Among the aims of this process was the increase in competition, especially in the generation activity where firms would have an incentive to become more efficient. However, the competitive behavior of generating firms might jeopardize the expected benefits of the electricity industry liberalization. The present paper proposes a conjectural variations model to study the competitive behavior of generating firms acting in liberalized electricity markets. The model computes a parameter that represents the degree of competition of each generating firm in each trading period. In this regard, the proposed model provides a powerful methodology for regulatory and competition authorities to monitor the competitive behavior of generating firms. As an application of the model, a study of the day-ahead Iberian electricity market (MIBEL) was conducted to analyze the impact of the integration of the Portuguese and Spanish electricity markets on the behavior of generating firms taking into account the hourly results of the months of June and July of 2007. The advantages of the proposed methodology over other methodologies used to address market power, namely Residual Supply index and Lerner index are highlighted. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
A previously developed model is used to numerically simulate real clinical cases of the surgical correction of scoliosis. This model consists of one-dimensional finite elements with spatial deformation in which (i) the column is represented by its axis; (ii) the vertebrae are assumed to be rigid; and (iii) the deformability of the column is concentrated in springs that connect the successive rigid elements. The metallic rods used for the surgical correction are modeled by beam elements with linear elastic behavior. To obtain the forces at the connections between the metallic rods and the vertebrae geometrically, non-linear finite element analyses are performed. The tightening sequence determines the magnitude of the forces applied to the patient column, and it is desirable to keep those forces as small as possible. In this study, a Genetic Algorithm optimization is applied to this model in order to determine the sequence that minimizes the corrective forces applied during the surgery. This amounts to find the optimal permutation of integers 1, ... , n, n being the number of vertebrae involved. As such, we are faced with a combinatorial optimization problem isomorph to the Traveling Salesman Problem. The fitness evaluation requires one computing intensive Finite Element Analysis per candidate solution and, thus, a parallel implementation of the Genetic Algorithm is developed.
Resumo:
Interest rate risk is one of the major financial risks faced by banks due to the very nature of the banking business. The most common approach in the literature has been to estimate the impact of interest rate risk on banks using a simple linear regression model. However, the relationship between interest rate changes and bank stock returns does not need to be exclusively linear. This article provides a comprehensive analysis of the interest rate exposure of the Spanish banking industry employing both parametric and non parametric estimation methods. Its main contribution is to use, for the first time in the context of banks’ interest rate risk, a nonparametric regression technique that avoids the assumption of a specific functional form. One the one hand, it is found that the Spanish banking sector exhibits a remarkable degree of interest rate exposure, although the impact of interest rate changes on bank stock returns has significantly declined following the introduction of the euro. Further, a pattern of positive exposure emerges during the post-euro period. On the other hand, the results corresponding to the nonparametric model support the expansion of the conventional linear model in an attempt to gain a greater insight into the actual degree of exposure.
Resumo:
We are concerned with providing more empirical evidence on forecast failure, developing forecast models, and examining the impact of events such as audit reports. A joint consideration of classic financial ratios and relevant external indicators leads us to build a basic prediction model focused in non-financial Galician SMEs. Explanatory variables are relevant financial indicators from the viewpoint of the financial logic and financial failure theory. The paper explores three mathematical models: discriminant analysis, Logit, and linear multivariate regression. We conclude that, even though they both offer high explanatory and predictive abilities, Logit and MDA models should be used and interpreted jointly.
Resumo:
The aim of this paper is to analyze the forecasting ability of the CARR model proposed by Chou (2005) using the S&P 500. We extend the data sample, allowing for the analysis of different stock market circumstances and propose the use of various range estimators in order to analyze their forecasting performance. Our results show that there are two range-based models that outperform the forecasting ability of the GARCH model. The Parkinson model is better for upward trends and volatilities which are higher and lower than the mean while the CARR model is better for downward trends and mean volatilities.
Resumo:
In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
We classify all possible implementations of an Abelian symmetry in the two-Higgs-doublet model with fermions. We identify those symmetries which are consistent with nonvanishing quark masses and a Cabibbo-Kobayashi-Maskawa quark-mixing matrix (CKM), which is not block-diagonal. Our analysis takes us from a plethora of possibilities down to 246 relevant cases, requiring only 34 distinct matrix forms. We show that applying Z(n) with n >= 4 to the scalar sector leads to a continuous U(1) symmetry in the whole Lagrangian. Finally, we address the possibilities of spontaneous CP violation and of natural suppression of the flavor-changing neutral currents. We explain why our work is relevant even for non-Abelian symmetries.
Resumo:
The analysis of the Higgs boson data by the ATLAS and CMS Collaborations appears to exhibit an excess of h -> gamma gamma events above the Standard Model (SM) expectations, whereas no significant excess is observed in h -> ZZ* -> four lepton events, albeit with large statistical uncertainty due to the small data sample. These results (assuming they persist with further data) could be explained by a pair of nearly mass-degenerate scalars, one of which is an SM-like Higgs boson and the other is a scalar with suppressed couplings to W+W- and ZZ. In the two-Higgs-doublet model, the observed gamma gamma and ZZ* -> four lepton data can be reproduced by an approximately degenerate CP-even (h) and CP-odd (A) Higgs boson for values of sin (beta - alpha) near unity and 0: 70 less than or similar to tan beta less than or similar to 1. An enhanced gamma gamma signal can also arise in cases where m(h) similar or equal to m(H), m(H) similar or equal to m(A), or m(h) similar or equal to m(H) similar or equal to m(A). Since the ZZ* -> 4 leptons signal derives primarily from an SM-like Higgs boson whereas the gamma gamma signal receives contributions from two (or more) nearly mass-degenerate states, one would expect a slightly different invariant mass peak in the ZZ* -> four lepton and gamma gamma channels. The phenomenological consequences of such models can be tested with additional Higgs data that will be collected at the LHC in the near future. DOI: 10.1103/PhysRevD.87.055009.
Resumo:
One of the most effective ways of controlling vibrations in plate or beam structures is by means of constrained viscoelastic damping treatments. Contrary to the unconstrained configuration, the design of constrained and integrated layer damping treatments is multifaceted because the thickness of the viscoelastic layer acts distinctly on the two main counterparts of the strain energy the volume of viscoelastic material and the shear strain field. In this work, a parametric study is performed exploring the effect that the design parameters, namely the thickness/length ratio, constraining layer thickness, material modulus, natural mode and boundary conditions have on these two counterparts and subsequently, on the treatment efficiency. This paper presents five parametric studies, namely, the thickness/length ratio, the constraining layer thickness, material properties, natural mode and boundary conditions. The results obtained evidence an interesting effect when dealing with very thin viscoelastic layers that contradicts the standard treatment efficiency vs. layer thickness relation; hence, the potential optimisation of constrained and integrated viscoelastic treatments through the use of properly designed thin multilayer configurations is justified. This work presents a dimensionless analysis and provides useful general guidelines for the efficient design of constrained and integrated damping treatments based on single or multi-layer configurations. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
In a heterogeneous cellular networks environment, users behaviour and network deployment configuration parameters have an impact on the overall Quality of Service. This paper proposes a new and simple model that, on the one hand, explores the users behaviour impact on the network by having mobility, multi-service usage and traffic generation profiles as inputs, and on the other, enables the network setup configuration evaluation impact on the Joint Radio Resource Management (JRRM), assessing some basic JRRM performance indicators, like Vertical Handover (VHO) probabilities, average bit rates, and number of active users, among others. VHO plays an important role in fulfilling seamless users sessions transfer when mobile terminals cross different Radio Access Technologies (RATs) boundaries. Results show that high bit rate RATs suffer and generate more influence from/on other RATs, by producing additional signalling traffic to a JRRM entity. Results also show that the VHOs probability can range from 5 up to 65%, depending on RATs cluster radius and users mobility profile.