865 resultados para Value-based pricing
Resumo:
A methodology based on data mining techniques to support the analysis of zonal prices in real transmission networks is proposed in this paper. The mentioned methodology uses clustering algorithms to group the buses in typical classes that include a set of buses with similar LMP values. Two different clustering algorithms have been used to determine the LMP clusters: the two-step and K-means algorithms. In order to evaluate the quality of the partition as well as the best performance algorithm adequacy measurements indices are used. The paper includes a case study using a Locational Marginal Prices (LMP) data base from the California ISO (CAISO) in order to identify zonal prices.
Resumo:
In this work, we investigated structural, morphological, electrical, and optical properties from a set of Cu2ZnSnS4 thin films grown by sulfurization of metallic precursors deposited on soda lime glass substrates coated with or without molybdenum. X-ray diffraction and Raman spectroscopy measurements revealed the formation of single-phase Cu2ZnSnS4 thin films. A good crystallinity and grain compactness of the film was found by scanning electron microscopy. The grown films are poor in copper and rich in zinc, which is a composition close to that of the Cu2ZnSnS4 solar cells with best reported efficiency. Electrical conductivity and Hall effect measurements showed a high doping level and a strong compensation. The temperature dependence of the free hole concentration showed that the films are nondegenerate. Photoluminescence spectroscopy showed an asymmetric broadband emission. The experimental behavior with increasing excitation power or temperature cannot be explained by donor-acceptor pair transitions. A model of radiative recombination of an electron with a hole bound to an acceptor level, broadened by potential fluctuations of the valence-band edge, was proposed. An ionization energy for the acceptor level in the range 29–40 meV was estimated, and a value of 172 ±2 meV was obtained for the potential fluctuation in the valence-band edge.
Resumo:
Mestrado em Contabilidade e Gestão das Instituições Financeiras
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
Variations of manufacturing process parameters and environmental aspects may affect the quality and performance of composite materials, which consequently affects their structural behaviour. Reliability-based design optimisation (RBDO) and robust design optimisation (RDO) searches for safe structural systems with minimal variability of response when subjected to uncertainties in material design parameters. An approach that simultaneously considers reliability and robustness is proposed in this paper. Depending on a given reliability index imposed on composite structures, a trade-off is established between the performance targets and robustness. Robustness is expressed in terms of the coefficient of variation of the constrained structural response weighted by its nominal value. The Pareto normed front is built and the nearest point to the origin is estimated as the best solution of the bi-objective optimisation problem.
Resumo:
Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.
Resumo:
The problem of selecting suppliers/partners is a crucial and important part in the process of decision making for companies that intend to perform competitively in their area of activity. The selection of supplier/partner is a time and resource-consuming task that involves data collection and a careful analysis of the factors that can positively or negatively influence the choice. Nevertheless it is a critical process that affects significantly the operational performance of each company. In this work, there were identified five broad selection criteria: Quality, Financial, Synergies, Cost, and Production System. Within these criteria, it was also included five sub-criteria. After the identification criteria, a survey was elaborated and companies were contacted in order to understand which factors have more weight in their decisions to choose the partners. Interpreted the results and processed the data, it was adopted a model of linear weighting to reflect the importance of each factor. The model has a hierarchical structure and can be applied with the Analytic Hierarchy Process (AHP) method or Value Analysis. The goal of the paper it's to supply a selection reference model that can represent an orientation/pattern for a decision making on the suppliers/partners selection process
Resumo:
Cloud SLAs compensate customers with credits when average availability drops below certain levels. This is too inflexible because consumers lose non-measurable amounts of performance being only compensated later, in next charging cycles. We propose to schedule virtual machines (VMs), driven by range-based non-linear reductions of utility, different for classes of users and across different ranges of resource allocations: partial utility. This customer-defined metric, allows providers transferring resources between VMs in meaningful and economically efficient ways. We define a comprehensive cost model incorporating partial utility given by clients to a certain level of degradation, when VMs are allocated in overcommitted environments (Public, Private, Community Clouds). CloudSim was extended to support our scheduling model. Several simulation scenarios with synthetic and real workloads are presented, using datacenters with different dimensions regarding the number of servers and computational capacity. We show the partial utility-driven driven scheduling allows more VMs to be allocated. It brings benefits to providers, regarding revenue and resource utilization, allowing for more revenue per resource allocated and scaling well with the size of datacenters when comparing with an utility-oblivious redistribution of resources. Regarding clients, their workloads’ execution time is also improved, by incorporating an SLA-based redistribution of their VM’s computational power.
Resumo:
A comparative study concerning the robustness of a novel, Fixed Point Transformations/Singular Value Decomposition (FPT/SVD)-based adaptive controller and the Slotine-Li (S&L) approach is given by numerical simulations using a three degree of freedom paradigm of typical Classical Mechanical systems, the cart + double pendulum. The effects of the imprecision of the available dynamical model, presence of dynamic friction at the axles of the drives, and the existence of external disturbance forces unknown and not modeled by the controller are considered. While the Slotine-Li approach tries to identify the parameters of the formally precise, available analytical model of the controlled system with the implicit assumption that the generalized forces are precisely known, the novel one makes do with a very rough, affine form and a formally more precise approximate model of that system, and uses temporal observations of its desired vs. realized responses. Furthermore, it does not assume the lack of unknown perturbations caused either by internal friction and/or external disturbances. Its another advantage is that it needs the execution of the SVD as a relatively time-consuming operation on a grid of a rough system-model only one time, before the commencement of the control cycle within which it works only with simple computations. The simulation examples exemplify the superiority of the FPT/SVD-based control that otherwise has the deficiency that it can get out of the region of its convergence. Therefore its design and use needs preliminary simulation investigations. However, the simulations also exemplify that its convergence can be guaranteed for various practical purposes.
Resumo:
Adopting standard-based weblab infrastructures can be an added value for spreading their influence and acceptance in education. This paper suggests a solution based on the IEEE1451.0 Std. and FPGA technology for creating reconfigurable weblab infrastructures using Instruments and Modules (I&Ms) described through standard Hardware Description Language (HDL) files. It describes a methodology for creating and binding I&Ms into an IEEE1451-module embedded in a FPGA-based board able to be remotely controlled/accessed using IEEE1451-HTTP commands. At the end, an example of a step-motor controller module bond to that IEEE1451-module is described.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente, perfil Gestão de Sistemas Ambientais
Resumo:
5th Portuguese Conference on Automatic Control, September, 5-7, 2002, Aveiro, Portugal
Resumo:
In this study, a new waste management solution for thermoset glass fibre reinforced polymer (GFRP) based products was assessed. Mechanical recycling approach, with reduction of GFRP waste to powdered and fibrous materials was applied, and the prospective added-value of obtained recyclates was experimentally investigated as raw material for polyester based mortars. Different GFRP waste admixed mortar formulations were analyzed varying the content, between 4% up to 12% in weight, of GFRP powder and fibre mix waste. The effect of incorporation of a silane coupling agent was also assessed. Design of experiments and data treatment was accomplished through implementation of full factorial design and analysis of variance ANOVA. Added value of potential recycling solution was assessed by means of flexural and compressive loading capacity of GFRP waste admixed mortars with regard to unmodified polymer mortars. The key findings of this study showed a viable technological option for improving the quality of polyester based mortars and highlight a potential cost-effective waste management solution for thermoset composite materials in the production of sustainable concrete-polymer based products.
Resumo:
Pultruded products are being targeted by a growing demand due to its excellent mechanical properties and low chemical reactivity, ensuring a low level of maintenance operations and allowing an easier assembly operation process than equivalent steel bars. In order to improve the mechanical drawing process and solve some acoustic and thermal insulation problems, pultruded pipes of glass fibre reinforced plastics (GFRF) can be filled with special products that increase their performance regarding the issues previously referred. The great challenge of this work was drawing a new equipment able to produce pultruded pipes filled with cork or polymeric pre-shaped bars as a continuous process. The project was carried out successfully and the new equipment was built and integrated in the pultrusion equipment already existing, allowing to obtain news products with higher added-value in the market, covering some needs previously identified in the field of civil construction.
Resumo:
The development and applications of thermoset polymeric composites, namely fibre reinforced plastics (FRP), have shifted in the last decades more and more into the mass market [1]. Despite of all advantages associated to FRP based products, the increasing production and consume also lead to an increasing amount of FRP wastes, either end-of-lifecycle products, or scrap and by-products generated by the manufacturing process itself. Whereas thermoplastic FRPs can be easily recycled, by remelting and remoulding, recyclability of thermosetting FRPs constitutes a more difficult task due to cross-linked nature of resin matrix. To date, most of the thermoset based FRP waste is being incinerated or landfilled, leading to negative environmental impacts and supplementary added costs to FRP producers and suppliers. This actual framework is putting increasing pressure on the industry to address the options available for FRP waste management, being an important driver for applied research undertaken cost efficient recycling methods. [1-2]. In spite of this, research on recycling solutions for thermoset composites is still at an elementary stage. Thermal and/or chemical recycling processes, with partial fibre recovering, have been investigated mostly for carbon fibre reinforced plastics (CFRP) due to inherent value of carbon fibre reinforcement; whereas for glass fibre reinforced plastics (GFRP), mechanical recycling, by means of milling and grinding processes, has been considered a more viable recycling method [1-2]. Though, at the moment, few solutions in the reuse of mechanically-recycled GFRP composites into valueadded products are being explored. Aiming filling this gap, in this study, a new waste management solution for thermoset GFRP based products was assessed. The mechanical recycling approach, with reduction of GFRP waste to powdered and fibrous materials was applied, and the potential added value of obtained recyclates was experimentally investigated as raw material for polyester based mortars. The use of a cementless concrete as host material for GFRP recyclates, instead of a conventional Portland cement based concrete, presents an important asset in avoiding the eventual incompatibility problems arisen from alkalis silica reaction between glass fibres and cementious binder matrix. Additionally, due to hermetic nature of resin binder, polymer based concretes present greater ability for incorporating recycled waste products [3]. Under this scope, different GFRP waste admixed polymer mortar (PM) formulations were analyzed varying the size grading and content of GFRP powder and fibre mix waste. Added value of potential recycling solution was assessed by means of flexural and compressive loading capacities of modified mortars with regard to waste-free polymer mortars.