998 resultados para Test scenario
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações
Resumo:
The limitation of photoactivation of dual-polymerized resin cements along the margins of metal restorations may adversely affect the mechanical properties of these cements, thus impairing the retention of restorations. The aim of this study was to assess the bond strength of cast metal crowns cemented with three dual-polymerized resin cements, using a chemically-activated resin cement and zinc phosphate as controls. Fifty nickel-chromium alloy crowns were cast and randomly assigned to five groups of equal size. Castings were cemented on their corresponding metal dies with one of the tested luting agents: Scotchbond Resin Cement, Enforce and Panavia F (dual-polymerized resin cements), Cement-It (chemically-activated resin cement) and Zinc Phosphate Cement (zinc phosphate cement). Specimens were stored in distilled water at 37 degreesC for 24 h and then loaded in tension until failure. Panavia F and Zinc Phosphate Cement provided the highest and lowest bond strength means, respectively. Scotchbond Resin Cement, Enforce and Cement-It cements exhibited similar intermediate values, but with statistically significant difference compared to the other materials (P < 0.05). Even with the restriction or absence of light activation, all tested dual-polymerized resin cements produced significantly higher bond strength than did the zinc phosphate cement and yielded similar or better results than the chemically activated cement. It should be pointed out that the findings of this study relate to a test scenario which does not mimic clinical circumstances and that further work is required to identify the clinical significance of the reported tensile bond strength differences between the different luting materials.
Resumo:
Computational grids allow users to share resources of distributed machines, even if those machines belong to different corporations. The scheduling of applications must be performed aiming at performance goals, and focusing on choose which processes can have access to specif resources, and which resources. In this article we discuss aspects of scheduling of application in grid computing environment. We also present a tool for scheduling simulation along with test scenarios and results.
Resumo:
While the system stabilizing function of reciprocity is widely acknowledged, much less attention has been paid to the argument that reciprocity might initiate social cooperation in the first place. This paper tests Gouldner’s early assumption that reciprocity may act as a ‘starting mechanism’ of social cooperation in consolidating societies. The empirical test scenario builds on unequal civic engagement between immigrants and nationals, as this engagement gap can be read as a lack of social cooperation in consolidating immigration societies. Empirical analyses using survey data on reciprocal norms and based on Bayesian hierarchical modelling lend support for Gouldner’s thesis, underlining thereby the relevance of reciprocity in today’s increasingly diverse societies: individual norms of altruistic reciprocity elevate immigrants’ propensity to volunteer, reducing thereby the engagement gap between immigrants and natives in the area of informal volunteering. In other words, compliance with altruistic reciprocity may trigger cooperation in social strata, where it is less likely to occur. The positive moderation of the informal engagement gap through altruistic reciprocity turns out to be most pronounced for immigrants who are least likely to engage in informal volunteering, meaning low, but also high educated immigrants.
Resumo:
(Matsukawa and Habeck, 2007) analyse the main instruments for risk mitigation in infrastructure financing with Multilateral Financial Institutions (MFIs). Their review coincided with the global financial crisis of 2007-08, and is highly relevant in current times considering the sovereign debt crisis, the lack of available capital and the increases in bank regulation in Western economies. The current macroeconomic environment has seen a slowdown in the level of finance for infrastructure projects, as they pose a higher credit risk given their requirements for long term investments. The rationale for this work is to look for innovative solutions that are focused on the credit risk mitigation of infrastructure and energy projects whilst optimizing the economic capital allocation for commercial banks. This objective is achieved through risk-sharing with MFIs and looking for capital relief in project finance transactions. This research finds out the answer to the main question: "What is the impact of risk-sharing with MFIs on project finance transactions to increase their efficiency and viability?", and is developed from the perspective of a commercial bank assessing the economic capital used and analysing the relevant variables for it: Probability of Default, Loss Given Default and Recovery Rates, (Altman, 2010). An overview of project finance for the infrastructure and energy sectors in terms of the volume of transactions worldwide is outlined, along with a summary of risk-sharing financing with MFIs. A review of the current regulatory framework beneath risk-sharing in structured finance with MFIs is also analysed. From here, the impact of risk-sharing and the diversification effect in infrastructure and energy projects is assessed, from the perspective of economic capital allocation for a commercial bank. CreditMetrics (J. P. Morgan, 1997) is applied over an existing well diversified portfolio of project finance infrastructure and energy investments, working with the main risk capital measures: economic capital, RAROC, and EVA. The conclusions of this research show that economic capital allocation on a portfolio of project finance along with risk-sharing with MFIs have a huge impact on capital relief whilst increasing performance profitability for commercial banks. There is an outstanding diversification effect due to the portfolio, which is combined with risk mitigation and an improvement in recovery rates through Partial Credit Guarantees issued by MFIs. A stress test scenario analysis is applied to the current assumptions and credit risk model, considering a downgrade in the rating for the commercial bank (lender) and an increase of default in emerging countries, presenting a direct impact on economic capital, through an increase in expected loss and a decrease in performance profitability. Getting capital relief through risk-sharing makes it more viable for commercial banks to finance infrastructure and energy projects, with the beneficial effect of a direct impact of these investments on GDP growth and employment. The main contribution of this work is to promote a strategic economic capital allocation in infrastructure and energy financing through innovative risk-sharing with MFIs and economic pricing to create economic value added for banks, and to allow the financing of more infrastructure and energy projects. This work suggests several topics for further research in relation to issues analysed. (Matsukawa and Habeck, 2007) analizan los principales instrumentos de mitigación de riesgos en las Instituciones Financieras Multilaterales (IFMs) para la financiación de infraestructuras. Su presentación coincidió con el inicio de la crisis financiera en Agosto de 2007, y sus consecuencias persisten en la actualidad, destacando la deuda soberana en economías desarrolladas y los problemas capitalización de los bancos. Este entorno macroeconómico ha ralentizado la financiación de proyectos de infraestructuras. El actual trabajo de investigación tiene su motivación en la búsqueda de soluciones para la financiación de proyectos de infraestructuras y de energía, mitigando los riesgos inherentes, con el objeto de reducir el consumo de capital económico en los bancos financiadores. Este objetivo se alcanza compartiendo el riesgo de la financiación con IFMs, a través de estructuras de risk-sharing. La investigación responde la pregunta: "Cuál es el impacto de risk-sharing con IFMs, en la financiación de proyectos para aumentar su eficiencia y viabilidad?". El trabajo se desarrolla desde el enfoque de un banco comercial, estimando el consumo de capital económico en la financiación de proyectos y analizando las principales variables del riesgo de crédito, Probability of Default, Loss Given Default and Recovery Rates, (Altman, 2010). La investigación presenta las cifras globales de Project Finance en los sectores de infraestructuras y de energía, y analiza el marco regulatorio internacional en relación al consumo de capital económico en la financiación de proyectos en los que participan IFMs. A continuación, el trabajo modeliza una cartera real, bien diversificada, de Project Finance de infraestructuras y de energía, aplicando la metodología CreditMet- rics (J. P. Morgan, 1997). Su objeto es estimar el consumo de capital económico y la rentabilidad de la cartera de proyectos a través del RAROC y EVA. La modelización permite estimar el efecto diversificación y la liberación de capital económico consecuencia del risk-sharing. Los resultados muestran el enorme impacto del efecto diversificación de la cartera, así como de las garantías parciales de las IFMs que mitigan riesgos, mejoran el recovery rate de los proyectos y reducen el consumo de capital económico para el banco comercial, mientras aumentan la rentabilidad, RAROC, y crean valor económico, EVA. En escenarios económicos de inestabilidad, empeoramiento del rating de los bancos, aumentos de default en los proyectos y de correlación en las carteras, hay un impacto directo en el capital económico y en la pérdida de rentabilidad. La liberación de capital económico, como se plantea en la presente investigación, permitirá financiar más proyectos de infraestructuras y de energía, lo que repercutirá en un mayor crecimiento económico y creación de empleo. La principal contribución de este trabajo es promover la gestión activa del capital económico en la financiación de infraestructuras y de proyectos energéticos, a través de estructuras innovadoras de risk-sharing con IFMs y de creación de valor económico en los bancos comerciales, lo que mejoraría su eficiencia y capitalización. La aportación metodológica del trabajo se convierte por su originalidad en una contribución, que sugiere y facilita nuevas líneas de investigación académica en las principales variables del riesgo de crédito que afectan al capital económico en la financiación de proyectos.
Resumo:
The Comprehensive Assessment conducted by the European Central Bank (ECB) represents a considerable step forward in enhancing transparency in euro-area banks’ balance sheets. The most notable progress since the previous European stress test has been the harmonisation of the definition of non-performing loans and other concepts as well as uncovering hidden losses, which resulted in a €34 billion aggregate capital-charge net of taxes. Despite this tightening, most banks were able to meet the 5.5% common equity tier 1 (CET1) threshold applied in the test, which suggests that the large majority of the euro-area banks have improved their financial position sufficiently that they should no longer be constrained in financing the economy. As shown in this CEPS Policy Brief by Willem Pieter de Groen, however, the detailed results provide a more nuanced picture: there remain a large number of the banks in the euro area that are still highly leveraged and in many cases unable to meet the regulatory capital requirements that will be introduced in the coming years under the adverse stress test scenario.
Resumo:
El sector eléctrico está experimentando cambios importantes tanto a nivel de gestión como a nivel de mercado. Una de las claves que están acelerando este cambio es la penetración cada vez mayor de los Sistemas de Generación Distribuida (DER), que están dando un mayor protagonismo al usuario a la hora de plantear la gestión del sistema eléctrico. La complejidad del escenario que se prevé en un futuro próximo, exige que los equipos de la red tenga la capacidad de interactuar en un sistema mucho más dinámico que en el presente, donde la interfaz de conexión deberá estar dotada de la inteligencia necesaria y capacidad de comunicación para que todo el sistema pueda ser gestionado en su conjunto de manera eficaz. En la actualidad estamos siendo testigos de la transición desde el modelo de sistema eléctrico tradicional hacia un nuevo sistema, activo e inteligente, que se conoce como Smart Grid. En esta tesis se presenta el estudio de un Dispositivo Electrónico Inteligente (IED) orientado a aportar soluciones para las necesidades que la evolución del sistema eléctrico requiere, que sea capaz de integrase en el equipamiento actual y futuro de la red, aportando funcionalidades y por tanto valor añadido a estos sistemas. Para situar las necesidades de estos IED se ha llevado a cabo un amplio estudio de antecedentes, comenzando por analizar la evolución histórica de estos sistemas, las características de la interconexión eléctrica que han de controlar, las diversas funciones y soluciones que deben aportar, llegando finalmente a una revisión del estado del arte actual. Dentro de estos antecedentes, también se lleva a cabo una revisión normativa, a nivel internacional y nacional, necesaria para situarse desde el punto de vista de los distintos requerimientos que deben cumplir estos dispositivos. A continuación se exponen las especificaciones y consideraciones necesarias para su diseño, así como su arquitectura multifuncional. En este punto del trabajo, se proponen algunos enfoques originales en el diseño, relacionados con la arquitectura del IED y cómo deben sincronizarse los datos, dependiendo de la naturaleza de los eventos y las distintas funcionalidades. El desarrollo del sistema continua con el diseño de los diferentes subsistemas que lo componen, donde se presentan algunos algoritmos novedosos, como el enfoque del sistema anti-islanding con detección múltiple ponderada. Diseñada la arquitectura y funciones del IED, se expone el desarrollo de un prototipo basado en una plataforma hardware. Para ello se analizan los requisitos necesarios que debe tener, y se justifica la elección de una plataforma embebida de altas prestaciones que incluye un procesador y una FPGA. El prototipo desarrollado se somete a un protocolo de pruebas de Clase A, según las normas IEC 61000-4-30 e IEC 62586-2, para comprobar la monitorización de parámetros. También se presentan diversas pruebas en las que se han estimado los retardos implicados en los algoritmos relacionados con las protecciones. Finalmente se comenta un escenario de prueba real, dentro del contexto de un proyecto del Plan Nacional de Investigación, donde este prototipo ha sido integrado en un inversor dotándole de la inteligencia necesaria para un futuro contexto Smart Grid.
Resumo:
Electric vehicles introduction will affect cities environment and urban mobility policies. Network system operators will have to consider the electric vehicles in planning and operation activities due to electric vehicles’ dependency on the electricity grid. The present paper presents test cases using an Electric Vehicle Scenario Simulator (EVeSSi) being developed by the authors. The test cases include two scenarios considering a 33 bus network with up to 2000 electric vehicles in the urban area. The scenarios consider a penetration of 10% of electric vehicles (200 of 2000), 30% (600) and 100% (2000). The first scenario will evaluate network impacts and the second scenario will evaluate CO2 emissions and fuel consumption.
Resumo:
Background: In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method: The cost-effectiveness of the Optimal (R) and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U. S. dollars. Sensitivity analysis was performed considering key model parameters. Results: In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$ 549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion: Microscopy is more cost-effective than OptiMal (R) in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.
Resumo:
Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization`s vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This thesis attempts to find whether scenario planning supports the organizational strategy as a method for addressing uncertainty. The main issues are why, what and how scenario planning fits in organizational strategy and how the process could be supported to make it more effective. The study follows the constructive approach. It starts with examination of competitive advantage and the way that an organization develops strategy and how it addresses the uncertainty in its operational environment. Based on the conducted literature review, scenario methods would seem to provide versatile platform for addressing future uncertainties. The construction is formed by examining the scenario methods and presenting suitable support methods, which results in forming of the theoretical proposition for supporter scenario process. The theoretical framework is tested in laboratory conditions, and the results from the test sessions are used a basis for scenario stories. The process of forming the scenarios and the results are illustrated and presented for scrutiny
Resumo:
We compare hypothetical and observed (experimental) willingness to pay (WTP) for a gradual improvement in the environmental performance of a marketed good (an office table). First, following usual practices in marketing research, subjects’ stated WTP for the improvement is obtained. Second, the same subjects participate in a real reward experiment designed to replicate the scenario valued in the hypothetical question. Our results show that, independently of the degree of the improvement, there are no significant median differences between stated and experimental data. However, subjects reporting extreme values of WTP (low or high) exhibit a more moderate behavior in the experiment.
Resumo:
In socio-environmental scenario increased the nature resources concern beyond products and subproducts reuse. Recycling is the approach for a material or energy reintroducing in productive system. This method allows the reduction of garbage volume dumped in environment, saving energy and decreasing the requirement of natural resources use. In general, the ending of expanded polystyrene is deposited sanitary landfills or garbage dumps without control that take large volume and spreads easily by aeolian action, with consequently environmental pollution, however, the recycling avoids their misuse and the obtainment from petroleum is reduced. This work recycled expanded polystyrene via merger and/or dissolution by solvents for the production of integrated circuits boards. The obtained material was characterized in flexural mode according to ASTM D 790 and results were compared with phenolite, traditionally used. Specimens fractures were observed by electronic microscopy scanning in order to establish patterns. Expanded Polyestirene recycled as well as phenolite were also thermo analyzed by TGA and DSC. The method using dissolution produced very brittle materials. The method using merger showed no voids formation nor increased the brittleness of the material. The recycled polystyrene presented a strength value significantly lower than that for the phenolite. (C) 2011 Published by Elsevier Ltd. Selection and peer-review under responsibility of ICM11
Resumo:
Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization's vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
We propose an alternative, nonsingular, cosmic scenario based on gravitationally induced particle production. The model is an attempt to evade the coincidence and cosmological constant problems of the standard model (Lambda CDM) and also to connect the early and late time accelerating stages of the Universe. Our space-time emerges from a pure initial de Sitter stage thereby providing a natural solution to the horizon problem. Subsequently, due to an instability provoked by the production of massless particles, the Universe evolves smoothly to the standard radiation dominated era thereby ending the production of radiation as required by the conformal invariance. Next, the radiation becomes subdominant with the Universe entering in the cold dark matter dominated era. Finally, the negative pressure associated with the creation of cold dark matter (CCDM model) particles accelerates the expansion and drives the Universe to a final de Sitter stage. The late time cosmic expansion history of the CCDM model is exactly like in the standard Lambda CDM model; however, there is no dark energy. The model evolves between two limiting (early and late time) de Sitter regimes. All the stages are also discussed in terms of a scalar field description. This complete scenario is fully determined by two extreme energy densities, or equivalently, the associated de Sitter Hubble scales connected by rho(I)/rho(f) = (H-I/H-f)(2) similar to 10(122), a result that has no correlation with the cosmological constant problem. We also study the linear growth of matter perturbations at the final accelerating stage. It is found that the CCDM growth index can be written as a function of the Lambda growth index, gamma(Lambda) similar or equal to 6/11. In this framework, we also compare the observed growth rate of clustering with that predicted by the current CCDM model. Performing a chi(2) statistical test we show that the CCDM model provides growth rates that match sufficiently well with the observed growth rate of structure.