890 resultados para real option analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electric power networks, namely distribution networks, have been suffering several changes during the last years due to changes in the power systems operation, towards the implementation of smart grids. Several approaches to the operation of the resources have been introduced, as the case of demand response, making use of the new capabilities of the smart grids. In the initial levels of the smart grids implementation reduced amounts of data are generated, namely consumption data. The methodology proposed in the present paper makes use of demand response consumers’ performance evaluation methods to determine the expected consumption for a given consumer. Then, potential commercial losses are identified using monthly historic consumption data. Real consumption data is used in the case study to demonstrate the application of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of distributed generation and smart grid research works are dedicated to network operation parameters studies, reliability, etc. However, many of these works normally uses traditional test systems, for instance, IEEE test systems. This paper proposes voltage magnitude and reliability studies in presence of fault conditions, considering realistic conditions found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12-bus sub-transmission network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last decade has witnessed a major shift towards the deployment of embedded applications on multi-core platforms. However, real-time applications have not been able to fully benefit from this transition, as the computational gains offered by multi-cores are often offset by performance degradation due to shared resources, such as main memory. To efficiently use multi-core platforms for real-time systems, it is hence essential to tightly bound the interference when accessing shared resources. Although there has been much recent work in this area, a remaining key problem is to address the diversity of memory arbiters in the analysis to make it applicable to a wide range of systems. This work handles diverse arbiters by proposing a general framework to compute the maximum interference caused by the shared memory bus and its impact on the execution time of the tasks running on the cores, considering different bus arbiters. Our novel approach clearly demarcates the arbiter-dependent and independent stages in the analysis of these upper bounds. The arbiter-dependent phase takes the arbiter and the task memory-traffic pattern as inputs and produces a model of the availability of the bus to a given task. Then, based on the availability of the bus, the arbiter-independent phase determines the worst-case request-release scenario that maximizes the interference experienced by the tasks due to the contention for the bus. We show that the framework addresses the diversity problem by applying it to a memory bus shared by a fixed-priority arbiter, a time-division multiplexing (TDM) arbiter, and an unspecified work-conserving arbiter using applications from the MediaBench test suite. We also experimentally evaluate the quality of the analysis by comparison with a state-of-the-art TDM analysis approach and consistently showing a considerable reduction in maximum interference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microcystin-LR (MC-LR) is a dangerous toxin found in environmental waters, quantified by high performance liquid chromatography and/or enzyme-linked immunosorbent assays. Quick, low cost and on-site analysis is thus required to ensure human safety and wide screening programs. This work proposes label-free potentiometric sensors made of solid-contact electrodes coated with a surface imprinted polymer on the surface of Multi-Walled Carbon NanoTubes (CNTs) incorporated in a polyvinyl chloride membrane. The imprinting effect was checked by using non-imprinted materials. The MC-LR sensitive sensors were evaluated, characterized and applied successfully in spiked environmental waters. The presented method offered the advantages of low cost, portability, easy operation and suitability for adaptation to flow methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring organic environmental contaminants is of crucial importance to ensure public health. This requires simple, portable and robust devices to carry out on-site analysis. For this purpose, a low-temperature co-fired ceramics (LTCC) microfluidic potentiometric device (LTCC/μPOT) was developed for the first time for an organic compound: sulfamethoxazole (SMX). Sensory materials relied on newly designed plastic antibodies. Sol–gel, self-assembling monolayer and molecular-imprinting techniques were merged for this purpose. Silica beads were amine-modified and linked to SMX via glutaraldehyde modification. Condensation polymerization was conducted around SMX to fill the vacant spaces. SMX was removed after, leaving behind imprinted sites of complementary shape. The obtained particles were used as ionophores in plasticized PVC membranes. The most suitable membrane composition was selected in steady-state assays. Its suitability to flow analysis was verified in flow-injection studies with regular tubular electrodes. The LTCC/μPOT device integrated a bidimensional mixer, an embedded reference electrode based on Ag/AgCl and an Ag-based contact screen-printed under a micromachined cavity of 600 μm depth. The sensing membranes were deposited over this contact and acted as indicating electrodes. Under optimum conditions, the SMX sensor displayed slopes of about −58.7 mV/decade in a range from 12.7 to 250 μg/mL, providing a detection limit of 3.85 μg/mL and a sampling throughput of 36 samples/h with a reagent consumption of 3.3 mL per sample. The system was adjusted later to multiple analyte detection by including a second potentiometric cell on the LTCC/μPOT device. No additional reference electrode was required. This concept was applied to Trimethoprim (TMP), always administered concomitantly with sulphonamide drugs, and tested in fish-farming waters. The biparametric microanalyzer displayed Nernstian behaviour, with average slopes −54.7 (SMX) and +57.8 (TMP) mV/decade. To demonstrate the microanalyzer capabilities for real applications, it was successfully applied to single and simultaneous determination of SMX and TMP in aquaculture waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presented at 21st IEEE International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA 2015). 19 to 21, Aug, 2015, pp 122-131. Hong Kong, China.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stroke is one of the most common conditions requiring rehabilitation, and its motor impairments are a major cause of permanent disability. Hemiparesis is observed by 80% of the patients after acute stroke. Neuroimaging studies showed that real and imagined movements have similarities regarding brain activation, supplying evidence that those similarities are based on the same process. Within this context, the combination of mental practice (MP) with physical and occupational therapy appears to be a natural complement based on neurorehabilitation concepts. Our study seeks to investigate if MP for stroke rehabilitation of upper limbs is an effective adjunct therapy. PubMed (Medline), ISI knowledge (Institute for Scientific Information) and SciELO (Scientific Electronic Library) were terminated on 20 February 2015. Data were collected on variables as follows: sample size, type of supervision, configuration of mental practice, setting the physical practice (intensity, number of sets and repetitions, duration of contractions, rest interval between sets, weekly and total duration), measures of sensorimotor deficits used in the main studies and significant results. Random effects models were used that take into account the variance within and between studies. Seven articles were selected. As there was no statistically significant difference between the two groups (MP vs control), showed a - 0.6 (95% CI: -1.27 to 0.04), for upper limb motor restoration after stroke. The present meta-analysis concluded that MP is not effective as adjunct therapeutic strategy for upper limb motor restoration after stroke.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pequenas empresas reconhecem a necessidade de sistematizar os seus procedimentos, por forma, a alcançarem sucesso nas suas propostas de negócio. Esta padronização pressupõe uma avaliação do enquadramento social e de mercado existente e dos valores, missão e objetivos das próprias empresas. As constantes inovações na área tecnológica e as mudanças sociais, refletem-se no desenvolvimento e estratégia a seguir pelas organizações, que se desdobram num ambiente muito competitivo e dinâmico, o que lhes exige uma atenção constante e ponderada. Ao analisarmos uma micro empresa portuguesa, que desenvolve a sua atividade na área da gestão e promoção de imóveis para arrendamento de curta duração, verificamos a existência de lacunas ao nível do seu modelo de gestão operacional. A escassez de meios financeiros e humanos, e o fraco conhecimento das práticas de gestão organizacional e estratégica, são fatores limitadores para o bom desempenho da empresa, podendo colocar em causa a sua viabilidade a curto prazo. Esta constatação foi o ponto de partida para a realização deste projeto que, após as diversas leituras efetuadas, o levantamento dos processos existentes, a análise e ponderação das diversas soluções possíveis a aplicar, terminou na proposta de um Sistema de Informação - opção que nos pareceu ser a mais adequada. Esta proposta foi aprovada pela gestão da empresa e o Sistema de Informação irá ser implementado. O presente trabalho teve como objetivo ajudar esta organização a melhorar o seu desempenho. Para atingir estes objetivos foi necessário elencar os pontos fortes e fracos desta empresa, de forma a ser possível agregar num documento, as necessidades que a mesma demonstrava, para colmatar as falhas existentes e que poderiam ser, num futuro próximo, motivo de desagregação da mesma. A metodologia adotada seguiu uma estratégia de investigação descritiva, utilizando o método de investigação-ação. A recolha de dados baseou-se em entrevistas à equipa de gestão e colaboradores da empresa, em documentação levantada na mesma relativa aos processos de gestão e informação institucional, cujos conteúdos foram analisados numa perspetiva qualitativa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O aumento verificado nos últimos anos em fusões e aquisições (F&A) resulta da opção dos gestores para fazer face ao aumento da pressão competitiva e ou à necessidade de crescimento rápido. A literatura sobre os efeitos dos processos de F&A é diversificada e as conclusões são contraditórias. Neste trabalho pretendemos verificar se o processo de F&A se traduz na criação de valor para o acionista da empresa adquirente. Como métrica de avaliação utilizamos o método de análise dos fluxos de caixa atualizados ou o Discounted Cash-Flow (DCF). Para o efeito efetuamos um estudo de caso que vai incidir sobre uma empresa seguradora portuguesa – A Lusitania Companhia de Seguros, S.A. – que recentemente adquiriu, por fusão por incorporação, a Real Seguros, S.A. Efetuamos uma análise de natureza quantitativa através dos dados do relatório e contas e encontramos evidência de que o desempenho diminuiu após a transação.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reduction of greenhouse gas emissions is one of the big global challenges for the next decades due to its severe impact on the atmosphere that leads to a change in the climate and other environmental factors. One of the main sources of greenhouse gas is energy consumption, therefore a number of initiatives and calls for awareness and sustainability in energy use are issued among different types of institutional and organizations. The European Council adopted in 2007 energy and climate change objectives for 20% improvement until 2020. All European countries are required to use energy with more efficiency. Several steps could be conducted for energy reduction: understanding the buildings behavior through time, revealing the factors that influence the consumption, applying the right measurement for reduction and sustainability, visualizing the hidden connection between our daily habits impacts on the natural world and promoting to more sustainable life. Researchers have suggested that feedback visualization can effectively encourage conservation with energy reduction rate of 18%. Furthermore, researchers have contributed to the identification process of a set of factors which are very likely to influence consumption. Such as occupancy level, occupants behavior, environmental conditions, building thermal envelope, climate zones, etc. Nowadays, the amount of energy consumption at the university campuses are huge and it needs great effort to meet the reduction requested by European Council as well as the cost reduction. Thus, the present study was performed on the university buildings as a use case to: a. Investigate the most dynamic influence factors on energy consumption in campus; b. Implement prediction model for electricity consumption using different techniques, such as the traditional regression way and the alternative machine learning techniques; and c. Assist energy management by providing a real time energy feedback and visualization in campus for more awareness and better decision making. This methodology is implemented to the use case of University Jaume I (UJI), located in Castellon, Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field Lab of Entrepreneurial Innovative Ventures