33 resultados para Residual-Based Panel Cointegration Test
Resumo:
Ancillary services represent a good business opportunity that must be considered by market players. This paper presents a new methodology for ancillary services market dispatch. The method considers the bids submitted to the market and includes a market clearing mechanism based on deterministic optimization. An Artificial Neural Network is used for day-ahead prediction of Regulation Down, regulation-up, Spin Reserve and Non-Spin Reserve requirements. Two test cases based on California Independent System Operator data concerning dispatch of Regulation Down, Regulation Up, Spin Reserve and Non-Spin Reserve services are included in this paper to illustrate the application of the proposed method: (1) dispatch considering simple bids; (2) dispatch considering complex bids.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
This paper proposes a dynamic scheduler that supports the coexistence of guaranteed and non-guaranteed bandwidth servers to efficiently handle soft-tasks’ overloads by making additional capacity available from two sources: (i) residual capacity allocated but unused when jobs complete in less than their budgeted execution time; (ii) stealing capacity from inactive non-isolated servers used to schedule best-effort jobs. The effectiveness of the proposed approach in reducing the mean tardiness of periodic jobs is demonstrated through extensive simulations. The achieved results become even more significant when tasks’ computation times have a large variance.
Resumo:
The advent of Wireless Sensor Network (WSN) technologies is paving the way for a panoply of new ubiquitous computing applications, some of them with critical requirements. In the ART-WiSe framework, we are designing a two-tiered communication architecture for supporting real-time and reliable communications in WSNs. Within this context, we have been developing a test-bed application, for testing, validating and demonstrating our theoretical findings - a search&rescue/pursuit-evasion application. Basically, a WSN deployment is used to detect, localize and track a target robot and a station controls a rescuer/pursuer robot until it gets close enough to the target robot. This paper describes how this application was engineered, particularly focusing on the implementation of the localization mechanism.
Resumo:
The ART-WiSe (Architecture for Real-Time communications in Wireless Sensor Networks) framework aims at the design of new communication architectures and mechanisms for time-sensitive Wireless Sensor Networks (WSNs). We adopted a two-tiered architecture where an overlay Wireless Local Area Network (Tier 2) serves as a backbone for a WSN (Tier 1), relying on existing standard communication protocols and commercial-off-the-shell (COTS) technologies – IEEE 802.15.4/ZigBee for Tier 1 and IEEE 802.11 for Tier 2. In this line, a test-bed application is being developed for assessing, validating and demonstrating the ART-WiSe architecture. A pursuit-evasion application was chosen since it fulfils a number of requirements, namely it is feasible and appealing and imposes some stress to the architecture in terms of timeliness. To develop the testbed based on the previously referred technologies, an implementation of the IEEE 8021.5.4/ZigBee protocols is being carried out, since there is no open source available to the community. This paper highlights some relevant aspects of the ART-WiSe architecture, provides some intuition on the protocol stack implementation and presents a general view over the envisaged test-bed application.
Resumo:
The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).
Resumo:
The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.
Resumo:
Weblabs are spreading their influence in Science and Engineering (S&E) courses providing a way to remotely conduct real experiments. Typically, they are implemented by different architectures and infrastructures supported by Instruments and Modules (I&Ms) able to be remotely controlled and observed. Besides the inexistence of a standard solution for implementing weblabs, their reconfiguration is limited to a setup procedure that enables interconnecting a set of preselected I&Ms into an Experiment Under Test (EUT). Moreover, those I&Ms are not able to be replicated or shared by different weblab infrastructures, since they are usually based on hardware platforms. Thus, to overcome these limitations, this paper proposes a standard solution that uses I&Ms embedded into Field-Programmable Gate Array (FPGAs) devices. It is presented an architecture based on the IEEE1451.0 Std. supported by a FPGA-based weblab infrastructure able to be remotely reconfigured with I&Ms, described through standard Hardware Description Language (HDL) files, using a Reconfiguration Tool (RecTool).
Resumo:
In competitive electricity markets it is necessary for a profit-seeking load-serving entity (LSE) to optimally adjust the financial incentives offering the end users that buy electricity at regulated rates to reduce the consumption during high market prices. The LSE in this model manages the demand response (DR) by offering financial incentives to retail customers, in order to maximize its expected profit and reduce the risk of market power experience. The stochastic formulation is implemented into a test system where a number of loads are supplied through LSEs.
Resumo:
The goal of this study was to propose a new functional magnetic resonance imaging (fMRI) paradigm using a language-free adaptation of a 2-back working memory task to avoid cultural and educational bias. We additionally provide an index of the validity of the proposed paradigm and test whether the experimental task discriminates the behavioural performances of healthy participants from those of individuals with working memory deficits. Ten healthy participants and nine patients presenting working memory (WM) deficits due to acquired brain injury (ABI) performed the developed task. To inspect whether the paradigm activates brain areas typically involved in visual working memory (VWM), brain activation of the healthy participants was assessed with fMRIs. To examine the task's capacity to discriminate behavioural data, performances of the healthy participants in the task were compared with those of ABI patients. Data were analysed with GLM-based random effects procedures and t-tests. We found an increase of the BOLD signal in the specialized areas of VWM. Concerning behavioural performances, healthy participants showed the predicted pattern of more hits, less omissions and a tendency for fewer false alarms, more self-corrected responses, and faster reaction times, when compared with subjects presenting WM impairments. The results suggest that this task activates brain areas involved in VWM and discriminates behavioural performances of clinical and non-clinical groups. It can thus be used as a research methodology for behavioural and neuroimaging studies of VWM in block-design paradigms.
Resumo:
Signal-to-interference ratio (SIR) performance of a multiband orthogonal frequency division multiplexing ultra-wideband system with residual timing offset is investigated. To do so, an exact mathematical derivation of the SIR of this system is derived. It becomes obvious that, unlike a cyclic prefixing based system, a zero padding based system is sensitive to residual timing offset.
Resumo:
Astringency is an organoleptic property resulting mostly from the interaction of salivary proteins with dietary polyphenols. It is of great importance to consumers but being typically measured by sensorial panels it turns out subjective and expensive. The main goal of the present work is to develop a sensory system to estimate astringency relying on protein/polyphenol interactions. For this purpose, a model protein was immobilized on a sensory gold surface and its subsequent interaction with polyphenols was measured by Surface Plasma Resonance (SPR). α-amylase and pentagalloyl glucose (PGG) were selected as model protein and polyphenol, respectively. To ensure specific binding between these, various surface chemistries were tested. Carboxylic terminated thiol decreased the binding ability of PGG and allowed covalent attachment of α-amylase to the surface. The pH 5 was the optimal condition for α-amylase immobilization on the surface. Further studies focus on Localized SPR sensor and application to wine samples, providing objectivity when compared to a trained panel.
Resumo:
This paper presents the design of low cost, small autonomous surface vehicle for missions in the coastal waters and specifically for the challenging surf zone. The main objective of the vehicle design described in this paper is to address both the capability of operation at sea in relative challenging conditions and maintain a very low set of operational requirements (ease of deployment). This vehicle provides a first step towards being able to perform general purpose missions (such as data gathering or patrolling) and to at least in a relatively short distances to be able to be used in rescue operations (with very low handling requirements) such as carrying support to humans on the water. The USV is based on a commercially available fiber glass hull, it uses a directional waterjet powered by an electrical brushless motor for propulsion, thus without any protruding propeller reducing danger in rescue operations. Its small dimensions (1.5 m length) and weight allow versatility and ease of deployment. The vehicle design is described in this paper both from a hardware and software point of view. A characterization of the vehicle in terms of energy consumption and performance is provided both from test tank and operational scenario tests. An example application in search and rescue is also presented and discussed with the integration of this vehicle in the European ICARUS (7th framework) research project addressing the development and integration of robotic tools for large scale search and rescue operations.
Resumo:
This work aims to evaluate the feasibility of using image-based cytometry (IBC) in the analysis of algal cell quantification and viability, using Pseudokirchneriella subcapitata as a cell model. Cell concentration was determined by IBC to be in a linear range between 1 × 105 and 8 × 106 cells mL−1. Algal viability was defined on the basis that the intact membrane of viable cells excludes the SYTOX Green (SG) probe. The disruption of membrane integrity represents irreversible damage and consequently results in cell death. Using IBC, we were able to successfully discriminate between live (SG-negative cells) and dead algal cells (heat-treated at 65 °C for 60 min; SG-positive cells). The observed viability of algal populations containing different proportions of killed cells was well correlated (R 2 = 0.994) with the theoretical viability. The validation of the use of this technology was carried out by exposing algal cells of P. subcapitata to a copper stress test for 96 h. IBC allowed us to follow the evolution of cell concentration and the viability of copper-exposed algal populations. This technology overcomes several main drawbacks usually associated with microscopy counting, such as labour-intensive experiments, tedious work and lack of the representativeness of the cell counting. In conclusion, IBC allowed a fast and automated determination of the total number of algal cells and allowed us to analyse viability. This technology can provide a useful tool for a wide variety of fields that utilise microalgae, such as the aquatic toxicology and biotechnology fields.
Resumo:
In the last two decades, small strain shear modulus became one of the most important geotechnical parameters to characterize soil stiffness. Finite element analysis have shown that in-situ stiffness of soils and rocks is much higher than what was previously thought and that stress-strain behaviour of these materials is non-linear in most cases with small strain levels, especially in the ground around retaining walls, foundations and tunnels, typically in the order of 10−2 to 10−4 of strain. Although the best approach to estimate shear modulus seems to be based in measuring seismic wave velocities, deriving the parameter through correlations with in-situ tests is usually considered very useful for design practice.The use of Neural Networks for modeling systems has been widespread, in particular within areas where the great amount of available data and the complexity of the systems keeps the problem very unfriendly to treat following traditional data analysis methodologies. In this work, the use of Neural Networks and Support Vector Regression is proposed to estimate small strain shear modulus for sedimentary soils from the basic or intermediate parameters derived from Marchetti Dilatometer Test. The results are discussed and compared with some of the most common available methodologies for this evaluation.