32 resultados para Frequency response model
Resumo:
This manuscript analyses the data generated by a Zero Length Column (ZLC) diffusion experimental set-up, for 1,3 Di-isopropyl benzene in a 100% alumina matrix with variable particle size. The time evolution of the phenomena resembles those of fractional order systems, namely those with a fast initial transient followed by long and slow tails. The experimental measurements are best fitted with the Harris model revealing a power law behavior.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
The paper proposes a Flexibility Requirements Model and a Factory Templates Framework to support the dynamic Virtual Organization decision-makers in order to reach effective response to the emergent business opportunities ensuring profitability. Through the construction and analysis of the flexibility requirements model, the network managers can achieve and conceive better strategies to model and breed new dynamic VOs. This paper also presents the leagility concept as a new paradigm fit to equip the network management with a hybrid approach that better tackle the performance challenges imposed by the new and competitive business environments.
Resumo:
In the current context of serious climate changes, where the increase of the frequency of some extreme events occurrence can enhance the rate of periods prone to high intensity forest fires, the National Forest Authority often implements, in several Portuguese forest areas, a regular set of measures in order to control the amount of fuel mass availability (PNDFCI, 2008). In the present work we’ll present a preliminary analysis concerning the assessment of the consequences given by the implementation of prescribed fire measures to control the amount of fuel mass in soil recovery, in particular in terms of its water retention capacity, its organic matter content, pH and content of iron. This work is included in a larger study (Meira-Castro, 2009(a); Meira-Castro, 2009(b)). According to the established praxis on the data collection, embodied in multidimensional matrices of n columns (variables in analysis) by p lines (sampled areas at different depths), and also considering the quantitative data nature present in this study, we’ve chosen a methodological approach that considers the multivariate statistical analysis, in particular, the Principal Component Analysis (PCA ) (Góis, 2004). The experiments were carried out in a soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, NW Portugal, who was able to maintain itself intact from prescribed burnings from four years and was submit to prescribed fire in March 2008. The soils samples were collected from five different plots at six different time periods. The methodological option that was adopted have allowed us to identify the most relevant relational structures inside the n variables, the p samples and in two sets at the same time (Garcia-Pereira, 1990). Consequently, and in addition to the traditional outputs produced from the PCA, we have analyzed the influence of both sampling depths and geomorphological environments in the behavior of all variables involved.
Resumo:
The ecotoxicological response of the living organisms in an aquatic system depends on the physical, chemical and bacteriological variables, as well as the interactions between them. An important challenge to scientists is to understand the interaction and behaviour of factors involved in a multidimensional process such as the ecotoxicological response.With this aim, multiple linear regression (MLR) and principal component regression were applied to the ecotoxicity bioassay response of Chlorella vulgaris and Vibrio fischeri in water collected at seven sites of Leça river during five monitoring campaigns (February, May, June, August and September of 2006). The river water characterization included the analysis of 22 physicochemical and 3 microbiological parameters. The model that best fitted the data was MLR, which shows: (i) a negative correlation with dissolved organic carbon, zinc and manganese, and a positive one with turbidity and arsenic, regarding C. vulgaris toxic response; (ii) a negative correlation with conductivity and turbidity and a positive one with phosphorus, hardness, iron, mercury, arsenic and faecal coliforms, concerning V. fischeri toxic response. This integrated assessment may allow the evaluation of the effect of future pollution abatement measures over the water quality of Leça River.
Resumo:
OBJECTIVE: To evaluate the predictive value of genetic polymorphisms in the context of BCG immunotherapy outcome and create a predictive profile that may allow discriminating the risk of recurrence. MATERIAL AND METHODS: In a dataset of 204 patients treated with BCG, we evaluate 42 genetic polymorphisms in 38 genes involved in the BCG mechanism of action, using Sequenom MassARRAY technology. Stepwise multivariate Cox Regression was used for data mining. RESULTS: In agreement with previous studies we observed that gender, age, tumor multiplicity and treatment scheme were associated with BCG failure. Using stepwise multivariate Cox Regression analysis we propose the first predictive profile of BCG immunotherapy outcome and a risk score based on polymorphisms in immune system molecules (SNPs in TNFA-1031T/C (rs1799964), IL2RA rs2104286 T/C, IL17A-197G/A (rs2275913), IL17RA-809A/G (rs4819554), IL18R1 rs3771171 T/C, ICAM1 K469E (rs5498), FASL-844T/C (rs763110) and TRAILR1-397T/G (rs79037040) in association with clinicopathological variables. This risk score allows the categorization of patients into risk groups: patients within the Low Risk group have a 90% chance of successful treatment, whereas patients in the High Risk group present 75% chance of recurrence after BCG treatment. CONCLUSION: We have established the first predictive score of BCG immunotherapy outcome combining clinicopathological characteristics and a panel of genetic polymorphisms. Further studies using an independent cohort are warranted. Moreover, the inclusion of other biomarkers may help to improve the proposed model.
Resumo:
In today’s healthcare paradigm, optimal sedation during anesthesia plays an important role both in patient welfare and in the socio-economic context. For the closed-loop control of general anesthesia, two drugs have proven to have stable, rapid onset times: propofol and remifentanil. These drugs are related to their effect in the bispectral index, a measure of EEG signal. In this paper wavelet time–frequency analysis is used to extract useful information from the clinical signals, since they are time-varying and mark important changes in patient’s response to drug dose. Model based predictive control algorithms are employed to regulate the depth of sedation by manipulating these two drugs. The results of identification from real data and the simulation of the closed loop control performance suggest that the proposed approach can bring an improvement of 9% in overall robustness and may be suitable for clinical practice.
Resumo:
Mestrado em Engenharia Informática, Área de Especialização em Tecnologias do Conhecimento e da Decisão
Resumo:
Demand response is assumed as an essential resource to fully achieve the smart grids operating benefits, namely in the context of competitive markets and of the increasing use of renewable-based energy sources. Some advantages of Demand Response (DR) programs and of smart grids can only be achieved through the implementation of Real Time Pricing (RTP). The integration of the expected increasing amounts of distributed energy resources, as well as new players, requires new approaches for the changing operation of power systems. The methodology proposed in this paper aims the minimization of the operation costs in a distribution network operated by a virtual power player that manages the available energy resources focusing on hour ahead re-scheduling. When facing lower wind power generation than expected from day ahead forecast, demand response is used in order to minimize the impacts of such wind availability change. In this way, consumers actively participate in regulation up and spinning reserve ancillary services through demand response programs. Real time pricing is also applied. The proposed model is especially useful when actual and day ahead wind forecast differ significantly. Its application is illustrated in this paper implementing the characteristics of a real resources conditions scenario in a 33 bus distribution network with 32 consumers and 66 distributed generators.
Resumo:
In competitive electricity markets it is necessary for a profit-seeking load-serving entity (LSE) to optimally adjust the financial incentives offering the end users that buy electricity at regulated rates to reduce the consumption during high market prices. The LSE in this model manages the demand response (DR) by offering financial incentives to retail customers, in order to maximize its expected profit and reduce the risk of market power experience. The stochastic formulation is implemented into a test system where a number of loads are supplied through LSEs.
Resumo:
Following the deregulation experience of retail electricity markets in most countries, the majority of the new entrants of the liberalized retail market were pure REP (retail electricity providers). These entities were subject to financial risks because of the unexpected price variations, price spikes, volatile loads and the potential for market power exertion by GENCO (generation companies). A REP can manage the market risks by employing the DR (demand response) programs and using its' generation and storage assets at the distribution network to serve the customers. The proposed model suggests how a REP with light physical assets, such as DG (distributed generation) units and ESS (energy storage systems), can survive in a competitive retail market. The paper discusses the effective risk management strategies for the REPs to deal with the uncertainties of the DAM (day-ahead market) and how to hedge the financial losses in the market. A two-stage stochastic programming problem is formulated. It aims to establish the financial incentive-based DR programs and the optimal dispatch of the DG units and ESSs. The uncertainty of the forecasted day-ahead load demand and electricity price is also taken into account with a scenario-based approach. The principal advantage of this model for REPs is reducing the risk of financial losses in DAMs, and the main benefit for the whole system is market power mitigation by virtually increasing the price elasticity of demand and reducing the peak demand.
Resumo:
The high penetration of distributed energy resources (DER) in distribution networks and the competitiveenvironment of electricity markets impose the use of new approaches in several domains. The networkcost allocation, traditionally used in transmission networks, should be adapted and used in the distribu-tion networks considering the specifications of the connected resources. The main goal is to develop afairer methodology trying to distribute the distribution network use costs to all players which are usingthe network in each period. In this paper, a model considering different type of costs (fixed, losses, andcongestion costs) is proposed comprising the use of a large set of DER, namely distributed generation(DG), demand response (DR) of direct load control type, energy storage systems (ESS), and electric vehi-cles with capability of discharging energy to the network, which is known as vehicle-to-grid (V2G). Theproposed model includes three distinct phases of operation. The first phase of the model consists in aneconomic dispatch based on an AC optimal power flow (AC-OPF); in the second phase Kirschen’s andBialek’s tracing algorithms are used and compared to evaluate the impact of each resource in the net-work. Finally, the MW-mile method is used in the third phase of the proposed model. A distributionnetwork of 33 buses with large penetration of DER is used to illustrate the application of the proposedmodel.
Resumo:
Uma nova tecnologia, os EDLC, também denominados por supercondensadores, tem-se tornado numa importante e aliciante área de interesse. Estes regem-se pelos mesmos princípios fundamentais dos condensadores clássicos, no entanto possibilitam receber capacidades superiores, devido a uma maior área de superfície e a um dielétrico menos espesso. Esta particularidade permite obter uma maior densidade energética, comparativamente com os condensadores clássicos e uma maior densidade de potência, comparativamente com as baterias. Consequentemente a utilização de supercondensadores tem aumentado, representando já uma alternativa fiável, segura e amiga do ambiente, em detrimento das baterias comuns. Assim, este projeto tem como principais objetivos, identificar os diferentes tipos de supercondensadores, apresentar as vantagens de cada tipo e explorar a sua resposta, quer no domínio das frequências quer no domínio dos tempos, e por fim modelá-los recorrendo a componentes elétricos clássicos, nomeadamente resistências e condensadores. A modelação foi realizada recorrendo ao MALTAB, através da função de minimização fminunc e foram construídos quatro modelos equivalentes, com o objetivo de modelar a resposta dos vários EDLC analisados. Por escassez de tempo o principal foco de análise recaiu sobre o EDLC de 0,022 F.
Resumo:
In today’s healthcare paradigm, optimal sedation during anesthesia plays an important role both in patient welfare and in the socio-economic context. For the closed-loop control of general anesthesia, two drugs have proven to have stable, rapid onset times: propofol and remifentanil. These drugs are related to their effect in the bispectral index, a measure of EEG signal. In this paper wavelet time–frequency analysis is used to extract useful information from the clinical signals, since they are time-varying and mark important changes in patient’s response to drug dose. Model based predictive control algorithms are employed to regulate the depth of sedation by manipulating these two drugs. The results of identification from real data and the simulation of the closed loop control performance suggest that the proposed approach can bring an improvement of 9% in overall robustness and may be suitable for clinical practice.
Resumo:
3rd Workshop on High-performance and Real-time Embedded Systems (HIRES 2015). 21, Jan, 2015. Amsterdam, Netherlands.