16 resultados para Evolutionary Polynomial Regression (EPR) for HydroSystems
em Instituto Politécnico do Porto, Portugal
Resumo:
Auditory event-related potentials (AERPs) are widely used in diverse fields of today’s neuroscience, concerning auditory processing, speech perception, language acquisition, neurodevelopment, attention and cognition in normal aging, gender, developmental, neurologic and psychiatric disorders. However, its transposition to clinical practice has remained minimal. Mainly due to scarce literature on normative data across age, wide spectrumof results, variety of auditory stimuli used and to different neuropsychological meanings of AERPs components between authors. One of the most prominent AERP components studied in last decades was N1, which reflects auditory detection and discrimination. Subsequently, N2 indicates attention allocation and phonological analysis. The simultaneous analysis of N1 and N2 elicited by feasible novelty experimental paradigms, such as auditory oddball, seems an objective method to assess central auditory processing. The aim of this systematic review was to bring forward normative values for auditory oddball N1 and N2 components across age. EBSCO, PubMed, Web of Knowledge and Google Scholarwere systematically searched for studies that elicited N1 and/or N2 by auditory oddball paradigm. A total of 2,764 papers were initially identified in the database, of which 19 resulted from hand search and additional references, between 1988 and 2013, last 25 years. A final total of 68 studiesmet the eligibility criteria with a total of 2,406 participants from control groups for N1 (age range 6.6–85 years; mean 34.42) and 1,507 for N2 (age range 9–85 years; mean 36.13). Polynomial regression analysis revealed thatN1latency decreases with aging at Fz and Cz,N1 amplitude at Cz decreases from childhood to adolescence and stabilizes after 30–40 years and at Fz the decrement finishes by 60 years and highly increases after this age. Regarding N2, latency did not covary with age but amplitude showed a significant decrement for both Cz and Fz. Results suggested reliable normative values for Cz and Fz electrode locations; however, changes in brain development and components topography over age should be considered in clinical practice.
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.
Resumo:
The higher education system in Europe is currently under stress and the debates over its reform and future are gaining momentum. Now that, for most countries, we are in a time for change, in the overall society and the whole education system, the legal and political dimensions have gained prominence, which has not been followed by a more integrative approach of the problem of order, its reform and the issue of regulation, beyond the typical static and classical cost-benefit analyses. The two classical approaches for studying (and for designing the policy measures of) the problem of the reform of the higher education system - the cost-benefit analysis and the legal scholarship description - have to be integrated. This is the argument of our paper that the very integration of economic and legal approaches, what Warren Samuels called the legal-economic nexus, is meaningful and necessary, especially if we want to address the problem of order (as formulated by Joseph Spengler) and the overall regulation of the system. On the one hand, and without neglecting the interest and insights gained from the cost-benefit analysis, or other approaches of value for money assessment, we will focus our study on the legal, social and political aspects of the regulation of the higher education system and its reform in Portugal. On the other hand, the economic and financial problems have to be taken into account, but in a more inclusive way with regard to the indirect and other socio-economic costs not contemplated in traditional or standard assessments of policies for the tertiary education sector. In the first section of the paper, we will discuss the theoretical and conceptual underpinning of our analysis, focusing on the evolutionary approach, the role of critical institutions, the legal-economic nexus and the problem of order. All these elements are related to the institutional tradition, from Veblen and Commons to Spengler and Samuels. The second section states the problem of regulation in the higher education system and the issue of policy formulation for tackling the problem. The current situation is clearly one of crisis with the expansion of the cohorts of young students coming to an end and the recurrent scandals in private institutions. In the last decade, after a protracted period of extension or expansion of the system, i. e., the continuous growth of students, universities and other institutions are competing harder to gain students and have seen their financial situation at risk. It seems that we are entering a period of radical uncertainty, higher competition and a new configuration that is slowly building up is the growth in intensity, which means upgrading the quality of the higher learning and getting more involvement in vocational training and life-long learning. With this change, and along with other deep ones in the Portuguese society and economy, the current regulation has shown signs of maladjustment. The third section consists of our conclusions on the current issue of regulation and policy challenge. First, we underline the importance of an evolutionary approach to a process of change that is essentially dynamic. A special attention will be given to the issues related to an evolutionary construe of policy analysis and formulation. Second, the integration of law and economics, through the notion of legal economic nexus, allows us to better define the issues of regulation and the concrete problems that the universities are facing. One aspect is the instability of the political measures regarding the public administration and on which the higher education system depends financially, legally and institutionally, to say the least. A corollary is the lack of clear strategy in the policy reforms. Third, our research criticizes several studies, such as the one made by the OECD in late 2006 for the Ministry of Science, Technology and Higher Education, for being too static and neglecting fundamental aspects of regulation such as the logic of actors, groups and organizations who are major players in the system. Finally, simply changing the legal rules will not necessary per se change the behaviors that the authorities want to change. By this, we mean that it is not only remiss of the policy maker to ignore some of the critical issues of regulation, namely the continuous non-respect by academic management and administrative bodies of universities of the legal rules that were once promulgated. Changing the rules does not change the problem, especially without the necessary debates form the different relevant quarters that make up the higher education system. The issues of social interaction remain as intact. Our treatment of the matter will be organized in the following way. In the first section, the theoretical principles are developed in order to be able to study more adequately the higher education transformation with a modest evolutionary theory and a legal and economic nexus of the interactions of the system and the policy challenges. After describing, in the second section, the recent evolution and current working of the higher education in Portugal, we will analyze the legal framework and the current regulatory practices and problems in light of the theoretical framework adopted. We will end with some conclusions on the current problems of regulation and the policy measures that are discusses in recent years.
CIDER - envisaging a COTS communication infrastructure for evolutionary dependable real-time systems
Resumo:
It is foreseen that future dependable real-time systems will also have to meet flexibility, adaptability and reconfigurability requirements. Considering the distributed nature of these computing systems, a communication infrastructure that permits to fulfil all those requirements is thus of major importance. Although Ethernet has been used primarily as an information network, there is a strong belief that some very recent technological advances will enable its use in dependable applications with real-time requirements. Indeed, several recently standardised mechanisms associated with Switched-Ethernet seem to be promising to enable communication infrastructures to support hard real-time, reliability and flexible distributed applications. This paper describes the motivation and the work being developed within the CIDER (Communication Infrastructure for Dependable Evolvable Real-Time Systems) project, which envisages the use of COTS Ethernet as an enabling technology for future dependable real-time systems. It is foreseen that the CIDER approach will constitute a relevant stream of research since it will bring together cutting edge research in the field of real-time and dependable distributed systems and the industrial eagerness to expand Ethernet responsabilities to support dependable real-time applications.
Resumo:
The trajectory planning of redundant robots is an important area of research and efficient optimization algorithms have been investigated in the last years. This paper presents a new technique that combines the closed-loop pseudoinverse method with genetic algorithms. In this case the trajectory planning is formulated as an optimization problem with constraints.
Resumo:
This paper presents a brief history of the western music: from its genesis to serialism and the Darmstadt school. Also some mathematical aspects of music are then presented and confronted with music as a form of art. The question is, are these two distinct aspects compatible? Can computers be of real help in automatic composition? The more appealing algorithmic approach is evolutionary computation as it offers creativity potential. Therefore, the Evolutionary Algorithms are then introduced and some results of GAs and GPs application to music generation are analysed.
Resumo:
The trajectory planning of redundant robots is an important area of research and efficient optimization algorithms are needed. The pseudoinverse control is not repeatable, causing drift in joint space which is undesirable for physical control. This paper presents a new technique that combines the closed-loop pseudoinverse method with genetic algorithms, leading to an optimization criterion for repeatable control of redundant manipulators, and avoiding the joint angle drift problem. Computer simulations performed based on redundant and hyper-redundant planar manipulators show that, when the end-effector traces a closed path in the workspace, the robot returns to its initial configuration. The solution is repeatable for a workspace with and without obstacles in the sense that, after executing several cycles, the initial and final states of the manipulator are very close.
Resumo:
This paper analyses the performance of a genetic algorithm (GA) in the synthesis of digital circuits using two novel approaches. The first concept consists in improving the static fitness function by including a discontinuity evaluation. The measure of variability in the error of the Boolean table has similarities with the function continuity issue in classical calculus. The second concept extends the static fitness by introducing a fractional-order dynamical evaluation.
Resumo:
The prediction of the time and the efficiency of the remediation of contaminated soils using soil vapor extraction remain a difficult challenge to the scientific community and consultants. This work reports the development of multiple linear regression and artificial neural network models to predict the remediation time and efficiency of soil vapor extractions performed in soils contaminated separately with benzene, toluene, ethylbenzene, xylene, trichloroethylene, and perchloroethylene. The results demonstrated that the artificial neural network approach presents better performances when compared with multiple linear regression models. The artificial neural network model allowed an accurate prediction of remediation time and efficiency based on only soil and pollutants characteristics, and consequently allowing a simple and quick previous evaluation of the process viability.
Resumo:
Radiotherapy is one of the main treatments used against cancer. Radiotherapy uses radiation to destroy cancerous cells trying, at the same time, to minimize the damages in healthy tissues. The planning of a radiotherapy treatment is patient dependent, resulting in a lengthy trial and error procedure until a treatment complying as most as possible with the medical prescription is found. Intensity Modulated Radiation Therapy (IMRT) is one technique of radiation treatment that allows the achievement of a high degree of conformity between the area to be treated and the dose absorbed by healthy tissues. Nevertheless, it is still not possible to eliminate completely the potential treatments’ side-effects. In this retrospective study we use the clinical data from patients with head-and-neck cancer treated at the Portuguese Institute of Oncology of Coimbra and explore the possibility of classifying new and untreated patients according to the probability of xerostomia 12 months after the beginning of IMRT treatments by using a logistic regression approach. The results obtained show that the classifier presents a high discriminative ability in predicting the binary response “at risk for xerostomia at 12 months”
Resumo:
This paper analyses the performance of a genetic algorithm (GA) in the synthesis of digital circuits using two novel approaches. The first concept consists in improving the static fitness function by including a discontinuity evaluation. The measure of variability in the error of the Boolean table has similarities with the function continuity issue in classical calculus. The second concept extends the static fitness by introducing a fractional-order dynamical evaluation.
Resumo:
O objetivo deste trabalho é o desenvolvimento de frameworks de testes automáticos de software. Este tipo de testes normalmente está associado ao modelo evolucionário e às metodologias ágeis de desenvolvimento de software, enquanto que os testes manuais estão relacionados com o modelo em cascata e as metodologias tradicionais. Como tal foi efetuado um estudo comparativo sobre os tipos de metodologias e de testes existentes, para decidir quais os que melhor se adequavam ao projeto e dar resposta à questão "Será que realmente compensa realizar testes (automáticos)?". Finalizado o estudo foram desenvolvidas duas frameworks, a primeira para a implementação de testes funcionais e unitários sem dependências a ser utilizada pelos estagiários curriculares da LabOrders, e a segunda para a implementação de testes unitários com dependências externas de base de dados e serviços, a ser utilizada pelos funcionários da empresa. Nas últimas duas décadas as metodologias ágeis de desenvolvimento de software não pararam de evoluir, no entanto as ferramentas de automação não conseguiram acompanhar este progresso. Muitas áreas não são abrangidas pelos testes e por isso alguns têm de ser feitos manualmente. Posto isto foram criadas várias funcionalidades inovadoras para aumentar a cobertura dos testes e tornar as frameworks o mais intuitivas possível, nomeadamente: 1. Download automático de ficheiros através do Internet Explorer 9 (e versões mais recentes). 2. Análise do conteúdo de ficheiros .pdf (através dos testes). 3. Obtenção de elementos web e respetivos atributos através de código jQuery utilizando a API WebDriver com PHP bindings. 4. Exibição de mensagens de erro personalizadas quando não é possível encontrar um determinado elemento. As frameworks implementadas estão também preparadas para a criação de outros testes (de carga, integração, regressão) que possam vir a ser necessários no futuro. Foram testadas em contexto de trabalho pelos colaboradores e clientes da empresa onde foi realizado o projeto de mestrado e os resultados permitiram concluir que a adoção de uma metodologia de desenvolvimento de software com testes automáticos pode aumentar a produtividade, reduzir as falhas e potenciar o cumprimento de orçamentos e prazos dos projetos das organizações.
Resumo:
In the last two decades, small strain shear modulus became one of the most important geotechnical parameters to characterize soil stiffness. Finite element analysis have shown that in-situ stiffness of soils and rocks is much higher than what was previously thought and that stress-strain behaviour of these materials is non-linear in most cases with small strain levels, especially in the ground around retaining walls, foundations and tunnels, typically in the order of 10−2 to 10−4 of strain. Although the best approach to estimate shear modulus seems to be based in measuring seismic wave velocities, deriving the parameter through correlations with in-situ tests is usually considered very useful for design practice.The use of Neural Networks for modeling systems has been widespread, in particular within areas where the great amount of available data and the complexity of the systems keeps the problem very unfriendly to treat following traditional data analysis methodologies. In this work, the use of Neural Networks and Support Vector Regression is proposed to estimate small strain shear modulus for sedimentary soils from the basic or intermediate parameters derived from Marchetti Dilatometer Test. The results are discussed and compared with some of the most common available methodologies for this evaluation.
Resumo:
In the last two decades, small strain shear modulus became one of the most important geotechnical parameters to characterize soil stiffness. Finite element analysis have shown that in-situ stiffness of soils and rocks is much higher than what was previously thought and that stress-strain behaviour of these materials is non-linear in most cases with small strain levels, especially in the ground around retaining walls, foundations and tunnels, typically in the order of 10−2 to 10−4 of strain. Although the best approach to estimate shear modulus seems to be based in measuring seismic wave velocities, deriving the parameter through correlations with in-situ tests is usually considered very useful for design practice.The use of Neural Networks for modeling systems has been widespread, in particular within areas where the great amount of available data and the complexity of the systems keeps the problem very unfriendly to treat following traditional data analysis methodologies. In this work, the use of Neural Networks and Support Vector Regression is proposed to estimate small strain shear modulus for sedimentary soils from the basic or intermediate parameters derived from Marchetti Dilatometer Test. The results are discussed and compared with some of the most common available methodologies for this evaluation.