19 resultados para Time-varying variable selection

em Instituto Politécnico do Porto, Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, a microwave-assisted extraction (MAE) methodology was compared with several conventional extraction methods (Soxhlet, Bligh & Dyer, modified Bligh & Dyer, Folch, modified Folch, Hara & Radin, Roese-Gottlieb) for quantification of total lipid content of three fish species: horse mackerel (Trachurus trachurus), chub mackerel (Scomber japonicus), and sardine (Sardina pilchardus). The influence of species, extraction method and frozen storage time (varying from fresh to 9 months of freezing) on total lipid content was analysed in detail. The efficiencies of methods MAE, Bligh & Dyer, Folch, modified Folch and Hara & Radin were the highest and although they were not statistically different, differences existed in terms of variability, with MAE showing the highest repeatability (CV = 0.034). Roese-Gottlieb, Soxhlet, and modified Bligh & Dyer methods were very poor in terms of efficiency as well as repeatability (CV between 0.13 and 0.18).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a graphical method to visualize possible time-varying correlations between fifteen stock market values. The method is useful for observing stable or emerging clusters of stock markets with similar behaviour. The graphs, originated from applying multidimensional scaling techniques (MDS), may also guide the construction of multivariate econometric models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure for coupling mesoscale and CFD codes is presented, enabling the inclusion of realistic stratification flow regimes and boundary conditions in CFD simulations of relevance to site and resource assessment studies in complex terrain. Two distinct techniques are derived: (i) in the first one, boundary conditions are extracted from mesoscale results to produce time-varying CFD solutions; (ii) in the second case, a statistical treatment of mesoscale data leads to steady-state flow boundary conditions believed to be more representative than the idealised profiles which are current industry practice. Results are compared with measured data and traditional CFD approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s healthcare paradigm, optimal sedation during anesthesia plays an important role both in patient welfare and in the socio-economic context. For the closed-loop control of general anesthesia, two drugs have proven to have stable, rapid onset times: propofol and remifentanil. These drugs are related to their effect in the bispectral index, a measure of EEG signal. In this paper wavelet time–frequency analysis is used to extract useful information from the clinical signals, since they are time-varying and mark important changes in patient’s response to drug dose. Model based predictive control algorithms are employed to regulate the depth of sedation by manipulating these two drugs. The results of identification from real data and the simulation of the closed loop control performance suggest that the proposed approach can bring an improvement of 9% in overall robustness and may be suitable for clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s healthcare paradigm, optimal sedation during anesthesia plays an important role both in patient welfare and in the socio-economic context. For the closed-loop control of general anesthesia, two drugs have proven to have stable, rapid onset times: propofol and remifentanil. These drugs are related to their effect in the bispectral index, a measure of EEG signal. In this paper wavelet time–frequency analysis is used to extract useful information from the clinical signals, since they are time-varying and mark important changes in patient’s response to drug dose. Model based predictive control algorithms are employed to regulate the depth of sedation by manipulating these two drugs. The results of identification from real data and the simulation of the closed loop control performance suggest that the proposed approach can bring an improvement of 9% in overall robustness and may be suitable for clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper applies multidimensional scaling techniques and Fourier transform for visualizing possible time-varying correlations between 25 stock market values. The method is useful for observing clusters of stock markets with similar behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho teve como objectivo a optimização das condições de crescimento de biomassa algal tendo em vista a sua utilização como fonte de lípidos para biocombustíveis. Assim, procedeu-se à inoculação de duas estirpes, a Dunaliella tertiolecta (água salgada) e a Tetraselmis subcordiformis (água salobra), seleccionando-se a Dunaliella tertiolecta uma vez que esta apresentou um crescimento mais rápido. Escolhida a estirpe a usar, avaliou-se a influência da composição do meio de cultura da espécie, variando-se a concentração de macronutrientes (Magnésio, Potássio, Azoto, Fósforo) e de micronutrientes (Manganês, Zinco, Ferro, Cobalto) presentes no meio em 10 e 20 vezes, comparativamente à do meio de cultura padrão, o meio Artificial Seawater Medium with Vitamins. Avaliou-se o crescimento algal, a uma temperatura de 25 ºC ± 2 ºC, com uma intensidade de iluminação de 5000 lux (lâmpadas luz dia) e fotoperíodos 12:12 h, controlando possíveis contaminações nas culturas em estudo. Para os ensaios realizados com a Dunaliella tertiolecta, os melhores resultados para a produtividade média e máxima de biomassa, 63,06 mgbiomassa seca/L.dia e 141,79 mgbiomassa seca/L.dia, respectivamente, foram obtidos no ensaio em que se fez variar 10 vezes a concentração de azoto (sob a forma de nitrato). Os resultados mais satisfatórios para o teor lípidico e para a produtividade lipídica máxima, 33,45% e 47,43 mgóleo/L.dia respectivamente, também foram obtidos no ensaio em que se fez variar 10 vezes a concentração de azoto (sob a forma de nitrato), (com extracção dos lípidos usando o método de Bligh e Dyer). Foram testados dois solventes para a extracção de lipídos, o clorofórmio e o hexano, tendose obtido resultados superiores com o clorofórmio, comparativamente aos obtidos quando se usou hexano, com excepção do ensaio em que se aumentou 20 vezes a concentração de fósforo no meio de cultura das microalgas. Verificou-se que, em todos os ensaios foi atingido o estado estacionário sensivelmente na mesma altura, isto é, decorridos cerca de 25 dias após o início do estudo, excepto os ensaios em que se fez variar a concentração de cobalto, para os quais as culturas não se adaptaram às alterações do meio, acabando por morrer passados 15 dias. A adição dos macronutrientes e micronutrientes usados nos ensaios, nas quantidades testadas, não influenciou significativamente a produtividade lipídica, com excepção do azoto e ferro. Conclui-se que o aumento da concentração de azoto para 10x o valor padrão potencia o aumento da produtividade lipídica máxima para mais do dobro (3,6 vezes – Padrão: 13,25 mgóleo/L.dia; 10x N: 47,43 mgóleo/L.dia) e que o aumento da concentração de ferro para 10x o valor padrão potencia o aumento da produtividade lipídica máxima para aproximadamente o dobro (1,9 vezes - Padrão: 14,61 mgóleo/L.dia; 10x Fe: 28,04 mgóleo/L.dia). Nos ensaios realizados com adição de azoto ou ferro, os resultados obtidos para a concentração, teor lípidico e produtividade lipídica máxima, foram sempre superiores aos do padrão correspondente, pelo que se pode concluir que estes ensaios se apresentam como os mais promissores deste estudo, embora o ensaio mais satisfatório tenha sido aquele em que se promoveu a alteração da concentração de azoto para 10 vezes o valor padrão.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource constraints are becoming a problem as many of the wireless mobile devices have increased generality. Our work tries to address this growing demand on resources and performance, by proposing the dynamic selection of neighbor nodes for cooperative service execution. This selection is in uenced by user's quality of service requirements expressed in his request, tailoring provided service to user's speci c needs. In this paper we improve our proposal's formulation algorithm with the ability to trade o time for the quality of the solution. At any given time, a complete solution for service execution exists, and the quality of that solution is expected to improve overtime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moving towards autonomous operation and management of increasingly complex open distributed real-time systems poses very significant challenges. This is particularly true when reaction to events must be done in a timely and predictable manner while guaranteeing Quality of Service (QoS) constraints imposed by users, the environment, or applications. In these scenarios, the system should be able to maintain a global feasible QoS level while allowing individual nodes to autonomously adapt under different constraints of resource availability and input quality. This paper shows how decentralised coordination of a group of autonomous interdependent nodes can emerge with little communication, based on the robust self-organising principles of feedback. Positive feedback is used to reinforce the selection of the new desired global service solution, while negative feedback discourages nodes to act in a greedy fashion as this adversely impacts on the provided service levels at neighbouring nodes. The proposed protocol is general enough to be used in a wide range of scenarios characterised by a high degree of openness and dynamism where coordination tasks need to be time dependent. As the reported results demonstrate, it requires less messages to be exchanged and it is faster to achieve a globally acceptable near-optimal solution than other available approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded real-time applications increasingly present high computation requirements, which need to be completed within specific deadlines, but that present highly variable patterns, depending on the set of data available in a determined instant. The current trend to provide parallel processing in the embedded domain allows providing higher processing power; however, it does not address the variability in the processing pattern. Dimensioning each device for its worst-case scenario implies lower average utilization, and increased available, but unusable, processing in the overall system. A solution for this problem is to extend the parallel execution of the applications, allowing networked nodes to distribute the workload, on peak situations, to neighbour nodes. In this context, this report proposes a framework to develop parallel and distributed real-time embedded applications, transparently using OpenMP and Message Passing Interface (MPI), within a programming model based on OpenMP. The technical report also devises an integrated timing model, which enables the structured reasoning on the timing behaviour of these hybrid architectures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This manuscript analyses the data generated by a Zero Length Column (ZLC) diffusion experimental set-up, for 1,3 Di-isopropyl benzene in a 100% alumina matrix with variable particle size. The time evolution of the phenomena resembles those of fractional order systems, namely those with a fast initial transient followed by long and slow tails. The experimental measurements are best fitted with the Harris model revealing a power law behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes an one-step decentralised coordination model based on an effective feedback mechanism to reduce the complexity of the needed interactions among interdependent nodes of a cooperative distributed system until a collective adaptation behaviour is determined. Positive feedback is used to reinforce the selection of the new desired global service solution, while negative feedback discourages nodes to act in a greedy fashion as this adversely impacts on the provided service levels at neighbouring nodes. The reduced complexity and overhead of the proposed decentralised coordination model are validated through extensive evaluations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of selecting suppliers/partners is a crucial and important part in the process of decision making for companies that intend to perform competitively in their area of activity. The selection of supplier/partner is a time and resource-consuming task that involves data collection and a careful analysis of the factors that can positively or negatively influence the choice. Nevertheless it is a critical process that affects significantly the operational performance of each company. In this work, there were identified five broad selection criteria: Quality, Financial, Synergies, Cost, and Production System. Within these criteria, it was also included five sub-criteria. After the identification criteria, a survey was elaborated and companies were contacted in order to understand which factors have more weight in their decisions to choose the partners. Interpreted the results and processed the data, it was adopted a model of linear weighting to reflect the importance of each factor. The model has a hierarchical structure and can be applied with the Analytic Hierarchy Process (AHP) method or Value Analysis. The goal of the paper it's to supply a selection reference model that can represent an orientation/pattern for a decision making on the suppliers/partners selection process

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A função de escalonamento desempenha um papel importante nos sistemas de produção. Os sistemas de escalonamento têm como objetivo gerar um plano de escalonamento que permite gerir de uma forma eficiente um conjunto de tarefas que necessitam de ser executadas no mesmo período de tempo pelos mesmos recursos. Contudo, adaptação dinâmica e otimização é uma necessidade crítica em sistemas de escalonamento, uma vez que as organizações de produção têm uma natureza dinâmica. Nestas organizações ocorrem distúrbios nas condições requisitos de trabalho regularmente e de forma inesperada. Alguns exemplos destes distúrbios são: surgimento de uma nova tarefa, cancelamento de uma tarefa, alteração na data de entrega, entre outros. Estes eventos dinâmicos devem ser tidos em conta, uma vez que podem influenciar o plano criado, tornando-o ineficiente. Portanto, ambientes de produção necessitam de resposta imediata para estes eventos, usando um método de reescalonamento em tempo real, para minimizar o efeito destes eventos dinâmicos no sistema de produção. Deste modo, os sistemas de escalonamento devem de uma forma automática e inteligente, ser capazes de adaptar o plano de escalonamento que a organização está a seguir aos eventos inesperados em tempo real. Esta dissertação aborda o problema de incorporar novas tarefas num plano de escalonamento já existente. Deste modo, é proposta uma abordagem de otimização – Hiper-heurística baseada em Seleção Construtiva para Escalonamento Dinâmico- para lidar com eventos dinâmicos que podem ocorrer num ambiente de produção, a fim de manter o plano de escalonamento, o mais robusto possível. Esta abordagem é inspirada em computação evolutiva e hiper-heurísticas. Do estudo computacional realizado foi possível concluir que o uso da hiper-heurística de seleção construtiva pode ser vantajoso na resolução de problemas de otimização de adaptação dinâmica.