62 resultados para Combinatorial reasoning
Resumo:
regula a posição do corpo no espaço, sendo um pré-requisito para o movimento. À periferia este processo de Controlo Postural pode ser identificado também através da variação do stiffness. O Acidente Vascular Encefálico apresenta-se como a patologia onde os sujeitos são referenciados como tendo alteração do stiffness, e poderão verificar-se modificações nesta variável no âmbito da reabilitação neuro-motora. Objetivo: Descrever o comportamento do stiffness da tibiotársica, nos dois membros inferiores, em indivíduos pós Acidente Vascular Encefálico, face a uma intervenção em fisioterapia baseada num processo de raciocínio clínico. Métodos: 5 sujeitos participaram no estudo, tendo sido implementado um programa de reabilitação para cada um dos sujeitos, por um período de 3 meses, com 2 momentos de avaliação (M0 e M1). O torque e a amplitude articular da tibiotársica foi monitorizada, através do dinamómetro isocinético, durante o movimento passivo de dorsiflexão, a diferentes velocidades (5º/s, 1º/s e 0,25º/s) A atividade eletromiográfica dos músculos Gastrocnémio Interno e Solear foi também recolhida. O valor de stiffness foi calculado através da relação torque/posição. Resultados: Em todos os sujeitos em estudo verificou-se que de uma forma geral o stiffness do membro contralateral à lesão apresentou uma modificação no sentido da diminuição em todas as amplitudes em M1. Nos sujeitos A e C, verificou-se que o stiffness do membro ipsilateral apresentou uma modificação no sentido da diminuição em M1 (em amplitudes intermédias). Nos sujeitos B, D e E o stiffness não apresentou modificações. O stiffness não variou com a velocidade. Conclusão: O stiffness apontou para uma diminuição, nos sujeitos em estudo no membro contralateral à lesão e no membro ipsilateral à lesão nos sujeitos A e C em amplitudes intermédias.
Resumo:
Introdução: O movimento do membro superior está de forma inequívoca direccionado para a resolução de problemas neuromotores. O gesto de alcance constitui o exemplo mais evidente da capacidade deste segmento se organizar no espaço com objetivos específicos e relacionados com a concretização de um propósito motor. A diminuição da necessidade de recorrer a estratégias compensatórias podem ser melhoradas através da implementação de uma intervenção baseada num processo de raciocínio clínico, assente na comprensão dos componentes específicos do movimento e do controle motor, o conceito de Bobath (CB). Objetivo: Pretendeu-se analisar as alterações nas variáveis: deslocamento do tronco, tempo de execução do movimento, unidades de movimento e velocidade máxima da mão no gesto de alcançar em 4 indivíduos com alterações neuromotoras decorrentes de um AVE, face à aplicação de um programa de intervenção baseado no CB. Metodologia: O estudo apresenta quatro casos de indivíduos com AVE, que realizaram intervenção em fisioterapia baseada no CB, durante 12 semanas. Antes e após a intervenção, analisadas as variáveis: deslocamento do tronco, tempo de execução do movimento, unidades de movimento e velocidade máxima da mão no gesto de alcançar recorrendo ao Qualisys Track Manager. Avaliou-se os movimentos compensatórios durante o gesto de alcance, através da Reach Performance Test e a Fugl-Meyer Assessment of Motor Recovery after Stroke para avaliar o comprometimento motor do MS. Resultados: Após a intervenção, os indivíduos em estudo apresentaram, na sua maioria, uma diminuição dos movimentos compensatórios no movimento de alcance. Apresentando diminuição deslocamento do troco, tempo de execução do movimento, unidades de movimento e um aumento na velocidade da mão. Conclusão: A intervenção baseada no CB teve efeitos positivos do ponto de vista do CP do tronco e MS, nos quatro indivíduos com AVE.
Resumo:
Desde o início que os sistemas periciais não foram vistos apenas como sistemas que poderiam substituir os peritos. Os peritos por sua vez, independentemente da área em que operam, eram tidos como indivíduos que atingiram a excelência através da experiência, estudo e total dedicação, por vezes durante anos. Hoje, mais do que assumir o papel de um perito na excelência de uma área de actuação, os sistemas periciais sobressaem pela disponibilidade contínua, acessos facilitados, custos reduzidos, estabilidade de funcionamento e coerência de raciocínio. Empurrados pelos avanços actuais das redes em termos de velocidade e propagação global, estes sistemas são hoje disponibilizados de forma mais simples e acessível. Este trabalho é realizado com o propósito de evoluir o actual sistema para uma versão mais apelativa, funcional e fiável. Um dos objectivos passa pela evolução do actual modo de funcionamento, utilização local e apenas um utilizador, para um modo de funcionamento que permita uma disponibilização num ambiente de acesso global, acessível a qualquer hora e em qualquer local.
Resumo:
Background Information:The incorporation of distance learning activities by institutions of higher education is considered an important contribution to create new opportunities for teaching at both, initial and continuing training. In Medicine and Nursing, several papers illustrate the adaptation of technological components and teaching methods are prolific, however, when we look at the Pharmaceutical Education area, the examples are scarce. In that sense this project demonstrates the implementation and assessment of a B-Learning Strategy for Therapeutics using a “case based learning” approach. Setting: Academic Pharmacy Methods:This is an exploratory study involving 2nd year students of the Pharmacy Degree at the School of Allied Health Sciences of Oporto. The study population consists of 61 students, divided in groups of 3-4 elements. The b-learning model was implemented during a time period of 8 weeks. Results:A B-learning environment and digital learning objects were successfully created and implemented. Collaboration and assessment techniques were carefully developed to ensure the active participation and fair assessment of all students. Moodle records show a consistent activity of students during the assignments. E-portfolios were also developed using Wikispaces, which promoted reflective writing and clinical reasoning. Conclusions:Our exploratory study suggests that the “case based learning” method can be successfully combined with the technological components to create and maintain a feasible online learning environment for the teaching of therapeutics.
Resumo:
A flow-spectrophotometric method is proposed for the routine determination of tartaric acid in wines. The reaction between tartaric acid and vanadate in acetic media is carried out in flowing conditions and the subsequent colored complex is monitored at 475 nm. The stability of the complex and the corresponding formation constant are presented. The effect of wavelength and pH was evaluated by batch experiments. The selected conditions were transposed to a flowinjection analytical system. Optimization of several flow parameters such as reactor lengths, flow-rate and injection volume was carried out. Using optimized conditions, a linear behavior was observed up to 1000 µg mL-1 tartaric acid, with a molar extinction coefficient of 450 L mg-1 cm-1 and ± 1 % repeatability. Sample throughput was 25 samples per hour. The flow-spectrophotometric method was satisfactorily applied to the quantification of tartaric acid (TA) in wines from different sources. Its accuracy was confirmed by statistical comparison to the conventional Rebelein procedure and to a certified analytical method carried out in a routine laboratory.
Resumo:
This paper presents a complete, quadratic programming formulation of the standard thermal unit commitment problem in power generation planning, together with a novel iterative optimisation algorithm for its solution. The algorithm, based on a mixed-integer formulation of the problem, considers piecewise linear approximations of the quadratic fuel cost function that are dynamically updated in an iterative way, converging to the optimum; this avoids the requirement of resorting to quadratic programming, making the solution process much quicker. From extensive computational tests on a broad set of benchmark instances of this problem, the algorithm was found to be flexible and capable of easily incorporating different problem constraints. Indeed, it is able to tackle ramp constraints, which although very important in practice were rarely considered in previous publications. Most importantly, optimal solutions were obtained for several well-known benchmark instances, including instances of practical relevance, that are not yet known to have been solved to optimality. Computational experiments and their results showed that the method proposed is both simple and extremely effective.
Resumo:
In recent years several countries have set up policies that allow exchange of kidneys between two or more incompatible patient–donor pairs. These policies lead to what is commonly known as kidney exchange programs. The underlying optimization problems can be formulated as integer programming models. Previously proposed models for kidney exchange programs have exponential numbers of constraints or variables, which makes them fairly difficult to solve when the problem size is large. In this work we propose two compact formulations for the problem, explain how these formulations can be adapted to address some problem variants, and provide results on the dominance of some models over others. Finally we present a systematic comparison between our models and two previously proposed ones via thorough computational analysis. Results show that compact formulations have advantages over non-compact ones when the problem size is large.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
Embedded real-time applications increasingly present high computation requirements, which need to be completed within specific deadlines, but that present highly variable patterns, depending on the set of data available in a determined instant. The current trend to provide parallel processing in the embedded domain allows providing higher processing power; however, it does not address the variability in the processing pattern. Dimensioning each device for its worst-case scenario implies lower average utilization, and increased available, but unusable, processing in the overall system. A solution for this problem is to extend the parallel execution of the applications, allowing networked nodes to distribute the workload, on peak situations, to neighbour nodes. In this context, this report proposes a framework to develop parallel and distributed real-time embedded applications, transparently using OpenMP and Message Passing Interface (MPI), within a programming model based on OpenMP. The technical report also devises an integrated timing model, which enables the structured reasoning on the timing behaviour of these hybrid architectures.
Resumo:
Mestrado em Educação Pré-Escolar
Resumo:
The resource constrained project scheduling problem (RCPSP) is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. During the last couple of years many heuristic procedures have been developed for this problem, but still these procedures often fail in finding near-optimal solutions. This paper proposes a genetic algorithm for the resource constrained project scheduling problem. The chromosome representation of the problem is based on random keys. The schedule is constructed using a heuristic priority rule in which the priorities and delay times of the activities are defined by the genetic algorithm. The approach was tested on a set of standard problems taken from the literature and compared with other approaches. The computational results validate the effectiveness of the proposed algorithm.
Resumo:
- The resource constrained project scheduling problem (RCPSP) is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. During the last couple of years many heuristic procedures have been developed for this problem, but still these procedures often fail in finding near-optimal solutions. This paper proposes a genetic algorithm for the resource constrained project scheduling problem. The chromosome representation of the problem is based on random keys. The schedule is constructed using a heuristic priority rule in which the priorities and delay times of the activities are defined by the genetic algorithm. The approach was tested on a set of standard problems taken from the literature and compared with other approaches. The computational results validate the effectiveness of the proposed algorithm
Resumo:
This paper presents a methodology for applying scheduling algorithms using Monte Carlo simulation. The methodology is based on a decision support system (DSS). The proposed methodology combines a genetic algorithm with a new local search using Monte Carlo Method. The methodology is applied to the job shop scheduling problem (JSSP). The JSSP is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. The methodology is tested on a set of standard instances taken from the literature and compared with others. The computation results validate the effectiveness of the proposed methodology. The DSS developed can be utilized in a common industrial or construction environment.
Resumo:
Swarm Intelligence (SI) is the property of a system whereby the collective behaviors of (unsophisticated) agents interacting locally with their environment cause coherent functional global patterns to emerge. Particle swarm optimization (PSO) is a form of SI, and a population-based search algorithm that is initialized with a population of random solutions, called particles. These particles are flying through hyperspace and have two essential reasoning capabilities: their memory of their own best position and knowledge of the swarm's best position. In a PSO scheme each particle flies through the search space with a velocity that is adjusted dynamically according with its historical behavior. Therefore, the particles have a tendency to fly towards the best search area along the search process. This work proposes a PSO based algorithm for logic circuit synthesis. The results show the statistical characteristics of this algorithm with respect to number of generations required to achieve the solutions. It is also presented a comparison with other two Evolutionary Algorithms, namely Genetic and Memetic Algorithms.
Resumo:
This paper presents an optimization approach for the job shop scheduling problem (JSSP). The JSSP is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. The proposed approach is based on a genetic algorithm technique. The scheduling rules such as SPT and MWKR are integrated into the process of genetic evolution. The chromosome representation of the problem is based on random keys. The schedules are constructed using a priority rule in which the priorities and delay times of the operations are defined by the genetic algorithm. Schedules are constructed using a procedure that generates parameterized active schedules. After a schedule is obtained a local search heuristic is applied to improve the solution. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed approach.