978 resultados para Normalization constraint
Resumo:
O Acidente Vascular Encefálico é uma das principais causas de morte, tornando-se cada vez mais iminente processos de reabilitação que minimizem as sequelas, nomeadamente as limitações do membro superior que dificultam o envolvimento em atividades da vida diária. O Constraint-Induced Movement Therapy, surge como uma abordagem que incrementa o uso do membro superior mais afetado. A presente investigação trata-se de um estudo de casos múltiplos. Pretende-se verificar se existem melhorias na funcionalidade do membro superior mais afetado, analisar em que atividades da vida diária são visíveis melhorias funcionais e compreender se o maior envolvimento nas atividades diárias está diretamente relacionado com a melhoria na capacidade funcional. Pretende-se ainda que os valores obtidos no Wolf Motor Function Test sejam um contributo para a sua validação para a população portuguesa. Utilizou-se um questionário para recolha de dados pessoais e clínicos (amplitudes de movimento, dor e espasticidade); o Wolf Motor Function Test e o Action Research Arm Test para verificar a funcionalidade do membro superior mais afetado; e a Motor Activity Log que avalia o envolvimento em atividades da vida diária. O grupo é constituído por 3 utentes que sofreram um primeiro Acidente Vascular Encefálico até 9 meses de evolução, internados na Santa Casa da Misericórdia de Monção e que cumpriam os critérios de inclusão. O programa foi implementado três horas/dia, durante 10 dias, mantendo a restrição no membro superior menos afetado durante 90% do dia acordado. Como se trata de um estudo de casos múltiplos, analisou-se cada participante individualmente e verificou-se a diferença entre os resultados finais e iniciais para cada uma das variáveis. Os resultados obtidos revelam ganhos na amplitude de movimento, velocidade de execução e capacidade funcional do membro superior mais afetado, nomeadamente nas funções de preensão e pinça da mão, bem como se testemunhou minimização do fenómeno learned nonuse. Verificaram-se ganhos funcionais em todos os participantes nas atividades da vida diária apesar de serem diferentes de participante para participante. Dois participantes afirmaram que voltariam a participar no programa.Conclui-se, assim que a técnica resulta em ganhos funcionais nestes utentes, indicando um caminho alternativo a outras abordagens de reabilitação.
Resumo:
Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.
Resumo:
The basic motivation of this work was the integration of biophysical models within the interval constraints framework for decision support. Comparing the major features of biophysical models with the expressive power of the existing interval constraints framework, it was clear that the most important inadequacy was related with the representation of differential equations. System dynamics is often modelled through differential equations but there was no way of expressing a differential equation as a constraint and integrate it within the constraints framework. Consequently, the goal of this work is focussed on the integration of ordinary differential equations within the interval constraints framework, which for this purpose is extended with the new formalism of Constraint Satisfaction Differential Problems. Such framework allows the specification of ordinary differential equations, together with related information, by means of constraints, and provides efficient propagation techniques for pruning the domains of their variables. This enabled the integration of all such information in a single constraint whose variables may subsequently be used in other constraints of the model. The specific method used for pruning its variable domains can then be combined with the pruning methods associated with the other constraints in an overall propagation algorithm for reducing the bounds of all model variables. The application of the constraint propagation algorithm for pruning the variable domains, that is, the enforcement of local-consistency, turned out to be insufficient to support decision in practical problems that include differential equations. The domain pruning achieved is not, in general, sufficient to allow safe decisions and the main reason derives from the non-linearity of the differential equations. Consequently, a complementary goal of this work proposes a new strong consistency criterion, Global Hull-consistency, particularly suited to decision support with differential models, by presenting an adequate trade-of between domain pruning and computational effort. Several alternative algorithms are proposed for enforcing Global Hull-consistency and, due to their complexity, an effort was made to provide implementations able to supply any-time pruning results. Since the consistency criterion is dependent on the existence of canonical solutions, it is proposed a local search approach that can be integrated with constraint propagation in continuous domains and, in particular, with the enforcing algorithms for anticipating the finding of canonical solutions. The last goal of this work is the validation of the approach as an important contribution for the integration of biophysical models within decision support. Consequently, a prototype application that integrated all the proposed extensions to the interval constraints framework is developed and used for solving problems in different biophysical domains.
Resumo:
Introduction: Standard Uptake Value (SUV) is a measurement of the uptake in a tumour normalized on the basis of a distribution volume and is used to quantify 18F-Fluorodeoxiglucose (FDG) uptake in tumors, such as primary lung tumor. Several sources of error can affect its accuracy. Normalization can be based on body weight, body surface area (BSA) and lean body mass (LBM). The aim of this study is to compare the influence of 3 normalization volumes in the calculation of SUV: body weight (SUVW), BSA (SUVBSA) and LBM (SUVLBM), with and without glucose correction, in patients with known primary lung tumor. The correlation between SUV and weight, height, blood glucose level, injected activity and time between injection and image acquisition is evaluated. Methods: Sample included 30 subjects (8 female and 22 male) with primary lung tumor, with clinical indication for 18F-FDG Positron Emission Tomography (PET). Images were acquired on a Siemens Biography according to the department’s protocol. Maximum pixel SUVW was obtained for abnormal uptake focus through semiautomatic VOI with Quantification 3D isocontour (threshold 2.5). The concentration of radioactivity (kBq/ml) was obtained from SUVW, SUVBSA, SUVLBM and the glucose corrected SUV were mathematically obtained. Results: Statistically significant differences between SUVW, SUVBSA and SUVLBM and between SUVWgluc, SUVBSAgluc and SUVLBMgluc were observed (p=0.000<0.05). The blood glucose level showed significant positive correlations with SUVW (r=0.371; p=0.043) and SUVLBM (r=0.389; p=0.034). SUVBSA showed independence of variations with the blood glucose level. Conclusion: The measurement of a radiopharmaceutical tumor uptake normalized on the basis of different distribution volumes is still variable. Further investigation on this subject is recommended.
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Informática, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Work presented in the context of the European Master in Computational Logics, as partial requisit for the graduation as Master in Computational Logics
Resumo:
work presented in the context of the European Master’s program in Computational Logic, as the partial requirement for obtaining Master of Science degree in Computational Logic
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Informática, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Química e Bioquímica
Resumo:
Optimization is a very important field for getting the best possible value for the optimization function. Continuous optimization is optimization over real intervals. There are many global and local search techniques. Global search techniques try to get the global optima of the optimization problem. However, local search techniques are used more since they try to find a local minimal solution within an area of the search space. In Continuous Constraint Satisfaction Problems (CCSP)s, constraints are viewed as relations between variables, and the computations are supported by interval analysis. The continuous constraint programming framework provides branch-and-prune algorithms for covering sets of solutions for the constraints with sets of interval boxes which are the Cartesian product of intervals. These algorithms begin with an initial crude cover of the feasible space (the Cartesian product of the initial variable domains) which is recursively refined by interleaving pruning and branching steps until a stopping criterion is satisfied. In this work, we try to find a convenient way to use the advantages in CCSP branchand- prune with local search of global optimization applied locally over each pruned branch of the CCSP. We apply local search techniques of continuous optimization over the pruned boxes outputted by the CCSP techniques. We mainly use steepest descent technique with different characteristics such as penalty calculation and step length. We implement two main different local search algorithms. We use “Procure”, which is a constraint reasoning and global optimization framework, to implement our techniques, then we produce and introduce our results over a set of benchmarks.
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
Shifting from chemical to biotechnological processes is one of the cornerstones of 21st century industry. The production of a great range of chemicals via biotechnological means is a key challenge on the way toward a bio-based economy. However, this shift is occurring at a pace slower than initially expected. The development of efficient cell factories that allow for competitive production yields is of paramount importance for this leap to happen. Constraint-based models of metabolism, together with in silico strain design algorithms, promise to reveal insights into the best genetic design strategies, a step further toward achieving that goal. In this work, a thorough analysis of the main in silico constraint-based strain design strategies and algorithms is presented, their application in real-world case studies is analyzed, and a path for the future is discussed.
Resumo:
This chapter presents a general methodology for the formulation of the kinematic constraint equations at position, velocity and acceleration levels. Also a brief characterization of the different type of constraints is offered, namely the holonomic and nonholonomic constraints. The kinematic constraints described here are formulated using generalized coordinates. The chapter ends with a general approach to deal with the kinematic analysis of multibody systems.
Resumo:
El batolito de Achala es uno de los macizos graníticos más grandes de las Sierras Pampeanas, el cual se localiza en las Sierras Grandes de Córdoba. Si bien el batolito de Achala ha sido objeto de diversos estudios geológicos, principalmente debido a sus yacimientos de uranio, el mismo todavía no posee un inequívoco modelo petrogéntico. Tampoco existe, en la actualidad, un inequívoco modelo que explique la preconcentración de uranio en las rocas graníticas portadores de este elemento. Este Proyecto tiene como objetivo general realizar estudios petrológicos y geoquímicos en la región conocida como CAÑADA del PUERTO, un lugar estratégicamente definido debido a la abundancia de granitos equigranulares de grano fino y/o grano medio biotíticos, emplazados durante el desarrollo de cizallas magmáticas tardías, y que constituirían las rocas fuentes de uranio. El objetivo específico requiere estudios detallados de las diferentes facies del batolito de Achala en el área seleccionada, incluyendo investigaciones petrológicas, geoquímicas de roca total, geoquímica de isótopos radiactivos y química mineral, con el fin de definir un MODELO PETROGENÉTICO que permita explicar: (a) el origen del magma padre y el subsiguiente proceso de cristalización de las diferentes facies graníticas aflorantes en el área de estudio, (b) identificar el proceso principal que condujo a la PRECONCENTRACIÓN uranífera de los magmas graníticos canalizados en las cizallas magmáticas tardías. Ambos objetivos se complementan y no son compartimentos estancos, ya que el logro combinado de estos objetivos permitirá comprender de mejor manera el proceso geoquímico que gobernó la distribución y concentración del U. De esta manera, se intentará definir un MODELO de PRECONCENTRACIÓN URANÍFERA EXTRAPOLABLE a otras áreas graníticas enriquecidas en uranio, constituyendo una poderosa herramienta de investigación aplicada a la exploración uranífera. En particular, el conocimiento de los recursos uraníferos es parte de una estrategia nacional con vistas a triplicar antes del 2025 la disponibilidad energética actual, en cuyo caso, el uranio constituye la materia prima de las centrales nucleares que se están planificando y en construcción. Por otro lado, la Argentina adhirió al Protocolo de Kioto y, junto a los países adherentes, deben disminuir de manera progresiva el uso de combustibles fósiles (que producen gases de efecto invernadero), reemplazándola por otras fuentes de energía, entre ellas, la ENERGÍA NUCLEAR. Este Proyecto, si bien NO es un Proyecto de exploración y/o prospección minera, es totalmente consistente con la política energética nacional promocionada desde el Ministerio de Planificación Federal, Inversión Pública y Servicios (v. sitio WEB CNEA), que ha invertido, desde 2006, importantes sumas de dinero, en el marco del Programa de Reactivación de la Actividad Nuclear.Los estudios referidos serán conducidos por los Drs. Dahlquist (CONICET-UNC) y Zarco (CNEA) quienes integrarán sus experiencias desarrolladas en el campo de las Ciencias Básicas con aquel logrado en el campo de las Ciencias Aplicadas, respectivamente. Se pretende, por tanto, aplicar conocimientos académicos-científicos a un problema de geología con potencial significado económico-energético, vinculando las instituciones referidas, esto es, CONICET-UNC y CNEA, con el fin de contribuir a la actividad socioeconómica de la provincia de Córdoba en particular y de Argentina en general.Finalmente, convencidos de que el progreso de la Ciencia y el Desarrollo Tecnológico está íntimamente vinculada con la sólida Formación de Recursos Humanos se pretende que este Proyecto contribuya SIGNIFICATIVAMENTE a las investigaciones de Doctorado que iniciará la Geóloga Carina Bello, actual Becaria de la CNEA.
Resumo:
n.s. no.43(1988)