941 resultados para Closed-Loop Control
Resumo:
Antecedentes Europa vive una situación insostenible. Desde el 2008 se han reducido los recursos de los gobiernos a raíz de la crisis económica. El continente Europeo envejece con ritmo constante al punto que se prevé que en 2050 habrá sólo dos trabajadores por jubilado [54]. A esta situación se le añade el aumento de la incidencia de las enfermedades crónicas, relacionadas con el envejecimiento, cuyo coste puede alcanzar el 7% del PIB de un país [51]. Es necesario un cambio de paradigma. Una nueva manera de cuidar de la salud de las personas: sustentable, eficaz y preventiva más que curativa. Algunos estudios abogan por el cuidado personalizado de la salud (pHealth). En este modelo las prácticas médicas son adaptadas e individualizadas al paciente, desde la detección de los factores de riesgo hasta la personalización de los tratamientos basada en la respuesta del individuo [81]. El cuidado personalizado de la salud está asociado a menudo al uso de las tecnologías de la información y comunicación (TICs) que, con su desarrollo exponencial, ofrecen oportunidades interesantes para la mejora de la salud. El cambio de paradigma hacia el pHealth está lentamente ocurriendo, tanto en el ámbito de la investigación como en la industria, pero todavía no de manera significativa. Existen todavía muchas barreras relacionadas a la economía, a la política y la cultura. También existen barreras puramente tecnológicas, como la falta de sistemas de información interoperables [199]. A pesar de que los aspectos de interoperabilidad están evolucionando, todavía hace falta un diseño de referencia especialmente direccionado a la implementación y el despliegue en gran escala de sistemas basados en pHealth. La presente Tesis representa un intento de organizar la disciplina de la aplicación de las TICs al cuidado personalizado de la salud en un modelo de referencia, que permita la creación de plataformas de desarrollo de software para simplificar tareas comunes de desarrollo en este dominio. Preguntas de investigación RQ1 >Es posible definir un modelo, basado en técnicas de ingeniería del software, que represente el dominio del cuidado personalizado de la salud de una forma abstracta y representativa? RQ2 >Es posible construir una plataforma de desarrollo basada en este modelo? RQ3 >Esta plataforma ayuda a los desarrolladores a crear sistemas pHealth complejos e integrados? Métodos Para la descripción del modelo se adoptó el estándar ISO/IEC/IEEE 42010por ser lo suficientemente general y abstracto para el amplio enfoque de esta tesis [25]. El modelo está definido en varias partes: un modelo conceptual, expresado a través de mapas conceptuales que representan las partes interesadas (stakeholders), los artefactos y la información compartida; y escenarios y casos de uso para la descripción de sus funcionalidades. El modelo fue desarrollado de acuerdo a la información obtenida del análisis de la literatura, incluyendo 7 informes industriales y científicos, 9 estándares, 10 artículos en conferencias, 37 artículos en revistas, 25 páginas web y 5 libros. Basándose en el modelo se definieron los requisitos para la creación de la plataforma de desarrollo, enriquecidos por otros requisitos recolectados a través de una encuesta realizada a 11 ingenieros con experiencia en la rama. Para el desarrollo de la plataforma, se adoptó la metodología de integración continua [74] que permitió ejecutar tests automáticos en un servidor y también desplegar aplicaciones en una página web. En cuanto a la metodología utilizada para la validación se adoptó un marco para la formulación de teorías en la ingeniería del software [181]. Esto requiere el desarrollo de modelos y proposiciones que han de ser validados dentro de un ámbito de investigación definido, y que sirvan para guiar al investigador en la búsqueda de la evidencia necesaria para justificarla. La validación del modelo fue desarrollada mediante una encuesta online en tres rondas con un número creciente de invitados. El cuestionario fue enviado a 134 contactos y distribuido en algunos canales públicos como listas de correo y redes sociales. El objetivo era evaluar la legibilidad del modelo, su nivel de cobertura del dominio y su potencial utilidad en el diseño de sistemas derivados. El cuestionario incluía preguntas cuantitativas de tipo Likert y campos para recolección de comentarios. La plataforma de desarrollo fue validada en dos etapas. En la primera etapa se utilizó la plataforma en un experimento a pequeña escala, que consistió en una sesión de entrenamiento de 12 horas en la que 4 desarrolladores tuvieron que desarrollar algunos casos de uso y reunirse en un grupo focal para discutir su uso. La segunda etapa se realizó durante los tests de un proyecto en gran escala llamado HeartCycle [160]. En este proyecto un equipo de diseñadores y programadores desarrollaron tres aplicaciones en el campo de las enfermedades cardio-vasculares. Una de estas aplicaciones fue testeada en un ensayo clínico con pacientes reales. Al analizar el proyecto, el equipo de desarrollo se reunió en un grupo focal para identificar las ventajas y desventajas de la plataforma y su utilidad. Resultados Por lo que concierne el modelo que describe el dominio del pHealth, la parte conceptual incluye una descripción de los roles principales y las preocupaciones de los participantes, un modelo de los artefactos TIC que se usan comúnmente y un modelo para representar los datos típicos que son necesarios formalizar e intercambiar entre sistemas basados en pHealth. El modelo funcional incluye un conjunto de 18 escenarios, repartidos en: punto de vista de la persona asistida, punto de vista del cuidador, punto de vista del desarrollador, punto de vista de los proveedores de tecnologías y punto de vista de las autoridades; y un conjunto de 52 casos de uso repartidos en 6 categorías: actividades de la persona asistida, reacciones del sistema, actividades del cuidador, \engagement" del usuario, actividades del desarrollador y actividades de despliegue. Como resultado del cuestionario de validación del modelo, un total de 65 personas revisó el modelo proporcionando su nivel de acuerdo con las dimensiones evaluadas y un total de 248 comentarios sobre cómo mejorar el modelo. Los conocimientos de los participantes variaban desde la ingeniería del software (70%) hasta las especialidades médicas (15%), con declarado interés en eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), medicina personalizada (5%), sistemas basados en pHealth (15%), informática médica (10%) e ingeniería biomédica (8%) con una media de 7.25_4.99 años de experiencia en estas áreas. Los resultados de la encuesta muestran que los expertos contactados consideran el modelo fácil de leer (media de 1.89_0.79 siendo 1 el valor más favorable y 5 el peor), suficientemente abstracto (1.99_0.88) y formal (2.13_0.77), con una cobertura suficiente del dominio (2.26_0.95), útil para describir el dominio (2.02_0.7) y para generar sistemas más específicos (2_0.75). Los expertos también reportan un interés parcial en utilizar el modelo en su trabajo (2.48_0.91). Gracias a sus comentarios, el modelo fue mejorado y enriquecido con conceptos que faltaban, aunque no se pudo demonstrar su mejora en las dimensiones evaluadas, dada la composición diferente de personas en las tres rondas de evaluación. Desde el modelo, se generó una plataforma de desarrollo llamada \pHealth Patient Platform (pHPP)". La plataforma desarrollada incluye librerías, herramientas de programación y desarrollo, un tutorial y una aplicación de ejemplo. Se definieron cuatro módulos principales de la arquitectura: el Data Collection Engine, que permite abstraer las fuentes de datos como sensores o servicios externos, mapeando los datos a bases de datos u ontologías, y permitiendo interacción basada en eventos; el GUI Engine, que abstrae la interfaz de usuario en un modelo de interacción basado en mensajes; y el Rule Engine, que proporciona a los desarrolladores un medio simple para programar la lógica de la aplicación en forma de reglas \if-then". Después de que la plataforma pHPP fue utilizada durante 5 años en el proyecto HeartCycle, 5 desarrolladores fueron reunidos en un grupo de discusión para analizar y evaluar la plataforma. De estas evaluaciones se concluye que la plataforma fue diseñada para encajar las necesidades de los ingenieros que trabajan en la rama, permitiendo la separación de problemas entre las distintas especialidades, y simplificando algunas tareas de desarrollo como el manejo de datos y la interacción asíncrona. A pesar de ello, se encontraron algunos defectos a causa de la inmadurez de algunas tecnologías empleadas, y la ausencia de algunas herramientas específicas para el dominio como el procesado de datos o algunos protocolos de comunicación relacionados con la salud. Dentro del proyecto HeartCycle la plataforma fue utilizada para el desarrollo de la aplicación \Guided Exercise", un sistema TIC para la rehabilitación de pacientes que han sufrido un infarto del miocardio. El sistema fue testeado en un ensayo clínico randomizado en el cual a 55 pacientes se les dio el sistema para su uso por 21 semanas. De los resultados técnicos del ensayo se puede concluir que, a pesar de algunos errores menores prontamente corregidos durante el estudio, la plataforma es estable y fiable. Conclusiones La investigación llevada a cabo en esta Tesis y los resultados obtenidos proporcionan las respuestas a las tres preguntas de investigación que motivaron este trabajo: RQ1 Se ha desarrollado un modelo para representar el dominio de los sistemas personalizados de salud. La evaluación hecha por los expertos de la rama concluye que el modelo representa el dominio con precisión y con un balance apropiado entre abstracción y detalle. RQ2 Se ha desarrollado, con éxito, una plataforma de desarrollo basada en el modelo. RQ3 Se ha demostrado que la plataforma es capaz de ayudar a los desarrolladores en la creación de software pHealth complejos. Las ventajas de la plataforma han sido demostradas en el ámbito de un proyecto de gran escala, aunque el enfoque genérico adoptado indica que la plataforma podría ofrecer beneficios también en otros contextos. Los resultados de estas evaluaciones ofrecen indicios de que, ambos, el modelo y la plataforma serán buenos candidatos para poderse convertir en una referencia para futuros desarrollos de sistemas pHealth. ABSTRACT Background Europe is living in an unsustainable situation. The economic crisis has been reducing governments' economic resources since 2008 and threatening social and health systems, while the proportion of older people in the European population continues to increase so that it is foreseen that in 2050 there will be only two workers per retiree [54]. To this situation it should be added the rise, strongly related to age, of chronic diseases the burden of which has been estimated to be up to the 7% of a country's gross domestic product [51]. There is a need for a paradigm shift, the need for a new way of caring for people's health, shifting the focus from curing conditions that have arisen to a sustainable and effective approach with the emphasis on prevention. Some advocate the adoption of personalised health care (pHealth), a model where medical practices are tailored to the patient's unique life, from the detection of risk factors to the customization of treatments based on each individual's response [81]. Personalised health is often associated to the use of Information and Communications Technology (ICT), that, with its exponential development, offers interesting opportunities for improving healthcare. The shift towards pHealth is slowly taking place, both in research and in industry, but the change is not significant yet. Many barriers still exist related to economy, politics and culture, while others are purely technological, like the lack of interoperable information systems [199]. Though interoperability aspects are evolving, there is still the need of a reference design, especially tackling implementation and large scale deployment of pHealth systems. This thesis contributes to organizing the subject of ICT systems for personalised health into a reference model that allows for the creation of software development platforms to ease common development issues in the domain. Research questions RQ1 Is it possible to define a model, based on software engineering techniques, for representing the personalised health domain in an abstract and representative way? RQ2 Is it possible to build a development platform based on this model? RQ3 Does the development platform help developers create complex integrated pHealth systems? Methods As method for describing the model, the ISO/IEC/IEEE 42010 framework [25] is adopted for its generality and high level of abstraction. The model is specified in different parts: a conceptual model, which makes use of concept maps, for representing stakeholders, artefacts and shared information, and in scenarios and use cases for the representation of the functionalities of pHealth systems. The model was derived from literature analysis, including 7 industrial and scientific reports, 9 electronic standards, 10 conference proceedings papers, 37 journal papers, 25 websites and 5 books. Based on the reference model, requirements were drawn for building the development platform enriched with a set of requirements gathered in a survey run among 11 experienced engineers. For developing the platform, the continuous integration methodology [74] was adopted which allowed to perform automatic tests on a server and also to deploy packaged releases on a web site. As a validation methodology, a theory building framework for SW engineering was adopted from [181]. The framework, chosen as a guide to find evidence for justifying the research questions, imposed the creation of theories based on models and propositions to be validated within a scope. The validation of the model was conducted as an on-line survey in three validation rounds, encompassing a growing number of participants. The survey was submitted to 134 experts of the field and on some public channels like relevant mailing lists and social networks. Its objective was to assess the model's readability, its level of coverage of the domain and its potential usefulness in the design of actual, derived systems. The questionnaires included quantitative Likert scale questions and free text inputs for comments. The development platform was validated in two scopes. As a small-scale experiment, the platform was used in a 12 hours training session where 4 developers had to perform an exercise consisting in developing a set of typical pHealth use cases At the end of the session, a focus group was held to identify benefits and drawbacks of the platform. The second validation was held as a test-case study in a large scale research project called HeartCycle the aim of which was to develop a closed-loop disease management system for heart failure and coronary heart disease patients [160]. During this project three applications were developed by a team of programmers and designers. One of these applications was tested in a clinical trial with actual patients. At the end of the project, the team was interviewed in a focus group to assess the role the platform had within the project. Results For what regards the model that describes the pHealth domain, its conceptual part includes a description of the main roles and concerns of pHealth stakeholders, a model of the ICT artefacts that are commonly adopted and a model representing the typical data that need to be formalized among pHealth systems. The functional model includes a set of 18 scenarios, divided into assisted person's view, caregiver's view, developer's view, technology and services providers' view and authority's view, and a set of 52 Use Cases grouped in 6 categories: assisted person's activities, system reactions, caregiver's activities, user engagement, developer's activities and deployer's activities. For what concerns the validation of the model, a total of 65 people participated in the online survey providing their level of agreement in all the assessed dimensions and a total of 248 comments on how to improve and complete the model. Participants' background spanned from engineering and software development (70%) to medical specialities (15%), with declared interest in the fields of eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), Personalized Medicine (5%), Personal Health Systems (15%), Medical Informatics (10%) and Biomedical Engineering (8%) with an average of 7.25_4.99 years of experience in these fields. From the analysis of the answers it is possible to observe that the contacted experts considered the model easily readable (average of 1.89_0.79 being 1 the most favourable scoring and 5 the worst), sufficiently abstract (1.99_0.88) and formal (2.13_0.77) for its purpose, with a sufficient coverage of the domain (2.26_0.95), useful for describing the domain (2.02_0.7) and for generating more specific systems (2_0.75) and they reported a partial interest in using the model in their job (2.48_0.91). Thanks to their comments, the model was improved and enriched with concepts that were missing at the beginning, nonetheless it was not possible to prove an improvement among the iterations, due to the diversity of the participants in the three rounds. From the model, a development platform for the pHealth domain was generated called pHealth Patient Platform (pHPP). The platform includes a set of libraries, programming and deployment tools, a tutorial and a sample application. The main four modules of the architecture are: the Data Collection Engine, which allows abstracting sources of information like sensors or external services, mapping data to databases and ontologies, and allowing event-based interaction and filtering, the GUI Engine, which abstracts the user interface in a message-like interaction model, the Workow Engine, which allows programming the application's user interaction ows with graphical workows, and the Rule Engine, which gives developers a simple means for programming the application's logic in the form of \if-then" rules. After the 5 years experience of HeartCycle, partially programmed with pHPP, 5 developers were joined in a focus group to discuss the advantages and drawbacks of the platform. The view that emerged from the training course and the focus group was that the platform is well-suited to the needs of the engineers working in the field, it allowed the separation of concerns among the different specialities and it simplified some common development tasks like data management and asynchronous interaction. Nevertheless, some deficiencies were pointed out in terms of a lack of maturity of some technological choices, and for the absence of some domain-specific tools, e.g. for data processing or for health-related communication protocols. Within HeartCycle, the platform was used to develop part of the Guided Exercise system, a composition of ICT tools for the physical rehabilitation of patients who suffered from myocardial infarction. The system developed using the platform was tested in a randomized controlled clinical trial, in which 55 patients used the system for 21 weeks. The technical results of this trial showed that the system was stable and reliable. Some minor bugs were detected, but these were promptly corrected using the platform. This shows that the platform, as well as facilitating the development task, can be successfully used to produce reliable software. Conclusions The research work carried out in developing this thesis provides responses to the three three research questions that were the motivation for the work. RQ1 A model was developed representing the domain of personalised health systems, and the assessment of experts in the field was that it represents the domain accurately, with an appropriate balance between abstraction and detail. RQ2 A development platform based on the model was successfully developed. RQ3 The platform has been shown to assist developers create complex pHealth software. This was demonstrated within the scope of one large-scale project, but the generic approach adopted provides indications that it would offer benefits more widely. The results of these evaluations provide indications that both the model and the platform are good candidates for being a reference for future pHealth developments.
Resumo:
El auge del "Internet de las Cosas" (IoT, "Internet of Things") y sus tecnologías asociadas han permitido su aplicación en diversos dominios de la aplicación, entre los que se encuentran la monitorización de ecosistemas forestales, la gestión de catástrofes y emergencias, la domótica, la automatización industrial, los servicios para ciudades inteligentes, la eficiencia energética de edificios, la detección de intrusos, la gestión de desastres y emergencias o la monitorización de señales corporales, entre muchas otras. La desventaja de una red IoT es que una vez desplegada, ésta queda desatendida, es decir queda sujeta, entre otras cosas, a condiciones climáticas cambiantes y expuestas a catástrofes naturales, fallos de software o hardware, o ataques maliciosos de terceros, por lo que se puede considerar que dichas redes son propensas a fallos. El principal requisito de los nodos constituyentes de una red IoT es que estos deben ser capaces de seguir funcionando a pesar de sufrir errores en el propio sistema. La capacidad de la red para recuperarse ante fallos internos y externos inesperados es lo que se conoce actualmente como "Resiliencia" de la red. Por tanto, a la hora de diseñar y desplegar aplicaciones o servicios para IoT, se espera que la red sea tolerante a fallos, que sea auto-configurable, auto-adaptable, auto-optimizable con respecto a nuevas condiciones que puedan aparecer durante su ejecución. Esto lleva al análisis de un problema fundamental en el estudio de las redes IoT, el problema de la "Conectividad". Se dice que una red está conectada si todo par de nodos en la red son capaces de encontrar al menos un camino de comunicación entre ambos. Sin embargo, la red puede desconectarse debido a varias razones, como que se agote la batería, que un nodo sea destruido, etc. Por tanto, se hace necesario gestionar la resiliencia de la red con el objeto de mantener la conectividad entre sus nodos, de tal manera que cada nodo IoT sea capaz de proveer servicios continuos, a otros nodos, a otras redes o, a otros servicios y aplicaciones. En este contexto, el objetivo principal de esta tesis doctoral se centra en el estudio del problema de conectividad IoT, más concretamente en el desarrollo de modelos para el análisis y gestión de la Resiliencia, llevado a la práctica a través de las redes WSN, con el fin de mejorar la capacidad la tolerancia a fallos de los nodos que componen la red. Este reto se aborda teniendo en cuenta dos enfoques distintos, por una parte, a diferencia de otro tipo de redes de dispositivos convencionales, los nodos en una red IoT son propensos a perder la conexión, debido a que se despliegan en entornos aislados, o en entornos con condiciones extremas; por otra parte, los nodos suelen ser recursos con bajas capacidades en términos de procesamiento, almacenamiento y batería, entre otros, por lo que requiere que el diseño de la gestión de su resiliencia sea ligero, distribuido y energéticamente eficiente. En este sentido, esta tesis desarrolla técnicas auto-adaptativas que permiten a una red IoT, desde la perspectiva del control de su topología, ser resiliente ante fallos en sus nodos. Para ello, se utilizan técnicas basadas en lógica difusa y técnicas de control proporcional, integral y derivativa (PID - "proportional-integral-derivative"), con el objeto de mejorar la conectividad de la red, teniendo en cuenta que el consumo de energía debe preservarse tanto como sea posible. De igual manera, se ha tenido en cuenta que el algoritmo de control debe ser distribuido debido a que, en general, los enfoques centralizados no suelen ser factibles a despliegues a gran escala. El presente trabajo de tesis implica varios retos que conciernen a la conectividad de red, entre los que se incluyen: la creación y el análisis de modelos matemáticos que describan la red, una propuesta de sistema de control auto-adaptativo en respuesta a fallos en los nodos, la optimización de los parámetros del sistema de control, la validación mediante una implementación siguiendo un enfoque de ingeniería del software y finalmente la evaluación en una aplicación real. Atendiendo a los retos anteriormente mencionados, el presente trabajo justifica, mediante una análisis matemático, la relación existente entre el "grado de un nodo" (definido como el número de nodos en la vecindad del nodo en cuestión) y la conectividad de la red, y prueba la eficacia de varios tipos de controladores que permiten ajustar la potencia de trasmisión de los nodos de red en respuesta a eventuales fallos, teniendo en cuenta el consumo de energía como parte de los objetivos de control. Así mismo, este trabajo realiza una evaluación y comparación con otros algoritmos representativos; en donde se demuestra que el enfoque desarrollado es más tolerante a fallos aleatorios en los nodos de la red, así como en su eficiencia energética. Adicionalmente, el uso de algoritmos bioinspirados ha permitido la optimización de los parámetros de control de redes dinámicas de gran tamaño. Con respecto a la implementación en un sistema real, se han integrado las propuestas de esta tesis en un modelo de programación OSGi ("Open Services Gateway Initiative") con el objeto de crear un middleware auto-adaptativo que mejore la gestión de la resiliencia, especialmente la reconfiguración en tiempo de ejecución de componentes software cuando se ha producido un fallo. Como conclusión, los resultados de esta tesis doctoral contribuyen a la investigación teórica y, a la aplicación práctica del control resiliente de la topología en redes distribuidas de gran tamaño. Los diseños y algoritmos presentados pueden ser vistos como una prueba novedosa de algunas técnicas para la próxima era de IoT. A continuación, se enuncian de forma resumida las principales contribuciones de esta tesis: (1) Se han analizado matemáticamente propiedades relacionadas con la conectividad de la red. Se estudia, por ejemplo, cómo varía la probabilidad de conexión de la red al modificar el alcance de comunicación de los nodos, así como cuál es el mínimo número de nodos que hay que añadir al sistema desconectado para su re-conexión. (2) Se han propuesto sistemas de control basados en lógica difusa para alcanzar el grado de los nodos deseado, manteniendo la conectividad completa de la red. Se han evaluado diferentes tipos de controladores basados en lógica difusa mediante simulaciones, y los resultados se han comparado con otros algoritmos representativos. (3) Se ha investigado más a fondo, dando un enfoque más simple y aplicable, el sistema de control de doble bucle, y sus parámetros de control se han optimizado empleando algoritmos heurísticos como el método de la entropía cruzada (CE, "Cross Entropy"), la optimización por enjambre de partículas (PSO, "Particle Swarm Optimization"), y la evolución diferencial (DE, "Differential Evolution"). (4) Se han evaluado mediante simulación, la mayoría de los diseños aquí presentados; además, parte de los trabajos se han implementado y validado en una aplicación real combinando técnicas de software auto-adaptativo, como por ejemplo las de una arquitectura orientada a servicios (SOA, "Service-Oriented Architecture"). ABSTRACT The advent of the Internet of Things (IoT) enables a tremendous number of applications, such as forest monitoring, disaster management, home automation, factory automation, smart city, etc. However, various kinds of unexpected disturbances may cause node failure in the IoT, for example battery depletion, software/hardware malfunction issues and malicious attacks. So, it can be considered that the IoT is prone to failure. The ability of the network to recover from unexpected internal and external failures is known as "resilience" of the network. Resilience usually serves as an important non-functional requirement when designing IoT, which can further be broken down into "self-*" properties, such as self-adaptive, self-healing, self-configuring, self-optimization, etc. One of the consequences that node failure brings to the IoT is that some nodes may be disconnected from others, such that they are not capable of providing continuous services for other nodes, networks, and applications. In this sense, the main objective of this dissertation focuses on the IoT connectivity problem. A network is regarded as connected if any pair of different nodes can communicate with each other either directly or via a limited number of intermediate nodes. More specifically, this thesis focuses on the development of models for analysis and management of resilience, implemented through the Wireless Sensor Networks (WSNs), which is a challenging task. On the one hand, unlike other conventional network devices, nodes in the IoT are more likely to be disconnected from each other due to their deployment in a hostile or isolated environment. On the other hand, nodes are resource-constrained in terms of limited processing capability, storage and battery capacity, which requires that the design of the resilience management for IoT has to be lightweight, distributed and energy-efficient. In this context, the thesis presents self-adaptive techniques for IoT, with the aim of making the IoT resilient against node failures from the network topology control point of view. The fuzzy-logic and proportional-integral-derivative (PID) control techniques are leveraged to improve the network connectivity of the IoT in response to node failures, meanwhile taking into consideration that energy consumption must be preserved as much as possible. The control algorithm itself is designed to be distributed, because the centralized approaches are usually not feasible in large scale IoT deployments. The thesis involves various aspects concerning network connectivity, including: creation and analysis of mathematical models describing the network, proposing self-adaptive control systems in response to node failures, control system parameter optimization, implementation using the software engineering approach, and evaluation in a real application. This thesis also justifies the relations between the "node degree" (the number of neighbor(s) of a node) and network connectivity through mathematic analysis, and proves the effectiveness of various types of controllers that can adjust power transmission of the IoT nodes in response to node failures. The controllers also take into consideration the energy consumption as part of the control goals. The evaluation is performed and comparison is made with other representative algorithms. The simulation results show that the proposals in this thesis can tolerate more random node failures and save more energy when compared with those representative algorithms. Additionally, the simulations demonstrate that the use of the bio-inspired algorithms allows optimizing the parameters of the controller. With respect to the implementation in a real system, the programming model called OSGi (Open Service Gateway Initiative) is integrated with the proposals in order to create a self-adaptive middleware, especially reconfiguring the software components at runtime when failures occur. The outcomes of this thesis contribute to theoretic research and practical applications of resilient topology control for large and distributed networks. The presented controller designs and optimization algorithms can be viewed as novel trials of the control and optimization techniques for the coming era of the IoT. The contributions of this thesis can be summarized as follows: (1) Mathematically, the fault-tolerant probability of a large-scale stochastic network is analyzed. It is studied how the probability of network connectivity depends on the communication range of the nodes, and what is the minimum number of neighbors to be added for network re-connection. (2) A fuzzy-logic control system is proposed, which obtains the desired node degree and in turn maintains the network connectivity when it is subject to node failures. There are different types of fuzzy-logic controllers evaluated by simulations, and the results demonstrate the improvement of fault-tolerant capability as compared to some other representative algorithms. (3) A simpler but more applicable approach, the two-loop control system is further investigated, and its control parameters are optimized by using some heuristic algorithms such as Cross Entropy (CE), Particle Swarm Optimization (PSO), and Differential Evolution (DE). (4) Most of the designs are evaluated by means of simulations, but part of the proposals are implemented and tested in a real-world application by combining the self-adaptive software technique and the control algorithms which are presented in this thesis.
Resumo:
This paper studies the impact that different approaches of modeling the real-time use of the secondary regulation reserves have in the joint energy and reserve hourly scheduling of a price-taker pumped-storage hydropower plant. The unexpected imbalance costs due to the error between the forecasted real-time use of the reserves and the actual value are also studied and evaluated for the different approaches. The proposed methodology is applied to a daily-cycle and closed-loop pumped-storage hydropower plant. Preliminary results show that the deviations in the water volume at the end of the day are important when the percentage of the real-time use of reserves is unknown in advance, and also that the total income in all approaches after correcting these deviations is significantly lower than the maximum theoretical income.
Resumo:
BACKGROUND & AIMS The liver performs a panoply of complex activities coordinating metabolic, immunologic and detoxification processes. Despite the liver's robustness and unique self-regeneration capacity, viral infection, autoimmune disorders, fatty liver disease, alcohol abuse and drug-induced hepatotoxicity contribute to the increasing prevalence of liver failure. Liver injuries impair the clearance of bile acids from the hepatic portal vein which leads to their spill over into the peripheral circulation where they activate the G-protein-coupled bile acid receptor TGR5 to initiate a variety of hepatoprotective processes. METHODS By functionally linking activation of ectopically expressed TGR5 to an artificial promoter controlling transcription of the hepatocyte growth factor (HGF), we created a closed-loop synthetic signalling network that coordinated liver injury-associated serum bile acid levels to expression of HGF in a self-sufficient, reversible and dose-dependent manner. RESULTS After implantation of genetically engineered human cells inside auto-vascularizing, immunoprotective and clinically validated alginate-poly-(L-lysine)-alginate beads into mice, the liver-protection device detected pathologic serum bile acid levels and produced therapeutic HGF levels that protected the animals from acute drug-induced liver failure. CONCLUSIONS Genetically engineered cells containing theranostic gene circuits that dynamically interface with host metabolism may provide novel opportunities for preventive, acute and chronic healthcare. LAY SUMMARY Liver diseases leading to organ failure may go unnoticed as they do not trigger any symptoms or significant discomfort. We have designed a synthetic gene circuit that senses excessive bile acid levels associated with liver injuries and automatically produces a therapeutic protein in response. When integrated into mammalian cells and implanted into mice, the circuit detects the onset of liver injuries and coordinates the production of a protein pharmaceutical which prevents liver damage.
Resumo:
The results of two experiments are reported that examined how performance in a simple interceptive action (hitting a moving target) was influenced by the speed of the target, the size of the intercepting effector and the distance moved to make the interception. In Experiment 1, target speed and the width of the intercepting manipulandum (bat) were varied. The hypothesis that people make briefer movements, when the temporal accuracy and precision demands of the task are high, predicts that bat width and target speed will divisively interact in their effect on movement time (MT) and that shorter MTs will be associated with a smaller temporal variable error (VE). An alternative hypothesis that people initiate movement when the rate of expansion (ROE) of the target's image reaches a specific, fixed criterion value predicts that bat width will have no effect on MT. The results supported the first hypothesis: a statistically reliable interaction of the predicted form was obtained and the temporal VE was smaller for briefer movements. In Experiment 2, distance to move and target speed were varied. MT increased in direct proportion to distance and there was a divisive interaction between distance and speed; as in Experiment 1, temporal VE was smaller for briefer movements. The pattern of results could not be explained by the strategy of initiating movement at a fixed value of the ROE or at a fixed value of any other perceptual variable potentially available for initiating movement. It is argued that the results support pre-programming of MT with movement initiated when the target's time to arrival at the interception location reaches a criterion value that is matched to the pre-programmed MT. The data supported completely open-loop control when MT was less than between 200 and 240 ms with corrective sub-movements increasingly frequent for movements of longer duration.
Resumo:
PURPOSE: To evaluate the hypothesis that objective measures of open- and closed-loop ocular accommodation are related to systemic cardiovascular function, and ipso facto autonomic nervous system activity. METHODS: Sixty subjects (29 male; 31 female) varying in age from 18 to 33 years (average: 20.3 +/- 2.9 years) with a range of refractive errors [mean spherical equivalent (MSE): -7.12 to +1.82 D] participated in the study. Five 20-s continuous objective recordings of the accommodative response, measured with an open-view IR autorefractor (Shin-Nippon SRW-5000), were obtained for a variety of open- and closed-loop accommodative demands while simultaneous continuous measurement of heart rate was recorded with a finger-mounted piezo-electric pulse transducer for 5 min. Fast Fourier Transformation of cardiovascular function allowed the absolute and relative power of the autonomic components to be assessed in the frequency-domain, whereas heart period gave an indication of the time-domain response. RESULTS: Increasing closed-loop accommodative demand led to a concurrent increase in heart rate of approximately 2 beats/min for a 4.0 D increase in accommodative demand. The increase was attributable to a reduction in the absolute (p < 0.05) and normalised (p < 0.001) input of the systemic parasympathetic nervous system, and was unaffected by refractive group. The interaction with refractive group failed to reach significance. CONCLUSIONS: For sustained accommodation effort, the data demonstrate covariation between the oculomotor and cardiovascular systems which implies that a near visual task can significantly influence cardiovascular behaviour. Accommodative effort alone, however, is not a sufficient stimulus to induce autonomic differences between refractive groups. The data suggest that both the oculomotor and cardiovascular systems are predominantly attributable to changes in the systemic parasympathetic nervous system.
Resumo:
Wastewater treatment coupled with energy crop cultivation provides an attractive source of cheap feedstock. This study reviews an advanced, closed-loop bioenergy conversion process [biothermal valorisation of biomass (BtVB)], in which pyroformer is coupled to a gasifier. BtVB process was developed at European Bioenergy Research Institute (EBRI), Aston University, UK and demonstrates an improved method for thermal conversion of ash-rich biomass.
Resumo:
Increasingly complicated medication regimens associated with the necessity of the repeated dosing of multiple agents used in treating pulmonary disease has been shown to compromise both disease management and patient convenience. In this study the viability of spray drying to introduce controlled release vectors into dry powders for inhalation was investigated. The first experimental section highlights the use of leucine in producing highly respirable spray dried powders, with in vitro respirable fractions (Fine particle fraction, FPF: F < 5µm) exceeding 80% of the total dose. The second experimental chapter introduces the biocompatible polymer chitosan (mw 190 – 310 kDa) to formulations containing leucine with findings of increased FPF with increasing leucine concentration (up to 82%) and the prolonged release of the active markers terbulataline sulfate (up to 2 hours) and beclometasone dipropionate (BDP: up to 12 hours) with increasing chitosan molecular weight. Next, the thesis details the use of a double emulsion format in delivering the active markers salbutamol sulfate and BDP at differing rates; using the polymers poly-lactide co-glycolide (PLGA 50:50 and PLGA 75:25) and/or chitosan incorporating leucine as an aerosolisation enhancer the duration of in vitro release of both agents reaching 19 days with FPF exceeding 60%. The final experimental chapter involves dual aqueous and organic closed loop spray drying to create controlled release dry powders for inhalation with in vitro sustained release exceeding 28 days and FPF surpassing 55% of total loaded dose. In conclusion, potentially highly respirable sustained release dry powders for inhalation have been produced by this research using the polymers chitosan and/or PLGA as drug release modifiers and leucine as an aerosolisation enhancer.
Resumo:
It has been proposed that early-onset myopia (EOM) i.e. myopia onset before the age of 15 is primarily inherited whereas late-onset myopia (LOM) i.e. myopia onset after the age of 16 is induced by environmental factors, principally sustained near vision. No consensus exists as to which aspect of the near vision response; accommodation, vergence or their synergistic cross links promotes LOM development. Furthermore, the mechanism by which near vision could induce elongation of posterior chamber is obscure although there is evidence to show that ciliary muscle tone plays an important role. By comparing accommodation and vergence responses of emmetropes (EMMs), EOMs and LOMs under both open- and closed-loop conditions, this thesis aims to define further the oculomotor correlates of myopic development. A Canon Autoref R-1 optometer was used to measure accommodation responses while an Apple IIe controlled the flashed Maddox Rod sequence used when measuring vergence. Both techniques permitted open- and closed-loop measures to be obtained. The results presented demonstrate that it is unlikely that those individuals susceptible to LOM can be distinguished with regard to oculomotor responses or innervational characteristics of the ciliary muscle. The aetiology of LOM may be associated with ciliary muscle function but account needs to be taken of interactions between the ciliary muscle, choriod, sclera and introcular pressure and further research is necessary before those EMMs susceptible to LOM can be identified.
Resumo:
It is well established that a synkinetic relationship exists between the accommodation and vergence components of the oculomotor near response such that increased accommodation will initiate a vergence response (i.e. accommodative convergence) and conversely increased vergence will drive accommodation (i.e. convergent accommodation) . The synkinesis associated with sustained near-vision was examined in a student population consisting of emmetropes, late-onset myopes (LOMs) i.e. myopia onset at 15 years of age or later and early-onset myopes (EOMs) i.e. myopia onset prior to 15 years of age. Oculomotor synkinesis was investigated both under closed-loop conditions and with either accommodation or vergence open-loop. Objective measures of the accommodative response were made using an infra-red optometer. Differences in near-response characteristics were observed between LOMs and EOMs under both open- and closed-loop conditions. LOMs exhibit significantly higher levels of disparity-induced accommodation (accommodation driven by vergence under closed-loop conditions) and lower response accommodative convergence/accommodation (AC/A) ratios when compared with EOMs. However no difference in convergent accommodation/convergence (CA/C) ratios were found between the three refractive groups. Accommodative adaptation was examined by comparing the pre- to post-task shift in dark focus (DF) following near-vision tasks. Accommodative adaptation was observed following tasks as brief as 15s. Following a 45s near-vision task, subjects having pre-task DF greater than +0.750 exhibited a marked negative shift in post-task DF which was shown to be induced by beta-adrenergic innervation to the ciliary muscle. However no evidence was found to support the proposal of reduced adrenergic innervation to the ciliary muscle in LOMs. Disparity-vergence produced a reduction in accommodative adaptation suggesting that oculomotor adaptation was not driven by the output of the near-response crosslinks. In order to verify this proposition, the effect of vergence adaptation on CA/C was investigated and it was observed that prism adaptation produced no significant change in the CA/C ratio. This would indicate that in a model of accommodation-vergence interaction, the near response cross-links occur after the input to the adaptive components of the oculomotor response rather than before the adaptive elements as reported in previous literature. The findings of this thesis indicate differences in the relative composition of the aggregate accommodation and vergence responses in the three refractive groups examined. They may also have implications with regard to the aetiology of late-onset myopia.
Resumo:
A Hamamatsu Video Area Analyser has been coupled with a modified Canon IR automatic optometer. This has allowed simultaneous recording of pupil diameter and accommodation response to be made both statically and continuously, a feature not common in previous studies. Experimental work concerned pupil and accommodation responses during near vision tasks under a variety of conditions. The effects of sustained near vision tasks on accommodation have usually been demonstrated by taking post-task measures under darkroom conditions. The possibility of similar effects on pupil diameter was assessed using static and continuous recordings following a near vision task. Results showed that is luminance levels remained unchanged by using a pre-and post-task bright-empty field then, although accommodation regressed to pre-task levels,pupil diameter remained for several minutes at the contstricted level induced by the task. An investigation into the effect of a sinusoidally-modulated blur-only accommodative stimulus on pupil response demonstrated that response may be reduced or absent despite robust accommodation responses. This suggests that blur-driven acommodation alone may not be sufficient to produce a pupil near response and that the presence of other cues may be necessary. Pupil response was investigated using a looming stimulus which produced an inferred-proximity cue. It was found that a pupil response could be induced which was in synchrony with the stimulus while closed-loop accommodation response was kept constant by the constraints of optical blur. The pupil diameter of young and elderly subjects undertaking a 5 minute reading task was measured to assess the contribution of pupil constriction to near vision function in terms of depth-of-focus. Results showed that in the young subjects pupil diameter was too large to have a significant effect on depth-of-focus although it may be increased in the elderly subjects. Pupil and accommodation reponses to a temporally-modulated stimulus containing all cues present in a normal visual environment was assessed and results showed that as stimulus temporal frequency increased, pupil response showed increasing phase lag relative to closed-loop accommodation. The results of this study suggest that it may be necessary to change the accepted view of the function of pupil response as part of the near vision triad and that further study would be of benefit in particular to designers of vision aids such as, for example, bifocal contact lenses.
Resumo:
Accommodating Intraocular Lenses (IOLs), multifocal IOLs (MIOLs) and toric IOLs are designed to provide a greater level of spectacle independency post cataract surgery. All of these IOLs are reliant on the accurate calculation of intraocular lens power determined through reliable ocular biometry. A standardised defocus area metric and reading performance index metric were devised for the evaluation of the range of focus and the reading ability of subjects implanted with presbyopic correcting IOLs. The range of clear vision after implantation of an MIOL is extended by a second focal point; however, this results in the prevalence of dysphotopsia. A bespoke halometer was designed and validated to assess this photopic phenomenon. There is a lack of standardisation in the methods used for determining IOL orientation and thus rotation. A repeatable, objective method was developed to allow the accurate assessment of IOL rotation, which was used to determine the rotational and positional stability of a closed loop haptic IOL. A new commercially available biometry device was validated for use with subjects prior to cataract surgery. The optical low coherence reflectometry instrument proved to be a valid method for assessing ocular biometry and covered a wider range of ocular parameters in comparison with previous instruments. The advantages of MIOLs were shown to include an extended range of clear vision translating into greater reading ability. However, an increased prevalence of dysphotopsia was shown with a bespoke halometer, which was dependent on the MIOL optic design. Implantation of a single optic accommodating IOL did not improve reading ability but achieved high subjective ratings of near vision. The closed-loop haptic IOL displayed excellent rotational stability in the late period but relatively poor rotational stability in the early period post implantation. The orientation error was compounded by the high frequency of positional misalignment leading to an extensive overall misalignment of the IOL. This thesis demonstrates the functionality of new IOL lens designs and the importance of standardised testing methods, thus providing a greater understanding of the consequences of implanting these IOLs. Consequently, the findings of the thesis will influence future designs of IOLs and testing methods.
Resumo:
Using suitable coupled Navier-Stokes Equations for an incompressible Newtonian fluid we investigate the linear and non-linear steady state solutions for both a homogeneously and a laterally heated fluid with finite Prandtl Number (Pr=7) in the vertical orientation of the channel. Both models are studied within the Large Aspect Ratio narrow-gap and under constant flux conditions with the channel closed. We use direct numerics to identify the linear stability criterion in parametric terms as a function of Grashof Number (Gr) and streamwise infinitesimal perturbation wavenumber (making use of the generalised Squire’s Theorem). We find higher harmonic solutions at lower wavenumbers with a resonance of 1:3exist, for both of the heating models considered. We proceed to identify 2D secondary steady state solutions, which bifurcate from the laminar state. Our studies show that 2D solutions are found not to exist in certain regions of the pure manifold, where we find that 1:3 resonant mode 2D solutions exist, for low wavenumber perturbations. For the homogeneously heated fluid, we notice a jump phenomenon existing between the pure and resonant mode secondary solutions for very specific wavenumbers .We attempt to verify whether mixed mode solutions are present for this model by considering the laterally heated model with the same geometry. We find mixed mode solutions for the laterally heated model showing that a bridge exists between the pure and 1:3 resonant mode 2D solutions, of which some are stationary and some travelling. Further, we show that for the homogeneously heated fluid that the 2D solutions bifurcate in hopf bifurcations and there exists a manifold where the 2D solutions are stable to Eckhaus criterion, within this manifold we proceed to identify 3D tertiary solutions and find that the stability for said 3D bifurcations is not phase locked to the 2D state. For the homogeneously heated model we identify a closed loop within the neutral stability curve for higher perturbation wavenumubers and analyse the nature of the multiple 2D bifurcations around this loop for identical wavenumber and find that a temperature inversion occurs within this loop. We conclude that for a homogeneously heated fluid it is possible to have abrup ttransitions between the pure and resonant 2D solutions, and that for the laterally heated model there exist a transient bifurcation via mixed mode solutions.
Resumo:
Several levels of complexity are available for modelling of wastewater treatment plants. Modelling local effects rely on computational fluid dynamics (CFD) approaches whereas activated sludge models (ASM) represent the global methodology. By applying both modelling approaches to pilot plant and full scale systems, this paper evaluates the value of each method and especially their potential combination. Model structure identification for ASM is discussed based on a full-scale closed loop oxidation ditch modelling. It is illustrated how and for what circumstances information obtained via CFD (computational fluid dynamics) analysis, residence time distribution (RTD) and other experimental means can be used. Furthermore, CFD analysis of the multiphase flow mechanisms is employed to obtain a correct description of the oxygenation capacity of the system studied, including an easy implementation of this information in the classical ASM modelling (e.g. oxygen transfer). The combination of CFD and activated sludge modelling of wastewater treatment processes is applied to three reactor configurations, a perfectly mixed reactor, a pilot scale activated sludge basin (ASB) and a real scale ASB. The application of the biological models to the CFD model is validated against experimentation for the pilot scale ASB and against a classical global ASM model response. A first step in the evaluation of the potential of the combined CFD-ASM model is performed using a full scale oxidation ditch system as testing scenario.
Resumo:
The miniaturization, sophistication, proliferation, and accessibility of technologies are enabling the capture of more and previously inaccessible phenomena in Parkinson's disease (PD). However, more information has not translated into a greater understanding of disease complexity to satisfy diagnostic and therapeutic needs. Challenges include noncompatible technology platforms, the need for wide-scale and long-term deployment of sensor technology (among vulnerable elderly patients in particular), and the gap between the "big data" acquired with sensitive measurement technologies and their limited clinical application. Major opportunities could be realized if new technologies are developed as part of open-source and/or open-hardware platforms that enable multichannel data capture sensitive to the broad range of motor and nonmotor problems that characterize PD and are adaptable into self-adjusting, individualized treatment delivery systems. The International Parkinson and Movement Disorders Society Task Force on Technology is entrusted to convene engineers, clinicians, researchers, and patients to promote the development of integrated measurement and closed-loop therapeutic systems with high patient adherence that also serve to (1) encourage the adoption of clinico-pathophysiologic phenotyping and early detection of critical disease milestones, (2) enhance the tailoring of symptomatic therapy, (3) improve subgroup targeting of patients for future testing of disease-modifying treatments, and (4) identify objective biomarkers to improve the longitudinal tracking of impairments in clinical care and research. This article summarizes the work carried out by the task force toward identifying challenges and opportunities in the development of technologies with potential for improving the clinical management and the quality of life of individuals with PD. © 2016 International Parkinson and Movement Disorder Society.