975 resultados para patient monitoring


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Myotonic dystrophies are autosomal dominant neuromuscular diseases. Among them, myotonic dystrophy type 1 (MD1), or Steinert disease, is the most common in adults, and besides muscular involvement it also has important systemic manifestations. Myotonic dystrophy type 1 poses a challenge to the anesthesiologist. Those patients are more sensitive to anesthetics and prone to cardiac and pulmonary complications. Besides, the possibility of developing malignant hyperthermia and myotonic episodes is also present. CASE REPORT: This is a 39-year old patient with DM1 who underwent general anesthesia for videolaparoscopic cholecystectomy. Total intravenous anesthesia with propofol, remifentanil, and rocuronium was the technique chosen. Intercurrences were not observed in the 90-minute surgical procedure, but after extubation, the patient developed respiratory failure and myotonia, which made tracheal intubation impossible. A laryngeal mask was used, allowing adequate oxygenation, and mechanical ventilation was maintained until full recovery of the respiratory function. The patient did not develop further complications. CONCLUSIONS: Myotonic dystrophy type 1 presents several particularities to the anesthesiologist. Detailed knowledge of its systemic involvement along with the differentiated action of anesthetic drugs in those patients will provide safer anesthetic-surgical procedure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report an unusual case of verruciform xanthoma in a patient with neurofibromatosis and some clinical features of oral lichen planus. © 2010 The British Association of Oral and Maxillofacial Surgeons.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and objectives: The introduction of extracorporeal circulation in clinical practice was decisive for the development of modern cardiovascular surgery. Addition of new procedures and equipment, however, brings inherent risks and complications. The objective of this report is to describe a malfunction of the oxygenation system and emphasize the importance of the interaction among the medical team members to prevent errors and complications. Case Report: During valve replacement and IVC correction surgery, we observed a darker shade of red in the blood on the exit of the oxygenator. Laboratory tests demonstrated severe acidosis and hypoxemia. The entire system was evaluated, but the cause of the malfunction was not found. Measures to reduce damage were successfully instituted. After the surgery, the whole system underwent technical evaluation. Conclusions: Interaction among the medical team members, early diagnosis, and immediate intervention were fundamental for a favorable outcome. © 2011 Elsevier Editora Ltda.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Internet of Things is a new paradigm where smart embedded devices and systems are connected to the Internet. In this context, Wireless Sensor Networks (WSN) are becoming an important alternative for sensing and actuating critical applications like industrial automation, remote patient monitoring and domotics. The IEEE 802.15.4 protocol has been adopted as a standard for WSN and the 6LoWPAN protocol has been proposed to overcome the challenges of integrating WSN and Internet protocols. In this paper, the mechanisms of header compression and fragmentation of IPv6 datagrams proposed in the 6LoWPAN standard were evaluated through field experiments using a gateway prototype and IEEE 802.15.4 nodes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Autologous hematopoietic stem cell transplantation is a conduct used to treat some hematologic diseases and to consolidate the treatment of others. In the field of nursing, the few published scientific studies on nursing care and early hospital discharge of transplant patients are deficient. Knowledge about the diseases treated using hematopoietic stem cell transplantation, providing guidance to patients and caregivers and patient monitoring are important nursing activities in this process. Guidance may contribute to long-term goals through patients' short-term needs. To analyze the results of early hospital discharge on the treatment of patients submitted to autologous transplantation and the influence of nursing care on this conduct. A retrospective, quantitative, descriptive and transversal study was conducted. The hospital records of 112 consecutive patients submitted to autologous transplantation in the period from January to December 2009 were revisited. Of these, 12 patients, who remained in hospital for more than ten days after transplantation, were excluded from the study. The medical records of 100 patients with a median age of 48.5 years (19-69 years) were analyzed. All patients were mobilized and hematopoietic stem cells were collected by leukapheresis. The most common conditioning regimes were BU12Mel100 and BEAM 400. Toxicity during conditioning was easily managed in the outpatient clinic. Gastrointestinal toxicity, mostly Grades I and II, was seen in 69% of the patients, 62% of patients had diarrhea, 61% of the patients had nausea and vomiting and 58% had Grade I and II mucositis. Ten patients required hospitalization due to the conditioning regimen. Febrile neutropenia was seen in 58% of patients. Two patients died before Day +60 due to infections, one with aplasia. The median times to granulocyte and platelet engraftment were 12 days and 15 days, respectively, with median red blood cell and platelet transfusions until discharge of three and four units, respectively. Twenty-three patients required rehospitalization before being discharged from the outpatient clinic. The median time to granulocyte engraftment was 12 days and during the aplasia phase few patients were hospitalized or suffered infections. The toxicity of the conditioning was the leading cause of rehospitalization. The nursing staff participated by providing guidance to patients and during the mobilization, transplant and outpatient follow-up phases, thus helping to successfully manage toxicity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background The expression of glucocorticoid-receptor (GR) seems to be a key mechanism in the regulation of glucocorticoid (GC) sensitivity and is potentially involved in cases of GC resistance or hypersensitivity. The aim of this study is to describe a method for quantitation of GR alpha isoform (GRα) expression using real-time PCR (qrt-PCR) with analytical capabilities to monitor patients, offering standard-curve reproducibility as well as intra- and inter-assay precision. Results Standard-curves were constructed by employing standardized Jurkat cell culture procedures, both for GRα and BCR (breakpoint cluster region), as a normalizing gene. We evaluated standard-curves using five different sets of cell culture passages, RNA extraction, reverse transcription, and qrt-PCR quantification. Intra-assay precision was evaluated using 12 replicates of each gene, for 2 patients, in a single experiment. Inter-assay precision was evaluated on 8 experiments, using duplicate tests of each gene for two patients. Standard-curves were reproducible, with CV (coefficient of variation) of less than 11%, and Pearson correlation coefficients above 0,990 for most comparisons. Intra-assay and inter-assay were 2% and 7%, respectively. Conclusion This is the first method for quantitation of GRα expression with technical characteristics that permit patient monitoring, in a fast, simple and robust way.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The full blood cell (FBC) count is the most common indicator of diseases. At present hematology analyzers are used for the blood cell characterization, but, recently, there has been interest in using techniques that take advantage of microscale devices and intrinsic properties of cells for increased automation and decreased cost. Microfluidic technologies offer solutions to handling and processing small volumes of blood (2-50 uL taken by finger prick) for point-of-care(PoC) applications. Several PoC blood analyzers are in use and may have applications in the fields of telemedicine, out patient monitoring and medical care in resource limited settings. They have the advantage to be easy to move and much cheaper than traditional analyzers, which require bulky instruments and consume large amount of reagents. The development of miniaturized point-of-care diagnostic tests may be enabled by chip-based technologies for cell separation and sorting. Many current diagnostic tests depend on fractionated blood components: plasma, red blood cells (RBCs), white blood cells (WBCs), and platelets. Specifically, white blood cell differentiation and counting provide valuable information for diagnostic purposes. For example, a low number of WBCs, called leukopenia, may be an indicator of bone marrow deficiency or failure, collagen- vascular diseases, disease of the liver or spleen. The leukocytosis, a high number of WBCs, may be due to anemia, infectious diseases, leukemia or tissue damage. In the laboratory of hybrid biodevices, at the University of Southampton,it was developed a functioning micro impedance cytometer technology for WBC differentiation and counting. It is capable to classify cells and particles on the base of their dielectric properties, in addition to their size, without the need of labeling, in a flow format similar to that of a traditional flow cytometer. It was demonstrated that the micro impedance cytometer system can detect and differentiate monocytes, neutrophils and lymphocytes, which are the three major human leukocyte populations. The simplicity and portability of the microfluidic impedance chip offer a range of potential applications in cell analysis including point-of-care diagnostic systems. The microfluidic device has been integrated into a sample preparation cartridge that semi-automatically performs erythrocyte lysis before leukocyte analysis. Generally erythrocytes are manually lysed according to a specific chemical lysis protocol, but this process has been automated in the cartridge. In this research work the chemical lysis protocol, defined in the patent US 5155044 A, was optimized in order to improve white blood cell differentiation and count performed by the integrated cartridge.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mediastinal mass syndrome remains an anaesthetic challenge that cannot be underestimated. Depending on the localization and the size of the mediastinal tumour, the clinical presentation is variable ranging from a complete lack of symptoms to severe cardiorespiratory problems. The administration of general anaesthesia can be associated with acute intraoperative or postoperative cardiorespiratory decompensation that may result in death due to tumour-related compression syndromes. The role of the anaesthesiologist, as a part of the interdisciplinary treatment team, is to ensure a safe perioperative period. However, there is still no structured protocol available for perioperative anaesthesiological procedure. The aim of this article is to summarize the genesis of and the diagnostic options for mediastinal mass syndrome and to provide a solid detailed methodology for its safe perioperative management based on a review of the latest literature and our own clinical experiences. Proper anaesthetic management of patients with mediastinal mass syndrome begins with an assessment of the preoperative status, directed foremost at establishing the localization of the tumour and on the basis of the clinical and radiological findings, discerning whether any vital mediastinal structures are affected. We have found it helpful to assign 'severity grade' (using a three-grade clinical classification scale: 'safe', 'uncertain', 'unsafe'), whereby each stage triggers appropriate action in terms of staffing and apparatus, such as the provision of alternatives for airway management, cardiopulmonary bypass and additional specialists. During the preoperative period, we are guided by a 12-point plan that also takes into account the special features of transportation into the operating theatre and patient monitoring. Tumour compression on the airways or the great vessels may create a critical respiratory and/or haemodynamic situation, and therefore the standard of intraoperative management includes induction of anaesthesia in the operating theatre on an adjustable surgical table, the use of short-acting anaesthetics, avoidance of muscle relaxants and maintenance of spontaneous respiration. In the case of severe clinical symptoms and large mediastinal tumours, we consider it absolutely essential to cannulate the femoral vessels preoperatively under local anaesthesia and to provide for the availability of cardiopulmonary bypass in the operating theatre, should extracorporeal circulation become necessary. The benefits of establishing vascular access under local anaesthesia clearly outweigh any associated degree of patient discomfort. In the case of patients classified as 'safe' or 'uncertain', a preoperative consensus with the surgeons should be reached as to the anaesthetic approach and the management of possible complications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To evaluate treatment response of hepatocellular carcinoma (HCC) after transarterial chemoembolization (TACE) with a new real-time imaging fusion technique of contrast-enhanced ultrasound (CEUS) with multi-slice detection computed tomography (CT) in comparison to conventional post-interventional follow-up. MATERIAL AND METHODS 40 patients with HCC (26 male, ages 46-81 years) were evaluated 24 hours after TACE using CEUS with ultrasound volume navigation and image fusion with CT compared to non-enhanced CT and follow-up contrast-enhanced CT after 6-8 weeks. Reduction of tumor vascularization to less than 25% was regarded as "successful" treatment, whereas reduction to levels >25% was considered as "partial" treatment response. Homogenous lipiodol retention was regarded as successful treatment in non-enhanced CT. RESULTS Post-interventional image fusion of CEUS with CT was feasible in all 40 patients. In 24 patients (24/40), post-interventional image fusion with CEUS revealed residual tumor vascularity, that was confirmed by contrast-enhanced CT 6-8 weeks later in 24/24 patients. In 16 patients (16/40), post-interventional image fusion with CEUS demonstrated successful treatment, but follow-up CT detected residual viable tumor (6/16). Non-enhanced CT did not identify any case of treatment failure. Image fusion with CEUS assessed treatment efficacy with a specificity of 100%, sensitivity of 80% and a positive predictive value of 1 (negative predictive value 0.63). CONCLUSIONS Image fusion of CEUS with CT allows a reliable, highly specific post-interventional evaluation of embolization response with good sensitivity without any further radiation exposure. It can detect residual viable tumor at early state, resulting in a close patient monitoring or re-therapy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The number of colonoscopies tremendously increased in recent years and will further rise in the near future. Because of patients' growing expectation on comfort during medical procedures, it is not surprising that the demand for sedation also expands. Propofol in combination with alfentanil is known to provide excellent analgosedation, however, its use is associated with respiratory and cardiovascular depression. Acupuncture could be a technique to reduce drug requirement while providing the same level of sedation and analgesia. METHODS/DESIGN The study will be performed as a single centre, randomised, placebo controlled trial. 153 patients scheduled for propofol/alfentanil sedation during colonoscopy will be randomly assigned to receive electroacupuncture (P6, ST36, LI4), sham acupuncture, or placebo acupuncture. Following endoscopy patients and gastroenterologists have to fill in questionnaires about their sedation experiences. Additionally, patients have to accomplish the Trieger test before and after the procedure. Patient monitoring includes time adapted HR, SpO2, ECG, NIBP, exCO2, OAA/S, and the Aldrete score. The primary outcome parameter is the dosage of propofol necessary for an adequate level of sedation to tolerate the procedure (OAA/S < 4). Effectiveness of sedation, classified by satisfaction levels measured by questionnaires is the secondary outcome parameter. DISCUSSION Moderate to deep sedation using propofol is increasingly applied during colonoscopies with a high satisfaction level among patients despite well-known hemodynamic and respiratory side effects of this hypnotic agent. Acupuncture is known to attenuate gastrointestinal discomfort and pain. We hypothesize that the combination of conventional sedation techniques with acupuncture may result in equally satisfied patients with a lower risk of respiratory and hemodynamic events during colonoscopies. TRIAL REGISTRATION This trial is registered in the Nederland's Trial Register NTR 4325 . The first patient was randomized on 13 February 2014.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The role of clinical chemistry has traditionally been to evaluate acutely ill or hospitalized patients. Traditional statistical methods have serious drawbacks in that they use univariate techniques. To demonstrate alternative methodology, a multivariate analysis of covariance model was developed and applied to the data from the Cooperative Study of Sickle Cell Disease.^ The purpose of developing the model for the laboratory data from the CSSCD was to evaluate the comparability of the results from the different clinics. Several variables were incorporated into the model in order to control for possible differences among the clinics that might confound any real laboratory differences.^ Differences for LDH, alkaline phosphatase and SGOT were identified which will necessitate adjustments by clinic whenever these data are used. In addition, aberrant clinic values for LDH, creatinine and BUN were also identified.^ The use of any statistical technique including multivariate analysis without thoughtful consideration may lead to spurious conclusions that may not be corrected for some time, if ever. However, the advantages of multivariate analysis far outweigh its potential problems. If its use increases as it should, the applicability to the analysis of laboratory data in prospective patient monitoring, quality control programs, and interpretation of data from cooperative studies could well have a major impact on the health and well being of a large number of individuals. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hoy en día asistimos a un creciente interés por parte de la sociedad hacia el cuidado de la salud. Esta afirmación viene apoyada por dos realidades. Por una parte, el aumento de las prácticas saludables (actividad deportiva, cuidado de la alimentación, etc.). De igual manera, el auge de los dispositivos inteligentes (relojes, móviles o pulseras) capaces de medir distintos parámetros físicos como el pulso cardíaco, el ritmo respiratorio, la distancia recorrida, las calorías consumidas, etc. Combinando ambos factores (interés por el estado de salud y disponibilidad comercial de dispositivos inteligentes) están surgiendo multitud de aplicaciones capaces no solo de controlar el estado actual de salud, también de recomendar al usuario cambios de hábitos que lleven hacia una mejora en su condición física. En este contexto, los llamados dispositivos llevables (weareables) unidos al paradigma de Internet de las cosas (IoT, del inglés Internet of Things) permiten la aparición de nuevos nichos de mercado para aplicaciones que no solo se centran en la mejora de la condición física, ya que van más allá proponiendo soluciones para el cuidado de pacientes enfermos, la vigilancia de niños o ancianos, la defensa y la seguridad, la monitorización de agentes de riesgo (como bomberos o policías) y un largo etcétera de aplicaciones por llegar. El paradigma de IoT se puede desarrollar basándose en las existentes redes de sensores inalámbricos (WSN, del inglés Wireless Sensor Network). La conexión de los ya mencionados dispositivos llevables a estas redes puede facilitar la transición de nuevos usuarios hacia aplicaciones IoT. Pero uno de los problemas intrínsecos a estas redes es su heterogeneidad. En efecto, existen multitud de sistemas operativos, protocolos de comunicación, plataformas de desarrollo, soluciones propietarias, etc. El principal objetivo de esta tesis es realizar aportaciones significativas para solucionar no solo el problema de la heterogeneidad, sino también de dotar de mecanismos de seguridad suficientes para salvaguardad la integridad de los datos intercambiados en este tipo de aplicaciones. Algo de suma importancia ya que los datos médicos y biométricos de los usuarios están protegidos por leyes nacionales y comunitarias. Para lograr dichos objetivos, se comenzó con la realización de un completo estudio del estado del arte en tecnologías relacionadas con el marco de investigación (plataformas y estándares para WSNs e IoT, plataformas de implementación distribuidas, dispositivos llevables y sistemas operativos y lenguajes de programación). Este estudio sirvió para tomar decisiones de diseño fundamentadas en las tres contribuciones principales de esta tesis: un bus de servicios para dispositivos llevables (WDSB, Wearable Device Service Bus) basado en tecnologías ya existentes tales como ESB, WWBAN, WSN e IoT); un protocolo de comunicaciones inter-dominio para dispositivos llevables (WIDP, Wearable Inter-Domain communication Protocol) que integra en una misma solución protocolos capaces de ser implementados en dispositivos de bajas capacidades (como lo son los dispositivos llevables y los que forman parte de WSNs); y finalmente, la tercera contribución relevante es una propuesta de seguridad para WSN basada en la aplicación de dominios de confianza. Aunque las contribuciones aquí recogidas son de aplicación genérica, para su validación se utilizó un escenario concreto de aplicación: una solución para control de parámetros físicos en entornos deportivos, desarrollada dentro del proyecto europeo de investigación “LifeWear”. En este escenario se desplegaron todos los elementos necesarios para validar las contribuciones principales de esta tesis y, además, se realizó una aplicación para dispositivos móviles por parte de uno de los socios del proyecto (lo que contribuyó con una validación externa de la solución). En este escenario se usaron dispositivos llevables tales como un reloj inteligente, un teléfono móvil con sistema operativo Android y un medidor del ritmo cardíaco inalámbrico capaz de obtener distintos parámetros fisiológicos del deportista. Sobre este escenario se realizaron diversas pruebas de validación mediante las cuales se obtuvieron resultados satisfactorios. ABSTRACT Nowadays, society is shifting towards a growing interest and concern on health care. This phenomenon can be acknowledged by two facts: first, the increasing number of people practising some kind of healthy activity (sports, balanced diet, etc.). Secondly, the growing number of commercial wearable smart devices (smartwatches or bands) able to measure physiological parameters such as heart rate, breathing rate, distance or consumed calories. A large number of applications combining both facts are appearing. These applications are not only able to monitor the health status of the user, but also to provide recommendations about routines in order to improve the mentioned health status. In this context, wearable devices merged with the Internet of Things (IoT) paradigm enable the proliferation of new market segments for these health wearablebased applications. Furthermore, these applications can provide solutions for the elderly or baby care, in-hospital or in-home patient monitoring, security and defence fields or an unforeseen number of future applications. The introduced IoT paradigm can be developed with the usage of existing Wireless Sensor Networks (WSNs) by connecting the novel wearable devices to them. In this way, the migration of new users and actors to the IoT environment will be eased. However, a major issue appears in this environment: heterogeneity. In fact, there is a large number of operating systems, hardware platforms, communication and application protocols or programming languages, each of them with unique features. The main objective of this thesis is defining and implementing a solution for the intelligent service management in wearable and ubiquitous devices so as to solve the heterogeneity issues that are presented when dealing with interoperability and interconnectivity of devices and software of different nature. Additionally, a security schema based on trust domains is proposed as a solution to the privacy problems arising when private data (e.g., biomedical parameters or user identification) is broadcasted in a wireless network. The proposal has been made after a comprehensive state-of-the-art analysis, and includes the design of a Wearable Device Service Bus (WDSB) including the technologies collected in the requirement analysis (ESB, WWBAN, WSN and IoT). Applications are able to access the WSN services regardless of the platform and operating system where they are running. Besides, this proposal also includes the design of a Wearable Inter-Domain communication Protocols set (WIDP) which integrates lightweight protocols suitable to be used in low-capacities devices (REST, JSON, AMQP, CoAP, etc...). Furthermore, a security solution for service management based on a trustworthy domains model to deploy security services in WSNs has been designed. Although the proposal is a generic framework for applications based on services provided by wearable devices, an application scenario for testing purposes has been included. In this validation scenario it has been presented an autonomous physical condition performance system, based on a WSN, bringing the possibility to include several elements in an IoT scenario: a smartwatch, a physiological monitoring device and a smartphone. In summary, the general objective of this thesis is solving the heterogeneity and security challenges arising when developing applications for WSNs and wearable devices. As it has been presented in the thesis, the solution proposed has been successfully validated in a real scenario and the obtained results were satisfactory.