951 resultados para third party liability


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las compañías de desarrollo de software buscan reducir costes a través del desarrollo de diseños que permitan: a) facilidad en la distribución del trabajo de desarrollo, con la menor comunicación de las partes; b) modificabilidad, permitiendo realizar cambios sobre un módulo sin alterar las otras partes y; c) comprensibilidad, permitiendo estudiar un módulo del sistema a la vez. Estas características elementales en el diseño de software se logran a través del diseño de sistemas cuasi-descomponibles, cuyo modelo teórico fue introducido por Simon en su búsqueda de una teoría general de los sistemas. En el campo del diseño de software, Parnas propone un camino práctico para lograr sistemas cuasi-descomponibles llamado el Principio de Ocultación de Información. El Principio de Ocultación de Información es un criterio diferente de descomposición en módulos, cuya implementación logra las características deseables de un diseño eficiente a nivel del proceso de desarrollo y mantenimiento. El Principio y el enfoque orientado a objetos se relacionan debido a que el enfoque orientado a objetos facilita la implementación del Principio, es por esto que cuando los objetos empiezan a tomar fuerza, también aparecen paralelamente las dificultades en el aprendizaje de diseño de software orientado a objetos, las cuales se mantienen hasta la actualidad, tal como se reporta en la literatura. Las dificultades en el aprendizaje de diseño de software orientado a objetos tiene un gran impacto tanto en las aulas como en la profesión. La detección de estas dificultades permitirá a los docentes corregirlas o encaminarlas antes que éstas se trasladen a la industria. Por otro lado, la industria puede estar advertida de los potenciales problemas en el proceso de desarrollo de software. Esta tesis tiene como objetivo investigar sobre las dificultades en el diseño de software orientado a objetos, a través de un estudio empírico. El estudio fue realizado a través de un estudio de caso cualitativo, que estuvo conformado por tres partes. La primera, un estudio inicial que tuvo como objetivo conocer el entendimiento de los estudiantes alrededor del Principio de Ocultación de Información antes de que iniciasen la instrucción. La segunda parte, un estudio llevado a cabo a lo largo del período de instrucción con la finalidad de obtener las dificultades de diseño de software y su nivel de persistencia. Finalmente, una tercera parte, cuya finalidad fue el estudio de las dificultades esenciales de aprendizaje y sus posibles orígenes. Los participantes de este estudio pertenecieron a la materia de Software Design del European Master in Software Engineering de la Escuela Técnica Superior de Ingenieros Informáticos de la Universidad Politécnica de Madrid. Los datos cualitativos usados para el análisis procedieron de las observaciones en las horas de clase y exposiciones, entrevistas realizadas a los estudiantes y ejercicios enviados a lo largo del período de instrucción. Las dificultades presentadas en esta tesis en sus diferentes perspectivas, aportaron conocimiento concreto de un estudio de caso en particular, realizando contribuciones relevantes en el área de diseño de software, docencia, industria y a nivel metodológico. ABSTRACT The software development companies look to reduce costs through the development of designs that will: a) ease the distribution of development work with the least communication between the parties; b) changeability, allowing to change a module without disturbing the other parties and; c) understandability, allowing to study a system module at a time. These basic software design features are achieved through the design of quasidecomposable systems, whose theoretical model was introduced by Simon in his search for a general theory of systems. In the field of software design, Parnas offers a practical way to achieve quasi-decomposable systems, called The Information Hiding Principle. The Information Hiding Principle is different criterion for decomposition into modules, whose implementation achieves the desirable characteristics of an efficient design at the development and maintenance level. The Principle and the object-oriented approach are related because the object-oriented approach facilitates the implementation of The Principle, which is why when objects begin to take hold, also appear alongside the difficulties in learning an object-oriented software design, which remain to this day, as reported in the literature. Difficulties in learning object-oriented software design has a great impact both in the classroom and in the profession. The detection of these difficulties will allow teachers to correct or route them before they move to the industry. On the other hand, the industry can be warned of potential problems related to the software development process. This thesis aims to investigate the difficulties in learning the object-oriented design, through an empirical study. The study was conducted through a qualitative case study, which consisted of three parts. The first, an initial study was aimed to understand the knowledge of the students around The Information Hiding Principle before they start the instruction. The second part, a study was conducted during the entire period of instruction in order to obtain the difficulties of software design and their level of persistence. Finally, a third party, whose purpose was to study the essential difficulties of learning and their possible sources. Participants in this study belonged to the field of Software Design of the European Master in Software Engineering at the Escuela Técnica Superior de Ingenieros Informáticos of Universidad Politécnica de Madrid. The qualitative data used for the analysis came from the observations in class time and exhibitions, performed interviews with students and exercises sent over the period of instruction. The difficulties presented in this thesis, in their different perspectives, provided concrete knowledge of a particular case study, making significant contributions in the area of software design, teaching, industry and methodological level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La evaluación de ontologías, incluyendo diagnóstico y reparación de las mismas, es una compleja actividad que debe llevarse a cabo en cualquier proyecto de desarrollo ontológico para comprobar la calidad técnica de las ontologías. Sin embargo, existe una gran brecha entre los enfoques metodológicos sobre la evaluación de ontologías y las herramientas que le dan soporte. En particular, no existen enfoques que proporcionen guías concretas sobre cómo diagnosticar y, en consecuencia, reparar ontologías. Esta tesis pretende avanzar en el área de la evaluación de ontologías, concretamente en la actividad de diagnóstico. Los principales objetivos de esta tesis son (a) ayudar a los desarrolladores en el diagnóstico de ontologías para encontrar errores comunes y (b) facilitar dicho diagnóstico reduciendo el esfuerzo empleado proporcionando el soporte tecnológico adecuado. Esta tesis presenta las siguientes contribuciones: • Catálogo de 41 errores comunes que los ingenieros ontológicos pueden cometer durante el desarrollo de ontologías. • Modelo de calidad para el diagnóstico de ontologías alineando el catálogo de errores comunes con modelos de calidad existentes. • Diseño e implementación de 48 métodos para detectar 33 de los 41 errores comunes en el catálogo. • Soporte tecnológico OOPS!, que permite el diagnstico de ontologías de forma (semi)automática. De acuerdo con los comentarios recibidos y los resultados de los test de satisfacción realizados, se puede afirmar que el enfoque desarrollado y presentado en esta tesis ayuda de forma efectiva a los usuarios a mejorar la calidad de sus ontologías. OOPS! ha sido ampliamente aceptado por un gran número de usuarios de formal global y ha sido utilizado alrededor de 3000 veces desde 60 países diferentes. OOPS! se ha integrado en software desarrollado por terceros y ha sido instalado en empresas para ser utilizado tanto durante el desarrollo de ontologías como en actividades de formación. Abstract Ontology evaluation, which includes ontology diagnosis and repair, is a complex activity that should be carried out in every ontology development project, because it checks for the technical quality of the ontology. However, there is an important gap between the methodological work about ontology evaluation and the tools that support such an activity. More precisely, not many approaches provide clear guidance about how to diagnose ontologies and how to repair them accordingly. This thesis aims to advance the current state of the art of ontology evaluation, specifically in the ontology diagnosis activity. The main goals of this thesis are (a) to help ontology engineers to diagnose their ontologies in order to find common pitfalls and (b) to lessen the effort required from them by providing the suitable technological support. This thesis presents the following main contributions: • A catalogue that describes 41 pitfalls that ontology developers might include in their ontologies. • A quality model for ontology diagnose that aligns the pitfall catalogue to existing quality models for semantic technologies. • The design and implementation of 48 methods for detecting 33 out of the 41 pitfalls defined in the catalogue. • A system called OOPS! (OntOlogy Pitfall Scanner!) that allows ontology engineers to (semi)automatically diagnose their ontologies. According to the feedback gathered and satisfaction tests carried out, the approach developed and presented in this thesis effectively helps users to increase the quality of their ontologies. At the time of writing this thesis, OOPS! has been broadly accepted by a high number of users worldwide and has been used around 3000 times from 60 different countries. OOPS! is integrated with third-party software and is locally installed in private enterprises being used both for ontology development activities and training courses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La computación ubicua está extendiendo su aplicación desde entornos específicos hacia el uso cotidiano; el Internet de las cosas (IoT, en inglés) es el ejemplo más brillante de su aplicación y de la complejidad intrínseca que tiene, en comparación con el clásico desarrollo de aplicaciones. La principal característica que diferencia la computación ubicua de los otros tipos está en como se emplea la información de contexto. Las aplicaciones clásicas no usan en absoluto la información de contexto o usan sólo una pequeña parte de ella, integrándola de una forma ad hoc con una implementación específica para la aplicación. La motivación de este tratamiento particular se tiene que buscar en la dificultad de compartir el contexto con otras aplicaciones. En realidad lo que es información de contexto depende del tipo de aplicación: por poner un ejemplo, para un editor de imágenes, la imagen es la información y sus metadatos, tales como la hora de grabación o los ajustes de la cámara, son el contexto, mientras que para el sistema de ficheros la imagen junto con los ajustes de cámara son la información, y el contexto es representado por los metadatos externos al fichero como la fecha de modificación o la de último acceso. Esto significa que es difícil compartir la información de contexto, y la presencia de un middleware de comunicación que soporte el contexto de forma explícita simplifica el desarrollo de aplicaciones para computación ubicua. Al mismo tiempo el uso del contexto no tiene que ser obligatorio, porque si no se perdería la compatibilidad con las aplicaciones que no lo usan, convirtiendo así dicho middleware en un middleware de contexto. SilboPS, que es nuestra implementación de un sistema publicador/subscriptor basado en contenido e inspirado en SIENA [11, 9], resuelve dicho problema extendiendo el paradigma con dos elementos: el Contexto y la Función de Contexto. El contexto representa la información contextual propiamente dicha del mensaje por enviar o aquella requerida por el subscriptor para recibir notificaciones, mientras la función de contexto se evalúa usando el contexto del publicador y del subscriptor. Esto permite desacoplar la lógica de gestión del contexto de aquella de la función de contexto, incrementando de esta forma la flexibilidad de la comunicación entre varias aplicaciones. De hecho, al utilizar por defecto un contexto vacío, las aplicaciones clásicas y las que manejan el contexto pueden usar el mismo SilboPS, resolviendo de esta forma la incompatibilidad entre las dos categorías. En cualquier caso la posible incompatibilidad semántica sigue existiendo ya que depende de la interpretación que cada aplicación hace de los datos y no puede ser solucionada por una tercera parte agnóstica. El entorno IoT conlleva retos no sólo de contexto, sino también de escalabilidad. La cantidad de sensores, el volumen de datos que producen y la cantidad de aplicaciones que podrían estar interesadas en manipular esos datos está en continuo aumento. Hoy en día la respuesta a esa necesidad es la computación en la nube, pero requiere que las aplicaciones sean no sólo capaces de escalar, sino de hacerlo de forma elástica [22]. Desgraciadamente no hay ninguna primitiva de sistema distribuido de slicing que soporte un particionamiento del estado interno [33] junto con un cambio en caliente, además de que los sistemas cloud actuales como OpenStack u OpenNebula no ofrecen directamente una monitorización elástica. Esto implica que hay un problema bilateral: cómo puede una aplicación escalar de forma elástica y cómo monitorizar esa aplicación para saber cuándo escalarla horizontalmente. E-SilboPS es la versión elástica de SilboPS y se adapta perfectamente como solución para el problema de monitorización, gracias al paradigma publicador/subscriptor basado en contenido y, a diferencia de otras soluciones [5], permite escalar eficientemente, para cumplir con la carga de trabajo sin sobre-provisionar o sub-provisionar recursos. Además está basado en un algoritmo recientemente diseñado que muestra como añadir elasticidad a una aplicación con distintas restricciones sobre el estado: sin estado, estado aislado con coordinación externa y estado compartido con coordinación general. Su evaluación enseña como se pueden conseguir notables speedups, siendo el nivel de red el principal factor limitante: de hecho la eficiencia calculada (ver Figura 5.8) demuestra cómo se comporta cada configuración en comparación con las adyacentes. Esto permite conocer la tendencia actual de todo el sistema, para saber si la siguiente configuración compensará el coste que tiene con la ganancia que lleva en el throughput de notificaciones. Se tiene que prestar especial atención en la evaluación de los despliegues con igual coste, para ver cuál es la mejor solución en relación a una carga de trabajo dada. Como último análisis se ha estimado el overhead introducido por las distintas configuraciones a fin de identificar el principal factor limitante del throughput. Esto ayuda a determinar la parte secuencial y el overhead de base [26] en un despliegue óptimo en comparación con uno subóptimo. Efectivamente, según el tipo de carga de trabajo, la estimación puede ser tan baja como el 10 % para un óptimo local o tan alta como el 60 %: esto ocurre cuando se despliega una configuración sobredimensionada para la carga de trabajo. Esta estimación de la métrica de Karp-Flatt es importante para el sistema de gestión porque le permite conocer en que dirección (ampliar o reducir) es necesario cambiar el despliegue para mejorar sus prestaciones, en lugar que usar simplemente una política de ampliación. ABSTRACT The application of pervasive computing is extending from field-specific to everyday use. The Internet of Things (IoT) is the shiniest example of its application and of its intrinsic complexity compared with classical application development. The main characteristic that differentiates pervasive from other forms of computing lies in the use of contextual information. Some classical applications do not use any contextual information whatsoever. Others, on the other hand, use only part of the contextual information, which is integrated in an ad hoc fashion using an application-specific implementation. This information is handled in a one-off manner because of the difficulty of sharing context across applications. As a matter of fact, the application type determines what the contextual information is. For instance, for an imaging editor, the image is the information and its meta-data, like the time of the shot or camera settings, are the context, whereas, for a file-system application, the image, including its camera settings, is the information and the meta-data external to the file, like the modification date or the last accessed timestamps, constitute the context. This means that contextual information is hard to share. A communication middleware that supports context decidedly eases application development in pervasive computing. However, the use of context should not be mandatory; otherwise, the communication middleware would be reduced to a context middleware and no longer be compatible with non-context-aware applications. SilboPS, our implementation of content-based publish/subscribe inspired by SIENA [11, 9], solves this problem by adding two new elements to the paradigm: the context and the context function. Context represents the actual contextual information specific to the message to be sent or that needs to be notified to the subscriber, whereas the context function is evaluated using the publisher’s context and the subscriber’s context to decide whether the current message and context are useful for the subscriber. In this manner, context logic management is decoupled from context management, increasing the flexibility of communication and usage across different applications. Since the default context is empty, context-aware and classical applications can use the same SilboPS, resolving the syntactic mismatch that there is between the two categories. In any case, the possible semantic mismatch is still present because it depends on how each application interprets the data, and it cannot be resolved by an agnostic third party. The IoT environment introduces not only context but scaling challenges too. The number of sensors, the volume of the data that they produce and the number of applications that could be interested in harvesting such data are growing all the time. Today’s response to the above need is cloud computing. However, cloud computing applications need to be able to scale elastically [22]. Unfortunately there is no slicing, as distributed system primitives that support internal state partitioning [33] and hot swapping and current cloud systems like OpenStack or OpenNebula do not provide elastic monitoring out of the box. This means there is a two-sided problem: 1) how to scale an application elastically and 2) how to monitor the application and know when it should scale in or out. E-SilboPS is the elastic version of SilboPS. I t is the solution for the monitoring problem thanks to its content-based publish/subscribe nature and, unlike other solutions [5], it scales efficiently so as to meet workload demand without overprovisioning or underprovisioning. Additionally, it is based on a newly designed algorithm that shows how to add elasticity in an application with different state constraints: stateless, isolated stateful with external coordination and shared stateful with general coordination. Its evaluation shows that it is able to achieve remarkable speedups where the network layer is the main limiting factor: the calculated efficiency (see Figure 5.8) shows how each configuration performs with respect to adjacent configurations. This provides insight into the actual trending of the whole system in order to predict if the next configuration would offset its cost against the resulting gain in notification throughput. Particular attention has been paid to the evaluation of same-cost deployments in order to find out which one is the best for the given workload demand. Finally, the overhead introduced by the different configurations has been estimated to identify the primary limiting factor for throughput. This helps to determine the intrinsic sequential part and base overhead [26] of an optimal versus a suboptimal deployment. Depending on the type of workload, this can be as low as 10% in a local optimum or as high as 60% when an overprovisioned configuration is deployed for a given workload demand. This Karp-Flatt metric estimation is important for system management because it indicates the direction (scale in or out) in which the deployment has to be changed in order to improve its performance instead of simply using a scale-out policy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Funding The International Primary Care Respiratory Group (IPCRG) provided funding for this research project as an UNLOCK group study for which the funding was obtained through an unrestricted grant by Novartis AG, Basel, Switzerland. The latter funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript. Database access for the OPCRD was provided by the Respiratory Effectiveness Group (REG) and Research in Real Life; the OPCRD statistical analysis was funded by REG. The Bocholtz Study was funded by PICASSO for COPD, an initiative of Boehringer Ingelheim, Pfizer and the Caphri Research Institute, Maastricht University, The Netherlands.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study was supported by the UK Natural Environment Research Council (NE/H019456/1) to CJvdG, by the Wellcome Trust (WT 098051) to AWW and JP for sequencing costs, and by The Anna Trust (KB2008) to KDB. AWW and The Rowett Institute of Nutrition and Health, University of Aberdeen, receive core funding support from the Scottish Government Rural and Environmental Science and Analysis Service (RESAS). We thank Paul Scott, Richard Rance and the Wellcome Trust Sanger Institute’s sequencing team for generating 16S rRNA gene sequence data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work was supported in Taipei by Institute of Biomedical Sciences, Academia Sinica and grants from the Ministry of Science and Technology, Taiwan (NSC100-2321-B-001-018, NSC102-2321-B-001-056, NSC102-2320-B-001-021-MY3, and MOST104-2325-B- 001-011) and in Aberdeen, by the Institute of Medical Sciences, University of Aberdeen, UK. We thank Dr David J. Anderson and Dr Yoshihiro Yoshihara for providing plasmids containing cDNA of eGFP-f and WGA, respectively. Dr John N. Wood, Dr Bai-Chuang Shyu and Dr Yu-Ting Yan for providing transgenic lines including Nav1.8-Cre, Parvalbumin-Cre, ROSA-Gt26 reporter and CAG-STOPfloxed-GFP reporter mice. Also we thank Dr Silvia Arber for offering Parvalbumin-Cre-specific genotyping primer sequence, Dr Philip LeDuc for critical reading of the manuscript, and the Transgenic Core Facility of Academia Sinica for the help on the generation of the 2 Asic3 mutant mice, as well as Dr Sin-Jhong Cheng of NPAS for technique support on electrophysiology

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acknowledgements This work was funded by the Office of Naval Research (N00014-13-1-0696). We thank C Asher for her comments on an earlier version of this manuscript.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We would like to thank the animal house staff and all members of the Energetics group for their invaluable help at various stages throughout the project. This work was supported by Natural Environment Research Council grant (NERC, NE/C004159/1). YG was supported by a scholarship from the rotary foundation. LV was supported by a Rubicon grant from the Netherlands Scientific Organisation (NWO).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We thank Donna Wallace and the animal house staff for their help with the animal studies. We thank Pat Bain for help in preparing the figures. This work was supported by the Biotechnology and Biological Science Research Council (BBSRC) grant number BB/K001043/1 (G.H., A.W.R., P.N.S., P.J.Mc. and P.J.M.) and the Scottish Government (A.W.R., L.M.T., C.D.M. and P.J.M.).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We thank Karim Gharbi and Urmi Trivedi for their assistance with RNA sequencing, carried out in the GenePool genomics facility (University of Edinburgh). We also thank Susan Fairley and Eduardo De Paiva Alves (Centre for Genome Enabled Biology and Medicine, University of Aberdeen) for help with the initial bioinformatics analysis. We thank Aaron Mitchell for kindly providing the ALS3 mutant, Julian Naglik for the gift of TR146 cells, and Jon Richardson for technical assistance. We thank the Genomics and Bioinformatics core of the Faculty of Health Sciences for Next Generation Sequencing and Bioinformatics support, the Information and Communication Technology Office at the University of Macau for providing access to a High Performance Computer and Jacky Chan and William Pang for their expert support on the High Performance Computer. Finally, we thank Amanda Veri for generating CaLC2928. M.D.L. is supported by a Sir Henry Wellcome Postdoctoral Fellowship (Wellcome Trust 096072), R.A.F. by a Wellcome Trust-Massachusetts Institute of Technology (MIT) Postdoctoral Fellowship, L.E.C. by a Canada Research Chair in Microbial Genomics and Infectious Disease and by Canadian Institutes of Health Research Grants MOP-119520 and MOP-86452, A.J. P.B. was supported by the UK Biotechnology and Biological Sciences Research Council (BB/F00513X/1) and by the European Research Council (ERC-2009-AdG-249793-STRIFE), KHW is supported by the Science and Technology Development Fund of Macau S.A.R (FDCT) (085/2014/A2) and the Research and Development Administrative Office of the University of Macau (SRG2014-00003-FHS) and R.T.W. by the Burroughs Wellcome fund and NIH R15AO094406. Data availability RNA-sequencing data sets are available at ArrayExpress (www.ebi.ac.uk) under accession code E-MTAB-4075. ChIP-seq data sets are available at the NCBI SRA database (http://www.ncbi.nlm.nih.gov) under accession code SRP071687. The authors declare that all other data supporting the findings of this study are available within the article and its supplementary information files, or from the corresponding author upon request.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Funding The IPCRG provided funding for this research project as an UNLOCK Group study for which the funding was obtained through an unrestricted grant by Novartis AG, Basel, Switzerland. Novartis has no role in study design, data collection and analysis, decision to publish or preparation of the manuscript. This study will include data from the Optimum Patient Care Research Database and is undertaken in collaboration with Optimum Patient Care and the Respiratory Effectiveness Group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Athymic mice grafted at birth with allogeneic thymic epithelium (TE) from day 10 embryos before hematopoietic cell colonization reconstitute normal numbers of T cells and exhibit full life-long tolerance to skin grafts of the TE haplotype. Intravenous transfers of splenic cells, from these animals to adult syngeneic athymic recipients, reconstitute T-cell compartments and the ability to reject third-party skin grafts. The transfer of specific tolerance to skin grafts of the TE donor strain, however, is not observed in all reconstituted recipients, and the fraction of nontolerant recipients increases with decreasing numbers of cells transferred. Furthermore, transfers of high numbers of total or CD4+ T cells from TE chimeras to T-cell receptor-anti-H-Y antigen transgenic immunocompetent syngeneic hosts specifically hinder the rejection of skin grafts of the TE haplotype that normally occurs in such recipients. These observations demonstrate (i) that mice tolerized by allogeneic TE and bearing healthy skin grafts harbor peripheral immunocompetent T cells capable of rejecting this very same graft; and (ii) that TE selects for regulatory T cells that can inhibit effector activities of graft-reactive cells.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Funding for work in the laboratory of PB was supported by Scottish Government Rural and Environment Science and Analytical Services Division, BBSRC (grant BB/M001504/1), British Society for Neuroendocrinology (research visit grant to IP). Work in the laboratory of SS was supported by a grant from the DFG (Ste 331/8-1). We thank Siegried Hilken, Marianne Brüning, Dr. Esther Lipokatic-Takacs and Dr. Frank Scherbarth at UVMH for technical assistance. We thank Graham Horgan of Bioinformatics, Statistics Scotland for assistance with some of statistical tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introdução: A prevalência de doenças crônicas, sobretudo na população idosa, nos coloca diante da necessidade de modelos longitudinais de cuidado. Atualmente os sujeitos estão sendo cada vez mais responsabilizados pelo gerenciamento de sua saúde através do uso de dispositivos de monitoramento, tais como o glicosímetro e o aferidor de pressão arterial. Esta nova realidade culmina na tomada de decisão no próprio domicílio. Objetivos: Identificar a tomada de decisão de idosos no monitoramento domiciliar das condições crônicas; identificar se as variáveis: sexo, escolaridade e renda influenciam a tomada de decisão; identificar a percepção dos idosos quanto às ações de cuidado no domicílio; identificar as dificuldades e estratégias no manuseio dos dispositivos de monitoramento. Materiais e métodos: Estudo quantitativo, exploratório e transversal. Casuística: 150 sujeitos com 60 anos de idade ou mais, sem comprometimento cognitivo, sem depressão e que façam uso do glicosímetro e/ou do aferidor de pressão arterial no domicílio. Instrumentos para seleção dos participantes: (1) Mini Exame do Estado Mental; (2) Escala de Depressão Geriátrica e (3) Escala de Atividades Instrumentais de Vida Diária de Lawton e Brody; Coleta de dados: realizada na cidade de Ribeirão Preto - SP entre setembro de 2014 e outubro de 2015. Instrumentos: (1) Questionário Socioeconômico; (2) Questionário sobre a tomada de decisão no monitoramento da saúde no domicílio (3) Classificação do uso de dispositivos eletrônicos voltados aos cuidados à saúde. Análise dos dados: Realizada estatística descritiva e quantificações absolutas e percentuais para identificar a relação entre tomada de decisão de acordo com o sexo, escolaridade e renda. Resultados: Participaram 150 idosos, sendo 117 mulheres e 33 homens, com média de idade de 72 anos. Destes, 113 são hipertensos e 62 são diabéticos. Quanto à tomada de decisão imediata, tanto os que fazem uso do aferidor de pressão arterial (n=128) quanto do glicosímetro (n=62) referem em sua maioria procurar ajuda médica, seguida da administração do medicamento prescrito e opções alternativas de tratamento. Em médio prazo destaca-se a procura por ajuda profissional para a maioria dos idosos em ambos os grupos. Foi notada pequena diferença na tomada de decisão com relação ao sexo. Quanto à escolaridade, os idosos com mais anos de estudos tendem a procurar mais pelo serviço de saúde se comparado aos idosos de menor escolaridade. A renda não mostrou influencia entre os usuários do glicosímetro. Já entre os usuários do aferidor de pressão arterial, idosos de maior renda tendem a procurar mais pelo serviço de saúde. A maioria dos participantes se refere ao monitoramento domiciliar da saúde de maneira positiva, principalmente pela praticidade em não sair de casa, obtenção rápida de resultados e possibilidade de controle contínuo da doença. As principais dificuldades no manuseio do glicosímetro estão relacionadas ao uso da lanceta e fita reagente, seguida da checagem dos resultados armazenados. Já as dificuldades no uso do aferidor de pressão arterial estão relacionadas a conferir o resultado após cada medida e ao posicionamento correto do corpo durante o monitoramento. Em ambos os grupos as estratégias utilizadas são pedir o auxílio de terceiros e tentativa e erro. Conclusão: Os idosos tem se mostrado favoráveis às ações de monitoramento domiciliar da saúde. De maneira geral, de imediato decidem por ações dentro do próprio domicílio para o controle dos sintomas e isto reforça a necessidade do investimento em informação de qualidade e educação em saúde para que o gerenciamento domiciliar possa vir a ser uma vertente do cuidado integral no tratamento das condições crônicas.