935 resultados para new therapeutic applications
Resumo:
Objective. To evaluate the host risk factors associated with rifamycin-resistant Clostridium difficile (C. diff) infection in hospitalized patients compared to rifamycin-susceptible C.diff infection.^ Background. C. diff is the most common definable cause of nosocomial diarrhea affecting elderly hospitalized patients taking antibiotics for prolonged durations. The epidemiology of Clostridium difficile associated disease is now changing with the reports of a new hypervirulent strain causing hospital outbreaks. This new strain is associated with increased disease severity and mortality. The conventional therapy for C. diff includes metronidazole and vancomycin but high recurrence rates and treatment failures are now becoming a major concern. Rifamycin antibiotics are being developed as a new therapeutic option to treat C. diff infection after their efficacy was established in a few in vivo and in vitro studies. There are some recent studies that report an association between the hypervirulent strain and emerging rifamycin resistance. These findings assess the need for clinical studies to better understand the efficacy of rifamycin drugs against C. diff.^ Methods. This is a hospital-based, matched case-control study using de-identified data drawn from two prospective cohort studies involving C. diff patients at St Luke's Hospital. The C. diff isolates from these patients are screened for rifamycin resistance using agar dilution methods for minimum inhibitory concentrations (MIC) as part of Dr Zhi-Dong Jiang's study. Twenty-four rifamycin-rifamycin resistant C. diff cases were identified and matched with one rifamycin susceptible C. diff control on the basis of ± 10 years of age and hospitalization 30 days before or after the case. De-identified data for the 48 subjects was obtained from Dr Kevin Garey's clinical study at St Luke's Hospital enrolling C. diff patients. It was reviewed to gather information about host risk factors, outcome variables and relevant clinical characteristic.^ Results. Medical diagnosis at the time of admission (p = 0.0281) and history of chemotherapy (p = 0.022) were identified as a significant risk factor while hospital stay ranging from 1 week to 1 month and artificial feeding were identified as an important outcome variable (p = 0.072 and p = 0.081 respectively). Horn's Index assessing the severity of underlying illness and duration of antibiotics for cases and controls showed no significant difference.^ Conclusion. The study was a small project designed to identify host risk factors and understand the clinical implications of rifamycin-resistance. The study was underpowered and a larger sample size is needed to validate the results.^
Resumo:
Human lipocalin 2 is described as the neutrophil gelatinase-associated lipocalin (NGAL). The lipocalin 2 gene encodes a small, secreted glycoprotein that possesses a variety of functions, of which the best characterized function is organic iron binding activity. Elevated NGAL expression has been observed in many human cancers including breast, colorectal, pancreatic and ovarian cancers. I focused on the characterization of NGAL function in chronic myelogenous leukemia (CML) and breast cancer. Using the leukemic xenograft mouse model, we demonstrated that over-expression of NGAL in K562 cells, a leukemic cell line, led to a higher apoptotic rate and an atrophy phenotype in the spleen of inoculated mice compared to K562 cells alone. These results indicate that NGAL plays a primary role in suppressing hematopoiesis by inducing apoptosis within normal hematopoietic cells. In the breast cancer project, we analyzed two microarray data sets of breast cancer cell lines ( n = 54) and primary breast cancer samples (n = 318), and demonstrated that high NGAL expression is significantly correlated with several tumor characteristics, including negative estrogen receptor (ER) status, positive HER2 status, high tumor grade, and lymph node metastasis. Ectopic NGAL expression in non-aggressive (ZR75.1 and MCF7) cells led to aggressive tumor phenotypes in vitro and in vivo. Conversely, knockdown of NGAL expression in various breast cancer cell lines by shRNA lentiviral infection significantly decreased migration, invasion, and metastasis activities of tumor cells both in vitro and in vivo . It has been previously reported that transgenic mice with a mutation in the region of trans-membrane domain (V664E) of HER2 develop mammary tumors that progress to lung metastasis. However, we observed that genetic deletion of the 24p3 gene, a mouse homolog of NGAL, in HER2 transgenic mice by breeding with 24p3-null mice resulted in a significant delay of mammary tumor formation and reduction of lung metastasis. Strikingly, we also found that treatment with affinity purified 24p3 antibodies in the 4T1 breast cancer mice strongly reduced lung metastasis. Our studies provide evidence that NGAL plays a critical role in breast cancer development and progression, and thus NGAL has potential as a new therapeutic target in breast cancer.^
Resumo:
The most common molecular alterations observed in prostate cancer are increased bcl-2 protein expression and mutations in p53. Understanding the molecular alterations associated with prostate cancer are critical for successful treatment and designing new therapeutic interventions. Hormone-ablation therapy remains the most effective nonsurgical treatment; however, most patients will relapse with hormone-independent, refractory disease. This study addresses how hormone-ablation therapy may increase bcl-2, develops a transgenic model to elucidate the role of bcl-2 multistep prostate carcinogenesis, and assesses how bcl-2 may confer resistance to cell death induction using adenoviral wild-type p53 gene therapy. ^ Two potential androgen response elements were identified in the bcl-2 promoter. Bcl-2 promoter luciferase constructs were transfected into the hormone- sensitive LNCaP prostate cell line. In the presence of dihydrotestosterone, the activity of one bcl-2 promoter luciferase construct was repressed 40% compared to control cells grown in charcoal-stripped serum. Additionally, it was demonstrated that both bcl-2 mRNA and protein were downregulated in the LNCaP cells grown in the presence DHT. This suggests that DHT represses bcl-2 expression through possible direct and indirect mechanisms and that hormone-ablation therapy may actually increases bcl-2 protein. ^ To determine the role of bcl-2 in prostate cancer progression in vivo, probasin-bcl-2 mice were generated where human bcl-2 was targeted to the prostate. Increased bcl-2 expression rendered the ventral prostate more resistant to apoptosis induction following castration. When the probasin-bcl-2 mice were crossed with TRAMP mice, the latency to tumor formation was decreased. The expression of bcl-2 in the double transgenic mice did not affect the incidence of metastases. The double transgenic model will facilitate the study of in vivo effects of specific genetic lesions during the pathogenesis of prostate cancer. ^ The effects of increased bcl-2 protein on wild-type adenoviral p53-mediated cell death were determined in prostatic cell lines. Increased bcl-2 protected PC3 and DU145 cell lines, which possess mutant p53, from p53-mediated cell death and reductions in cell viability. Bcl-2 did not provide the same protective effect in LNCaP cell line, which expresses wild-type p53. This suggests that the ability of bcl-2 to protect against p53-mediated cell death is dependent upon the endogenous status of p53. ^
Resumo:
Differences in gene expression patterns have been documented not only in Multiple Sclerosis patients versus healthy controls but also in the relapse of the disease. Recently a new gene expression modulator has been identified: the microRNA or miRNA. The aim of this work is to analyze the possible role of miRNAs in multiple sclerosis, focusing on the relapse stage. We have analyzed the expression patterns of 364 miRNAs in PBMC obtained from multiple sclerosis patients in relapse status, in remission status and healthy controls. The expression patterns of the miRNAs with significantly different expression were validated in an independent set of samples. In order to determine the effect of the miRNAs, the expression of some predicted target genes of these were studied by qPCR. Gene interaction networks were constructed in order to obtain a co-expression and multivariate view of the experimental data. The data analysis and later validation reveal that two miRNAs (hsa-miR-18b and hsa-miR-599) may be relevant at the time of relapse and that another miRNA (hsa-miR-96) may be involved in remission. The genes targeted by hsa-miR-96 are involved in immunological pathways as Interleukin signaling and in other pathways as wnt signaling. This work highlights the importance of miRNA expression in the molecular mechanisms implicated in the disease. Moreover, the proposed involvement of these small molecules in multiple sclerosis opens up a new therapeutic approach to explore and highlight some candidate biomarker targets in MS
Resumo:
This PhD thesis contributes to the problem of resource and service discovery in the context of the composable web. In the current web, mashup technologies allow developers reusing services and contents to build new web applications. However, developers face a problem of information flood when searching for appropriate services or resources for their combination. To contribute to overcoming this problem, a framework is defined for the discovery of services and resources. In this framework, three levels are defined for performing discovery at content, discovery and agente levels. The content level involves the information available in web resources. The web follows the Representational Stateless Transfer (REST) architectural style, in which resources are returned as representations from servers to clients. These representations usually employ the HyperText Markup Language (HTML), which, along with Content Style Sheets (CSS), describes the markup employed to render representations in a web browser. Although the use of SemanticWeb standards such as Resource Description Framework (RDF) make this architecture suitable for automatic processes to use the information present in web resources, these standards are too often not employed, so automation must rely on processing HTML. This process, often referred as Screen Scraping in the literature, is the content discovery according to the proposed framework. At this level, discovery rules indicate how the different pieces of data in resources’ representations are mapped onto semantic entities. By processing discovery rules on web resources, semantically described contents can be obtained out of them. The service level involves the operations that can be performed on the web. The current web allows users to perform different tasks such as search, blogging, e-commerce, or social networking. To describe the possible services in RESTful architectures, a high-level feature-oriented service methodology is proposed at this level. This lightweight description framework allows defining service discovery rules to identify operations in interactions with REST resources. The discovery is thus performed by applying discovery rules to contents discovered in REST interactions, in a novel process called service probing. Also, service discovery can be performed by modelling services as contents, i.e., by retrieving Application Programming Interface (API) documentation and API listings in service registries such as ProgrammableWeb. For this, a unified model for composable components in Mashup-Driven Development (MDD) has been defined after the analysis of service repositories from the web. The agent level involves the orchestration of the discovery of services and contents. At this level, agent rules allow to specify behaviours for crawling and executing services, which results in the fulfilment of a high-level goal. Agent rules are plans that allow introspecting the discovered data and services from the web and the knowledge present in service and content discovery rules to anticipate the contents and services to be found on specific resources from the web. By the definition of plans, an agent can be configured to target specific resources. The discovery framework has been evaluated on different scenarios, each one covering different levels of the framework. Contenidos a la Carta project deals with the mashing-up of news from electronic newspapers, and the framework was used for the discovery and extraction of pieces of news from the web. Similarly, in Resulta and VulneraNET projects the discovery of ideas and security knowledge in the web is covered, respectively. The service level is covered in the OMELETTE project, where mashup components such as services and widgets are discovered from component repositories from the web. The agent level is applied to the crawling of services and news in these scenarios, highlighting how the semantic description of rules and extracted data can provide complex behaviours and orchestrations of tasks in the web. The main contributions of the thesis are the unified framework for discovery, which allows configuring agents to perform automated tasks. Also, a scraping ontology has been defined for the construction of mappings for scraping web resources. A novel first-order logic rule induction algorithm is defined for the automated construction and maintenance of these mappings out of the visual information in web resources. Additionally, a common unified model for the discovery of services is defined, which allows sharing service descriptions. Future work comprises the further extension of service probing, resource ranking, the extension of the Scraping Ontology, extensions of the agent model, and contructing a base of discovery rules. Resumen La presente tesis doctoral contribuye al problema de descubrimiento de servicios y recursos en el contexto de la web combinable. En la web actual, las tecnologías de combinación de aplicaciones permiten a los desarrolladores reutilizar servicios y contenidos para construir nuevas aplicaciones web. Pese a todo, los desarrolladores afrontan un problema de saturación de información a la hora de buscar servicios o recursos apropiados para su combinación. Para contribuir a la solución de este problema, se propone un marco de trabajo para el descubrimiento de servicios y recursos. En este marco, se definen tres capas sobre las que se realiza descubrimiento a nivel de contenido, servicio y agente. El nivel de contenido involucra a la información disponible en recursos web. La web sigue el estilo arquitectónico Representational Stateless Transfer (REST), en el que los recursos son devueltos como representaciones por parte de los servidores a los clientes. Estas representaciones normalmente emplean el lenguaje de marcado HyperText Markup Language (HTML), que, unido al estándar Content Style Sheets (CSS), describe el marcado empleado para mostrar representaciones en un navegador web. Aunque el uso de estándares de la web semántica como Resource Description Framework (RDF) hace apta esta arquitectura para su uso por procesos automatizados, estos estándares no son empleados en muchas ocasiones, por lo que cualquier automatización debe basarse en el procesado del marcado HTML. Este proceso, normalmente conocido como Screen Scraping en la literatura, es el descubrimiento de contenidos en el marco de trabajo propuesto. En este nivel, un conjunto de reglas de descubrimiento indican cómo los diferentes datos en las representaciones de recursos se corresponden con entidades semánticas. Al procesar estas reglas sobre recursos web, pueden obtenerse contenidos descritos semánticamente. El nivel de servicio involucra las operaciones que pueden ser llevadas a cabo en la web. Actualmente, los usuarios de la web pueden realizar diversas tareas como búsqueda, blogging, comercio electrónico o redes sociales. Para describir los posibles servicios en arquitecturas REST, se propone en este nivel una metodología de alto nivel para descubrimiento de servicios orientada a funcionalidades. Este marco de descubrimiento ligero permite definir reglas de descubrimiento de servicios para identificar operaciones en interacciones con recursos REST. Este descubrimiento es por tanto llevado a cabo al aplicar las reglas de descubrimiento sobre contenidos descubiertos en interacciones REST, en un nuevo procedimiento llamado sondeo de servicios. Además, el descubrimiento de servicios puede ser llevado a cabo mediante el modelado de servicios como contenidos. Es decir, mediante la recuperación de documentación de Application Programming Interfaces (APIs) y listas de APIs en registros de servicios como ProgrammableWeb. Para ello, se ha definido un modelo unificado de componentes combinables para Mashup-Driven Development (MDD) tras el análisis de repositorios de servicios de la web. El nivel de agente involucra la orquestación del descubrimiento de servicios y contenidos. En este nivel, las reglas de nivel de agente permiten especificar comportamientos para el rastreo y ejecución de servicios, lo que permite la consecución de metas de mayor nivel. Las reglas de los agentes son planes que permiten la introspección sobre los datos y servicios descubiertos, así como sobre el conocimiento presente en las reglas de descubrimiento de servicios y contenidos para anticipar contenidos y servicios por encontrar en recursos específicos de la web. Mediante la definición de planes, un agente puede ser configurado para descubrir recursos específicos. El marco de descubrimiento ha sido evaluado sobre diferentes escenarios, cada uno cubriendo distintos niveles del marco. El proyecto Contenidos a la Carta trata de la combinación de noticias de periódicos digitales, y en él el framework se ha empleado para el descubrimiento y extracción de noticias de la web. De manera análoga, en los proyectos Resulta y VulneraNET se ha llevado a cabo un descubrimiento de ideas y de conocimientos de seguridad, respectivamente. El nivel de servicio se cubre en el proyecto OMELETTE, en el que componentes combinables como servicios y widgets se descubren en repositorios de componentes de la web. El nivel de agente se aplica al rastreo de servicios y noticias en estos escenarios, mostrando cómo la descripción semántica de reglas y datos extraídos permiten proporcionar comportamientos complejos y orquestaciones de tareas en la web. Las principales contribuciones de la tesis son el marco de trabajo unificado para descubrimiento, que permite configurar agentes para realizar tareas automatizadas. Además, una ontología de extracción ha sido definida para la construcción de correspondencias y extraer información de recursos web. Asimismo, un algoritmo para la inducción de reglas de lógica de primer orden se ha definido para la construcción y el mantenimiento de estas correspondencias a partir de la información visual de recursos web. Adicionalmente, se ha definido un modelo común y unificado para el descubrimiento de servicios que permite la compartición de descripciones de servicios. Como trabajos futuros se considera la extensión del sondeo de servicios, clasificación de recursos, extensión de la ontología de extracción y la construcción de una base de reglas de descubrimiento.
Resumo:
La termografía infrarroja (TI) es una técnica no invasiva y de bajo coste que permite, con el simple acto de tomar una fotografía, el registro sin contacto de la energía que irradia el cuerpo humano (Akimov & Son’kin, 2011, Merla et al., 2005, Ng et al., 2009, Costello et al., 2012, Hildebrandt et al., 2010). Esta técnica comenzó a utilizarse en el ámbito médico en los años 60, pero debido a los malos resultados como herramienta diagnóstica y la falta de protocolos estandarizados (Head & Elliot, 2002), ésta se dejó de utilizar en detrimento de otras técnicas más precisas a nivel diagnóstico. No obstante, las mejoras tecnológicas de la TI en los últimos años han hecho posible un resurgimiento de la misma (Jiang et al., 2005, Vainer et al., 2005, Cheng et al., 2009, Spalding et al., 2011, Skala et al., 2012), abriendo el camino a nuevas aplicaciones no sólo centradas en el uso diagnóstico. Entre las nuevas aplicaciones, destacamos las que se desarrollan en el ámbito de la actividad física y el deporte, donde recientemente se ha demostrado que los nuevos avances con imágenes de alta resolución pueden proporcionar información muy interesante sobre el complejo sistema de termorregulación humana (Hildebrandt et al., 2010). Entre las nuevas aplicaciones destacan: la cuantificación de la asimilación de la carga de trabajo físico (Čoh & Širok, 2007), la valoración de la condición física (Chudecka et al., 2010, 2012, Akimov et al., 2009, 2011, Merla et al., 2010), la prevención y seguimiento de lesiones (Hildebrandt et al., 2010, 2012, Badža et al., 2012, Gómez Carmona, 2012) e incluso la detección de agujetas (Al-Nakhli et al., 2012). Bajo estas circunstancias, se acusa cada vez más la necesidad de ampliar el conocimiento sobre los factores que influyen en la aplicación de la TI en los seres humanos, así como la descripción de la respuesta de la temperatura de la piel (TP) en condiciones normales, y bajo la influencia de los diferentes tipos de ejercicio. Por consiguiente, este estudio presenta en una primera parte una revisión bibliográfica sobre los factores que afectan al uso de la TI en los seres humanos y una propuesta de clasificación de los mismos. Hemos analizado la fiabilidad del software Termotracker, así como su reproducibilidad de la temperatura de la piel en sujetos jóvenes, sanos y con normopeso. Finalmente, se analizó la respuesta térmica de la piel antes de un entrenamiento de resistencia, velocidad y fuerza, inmediatamente después y durante un período de recuperación de 8 horas. En cuanto a la revisión bibliográfica, hemos propuesto una clasificación para organizar los factores en tres grupos principales: los factores ambientales, individuales y técnicos. El análisis y descripción de estas influencias deben representar la base de nuevas investigaciones con el fin de utilizar la TI en las mejores condiciones. En cuanto a la reproducibilidad, los resultados mostraron valores excelentes para imágenes consecutivas, aunque la reproducibilidad de la TP disminuyó ligeramente con imágenes separadas por 24 horas, sobre todo en las zonas con valores más fríos (es decir, zonas distales y articulaciones). Las asimetrías térmicas (que normalmente se utilizan para seguir la evolución de zonas sobrecargadas o lesionadas) también mostraron excelentes resultados pero, en este caso, con mejores valores para las articulaciones y el zonas centrales (es decir, rodillas, tobillos, dorsales y pectorales) que las Zonas de Interés (ZDI) con valores medios más calientes (como los muslos e isquiotibiales). Los resultados de fiabilidad del software Termotracker fueron excelentes en todas las condiciones y parámetros. En el caso del estudio sobre los efectos de los entrenamientos de la velocidad resistencia y fuerza en la TP, los resultados muestran respuestas específicas según el tipo de entrenamiento, zona de interés, el momento de la evaluación y la función de las zonas analizadas. Los resultados mostraron que la mayoría de las ZDI musculares se mantuvieron significativamente más calientes 8 horas después del entrenamiento, lo que indica que el efecto del ejercicio sobre la TP perdura por lo menos 8 horas en la mayoría de zonas analizadas. La TI podría ser útil para cuantificar la asimilación y recuperación física después de una carga física de trabajo. Estos resultados podrían ser muy útiles para entender mejor el complejo sistema de termorregulación humano, y por lo tanto, para utilizar la TI de una manera más objetiva, precisa y profesional con visos a mejorar las nuevas aplicaciones termográficas en el sector de la actividad física y el deporte Infrared Thermography (IRT) is a safe, non-invasive and low-cost technique that allows the rapid and non-contact recording of the irradiated energy released from the body (Akimov & Son’kin, 2011; Merla et al., 2005; Ng et al., 2009; Costello et al., 2012; Hildebrandt et al., 2010). It has been used since the early 1960’s, but due to poor results as diagnostic tool and a lack of methodological standards and quality assurance (Head et al., 2002), it was rejected from the medical field. Nevertheless, the technological improvements of IRT in the last years have made possible a resurgence of this technique (Jiang et al., 2005; Vainer et al., 2005; Cheng et al., 2009; Spalding et al., 2011; Skala et al., 2012), paving the way to new applications not only focused on the diagnose usages. Among the new applications, we highlighted those in physical activity and sport fields, where it has been recently proven that a high resolution thermal images can provide us with interesting information about the complex thermoregulation system of the body (Hildebrandt et al., 2010), information than can be used as: training workload quantification (Čoh & Širok, 2007), fitness and performance conditions (Chudecka et al., 2010, 2012; Akimov et al., 2009, 2011; Merla et al., 2010; Arfaoui et al., 2012), prevention and monitoring of injuries (Hildebrandt et al., 2010, 2012; Badža et al., 2012, Gómez Carmona, 2012) and even detection of Delayed Onset Muscle Soreness – DOMS- (Al-Nakhli et al., 2012). Under this context, there is a relevant necessity to broaden the knowledge about factors influencing the application of IRT on humans, and to better explore and describe the thermal response of Skin Temperature (Tsk) in normal conditions, and under the influence of different types of exercise. Consequently, this study presents a literature review about factors affecting the application of IRT on human beings and a classification proposal about them. We analysed the reliability of the software Termotracker®, and also its reproducibility of Tsk on young, healthy and normal weight subjects. Finally, we examined the Tsk thermal response before an endurance, speed and strength training, immediately after and during an 8-hour recovery period. Concerning the literature review, we proposed a classification to organise the factors into three main groups: environmental, individual and technical factors. Thus, better exploring and describing these influence factors should represent the basis of further investigations in order to use IRT in the best and optimal conditions to improve its accuracy and results. Regarding the reproducibility results, the outcomes showed excellent values for consecutive images, but the reproducibility of Tsk slightly decreased with time, above all in the colder Regions of Interest (ROI) (i.e. distal and joint areas). The side-to-side differences (ΔT) (normally used to follow the evolution of some injured or overloaded ROI) also showed highly accurate results, but in this case with better values for joints and central ROI (i.e. Knee, Ankles, Dorsal and Pectoral) than the hottest muscle ROI (as Thigh or Hamstrings). The reliability results of the IRT software Termotracker® were excellent in all conditions and parameters. In the part of the study about the effects on Tsk of aerobic, speed and strength training, the results of Tsk demonstrated specific responses depending on the type of training, ROI, moment of the assessment and the function of the considered ROI. The results showed that most of muscular ROI maintained warmer significant Tsk 8 hours after the training, indicating that the effect of exercise on Tsk last at least 8 hours in most of ROI, as well as IRT could help to quantify the recovery status of the athlete as workload assimilation indicator. Those results could be very useful to better understand the complex skin thermoregulation behaviour, and therefore, to use IRT in a more objective, accurate and professional way to improve the new IRT applications for the physical activity and sport sector.
Resumo:
The Internet of Things (IoT) is growing at a fast pace with new devices getting connected all the time. A new emerging group of these devices are the wearable devices, and Wireless Sensor Networks are a good way to integrate them in the IoT concept and bring new experiences to the daily life activities. In this paper we present an everyday life application involving a WSN as the base of a novel context-awareness sports scenario where physiological parameters are measured and sent to the WSN by wearable devices. Applications with several hardware components introduce the problem of heterogeneity in the network. In order to integrate different hardware platforms and to introduce a service-oriented semantic middleware solution into a single application, we propose the use of an Enterprise Service Bus (ESB) as a bridge for guaranteeing interoperability and integration of the different environments, thus introducing a semantic added value needed in the world of IoT-based systems. This approach places all the data acquired (e.g., via Internet data access) at application developers disposal, opening the system to new user applications. The user can then access the data through a wide variety of devices (smartphones, tablets, computers) and Operating Systems (Android, iOS, Windows, Linux, etc.).
Resumo:
La usabilidad es un atributo de calidad de un sistema software que llega a ser crítico en sistemas altamente interactivos. Desde el campo de la Interacción Persona-Ordenador se proponen recomendaciones que permiten alcanzar un nivel adecuado de usabilidad en un sistema. En la disciplina de la Ingeniería de Software se ha establecido que algunas de estas recomendaciones afectan a la funcionalidad principal de los sistemas y no solo a la interfaz de usuario. Este tipo de recomendaciones de usabilidad se deben tener en cuenta desde las primeras actividades y durante todo el proceso de desarrollo, así como se hace con atributos tales como la seguridad, la facilidad de mantenimiento o el rendimiento. Desde la Ingeniería de Software se han hecho estudios y propuestas para abordar la usabilidad en las primeras actividades del desarrollo. En particular en la educción de requisitos y diseño de la arquitectura. Estas propuestas son de un alto nivel de abstracción. En esta investigación se aborda la usabilidad en actividades avanzadas del proceso de desarrollo: el diseño detallado y la programación. El objetivo de este trabajo es obtener, formalizar y validar soluciones reutilizables para la usabilidad en estas actividades. En este estudio se seleccionan tres funcionalidades de usabilidad identificadas como de alto impacto en el diseño: Abortar Operación, Retroalimentación de Progreso y Preferencias. Para la obtención de elementos reutilizables se utiliza un método inductivo. Se parte de la construcción de aplicaciones web particulares y se induce una solución general. Durante la construcción de las aplicaciones se mantiene la trazabilidad de los elementos relacionados con cada funcionalidad de usabilidad. Al finalizar se realiza un análisis de elementos comunes, y los hallazgos se formalizan como patrones de diseño orientados a la implementación y patrones de programación en cada uno de los lenguajes utilizados: PHP, VB .NET y Java. Las soluciones formalizadas como patrones se validan usando la metodología de estudio de casos. Desarrolladores independientes utilizan los patrones para la inclusión de las tres funcionalidades de usabilidad en dos nuevas aplicaciones web. Como resultado, los desarrolladores pueden usar con éxito las soluciones propuestas para dos de las funcionalidades: Abortar Operación y Preferencias. La funcionalidad Retroalimentación de Progreso no puede ser implementada completamente. Se concluye que es posible obtener elementos reutilizables para la implementación de cada funcionalidad de usabilidad. Estos elementos incluyen: escenarios de aplicación, que son la combinación de casuísticas que generan las funcionalidades de usabilidad, responsabilidades comunes necesarias para cubrir los escenarios, componentes comunes para cumplir con las responsabilidades, elementos de diseño asociados a los componentes y el código que implementa el diseño. Formalizar las soluciones como patrones resulta útil para comunicar los hallazgos a otros desarrolladores y los patrones se mejoran a través de su utilización en nuevos desarrollos. La implementación de funcionalidades de usabilidad presenta características que condicionan su reutilización, en particular, el nivel de acoplamiento de la funcionalidad de usabilidad con las funcionalidades de la aplicación, y la complejidad interna de la solución. ABSTRACT Usability is a critical quality attribute of highly interactive software systems. The humancomputer interaction field proposes recommendations for achieving an acceptable system usability level. The discipline of software engineering has established that some of these recommendations affect not only the user interface but also the core system functionality. This type of usability recommendations must be taken into account as of the early activities and throughout the software development process as in the case of attributes like security, ease of maintenance or performance. Software engineering has conducted studies and put forward proposals for tackling usability in the early development activities, particularly requirements elicitation and architecture design. These proposals have a high level of abstraction. This research addresses usability in later activities of the development process: detailed design and programming. The goal of this research is to discover, specify and validate reusable usability solutions for detailed design and programming. Abort Operation, Feedback and Preferences, three usability functionalities identified as having a high impact on design, are selected for the study. An inductive method, whereby a general solution is induced from particular web applications built for the purpose, is used to discover reusable elements. During the construction of the applications, the traceability of the elements related to each usability functionality is maintained. At the end of the process, the common and possibly reusable elements are analysed. The findings are specified as implementation-oriented design patterns and programming patterns for each of the languages used: PHP, VB .NET and Java. The solutions specified as patterns are validated using the case study methodology. Independent developers use the patterns in order to build the three usability functionalities into two new web applications. As a result, the developers successfully use the proposed solutions for two of the functionalities: Abort Operation and Preferences. The Progress Feedback functionality cannot be fully implemented. We conclude that it is possible to discover reusable elements for implementing each usability functionality. These elements include: application scenarios, which are combinations of cases that generate usability functionalities, common responsibilities to cover the scenarios, common components to fulfil the responsibilities, design elements associated with the components and code implementing the design. It is useful to specify solutions as patterns in order to communicate findings to other developers, and patterns improve through further use in other development projects. Reusability depends on the features of usability functionality implementation, particularly the level of coupling of the usability functionality with the application functionalities and the internal complexity of the solution.
Resumo:
The Internet of Things makes use of a huge disparity of technologies at very different levels that help one to the other to accomplish goals that were previously regarded as unthinkable in terms of ubiquity or scalability. If the Internet of Things is expected to interconnect every day devices or appliances and enable communications between them, a broad range of new services, applications and products can be foreseen. For example, monitoring is a process where sensors have widespread use for measuring environmental parameters (temperature, light, chemical agents, etc.) but obtaining readings at the exact physical point they want to be obtained from, or about the exact wanted parameter can be a clumsy, time-consuming task that is not easily adaptable to new requirements. In order to tackle this challenge, a proposal on a system used to monitor any conceivable environment, which additionally is able to monitor the status of its own components and heal some of the most usual issues of a Wireless Sensor Network, is presented here in detail, covering all the layers that give it shape in terms of devices, communications or services.
Resumo:
A finales del siglo XIX y principios del XX, la aparición de nuevos materiales, como el acero y el hormigón armado, y la experimentación en procedimientos industriales provocan un cambio en el concepto de cerramiento y en la forma de construir. La fachada se libera y se independiza de la estructura principal, y el nuevo cerramiento debe responder a los principios arquitectónicos y constructivos de este momento. Se busca, por tanto, un cerramiento nuevo. Un cerramiento ligero, de poco peso, de poco espesor, autoportante, multicapa, montado en seco, de grandes dimensiones y que cumpla las exigencias de todo cerramiento. Se puede afirmar que, hasta que Jean Prouvé experimenta con distintos materiales y sistemas de fabricación, la técnica de los cerramientos ligeros no se desarrolla por completo. En sus trabajos se pueden encontrar aplicaciones de los nuevos materiales y nuevas técnicas, e investigaciones sobre prefabricación ligera en acero y aluminio, en un intento de aplicar la producción industrial y en serie a la construcción. Esta Tesis realiza un análisis en profundidad, tanto gráfico como escrito, de los cerramientos verticales desarrollados por Jean Prouvé, sin tratarlos como objetos aislados, entendiendo que forman parte de una obra arquitectónica concreta y completa. Dicho análisis sirve para clasificarlos según las funciones esenciales que debe garantizar un cerramiento: aislar, iluminar, ventilar y proteger, y para comprender cuáles son las claves, los recursos e intenciones, utilizadas por el autor para conseguir este propósito. El resultado de la investigación se plasma de dos formas diferentes. En la primera, se realizan reflexiones críticas para extraer los temas importantes de los elementos analizados, lo que posibilita el acercamiento a otros arquitectos y ampliar el campo de visión. En la segunda, de tipo gráfico, se elabora un atlas de los distintos tipos de cerramientos verticales desarrollados por Jean Prouvé. ABSTRACT In the late nineteenth and the early twentieth century, the appearance of new materials, like steel or reinforced concrete, and the experimentation in industrial procedures cause a change in the concept of façade and in the way of build. The façade is released and become independent of the main structural frame, and the new building enclosure must answer the architectural and construction principles of that moment. A new façade is therefore looked for. A light, thin, self supported, multi layer, dry mounted and big dimensions façade that meet the exigencies of all building enclosure. You can ensure that until Jean Prouvé experiment with several materials and fabrication systems, the light façade technic does not develop completely. In his work we can find new materials applications and new technics and studies about light prefabrication with steel and aluminium, in an attempt of apply the mass production to construction. This Thesis carries out a deep analysis, graphic and written, of the vertical enclosure panels of Jean Prouvé’s work. This is made without studying them like isolated objects, but understanding that they are part of a particular architectural work, as a whole. The analysis is used for classify the panels according to main functions that a façade must satisfy: isolate, light up, ventilate and protect. And also to understand which are the keys, the resources and intentions used by Prouvé to achieve this goal. The result of the research is presented in two different ways. In the first one, a critical reflection is made in order to extract the important issues of the analyzed elements. That makes possible the approach to other architects and gives us a bigger range of vision. In the second, graphic, an atlas of the different types of vertical façade panels of Jean Prouvé is made.
Resumo:
T helper (Th) cells can be categorized according to their cytokine expression. The differential induction of Th cells expressing Th1 and/or Th2 cytokines is key to the regulation of both protective and pathological immune responses. Cytokines are expressed transiently and there is a lack of stably expressed surface molecules, significant for functionally different types of Th cells. Such molecules are of utmost importance for the analysis and selective functional modulation of Th subsets and will provide new therapeutic strategies for the treatment of allergic or autoimmune diseases. To this end, we have identified potential target genes preferentially expressed in Th2 cells, expressing interleukin (IL)-4, IL-5, and/or IL-10, but not interferon-γ. One such gene, T1/ST2, is expressed stably on both Th2 clones and Th2-polarized cells activated in vivo or in vitro. T1/ST2 expression is independent of induction by IL-4, IL-5, or IL-10. T1/ST2 plays a critical role in Th2 effector function. Administration of either a mAb against T1/ST2 or recombinant T1/ST2 fusion protein attenuates eosinophilic inflammation of the airways and suppresses IL-4 and IL-5 production in vivo following adoptive transfer of Th2 cells.
Resumo:
Previous work has shown that glucocorticoid hormones facilitate the behavioral and dopaminergic effects of morphine. In this study we examined the possible role in these effects of the two central corticosteroid receptor types: mineralocorticoid receptor (MR), and glucocorticoid receptor (GR). To accomplish this, specific antagonists of these receptors were infused intracerebroventricularly and 2 hr later we measured: (i) locomotor activity induced by a systemic injection of morphine (2 mg/kg); (ii) locomotor activity induced by an infusion of morphine (1 μg per side) into the ventral tegmental area, which is a dopamine-dependent behavioral response to morphine; (iii) morphine-induced dopamine release in the nucleus accumbens, a dopaminergic projection site mediating the locomotor and reinforcing effects of drugs of abuse. Blockade of MRs by spironolactone had no significant effects on locomotion induced by systemic morphine. In contrast, blockade of GRs by either RU38486 or RU39305, which is devoid of antiprogesterone effects, reduced the locomotor response to morphine, and this effect was dose dependent. GR antagonists also reduced the locomotor response to intraventral tegmental area morphine as well as the basal and morphine-induced increase in accumbens dopamine, as measured by microdialysis in freely moving rats. In contrast, spironolactone did not modify dopamine release. In conclusion, glucocorticoids, via GRs, facilitate the dopamine-dependent behavioral effects of morphine, probably by facilitating dopamine release. The possibility of decreasing the behavioral and dopaminergic effects of opioids by an acute administration of GR antagonists may open new therapeutic strategies for treatment of drug addiction.
Resumo:
Steroids, thyroid hormones, vitamin D3, and retinoids are lipophilic small molecules that regulate diverse biological effects such as cell differentiation, development, and homeostasis. The actions of these hormones are mediated by steroid/nuclear receptors which function as ligand-dependent transcriptional regulators. Transcriptional activation by ligand-bound receptors is a complex process requiring dissociation and recruitment of several additional cofactors. We report here the cloning and characterization of receptor-associated coactivator 3 (RAC3), a human transcriptional coactivator for steroid/nuclear receptors. RAC3 interacts with several liganded receptors through a mechanism which requires their respective ligand-dependent activation domains. RAC3 can activate transcription when tethered to a heterologous DNA-binding domain. Overexpression of RAC3 enhances the ligand-dependent transcriptional activation by the receptors in mammalian cells. Sequence analysis reveals that RAC3 is related to steroid receptor coactivator 1 (SRC-1) and transcriptional intermediate factor 2 (TIF2), two of the most potent coactivators for steroid/nuclear receptors. Thus, RAC3 is a member of a growing coactivator network that should be useful as a tool for understanding hormone action and as a target for developing new therapeutic agents that can block hormone-dependent neoplasia.
Resumo:
Triabin, a 142-residue protein from the saliva of the blood-sucking triatomine bug Triatoma pallidipennis, is a potent and selective thrombin inhibitor. Its stoichiometric complex with bovine α-thrombin was crystallized, and its crystal structure was solved by Patterson search methods and refined at 2.6-Å resolution to an R value of 0.184. The analysis revealed that triabin is a compact one-domain molecule essentially consisting of an eight-stranded β-barrel. The eight strands A to H are arranged in the order A-C-B-D-E-F-G-H, with the first four strands exhibiting a hitherto unobserved up-up-down-down topology. Except for the B-C inversion, the triabin fold exhibits the regular up-and-down topology of lipocalins. In contrast to the typical ligand-binding lipocalins, however, the triabin barrel encloses a hydrophobic core intersected by a unique salt-bridge cluster. Triabin interacts with thrombin exclusively via its fibrinogen-recognition exosite. Surprisingly, most of the interface interactions are hydrophobic. A prominent exception represents thrombin’s Arg-77A side chain, which extends into a hydrophobic triabin pocket forming partially buried salt bridges with Glu-128 and Asp-135 of the inhibitor. The fully accessible active site of thrombin in this complex is in agreement with its retained hydrolytic activity toward small chromogenic substrates. Impairment of thrombin’s fibrinogen converting activity or of its thrombomodulin-mediated protein C activation capacity upon triabin binding is explained by usage of overlapping interaction sites of fibrinogen, thrombomodulin, and triabin on thrombin. These data demonstrate that triabin inhibits thrombin via a novel and unique mechanism that might be of interest in the context of potential therapeutic applications.
Resumo:
A novel fungal metabolite, apicidin [cyclo(N-O-methyl-l-tryptophanyl-l-isoleucinyl-d-pipecolinyl-l-2-amino-8-oxodecanoyl)], that exhibits potent, broad spectrum antiprotozoal activity in vitro against Apicomplexan parasites has been identified. It is also orally and parenterally active in vivo against Plasmodium berghei malaria in mice. Many Apicomplexan parasites cause serious, life-threatening human and animal diseases, such as malaria, cryptosporidiosis, toxoplasmosis, and coccidiosis, and new therapeutic agents are urgently needed. Apicidin’s antiparasitic activity appears to be due to low nanomolar inhibition of Apicomplexan histone deacetylase (HDA), which induces hyperacetylation of histones in treated parasites. The acetylation–deacetylation of histones is a thought to play a central role in transcriptional control in eukaryotic cells. Other known HDA inhibitors were also evaluated and found to possess antiparasitic activity, suggesting that HDA is an attractive target for the development of novel antiparasitic agents.