849 resultados para Integración of methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Los sistemas transaccionales tales como los programas informáticos para la planificación de recursos empresariales (ERP software) se han implementado ampliamente mientras que los sistemas analíticos para la gestión de la cadena de suministro (SCM software) no han tenido el éxito deseado por la industria de tecnología de información (TI). Aunque se documentan beneficios importantes derivados de las implantaciones de SCM software, las empresas industriales son reacias a invertir en este tipo de sistemas. Por una parte esto es debido a la falta de métodos que son capaces de detectar los beneficios por emplear esos sistemas, y por otra parte porque el coste asociado no está identificado, detallado y cuantificado suficientemente. Los esquemas de coordinación basados únicamente en sistemas ERP son alternativas válidas en la práctica industrial siempre que la relación coste-beneficio esta favorable. Por lo tanto, la evaluación de formas organizativas teniendo en cuenta explícitamente el coste debido a procesos administrativos, en particular por ciclos iterativos, es de gran interés para la toma de decisiones en el ámbito de inversiones en TI. Con el fin de cerrar la brecha, el propósito de esta investigación es proporcionar métodos de evaluación que permitan la comparación de diferentes formas de organización y niveles de soporte por sistemas informáticos. La tesis proporciona una amplia introducción, analizando los retos a los que se enfrenta la industria. Concluye con las necesidades de la industria de SCM software: unas herramientas que facilitan la evaluación integral de diferentes propuestas de organización. A continuación, la terminología clave se detalla centrándose en la teoría de la organización, las peculiaridades de inversión en TI y la tipología de software de gestión de la cadena de suministro. La revisión de la literatura clasifica las contribuciones recientes sobre la gestión de la cadena de suministro, tratando ambos conceptos, el diseño de la organización y su soporte por las TI. La clasificación incluye criterios relacionados con la metodología de la investigación y su contenido. Los estudios empíricos en el ámbito de la administración de empresas se centran en tipologías de redes industriales. Nuevos algoritmos de planificación y esquemas de coordinación innovadoras se desarrollan principalmente en el campo de la investigación de operaciones con el fin de proponer nuevas funciones de software. Artículos procedentes del área de la gestión de la producción se centran en el análisis de coste y beneficio de las implantaciones de sistemas. La revisión de la literatura revela que el éxito de las TI para la coordinación de redes industriales depende en gran medida de características de tres dimensiones: la configuración de la red industrial, los esquemas de coordinación y las funcionalidades del software. La literatura disponible está enfocada sobre todo en los beneficios de las implantaciones de SCM software. Sin embargo, la coordinación de la cadena de suministro, basándose en el sistema ERP, sigue siendo la práctica industrial generalizada, pero el coste de coordinación asociado no ha sido abordado por los investigadores. Los fundamentos de diseño organizativo eficiente se explican en detalle en la medida necesaria para la comprensión de la síntesis de las diferentes formas de organización. Se han generado varios esquemas de coordinación variando los siguientes parámetros de diseño: la estructura organizativa, los mecanismos de coordinación y el soporte por TI. Las diferentes propuestas de organización desarrolladas son evaluadas por un método heurístico y otro basado en la simulación por eventos discretos. Para ambos métodos, se tienen en cuenta los principios de la teoría de la organización. La falta de rendimiento empresarial se debe a las dependencias entre actividades que no se gestionan adecuadamente. Dentro del método heurístico, se clasifican las dependencias y se mide su intensidad basándose en factores contextuales. A continuación, se valora la idoneidad de cada elemento de diseño organizativo para cada dependencia específica. Por último, cada forma de organización se evalúa basándose en la contribución de los elementos de diseño tanto al beneficio como al coste. El beneficio de coordinación se refiere a la mejora en el rendimiento logístico - este concepto es el objeto central en la mayoría de modelos de evaluación de la gestión de la cadena de suministro. Por el contrario, el coste de coordinación que se debe incurrir para lograr beneficios no se suele considerar en detalle. Procesos iterativos son costosos si se ejecutan manualmente. Este es el caso cuando SCM software no está implementada y el sistema ERP es el único instrumento de coordinación disponible. El modelo heurístico proporciona un procedimiento simplificado para la clasificación sistemática de las dependencias, la cuantificación de los factores de influencia y la identificación de configuraciones que indican el uso de formas organizativas y de soporte de TI más o menos complejas. La simulación de eventos discretos se aplica en el segundo modelo de evaluación utilizando el paquete de software ‘Plant Simulation’. Con respecto al rendimiento logístico, por un lado se mide el coste de fabricación, de inventario y de transporte y las penalizaciones por pérdida de ventas. Por otro lado, se cuantifica explícitamente el coste de la coordinación teniendo en cuenta los ciclos de coordinación iterativos. El método se aplica a una configuración de cadena de suministro ejemplar considerando diversos parámetros. Los resultados de la simulación confirman que, en la mayoría de los casos, el beneficio aumenta cuando se intensifica la coordinación. Sin embargo, en ciertas situaciones en las que se aplican ciclos de planificación manuales e iterativos el coste de coordinación adicional no siempre conduce a mejor rendimiento logístico. Estos resultados inesperados no se pueden atribuir a ningún parámetro particular. La investigación confirma la gran importancia de nuevas dimensiones hasta ahora ignoradas en la evaluación de propuestas organizativas y herramientas de TI. A través del método heurístico se puede comparar de forma rápida, pero sólo aproximada, la eficiencia de diferentes formas de organización. Por el contrario, el método de simulación es más complejo pero da resultados más detallados, teniendo en cuenta parámetros específicos del contexto del caso concreto y del diseño organizativo. ABSTRACT Transactional systems such as Enterprise Resource Planning (ERP) systems have been implemented widely while analytical software like Supply Chain Management (SCM) add-ons are adopted less by manufacturing companies. Although significant benefits are reported stemming from SCM software implementations, companies are reluctant to invest in such systems. On the one hand this is due to the lack of methods that are able to detect benefits from the use of SCM software and on the other hand associated costs are not identified, detailed and quantified sufficiently. Coordination schemes based only on ERP systems are valid alternatives in industrial practice because significant investment in IT can be avoided. Therefore, the evaluation of these coordination procedures, in particular the cost due to iterations, is of high managerial interest and corresponding methods are comprehensive tools for strategic IT decision making. The purpose of this research is to provide evaluation methods that allow the comparison of different organizational forms and software support levels. The research begins with a comprehensive introduction dealing with the business environment that industrial networks are facing and concludes highlighting the challenges for the supply chain software industry. Afterwards, the central terminology is addressed, focusing on organization theory, IT investment peculiarities and supply chain management software typology. The literature review classifies recent supply chain management research referring to organizational design and its software support. The classification encompasses criteria related to research methodology and content. Empirical studies from management science focus on network types and organizational fit. Novel planning algorithms and innovative coordination schemes are developed mostly in the field of operations research in order to propose new software features. Operations and production management researchers realize cost-benefit analysis of IT software implementations. The literature review reveals that the success of software solutions for network coordination depends strongly on the fit of three dimensions: network configuration, coordination scheme and software functionality. Reviewed literature is mostly centered on the benefits of SCM software implementations. However, ERP system based supply chain coordination is still widespread industrial practice but the associated coordination cost has not been addressed by researchers. Fundamentals of efficient organizational design are explained in detail as far as required for the understanding of the synthesis of different organizational forms. Several coordination schemes have been shaped through the variation of the following design parameters: organizational structuring, coordination mechanisms and software support. The different organizational proposals are evaluated using a heuristic approach and a simulation-based method. For both cases, the principles of organization theory are respected. A lack of performance is due to dependencies between activities which are not managed properly. Therefore, within the heuristic method, dependencies are classified and their intensity is measured based on contextual factors. Afterwards the suitability of each organizational design element for the management of a specific dependency is determined. Finally, each organizational form is evaluated based on the contribution of the sum of design elements to coordination benefit and to coordination cost. Coordination benefit refers to improvement in logistic performance – this is the core concept of most supply chain evaluation models. Unfortunately, coordination cost which must be incurred to achieve benefits is usually not considered in detail. Iterative processes are costly when manually executed. This is the case when SCM software is not implemented and the ERP system is the only available coordination instrument. The heuristic model provides a simplified procedure for the classification of dependencies, quantification of influence factors and systematic search for adequate organizational forms and IT support. Discrete event simulation is applied in the second evaluation model using the software package ‘Plant Simulation’. On the one hand logistic performance is measured by manufacturing, inventory and transportation cost and penalties for lost sales. On the other hand coordination cost is explicitly considered taking into account iterative coordination cycles. The method is applied to an exemplary supply chain configuration considering various parameter settings. The simulation results confirm that, in most cases, benefit increases when coordination is intensified. However, in some situations when manual, iterative planning cycles are applied, additional coordination cost does not always lead to improved logistic performance. These unexpected results cannot be attributed to any particular parameter. The research confirms the great importance of up to now disregarded dimensions when evaluating SCM concepts and IT tools. The heuristic method provides a quick, but only approximate comparison of coordination efficiency for different organizational forms. In contrast, the more complex simulation method delivers detailed results taking into consideration specific parameter settings of network context and organizational design.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Parte de la investigación biomédica actual se encuentra centrada en el análisis de datos heterogéneos. Estos datos pueden tener distinto origen, estructura, y semántica. Gran cantidad de datos de interés para los investigadores se encuentran en bases de datos públicas, que recogen información de distintas fuentes y la ponen a disposición de la comunidad de forma gratuita. Para homogeneizar estas fuentes de datos públicas con otras de origen privado, existen diversas herramientas y técnicas que permiten automatizar los procesos de homogeneización de datos heterogéneos. El Grupo de Informática Biomédica (GIB) [1] de la Universidad Politécnica de Madrid colabora en el proyecto europeo P-medicine [2], cuya finalidad reside en el desarrollo de una infraestructura que facilite la evolución de los procedimientos médicos actuales hacia la medicina personalizada. Una de las tareas enmarcadas en el proyecto P-medicine que tiene asignado el grupo consiste en elaborar herramientas que ayuden a usuarios en el proceso de integración de datos contenidos en fuentes de información heterogéneas. Algunas de estas fuentes de información son bases de datos públicas de ámbito biomédico contenidas en la plataforma NCBI [3] (National Center for Biotechnology Information). Una de las herramientas que el grupo desarrolla para integrar fuentes de datos es Ontology Annotator. En una de sus fases, la labor del usuario consiste en recuperar información de una base de datos pública y seleccionar de forma manual los resultados relevantes. Para automatizar el proceso de búsqueda y selección de resultados relevantes, por un lado existe un gran interés en conseguir generar consultas que guíen hacia resultados lo más precisos y exactos como sea posible, por otro lado, existe un gran interés en extraer información relevante de elevadas cantidades de documentos, lo cual requiere de sistemas que analicen y ponderen los datos que caracterizan a los mismos. En el campo informático de la inteligencia artificial, dentro de la rama de la recuperación de la información, existen diversos estudios acerca de la expansión de consultas a partir de retroalimentación relevante que podrían ser de gran utilidad para dar solución a la cuestión. Estos estudios se centran en técnicas para reformular o expandir la consulta inicial utilizando como realimentación los resultados que en una primera instancia fueron relevantes para el usuario, de forma que el nuevo conjunto de resultados tenga mayor proximidad con los que el usuario realmente desea. El objetivo de este trabajo de fin de grado consiste en el estudio, implementación y experimentación de métodos que automaticen el proceso de extracción de información trascendente de documentos, utilizándola para expandir o reformular consultas. De esta forma se pretende mejorar la precisión y el ranking de los resultados asociados. Dichos métodos serán integrados en la herramienta Ontology Annotator y enfocados a la fuente de datos de PubMed [4].---ABSTRACT---Part of the current biomedical research is focused on the analysis of heterogeneous data. These data may have different origin, structure and semantics. A big quantity of interesting data is contained in public databases which gather information from different sources and make it open and free to be used by the community. In order to homogenize thise sources of public data with others which origin is private, there are some tools and techniques that allow automating the processes of integration heterogeneous data. The biomedical informatics group of the Universidad Politécnica de Madrid cooperates with the European project P-medicine which main purpose is to create an infrastructure and models to facilitate the transition from current medical practice to personalized medicine. One of the tasks of the project that the group is in charge of consists on the development of tools that will help users in the process of integrating data from diverse sources. Some of the sources are biomedical public data bases from the NCBI platform (National Center for Biotechnology Information). One of the tools in which the group is currently working on for the integration of data sources is called the Ontology Annotator. In this tool there is a phase in which the user has to retrieve information from a public data base and select the relevant data contained in it manually. For automating the process of searching and selecting data on the one hand, there is an interest in automatically generating queries that guide towards the more precise results as possible. On the other hand, there is an interest on retrieve relevant information from large quantities of documents. The solution requires systems that analyze and weigh the data allowing the localization of the relevant items. In the computer science field of the artificial intelligence, in the branch of information retrieval there are diverse studies about the query expansion from relevance feedback that could be used to solve the problem. The main purpose of this studies is to obtain a set of results that is the closer as possible to the information that the user really wants to retrieve. In order to reach this purpose different techniques are used to reformulate or expand the initial query using a feedback the results that where relevant for the user, with this method, the new set of results will have more proximity with the ones that the user really desires. The goal of this final dissertation project consists on the study, implementation and experimentation of methods that automate the process of extraction of relevant information from documents using this information to expand queries. This way, the precision and the ranking of the results associated will be improved. These methods will be integrated in the Ontology Annotator tool and will focus on the PubMed data source.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

All of the 17 autistic children studied in the present paper showed disturbances of movement that with our methods could be detected clearly at the age of 4–6 months, and sometimes even at birth. We used the Eshkol–Wachman Movement Analysis System in combination with still-frame videodisc analysis to study videos obtained from parents of children who had been diagnosed as autistic by conventional methods, usually around 3 years old. The videos showed their behaviors when they were infants, long before they had been diagnosed as autistic. The movement disorders varied from child to child. Disturbances were revealed in the shape of the mouth and in some or all of the milestones of development, including, lying, righting, sitting, crawling, and walking. Our findings support the view that movement disturbances play an intrinsic part in the phenomenon of autism, that they are present at birth, and that they can be used to diagnose the presence of autism in the first few months of life. They indicate the need for the development of methods of therapy to be applied from the first few months of life in autism.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Clustered DNA damages—two or more closely spaced damages (strand breaks, abasic sites, or oxidized bases) on opposing strands—are suspects as critical lesions producing lethal and mutagenic effects of ionizing radiation. However, as a result of the lack of methods for measuring damage clusters induced by ionizing radiation in genomic DNA, neither the frequencies of their production by physiological doses of radiation, nor their repairability, nor their biological effects are known. On the basis of methods that we developed for quantitating damages in large DNAs, we have devised and validated a way of measuring ionizing radiation-induced clustered lesions in genomic DNA, including DNA from human cells. DNA is treated with an endonuclease that induces a single-strand cleavage at an oxidized base or abasic site. If there are two closely spaced damages on opposing strands, such cleavage will reduce the size of the DNA on a nondenaturing gel. We show that ionizing radiation does induce clustered DNA damages containing abasic sites, oxidized purines, or oxidized pyrimidines. Further, the frequency of each of these cluster classes is comparable to that of frank double-strand breaks; among all complex damages induced by ionizing radiation, double-strand breaks are only about 20%, with other clustered damage constituting some 80%. We also show that even low doses (0.1–1 Gy) of high linear energy transfer ionizing radiation induce clustered damages in human cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The application of gene therapy to human disease is currently restricted by the relatively low efficiency and potential hazards of methods of oligonucleotide or gene delivery. Antisense or transcription factor decoy oligonucleotides have been shown to be effective at altering gene expression in cell culture expreriments, but their in vivo application is limited by the efficiency of cellular delivery, the intracellular stability of the compounds, and their duration of activity. We report herein the development of a highly efficient method for naked oligodeoxynucleotide (ODN) transfection into cardiovascular tissues by using controlled, nondistending pressure without the use of viral vectors, lipid formulations, or exposure to other adjunctive, potentially hazardous substances. In this study, we have documented the ability of ex vivo, pressure-mediated transfection to achieve nuclear localization of fluorescent (FITC)-labeled ODN in approximately 90% and 50% of cells in intact human saphenous vein and rat myocardium, respectively. We have further documented that pressure-mediated delivery of antisense ODN can functionally inhibit target gene expression in both of these tissues in a sequence-specific manner at the mRNA and protein levels. This oligonucleotide transfection system may represent a safe means of achieving the intraoperative genetic engineering of failure-resistant human bypass grafts and may provide an avenue for the genetic manipultation of cardiac allograft rejection, allograft vasculopathy, or other transplant diseases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mutations in the VHL tumor suppressor gene result in constitutive expression of many hypoxia-inducible genes, at least in part because of increases in the cellular level of hypoxia-inducible transcription factor HIF1α, which in normal cells is rapidly ubiquitinated and degraded by the proteasome under normoxic conditions. The recent observation that the VHL protein is a subunit of an Skp1-Cul1/Cdc53-F-box (SCF)-like E3 ubiquitin ligase raised the possibility that VHL may be directly responsible for regulating cellular levels of HIF1α by targeting it for ubiquitination and proteolysis. In this report, we test this hypothesis directly. We report development of methods for production of the purified recombinant VHL complex and present direct biochemical evidence that it can function with an E1 ubiquitin-activating enzyme and E2 ubiquitin-conjugating enzyme to activate HIF1α ubiquitination in vitro. Our findings provide new insight into the function of the VHL tumor suppressor protein, and they provide a foundation for future investigations of the mechanisms underlying VHL regulation of oxygen-dependent gene expression.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although the zebrafish possesses many characteristics that make it a valuable model for genetic studies of vertebrate development, one deficiency of this model system is the absence of methods for cell-mediated gene transfer and targeted gene inactivation. In mice, embryonic stem cell cultures are routinely used for gene transfer and provide the advantage of in vitro selection for rare events such as homologous recombination and targeted mutation. Transgenic animals possessing a mutated copy of the targeted gene are generated when the selected cells contribute to the germ line of a chimeric embryo. Although zebrafish embryo cell cultures that exhibit characteristics of embryonic stem cells have been described, successful contribution of the cells to the germ-cell lineage of a host embryo has not been reported. In this study, we demonstrate that short-term zebrafish embryo cell cultures maintained in the presence of cells from a rainbow trout spleen cell line (RTS34st) are able to produce germ-line chimeras when introduced into a host embryo. Messenger RNA encoding the primordial germ-cell marker, vasa, was present for more than 30 days in embryo cells cocultured with RTS34st cells or their conditioned medium and disappeared by 5 days in the absence of the spleen cells. The RTS34st cells also inhibited melanocyte and neuronal cell differentiation in the embryo cell cultures. These results suggest that the RTS34st splenic–stromal cell line will be a valuable tool in the development of a cell-based gene transfer approach to targeted gene inactivation in zebrafish.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The false memory/recovered memory debate, research regarding the malleability of memory, and the current lack of methods for validating recovered memories all support the view that heightened care is required of therapists dealing with clients whom they suspect have been sexually abused. The judgmental heuristics that underlie the major clinical inference biases of confirmatory bias, biased covariation, base rate fallacies, and schematic processing errors are all relevant to the processes leading to therapist-client constructions of memories of sexual abuse. Suggestions for minimizing each of these biases are offered. Personal motivations of the client and client suggestibility are factors that may contribute to the construction of memories of sexual abuse, and suggestions for minimizing the impact of these motivations are offered. In conclusion, general suggestions for minimizing the impact of clinical inference biases within the sexual abuse treatment context are summarized.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Online education is no longer a trend, rather it is mainstream. In the Fall of 2012, 69.1% of chief academic leaders indicated online learning was critical to their long-term strategy and of the 20.6 million students enrolled in higher education, 6.7 million were enrolled in an online course (Allen & Seaman, 2013; United States Department of Education, 2013). The advent of online education and its rapid growth has forced academic institutions and faculty to question the current styles and techniques for teaching and learning. As developments in educational technology continue to advance, the ways in which we deliver and receive knowledge in both the traditional and online classrooms will further evolve. It is necessary to investigate and understand the progression and advancements in educational technology and the variety of methods used to deliver knowledge to improve the quality of education we provide today and motivate, inspire, and educate the students of the 21st century. This paper explores the atioevolution of distance education beginning with correspondence and the use of parcel post, to radio, then to television, and finally to online education.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysis of vibrations and displacements is a hot topic in structural engineering. Although there is a wide variety of methods for vibration analysis, direct measurement of displacements in the mid and high frequency range is not well solved and accurate devices tend to be very expensive. Low-cost systems can be achieved by applying adequate image processing algorithms. In this paper, we propose the use of a commercial pocket digital camera, which is able to register more than 420 frames per second (fps) at low resolution, for accurate measuring of small vibrations and displacements. The method is based on tracking elliptical targets with sub-pixel accuracy. Our proposal is demonstrated at a 10 m distance with a spatial resolution of 0.15 mm. A practical application over a simple structure is given, and the main parameters of an attenuated movement of a steel column after an impulsive impact are determined with a spatial accuracy of 4 µm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysis of vibrations and displacements is a hot topic in structural engineering. Although there is a wide variety of methods for vibration analysis, direct measurement of displacements in the mid and high frequency range is not well solved and accurate devices tend to be very expensive. Low-cost systems can be achieved by applying adequate image processing algorithms. In this paper, we propose the use of a commercial pocket digital camera, which is able to register more than 420 frames per second (fps) at low resolution, for accurate measuring of small vibrations and displacements. The method is based on tracking elliptical targets with sub-pixel accuracy. Our proposal is demonstrated at a 10 m distance with a spatial resolution of 0.15 mm. A practical application over a simple structure is given, and the main parameters of an attenuated movement of a steel column after an impulsive impact are determined with a spatial accuracy of 4 µm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This RILEM Technical Recommendation intends to give a general description of methods of sampling for obtaining chloride concentration profiles in concrete, applicable both for laboratory cast concrete specimens, for concrete cores taken from structures and for testing on site. These sampling procedures may be applied for obtaining concentration profiles of any other chemical species present in concrete.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The microfoundations research agenda presents an expanded theoretical perspective because it considers individuals, their characteristics, and their interactions as relevant variables to help us understand firm-level strategic issues. However, microfoundations empirical research faces unique challenges because processes take place at different levels of analysis and these multilevel processes must be considered simultaneously. We describe multilevel modeling and mixed methods as methodological approaches whose use will allow for theoretical advancements. We describe key issues regarding the use of these two types of methods and, more importantly, discuss pressing substantive questions and topics that can be addressed with each of these methodological approaches with the goal of making theoretical advancements regarding the microfoundations research agenda and strategic management studies in general.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As human populations and resource consumption increase, it is increasingly important to monitor the quality of our environment. While laboratory instruments offer useful information, portable, easy to use sensors would allow environmental analysis to occur on-site, at lower cost, and with minimal operator training. We explore the synthesis, modification, and applications of modified polysiloxane in environmental sensing. Multiple methods of producing modified siloxanes were investigated. Oligomers were formed by using functionalized monomers, producing siloxane materials containing silicon hydride, methyl, and phenyl side chains. Silicon hydride-functionalized oligomers were further modified by hydrosilylation to incorporate methyl ester and naphthyl side chains. Modifications to the siloxane materials were also carried out using post-curing treatments. Methyl ester-functionalized siloxane was incorporated into the surface of a cured poly(dimethylsiloxane) film by siloxane equilibration. The materials containing methyl esters were hydrolyzed to reveal carboxylic acids, which could later be used for covalent protein immobilization. Finally, the siloxane surfaces were modified to incorporate antibodies by covalent, affinity, and adsorption-based attachment. These modifications were characterized by a variety of methods, including contact angle, attenuated total reflectance Fourier transform infrared spectroscopy, dye labels, and 1H nuclear magnetic resonance spectroscopy. The modified siloxane materials were employed in a variety of sensing schemes. Volatile organic compounds were detected using methyl, phenyl, and naphthyl-functionalized materials on a Fabry-Perot interferometer and a refractometer. The Fabry-Perot interferometer was found to detect the analytes upon siloxane extraction by deformation of the Bragg reflectors. The refractometer was used to determine that naphthyl-functionalized siloxanes had elevated refractive indices, rendering these materials more sensitive to some analytes. Antibody-modified siloxanes were used to detect biological analytes through a solid phase microextraction-mediated enzyme linked immunosorbent assay (SPME ELISA). The SPME ELISA was found to have higher analyte sensitivity compared to a conventional ELISA system. The detection scheme was used to detect Escherichia coli at 8500 CFU/mL. These results demonstrate the variety of methods that can be used to modify siloxanes and the wide range of applications of modified siloxanes has been demonstrated through chemical and biological sensing schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is - potentially fatally - obstructed. It is one of the leading causes of sudden cardiac death in young people. Electrocardiography (ECG) and Echocardiography (Echo) are the standard tests for identifying HCM and other cardiac abnormalities. The American Heart Association has recommended using a pre-participation questionnaire for young athletes instead of ECG or Echo tests due to considerations of cost and time involved in interpreting the results of these tests by an expert cardiologist. Initially we set out to develop a classifier for automated prediction of young athletes’ heart conditions based on the answers to the questionnaire. Classification results and further in-depth analysis using computational and statistical methods indicated significant shortcomings of the questionnaire in predicting cardiac abnormalities. Automated methods for analyzing ECG signals can help reduce cost and save time in the pre-participation screening process by detecting HCM and other cardiac abnormalities. Therefore, the main goal of this dissertation work is to identify HCM through computational analysis of 12-lead ECG. ECG signals recorded on one or two leads have been analyzed in the past for classifying individual heartbeats into different types of arrhythmia as annotated primarily in the MIT-BIH database. In contrast, we classify complete sequences of 12-lead ECGs to assign patients into two groups: HCM vs. non-HCM. The challenges and issues we address include missing ECG waves in one or more leads and the dimensionality of a large feature-set. We address these by proposing imputation and feature-selection methods. We develop heartbeat-classifiers by employing Random Forests and Support Vector Machines, and propose a method to classify full 12-lead ECGs based on the proportion of heartbeats classified as HCM. The results from our experiments show that the classifiers developed using our methods perform well in identifying HCM. Thus the two contributions of this thesis are the utilization of computational and statistical methods for discovering shortcomings in a current screening procedure and the development of methods to identify HCM through computational analysis of 12-lead ECG signals.