17 resultados para Isa Label

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los trabajos previos en los que se han estudiado las recomendaciones de metionina+cistina para gallinas ponedoras son muy numerosos, pero los resultados obtenidos presentan una gran variabilidad y, en algunos casos, son contradictorios. Esta variabilidad se explica por las condiciones en las que se ha realizado el estudio, la edad de las gallinas, la genética y el parámetro a optimizar. En este sentido, Novak et al. (2004) observaron que las necesidades totales de metionina+cistina eran mayores para maximizar el peso del huevo que para optimizar la producción de huevos o la eficacia alimenticia. Estas diferencias fueron menos importantes entre las 20 y 43 semanas (8%), que de las 44 a las 63 semanas de edad (16%). Además, las recomendaciones para optimizar la producción y el peso del huevo fueron un 17% y 11% mayores, respectivamente, en el primer periodo con respecto al segundo. Por el contrario, Waldroup y Hellwig (1995) encontraron que las necesidades totales de metionina+cistina para optimizar la producción y masa de huevo fueron más elevadas (12 y 10%, respectivamente) de 51 a 71 semanas de edad que de 25 a 45. Cuando las recomendaciones se expresan en unidades digestibles, el rango de necesidades de metionina+cistina digestibles con respecto a lisina digestible varía desde un 81 a un 107% (81%: Coon and Zhang, 1999; 90%: FEDNA, 2008; 91%: Rostagno et al., 2005; 93%: CVB, 1996; 94%: Bregendahl et al., 2008; 99%: Brumano et al., 2010a; 100%: Cupertino et al., 2009; Brumano et al., 2010a; 101%: Brumano et al., 2010b; 107%: Schmidt et al., 2009). Como consecuencia de esta alta variabilidad, es necesario seguir investigando sobre cuál sería el ratio óptimo metionina+cistina/lisina digestible para optimizar los rendimientos de gallinas ponedoras. Por tanto, el objetivo de este trabajo es determinar las necesidades óptimas de metionina+cistina digestibles con respecto a lisina digestible de gallinas Isa Brown desde las 34 a las 42 semanas de edad

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In previous works we demonstrated the benefits of using micro–nano patterning materials to be used as bio-photonic sensing cells (BICELLs), referred as micro–nano photonic structures having immobilized bioreceptors on its surface with the capability of recognizing the molecular binding by optical transduction. Gestrinone/anti-gestrinone and BSA/anti-BSA pairs were proven under different optical configurations to experimentally validate the biosensing capability of these bio-sensitive photonic architectures. Moreover, Three-Dimensional Finite Difference Time Domain (FDTD) models were employed for simulating the optical response of these structures. For this article, we have developed an effective analytical simulation methodology capable of simulating complex biophotonic sensing architectures. This simulation method has been tested and compared with previous experimental results and FDTD models. Moreover, this effective simulation methodology can be used for efficiently design and optimize any structure as BICELL. In particular for this article, six different BICELL's types have been optimized. To carry out this optimization we have considered three figures of merit: optical sensitivity, Q-factor and signal amplitude. The final objective of this paper is not only validating a suitable and efficient optical simulation methodology but also demonstrating the capability of this method for analyzing the performance of a given number of BICELLs for label-free biosensing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Label free immunoassay sector is a ferment of activity, experiencing rapid growth as new technologies come forward and achieve acceptance. The landscape is changing in a “bottom up” approach, as individual companies promote individual technologies and find a market for them. Therefore, each of the companies operating in the label-free immunoassay sector offers a technology that is in some way unique and proprietary. However, no many technologies based on Label-free technology are currently in the market for PoC and High Throughput Screening (HTS), where mature labeled technologies have taken the market.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The field of optical label free biosensors has become a topic of interest during past years, with devices based on the detection of angular or wavelength shift of optical modes [1]. Common parameters to characterize their performance are the Limit of Detection (LOD, defined as the minimum change of refractive index upon the sensing surface that the device is able to detect, and also BioLOD, which represents the minimum amount of target analyte accurately resolved by the system; with units of concentration (common un its are p pm, ng/ml, or nM). LOD gives a first value to compare different biosensors, and is obtained both theoretically (using photonic calculation tools), and experimentally,covering the sensing area with fluids of different refractive indexes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los sectores de detección biológica demandan continuamente técnicas de análisis y diagnóstico más eficientes y precisas para identificar enfermedades y desarrollar nuevos medicamentos. Actualmente se considera que hay una gran necesidad de desarrollar herramientas de diagnóstico capaces de asegurar sensibilidad, rapidez, sencillez y asequibilidad para aplicaciones en sectores como la salud, la alimentación, el medioambiente o la seguridad. En el ámbito clínico se necesitan profundos avances tecnológicos capaces de ofrecer análisis rápidos, exactos, fiables y asequibles en coste y que tengan como consecuencia la mejora clínica y económica a partir de un diagnóstico eficiente. En concreto, hay un interés creciente por la descentralización del diagnóstico clínico mediante plataformas de detección cercanas al usuario final, denominadas POCs (Point Of Care devices). La utilización de POCs (referidas al diagnóstico cercano al usuario final o fuera del laboratorio de análisis clínico), mediante detección in vitro (IVD), será extremadamente útil en centros de salud, clínicas o unidades hospitalarias, entornos laborales o incluso en el hogar. Por otra parte, el desarrollo de la genómica, proteómica y otras tecnologías conocidas como “omics” (sufijo en inglés para referirse, por ejemplo, a genomics, transcriptomics, proteomics, metabolomics, lipidomics) está incrementando la demanda de nuevas tecnologías mucho más avanzadas con una clara orientación hacia la medicina personalizada y la necesidad de hacer frente a cambios en los tratamientos en el caso de enfermedades complejas. Desde hace poco tiempo se han definido las Celdas Biofónicas (BICELLs) como una metodología novedosa para la detección de agentes biológicos que ofrecen una serie de características que las hacen interesantes como son: Capacidad de multiplexación, alta sensibilidad, posibilidad de medir en gota, compatible con otras tecnologías. En este trabajo se hace un estudio y optimización sobre diferentes tipos de BICELLs y se valoran una serie de figuras de merito a tener en cuenta desde el punto de vista del lector óptico a emplear.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Biophotonic Sensing Cells (BICELLs) based on micro-nano pattemed photonic architectures has been recently proven as an efficient methodology for label-free biosensing by using Optical Interrogation [1]. According to this, we have studied the different optical response for a specific typology of BICELL, consisting of structures of SU -8. This material is biocompatible with different types of biomolecules and can be immobilized on its sensing surface. In particular, we have measured the optical response for a biomarker in clinic diagnostic of dry eye. Although different proteins can be enstudied such as: PRDX5, ANXA 1, ANXA 11, CST 4, PLAA Y S 1 OOA6 related with ocular surface (dry eye), for this work PLAA (phospholipase A2) is studied by means of label free biosensing based on BICELLs for analyzing the performance and specificity according with means values of concentration in ROC curves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-label classification (MLC) is the supervised learning problem where an instance may be associated with multiple labels. Modeling dependencies between labels allows MLC methods to improve their performance at the expense of an increased computational cost. In this paper we focus on the classifier chains (CC) approach for modeling dependencies. On the one hand, the original CC algorithm makes a greedy approximation, and is fast but tends to propagate errors down the chain. On the other hand, a recent Bayes-optimal method improves the performance, but is computationally intractable in practice. Here we present a novel double-Monte Carlo scheme (M2CC), both for finding a good chain sequence and performing efficient inference. The M2CC algorithm remains tractable for high-dimensional data sets and obtains the best overall accuracy, as shown on several real data sets with input dimension as high as 1449 and up to 103 labels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to develop a probabilistic modeling framework for the segmentation of structures of interest from a collection of atlases. Given a subset of registered atlases into the target image for a particular Region of Interest (ROI), a statistical model of appearance and shape is computed for fusing the labels. Segmentations are obtained by minimizing an energy function associated with the proposed model, using a graph-cut technique. We test different label fusion methods on publicly available MR images of human brains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los trabajos previos en los que se han estudiado las recomendaciones de metionina+cistina para gallinas ponedoras son muy numerosos, pero los resultados obtenidos presentan una gran variabilidad y, en algunos casos, son contradictorios. Esta variabilidad se explica por las condiciones en las que se ha realizado el estudio, la edad de las gallinas, la genética y el parámetro a optimizar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian network classifiers are widely used in machine learning because they intuitively represent causal relations. Multi-label classification problems require each instance to be assigned a subset of a defined set of h labels. This problem is equivalent to finding a multi-valued decision function that predicts a vector of h binary classes. In this paper we obtain the decision boundaries of two widely used Bayesian network approaches for building multi-label classifiers: Multi-label Bayesian network classifiers built using the binary relevance method and Bayesian network chain classifiers. We extend our previous single-label results to multi-label chain classifiers, and we prove that, as expected, chain classifiers provide a more expressive model than the binary relevance method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Interneuron classification is an important and long-debated topic in neuroscience. A recent study provided a data set of digitally reconstructed interneurons classified by 42 leading neuroscientists according to a pragmatic classification scheme composed of five categorical variables, namely, of the interneuron type and four features of axonal morphology. From this data set we now learned a model which can classify interneurons, on the basis of their axonal morphometric parameters, into these five descriptive variables simultaneously. Because of differences in opinion among the neuroscientists, especially regarding neuronal type, for many interneurons we lacked a unique, agreed-upon classification, which we could use to guide model learning. Instead, we guided model learning with a probability distribution over the neuronal type and the axonal features, obtained, for each interneuron, from the neuroscientists’ classification choices. We conveniently encoded such probability distributions with Bayesian networks, calling them label Bayesian networks (LBNs), and developed a method to predict them. This method predicts an LBN by forming a probabilistic consensus among the LBNs of the interneurons most similar to the one being classified. We used 18 axonal morphometric parameters as predictor variables, 13 of which we introduce in this paper as quantitative counterparts to the categorical axonal features. We were able to accurately predict interneuronal LBNs. Furthermore, when extracting crisp (i.e., non-probabilistic) predictions from the predicted LBNs, our method outperformed related work on interneuron classification. Our results indicate that our method is adequate for multi-dimensional classification of interneurons with probabilistic labels. Moreover, the introduced morphometric parameters are good predictors of interneuron type and the four features of axonal morphology and thus may serve as objective counterparts to the subjective, categorical axonal features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interneuron classification is an important and long-debated topic in neuroscience. A recent study provided a data set of digitally reconstructed interneurons classified by 42 leading neuroscientists according to a pragmatic classification scheme composed of five categorical variables, namely, of the interneuron type and four features of axonal morphology. From this data set we now learned a model which can classify interneurons, on the basis of their axonal morphometric parameters, into these five descriptive variables simultaneously. Because of differences in opinion among the neuroscientists, especially regarding neuronal type, for many interneurons we lacked a unique, agreed-upon classification, which we could use to guide model learning. Instead, we guided model learning with a probability distribution over the neuronal type and the axonal features, obtained, for each interneuron, from the neuroscientists’ classification choices. We conveniently encoded such probability distributions with Bayesian networks, calling them label Bayesian networks (LBNs), and developed a method to predict them. This method predicts an LBN by forming a probabilistic consensus among the LBNs of the interneurons most similar to the one being classified. We used 18 axonal morphometric parameters as predictor variables, 13 of which we introduce in this paper as quantitative counterparts to the categorical axonal features. We were able to accurately predict interneuronal LBNs. Furthermore, when extracting crisp (i.e., non-probabilistic) predictions from the predicted LBNs, our method outperformed related work on interneuron classification. Our results indicate that our method is adequate for multi-dimensional classification of interneurons with probabilistic labels. Moreover, the introduced morphometric parameters are good predictors of interneuron type and the four features of axonal morphology and thus may serve as objective counterparts to the subjective, categorical axonal features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A solution for the problem of reusability of software system for batch production systems is proposed. It is based on ISA S88 standard that prescribes the abstraction of elements in the manufacturing system that is equipment, processes and procedures abstraction, required to make a product batch. An easy to apply data scheme, compatible with the standard, is developed for management of production information. In addition to flexibility provided by the S88 standard, software system reusability requires a solution supporting manufacturing equipment reconfigurability. Toward this end a coupling mechanism is developed. A software tool, including these solutions, was developed and validated at laboratory level, using product manufacturing information of an actual plant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La metodología Integrated Safety Analysis (ISA), desarrollada en el área de Modelación y Simulación (MOSI) del Consejo de Seguridad Nuclear (CSN), es un método de Análisis Integrado de Seguridad que está siendo evaluado y analizado mediante diversas aplicaciones impulsadas por el CSN; el análisis integrado de seguridad, combina las técnicas evolucionadas de los análisis de seguridad al uso: deterministas y probabilistas. Se considera adecuado para sustentar la Regulación Informada por el Riesgo (RIR), actual enfoque dado a la seguridad nuclear y que está siendo desarrollado y aplicado en todo el mundo. En este contexto se enmarcan, los proyectos Safety Margin Action Plan (SMAP) y Safety Margin Assessment Application (SM2A), impulsados por el Comité para la Seguridad de las Instalaciones Nucleares (CSNI) de la Agencia de la Energía Nuclear (NEA) de la Organización para la Cooperación y el Desarrollo Económicos (OCDE) en el desarrollo del enfoque adecuado para el uso de las metodologías integradas en la evaluación del cambio en los márgenes de seguridad debidos a cambios en las condiciones de las centrales nucleares. El comité constituye un foro para el intercambio de información técnica y de colaboración entre las organizaciones miembro, que aportan sus propias ideas en investigación, desarrollo e ingeniería. La propuesta del CSN es la aplicación de la metodología ISA, especialmente adecuada para el análisis según el enfoque desarrollado en el proyecto SMAP que pretende obtener los valores best-estimate con incertidumbre de las variables de seguridad que son comparadas con los límites de seguridad, para obtener la frecuencia con la que éstos límites son superados. La ventaja que ofrece la ISA es que permite el análisis selectivo y discreto de los rangos de los parámetros inciertos que tienen mayor influencia en la superación de los límites de seguridad, o frecuencia de excedencia del límite, permitiendo así evaluar los cambios producidos por variaciones en el diseño u operación de la central que serían imperceptibles o complicados de cuantificar con otro tipo de metodologías. La ISA se engloba dentro de las metodologías de APS dinámico discreto que utilizan la generación de árboles de sucesos dinámicos (DET) y se basa en la Theory of Stimulated Dynamics (TSD), teoría de fiabilidad dinámica simplificada que permite la cuantificación del riesgo de cada una de las secuencias. Con la ISA se modelan y simulan todas las interacciones relevantes en una central: diseño, condiciones de operación, mantenimiento, actuaciones de los operadores, eventos estocásticos, etc. Por ello requiere la integración de códigos de: simulación termohidráulica y procedimientos de operación; delineación de árboles de sucesos; cuantificación de árboles de fallos y sucesos; tratamiento de incertidumbres e integración del riesgo. La tesis contiene la aplicación de la metodología ISA al análisis integrado del suceso iniciador de la pérdida del sistema de refrigeración de componentes (CCWS) que genera secuencias de pérdida de refrigerante del reactor a través de los sellos de las bombas principales del circuito de refrigerante del reactor (SLOCA). Se utiliza para probar el cambio en los márgenes, con respecto al límite de la máxima temperatura de pico de vaina (1477 K), que sería posible en virtud de un potencial aumento de potencia del 10 % en el reactor de agua a presión de la C.N. Zion. El trabajo realizado para la consecución de la tesis, fruto de la colaboración de la Escuela Técnica Superior de Ingenieros de Minas y Energía y la empresa de soluciones tecnológicas Ekergy Software S.L. (NFQ Solutions) con el área MOSI del CSN, ha sido la base para la contribución del CSN en el ejercicio SM2A. Este ejercicio ha sido utilizado como evaluación del desarrollo de algunas de las ideas, sugerencias, y los algoritmos detrás de la metodología ISA. Como resultado se ha obtenido un ligero aumento de la frecuencia de excedencia del daño (DEF) provocado por el aumento de potencia. Este resultado demuestra la viabilidad de la metodología ISA para obtener medidas de las variaciones en los márgenes de seguridad que han sido provocadas por modificaciones en la planta. También se ha mostrado que es especialmente adecuada en escenarios donde los eventos estocásticos o las actuaciones de recuperación o mitigación de los operadores pueden tener un papel relevante en el riesgo. Los resultados obtenidos no tienen validez más allá de la de mostrar la viabilidad de la metodología ISA. La central nuclear en la que se aplica el estudio está clausurada y la información relativa a sus análisis de seguridad es deficiente, por lo que han sido necesarias asunciones sin comprobación o aproximaciones basadas en estudios genéricos o de otras plantas. Se han establecido tres fases en el proceso de análisis: primero, obtención del árbol de sucesos dinámico de referencia; segundo, análisis de incertidumbres y obtención de los dominios de daño; y tercero, cuantificación del riesgo. Se han mostrado diversas aplicaciones de la metodología y ventajas que presenta frente al APS clásico. También se ha contribuido al desarrollo del prototipo de herramienta para la aplicación de la metodología ISA (SCAIS). ABSTRACT The Integrated Safety Analysis methodology (ISA), developed by the Consejo de Seguridad Nuclear (CSN), is being assessed in various applications encouraged by CSN. An Integrated Safety Analysis merges the evolved techniques of the usually applied safety analysis methodologies; deterministic and probabilistic. It is considered as a suitable tool for assessing risk in a Risk Informed Regulation framework, the approach under development that is being adopted on Nuclear Safety around the world. In this policy framework, the projects Safety Margin Action Plan (SMAP) and Safety Margin Assessment Application (SM2A), set up by the Committee on the Safety of Nuclear Installations (CSNI) of the Nuclear Energy Agency within the Organization for Economic Co-operation and Development (OECD), were aimed to obtain a methodology and its application for the integration of risk and safety margins in the assessment of the changes to the overall safety as a result of changes in the nuclear plant condition. The committee provides a forum for the exchange of technical information and cooperation among member organizations which contribute their respective approaches in research, development and engineering. The ISA methodology, proposed by CSN, specially fits with the SMAP approach that aims at obtaining Best Estimate Plus Uncertainty values of the safety variables to be compared with the safety limits. This makes it possible to obtain the exceedance frequencies of the safety limit. The ISA has the advantage over other methods of allowing the specific and discrete evaluation of the most influential uncertain parameters in the limit exceedance frequency. In this way the changes due to design or operation variation, imperceptibles or complicated to by quantified by other methods, are correctly evaluated. The ISA methodology is one of the discrete methodologies of the Dynamic PSA framework that uses the generation of dynamic event trees (DET). It is based on the Theory of Stimulated Dynamics (TSD), a simplified version of the theory of Probabilistic Dynamics that allows the risk quantification. The ISA models and simulates all the important interactions in a Nuclear Power Plant; design, operating conditions, maintenance, human actuations, stochastic events, etc. In order to that, it requires the integration of codes to obtain: Thermohydraulic and human actuations; Even trees delineation; Fault Trees and Event Trees quantification; Uncertainty analysis and risk assessment. This written dissertation narrates the application of the ISA methodology to the initiating event of the Loss of the Component Cooling System (CCWS) generating sequences of loss of reactor coolant through the seals of the reactor coolant pump (SLOCA). It is used to test the change in margins with respect to the maximum clad temperature limit (1477 K) that would be possible under a potential 10 % power up-rate effected in the pressurized water reactor of Zion NPP. The work done to achieve the thesis, fruit of the collaborative agreement of the School of Mining and Energy Engineering and the company of technological solutions Ekergy Software S.L. (NFQ Solutions) with de specialized modeling and simulation branch of the CSN, has been the basis for the contribution of the CSN in the exercise SM2A. This exercise has been used as an assessment of the development of some of the ideas, suggestions, and algorithms behind the ISA methodology. It has been obtained a slight increase in the Damage Exceedance Frequency (DEF) caused by the power up-rate. This result shows that ISA methodology allows quantifying the safety margin change when design modifications are performed in a NPP and is specially suitable for scenarios where stochastic events or human responses have an important role to prevent or mitigate the accidental consequences and the total risk. The results do not have any validity out of showing the viability of the methodology ISA. Zion NPP was retired and information of its safety analysis is scarce, so assumptions without verification or approximations based on generic studies have been required. Three phases are established in the analysis process: first, obtaining the reference dynamic event tree; second, uncertainty analysis and obtaining the damage domains; third, risk quantification. There have been shown various applications of the methodology and advantages over the classical PSA. It has also contributed to the development of the prototype tool for the implementation of the ISA methodology (SCAIS).