922 resultados para Application specific algorithm
Resumo:
Dissertação de mestrado em Engenharia de Sistemas
Resumo:
The authors acknowledge to Sofia Neves from ICVS for her help in the antibodies selection.
Resumo:
Cardiopulmonary arrest is a medical emergency in which the lapse of time between event onset and the initiation of measures of basic and advanced support, as well as the correct care based on specific protocols for each clinical situation, constitute decisive factors for a successful therapy. Cardiopulmonary arrest care cannot be restricted to the hospital setting because of its fulminant nature. This necessitates the creation of new concepts, strategies and structures, such as the concept of life chain, cardio-pulmonary resuscitation courses for professionals who work in emergency medical services, the automated external defibrillator, the implantable cardioverter-defibrillator, and mobile intensive care units, among others. New concepts, strategies and structures motivated by new advances have also modified the treatment and improved the results of cardiopulmonary resuscitation in the hospital setting. Among them, we can cite the concept of cerebral resuscitation, the application of the life chain, the creation of the universal life support algorithm, the adjustment of drug doses, new techniques - measure of the end-tidal carbon dioxide levels and of the coronary perfusion pressure - and new drugs under research.
Resumo:
Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.
Resumo:
The Smart Drug Search is publicly accessible at http://sing.ei.uvigo.es/sds/. The BIOMedical Search Engine Framework is freely available for non-commercial use at https://github.com/agjacome/biomsef
Resumo:
La contaminación ambiental por metales pesados como el cromo y por compuestos orgánicos como los fenoles es un grave problema a nivel mundial debido a su toxicidad y a sus efectos adversos sobre los seres humanos, la flora y la fauna, tanto por su acumulación en la cadena alimentaria como por su continua persistencia en el medio ambiente. En un estudio preliminar, efectuado por nuestro laboratorio, se han detectado elevados niveles de estos contaminantes en sedimentos y efluentes en zonas industriales del sur de la provincia de Córdoba, lo cual plantea la necesidad de removerlos. Entre las tecnologías disponibles, la biorremediación, que se basa en el uso de sistemas biológicos, como los microorganismos, para la detoxificación y la degradación de contaminantes, se presenta como una alternativa probablemente más efectiva y de menor costo que las técnicas convencionales. Sin embargo, la aplicación de esta tecnología depende en gran parte de la influencia de las características particulares y específicas de la zona a remediar. En consecuencia, en primer lugar se caracterizará la zona de muestreo y se aislarán e identificarán microorganismos nativos de la región, tolerantes a cromo y fenol, a partir de muestras de suelo, agua y sedimentos, ya que podrían constituir una adecuada herramienta biotecnológica, mejor adaptada al sitio a tratar. Posteriormente se estudiará la biorremediación de Cr y fenol utilizando dichos microorganismos, analizando su capacidad para biotransformar, bioacumular o bioadsorber a estos contaminantes, y se determinarán las condiciones óptimas para el tratamiento. Se analizarán los posibles mecanismos fisiológicos, bioquímicos y moleculares involucrados en la remediación, que constituye una etapa crucial para el diseño de una estrategia adecuada y eficiente. Finalmente, se aplicará esta tecnología a escala reactor, como una primera aproximación al tratamiento a mayor escala. De esta manera se espera reducir los niveles de estos contaminantes y así minimizar el impacto ambiental que ellos producen en suelos y acuíferos. A futuro, la utilización de los microorganismos seleccionados, de manera individual o formando consorcios, para el tratamiento de efluentes industriales previa liberación al medio ambiente, o su uso en bioaumento, constituirían posibles alternativas de aplicación. Los principales impactos científico-tecnológicos del proyecto serán: (a) la generación de una nueva tecnología biológica de decontaminación de cromo y fenol, intentando presentar soluciones frente a una problemática ambiental que afecta a nuestra región, pero que además es común a la mayoría de los países, (b) la formación de nuevos recursos humanos en el área y (c) el trabajo en colaboración con otros grupos de investigación que se destacan en el área de biotecnología ambiental. Environmental pollution produced by heavy metals, such as chromium and organic compounds like phenolics is a serious global problem due to their toxicity, their adverse effects on human life, plants and animals, their accumulation in the food chains and also by their persistance in the environment. In a previous study performed in our laboratory, high levels of these pollutants were detected in sediments and effluents from industrial zones of the south of Cordoba Province, which determine the need to remove them. Among various technologies, bioremediation which is based on the use of biological systems, such as microorganisms, to detoxify and to degrade contaminants, is probably the most effective alternative, and it is less expensive than other conventional technologies. However, the application of this technology depends on the influence of the particular and specific characteristics of the zone to be remediate. As a consecuence, at the first time, the zone of sampling will be characterized and then, native microorganisms, tolerant to chromium and phenol, will be isolated from soils, water and sediments and identificated. These microorganisms would be an adequate biotechnological tool, more adapted to the conditions of the site to be remediate than other ones. Then, the ability of these selected microorganisms to biotransform, bioaccumulate or biosorbe chromium and phenol will be studied and the optimal conditions for the treatment will be determined. The possible physiological, biochemical and molecular mechanisms involved in bioremediation will be also analized, because this is a crucial step in the design of an adequate and efficient remediation strategy. Finally, this technology will be applied in a reactor, as an approximation to the treatment at a major scale. A reduction in the levels of these pollutants will be expected, to minimize their environmental impact on soils and aquifers.
Resumo:
IDENTIFICACIÓN DEL PROBLEMA DE ESTUDIO. Las sustancias orgánicas solubles en agua no biodegradables tales como ciertos herbicidas, colorantes industriales y metabolitos de fármacos de uso masivo son una de las principales fuentes de contaminación en aguas subterráneas de zonas agrícolas y en efluentes industriales y domésticos. Las reacciones fotocatalizadas por irradiación UV-visible y sensitizadores orgánicos e inorgánicos son uno de los métodos más económicos y convenientes para la descomposición de contaminantes en subproductos inocuos y/o biodegradables. En muchas aplicaciones es deseable un alto grado de especificidad, efectividad y velocidad de degradación de un dado agente contaminante que se encuentra presente en una mezcla compleja de sustancias orgánicas en solución. En particular son altamente deseables sistemas nano/micro -particulados que formen suspensiones acuosas estables debido a que estas permiten una fácil aplicación y una eficaz acción descontaminante en grandes volúmenes de fluidos. HIPÓTESIS Y PLANTEO DE LOS OBJETIVOS. El objetivo general de este proyecto es desarrollar sistemas nano/micro particulados formados por polímeros de impresión molecular (PIMs) y foto-sensibilizadores (FS). Un PIMs es un polímero especialmente sintetizado para que sea capaz de reconocer específicamente un analito (molécula plantilla) determinado. La actividad de unión específica de los PIMs en conjunto con la capacidad fotocatalizadora de los sensibilizadores pueden ser usadas para lograr la fotodescomposición específica de moléculas “plantilla” (en este caso un dado contaminante) en soluciones conteniendo mezclas complejas de sustancias orgánicas. MATERIALES Y MÉTODOS A UTILIZAR. Se utilizaran técnicas de polimerización en mini-emulsión para sintetizar los sistemas nano/micro PIM-FS para buscar la degradación de ciertos compuestos de interés. Para caracterizar eficiencias, mecanismos y especificidad de foto-degradación en dichos sistemas se utilizan diversas técnicas espectroscópicas (estacionarias y resueltas en el tiempo) y de cromatografía (HPLC y GC). Así mismo, para medir directamente distribuciones de afinidades de unión y eficiencia de foto-degradación se utilizaran técnicas de fluorescencia de molécula/partícula individual. Estas determinaciones permitirán obtener resultados importantes al momento de analizar los factores que afectan la eficiencia de foto-degradación (nano/micro escala), tales como cantidad y ubicación de foto- sensibilizadores en las matrices poliméricas y eficiencia de unión de la plantilla y los productos de degradación al PIM. RESULTADOS ESPERADOS. Los estudios propuestos apuntan a un mejor entendimiento de procesos foto-iniciados en entornos nano/micro-particulados para aplicar dichos conocimientos al diseño de sistemas optimizados para la foto-destrucción selectiva de contaminantes acuosos de relevancia social; tales como herbicidas, residuos industriales, metabolitos de fármacos de uso masivo, etc. IMPORTANCIA DEL PROYECTO. Los sistemas nano/micro-particulados PIM-FS que se propone desarrollar en este proyecto se presentan como candidatos ideales para tratamientos específicos de efluentes industriales y domésticos en los cuales se desea lograr la degradación selectiva de compuestos orgánicos. Los conocimientos adquiridos serán indispensables para construir una plataforma versátil de sistemas foto-catalíticos específicos para la degradación de diversos contaminantes orgánicos de interés social. En lo referente a la formación de recursos humanos, el proyecto propuesto contribuirá en forma directa a la formación de 3 estudiantes de postgrado y 2 estudiantes de grado. En las capacidades institucionales se contribuirá al acondicionamiento del Laboratorio para Microscopía Óptica Avanzada (LMOA) en el Dpto. de Química de la UNRC y al montaje de un sistema de microscopio de fluorescencia que permitirá la aplicación de técnicas avanzadas de espectroscopia de fluorescencia de molecula individual. Water-soluble organic molecules such as certain non-biodegradable herbicides, industrial dyes and metabolites of widespread use drugs are a major source of pollution in groundwater from agricultural areas and in industrial and domestic effluents. Photo-catalytic reactions by UV-visible irradiation and organic sensitizers are one of the most economical and convenient methods for the decomposition of pollutants into harmless byproducts. In many applications it is highly desirable a high degree of specificity, effectiveness and speed of degradation of specific pollutants present in a complex mixture. In particular nano/micro-particles systems that form stable aqueous suspensions are highly desirable because they allow for easy application and effective decontamination of large volumes of fluids. Herein we propose the development of nano/micro particles composed by molecularly imprinted polymers (MIP) and photo-sensitizers (PS). The specific binding of MIP and the photo-catalytic ability of the sensitizers are used to achieve the photo-decomposition of specific "template" molecules in complex mixtures. Mini-emulsion polymerization techniques will be used to synthesize nano/micro MIP-FS systems. Spectroscopy (steady-state and time resolved) and chromatography (GC and HPLC) will be used to characterize efficiency, mechanisms and specificity of photo-degradation in these systems. In addition single molecule/particle fluorescence spectroscopy techniques will be used to directly measure distributions of binding affinities and photo-degradation efficiency in individual particles. The proposed studies point to a more detailed understanding of the factors affecting the photo-degradation efficiency in nano/micro-particles and to apply that knowledge in the design of optimized systems for photo-selective destruction of socially relevant aqueous pollutants.
Resumo:
El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme
Resumo:
The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.
Resumo:
The classical Lojasiewicz inequality and its extensions for partial differential equation problems (Simon) and to o-minimal structures (Kurdyka) have a considerable impact on the analysis of gradient-like methods and related problems: minimization methods, complexity theory, asymptotic analysis of dissipative partial differential equations, tame geometry. This paper provides alternative characterizations of this type of inequalities for nonsmooth lower semicontinuous functions defined on a metric or a real Hilbert space. In a metric context, we show that a generalized form of the Lojasiewicz inequality (hereby called the Kurdyka- Lojasiewicz inequality) relates to metric regularity and to the Lipschitz continuity of the sublevel mapping, yielding applications to discrete methods (strong convergence of the proximal algorithm). In a Hilbert setting we further establish that asymptotic properties of the semiflow generated by -∂f are strongly linked to this inequality. This is done by introducing the notion of a piecewise subgradient curve: such curves have uniformly bounded lengths if and only if the Kurdyka- Lojasiewicz inequality is satisfied. Further characterizations in terms of talweg lines -a concept linked to the location of the less steepest points at the level sets of f- and integrability conditions are given. In the convex case these results are significantly reinforced, allowing in particular to establish the asymptotic equivalence of discrete gradient methods and continuous gradient curves. On the other hand, a counterexample of a convex C2 function in R2 is constructed to illustrate the fact that, contrary to our intuition, and unless a specific growth condition is satisfied, convex functions may fail to fulfill the Kurdyka- Lojasiewicz inequality.
Resumo:
Fluctuations in ammonium (NH4+), measured as NH4-N loads using an ion-selective electrode installed at the inlet of a sewage treatment plant, showed a distinctive pattern which was associated to weekly (i.e., commuters) and seasonal (i.e., holidays) fluctuations of the population. Moreover, population size estimates based on NH4-N loads were lower compared to census data. Diurnal profiles of benzoylecgonine (BE) and 11-nor-9-carboxy-Δ9-tetrahydrocannabinol (THC-COOH) were shown to be strongly correlated to NH4-N. Characteristic patterns, which reflect the prolonged nocturnal activity of people during the weekend, could be observed for BE, cocaine, and a major metabolite of MDMA (i.e., 4-hydroxy-3-methoxymethamphetamine). Additional 24 h composite samples were collected between February and September 2013. Per-capita loads (i.e., grams per day per 1000 inhabitants) were computed using census data and NH4-N measurements. Normalization with NH4-N did not modify the overall pattern, suggesting that the magnitude of fluctuations in the size of the population is negligible compared to those of illicit drug loads. Results show that fluctuations in the size of the population over longer periods of time or during major events can be monitored using NH4-N loads: either using raw NH4-N loads or population size estimates based on NH4-N loads, if information about site-specific NH4-N population equivalents is available.
Resumo:
The World Health Organization (WHO) criteria for the diagnosis of osteoporosis are mainly applicable for dual X-ray absorptiometry (DXA) measurements at the spine and hip levels. There is a growing demand for cheaper devices, free of ionizing radiation such as promising quantitative ultrasound (QUS). In common with many other countries, QUS measurements are increasingly used in Switzerland without adequate clinical guidelines. The T-score approach developed for DXA cannot be applied to QUS, although well-conducted prospective studies have shown that ultrasound could be a valuable predictor of fracture risk. As a consequence, an expert committee named the Swiss Quality Assurance Project (SQAP, for which the main mission is the establishment of quality assurance procedures for DXA and QUS in Switzerland) was mandated by the Swiss Association Against Osteoporosis (ASCO) in 2000 to propose operational clinical recommendations for the use of QUS in the management of osteoporosis for two QUS devices sold in Switzerland. Device-specific weighted "T-score" based on the risk of osteoporotic hip fractures as well as on the prediction of DXA osteoporosis at the hip, according to the WHO definition of osteoporosis, were calculated for the Achilles (Lunar, General Electric, Madison, Wis.) and Sahara (Hologic, Waltham, Mass.) ultrasound devices. Several studies (totaling a few thousand subjects) were used to calculate age-adjusted odd ratios (OR) and area under the receiver operating curve (AUC) for the prediction of osteoporotic fracture (taking into account a weighting score depending on the design of the study involved in the calculation). The ORs were 2.4 (1.9-3.2) and AUC 0.72 (0.66-0.77), respectively, for the Achilles, and 2.3 (1.7-3.1) and 0.75 (0.68-0.82), respectively, for the Sahara device. To translate risk estimates into thresholds for clinical application, 90% sensitivity was used to define low fracture and low osteoporosis risk, and a specificity of 80% was used to define subjects as being at high risk of fracture or having osteoporosis at the hip. From the combination of the fracture model with the hip DXA osteoporotic model, we found a T-score threshold of -1.2 and -2.5 for the stiffness (Achilles) determining, respectively, the low- and high-risk subjects. Similarly, we found a T-score at -1.0 and -2.2 for the QUI index (Sahara). Then a screening strategy combining QUS, DXA, and clinical factors for the identification of women needing treatment was proposed. The application of this approach will help to minimize the inappropriate use of QUS from which the whole field currently suffers.
Resumo:
Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.
Resumo:
En este proyecto se implementan tres algoritmos esteganográficos diferentes usando JPEG2000 como portador del mensaje, se calcula el rendimiento de cada uno de ellos y se comparan usando una gráfica. El objetivo es visualizar para unos casos específicos que el algoritmo basado en el producto de dos códigos lineales perfectos tiene mejor rendimiento que el obtenido con algoritmos como el F5 y el LSB.