995 resultados para Graphics processing units
Resumo:
For years, silk fibroin of a domestic silkworm, Bombyx mori, has been recognized as a valuable material and extensively used. In the last decades, new application fields are emerging for this versatile material. Those final, specific applications of silk dictate the way it has been processed in industry and research. This review focuses on the description of various approaches for silk downstream processing in a laboratory scale, that fall within several categories. The detailed description of workflow possibilities from the naturally found material to a finally formulated product is presented. Considerable attention is given to (bio-) chemical approaches of silk fibroin transformation, particularly, to its enzyme-driven modifications. The focus of the current literature survey is exclusively on the methods applied in research and not industry.
Resumo:
This article is intended to evaluate the density and the mechanical, acoustic and thermal properties of compression moulded plates composed of granulate from electrical cables wastes. Those cable wastes are the insulation part from the electric cables, and are composed of PVC, PE, EMP and PEX rubber. After these materiais lose their initial properties and cease to be useful as insulation material, due to safety requirements, it is possible to reuse them into new applications like industrial or playground floorings, as sound insulation material to be applied in walls or floors, or to dampen vibrations from equipments. Recovering electric cable waste has been a major concern to the European Commission due to its leveis of toxicity when incineration and land fill ing is the solution to dispose this material. Such as the European Commission's study for DG Xl[1] suggested that recycling may be the most favourable future waste management option.
Resumo:
Background: Abnormalities in emotional prosody processing have been consistently reported in schizophrenia and are related to poor social outcomes. However, the role of stimulus complexity in abnormal emotional prosody processing is still unclear. Method: We recorded event-related potentials in 16 patients with chronic schizophrenia and 16 healthy controls to investigate: 1) the temporal course of emotional prosody processing; and 2) the relative contribution of prosodic and semantic cues in emotional prosody processing. Stimuli were prosodic single words presented in two conditions: with intelligible (semantic content condition—SCC) and unintelligible semantic content (pure prosody condition—PPC). Results: Relative to healthy controls, schizophrenia patients showed reduced P50 for happy PPC words, and reduced N100 for both neutral and emotional SCC words and for neutral PPC stimuli. Also, increased P200 was observed in schizophrenia for happy prosody in SCC only. Behavioral results revealed higher error rates in schizophrenia for angry prosody in SCC and for happy prosody in PPC. Conclusions: Together, these data further demonstrate the interactions between abnormal sensory processes and higher-order processes in bringing about emotional prosody processing dysfunction in schizophrenia. They further suggest that impaired emotional prosody processing is dependent on stimulus complexity.
Resumo:
Recent studies have demonstrated the positive effects of musical training on the perception of vocally expressed emotion. This study investigated the effects of musical training on event-related potential (ERP) correlates of emotional prosody processing. Fourteen musicians and fourteen control subjects listened to 228 sentences with neutral semantic content, differing in prosody (one third with neutral, one third with happy and one third with angry intonation), with intelligible semantic content (semantic content condition--SCC) and unintelligible semantic content (pure prosody condition--PPC). Reduced P50 amplitude was found in musicians. A difference between SCC and PPC conditions was found in P50 and N100 amplitude in non-musicians only, and in P200 amplitude in musicians only. Furthermore, musicians were more accurate in recognizing angry prosody in PPC sentences. These findings suggest that auditory expertise characterizing extensive musical training may impact different stages of vocal emotional processing.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
DNA strand-breaks (SBs) with non-ligatable ends are generated by ionizing radiation, oxidative stress, various chemotherapeutic agents, and also as base excision repair (BER) intermediates. Several neurological diseases have already been identified as being due to a deficiency in DNA end-processing activities. Two common dirty ends, 3'-P and 5'-OH, are processed by mammalian polynucleotide kinase 3'-phosphatase (PNKP), a bifunctional enzyme with 3'-phosphatase and 5'-kinase activities. We have made the unexpected observation that PNKP stably associates with Ataxin-3 (ATXN3), a polyglutamine repeat-containing protein mutated in spinocerebellar ataxia type 3 (SCA3), also known as Machado-Joseph Disease (MJD). This disease is one of the most common dominantly inherited ataxias worldwide; the defect in SCA3 is due to CAG repeat expansion (from the normal 14-41 to 55-82 repeats) in the ATXN3 coding region. However, how the expanded form gains its toxic function is still not clearly understood. Here we report that purified wild-type (WT) ATXN3 stimulates, and by contrast the mutant form specifically inhibits, PNKP's 3' phosphatase activity in vitro. ATXN3-deficient cells also show decreased PNKP activity. Furthermore, transgenic mice conditionally expressing the pathological form of human ATXN3 also showed decreased 3'-phosphatase activity of PNKP, mostly in the deep cerebellar nuclei, one of the most affected regions in MJD patients' brain. Finally, long amplicon quantitative PCR analysis of human MJD patients' brain samples showed a significant accumulation of DNA strand breaks. Our results thus indicate that the accumulation of DNA strand breaks due to functional deficiency of PNKP is etiologically linked to the pathogenesis of SCA3/MJD.
Resumo:
[Excerpt] Introduction: Thermal processing is probably the most important process in food industry that has been used since prehistoric times, when it was discovered that heat enhanced the palatability and the life of the heat-treated food. Thermal processing comprehends the heating of foods at a defined temperature for a certain length of time. However, in some foods, the high thermotolerance of certain enzymes and microorganisms, their physical properties (e.g.,highviscosity),ortheircomponents(e.g.,solidfractions) require the application of extreme heat treatments that not only are energy intensive, but also will adversely affect the nutritional and organoleptic properties of the food. Technologies such as ohmic heating, dielectric heating (which includes microwave heating and radiofrequency heating), inductive heating, and infrared heating are available to replace, or complement, the traditional heat-dependent technologies (heating through superheated steam, hot air, hot water, or other hot liquid, being the heating achieved either through direct contact with those agents – mostly superheated steam – or through contact with a hot surface which is in turn heated by such agents). Given that the “traditional” heatdependent technologies are thoroughly described in the literature, this text will be mainly devoted to the so-called “novel” thermal technologies. (...)
Resumo:
This research work explores a new way of presenting and representing information about patients in critical care, which is the use of a timeline to display information. This is accomplished with the development of an interactive Pervasive Patient Timeline able to give to the intensivists an access in real-time to an environment containing patients clinical information from the moment in which the patients are admitted in the Intensive Care Unit (ICU) until their discharge This solution allows the intensivists to analyse data regarding vital signs, medication, exams, data mining predictions, among others. Due to the pervasive features, intensivists can have access to the timeline anywhere and anytime, allowing them to make decisions when they need to be made. This platform is patient-centred and is prepared to support the decision process allowing the intensivists to provide better care to patients due the inclusion of clinical forecasts.
Resumo:
The decision support models in intensive care units are developed to support medical staff in their decision making process. However, the optimization of these models is particularly difficult to apply due to dynamic, complex and multidisciplinary nature. Thus, there is a constant research and development of new algorithms capable of extracting knowledge from large volumes of data, in order to obtain better predictive results than the current algorithms. To test the optimization techniques a case study with real data provided by INTCare project was explored. This data is concerning to extubation cases. In this dataset, several models like Evolutionary Fuzzy Rule Learning, Lazy Learning, Decision Trees and many others were analysed in order to detect early extubation. The hydrids Decision Trees Genetic Algorithm, Supervised Classifier System and KNNAdaptive obtained the most accurate rate 93.2%, 93.1%, 92.97% respectively, thus showing their feasibility to work in a real environment.
Resumo:
It is estimated that 5 to 8 million individuals with chest pain or other symptoms suggestive of myocardial ischemia are seen each year in emergency departments (ED) in the United States 1,2, which corresponds to 5 to 10% of all visits 3,4. Most of these patients are hospitalized for evaluation of possible acute coronary syndrome (ACS). This generates an estimated cost of 3 - 6 thousand dollars per patient 5,6. From this evaluation process, about 1.2 million patients receive the diagnosis of acute myocardial infarction (AMI), and just about the same number have unstable angina. Therefore, about one half to two thirds of these patients with chest pain do not have a cardiac cause for their symptoms 2,3. Thus, the emergency physician is faced with the difficult challenge of identifying those with ACS - a life-threatening disease - to treat them properly, and to discharge the others to suitable outpatient investigation and management.
Resumo:
Nuevas biotecnologías, como los marcadores de la molécula de ADN, permiten caracterizar el genoma vegetal. El uso de la información genómica producida para cientos o miles de posiciones cromosómicas permite identificar genotipos superiores en menos tiempo que el requerido por la selección fenotípica tradicional. La mayoría de los caracteres de las especies vegetales cultivadas de importancia agronómica y económica, son controlados por poli-genes causantes de un fenotipo con variación continua, altamente afectados por el ambiente. Su herencia es compleja ya que resulta de la interacción entre genes, del mismo o distinto cromosoma, y de la interacción del genotipo con el ambiente, dificultando la selección. Estas biotecnologías producen bases de datos con gran cantidad de información y estructuras complejas de correlación que requieren de métodos y modelos biométricos específicos para su procesamiento. Los modelos estadísticos focalizados en explicar el fenotipo a partir de información genómica masiva requieren la estimación de un gran número de parámetros. No existen métodos, dentro de la estadística paramétrica capaces de abordar este problema eficientemente. Además los modelos deben contemplar no-aditividades (interacciones) entre efectos génicos y de éstos con el ambiente que son también dificiles de manejar desde la concepción paramétrica. Se hipotetiza que el análisis de la asociación entre caracteres fenotípicos y genotipos moleculares, caracterizados por abundante información genómica, podría realizarse eficientemente en el contexto de los modelos mixtos semiparamétricos y/o de métodos no-paramétricos basados en técnicas de aprendizaje automático. El objetivo de este proyecto es desarrollar nuevos métodos para análisis de datos que permitan el uso eficiente de información genómica masiva en evaluaciones genéticas de interés agro-biotecnológico. Los objetivos específicos incluyen la comparación, respecto a propiedades estadísticas y computacionales, de estrategias analíticas paramétricas con estrategias semiparamétricas y no-paramétricas. Se trabajará con aproximaciones por regresión del análisis de loci de caracteres cuantitativos bajo distintas estrategias y escenarios (reales y simulados) con distinto volúmenes de datos de marcadores moleculares. En el área paramétrica se pondrá especial énfasis en modelos mixtos, mientras que en el área no paramétrica se evaluarán algoritmos de redes neuronales, máquinas de soporte vectorial, filtros multivariados, suavizados del tipo LOESS y métodos basados en núcleos de reciente aparición. La propuesta semiparamétrica se basará en una estrategia de análisis en dos etapas orientadas a: 1) reducir la dimensionalidad de los datos genómicos y 2) modelar el fenotipo introduciendo sólo las señales moleculares más significativas. Con este trabajo se espera poner a disposición de investigadores de nuestro medio, nuevas herramientas y procedimientos de análisis que permitan maximizar la eficiencia en el uso de los recursos asignados a la masiva captura de datos genómicos y su aplicación en desarrollos agro-biotecnológicos.
Resumo:
Los eventos transitorios únicos analógicos (ASET, Analog Single Event Transient) se producen debido a la interacción de un ión pesado o un protón de alta energía con un dispositivo sensible de un circuito analógico. La interacción del ión con un transistor bipolar o de efecto de campo MOS induce pares electrón-hueco que provocan picos que pueden propagarse a la salida del componente analógico provocando transitorios que pueden inducir fallas en el nivel sistema. Los problemas más graves debido a este tipo de fenómeno se dan en el medioambiente espacial, muy rico en iones pesados. Casos típicos los constituyen las computadoras de a bordo de satélites y otros artefactos espaciales. Sin embargo, y debido a la continua contracción de dimensiones de los transistores (que trae aparejado un aumento de sensibilidad), este fenómeno ha comenzado a observarse a nivel del mar, provocado fundamentalmente por el impacto de neutrones atmosféricos. Estos efectos pueden provocar severos problemas a los sistemas informáticos con interfaces analógicas desde las que obtienen datos para el procesamiento y se han convertido en uno de los problemas más graves a los que tienen que hacer frente los diseñadores de sistemas de alta escala de integración. Casos típicos son los Sistemas en Chip que incluyen módulos de procesamiento de altas prestaciones como las interfaces analógicas.El proyecto persigue como objetivo general estudiar la susceptibilidad de sistemas informáticos a ASETs en sus secciones analógicas, proponiendo estrategias para la mitigación de los errores.Como objetivos específicos se pretende: -Proponer nuevos modelos de ASETs basados en simulaciones en el nivel dispositivo y resueltas por el método de elementos finitos.-Utilizar los modelos para identificar las secciones más propensas a producir errores y consecuentemente para ser candidatos a la aplicación de técnicas de endurecimiento a radiaciones.-Utilizar estos modelos para estudiar la naturaleza de los errores producidos en sistemas de procesamiento de datos.-Proponer soluciones novedosas para la mitigación de estos efectos en los mismos circuitos analógicos evitando su propagación a las secciones digitales.-Proponer soluciones para la mitigación de los efectos en el nivel sistema.Para llevar a cabo el proyecto se plantea un procedimiento ascendente para las investigaciones a realizar, comenzando por descripciones en el nivel físico para posteriormente aumentar el nivel de abstracción en el que se encuentra modelado el circuito. Se propone el modelado físico de los dispositivos MOS y su resolución mediante el Método de Elementos Finitos. La inyección de cargas en las zonas sensibles de los modelos permitirá determinar los perfiles de los pulsos de corriente que deben inyectarse en el nivel circuito para emular estos efectos. Estos procedimientos se realizarán para los distintos bloques constructivos de las interfaces analógicas, proponiendo estrategias de mitigación de errores en diferentes niveles.Los resultados esperados del presente proyecto incluyen hardware para detección de errores y tolerancia a este tipo de eventos que permitan aumentar la confiabilidad de sistemas de tratamiento de la información, así como también nuevos datos referentes a efectos de la radiación en semiconductores, nuevos modelos de fallas transitorias que permitan una simulación de estos eventos en el nivel circuito y la determinación de zonas sensibles de interfaces analógicas típicas que deben ser endurecidas para radiación.
Resumo:
La quinoa (Chenopodium quinoa Willd), es un pseudocereal originario de la región Andina. Fue utilizada como alimento básico por los pueblos nativos. La quinoa, la papa y el maíz constituyeron el trinomio base de la alimentación indígena de este continente. La colonización española fue desplazando su cultivo a favor del trigo europeo y otros cereales, quedando reducida a las zonas altas de la región andina. La Quínoa ha adquirido una considerable atención en los últimos tiempos, principalmente por la calidad de sus proteínas y la ausencia de gluten en ella. Su empleo está ampliamente difundido en los países andinos, especialmente Bolivia y Perú, con un notable crecimiento de la superficie sembrada. En nuestro país la explotación de este cultivo se ubica principalmente en las provincias norteñas de Salta y Jujuy. En estos últimos años se ha reivindicado su cultivo y los granos privados de saponinas son considerados como un excelente alimento, reconocido por la OMS, la FAO y la NASA. Además de la calidad de sus lípidos y vitaminas, y al elevado contenido en almidón, la quinoa posee una proteína de excelente calidad nutricional y libre de gluten, lo que hace a este grano especialmente indicado para la alimentación de personas que sufren de la enfermedad celíaca o del síndrome de intestino irritado. El presente proyecto está orientado al aprovechamiento integral del grano de quinoa. Es nuestra intensión aquí, demostrar que dicho grano, cultivado en la provincia de Córdoba, permitirá elaborar productos alimenticios asi como también derivados de su industrialización. Para este objetivo se cuenta con las instalaciones de la Planta Piloto del Instituto de Ciencia y Tecnología de los Alimentos (ICTA), de la UNC, así como de intrumental moderno y acorde, como HPLC, GC, Espectrofotómetro UV-Vis, rotavapores de laboratorio e industrial, cámara fría, balanzas analíticas y de precisión, muflas, estufas, molinos y tamices, así como también, contamos con profesionales, algunos de ellos realizando su tesis doctoral en este tema. En cuanto a los objetivos que se persiguen, se espera obtener productos tales como sopas, papillas, productos para panadería y galletería y salsas. En el plano industrial, se pretende elaborar concentrados proteicos, almidón y saponinas. Como se dijo más arriba, a nivel internacional la quinoa ha comenzado a extender sus fronteras, y es así que hoy el principal productor mundial de este grano, Bolivia, destina un porcentaje importante de su producción a la exportación. La creciente demanda mundial de quinoa a hecho que se constituya en un cultivo estratégico y de alto valor, con precios internacionales que rondan los U$S 1200 la tonelada. Si a esto unimos que la planta presenta una gran resistencia a la sequía, que se adapta bien a terrenos salitrosos, arenosos y pobres, podemos comprender la importancia que adquiere para nuestra provincia, toda vez que en la misma existen zonas geográficas potencialmente aptas para su cultivo. Quinoa (Chenopodium quinoa Willd) is a pseudocereal originating in the Andean region. It was used as a staple food by native peoples. Quinoa, potatoes and corn were the tree most important indigenous staple food to this part of South America. Spanish colonization was marginalized cultivation in favor of European wheat and other grains, displacing it to the highlands of the Andean region. Quinoa has recently gained considerable attention, mainly by its protein quality and lack of gluten. Its use is widespread in the Andean countries, especially Bolivia and Peru, with a notable increase in plantings. In our country, the exploitation of this crop is located mainly in the northern provinces of Salta and Jujuy. In recent years its cultivation has been promoted, and the grains once free of saponins are considered an excellent food, recognized by WHO, FAO and NASA. In addition to its lipid and vitamins, and high starch contain, quinoa protein has an excellent nutritional value and it is free of gluten, making it particularly suitable for this grain to feed people with celiac disease or irritable bowel syndrome. This project aims at an integral development of quinoa grain. It is our intention here to demonstrate that this grain grown in the province of Córdoba, can produce food products resulting from local industrialization. This team has access to the facilities of the Pilot Plant of the Institute of Science and Food Technology (ICTA) of the UNC, and the modern equipments in it, as HPLC, GC, UV-Vis spectrophotometer, laboratory and industrial rotary evaporators, cold storage, analytical and precision balances, flasks, ovens, grinders and screens. Also, we have an important professional staff, some of them doing their thesis on this subject. With regard to the objectives pursued, we expect to obtain products such as soups, baby food, bakery products and biscuits and sauces. At the industrial level, it aims at producing protein concentrates, starch and saponins.
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.