939 resultados para sparse coding
Resumo:
Aims: The present study focuses on the analysis of novelty emergence in classic Gloria Films with Rogers, Perls, and Ellis to understand how the same client formulated her own problem and if and how change occurred in those three sessions. Method: The Innovative Moments Coding System was applied to track innovative moments (IMs) and their themes. Results: The session with Rogers showed more diversity in disclosed problems and themes of IMs, as well as a higher proportion of reflection IMs. The session with Perls demonstrated a high proportion of protest IMs. The session with Ellis showed less innovation than other sessions. The changes found were based mostly on reflection and protest IMs in three sessions. Conclusion: Narrative innovations occurred in the three single sessions. The type of dominant innovation is consistent with the therapeutic model and the IMs model. The exploration of the IMs’ themes allowed a more precise identification of Gloria's new narrative positions and their development throughout those sessions.
Resumo:
The identification of new and druggable targets in bacteria is a critical endeavour in pharmaceutical research of novel antibiotics to fight infectious agents. The rapid emergence of resistant bacteria makes today's antibiotics more and more ineffective, consequently increasing the need for new pharmacological targets and novel classes of antibacterial drugs. A new model that combines the singular value decomposition technique with biological filters comprised of a set of protein properties associated with bacterial drug targets and similarity to protein-coding essential genes of E. coli has been developed to predict potential drug targets in the Enterobacteriaceae family [1]. This model identified 99 potential target proteins amongst the studied bacterial family, exhibiting eight different functions that suggest that the disruption of the activities of these proteins is critical for cells. Out of these candidates, one was selected for target confirmation. To find target modulators, receptor-based pharmacophore hypotheses were built and used in the screening of a virtual library of compounds. Postscreening filters were based on physicochemical and topological similarity to known Gram-negative antibiotics and applied to the retrieved compounds. Screening hits passing all filters were docked into the proteins catalytic groove and 15 of the most promising compounds were purchased from their chemical vendors to be experimentally tested in vitro. To the best of our knowledge, this is the first attempt to rationalize the search of compounds to probe the relevance of this candidate as a new pharmacological target.
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
Alzheimer's disease (AD) is commonly associated with marked memory deficits; however, nonamnestic variants have been consistently described as well. Posterior cortical atrophy (PCA) is a progressive degenerative condition in which posterior regions of the brain are predominantly affected, therefore resulting in a pattern of distinctive and marked visuospatial symptoms, such as apraxia, alexia, and spatial neglect. Despite the growing number of studies on cognitive and neural bases of the visual variant of AD, intervention studies remain relatively sparse. Current pharmacological treatments offer modest efficacy. Also, there is a scarcity of complementary nonpharmacological interventions with only two previous studies of PCA. Here we describe a highly educated 57-year-old patient diagnosed with a visual variant of AD who participated in a cognitive intervention program (comprising reality orientation, cognitive stimulation, and cognitive training exercises). Neuropsychological assessment was performed across moments (baseline, postintervention, follow-up) and consisted mainly of verbal and visual memory. Baseline neuropsychological assessment showed deficits in perceptive and visual-constructive abilities, learning and memory, and temporal orientation. After neuropsychological rehabilitation, we observed small improvements in the patient's cognitive functioning, namely in verbal memory, attention, and psychomotor abilities. This study shows evidence of small beneficial effects of cognitive intervention in PCA and is the first report of this approach with a highly educated patient in a moderate stage of the disease. Controlled studies are needed to assess the potential efficacy of cognition-focused approaches in these patients, and, if relevant, to grant their availability as a complementary therapy to pharmacological treatment and visual aids.
Resumo:
Olive oil quality grading is traditionally assessed by human sensory evaluation of positive and negative attributes (olfactory, gustatory, and final olfactorygustatory sensations). However, it is not guaranteed that trained panelist can correctly classify monovarietal extra-virgin olive oils according to olive cultivar. In this work, the potential application of human (sensory panelists) and artificial (electronic tongue) sensory evaluation of olive oils was studied aiming to discriminate eight single-cultivar extra-virgin olive oils. Linear discriminant, partial least square discriminant, and sparse partial least square discriminant analyses were evaluated. The best predictive classification was obtained using linear discriminant analysis with simulated annealing selection algorithm. A low-level data fusion approach (18 electronic tongue signals and nine sensory attributes) enabled 100 % leave-one-out cross-validation correct classification, improving the discrimination capability of the individual use of sensor profiles or sensory attributes (70 and 57 % leave-one-out correct classifications, respectively). So, human sensory evaluation and electronic tongue analysis may be used as complementary tools allowing successful monovarietal olive oil discrimination.
Resumo:
In the present work we explored the ABP-CM4 peptide properties from Bombyx mori for the creation of biopolymers with broad antimicrobial activity. An antimicrobial recombinant protein-based polymer (rPBP) was designed by cloning the DNA sequence coding for ABP-CM4 in frame with the N-terminus of the elastin-like recombinamer consisting of 200 repetitions of the pentamer VPAVG, here named A200. The new rPBP, named CM4-A200, was purified via a simplified nonchromatographic method, making use of the thermoresponsive behavior of the A200 polymer. ABP-CM4 peptide was also purified through the incorporation of a formic acid cleavage site between the peptide and the A200 sequence. In soluble state the antimicrobial activity of both CM4-A200 polymer and ABP-CM4 peptide was poorly effective. However, when the CM4-A200 polymer was processed into free-standing films high antimicrobial activity against Gram-positive and Gram-negative bacteria, yeasts and filamentous fungi was observed. The antimicrobial activity of CM4-A200 was dependent on the physical contact of cells with the film surface. Furthermore, CM4-A200 films did not reveal a cytotoxic effect against both normal human skin fibroblasts and human keratinocytes. Finally, we have developed an optimized ex vivo assay with pig skin demonstrating the antimicrobial properties of the CM4-A200 cast films for skin applications.
Resumo:
It is well known that color coding facilitates search and iden- tification in real-life tasks. The aim of this work was to compare reac- tion times for normal color and dichromatic observers in a visual search experiment. A unique distracter color was used to avoid abnormal color vision vulnerability to background complexity. Reaction times for nor- mal color observers and dichromats were estimated for 2◦ central vision at 48 directions around a white point in CIE L∗a∗b∗ color space for systematic examination on the mechanisms of dichromatic color percep- tion. The results show that mean search times for dichromats were twice larger compared to the normal color observers and for all directions. The difference between the copunctual confusion lines and the confusion direction measure experimentally was 5.5◦ for protanopes and 7.5◦ for deuteranopes.
Resumo:
Tese de Doutoramento em Engenharia Química e Biológica.
Resumo:
Dissertação de mestrado Internacional em Sustentabilidade do Ambiente Construído
Resumo:
Spinocerebellar ataxia type 3 (SCA3), also known as Machado-Joseph disease (MJD), is an untreatable autosomal dominant neurodegenerative disease, and the most common such inherited ataxia worldwide. The mutation in SCA3 is the expansion of a polymorphic CAG tri-nucleotide repeat sequence in the C-terminal coding region of the ATXN3 gene at chromosomal locus 14q32.1. The mutant ATXN3 protein encoding expanded glutamine (polyQ) sequences interacts with multiple proteins in vivo, and is deposited as aggregates in the SCA3 brain. A large body of literature suggests that the loss of function of the native ATNX3-interacting proteins that are deposited in the polyQ aggregates contributes to cellular toxicity, systemic neurodegeneration and the pathogenic mechanism in SCA3. Nonetheless, a significant understanding of the disease etiology of SCA3, the molecular mechanism by which the polyQ expansions in the mutant ATXN3 induce neurodegeneration in SCA3 has remained elusive. In the present study, we show that the essential DNA strand break repair enzyme PNKP (polynucleotide kinase 3'-phosphatase) interacts with, and is inactivated by, the mutant ATXN3, resulting in inefficient DNA repair, persistent accumulation of DNA damage/strand breaks, and subsequent chronic activation of the DNA damage-response ataxia telangiectasia-mutated (ATM) signaling pathway in SCA3. We report that persistent accumulation of DNA damage/strand breaks and chronic activation of the serine/threonine kinase ATM and the downstream p53 and protein kinase C-d pro-apoptotic pathways trigger neuronal dysfunction and eventually neuronal death in SCA3. Either PNKP overexpression or pharmacological inhibition of ATM dramatically blocked mutant ATXN3-mediated cell death. Discovery of the mechanism by which mutant ATXN3 induces DNA damage and amplifies the pro-death signaling pathways provides a molecular basis for neurodegeneration due to PNKP inactivation in SCA3, and for the first time offers a possible approach to treatment.
Resumo:
DNA strand-breaks (SBs) with non-ligatable ends are generated by ionizing radiation, oxidative stress, various chemotherapeutic agents, and also as base excision repair (BER) intermediates. Several neurological diseases have already been identified as being due to a deficiency in DNA end-processing activities. Two common dirty ends, 3'-P and 5'-OH, are processed by mammalian polynucleotide kinase 3'-phosphatase (PNKP), a bifunctional enzyme with 3'-phosphatase and 5'-kinase activities. We have made the unexpected observation that PNKP stably associates with Ataxin-3 (ATXN3), a polyglutamine repeat-containing protein mutated in spinocerebellar ataxia type 3 (SCA3), also known as Machado-Joseph Disease (MJD). This disease is one of the most common dominantly inherited ataxias worldwide; the defect in SCA3 is due to CAG repeat expansion (from the normal 14-41 to 55-82 repeats) in the ATXN3 coding region. However, how the expanded form gains its toxic function is still not clearly understood. Here we report that purified wild-type (WT) ATXN3 stimulates, and by contrast the mutant form specifically inhibits, PNKP's 3' phosphatase activity in vitro. ATXN3-deficient cells also show decreased PNKP activity. Furthermore, transgenic mice conditionally expressing the pathological form of human ATXN3 also showed decreased 3'-phosphatase activity of PNKP, mostly in the deep cerebellar nuclei, one of the most affected regions in MJD patients' brain. Finally, long amplicon quantitative PCR analysis of human MJD patients' brain samples showed a significant accumulation of DNA strand breaks. Our results thus indicate that the accumulation of DNA strand breaks due to functional deficiency of PNKP is etiologically linked to the pathogenesis of SCA3/MJD.
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
Monkey, neuron, auditory cortex, temporal processing, nonlinear interaction, sequence, temporal coding
Resumo:
Object-oriented simulation, mechatronic systems, non-iterative algorithm, electric components, piezo-actuator, symbolic computation, Maple, Sparse-Tableau, Library of components
Resumo:
Speaker Recognition, Speaker Verification, Sparse Kernel Logistic Regression, Support Vector Machine