963 resultados para Algebra, Abstract
Resumo:
In this article, we describe a novel methodology to extract semantic characteristics from protein structures using linear algebra in order to compose structural signature vectors which may be used efficiently to compare and classify protein structures into fold families. These signatures are built from the pattern of hydrophobic intrachain interactions using Singular Value Decomposition (SVD) and Latent Semantic Indexing (LSI) techniques. Considering proteins as documents and contacts as terms, we have built a retrieval system which is able to find conserved contacts in samples of myoglobin fold family and to retrieve these proteins among proteins of varied folds with precision of up to 80%. The classifier is a web tool available at our laboratory website. Users can search for similar chains from a specific PDB, view and compare their contact maps and browse their structures using a JMol plug-in.
Resumo:
The nature of concepts is a matter of intense debate in cognitive sciences. While traditional views claim that conceptual knowledge is represented in a unitary symbolic system, recent Embodied and Grounded Cognition theories (EGC) submit the idea that conceptual system is couched in our body and influenced by the environment (Barsalou, 2008). One of the major challenges for EGC is constituted by abstract concepts (ACs), like fantasy. Recently, some EGC proposals addressed this criticism, arguing that the ACs comprise multifaced exemplars that rely on different grounding sources beyond sensorimotor one, including interoception, emotions, language, and sociality (Borghi et al., 2018). However, little is known about how ACs representation varies as a function of life experiences and their use in communication. The theoretical arguments and empirical studies comprised in this dissertation aim to provide evidence on multiple grounding of ACs taking into account their varieties and flexibility. Study I analyzed multiple ratings on a large sample of ACs and identified four distinct subclusters. Study II validated this classification with an interference paradigm involving motor/manual, interoceptive, and linguistic systems during a difficulty rating task. Results confirm that different grounding sources are activated depending on ACs kind. Study III-IV investigate the variability of institutional concepts, showing that the higher the law expertise level, the stronger the concrete/emotional determinants in their representation. Study V introduced a novel interactive task in which abstract and concrete sentences serve as cues to simulate conversation. Analysis of language production revealed that the uncertainty and interactive exchanges increase with abstractness, leading to generating more questions/requests for clarifications with abstract than concrete sentences. Overall, results confirm that ACs are multidimensional, heterogeneous, and flexible constructs and that social and linguistic interactions are crucial to shaping their meanings. Investigating ACs in real-time dialogues may be a promising direction for future research.
Resumo:
The abundance of visual data and the push for robust AI are driving the need for automated visual sensemaking. Computer Vision (CV) faces growing demand for models that can discern not only what images "represent," but also what they "evoke." This is a demand for tools mimicking human perception at a high semantic level, categorizing images based on concepts like freedom, danger, or safety. However, automating this process is challenging due to entropy, scarcity, subjectivity, and ethical considerations. These challenges not only impact performance but also underscore the critical need for interoperability. This dissertation focuses on abstract concept-based (AC) image classification, guided by three technical principles: situated grounding, performance enhancement, and interpretability. We introduce ART-stract, a novel dataset of cultural images annotated with ACs, serving as the foundation for a series of experiments across four key domains: assessing the effectiveness of the end-to-end DL paradigm, exploring cognitive-inspired semantic intermediaries, incorporating cultural and commonsense aspects, and neuro-symbolic integration of sensory-perceptual data with cognitive-based knowledge. Our results demonstrate that integrating CV approaches with semantic technologies yields methods that surpass the current state of the art in AC image classification, outperforming the end-to-end deep vision paradigm. The results emphasize the role semantic technologies can play in developing both effective and interpretable systems, through the capturing, situating, and reasoning over knowledge related to visual data. Furthermore, this dissertation explores the complex interplay between technical and socio-technical factors. By merging technical expertise with an understanding of human and societal aspects, we advocate for responsible labeling and training practices in visual media. These insights and techniques not only advance efforts in CV and explainable artificial intelligence but also propel us toward an era of AI development that harmonizes technical prowess with deep awareness of its human and societal implications.
Resumo:
Il modello ΛCDM è il modello cosmologico più semplice, ma finora più efficace, per descrivere l'evoluzione dell'universo. Esso si basa sulla teoria della Relatività Generale di Einstein e fornisce una spiegazione dell'espansione accelerata dell'universo introducendo la costante cosmologica Λ, che rappresenta il contributo della cosiddetta energia oscura, un'entità di cui ben poco si sa con certezza. Sono stati tuttavia proposti modelli teorici alternativi che descrivono gli effetti di questa quantità misteriosa, introducendo ad esempio gradi di libertà aggiuntivi, come nella teoria di Horndeski. L'obiettivo principale di questa testi è quello di studiare questi modelli tramite il tensor computer algebra xAct. In particolare, il nostro scopo sarà quello di implementare una procedura universale che permette di derivare, a partire dall'azione, le equazioni del moto e l'evoluzione temporale di qualunque modello generico.
Biased Random-key Genetic Algorithms For The Winner Determination Problem In Combinatorial Auctions.
Resumo:
Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.
Resumo:
Abstract The aim of this study was to evaluate three transfer techniques used to obtain working casts of implant-supported prostheses through the marginal misfit and strain induced to metallic framework. Thirty working casts were obtained from a metallic master cast, each one containing two implant analogues simulating a clinical situation of three-unit implant-supported fixed prostheses, according to the following transfer impression techniques: Group A, squared transfers splinted with dental floss and acrylic resin, sectioned and re-splinted; Group B, squared transfers splinted with dental floss and bis-acrylic resin; and Group N, squared transfers not splinted. A metallic framework was made for marginal misfit and strain measurements from the metallic master cast. The misfit between metallic framework and the working casts was evaluated with an optical microscope following the single-screw test protocol. In the same conditions, the strain was evaluated using strain gauges placed on the metallic framework. The data was submitted to one-way ANOVA followed by the Tukey's test (α=5%). For both marginal misfit and strain, there were statistically significant differences between Groups A and N (p<0.01) and Groups B and N (p<0.01), with greater values for the Group N. According to the Pearson's test, there was a positive correlation between the variables misfit and strain (r=0.5642). The results of this study showed that the impression techniques with splinted transfers promoted better accuracy than non-splinted one, regardless of the splinting material utilized.
Resumo:
Abstract Introduction: Hypertension (HTN) is a preventable cause of cardiovascular morbidity and mortality. To compare the prevalence, awareness, treatment, and control of HTN among urban and riverside populations in Porto Velho, Amazon region. We conducted a cross-sectional study between July and December 2013 based on a household survey of individuals aged 35-80 years. Interviews by using a standardized questionnaire, and blood pressure (BP), weight, height, and waist circumference measurements were performed. HTN was defined when individuals reported having the disease, received antihypertensive medications, or had a systolic BP ≥ 140 mm Hg or diastolic BP ≥ 90 mm Hg. Awareness was based on self-reports and the use of antihypertensive medications. Control was defined as a BP ≤ 140/90 mm Hg. Among the 1410 participants, 750 (53.19%) had HTN and 473 (63.06%) had diagnosis awareness, of whom 404 (85.41%) received pharmacological treatment but with low control rate. The prevalence and treatment rates were higher in the urban areas (55.48% vs. 48.87% [p = 0.02] and 61.25% vs. 52.30% [p < 0.01], respectively). HTN awareness was higher in the riverside area (61.05% vs. 67.36% ; p < 0.01), but the control rates showed no statistically significant difference (22.11% vs. 23.43% ; p = 0.69). HTN prevalence was higher in the urban population than in the riverside population. Of the hypertensive individuals in both areas, <25% had controlled HTN. Comprehensive public health measures are needed to improve the prevention and treatment of systemic arterial HTN and prevent other cardiovascular diseases.
Resumo:
Abstract Objective. The aim of this study was to evaluate the alteration of human enamel bleached with high concentrations of hydrogen peroxide associated with different activators. Materials and methods. Fifty enamel/dentin blocks (4 × 4 mm) were obtained from human third molars and randomized divided according to the bleaching procedure (n = 10): G1 = 35% hydrogen peroxide (HP - Whiteness HP Maxx); G2 = HP + Halogen lamp (HL); G3 = HP + 7% sodium bicarbonate (SB); G4 = HP + 20% sodium hydroxide (SH); and G5 = 38% hydrogen peroxide (OXB - Opalescence Xtra Boost). The bleaching treatments were performed in three sessions with a 7-day interval between them. The enamel content, before (baseline) and after bleaching, was determined using an FT-Raman spectrometer and was based on the concentration of phosphate, carbonate, and organic matrix. Statistical analysis was performed using two-way ANOVA for repeated measures and Tukey's test. Results. The results showed no significant differences between time of analysis (p = 0.5175) for most treatments and peak areas analyzed; and among bleaching treatments (p = 0.4184). The comparisons during and after bleaching revealed a significant difference in the HP group for the peak areas of carbonate and organic matrix, and for the organic matrix in OXB and HP+SH groups. Tukey's analysis determined that the difference, peak areas, and the interaction among treatment, time and peak was statistically significant (p < 0.05). Conclusion. The association of activators with hydrogen peroxide was effective in the alteration of enamel, mainly with regards to the organic matrix.
Resumo:
This article is divided into two parts. The first presents two abstract arguments that constitute the starting point for the development of the second part, which is directed towards discussing some aspects of the distance learning policies, i.e., the role of public authorities in encouraging and shaping this type of education. First of all, it is intended to establish some assumptions to understand the emergence of this phenomenon and some of its conditioning factors and consequences; then to suggest some analytical tools to allow us to reflect on public policy in this area.
Resumo:
This article shows that the term functionalism, very often understood as a single or uniform approach in linguistics, has to be understood in its different perspectives. I start by presenting an opposing conception similar to the I-language vs E-language in Chomsky (1986). As in the latter conception , language can be understood as an abstract model of a mind internal mechanism responsible for language production and perception or, as in the former one, it can be the description of the external use of language. Also like with formalists , there are functionalists who look for cross-linguistic variation (and universals of language use) and functionalists who look for language internal variation. It is also shown that functionalists can differ in the extent to which social variables are considered in the explanation of linguistic form.
Resumo:
This paper presents an overview of the concept of parameter in the Principles and Parameters theory, showing that a) in the first stage parameters were conceived as variation associated to the Principles and b) in the second stage as properties of the lexicon, and more specifically as properties of functional categories. The latter view has also developed from a substantive conception of functional categories to a more formal abstract characterization of functional heads. The paper also discusses parameters related to different levels of representation.
Resumo:
The theme of human formation is at the centre of the philosophy of education, whose aim is precisely the process of human promotion brought about by education. Starting from the critical vigilance proper to philosophy, the text sketches a phenomenology of the present time, verifying that the ideas prevailing in education at present are centred on the critique of reason and on the notions of truth and objectivity. This neo-pragmatism, which in the attempt to oppose metaphysics becomes deeply metaphysical, reducing everything to language, is contested by the authors with Marx's thoughts as a historicising philosophy that concerns not abstract subjects, but real individuals, historical subjects that are constituted as a synthesis of social relations. To that end, the authors resort to the historical ontological reflection on human formation contained in Marx's Economic and Philosophical Manuscripts of 1844. The article concludes by defending the proposition that access to the classics is a necessary condition for human formation.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física