944 resultados para Logic, Symbolic and mathematical
Resumo:
In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions
Resumo:
The Ukraine crisis and Russia’s contribution to it have raised numerous concerns regarding the possible emergence of a new ‘Cold War’ in Europe. At the same time, Ukraine’s popular choice and enthusiasm for European integration expressed clearly on the streets of Kyiv seem to have caused Russia to adopt a (neo)revisionist attitude. In this context, relations between Russia and the EU (and the West for that matter) have been limited, frozen and directed on path towards conflict. This article analyses how the traditional dichotomy between conflict and cooperation in EU–Russia relations was replaced by conflict in the context of the Ukraine crisis. The article contends that the breakdown of the symbolic and peaceful cohabitation between the EU and Russia has been influenced by the fact that both actors have chosen to ignore key tensions that characterized their post-Cold War interactions. The article identifies three such tensions: the first emphasizes divisions between EU member states and their impact on coagulating a common EU approach towards Russia; the second (geopolitical) tension highlights the almost mutually exclusive way in which the EU and Russia’s security interests have developed in the post-Soviet space; finally, the third contends that a clash of values and worldviews between the EU and Russia makes conflict virtually unavoidable.
Resumo:
Dyscalculia stands for a brain-based condition that makes it hard to make sense of numbers and mathematical concepts. Some adolescents with dyscalculia cannot grasp basic number concepts. They work hard to learn and memorize basic number facts. They may know what to do in mathematical classes but do not understand why they are doing it. In other words, they miss the logic behind it. However, it may be worked out in order to decrease its degree of severity. For example, disMAT, an app developed for android may help children to apply mathematical concepts, without much effort, that is turning in itself, a promising tool to dyscalculia treatment. Thus, this work focuses on the development of an Intelligent System to estimate children evidences of dyscalculia, based on data obtained on-the-fly with disMAT. The computational framework is built on top of a Logic Programming framework to Knowledge Representation and Reasoning, complemented with a Case-Based problem solving approach to computing, that allows for the handling of incomplete, unknown, or even contradictory information.
Resumo:
The soil carries out a wide range of functions and it is important study the effects of land use on soil quality in order to provide most sustainable practices. Three fields trial have been considered to assess soil quality and functionality after human alteration, and to determine the power of soil enzymatic activities, biochemical indexes and mathematical model in the evaluation of soil status. The first field was characterized by conventional and organic management in which were tested also tillage effects. The second was characterized by conventional, organic and agro-ecological management. Finally, the third was a beech forest where was tested the effects of N deposition on soil organic carbon sequestration. Results highlight that both enzyme activities and biochemical indexes could be valid parameters for soil quality evaluation. Conventional management and plowing negatively affected soil quality and functionality with intensive tillage that lead to the downturn of microbial biomass and activity. Both organic and agro-ecological management revealed to be good practices for the maintenance of soil functionality with better microbial activity and metabolic efficiency. This positively affected also soil organic carbon content. At the eutrophic forest, enzyme activities and biochemical indexes positively respond to the treatments but one year of experimentation resulted to be not enough to observe variation in soil organic carbon content. Mathematical models and biochemical indicators resulted to be valid tools for assess soil quality, nonetheless it would be better including the microbial component in the mathematical model and consider more than one index if the aim of the work is to evaluate the overall soil quality and functionality. Concluding, the forest site is the richest one in terms of organic carbon, microbial biomass and activity while, the organic and the agro-ecological management seem to be the more sustainable but without taking in consideration the yield.
Resumo:
This Thesis is composed of a collection of works written in the period 2019-2022, whose aim is to find methodologies of Artificial Intelligence (AI) and Machine Learning to detect and classify patterns and rules in argumentative and legal texts. We define our approach “hybrid”, since we aimed at designing hybrid combinations of symbolic and sub-symbolic AI, involving both “top-down” structured knowledge and “bottom-up” data-driven knowledge. A first group of works is dedicated to the classification of argumentative patterns. Following the Waltonian model of argument and the related theory of Argumentation Schemes, these works focused on the detection of argumentative support and opposition, showing that argumentative evidences can be classified at fine-grained levels without resorting to highly engineered features. To show this, our methods involved not only traditional approaches such as TFIDF, but also some novel methods based on Tree Kernel algorithms. After the encouraging results of this first phase, we explored the use of a some emerging methodologies promoted by actors like Google, which have deeply changed NLP since 2018-19 — i.e., Transfer Learning and language models. These new methodologies markedly improved our previous results, providing us with best-performing NLP tools. Using Transfer Learning, we also performed a Sequence Labelling task to recognize the exact span of argumentative components (i.e., claims and premises), thus connecting portions of natural language to portions of arguments (i.e., to the logical-inferential dimension). The last part of our work was finally dedicated to the employment of Transfer Learning methods for the detection of rules and deontic modalities. In this case, we explored a hybrid approach which combines structured knowledge coming from two LegalXML formats (i.e., Akoma Ntoso and LegalRuleML) with sub-symbolic knowledge coming from pre-trained (and then fine-tuned) neural architectures.
Resumo:
Allostery is a phenomenon of fundamental importance in biology, allowing regulation of function and dynamic adaptability of enzymes and proteins. Despite the allosteric effect was first observed more than a century ago allostery remains a biophysical enigma, defined as the “second secret of life”. The challenge is mainly associated to the rather complex nature of the allosteric mechanisms, which manifests itself as the alteration of the biological function of a protein/enzyme (e.g. ligand/substrate binding at the active site) by binding of “other object” (“allos stereos” in Greek) at a site distant (> 1 nanometer) from the active site, namely the effector site. Thus, at the heart of allostery there is signal propagation from the effector to the active site through a dense protein matrix, with a fundamental challenge being represented by the elucidation of the physico-chemical interactions between amino acid residues allowing communicatio n between the two binding sites, i.e. the “allosteric pathways”. Here, we propose a multidisciplinary approach based on a combination of computational chemistry, involving molecular dynamics simulations of protein motions, (bio)physical analysis of allosteric systems, including multiple sequence alignments of known allosteric systems, and mathematical tools based on graph theory and machine learning that can greatly help understanding the complexity of dynamical interactions involved in the different allosteric systems. The project aims at developing robust and fast tools to identify unknown allosteric pathways. The characterization and predictions of such allosteric spots could elucidate and fully exploit the power of allosteric modulation in enzymes and DNA-protein complexes, with great potential applications in enzyme engineering and drug discovery.
Resumo:
In this thesis, we investigate the role of applied physics in epidemiological surveillance through the application of mathematical models, network science and machine learning. The spread of a communicable disease depends on many biological, social, and health factors. The large masses of data available make it possible, on the one hand, to monitor the evolution and spread of pathogenic organisms; on the other hand, to study the behavior of people, their opinions and habits. Presented here are three lines of research in which an attempt was made to solve real epidemiological problems through data analysis and the use of statistical and mathematical models. In Chapter 1, we applied language-inspired Deep Learning models to transform influenza protein sequences into vectors encoding their information content. We then attempted to reconstruct the antigenic properties of different viral strains using regression models and to identify the mutations responsible for vaccine escape. In Chapter 2, we constructed a compartmental model to describe the spread of a bacterium within a hospital ward. The model was informed and validated on time series of clinical measurements, and a sensitivity analysis was used to assess the impact of different control measures. Finally (Chapter 3) we reconstructed the network of retweets among COVID-19 themed Twitter users in the early months of the SARS-CoV-2 pandemic. By means of community detection algorithms and centrality measures, we characterized users’ attention shifts in the network, showing that scientific communities, initially the most retweeted, lost influence over time to national political communities. In the Conclusion, we highlighted the importance of the work done in light of the main contemporary challenges for epidemiological surveillance. In particular, we present reflections on the importance of nowcasting and forecasting, the relationship between data and scientific research, and the need to unite the different scales of epidemiological surveillance.
Resumo:
My thesis falls within the framework of physics education and teaching of mathematics. The objective of this report was made possible by using geometrical (in mathematics) and qualitative (in physics) problems. We have prepared four (resp. three) open answer exercises for mathematics (resp. physics). The test batch has been selected across two different school phases: end of the middle school (third year, 8\textsuperscript{th} grade) and beginning of high school (second and third year, 10\textsuperscript{th} and 11\textsuperscript{th} grades respectively). High school students achieved the best results in almost every problem, but 10\textsuperscript{th} grade students got the best overall results. Moreover, a clear tendency to not even try qualitative problems resolution has emerged from the first collection of graphs, regardless of subject and grade. In order to improve students' problem-solving skills, it is worth to invest on vertical learning and spiral curricula. It would make sense to establish a stronger and clearer connection between physics and mathematical knowledge through an interdisciplinary approach.
Resumo:
One of the great challenges of the scientific community on theories of genetic information, genetic communication and genetic coding is to determine a mathematical structure related to DNA sequences. In this paper we propose a model of an intra-cellular transmission system of genetic information similar to a model of a power and bandwidth efficient digital communication system in order to identify a mathematical structure in DNA sequences where such sequences are biologically relevant. The model of a transmission system of genetic information is concerned with the identification, reproduction and mathematical classification of the nucleotide sequence of single stranded DNA by the genetic encoder. Hence, a genetic encoder is devised where labelings and cyclic codes are established. The establishment of the algebraic structure of the corresponding codes alphabets, mappings, labelings, primitive polynomials (p(x)) and code generator polynomials (g(x)) are quite important in characterizing error-correcting codes subclasses of G-linear codes. These latter codes are useful for the identification, reproduction and mathematical classification of DNA sequences. The characterization of this model may contribute to the development of a methodology that can be applied in mutational analysis and polymorphisms, production of new drugs and genetic improvement, among other things, resulting in the reduction of time and laboratory costs.
Resumo:
Development of processing technology and equipments requires new methods and better quality of the processed product. In the continuous drying process, utilization of equipments that promotes an increment in the transfer coefficients becomes of the major interest. The use of vibrational energy has been recommended to the dispersed materials. Such method is based on the use of vibrational energy applied to disperse media. Thus, a literature review on the mass transfer and drying in vibro-fluidized beds was carried out, showing experimental results and mathematical modeling.
Resumo:
This paper discusses theoretical results of the research project Linguistic Identity and Identification: A Study of Functions of Second Language in Enunciating Subject Constitution. Non-cognitive factors that have a crucial incidence in the degree of success and ways of accomplishment of second language acquisition process are focused. A transdisciplinary perspective is adopted, mobilising categories from Discourse Analysis and Psychoanalysis. The most relevant ones are: discursive formation, intradiscourse, interdiscourse, forgetting n° 1, forgetting n° 2 (Pêcheux, 1982), identity, identification (Freud, 1966; Lacan, 1977; Nasio, 1995). Revuz s views (1991) are discussed. Her main claim is that during the process of learning a foreign language, the foundations of psychical structure, and consequently first language, are required. After examining how nomination and predication processes work in first and second languages, components of identity and identification processes are focused on, in an attempt to show how second language acquisition strategies depend on them. It is stated that methodological affairs of language teaching, learner s explicit motivation and the like are subordinated to the comprehension of deeper non-cognitive factors that determine the accomplishment of the second language acquisition process. It is also pointed out that those factors are to be approached, questioning the bipolar biological-social conception of subjectivity in the study of language acquisition and use and including in the analysis symbolic and significant dimensions of the discourse constitution process.
Resumo:
In response to methodological concerns associated with previous research into the educational characteristics of students with high or low self-concept, the topic was re-examined using a significantly more representative sample and a contemporary self-concept measure. From an initial screening of 515 preadolescent, coeducational students in 18 schools, students significantly high or low in self-concept were compared using standardized tests in reading, spelling, and mathematics, and teacher interviews to determine students' academic and nonacademic characteristics. The teachers were not informed of the self-concept status of the students. Compared to students with low self-concept, students with high self-concept were rated by teachers as being more popular, cooperative, and persistent in class, showed greater leadership, were lower in anxiety, had more supportive families, and had higher teacher expectations for their future success. Teachers observed that students with low self-concept were quiet and withdrawn, while peers with high self-concept were talkative and more dominating with peers. Students with lower self-concepts were also lower than their peers in reading, spelling, and mathematical abilities. The findings support the notion that there is an interactive relationship between self-concept and achievement. (C) 1998 John Wiley & Sons, Inc.
Resumo:
New techniques in air-displacement plethysmography seem to have overcome many of the previous problems of poor reproducibility and validity. These have made body-density measurements available to a larger range of individuals, including children, elderly and sick patients who often have difficulties in being submerged underwater in hydrodensitometry systems. The BOD POD air-displacement system (BOD POD body composition system; Life Measurement Instruments, Concord, CA, USA) is more precise than hydrodensitometry, is simple and rapid to operate (approximately 1 min measurements) and the results agree closely with those of hydrodensitometry (e.g. +/-3.4% for estimation of body fat). Body line scanners employing the principles of three-dimensional photography are potentially able to measure the surface area and volume of the body and its segments even more rapidly (approximately 10 s), but the validity of the measurements needs to be established. Advances in i.r. spectroscopy and mathematical modelling for calculating the area under the curve have improved precision for measuring enrichment of (H2O)-H-2 in studies of water dilution (CV 0.1-0.9% within the range of 400-1000 mu l/l) in saliva, plasma and urine. The technique is rapid and compares closely with mass spectrometry (bias 1 (SD 2) %). Advances in bedside bioelectrical-impedance techniques are making possible potential measurements of skinfold thicknesses and limb muscle mass electronically. Preliminary results suggest that the electronic method is more reproducible (intra-and inter-individual reproducibility for measuring skinfold thicknesses) and associated with less bias (+ 12%), than anthropometry (+ 40%). In addition to these selected examples, the 'mobility' or transfer of reference methods between centres has made the distinction between reference and bedside or field techniques less distinct than in the past.
Resumo:
An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The concept of local concurrence is used to quantify the entanglement between a single qubit and the remainder of a multiqubit system. For the ground state of the BCS model in the thermodynamic limit the set of local concurrences completely describes the entanglement. As a measure for the entanglement of the full system we investigate the average local concurrence (ALC). We find that the ALC satisfies a simple relation with the order parameter. We then show that for finite systems with a fixed particle number, a relation between the ALC and the condensation energy exposes a threshold coupling. Below the threshold, entanglement measures besides the ALC are significant.