926 resultados para Discrete Mathematics in Computer Science
Resumo:
The logic PJ is a probabilistic logic defined by adding (noniterated) probability operators to the basic justification logic J. In this paper we establish upper and lower bounds for the complexity of the derivability problem in the logic PJ. The main result of the paper is that the complexity of the derivability problem in PJ remains the same as the complexity of the derivability problem in the underlying logic J, which is π[p/2] -complete. This implies that the probability operators do not increase the complexity of the logic, although they arguably enrich the expressiveness of the language.
Resumo:
We present a probabilistic justification logic, PPJ, to study rational belief, degrees of belief and justifications. We establish soundness and completeness for PPJ and show that its satisfiability problem is decidable. In the last part we use PPJ to provide a solution to the lottery paradox.
Resumo:
Development of homology modeling methods will remain an area of active research. These methods aim to develop and model increasingly accurate three-dimensional structures of yet uncrystallized therapeutically relevant proteins e.g. Class A G-Protein Coupled Receptors. Incorporating protein flexibility is one way to achieve this goal. Here, I will discuss the enhancement and validation of the ligand-steered modeling, originally developed by Dr. Claudio Cavasotto, via cross modeling of the newly crystallized GPCR structures. This method uses known ligands and known experimental information to optimize relevant protein binding sites by incorporating protein flexibility. The ligand-steered models were able to model, reasonably reproduce binding sites and the co-crystallized native ligand poses of the β2 adrenergic and Adenosine 2A receptors using a single template structure. They also performed better than the choice of template, and crude models in a small scale high-throughput docking experiments and compound selectivity studies. Next, the application of this method to develop high-quality homology models of Cannabinoid Receptor 2, an emerging non-psychotic pain management target, is discussed. These models were validated by their ability to rationalize structure activity relationship data of two, inverse agonist and agonist, series of compounds. The method was also applied to improve the virtual screening performance of the β2 adrenergic crystal structure by optimizing the binding site using β2 specific compounds. These results show the feasibility of optimizing only the pharmacologically relevant protein binding sites and applicability to structure-based drug design projects.
Resumo:
In this paper we generalize the Continuous Adversarial Queuing Theory (CAQT) model (Blesa et al. in MFCS, Lecture Notes in Computer Science, vol. 3618, pp. 144–155, 2005) by considering the possibility that the router clocks in the network are not synchronized. We name the new model Non Synchronized CAQT (NSCAQT). Clearly, this new extension to the model only affects those scheduling policies that use some form of timing. In a first approach we consider the case in which although not synchronized, all clocks run at the same speed, maintaining constant differences. In this case we show that all universally stable policies in CAQT that use the injection time and the remaining path to schedule packets remain universally stable. These policies include, for instance, Shortest in System (SIS) and Longest in System (LIS). Then, we study the case in which clock differences can vary over time, but the maximum difference is bounded. In this model we show the universal stability of two families of policies related to SIS and LIS respectively (the priority of a packet in these policies depends on the arrival time and a function of the path traversed). The bounds we obtain in this case depend on the maximum difference between clocks. This is a necessary requirement, since we also show that LIS is not universally stable in systems without bounded clock difference. We then present a new policy that we call Longest in Queues (LIQ), which gives priority to the packet that has been waiting the longest in edge queues. This policy is universally stable and, if clocks maintain constant differences, the bounds we prove do not depend on them. To finish, we provide with simulation results that compare the behavior of some of these policies in a network with stochastic injection of packets.
Resumo:
In this paper we generalize the Continuous Adversarial Queuing Theory (CAQT) model (Blesa et al. in MFCS, Lecture Notes in Computer Science, vol. 3618, pp. 144–155, 2005) by considering the possibility that the router clocks in the network are not synchronized. We name the new model Non Synchronized CAQT (NSCAQT). Clearly, this new extension to the model only affects those scheduling policies that use some form of timing. In a first approach we consider the case in which although not synchronized, all clocks run at the same speed, maintaining constant differences. In this case we show that all universally stable policies in CAQT that use the injection time and the remaining path to schedule packets remain universally stable. These policies include, for instance, Shortest in System (SIS) and Longest in System (LIS). Then, we study the case in which clock differences can vary over time, but the maximum difference is bounded. In this model we show the universal stability of two families of policies related to SIS and LIS respectively (the priority of a packet in these policies depends on the arrival time and a function of the path traversed). The bounds we obtain in this case depend on the maximum difference between clocks. This is a necessary requirement, since we also show that LIS is not universally stable in systems without bounded clock difference. We then present a new policy that we call Longest in Queues (LIQ), which gives priority to the packet that has been waiting the longest in edge queues. This policy is universally stable and, if clocks maintain constant differences, the bounds we prove do not depend on them. To finish, we provide with simulation results that compare the behavior of some of these policies in a network with stochastic injection of packets.
Resumo:
We present in this paper a neural-like membrane system solving the SAT problem in linear time. These neural Psystems are nets of cells working with multisets. Each cell has a finite state memory, processes multisets of symbol-impulses, and can send impulses (?excitations?) to the neighboring cells. The maximal mode of rules application and the replicative mode of communication between cells are at the core of the eficiency of these systems.
Resumo:
In recent future, wireless sensor networks (WSNs) will experience a broad high-scale deployment (millions of nodes in the national area) with multiple information sources per node, and with very specific requirements for signal processing. In parallel, the broad range deployment of WSNs facilitates the definition and execution of ambitious studies, with a large input data set and high computational complexity. These computation resources, very often heterogeneous and driven on-demand, can only be satisfied by high-performance Data Centers (DCs). The high economical and environmental impact of the energy consumption in DCs requires aggressive energy optimization policies. These policies have been already detected but not successfully proposed. In this context, this paper shows the following on-going research lines and obtained results. In the field of WSNs: energy optimization in the processing nodes from different abstraction levels, including reconfigurable application specific architectures, efficient customization of the memory hierarchy, energy-aware management of the wireless interface, and design automation for signal processing applications. In the field of DCs: energy-optimal workload assignment policies in heterogeneous DCs, resource management policies with energy consciousness, and efficient cooling mechanisms that will cooperate in the minimization of the electricity bill of the DCs that process the data provided by the WSNs.
Learning and Assessing Competencies: New challenges for Mathematics in Engineering Degrees in Spain.
Resumo:
The introduction of new degrees adapted to the European Area of Higher Education (EAHE) has involved a radically different approach to the curriculum. The new programs are structured around competencies that should be acquired. Considering the competencies, teachers must define and develop learning objectives, design teaching methods and establish appropriate evaluation systems. While most Spanish universities have incorporated methodological innovations and evaluation systems different from traditional exams, there is enough confusion about how to teach and assess competencies and learning outcomes, as traditionally the teaching and assessment have focused on knowledge. In this paper we analyze the state-of-the-art in the mathematical courses of the new engineering degrees in some Spanish universities.
Resumo:
Cloud-based infrastructure has been increasingly adopted by the industry in distributed software development (DSD) environments. Its proponents claim that its several benefits include reduced cost, increased speed and greater productivity in software development. Empirical evaluations, however, are in the nascent stage of examining both the benefits and the risks of cloud-based infrastructure. The objective of this paper is to identify potential benefits and risks of using cloud in a DSD project conducted by teams based in Helsinki and Madrid. A cross-case qualitative analysis is performed based on focus groups conducted at the Helsinki and Madrid sites. Participants observations are used to supplement the analysis. The results of the analysis indicated that the main benefits of using cloud are rapid development, continuous integration, cost savings, code sharing, and faster ramp-up. The key risks determined by the project are dependencies, unavailability of access to the cloud, code commitment and integration, technical debt, and additional support costs. The results revealed that if such environments are not planned and set up carefully, the benefits of using cloud in DSD projects might be overshadowed by the risks associated with it.
Resumo:
Writing an efficient abstract is always a difficult and significant work in academic writing. What kinds of abstracts are well reputed in sport science? To answer this question, 20 abstracts from top journals of sport science were analyzed in the current research. The number of words and rhetorical moves were studied to assess the structures of the abstracts. Meanwhile, the key clauses, citations, the use of first person pronoun, the adoption of abbreviations and acronyms, hedging and the main tense were included in the analysis of the writing skills. Results have show: (1) Almost all of the abstracts were non-structured, and the length varied a lot, but the average word count was about 210-220; (2) the use of writing skills, such as key clauses, citations and hedging differed depending on the preference of the journal where the abstract appeared, and the main tense was selected based on the context of the abstract. In most cases, abbreviations and acronyms were allowed to be used, while the first person pronoun was always avoided
Resumo:
In this paper, we study a robot swarm that has to perform task allocation in an environment that features periodic properties. In this environment, tasks appear in different areas following periodic temporal patterns. The swarm has to reallocate its workforce periodically, performing a temporal task allocation that must be synchronized with the environment to be effective. We tackle temporal task allocation using methods and concepts that we borrow from the signal processing literature. In particular, we propose a distributed temporal task allocation algorithm that synchronizes robots of the swarm with the environment and with each other. In this algorithm, robots use only local information and a simple visual communication protocol based on light blinking. Our results show that a robot swarm that uses the proposed temporal task allocation algorithm performs considerably more tasks than a swarm that uses a greedy algorithm.
Resumo:
This paper presents a study in which the relationship between basic subjects (Mathematics and Physics) and applied engineering subjects (related to Machinery, Electrical Engineering, Topography and Buildings) in higher engineering education curricula is evaluated. The analysis has been conducted using the academic records of 206 students for five years. Furthermore, 34 surveys and personal interviews were conducted to analyze the connections between the contents taught in each subject and to identify student perceptions of the correlation with other subjects or disciplines. At the same time, the content of the different subjects have been analyzed to verify the relationship among the disciplines.Aproper coordination among subjects will allow students to relate and interconnect topics of different subjects, even with the ones learnt in previous courses, while also helping to reduce dropout rates and student failures in successfully accomplishing the different courses.
Resumo:
This paper describes a corpus-based analysis of the humanizing metaphor and supports that constitutive metaphor in science and technology may be highly metaphorical and active. The study, grounded in Lakoff’s Theory of Metaphor and in Langacker’s relational networks, consists of two phases: firstly, Earth Science metaphorical terms were extracted from databases and dictionaries and, then, contextualized by means of the “Wordsmith” tool in a digitalized corpus created to establish their productivity. Secondly, the terms were classified to disclose the main conceptual metaphors underlying them; then, the mappings and the relational networks of the metaphor were described. Results confirm the systematicity and productivity of the metaphor in this field, show evidence that metaphoricity of scientific terms is gradable, and support that Earth Science metaphors are not only created in terms of their concrete salient properties and attributes, but also on abstract human anthropocentric projections.
Resumo:
New devices have made their way into everyday life in recent years, opening the doors to new ways of interacting with computers, providing different, and potentially better, solutions to some problems. But this raises the question of if there is any way of measuring whether or not these new devices are suitable. This paper presents a strategy for evaluating the suitability of new interaction devices in the context of teaching children with special educational needs