1000 resultados para Computação quântica
Resumo:
In this dissertation, the theoretical principles governing the molecular modeling were applied for electronic characterization of oligopeptide α3 and its variants (5Q, 7Q)-α3, as well as in the quantum description of the interaction of the aminoglycoside hygromycin B and the 30S subunit of bacterial ribosome. In the first study, the linear and neutral dipeptides which make up the mentioned oligopeptides were modeled and then optimized for a structure of lower potential energy and appropriate dihedral angles. In this case, three subsequent geometric optimization processes, based on classical Newtonian theory, the semi-empirical and density functional theory (DFT), explore the energy landscape of each dipeptide during the search of ideal minimum energy structures. Finally, great conformers were described about its electrostatic potential, ionization energy (amino acids), and frontier molecular orbitals and hopping term. From the hopping terms described in this study, it was possible in subsequent studies to characterize the charge transport propertie of these peptides models. It envisioned a new biosensor technology capable of diagnosing amyloid diseases, related to an accumulation of misshapen proteins, based on the conductivity displayed by proteins of the patient. In a second step of this dissertation, a study carried out by quantum molecular modeling of the interaction energy of an antibiotic ribosomal aminoglicosídico on your receiver. It is known that the hygromycin B (hygB) is an aminoglycoside antibiotic that affects ribosomal translocation by direct interaction with the small subunit of the bacterial ribosome (30S), specifically with nucleotides in helix 44 of the 16S ribosomal RNA (16S rRNA). Due to strong electrostatic character of this connection, it was proposed an energetic investigation of the binding mechanism of this complex using different values of dielectric constants (ε = 0, 4, 10, 20 and 40), which have been widely used to study the electrostatic properties of biomolecules. For this, increasing radii centered on the hygB centroid were measured from the 30S-hygB crystal structure (1HNZ.pdb), and only the individual interaction energy of each enclosed nucleotide was determined for quantum calculations using molecular fractionation with conjugate caps (MFCC) strategy. It was noticed that the dielectric constants underestimated the energies of individual interactions, allowing the convergence state is achieved quickly. But only for ε = 40, the total binding energy of drug-receptor interaction is stabilized at r = 18A, which provided an appropriate binding pocket because it encompassed the main residues that interact more strongly with the hygB - C1403, C1404, G1405, A1493, G1494, U1495, U1498 and C1496. Thus, the dielectric constant ≈ 40 is ideal for the treatment of systems with many electrical charges. By comparing the individual binding energies of 16S rRNA nucleotides with the experimental tests that determine the minimum inhibitory concentration (MIC) of hygB, it is believed that those residues with high binding values generated bacterial resistance to the drug when mutated. With the same reasoning, since those with low interaction energy do not influence effectively the affinity of the hygB in its binding site, there is no loss of effectiveness if they were replaced.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The progresses of the Internet and telecommunications have been changing the concepts of Information Technology IT, especially with regard to outsourcing services, where organizations seek cost-cutting and a better focus on the business. Along with the development of that outsourcing, a new model named Cloud Computing (CC) evolved. It proposes to migrate to the Internet both data processing and information storing. Among the key points of Cloud Computing are included cost-cutting, benefits, risks and the IT paradigms changes. Nonetheless, the adoption of that model brings forth some difficulties to decision-making, by IT managers, mainly with regard to which solutions may go to the cloud, and which service providers are more appropriate to the Organization s reality. The research has as its overall aim to apply the AHP Method (Analytic Hierarchic Process) to decision-making in Cloud Computing. There to, the utilized methodology was the exploratory kind and a study of case applied to a nationwide organization (Federation of Industries of RN). The data collection was performed through two structured questionnaires answered electronically by IT technicians, and the company s Board of Directors. The analysis of the data was carried out in a qualitative and comparative way, and we utilized the software to AHP method called Web-Hipre. The results we obtained found the importance of applying the AHP method in decision-making towards the adoption of Cloud Computing, mainly because on the occasion the research was carried out the studied company already showed interest and necessity in adopting CC, considering the internal problems with infrastructure and availability of information that the company faces nowadays. The organization sought to adopt CC, however, it had doubt regarding the cloud model and which service provider would better meet their real necessities. The application of the AHP, then, worked as a guiding tool to the choice of the best alternative, which points out the Hybrid Cloud as the ideal choice to start off in Cloud Computing. Considering the following aspects: the layer of Infrastructure as a Service IaaS (Processing and Storage) must stay partly on the Public Cloud and partly in the Private Cloud; the layer of Platform as a Service PaaS (Software Developing and Testing) had preference for the Private Cloud, and the layer of Software as a Service - SaaS (Emails/Applications) divided into emails to the Public Cloud and applications to the Private Cloud. The research also identified the important factors to hiring a Cloud Computing provider
Resumo:
The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Image segmentation is one of the image processing problems that deserves special attention from the scientific community. This work studies unsupervised methods to clustering and pattern recognition applicable to medical image segmentation. Natural Computing based methods have shown very attractive in such tasks and are studied here as a way to verify it's applicability in medical image segmentation. This work treats to implement the following methods: GKA (Genetic K-means Algorithm), GFCMA (Genetic FCM Algorithm), PSOKA (PSO and K-means based Clustering Algorithm) and PSOFCM (PSO and FCM based Clustering Algorithm). Besides, as a way to evaluate the results given by the algorithms, clustering validity indexes are used as quantitative measure. Visual and qualitative evaluations are realized also, mainly using data given by the BrainWeb brain simulator as ground truth
Resumo:
Hospital Automation is an area that is constantly growing. The emergency of new technologies and hardware is transforming the processes more efficient. Nevertheless, some of the hospital processes are still being performed manually, such as monitoring of patients that is considered critical because it involves human lives. One of the factors that should be taken into account during a monitoring is the agility to detect any abnormality in vital signs of patients, as well as warning of this anomaly to the medical team involved. So, this master's thesis aims to develop an architecture to automate this process of monitoring and reporting of possible alert to a professional, so that emergency care can be done effectively. The computing mobile was used to improve the communication by distributing messages between a central located into the hospital and the mobile carried by the duty
Resumo:
The aim of this work is to derive theWard Identity for the low energy effective theory of a fermionic system in the presence of a hyperbolic Fermi surface coupled with a U(1) gauge field in 2+1 dimensions. These identities are important because they establish requirements for the theory to be gauge invariant. We will see that the identity associated Ward Identity (WI) of the model is not preserved at 1-loop order. This feature signalizes the presence of a quantum anomaly. In other words, a classical symmetry is broken dynamically by quantum fluctuations. Furthermore, we are considering that the system is close to a Quantum Phase Transitions and in vicinity of a Quantum Critical Point the fermionic excitations near the Fermi surface, decay through a Landau damping mechanism. All this ingredients need to be take explicitly to account and this leads us to calculate the vertex corrections as well as self energies effects, which in this way lead to one particle propagators which have a non-trivial frequency dependence
Resumo:
One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism
Resumo:
In this work will applied the technique of Differential Cryptanalysis, introduced in 1990 by Biham and Shamir, on Papílio s cryptosystem, developed by Karla Ramos, to test and most importantly, to prove its relevance to other block ciphers such as DES, Blowfish and FEAL-N (X). This technique is based on the analysis of differences between plaintext and theirs respective ciphertext, in search of patterns that will assist in the discovery of the subkeys and consequently in the discovery of master key. These differences are obtained by XOR operations. Through this analysis, in addition to obtaining patterns of Pap´ılio, it search to obtain also the main characteristics and behavior of Papilio throughout theirs 16 rounds, identifying and replacing when necessary factors that can be improved in accordance with pre-established definitions of the same, thus providing greater security in the use of his algoritm
Resumo:
With the advance of the Cloud Computing paradigm, a single service offered by a cloud platform may not be enough to meet all the application requirements. To fulfill such requirements, it may be necessary, instead of a single service, a composition of services that aggregates services provided by different cloud platforms. In order to generate aggregated value for the user, this composition of services provided by several Cloud Computing platforms requires a solution in terms of platforms integration, which encompasses the manipulation of a wide number of noninteroperable APIs and protocols from different platform vendors. In this scenario, this work presents Cloud Integrator, a middleware platform for composing services provided by different Cloud Computing platforms. Besides providing an environment that facilitates the development and execution of applications that use such services, Cloud Integrator works as a mediator by providing mechanisms for building applications through composition and selection of semantic Web services that take into account metadata about the services, such as QoS (Quality of Service), prices, etc. Moreover, the proposed middleware platform provides an adaptation mechanism that can be triggered in case of failure or quality degradation of one or more services used by the running application in order to ensure its quality and availability. In this work, through a case study that consists of an application that use services provided by different cloud platforms, Cloud Integrator is evaluated in terms of the efficiency of the performed service composition, selection and adaptation processes, as well as the potential of using this middleware in heterogeneous computational clouds scenarios
Resumo:
Considering a quantum gas, the foundations of standard thermostatistics are investigated in the context of non-Gaussian statistical mechanics introduced by Tsallis and Kaniadakis. The new formalism is based on the following generalizations: i) Maxwell- Boltzmann-Gibbs entropy and ii) deduction of H-theorem. Based on this investigation, we calculate a new entropy using a generalization of combinatorial analysis based on two different methods of counting. The basic ingredients used in the H-theorem were: a generalized quantum entropy and a generalization of collisional term of Boltzmann equation. The power law distributions are parameterized by parameters q;, measuring the degree of non-Gaussianity of quantum gas. In the limit q
Resumo:
A necessidade de uma precisão e de uma aproximação dos resultados numéricos zeram com que diversas teorias surgissem: dentre elas, destacamos a Matemática Intervalar. A Matemática Intervalar surgiu na década de 60 com os trabalhos de pesquisa de Moore (MOORE, 1959) , em que ele propôs trabalhar com uma Matemática baseada na noção de intervalo real e não mais com um número como aproximação. Com isso, surgiu a necessidade de revisitar e reformular os conceitos e resultados da Matemática Clássica utilizando como base a noção de intervalo de Moore. Uma das áreas da Matem ática Clássica que tem tido muitas aplicações em engenharias e ciências é a Análises Numérica, onde um dos seus pilares é o Cálculo Integral e em particular as integrais de linha. Assim, é muito desejável se ter um cálculo integral dentro da própria Matemática Intervalar. No presente trabalho apresenta-se uma noção de Integral de Linha Intervalar baseada na extensão de integração proposta por Bedregal em (BEDREGAL; BEDREGAL, 2010). Para a fundamentação apresenta-se incialmente uma introdução sobre a pespectiva em que o trabalho foi realizado, considerando alguns aspectos histórico-evolutivos da Matemática Clássica. Os conceitos de Integrais de Linha Clássica, bem como algumas das suas aplicações mais importantes. Alguns conceitos de Matemática Intervalar necessários para o entendimento do trabalho. Para nalizar propomos uma aplicação da integral de linha em um experimênto clássico da mecânica quântica (a difração de um elétron em uma fenda) que graças ao fato de ser a Matemática Intervalar utilizada, nos dá um foco mais detalhado e mais próximo da realidade