996 resultados para NMR quantum computing
Resumo:
From a narratological perspective, this paper aims to address the theoretical issues concerning the functioning of the so called «narrative bifurcation» in data presentation and information retrieval. Its use in cyberspace calls for a reassessment as a storytelling device. Films have shown its fundamental role for the creation of suspense. Interactive fiction and games have unveiled the possibility of plots with multiple choices, giving continuity to cinema split-screen experiences. Using practical examples, this paper will show how this storytelling tool returns to its primitive form and ends up by conditioning cloud computing interface design.
Resumo:
The Graphics Processing Unit (GPU) is present in almost every modern day personal computer. Despite its specific purpose design, they have been increasingly used for general computations with very good results. Hence, there is a growing effort from the community to seamlessly integrate this kind of devices in everyday computing. However, to fully exploit the potential of a system comprising GPUs and CPUs, these devices should be presented to the programmer as a single platform. The efficient combination of the power of CPU and GPU devices is highly dependent on each device’s characteristics, resulting in platform specific applications that cannot be ported to different systems. Also, the most efficient work balance among devices is highly dependable on the computations to be performed and respective data sizes. In this work, we propose a solution for heterogeneous environments based on the abstraction level provided by algorithmic skeletons. Our goal is to take full advantage of the power of all CPU and GPU devices present in a system, without the need for different kernel implementations nor explicit work-distribution.To that end, we extended Marrow, an algorithmic skeleton framework for multi-GPUs, to support CPU computations and efficiently balance the work-load between devices. Our approach is based on an offline training execution that identifies the ideal work balance and platform configurations for a given application and input data size. The evaluation of this work shows that the combination of CPU and GPU devices can significantly boost the performance of our benchmarks in the tested environments, when compared to GPU-only executions.
Resumo:
Breast cancer is the most common cancer among women, being a major public health problem. Worldwide, X-ray mammography is the current gold-standard for medical imaging of breast cancer. However, it has associated some well-known limitations. The false-negative rates, up to 66% in symptomatic women, and the false-positive rates, up to 60%, are a continued source of concern and debate. These drawbacks prompt the development of other imaging techniques for breast cancer detection, in which Digital Breast Tomosynthesis (DBT) is included. DBT is a 3D radiographic technique that reduces the obscuring effect of tissue overlap and appears to address both issues of false-negative and false-positive rates. The 3D images in DBT are only achieved through image reconstruction methods. These methods play an important role in a clinical setting since there is a need to implement a reconstruction process that is both accurate and fast. This dissertation deals with the optimization of iterative algorithms, with parallel computing through an implementation on Graphics Processing Units (GPUs) to make the 3D reconstruction faster using Compute Unified Device Architecture (CUDA). Iterative algorithms have shown to produce the highest quality DBT images, but since they are computationally intensive, their clinical use is currently rejected. These algorithms have the potential to reduce patient dose in DBT scans. A method of integrating CUDA in Interactive Data Language (IDL) is proposed in order to accelerate the DBT image reconstructions. This method has never been attempted before for DBT. In this work the system matrix calculation, the most computationally expensive part of iterative algorithms, is accelerated. A speedup of 1.6 is achieved proving the fact that GPUs can accelerate the IDL implementation.
Resumo:
Spin-lattice Relaxation, self-Diffusion coefficients and Residual Dipolar Couplings (RDC’s) are the basis of well established Nuclear Magnetic Resonance techniques for the physicochemical study of small molecules (typically organic compounds and natural products with MW < 1000 Da), as they proved to be a powerful and complementary source of information about structural dynamic processes in solution. The work developed in this thesis consists in the application of the earlier-mentioned NMR techniques to explore, analyze and systematize patterns of the molecular dynamic behavior of selected small molecules in particular experimental conditions. Two systems were chosen to investigate molecular dynamic behavior by these techniques: the dynamics of ion-pair formation and ion interaction in ionic liquids (IL) and the dynamics of molecular reorientation when molecules are placed in oriented phases (alignment media). The application of NMR spin-lattice relaxation and self-diffusion measurements was applied to study the rotational and translational molecular dynamics of the IL: 1-butyl-3-methylimidazolium tetrafluoroborate [BMIM][BF4]. The study of the cation-anion dynamics in neat and IL-water mixtures was systematically investigated by a combination of multinuclear NMR relaxation techniques with diffusion data (using by H1, C13 and F19 NMR spectroscopy). Spin-lattice relaxation time (T1), self-diffusion coefficients and nuclear Overhauser effect experiments were combined to determine the conditions that favor the formation of long lived [BMIM][BF4] ion-pairs in water. For this purpose and using the self-diffusion coefficients of cation and anion as a probe, different IL-water compositions were screened (from neat IL to infinite dilution) to find the conditions where both cation and anion present equal diffusion coefficients (8% water fraction at 25 ºC). This condition as well as the neat IL and the infinite dilution were then further studied by 13C NMR relaxation in order to determine correlation times (c) for the molecular reorientational motion using a mathematical iterative procedure and experimental data obtained in a temperature range between 273 and 353 K. The behavior of self-diffusion and relaxation data obtained in our experiments point at the combining parameters of molar fraction 8 % and temperature 298 K as the most favorable condition for the formation of long lived ion-pairs. When molecules are subjected to soft anisotropic motion by being placed in some special media, Residual Dipolar Couplings (RDCs), can be measured, because of the partial alignment induced by this media. RDCs are emerging as a powerful routine tool employed in conformational analysis, as it complements and even outperforms the approaches based on the classical NMR NOE or J3 couplings. In this work, three different alignment media have been characterized and evaluated in terms of integrity using 2H and 1H 1D-NMR spectroscopy, namely the stretched and compressed gel PMMA, and the lyotropic liquid crystals CpCl/n-hexanol/brine and cromolyn/water. The influence that different media and degrees of alignment have on the dynamic properties of several molecules was explored. Different sized sugars were used and their self-diffusion was determined as well as conformation features using RDCs. The results obtained indicate that no influence is felt by the small molecules diffusion and conformational features studied within the alignment degree range studied, which was the 3, 5 and 6 % CpCl/n-hexanol/brine for diffusion, and 5 and 7.5 % CpCl/n-hexanol/brine for conformation. It was also possible to determine that the small molecules diffusion verified in the alignment media presented close values to the ones observed in water, reinforcing the idea of no conditioning of molecular properties in such media.
Resumo:
No atual contexto da inovação, um grande número de estudos tem analisado o potencial do modelo de Inovação Aberta. Neste sentido, o autor Henry Chesbrough (2003) considerado o pai da Inovação Aberta, afirma que as empresas estão vivenciando uma “mudança de paradigma” na maneira como desenvolvem os seus processos de inovação e na comercialização de tecnologia e conhecimento. Desta forma, o modelo de Inovação Aberta defende que as empresas podem e devem utilizar os recursos disponíveis fora das suas fronteiras sendo esta combinação de ideias e tecnologias internas e externas crucial para atingir uma posição de liderança no mercado. Já afirmava Chesbrough (2003) que não se faz inovação isoladamente e o próprio dinamismo do cenário atual reforça esta ideia. Assim, os riscos inerentes ao processo de inovação podem ser atenuados através da realização de parcerias entre empresas e instituições. A adoção do modelo de Inovação Aberta é percebida com base na abundância de conhecimento disponível, que poderá proporcionar valor também à empresa que o criou, como é o caso do licenciamento de patentes. O presente estudo teve como objetivo identificar as práticas de Inovação Aberta entre as parcerias mencionadas pelas empresas prestadoras de Cloud Computing. Através da Análise de Redes Sociais foram construídas matrizes referentes às parcerias mencionadas pelas empresas e informações obtidas em fontes secundárias (Sousa, 2012). Essas matrizes de relacionamento (redes) foram analisadas e representadas através de diagramas. Desta forma, foi possível traçar um panorama das parcerias consideradas estratégicas pelas empresas entrevistadas e identificar quais delas constituem, de fato, práticas de Inovação Aberta. Do total de 26 parcerias estratégicas mencionadas nas entrevistas, apenas 11 foram caracterizadas como práticas do modelo aberto. A análise das práticas conduzidas pelas empresas entrevistadas permite verificar algumas limitações no aproveitamento do modelo de Inovação Aberta. Por fim, são feitas algumas recomendações sobre a implementação deste modelo pelas pequenas e médias empresas baseadas em tecnologias emergentes, como é o caso do conceito de cloud computing.
Resumo:
This study discusses some fundamental issues so that the development and diffusion of services based in cloud computing happen positively in several countries. For exposure of this subject is discusses public initiatives by the most advanced countries in terms of cloud computing application and the brazilin position in this context. Based on presented evidences here it appears that the essential elements for the development and diffusion of cloud computing in Brazil made important steps and show evidence of maturity, as the cybercrime legislation. However, other elements still require analysis and specifically adaptations for the cloud computing case, such as the Intellectual Property Rights. Despite showing broadband services still lacking, one cannot disregard the government effort to facilitate access for all society. In contrast, the large volume of the Brazilian IT market is an interest factor for companies seeking to invest in the country.
Resumo:
In the following text I will develop three major aspects. The first is to draw attention to those who seem to have been the disciplinary fields where, despite everything, the Digital Humanities (in the broad perspective as will be regarded here) have asserted themselves in a more comprehensive manner. I think it is here that I run into greater risks, not only for what I have mentioned above, but certainly because a significant part, perhaps, of the achievements and of the researchers might have escaped the look that I sought to cast upon the past few decades, always influenced by my own experience and the work carried out in the field of History. But this can be considered as a work in progress and it is open to criticism and suggestions. A second point to note is that emphasis will be given to the main lines of development in the relationship between historical research and digital methodologies, resources and tools. Finally, I will try to make a brief analysis of what has been the Digital Humanities discourse appropriation in recent years, with very debatable data and methods for sure, because studies are still scarce and little systematic information is available that would allow to go beyond an introductory reflection.
Resumo:
Self-assembly is a phenomenon that occurs frequently throughout the universe. In this work, two self-assembling systems were studied: the formation of reverse micelles in isooctane and in supercritical CO2 (scCO2), and the formation of gels in organic solvents. The goal was the physicochemical study of these systems and the development of an NMR methodology to study them. In this work, AOT was used as a model molecule both to comprehensively study a widely researched system water/AOT/isooctane at different water concentrations and to assess its aggregation in supercritical carbon dioxide at different pressures. In order to do so an NMR methodology was devised, in which it was possible to accurately determine hydrodynamic radius of the micelle (in agreement with DLS measurements) using diffusion ordered spectroscopy (DOSY), the micellar stability and its dynamics. This was mostly assessed by 1H NMR relaxation studies, which allowed to determine correlation times and size of correlating water molecules, which are in agreement with the size of the shell that interacts with the micellar layer. The encapsulation of differently-sized carbohydrates was also studied and allowed to understand the dynamics and stability of the aggregates in such conditions. A W/CO2 microemulsion was prepared using AOT and water in scCO2, with ethanol as cosurfactant. The behaviour of the components of the system at different pressures was assessed and it is likely that above 130 bar reverse microemulsions were achieved. The homogeneity of the system was also determined by NMR. The formation of the gel network by two small molecular organogelators in toluene-d8 was studied by DOSY. A methodology using One-shot DOSY to perform the spectra was designed and applied with success. This yielded an understanding about the role of the solvent and gelator in the aggregation process, as an estimation of the time of gelation.
Resumo:
To find sustainable solutions for the production of energy, it is necessary to create photovoltaic technologies that make every photon count. To pursue this necessity, in the present work photodetectors of zinc oxide embedded with nano-structured materials, that significantly raise the conversion of solar energy to electric energy, were developed. The novelty of this work is on the development of processing methodologies in which all steps are in solution: quantum dots synthesis, passivation of their surface and sol-gel deposition. The quantum dot solutions with different capping agents were characterized by UVvisible absorption spectroscopy, spectrofluorimetry, dynamic light scattering and transmission electron microscopy. The obtained quantum dots have dimensions between 2 and 3nm. These particles were suspended in zinc acetate solutions and used to produce doped zinc oxide films with embedded quantum dots, whose electric response was tested. The produced nano-structured zinc oxide materials have a superior performance than the bulk, in terms of the produced photo-current. This indicates that an intermediate band material should have been produced that acts as a photovoltaic medium for solar cells. The results are currently being compiled in a scientific article, that is being prepared for possible submission to Energy and Environmental Science or Nanoscale journals.
Resumo:
Accepted Manuscript
Resumo:
Human activity is very dynamic and subtle, and most physical environments are also highly dynamic and support a vast range of social practices that do not map directly into any immediate ubiquitous computing functionally. Identifying what is valuable to people is very hard and obviously leads to great uncertainty regarding the type of support needed and the type of resources needed to create such support. We have addressed the issues of system development through the adoption of a Crowdsourced software development model [13]. We have designed and developed Anywhere places, an open and flexible system support infrastructure for Ubiquitous Computing that is based on a balanced combination between global services and applications and situated devices. Evaluation, however, is still an open problem. The characteristics of ubiquitous computing environments make their evaluation very complex: there are no globally accepted metrics and it is very difficult to evaluate large-scale and long-term environments in real contexts. In this paper, we describe a first proposal of an hybrid 3D simulated prototype of Anywhere places that combines simulated and real components to generate a mixed reality which can be used to assess the envisaged ubiquitous computing environments [17].
Resumo:
This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto.
Resumo:
Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.
Resumo:
Recently, CdTe semiconductor quantum dots (QDs) have attracted great interest due to their unique properties [1]. Their dispersion into polymeric matrices would be very for several optoelectronics applications. Despite its importance, there has been relatively little work done on charge transport in the QD polymeric films [2], which is mainly affected by their structural and morphological properties. In the present work, polymer-quantum dot nanocomposites films based on optically transparent polymers in the visible spectral range and CdTe QDs with controlled particle size and emission wavelength, were prepared via solvent casting. Photoluminescent (PL) measurements indicate different emission intensity of the nanocomposites. A blue shift of the emission peak compared to that of QDs in solution occurred, which is attributed to the QDs environment changes. The morphological and structural properties of the CdTe nanocomposites were evaluated. Since better QDs dispersion was achieved, PMMA seemed to be the most promising matrix. Electrical properties measurements indicate an ohmic behavior.