629 resultados para functionalities
Resumo:
La expansión experimentada por la informática, las nuevas tecnologías e internet en los últimos años, no solo viene dada por la evolución del hardware subyacente, sino por la evolución del desarrollo de software y del crecimiento del número de desarrolladores. Este incremento ha hecho evolucionar el software de unos sistemas de gestión basados en ficheros, prácticamente sin interfaz gráfico y de unos pocos miles de líneas a grandes sistemas distribuidos multiplataforma. El desarrollo de estos grandes sistemas, requiere gran cantidad de personas involucradas en el desarrollo, y que las herramientas de desarrollo hayan crecido también para facilitar su análisis, diseño, codificación, pruebas, implantación y mantenimiento. La base de estas herramientas software las proveen las propias plataformas de desarrollo, pero la experiencia de los desarrolladores puede aportar un sinfín de utilidades y de técnicas que agilicen los desarrollos y cumplan los requisitos del software en base a la reutilización de soluciones lo suficientemente probadas y optimizadas. Dichas herramientas se agrupan ordenadamente, creando así frameworks personalizados, con herramientas de todo tipo, clases, controles, interfaces, patrones de diseño, de tal manera que se dan soluciones personalizadas a un amplio número de problemas para emplearlas cuantas veces se quiera, bien marcando directrices de desarrollo mediante el uso de patrones, bien con la encapsulación de complejidades de tal modo que los desarrolladores ya dispongan de componentes que asuman cierta lógica o cierta complejidad aliviando así la fase de construcción. En este trabajo se abordan temas sobre las tecnologías base y plataformas de desarrollo para poder acometer la creación de un framework personalizado, necesidades a evaluar antes de acometerlo, y técnicas a emplear para la consecución del mismo, orientadas a la documentación, mantenimiento y extensión del framework. La exposición teórica consiste en mostrar y evaluar los requisitos para crear un framework, requisitos de la plataforma de desarrollo, y explicar cómo funcionan las grandes plataformas de desarrollo actuales, que elementos los componen y su funcionamiento, así como marcar ciertas pautas de estructuración y nomenclatura que el desarrollo de un framework debe contemplar para su mantenimiento y extensión. En la parte metodológica se ha usado un subconjunto de Métrica V3, ya que para el desarrollo de controles no aplica dicha metodología en su totalidad, pero contempla el catálogo de requisitos, los casos de uso, diagramas de clase, diagramas de secuencia, etc… Aparte de los conceptos teóricos, se presenta un caso práctico con fines didácticos de cómo parametrizar y configurar el desarrollo bajo la plataforma .NET. Dicho caso práctico consiste en la extensión de un control de usuario genérico de la plataforma .NET, de tal modo que se aplican conceptos más allá del hecho de crear funciones como las funcionalidades que puede brindar un API. Conceptos sobre como extender y modificar controles ya existentes, que interactúan por medio de eventos con otros controles, con vistas a que ese nuevo control forme parte de una biblioteca de controles de usuario personalizados ampliamente divulgada. Los controles de usuario son algo que no solo tienen una parte funcional, sino que también tienen una parte visual, y definiciones funcionales distintas de las típicas del software de gestión, puesto que han de controlar eventos, visualizaciones mientras se dan estos eventos y requisitos no funcionales de optimización de rendimiento, etc… Para el caso práctico se toma como herramienta la plataforma de desarrollo .Net Framework, en todas sus versiones, ya que el control a extender es el control ListView y hacerlo editable. Este control está presente en todas las versiones de .NET framework y con un alto grado de reutilización. Esta extensión muestra además como se puede migrar fácilmente este tipo de extensiones sobre todos los frameworks. Los entornos de desarrollo usados son varias versiones de Visual Studio para el mostrar dicha compatibilidad, aunque el desarrollo que acompaña este documento esté realizado sobre Visual Studio 2013. ABSTRACT The expansion in computer science, new technologies and the Internet in recent years, not only is given by the evolution of the underlying hardware, but for the evolution of software development and the growing number of developers. This increase has evolved software from management systems based on files almost without graphical interface and a few thousand of code lines, to large multiplatform distributed systems. The development of these large systems, require lots of people involved in development, and development tools have also grown to facilitate analysis, design, coding, testing, deployment and maintenance. The basis of these software tools are providing by their own development platforms, but the experience of the developers can bring a lot of utilities and techniques to speed up developments and meet the requirements of software reuse based on sufficiently proven solutions and optimized. These tools are grouped neatly, creating in this way custom frameworks, with tools of all types, classes, controls, interfaces, design patterns,… in such a way that they provide customized solutions to a wide range of problems to use them many times as you want to occur, either by dialing development guidelines by using patterns or along with the encapsulation of complexities, so that developers already have components that take some logic or some complexity relieving the construction phase. This paper cover matters based on technologies and development platforms to undertake the creation of a custom framework, needs to evaluate before rush it and techniques to use in order to achieve it, a part from techniques oriented to documentation, maintenance and framework extension. The theoretical explanation consists in to demonstrate and to evaluate the requirements for creating a framework, development platform requirements, and explain how large current development platforms work, which elements compose them and their operation work, as well as mark certain patterns of structure and nomenclature that the development of a framework should include for its maintenance and extension. In the methodological part, a subset of Métrica V3 has been used, because of, for the development of custom controls this methodology does not apply in its entirety, but provides a catalogue of requirements, use cases, class diagrams, sequence diagrams, etc ... Apart from the theoretical concepts, a study case for teaching purposes about how to parameterize and configure the development under the .NET platform is presented. This study case involves the extension of a generic user control of the .NET platform, so that concepts apply beyond the fact of creating functions as the functionalities that can provide an API. Concepts on how to extend and modify existing controls that interact through events with other controls, overlooking that new control as a part of a custom user controls library widely publicized. User controls are something that not only have a functional part, but also have a visual part, and various functional definitions of typical management software, since that they have to control events, visualizations while these events are given and not functional of performance optimization requirements, etc ... For the study case the development platform .Net Framework is taken as tool, in all its versions, considering that control to extend is the ListView control and make it editable. This control is present in all versions of .NET framework and with a high degree of reuse. This extension also shows how you can easily migrate these extensions on all frameworks. The used development environments are several versions of Visual Studio to show that compatibility, although the development that accompanies this document is done on Visual Studio 2013.
Resumo:
El objetivo principal del proyecto es el desarrollo de un simulador de comunicaciones submarinas, que permite la caracterización del canal a través de datos reales que son usados para establecer la comunicación entre dos puntos, empleando diferentes técnicas de modulación. Dicho simulador, ofrece un interfaz gráfico sencillo de usar y ha sido desarrollado en MatLab, basado en Bellhop [14] y Simulink. Dicho simulador desarrollado se ha usado para realizar simulaciones en diferentes escenarios, con datos reales del océano extraídos de la base de datos WOD [2]. Se ha divido el proyecto en seis partes: INTRODUCCIÓN, MARCO TEÓRICO, IMPLEMENTACIÓN, CONCLUSIONES, MANUAL y PROPUESTA DE PRÁCTICA. Se describen a continuación: En la primera parte, se realiza una introducción al proyecto, indicando las motivaciones que llevaron a desarrollarlo, una breve introducción, los objetivos fijados y un análisis de la evolución histórica de las comunicaciones submarinas, hasta llegar al estado del arte existente. En la segunda parte se describen los fundamentos teóricos necesarios para el desarrollo del proyecto, por una parte lo relativo a las ondas acústicas y su propagación, y por otra lo relativo a las técnicas de modulación digital empleadas. En la tercera parte se describe la implementación del simulador, explicando las funcionalidades existentes y un resumen de cómo fue desarrollado y su arquitectura lo que facilita su uso para proyectos futuros. La cuarta parte analiza las simulaciones realizadas en diversos escenarios, empleando datos reales y datos artificiales para la temperatura y salinidad del agua. En la quinta parte se proporciona un manual de usuario del simulador desarrollado, para que pueda ser usado correctamente. Se describe también el procesado de extracción de datos de WOD para que sean compatibles. Por último, en propuesta didáctica se propone un guión de práctica para desarrollar en la asignatura P.A.S. ABSTRACT. The main goal of this project is the development of an underwater communication simulator, that allows the determination of the underwater channel through real data, using different modulation techniques. The simulator, offers a graphic interface, easy to use and developed in MatLab, based on Bellhop [14] and Simulink. The simulator was given the name of UWACOMSIM and it was used to simulate different scenarios, using data from the WOD [2]. The project is divided into six parts: INTRODUCTION, THEORETICAL FRAMEWORK, IMPLEMENTATION, CONCLUSIONS, MANUAL and DIDACTAL PROPOSAL. These parts are described bellow: In the first part an introduction is given, remarking the motivations that lead to develop the project. Also objectives are explained, a historical analysis of the underwater communications is given, and finish with the state of the art. Secondly, theoretical part is described. First, everything related with acoustics and wave propagation throgh water, secondly, digital modulation techniques are explained. In the third part, the simulation implementation is explained. Main functionalities are highlighted and a brief overview of the architecture is given. This part can be useful for related works. Simulations and conclusions about the results, are done in the fourth part. In this section, different significant scenarios are chosen, and many simulations are launched in order to analyse the data. In the fifth parth, a user manual is provided in order to show the user how to use the simulator and how to download data from WOD if needed. In the final part of the project, a laboratory session is proposed for the subject P.A.S.
Análisis de las herramientas ORCC y Vivado HLS para la Síntesis de Modelos de Flujo de Datos RVC-CAL
Resumo:
En este Proyecto Fin de Grado se ha realizado un estudio de cómo generar, a partir de modelos de flujo de datos en RVC-CAL (Reconfigurable Video Coding – CAL Actor Language), modelos VHDL (Versatile Hardware Description Language) mediante Vivado HLS (Vivado High Level Synthesis), incluida en las herramientas disponibles en Vivado de Xilinx. Una vez conseguido el modelo VHDL resultante, la intención es que mediante las herramientas de Xilinx se programe en una FPGA (Field Programmable Gate Array) o el dispositivo Zynq también desarrollado por Xilinx. RVC-CAL es un lenguaje de flujo de datos que describe la funcionalidad de bloques funcionales, denominados actores. Las funcionalidades que desarrolla un actor se definen como acciones, las cuales pueden ser diferentes en un mismo actor. Los actores pueden comunicarse entre sí y formar una red de actores o network. Con Vivado HLS podemos obtener un diseño VHDL a partir de un modelo en lenguaje C. Por lo que la generación de modelos en VHDL a partir de otros en RVC-CAL, requiere una fase previa en la que los modelos en RVC-CAL serán compilados para conseguir su equivalente en lenguaje C. El compilador ORCC (Open RVC-CAL Compiler) es la herramienta que nos permite lograr diseños en lenguaje C partiendo de modelos en RVC-CAL. ORCC no crea directamente el código ejecutable, sino que genera un código fuente disponible para ser compilado por otra herramienta, en el caso de este proyecto, el compilador GCC (Gnu C Compiler) de Linux. En resumen en este proyecto nos encontramos con tres puntos de estudio bien diferenciados, los cuales son: 1. Partimos de modelos de flujo de datos en RVC-CAL, los cuales son compilados por ORCC para alcanzar su traducción en lenguaje C. 2. Una vez conseguidos los diseños equivalentes en lenguaje C, son sintetizados en Vivado HLS para conseguir los modelos en VHDL. 3. Los modelos VHDL resultantes serian manipulados por las herramientas de Xilinx para producir el bitstream que sea programado en una FPGA o en el dispositivo Zynq. En el estudio del segundo punto, nos encontramos con una serie de elementos conflictivos que afectan a la síntesis en Vivado HLS de los diseños en lenguaje C generados por ORCC. Estos elementos están relacionados con la manera que se encuentra estructurada la especificación en C generada por ORCC y que Vivado HLS no puede soportar en determinados momentos de la síntesis. De esta manera se ha propuesto una transformación “manual” de los diseños generados por ORCC que afecto lo menos posible a los modelos originales para poder realizar la síntesis con Vivado HLS y crear el fichero VHDL correcto. De esta forma este documento se estructura siguiendo el modelo de un trabajo de investigación. En primer lugar, se exponen las motivaciones y objetivos que apoyan y se esperan lograr en este trabajo. Seguidamente, se pone de manifiesto un análisis del estado del arte de los elementos necesarios para el desarrollo del mismo, proporcionando los conceptos básicos para la correcta comprensión y estudio del documento. Se realiza una descripción de los lenguajes RVC-CAL y VHDL, además de una introducción de las herramientas ORCC y Vivado, analizando las bondades y características principales de ambas. Una vez conocido el comportamiento de ambas herramientas, se describen las soluciones desarrolladas en nuestro estudio de la síntesis de modelos en RVC-CAL, poniéndose de manifiesto los puntos conflictivos anteriormente señalados que Vivado HLS no puede soportar en la síntesis de los diseños en lenguaje C generados por el compilador ORCC. A continuación se presentan las soluciones propuestas a estos errores acontecidos durante la síntesis, con las cuales se pretende alcanzar una especificación en C más óptima para una correcta síntesis en Vivado HLS y alcanzar de esta forma los modelos VHDL adecuados. Por último, como resultado final de este trabajo se extraen un conjunto de conclusiones sobre todos los análisis y desarrollos acontecidos en el mismo. Al mismo tiempo se proponen una serie de líneas futuras de trabajo con las que se podría continuar el estudio y completar la investigación desarrollada en este documento. ABSTRACT. In this Project it has made a study of how to generate, from data flow models in RVC-CAL (Reconfigurable Video Coding - Actor CAL Language), VHDL models (Versatile Hardware Description Language) by Vivado HLS (Vivado High Level Synthesis), included in the tools available in Vivado of Xilinx. Once achieved the resulting VHDL model, the intention is that by the Xilinx tools programmed in FPGA or Zynq device also developed by Xilinx. RVC-CAL is a dataflow language that describes the functionality of functional blocks, called actors. The functionalities developed by an actor are defined as actions, which may be different in the same actor. Actors can communicate with each other and form a network of actors. With Vivado HLS we can get a VHDL design from a model in C. So the generation of models in VHDL from others in RVC-CAL requires a preliminary phase in which the models RVC-CAL will be compiled to get its equivalent in C. The compiler ORCC (Open RVC-CAL Compiler) is the tool that allows us to achieve designs in C language models based on RVC-CAL. ORCC not directly create the executable code but generates an available source code to be compiled by another tool, in the case of this project, the GCC compiler (GNU C Compiler) of Linux. In short, in this project we find three well-defined points of study, which are: 1. We start from data flow models in RVC-CAL, which are compiled by ORCC to achieve its translation in C. 2. Once you realize the equivalent designs in C, they are synthesized in Vivado HLS for VHDL models. 3. The resulting models VHDL would be manipulated by Xilinx tools to produce the bitstream that is programmed into an FPGA or Zynq device. In the study of the second point, we find a number of conflicting elements that affect the synthesis Vivado HLS designs in C generated by ORCC. These elements are related to the way it is structured specification in C generated ORCC and Vivado HLS cannot hold at certain times of the synthesis. Thus it has proposed a "manual" transformation of designs generated by ORCC that affected as little as possible to the original in order to perform the synthesis Vivado HLS and create the correct file VHDL models. Thus this document is structured along the lines of a research. First, the motivations and objectives that support and hope to reach in this work are presented. Then it shows an analysis the state of the art of the elements necessary for its development, providing the basics for a correct understanding and study of the document. A description of the RVC-CAL and VHDL languages is made, in addition an introduction of the ORCC and Vivado tools, analyzing the advantages and main features of both. Once you know the behavior of both tools, the solutions developed in our study of the synthesis of RVC-CAL models, introducing the conflicting points mentioned above are described that Vivado HLS cannot stand in the synthesis of design in C language generated by ORCC compiler. Below the proposed solutions to these errors occurred during synthesis, with which it is intended to achieve optimum C specification for proper synthesis Vivado HLS and thus create the appropriate VHDL models are presented. Finally, as the end result of this work a set of conclusions on all analyzes and developments occurred in the same are removed. At the same time a series of future lines of work which could continue to study and complete the research developed in this document are proposed.
Resumo:
Oligonucleotides that recapitulate the acceptor stems of tRNAs are substrates for aminoacylation by many tRNA synthetases in vitro, even though these substrates are missing the anticodon trinucleotides of the genetic code. In the case of tRNAAla a single acceptor stem G⋅U base pair at position 3·70 is essential, based on experiments where the wobble pair has been replaced by alternatives such as I⋅U, G⋅C, and A⋅U, among others. These experiments led to the conclusion that the minor-groove free 2-amino group (of guanosine) of the G⋅U wobble pair is essential for charging. Moreover, alanine-inserting tRNAs (amber suppressors) that replace G⋅U with mismatches such as G⋅A and C⋅A are partially active in vivo and can support growth of an Escherichia coli tRNAAla knockout strain, leading to the hypothesis that a helix irregularity and nucleotide functionalities are important for recognition. Herein we investigate the charging in vitro of oligonucleotide and full-length tRNA substrates that contain mismatches at the position of the G⋅U pair. Although most of these substrates have undetectable activity, G⋅A and C⋅A variants retain some activity, which is, nevertheless, reduced by at least 100-fold. Thus, the in vivo assays are much less sensitive to large changes in aminoacylation kinetic efficiency of 3·70 variants than is the in vitro assay system. Although these functional data do not clarify all of the details, it is now clear that specific atomic groups are substantially more important in determining kinetic efficiency than is a helical distortion. By implication, the activity of mutant tRNAs measured in the in vivo assays appears to be more dependent on factors other than aminoacylation kinetic efficiency.
Resumo:
Betidamino acids (a contraction of "beta" position and "amide") are N'-monoacylated (optionally, N'-monoacylated and N-mono- or N,N'-dialkylated) aminoglycine derivatives in which each N'acyl/alkyl group may mimic naturally occurring amino acid side chains or introduce novel functionalities. Betidamino acids are most conveniently generated on solid supports used for the synthesis of peptides by selective acylation of one of the two amino functions of orthogonally protected aminoglycine(s) to generate the side chain either prior to or after the elongation of the main chain. We have used unresolved Nalpha-tert-butyloxycarbonyl-N'alpha-fluorenylmethoxycarbonyl++ + aminoglycine, and Nalpha-(Nalpha-methyl)-tert-butyloxycarbonyl-N'alpha-fluo renylmethoxycarbonyl aminoglycine as the templates for the introduction of betidamino acids in Acyline [Ac-D2Nal-D4Cpa-D3Pal-Ser-4Aph(Ac)-D4Aph(A c)-Leu-Ilys-Pro-DAla-NH2, where 2Nal is 2-naphthylalanine, 4Cpa is 4-chlorophenylalanine, 3Pal is 3-pyridylalanine, Aph is 4-aminophenylalanine, and Ilys is Nepsilon-isopropyllysine], a potent gonadotropin-releasing hormone antagonist, in order to test biocompatibility of these derivatives. Diasteremneric peptides could be separated in most cases by reverse-phase HPLC. Biological results indicated small differences in relative potencies (<5-fold) between the D and L nonalkylated betidamino acid-containing Acyline derivatives. Importantly, most betide diastereomers were equipotent with Acyline. In an attempt to correlate structure and observed potency, Ramachandran-type plots were calculated for a series of betidamino acids and their methylated homologs. According to these calculations, betidamino acids have access to a more limited and distinct number of conformational states (including those associated with alpha-helices, beta-sheets, or turn structures), with deeper minima than those observed for natural amino acids.
Resumo:
One of the challenges that concerns chemistry is the design of molecules able to modulate protein-protein and protein-ligand interactions, since these are involved in many physiological and pathological processes. The interactions occurring between proteins and their natural counterparts can take place through reciprocal recognition of rather large surface areas, through recognition of single contact points and single residues, through inclusion of the substrates in specific, more or less deep binding sites. In many cases, the design of synthetic molecules able to interfere with the processes involving proteins can benefit from the possibility of exploiting the multivalent effect. Multivalency, widely spread in Nature, consists in the simultaneous formation between two entities (cell-cell, cell-protein, protein-protein) of multiple equivalent ligand-recognition site complexes. In this way the whole interaction results particularly strong and specific. Calixarenes furnish a very interesting scaffold for the preparation of multivalent ligands and in the last years calixarene-based ligands demonstrated their remarkable capability to recognize and inhibit or restore the activity of different proteins, with a high efficiency and selectivity in several recognition phenomena. The relevance and versatility of these ligands is due to the different exposition geometries of the binding units that can be explored exploiting the conformational properties of these macrocycles, the wide variety of functionalities that can be linked to their structure at different distances from the aromatic units and to their intrinsic multivalent nature. With the aim of creating new multivalent systems for protein targeting, the work reported in this thesis regards the synthesis and properties of glycocalix[n]arenes and guanidino calix[4]arenes for different purposes. Firstly, a new bolaamphiphile glycocalix[4]arene in 1,3-alternate geometry, bearing cellobiose, was synthesized for the preparation of targeted drug delivery systems based on liposomes. The formed stable mixed liposomes obtained by mixing the macrocycle with DOPC were shown to be able of exploiting the sugar units emerging from the lipid bilayer to agglutinate Concanavalin A, a lectin specific for glucose. Moreover, always thanks to the presence of the glycocalixarene in the layer, the same liposomes demonstrated through preliminary experiments to be uptaken by cancer cells overexpressing glucose receptors on their exterior surface more efficiently respect to simple DOPC liposomes lacking glucose units in their structure. Then a small library of glycocalix[n]arenes having different valency and geometry was prepared, for the creation of potentially active immunostimulants against Streptococcus pneumoniae, particularly the 19F serotype, one of the most virulent. These synthesized glycocalixarenes bearing β-N-acetylmannosamine as antigenic unit were compared with the natural polysaccharide on the binding to the specific anti-19F human polyclonal antibody, to verify their inhibition potency. Among all, the glycocalixarene based on the conformationally mobile calix[4]arene resulted the more efficient ligand, probably due its major possibility to explore the antibody surface and dispose the antigenic units in a proper arrangement for the interaction process. These results pointed out the importance of how the different multivalent presentation in space of the glycosyl units can influence the recognition phenomena. At last, NMR studies, using particularly 1H-15N HSQC experiments, were performed on selected glycocalix[6]arenes and guanidino calix[4]arenes blocked in the cone geometry, in order to better understand protein-ligand interactions. The glycosylated compounds were studied with Ralstonia solanacearum lectin, in order to better understand the nature of the carbohydrate‐lectin interactions in solution. The series of cationic calixarene was employed with three different acidic proteins: GB1, Fld and alpha synuclein. Particularly GB1 and Fld were observed to interact with all five cationic calix[4]arenes but showing different behaviours and affinities.
Resumo:
The main aim of this thesis is the controlled and reproducible synthesis of functional materials at the nanoscale. In the first chapter, a tuning of morphology and magnetic properties of magnetite nanoparticles is presented. It was achieved by an innovative approach, which involves the use of an organic macrocycle (calixarene) to induce the oriented aggregation of NPs during the synthesis. This method is potentially applicable to the preparation of other metal oxide NPs by thermal decomposition of the respective precursors. Products obtained, in particular the multi-core nanoparticles, show remarkable magnetic and colloidal properties, making them very interesting for biomedical applications. The synthesis and functionalisation of plasmonic Au and Ag nanoparticles is presented in the second chapter. Here, a supramolecular approach was exploited to achieve a controlled and potentially reversible aggregation between Au and Ag NPs. This aggregation phenomena was followed by UV - visible spectroscopy and dynamic light scattering. In the final chapters, the conjugation of plasmonic and magnetic functionalities was tackled through the preparation of dimeric nanostructures. Au - Fe oxide heterodimeric nanoparticles were prepared and their magnetic properties thoroughly characterised. The results demonstrate the formation of FeO (wustite), together with magnetite, during the thermal decomposition of the iron precursor. By an oxidation process that preserves Au in the dimeric structures, wustite completely disappeared, with the formation of either magnetite and / or maghemite, much better from the magnetic point of view. The plasmon resonance of Au results damped by the presence of the iron oxide, a material with high refractive index, but it is still present if the Au domain of the nanoparticles is exposed towards the bulk. Finally, remarkable hyperthermia, also in vitro, was found for these structures.
Resumo:
De modo a satisfazer aspectos de resistência, custo ou conforto, o aperfeiçoamento do desempenho das estruturas é uma meta sempre almejada na Engenharia. Melhorias têm sido alcançadas dado ao crescente uso de materiais compósitos, pois estes apresentam propriedades físicas diferenciadas capazes de atender as necessidades de projeto. Associado ao emprego de compósitos, o estudo da plasticidade demonstra uma interessante alternativa para aumentar o desempenho estrutural ao conferir uma capacidade resistente adicional ao conjunto. Entretanto, alguns problemas podem ser encontrados na análise elastoplástica de compósitos, além das próprias dificuldades inerentes à incorporação de fibras na matriz, no caso de compósitos reforçados. A forma na qual um compósito reforçado por fibras e suas fases têm sua representação e simulação é de extrema importância para garantir que os resultados obtidos sejam compatíveis com a realidade. À medida que se desenvolvem modelos mais refinados, surgem problemas referentes ao custo computacional, além da necessidade de compatibilização dos graus de liberdade entre os nós das malhas de elementos finitos da matriz e do reforço, muitas vezes exigindo a coincidência das referidas malhas. O presente trabalho utiliza formulações que permitem a representação de compósitos reforçados com fibras sem que haja a necessidade de coincidência entre malhas. Além disso, este permite a simulação do meio e do reforço em regime elastoplástico com o objetivo de melhor estudar o real comportamento. O modelo constitutivo adotado para a plasticidade é o de von Mises 2D associativo com encruamento linear positivo e a solução deste modelo foi obtida através de um processo iterativo. A formulação de elementos finitos posicional é adotada com descrição Lagrangeana Total e apresenta as posições do corpo no espaço como parâmetros nodais. Com o intuito de averiguar a correta implementação das formulações consideradas, exemplos para validação e apresentação das funcionalidades do código computacional desenvolvido foram analisados.
Resumo:
Aplicativos móveis de celulares que coletam dados pessoais estão cada vez mais presentes na rotina do cidadão comum. Associado a estas aplicações, há polêmicas sobre riscos de segurança e de invasão de privacidade, que podem se tornar entraves para aceitação destes sistemas por parte dos usuários. Por outro lado, discute-se o Paradoxo da Privacidade, em que os consumidores revelam mais informações pessoais voluntariamente, apesar de declarar que reconhecem os riscos. Há pouco consenso, nas pesquisas acadêmicas, sobre os motivos deste paradoxo ou mesmo se este fenômeno realmente existe. O objetivo desta pesquisa é analisar como a coleta de informações sensíveis influencia a escolha de aplicativos móveis. A metodologia é o estudo de aplicativos disponíveis em lojas virtuais para celulares através de técnicas qualitativas e quantitativas. Os resultados indicam que os produtos mais populares da loja são aqueles que coletam mais dados pessoais. Porém, em uma análise minuciosa, observa-se que aqueles mais buscados também pertencem a empresas de boa reputação e possuem mais funcionalidades, que exigem maior acesso aos dados privativos do celular. Na survey realizada em seguida, nota-se que os consumidores reduzem o uso dos aplicativos quando consideram que o produto coleta dados excessivamente, mas a estratégia para proteger essas informações pode variar. No grupo dos usuários que usam aplicativos que coletam dados excessivamente, conclui-se que o motivo primordial para compartilhar informações pessoais são as funcionalidades. Além disso, esta pesquisa confirma que comparar os dados solicitados pelos aplicativos com a expectativa inicial do consumidor é um constructo complementar para avaliar preocupações com privacidade, ao invés de simplesmente analisar a quantidade de informações coletadas. O processo desta pesquisa também ilustrou que, dependendo do método utilizado para análise, é possível chegar a resultados opostos sobre a ocorrência ou não do paradoxo. Isso pode dar indícios sobre os motivos da falta de consenso sobre o assunto
Resumo:
No setor de energia elétrica, a área que se dedica ao estudo da inserção de novos parques geradores de energia no sistema é denominada planejamento da expansão da geração. Nesta área, as decisões de localização e instalação de novas usinas devem ser amplamente analisadas, a fim de se obter os diversos cenários proporcionados pelas alternativas geradas. Por uma série de fatores, o sistema de geração elétrico brasileiro, com predominância hidroelétrica, tende a ser gradualmente alterada pela inserção de usinas termoelétricas (UTEs). O problema de localização de UTEs envolve um grande número de variáveis através do qual deve ser possível analisar a importância e contribuição de cada uma. O objetivo geral deste trabalho é o desenvolvimento de um modelo de localização de usinas termoelétricas, aqui denominado SIGTE (Sistema de Informação Geográfica para Geração Termoelétrica), o qual integra as funcionalidades das ferramentas SIGs (Sistemas de Informação Geográfica) e dos métodos de decisão multicritério. A partir de uma visão global da área estudada, as componentes espaciais do problema (localização dos municípios, tipos de transporte, linhas de transmissão de diferentes tensões, áreas de preservação ambiental, etc.) podem ter uma representação mais próxima da realidade e critérios ambientais podem ser incluídos na análise. Além disso, o SIGTE permite a inserção de novas variáveis de decisão sem prejuízo da abordagem. O modelo desenvolvido foi aplicado para a realidade do Estado de São Paulo, mas deixando claro a viabilidade de uso do modelo para outro sistema ou região, com a devida atualização dos bancos de dados correspondentes. Este modelo é designado para auxiliar empreendedores que venham a ter interesse em construir uma usina ou órgãos governamentais que possuem a função de avaliar e deferir ou não a licença de instalação e operação de usinas.
Resumo:
This study describes the electrochemical characterization of N-doped carbon xerogels in the form of microspheres and of carbon aerogels with varied porosities and surface oxygen complexes. The interfacial capacitance of N-doped carbon xerogels decreased with increased micropore surface area as determined by N2 adsorption at −196 °C. The interfacial capacitance showed a good correlation with the areal NXPS concentration, and the best correlation with the areal concentration of pyrrolic or pyridonic nitrogen functionalities. The gravimetric capacitance decreased with greater xerogel microsphere diameter. The interfacial capacitance of carbon aerogels increased with higher percentage of porosity as determined from particle and true densities. The interfacial capacitance showed a linear relationship with the areal oxygen concentration and with the areal concentrations of CO- and CO2-evolving groups.
Ammonia removal using activated carbons: effect of the surface chemistry in dry and moist conditions
Resumo:
The effect of surface chemistry (nature and amount of oxygen groups) in the removal of ammonia was studied using a modified resin-based activated carbon. NH3 breakthrough column experiments show that the modification of the original activated carbon with nitric acid, that is, the incorporation of oxygen surface groups, highly improves the adsorption behavior at room temperature. Apparently, there is a linear relationship between the total adsorption capacity and the amount of the more acidic and less stable oxygen surface groups. Similar experiments using moist air clearly show that the effect of humidity highly depends on the surface chemistry of the carbon used. Moisture highly improves the adsorption behavior for samples with a low concentration of oxygen functionalities, probably due to the preferential adsorption of ammonia via dissolution into water. On the contrary, moisture exhibits a small effect on samples with a rich surface chemistry due to the preferential adsorption pathway via Brønsted and Lewis acid centers from the carbon surface. FTIR analyses of the exhausted oxidized samples confirm both the formation of NH4+ species interacting with the Brønsted acid sites, together with the presence of NH3 species coordinated, through the lone pair electron, to Lewis acid sites on the graphene layers.
Resumo:
One option to optimize carbon materials for supercapacitor applications is the generation of surface functional groups that contribute to the pseudocapacitance without losing the designed physical properties. This requires suitable functionalization techniques able to selectively introduce a given amount of electroactive oxygen groups. In this work, the influence of the chemical and electrochemical oxidation methods, on the chemical and physical properties of a zeolite templated carbon (ZTC), as a model carbon material, have been studied and compared. Although both oxidation methods generally produce a loss of the original ZTC physical properties with increasing amount of oxidation, the electrochemical method shows much better controllability and, unlike chemical treatments, enables the generation of a large number of oxygen groups (O = 11000- 3300 μmol/g), with a higher proportion of active functionalities, while retaining a high surface area (ranging between 1900-3500 m2/g), a high microporosity and an ordered 3-D structure.
Resumo:
The monoliths studied in this work show large specific surface areas (up to 1600 m2 g-1), high densities (up to 1.17 g cm-3) and high electrical conductivities (up to 9.5 S cm-1). They are microporous carbons with pore sizes up to 1.3 nm but most of them below 0.75 nm. They also show oxygen functionalities. The electrochemical behavior of the monoliths is studied in three-electrode cells with aqueous H2SO4 solution as electrolyte. This work deals with the contribution of the sulfate ions and protons to the specific capacitance of carbon monoliths having different surface areas and different contents of oxygen groups. Protons contribute with a pseudocapacitance (up to 152 F g-1) in addition to the double layer capacitance. Sulfate ions contribute with a double layer capacitance only. At the double layer, the capacitance of the sulfate ions (up to 291 F g-1) is slightly higher than that of protons (up to 251 F g-1); both capacitances increase as the surface area increases. The preference of protons to be electroadsorbed at the double layer and the broader voltage window of these ions account for their higher contribution (70 %) to the double layer capacitance.
Resumo:
This article presents an interactive Java software platform which enables any user to easily create advanced virtual laboratories (VLs) for Robotics. This novel tool provides both support for developing applications with full 3D interactive graphical interface and a complete functional framework for modelling and simulation of arbitrary serial-link manipulators. In addition, its software architecture contains a high number of functionalities included as high-level tools, with the advantage of allowing any user to easily develop complex interactive robotic simulations with a minimum of programming. In order to show the features of the platform, the article describes, step-by-step, the implementation methodology of a complete VL for Robotics education using the presented approach. Finally, some educational results about the experience of implementing this approach are reported.