34 resultados para Solução arquitetural
Resumo:
The literary critic Terry Eagleton obtained notoriety in academic circles when he was recognized intellectually for his bestselling book Literary Theory: An Introduction. In this book, the English author boldly proposes the end of literature and literary criticism. However, Eagleton proposed years before, in his book Criticism and Ideology (1976), a scientific system of analysis of literary texts, which seemed less radical, both in theory and in method, than in his later theoretical proposal. Based on this, the objective of this dissertation is to present the English literary critic´s initial method, explaining the reasons that led him to abandon his initial project - of develop a method of analysis of the literary text on a Marxist scientific perspective - and to propose, in the following years, in his most famous book and others, a revolutionary vision that would go beyond textual analysis and make literary texts have a practical intervention in society. Finally, we explain what would be his idea of revolutionary criticism
Resumo:
In this work the organosilanes aminopropyltriethoxysilane, 3-mercaptopropyltryethoxisilane and n[-3-(trimetoxisilyl)propyl]ethylenetriamine, as well as tetraethylortosilicate (TEOS), were employed to produce, by sol-gel method, organofuncionalized silicon samples. The prepared samples were characterized by elementar analys by thermogravimetry and infrared spectroscopy. Those samples were employed to adsorb Cd2+, Pb2+, Ni2+ and Zn2+ from aqueous solutions (10, 20, 40, 60 and 80 mg L-1). In typical experiments, 50 mg of the organometrix was suspended in 20 mL of metal cation solutions at four different contact times: 30, 60, 90 and 120 minutes. The total amount of adsorbed cations were measured by atomic absorption spectrometry. To all investigated matrices, the following adsorption capacity was observed: Ni2+ > Zn2+ > Cd2+ > Pb2+. Such sequence is closely related with the cation radius, as well as the cation hardness
Resumo:
This study aimed to analyze the effect of a saline solution on growth and chemical composition of Atriplex nummularia, shrubby plant, absorbing salts used in the diet of animals and the management of water and saline soils. These plant seedlings were planted and grown in a reserved area at the Federal University of Rio Grande do Norte. The plantation was divided into two blocks, in which one of them was irrigated with saline solution with a concentration of 2840 mgL-1 of NaCl and the second group was irrigated with drinking water. After six months, the plants were collected, harvested and divided into three parts: leaf, thin and thick stem. Monthly, dimension measurements were carried out for cataloging the growth of Atriplex. Ion Chromatography (IC) and Optical Emission Spectroscopy Inductively Coupled Plasma (ICP-OES) were used to analyze the chemical composition of the partition plant parts. The results of these analyses revealed that an absorption process of anions and cations by Atriplex nummularia plant during its growth was achieved, in particular by a higher concentration of sodium and chloride ions. Scanning electron microscopy images showed and confirmed the presence of small crystals on the leaf surface. Electrical conductivity and pH measurements of the aerial parts of the plant were carried out and these results showed that the leaf is the plant part where there is a largest concentration of ions. In addition, measurements of specific surface were obtained from irrigated plants with saline solution, achieving higher surface area, in all cases. Plant dimensions obtained monthly showed that the plants irrigated with water grew 5% more than those plants irrigated with saline solution. Based on results obtained, Atriplex plant showed a higher potential to survive and adapt to environments (aquatic or geological) with high levels of salinity and this property can be used as a tool for removing salts/metals from industrial contaminated soils and effluents.
Resumo:
Combating pollution of soils is a challenge that has concerned researchers from different areas and motivated the search for technologies that aim the recovery of degraded soils. Literature shows numerous processes that have been proposed with the intent of remediating soils contaminated by oils and other by-products of the oil industry, considering that the processes available have, generally, high operating costs, this work proposes a costeffective alternative to the treatment of Diesel-contaminated soils. The washing solutions were prepared using water as aqueous phase, the saponified coconut oil (OCS) as surfactant and n-butanol as co-surfactant. In this study, the soil was characterized by physical and chemical analyses. The study of diesel desorption from the soil was held in bath, using hexane and washing solutions, which had 10 and 20 wt.% active matter (AM - co-surfactant/surfactants) respectively. The study of the influence of active matter concentration and temperature in bath agitated used an experimental planning. The experiment also developed a system of percolation in bed to wash the soil and studied the influence of the concentration of active substance and volume of washing solution using an experimental planning. The optimal times to achieve hexane extraction were 30 and 180 min, while the best results using a 10% AM was 60 min and using a 20% AM was 120 min. The results of the experimental planning on bath showed that the maximum diesel removal was obtained when at a 20 wt.% of AM and under 50 °C, removing 99.92% of the oil. As for experiments in the system of percolation soil bed, the maximum diesel removal was high when the volume of the washing solution was of 5 L and the concentration of 20% AM. This experiment concluded that the concentration of AM and the temperature were vital to bath experiments for diesel removal, while in the system of percolation soil bed only concentration of AM influenced the soil remediation
Resumo:
Model-oriented strategies have been used to facilitate products customization in the software products lines (SPL) context and to generate the source code of these derived products through variability management. Most of these strategies use an UML (Unified Modeling Language)-based model specification. Despite its wide application, the UML-based model specification has some limitations such as the fact that it is essentially graphic, presents deficiencies regarding the precise description of the system architecture semantic representation, and generates a large model, thus hampering the visualization and comprehension of the system elements. In contrast, architecture description languages (ADLs) provide graphic and textual support for the structural representation of architectural elements, their constraints and interactions. This thesis introduces ArchSPL-MDD, a model-driven strategy in which models are specified and configured by using the LightPL-ACME ADL. Such strategy is associated to a generic process with systematic activities that enable to automatically generate customized source code from the product model. ArchSPLMDD strategy integrates aspect-oriented software development (AOSD), modeldriven development (MDD) and SPL, thus enabling the explicit modeling as well as the modularization of variabilities and crosscutting concerns. The process is instantiated by the ArchSPL-MDD tool, which supports the specification of domain models (the focus of the development) in LightPL-ACME. The ArchSPL-MDD uses the Ginga Digital TV middleware as case study. In order to evaluate the efficiency, applicability, expressiveness, and complexity of the ArchSPL-MDD strategy, a controlled experiment was carried out in order to evaluate and compare the ArchSPL-MDD tool with the GingaForAll tool, which instantiates the process that is part of the GingaForAll UML-based strategy. Both tools were used for configuring the products of Ginga SPL and generating the product source code
Resumo:
Over the years the use of application frameworks designed for the View and Controller layers of MVC architectural pattern adapted to web applications has become very popular. These frameworks are classified into Actions Oriented and Components Oriented , according to the solution strategy adopted by the tools. The choice of such strategy leads the system architecture design to acquire non-functional characteristics caused by the way the framework influences the developer to implement the system. The components reusability is one of those characteristics and plays a very important role for development activities such as system evolution and maintenance. The work of this dissertation consists to analyze of how the reusability could be influenced by the Web frameworks usage. To accomplish this, small academic management applications were developed using the latest versions of Apache Struts and JavaServer Faces frameworks, the main representatives of Java plataform Web frameworks of. For this assessment was used a software quality model that associates internal attributes, which can be measured objectively, to the characteristics in question. These attributes and metrics defined for the model were based on some work related discussed in the document
Resumo:
The visualization of three-dimensional(3D)images is increasigly being sed in the area of medicine, helping physicians diagnose desease. the advances achived in scaners esed for acquisition of these 3d exames, such as computerized tumography(CT) and Magnetic Resonance imaging (MRI), enable the generation of images with higher resolutions, thus, generating files with much larger sizes. Currently, the images of computationally expensive one, and demanding the use of a righ and computer for such task. The direct remote acess of these images thruogh the internet is not efficient also, since all images have to be trasferred to the user´s equipment before the 3D visualization process ca start. with these problems in mind, this work proposes and analyses a solution for the remote redering of 3D medical images, called Remote Rendering (RR3D). In RR3D, the whole hedering process is pefomed a server or a cluster of servers, with high computational power, and only the resulting image is tranferred to the client, still allowing the client to peform operations such as rotations, zoom, etc. the solution was developed using web services written in java and an architecture that uses the scientific visualization packcage paraview, the framework paraviewWeb and the PACS server DCM4CHEE.The solution was tested with two scenarios where the rendering process was performed by a sever with graphics hadwere (GPU) and by a server without GPUs. In the scenarios without GPUs, the soluction was executed in parallel with several number of cores (processing units)dedicated to it. In order to compare our solution to order medical visualization application, a third scenario was esed in the rendering process, was done locally. In all tree scenarios, the solution was tested for different network speeds. The solution solved satisfactorily the problem with the delay in the transfer of the DICOM files, while alowing the use of low and computers as client for visualizing the exams even, tablets and smart phones
Resumo:
The Traveling Purchaser Problem is a variant of the Traveling Salesman Problem, where there is a set of markets and a set of products. Each product is available on a subset of markets and its unit cost depends on the market where it is available. The objective is to buy all the products, departing and returning to a domicile, at the least possible cost defined as the summation of the weights of the edges in the tour and the cost paid to acquire the products. A Transgenetic Algorithm, an evolutionary algorithm with basis on endosymbiosis, is applied to the Capacited and Uncapacited versions of this problem. Evolution in Transgenetic Algorithms is simulated with the interaction and information sharing between populations of individuals from distinct species. The computational results show that this is a very effective approach for the TPP regarding solution quality and runtime. Seventeen and nine new best results are presented for instances of the capacited and uncapacited versions, respectively
Resumo:
The soil heat flux and soil thermal diffusivity are important components of the surface energy balance, especially in ar id and semi-arid regions. The obj ective of this work was to carry out to estimate the soil heat flux from th e soil temperature measured at a single depth, based on the half-order time derivative met hod proposed by Wang and Bras (1999), and to establish a method capable of es timating the thermal diffusivity of the soil, based on the half order derivative, from the temporal series of soil temperature at two depths. The results obtained in the estimates of soil heat flux were compared with the values of soil heat flux measured through flux plates, and the thermal di ffusivity estimated was compared with the measurements carried out in situ. The results obtained showed excellent concordance between the estimated and measured soil heat flux, with correlation (r), coeffici ent of determination (R 2 ) and standard error (W/m 2 ) of: r = 0.99093, R 2 = 0.98194 and error = 2.56 (W/m 2 ) for estimated period of 10 days; r = 0,99069, R 2 = 0,98147 and error = 2.59 (W/m 2 ) for estimated period of 30 days; and r = 0,98974, R 2 = 0,97958 and error = 2.77 (W/m 2 ) for estimated period of 120 days. The values of thermal di ffusivity estimated by the proposed method showed to be coherent and consis tent with in situ measured va lues, and with the values found in the literature usi ng conventional methods.
Resumo:
A computação ubíqua é um paradigma no qual dispositivos com capacidade de processamento e comunicação são embutidos nos elementos comuns de nossas vidas (casas, carros, máquinas fotográficas, telefones, escolas, museus, etc), provendo serviços com um alto grau de mobilidade e transparência. O desenvolvimento de sistemas ubíquos é uma tarefa complexa, uma vez que envolve várias áreas da computação, como Engenharia de Software, Inteligência Artificial e Sistemas Distribuídos. Essa tarefa torna-se ainda mais complexa pela ausência de uma arquitetura de referência para guiar o desenvolvimento de tais sistemas. Arquiteturas de referência têm sido usadas para fornecer uma base comum e dar diretrizes para a construção de arquiteturas de softwares para diferentes classes de sistemas. Por outro lado, as linguagens de descrição arquitetural (ADLs) fornecem uma sintaxe para representação estrutural dos elementos arquiteturais, suas restrições e interações, permitindo-se expressar modelo arquitetural de sistemas. Atualmente não há, na literatura, ADLs baseadas em arquiteturas de referência para o domínio de computação ubíqua. De forma a permitir a modelagem arquitetural de aplicações ubíquas, esse trabalho tem como objetivo principal especificar UbiACME, uma linguagem de descrição arquitetural para aplicações ubíquas, bem como disponibilizar a ferramenta UbiACME Studio, que permitirá arquitetos de software realizar modelagens usando UbiACME. Para esse fim, inicialmente realizamos uma revisão sistemática, de forma a investigar na literatura relacionada com sistemas ubíquos, os elementos comuns a esses sistemas que devem ser considerados no projeto de UbiACME. Além disso, com base na revisão sistemática, definimos uma arquitetura de referência para sistemas ubíquos, RA-Ubi, que é a base para a definição dos elementos necessários para a modelagem arquitetural e, portanto, fornece subsídios para a definição dos elementos de UbiACME. Por fim, de forma a validar a linguagem e a ferramenta, apresentamos um experimento controlado onde arquitetos modelam uma aplicação ubíqua usando UbiACME Studio e comparam com a modelagem da mesma aplicação em SySML.
Resumo:
This Thesis main objective is to implement a supporting architecture to Autonomic Hardware systems, capable of manage the hardware running in reconfigurable devices. The proposed architecture implements manipulation, generation and communication functionalities, using the Context Oriented Active Repository approach. The solution consists in a Hardware-Software based architecture called "Autonomic Hardware Manager (AHM)" that contains an Active Repository of Hardware Components. Using the repository the architecture will be able to manage the connected systems at run time allowing the implementation of autonomic features such as self-management, self-optimization, self-description and self-configuration. The proposed architecture also contains a meta-model that allows the representation of the Operating Context for hardware systems. This meta-model will be used as basis to the context sensing modules, that are needed in the Active Repository architecture. In order to demonstrate the proposed architecture functionalities, experiments were proposed and implemented in order to proof the Thesis hypothesis and achieved objectives. Three experiments were planned and implemented: the Hardware Reconfigurable Filter, that consists of an application that implements Digital Filters using reconfigurable hardware; the Autonomic Image Segmentation Filter, that shows the project and implementation of an image processing autonomic application; finally, the Autonomic Autopilot application that consist of an auto pilot to unmanned aerial vehicles. In this work, the applications architectures were organized in modules, according their functionalities. Some modules were implemented using HDL and synthesized in hardware. Other modules were implemented kept in software. After that, applications were integrated to the AHM to allow their adaptation to different Operating Context, making them autonomic.
Resumo:
Searches using organoclays have been the subject of great interest due to its wide application in industry and removal of environmental pollutants. The organoclays were obtained using bentonite (BEN) and cationic surfactants: hexadecyltrimethyl ammonium bromide (HDTMA-Br) and trimethyloctadecyl ammonium bromide (TMOA-Br) in ratios of 50 and 100 % of its ion exchange capacity. The materials were characterized by the techniques of X-ray diffraction (DRX), infrared spectroscopy (IR), X-ray fluorescence (FRX), thermal analysis (TA) and scanning electron microscopy (SEM). The bentonite and organobentonite were used on the adsorption of dyes, Remazol Blue RR (AZ) and Remazol Red RR (VM) in aqueous solution. The adsorption models of Langmuir and Freundlich were used for mathematical description of sorption equilibrium data and obtain the constants of the isotherms. The Freundlich model fit to the data for adsorption equilibrium of bentonite, on the other hand both the model fit to the Langmuir adsorption test of organoclays. The adsorption processes using adsorbents with both dyes interspersed with HDTMA-Br show endothermic and exothermic nature, respectively.
Resumo:
The task of expression undertaken by the performer falls largely on the right hand of guitarist. Aware of this fact, past and present masters have left their contributions to the development of right hand technique. It is clear, with rare exceptions, that educational and interpretative proposals, so far, have addressed the attack on the strings from the flexion of the fingers. This work, however, presents a technical resource called imalt, including in the attack action, the extension movement. Some techniques used in specific circumstances, such as the dedillo, the alzapúa, the tremulo and the rasgueado also use extension movements in the attack. They are put in perspective with the imalt providing a panoramic view of their individual characteristics. The use of imalt in the traditional guitar repertoire is exemplified in Villa Lobos, Ponce and Brouwer. Three pieces were composed for this work: Shravana, Alegoria and Vandana. Compositional techniques such as melodic contour applying and ostinato have been reviewed and used in the preparation of these compositions. A detailed record of compositional trajectory is presented. Therefore, the Model for the Compositional Process Accompaniment according Silva (2007) is used. Some events that have left the imalt in evidence are reported, as the launch and distribution of the Compact Disc (CD) Imalt, publishing scores and interviews. Finally is presented concluding comments, pointing possibilities opened up by this work.
Resumo:
Given the growing demand for the development of mobile applications, driven by use increasingly common in smartphones and tablets grew in society the need for remote data access in full in the use of mobile application without connectivity environments where there is no provision network access at all times. Given this reality, this work proposes a framework that present main functions are the provision of a persistence mechanism, replication and data synchronization, contemplating the creation, deletion, update and display persisted or requested data, even though the mobile device without connectivity with the network. From the point of view of the architecture and programming practices, it reflected in defining strategies for the main functions of the framework are met. Through a controlled study was to validate the solution proposal, being found as the gains in reducing the number of lines code and the amount of time required to perform the development of an application without there being significant increase for the operations.
Resumo:
Given the growing demand for the development of mobile applications, driven by use increasingly common in smartphones and tablets grew in society the need for remote data access in full in the use of mobile application without connectivity environments where there is no provision network access at all times. Given this reality, this work proposes a framework that present main functions are the provision of a persistence mechanism, replication and data synchronization, contemplating the creation, deletion, update and display persisted or requested data, even though the mobile device without connectivity with the network. From the point of view of the architecture and programming practices, it reflected in defining strategies for the main functions of the framework are met. Through a controlled study was to validate the solution proposal, being found as the gains in reducing the number of lines code and the amount of time required to perform the development of an application without there being significant increase for the operations.