34 resultados para Solução arquitetural

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is increasingly common use of a single computer system using different devices - personal computers, telephones cellular and others - and software platforms - systems graphical user interfaces, Web and other systems. Depending on the technologies involved, different software architectures may be employed. For example, in Web systems, it utilizes architecture client-server - usually extended in three layers. In systems with graphical interfaces, it is common architecture with the style MVC. The use of architectures with different styles hinders the interoperability of systems with multiple platforms. Another aggravating is that often the user interface in each of the devices have structure, appearance and behaviour different on each device, which leads to a low usability. Finally, the user interfaces specific to each of the devices involved, with distinct features and technologies is a job that needs to be done individually and not allow scalability. This study sought to address some of these problems by presenting a reference architecture platform-independent and that allows the user interface can be built from an abstract specification described in the language in the specification of the user interface, the MML. This solution is designed to offer greater interoperability between different platforms, greater consistency between the user interfaces and greater flexibility and scalability for the incorporation of new devices

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-adaptive software system is able to change its structure and/or behavior at runtime due to changes in their requirements, environment or components. One way to archieve self-adaptation is the use a sequence of actions (known as adaptation plans) which are typically defined at design time. This is the approach adopted by Cosmos - a Framework to support the configuration and management of resources in distributed environments. In order to deal with the variability inherent of self-adaptive systems, such as, the appearance of new components that allow the establishment of configurations that were not envisioned at development time, this dissertation aims to give Cosmos the capability of generating adaptation plans of runtime. In this way, it was necessary to perform a reengineering of the Cosmos Framework in order to allow its integration with a mechanism for the dynamic generation of adaptation plans. In this context, our work has been focused on conducting a reengineering of Cosmos. Among the changes made to in the Cosmos, we can highlight: changes in the metamodel used to represent components and applications, which has been redefined based on an architectural description language. These changes were propagated to the implementation of a new Cosmos prototype, which was then used for developing a case study application for purpose of proof of concept. Another effort undertaken was to make Cosmos more attractive by integrating it with another platform, in the case of this dissertation, the OSGi platform, which is well-known and accepted by the industry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explores the architectural conception process using the Architecturologie as an instrument of analysis, based on Enseigner la Conception Architecturale (2000), developed by Boudon - architect, lecturer and French researcher. It begins with the selection of local architect´s works, João Maurício Fernandes de Miranda, which resulted in six non residential projects developed between 1961 and 1981. Architectural readings were developed, with emphasis in the identification of architecturological scales, their functions, relationship and modality of occurrence

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuous Synthesis by Solution Combustion was employed in this work aiming to obtain tin dioxide nanostructured. Basically, a precursor solution is prepared and then be atomized and sprayed into the flame, where its combustion occurs, leading to the formation of particles. This is a recent technique that shows an enormous potential in oxides deposition, mainly by the low cost of equipment and precursors employed. The tin dioxide (SnO2) nanostructured has been widely used in various applications, especially as gas sensors and varistors. In the case of sensors based on semiconducting ceramics, where surface reactions are responsible for the detection of gases, the importance of surface area and particle size is even greater. The preference for a nanostructured material is based on its significant increase in surface area compared to conventional microcrystalline powders and small particle size, which may benefit certain properties such as high electrical conductivity, high thermal stability, mechanical and chemical. In this work, were employed as precursor solution tin chloride dehydrate diluted in anhydrous ethyl alcohol. Were utilized molar ratio chloride/solvent of 0,75 with the purpose of investigate its influence in the microstructure of produced powder. The solution precursor flux was 3 mL/min. Analysis with X-ray diffraction appointed that a solution precursor with molar ratio chloride/solvent of 0,75 leads to crystalline powder with single phase and all peaks are attributed to phase SnO2. Parameters as distance from the flame with atomizer distance from the capture system with the pilot, molar ratio and solution flux doesn t affect the presence of tin dioxide in the produced powder. In the characterization of the obtained powder techniques were used as thermogravimetric (TGA) and thermodiferential analysis (DTA), particle size by laser diffraction (GDL), crystallographic analysis by X-ray diffraction (XRD), morphology by scanning electron microscopy (SEM), transmission electron microscopy (TEM), specific surface area (BET) and electrical conductivity analysis. The techniques used revealed that the SnO2 exhibits behavior of a semiconductor material, and a potentially promising material for application as varistor and sensor systems for gas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing in the consumption of plant medicine by parts of the population generated a bigger need for studies. Drug substitutions, changes and adulterations at the production techniques are common places at plant-originated drugs trade, leading governmental departments of drug control round the world to adopt many analytical practices to medicinal plants. However, agronomic and technological issues cause characteristics and chemical composition variation at the drug, problem to be solved by the subject researchers. The present work aims to obtain a spray dried extract from a extractive solution obtained from Psidium guajava L. leaves based in book references that stress the intermediate dosage forms advantages. It also tries to validate useful methodologies for the quality control for both raw material and its derivates. Using eight sets of the spray dried extract (with Eudragit®, Aerosil ® e Avicel PH101 ® as drying adjuvants), the study proposes analytical methods using techniques commonly performed to plant medicines and its intermediate forms. As results, a viable spray-dried extract was obtained from a standartized extract solution. Among the studied adjuvants, the combination Aerosil ® with Eudragit ® showed the drying outcome, rheology, humidity and tannin content values that best fitted the demands of the Brazilian Pharmacopaea

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Brazilian legal context, conflict resolution is studied and analyzed over a majority jurisdictional view, which is one of the reasons of litigation culture that creates a jurisdictional resolution hopeness. The practical impact of such reality is the loss of quality in the public service of the judicial function, moved, as a rule, by the overcrowdings, slowness of legal procedures and the relegation of peaceful resolution methods to peripheral plan. However, the Federal Constitution of 1988, following the Ordinary Law constitutionalization phenomenon provides specific guidance about the values towards the litigation resolution. The study, therefore, aims to approach the constitutionalization of conflict resolution in order to identify, through scientific and spiritual interpretation in conjunction with the systematic paradigm, what are these values, as well as operation and legal representation and practice of these measurements. In this sense, the thesis is to study the initial point of the analysis of conflict theories and explanations about the culture of litigation matched with concepts of creation and interpretation, constitutionalization, access to justice and social pacification public policies. It is used for this purpose, the logical-deductive method with the aid of the dialectic immanent in Law

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deaf people have serious difficulties to access information. The support for sign languages is rarely addressed in Information and Communication Technologies (ICT). Furthermore, in scientific literature, there is a lack of works related to machine translation for sign languages in real-time and open-domain scenarios, such as TV. To minimize these problems, in this work, we propose a solution for automatic generation of Brazilian Sign Language (LIBRAS) video tracks into captioned digital multimedia contents. These tracks are generated from a real-time machine translation strategy, which performs the translation from a Brazilian Portuguese subtitle stream (e.g., a movie subtitle or a closed caption stream). Furthermore, the proposed solution is open-domain and has a set of mechanisms that exploit human computation to generate and maintain their linguistic constructions. Some implementations of the proposed solution were developed for digital TV, Web and Digital Cinema platforms, and a set of experiments with deaf users was developed to evaluate the main aspects of the solution. The results showed that the proposed solution is efficient and able to generate and embed LIBRAS tracks in real-time scenarios and is a practical and feasible alternative to reduce barriers of deaf to access information, especially when human interpreters are not available

Relevância:

20.00% 20.00%

Publicador:

Resumo:

T'his dissertation proposes alternative models to allow the interconnectioin of the data communication networks of COSERN Companhia Energética do Rio Grande do Norte. These networks comprise the oorporative data network, based on TCP/IP architecture, and the automation system linking remote electric energy distribution substations to the main Operatin Centre, based on digital radio links and using the IEC 60870-5-101 protoco1s. The envisaged interconnection aims to provide automation data originated from substations with a contingent route to the Operation Center, in moments of failure or maintenance of the digital radio links. Among the presented models, the one chosen for development consists of a computational prototype based on a standard personal computer, working under LINUX operational system and running na application, developesd in C language, wich functions as a Gateway between the protocols of the TCP/IP stack and the IEC 60870-5-101 suite. So, it is described this model analysis, implementation and tests of functionality and performance. During the test phase it was basically verified the delay introduced by the TCP/IP network when transporting automation data, in order to guarantee that it was cionsistent with the time periods present on the automation network. Besides , additional modules are suggested to the prototype, in order to handle other issues such as security and prioriz\ation of the automation system data, whenever they are travesing the TCP/IP network. Finally, a study hás been done aiming to integrate, in more complete way, the two considered networks. It uses IP platform as a solution of convergence to the communication subsystem of na unified network, as the most recente market tendencies for supervisory and other automation systems indicate

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hospital Automation is an area that is constantly growing. The emergency of new technologies and hardware is transforming the processes more efficient. Nevertheless, some of the hospital processes are still being performed manually, such as monitoring of patients that is considered critical because it involves human lives. One of the factors that should be taken into account during a monitoring is the agility to detect any abnormality in vital signs of patients, as well as warning of this anomaly to the medical team involved. So, this master's thesis aims to develop an architecture to automate this process of monitoring and reporting of possible alert to a professional, so that emergency care can be done effectively. The computing mobile was used to improve the communication by distributing messages between a central located into the hospital and the mobile carried by the duty

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neste trabalho é proposto um novo algoritmo online para o resolver o Problema dos k-Servos (PKS). O desempenho desta solução é comparado com o de outros algoritmos existentes na literatura, a saber, os algoritmos Harmonic e Work Function, que mostraram ser competitivos, tornando-os parâmetros de comparação significativos. Um algoritmo que apresente desempenho eficiente em relação aos mesmos tende a ser competitivo também, devendo, obviamente, se provar o referido fato. Tal prova, entretanto, foge aos objetivos do presente trabalho. O algoritmo apresentado para a solução do PKS é baseado em técnicas de aprendizagem por reforço. Para tanto, o problema foi modelado como um processo de decisão em múltiplas etapas, ao qual é aplicado o algoritmo Q-Learning, um dos métodos de solução mais populares para o estabelecimento de políticas ótimas neste tipo de problema de decisão. Entretanto, deve-se observar que a dimensão da estrutura de armazenamento utilizada pela aprendizagem por reforço para se obter a política ótima cresce em função do número de estados e de ações, que por sua vez é proporcional ao número n de nós e k de servos. Ao se analisar esse crescimento (matematicamente, ) percebe-se que o mesmo ocorre de maneira exponencial, limitando a aplicação do método a problemas de menor porte, onde o número de nós e de servos é reduzido. Este problema, denominado maldição da dimensionalidade, foi introduzido por Belmann e implica na impossibilidade de execução de um algoritmo para certas instâncias de um problema pelo esgotamento de recursos computacionais para obtenção de sua saída. De modo a evitar que a solução proposta, baseada exclusivamente na aprendizagem por reforço, seja restrita a aplicações de menor porte, propõe-se uma solução alternativa para problemas mais realistas, que envolvam um número maior de nós e de servos. Esta solução alternativa é hierarquizada e utiliza dois métodos de solução do PKS: a aprendizagem por reforço, aplicada a um número reduzido de nós obtidos a partir de um processo de agregação, e um método guloso, aplicado aos subconjuntos de nós resultantes do processo de agregação, onde o critério de escolha do agendamento dos servos é baseado na menor distância ao local de demanda

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In practically all vertical markets and in every region of the planet, loyalty marketers have adopted the tactic of recognition and reward to identify, maintain and increase the yield of their customers. Several strategies have been adopted by companies, and the most popular among them is the loyalty program, which displays a loyalty club to manage these rewards. But the problem with loyalty programs is that customer identification and transfer of loyalty points are made in a semiautomatic. Aiming at this, this paper presents a master's embedded business automation solution called e-Points. The goal of e-Points is munir clubs allegiances with fully automated tooling technology to identify customers directly at the point of sales, ensuring greater control over the loyalty of associate members. For this, we developed a hardware platform with embedded system and RFID technology to be used in PCs tenant, a smart card to accumulate points with every purchase and a web server, which will provide services of interest to retailers and customers membership to the club

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work intends to show a new and few explored SLAM approach inside the simultaneous localization and mapping problem (SLAM). The purpose is to put a mobile robot to work in an indoor environment. The robot should map the environment and localize itself in the map. The robot used in the tests has an upward camera and encoders on the wheels. The landmarks in this built map are light splotches on the images of the camera caused by luminaries on the ceil. This work develops a solution based on Extended Kalman Filter to the SLAM problem using a developed observation model. Several developed tests and softwares to accomplish the SLAM experiments are shown in details

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The oil and petrochemical industry is responsable to generate a large amount of waste and wastewater. Among some efluents, is possible find the benzene, toluene, ethilbenze and isomers of xilenes compounds, known as BTEX. These compounds are very volatily, toxic for environment and potencially cancerigenous in man. Oxidative advanced processes, OAP, are unconventional waste treatment, wich may be apply on treatment and remotion this compounds. Fenton is a type of OAPs, wich uses the Fenton s reactant, hydrogen peroxide and ferrous salt, to promove the organic degradation. While the Photo-Fenton type uses the Fenton s reactant plus UV radiation (ultraviolet). These two types of OAP, according to literature, may be apply on BTEX complex system. This project consists on the consideration of the utilization of technologies Fenton and Photo-Fenton in aqueous solution in concentration of 100 ppm of BTEX, each, on simulation of condition near of petrochemical effluents. Different reactors were used for each type of OAP. For the analyticals results of amount of remotion were used the SPME technique (solid phase microextraction) for extraction in gaseous phase of these analytes and the gas chromatography/mass espectrometry The arrangement mechanical of Photo-Fenton system has been shown big loss by volatilization of these compounds. The Fenton system has been shown capable of degradate benzene and toluene compounds, with massic percentage of remotion near the 99%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polymer particles in the nanometer range are of fundamental interest today, especially when used as carrier systems in the controlled release of drugs, cosmetics and nutraceuticals, as well as in coating materials with magnetic properties. The main objective of the present study concerns the production of submicron particles of poly (methyl methacrylate) (PMMA) by crystallization of a polymer solution by thermally controlled cooling. In this work, PMMA solutions in ethanol and 1-propanol were prepared at different concentrations (1% to 5% by weight) and crystallized at different cooling rates (0.2 to 0.8 ° C / min) controlled linearly. Analysis of particle size distribution (DLS / CILAS) and scanning electron microscopy (SEM) were performed in order to evaluate the morphological characteristics of the produced particles. The results demonstrated that it is possible to obtain submicron polymer perfectly spherical particles using the technique discussed in this study. It was also observed that, depending on the cooling rate and the concentration of the polymer solution, it is possible to achieve high yield in the formation of submicron particles. In addition, preliminary tests were performed in order to verify the ability of this technique to form particulated carrier material with magnetic properties. The results showed that the developed technique can be an interesting alternative to obtain polymer particles with magnetic properties