84 resultados para Coincidência de aplicações


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Java Card technology allows the development and execution of small applications embedded in smart cards. A Java Card application is composed of an external card client and of an application in the card that implements the services available to the client by means of an Application Programming Interface (API). Usually, these applications manipulate and store important information, such as cash and confidential data of their owners. Thus, it is necessary to adopt rigor on developing a smart card application to improve its quality and trustworthiness. The use of formal methods on the development of these applications is a way to reach these quality requirements. The B method is one of the many formal methods for system specification. The development in B starts with the functional specification of the system, continues with the application of some optional refinements to the specification and, from the last level of refinement, it is possible to generate code for some programming language. The B formalism has a good tool support and its application to Java Card is adequate since the specification and development of APIs is one of the major applications of B. The BSmart method proposed here aims to promote the rigorous development of Java Card applications up to the generation of its code, based on the refinement of its formal specification described in the B notation. This development is supported by the BSmart tool, that is composed of some programs that automate each stage of the method; and by a library of B modules and Java Card classes that model primitive types, essential Java Card API classes and reusable data structures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the years the use of application frameworks designed for the View and Controller layers of MVC architectural pattern adapted to web applications has become very popular. These frameworks are classified into Actions Oriented and Components Oriented , according to the solution strategy adopted by the tools. The choice of such strategy leads the system architecture design to acquire non-functional characteristics caused by the way the framework influences the developer to implement the system. The components reusability is one of those characteristics and plays a very important role for development activities such as system evolution and maintenance. The work of this dissertation consists to analyze of how the reusability could be influenced by the Web frameworks usage. To accomplish this, small academic management applications were developed using the latest versions of Apache Struts and JavaServer Faces frameworks, the main representatives of Java plataform Web frameworks of. For this assessment was used a software quality model that associates internal attributes, which can be measured objectively, to the characteristics in question. These attributes and metrics defined for the model were based on some work related discussed in the document

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a proposal of a multi-middleware environment to develop distributed applications, which abstracts different underlying middleware platforms. This work describes: (i) the reference architecture designed for the environment, (ii) an implementation which aims to validate the specified architecture integrating CORBA and EJB, (iii) a case study illustrating the use of the environment, (iv) a performance analysis. The proposed environment allows interoperability on middleware platforms, allowing the reuse of components of different kinds of middleware platforms in a transparency away to the developer and without major losses in performance. Also in the implementation we developed an Eclipse plugin which allows developers gain greater productivity at developing distributed applications using the proposed environment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ubiquitous computing systems operate in environments where the available resources significantly change during the system operation, thus requiring adaptive and context aware mechanisms to sense changes in the environment and adapt to new execution contexts. Motivated by this requirement, a framework for developing and executing adaptive context aware applications is proposed. The PACCA framework employs aspect-oriented techniques to modularize the adaptive behavior and to keep apart the application logic from this behavior. PACCA uses abstract aspect concept to provide flexibility by addition of new adaptive concerns that extend the abstract aspect. Furthermore, PACCA has a default aspect model that considers habitual adaptive concerns in ubiquitous applications. It exploits the synergy between aspect-orientation and dynamic composition to achieve context-aware adaptation, guided by predefined policies and aim to allow software modules on demand load making possible better use of mobile devices and yours limited resources. A Development Process for the ubiquitous applications conception is also proposed and presents a set of activities that guide adaptive context-aware developer. Finally, a quantitative study evaluates the approach based on aspects and dynamic composition for the construction of ubiquitous applications based in metrics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brazil is going through the process from analogical transmission to digital transmission. This new technology, in addition to providing a high quality audio and video, also allows applications to execute on television. Equipment called Set-Top Box are needed to allow the reception of this new signal and create the appropriate environment necessary to execute applications. At first, the only way to interact with these applications is given by remote control. However, the remote control has serious usability problems when used to interact with some types of applications. This research suggests a software resources implementation capable to create a environment that allows a smartphone to interact with applications. Besides this implementation, is performed a comparative study between use remote controle and smartphones to interact with applications of digital television, taking into account parameters related to usability. After analysis of data collected by the comparative study is possible to identify which device provides an interactive experience more interesting for users

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of smart card applications requires a high level of reliability. Formal methods provide means for this reliability to be achieved. The BSmart method and tool contribute to the development of smart card applications with the support of the B method, generating Java Card code from B specifications. For the development with BSmart to be effectively rigorous without overloading the user it is important to have a library of reusable components built in B. The goal of KitSmart is to provide this support. A first research about the composition of this library was a graduation work from Universidade Federal do Rio Grande do Norte, made by Thiago Dutra in 2006. This first version of the kit resulted in a specification of Java Card primitive types byte, short and boolean in B and the creation of reusable components for application development. This work provides an improvement of KitSmart with the addition of API Java Card specification made in B and a guide for the creation of new components. The API Java Card in B, besides being available to be used for development of applications, is also useful as a documentation of each API class. The reusable components correspond to modules to manipulate specific structures, such as date and time. These structures are not available for B or Java Card. These components for Java Card are generated from specifications formally verified in B. The guide contains quick reference on how to specify some structures and how some situations were adapted from object-orientation to the B Method. This work was evaluated through a case study made through the BSmart tool, that makes use of the KitSmart library. In this case study, it is possible to see the contribution of the components in a B specification. This kit should be useful for B method users and Java Card application developers

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing complexity of integrated circuits has boosted the development of communications architectures like Networks-on-Chip (NoCs), as an architecture; alternative for interconnection of Systems-on-Chip (SoC). Networks-on-Chip complain for component reuse, parallelism and scalability, enhancing reusability in projects of dedicated applications. In the literature, lots of proposals have been made, suggesting different configurations for networks-on-chip architectures. Among all networks-on-chip considered, the architecture of IPNoSys is a non conventional one, since it allows the execution of operations, while the communication process is performed. This study aims to evaluate the execution of data-flow based applications on IPNoSys, focusing on their adaptation against the design constraints. Data-flow based applications are characterized by the flowing of continuous stream of data, on which operations are executed. We expect that these type of applications can be improved when running on IPNoSys, because they have a programming model similar to the execution model of this network. By observing the behavior of these applications when running on IPNoSys, were performed changes in the execution model of the network IPNoSys, allowing the implementation of an instruction level parallelism. For these purposes, analysis of the implementations of dataflow applications were performed and compared

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mainstream programming languages provide built-in exception handling mechanisms to support robust and maintainable implementation of exception handling in software systems. Most of these modern languages, such as C#, Ruby, Python and many others, are often claimed to have more appropriated exception handling mechanisms. They reduce programming constraints on exception handling to favor agile changes in the source code. These languages provide what we call maintenance-driven exception handling mechanisms. It is expected that the adoption of these mechanisms improve software maintainability without hindering software robustness. However, there is still little empirical knowledge about the impact that adopting these mechanisms have on software robustness. This work addresses this gap by conducting an empirical study aimed at understanding the relationship between changes in C# programs and their robustness. In particular, we evaluated how changes in the normal and exceptional code were related to exception handling faults. We applied a change impact analysis and a control flow analysis in 100 versions of 16 C# programs. The results showed that: (i) most of the problems hindering software robustness in those programs are caused by changes in the normal code, (ii) many potential faults were introduced even when improving exception handling in C# code, and (iii) faults are often facilitated by the maintenance-driven flexibility of the exception handling mechanism. Moreover, we present a series of change scenarios that decrease the program robustness

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we present a proposal to contribute to the teaching and learning of affine function in the first year of high school having as prerequisite mathematical knowledge of basic education. The proposal focuses on some properties, special cases and applications of affine functions in order to show the importance of the demonstrations while awaken student interest by showing how this function is important to solve everyday problems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A necessidade de uma precisão e de uma aproximação dos resultados numéricos zeram com que diversas teorias surgissem: dentre elas, destacamos a Matemática Intervalar. A Matemática Intervalar surgiu na década de 60 com os trabalhos de pesquisa de Moore (MOORE, 1959) , em que ele propôs trabalhar com uma Matemática baseada na noção de intervalo real e não mais com um número como aproximação. Com isso, surgiu a necessidade de revisitar e reformular os conceitos e resultados da Matemática Clássica utilizando como base a noção de intervalo de Moore. Uma das áreas da Matem ática Clássica que tem tido muitas aplicações em engenharias e ciências é a Análises Numérica, onde um dos seus pilares é o Cálculo Integral e em particular as integrais de linha. Assim, é muito desejável se ter um cálculo integral dentro da própria Matemática Intervalar. No presente trabalho apresenta-se uma noção de Integral de Linha Intervalar baseada na extensão de integração proposta por Bedregal em (BEDREGAL; BEDREGAL, 2010). Para a fundamentação apresenta-se incialmente uma introdução sobre a pespectiva em que o trabalho foi realizado, considerando alguns aspectos histórico-evolutivos da Matemática Clássica. Os conceitos de Integrais de Linha Clássica, bem como algumas das suas aplicações mais importantes. Alguns conceitos de Matemática Intervalar necessários para o entendimento do trabalho. Para nalizar propomos uma aplicação da integral de linha em um experimênto clássico da mecânica quântica (a difração de um elétron em uma fenda) que graças ao fato de ser a Matemática Intervalar utilizada, nos dá um foco mais detalhado e mais próximo da realidade

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Surfactants are versatile organic compounds that have, in a single molecule, double chemical affinity. The surfactant molecule is composed by a hy drophobic tail group, a hydrocarbon chain (linear, branched, or mixed), and by a hydrophilic head group, which contains polar groups that makes it able to be applied in the organophilization process of natural clays. Microemulsions are microheterogeneous b lends composed by: a surfactant, an oily phase (non - polar solvent), an aqueous phase, and, sometimes, a co - surfactant (short - chain alcohol). They are systems with thermodynamic stability, transparent, and have high solubility power. Vermiculite is a clay m ineral with an expandable crystalline structure that has high cation exchange capacity. In this work vermiculite was used to obtain organoclays. The ionic surfactants dodecyl ammonium chlori de (DDAC) and cetyltrimethylammonium bromide (C 16 TAB) were used in the organophilization process. They were used as surfactant aqueous solutions and, for DDAC, as a microemulsion system. The organoclays were used to promote the separation of binary mixtures of xylene isomers (ortho - and meta - xylene). Dif ferent analytical techniques were used to characterize microemulsion systems and also the nanoclays. It was produced a water - rich microemulsion system with 0.92 nm droplet average diameter. The vermiculite used in this work has a cationic exchange capacity of 172 meq/100g and magnesium as main cation (24.25%). The basal spacing of natural vermiculite and organo - vermiculites were obtained by X - ray Diffraction technique. The basal spacing was 1.48nm for natural vermiculite, 4.01nm for CTAB - vermiculite (CTAB 4 ) , and 3.03nm for DDAC - vermiculite (DDAC M1A), that proves the intercalation process. Separation tests were carried out in glass columns using three binary mixtures of xylene (ortho - xylene and meta - xylene). The results showed that the organovermiculite pre sented an enhanced chemical affinity by the mixture of hydrocarbons, when compared with the natural vermiculite, and also its preference by ortho - xylene. A factorial experimental design 2 2 with triplicate at the central point was used to optimize the xylen e separation process. The experimental design revealed that the initial concentration of isomers in the mixture and the mass of organovermiculite were the significant factors for an improved separation of isomers. In the experiments carried out using a bin ary mixture of ortho - xylene and meta - xylene (2:1), after its percolating through the organovermiculite bed (DDAC M1), it was observed the preference of the organoclay by the ortho - xylene isomer, which was retained in greater quantity than the meta - xylene o ne. At the end of the treatment, it was obtained a final concentration in meta - xylene of 47.52%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main hypothesis of this thesis is that the deve lopment of industrial automation applications efficiently, you need a good structuri ng of data to be handled. Then, with the aim of structuring knowledge involved in the contex t of industrial processes, this thesis proposes an ontology called OntoAuto that conceptua lly models the elements involved in the description of industrial processes. To validat e the proposed ontology, several applica- tions are presented. In the first, two typical indu strial processes are modeled conceptually: treatment unit DEA (Diethanolamine) and kiln. In th e second application, the ontology is used to perform a semantic filtering alarms, which together with the analysis of correla- tions, provides temporal relationships between alar ms from an industrial process. In the third application, the ontology was used for modeli ng and analysis of construction cost and operation processes. In the fourth application, the ontology is adopted to analyze the reliability and availability of an industrial plant . Both for the application as it involves costs for the area of reliability, it was necessary to create new ontologies, and OntoE- con OntoConf, respectivamentem, importing the knowl edge represented in OntoAuto but adding specific information. The main conclusions of the thesis has been that on tology approaches are well suited for structuring the knowledge of industrial process es and based on them, you can develop various advanced applications in industrial automat ion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Opuntia fícus - indica (L.) Mill is a cactacea presents in the Caatinga ecosystem and shows in its chemical c omposition flavonoids, galacturonic acid and sugars. Different hydroglicolic (EHG001 and EHG002) and hydroethanolic subsequently lyophilized (EHE001 and EHE002) extracts were developed. The EHE002 had his preliminary phytochemical composition investigated by thin layer chromatography (TLC) and we observed the predominance of flavonoids. Different formulations were prepared as emulsions with Sodium Polyacrylate (and) Hydrogenated Polydecene (and) Trideceth - 6 (Rapithix® A60), and Polyacrylamide (and) C13 - 14 I soparaffin (and) Laureth - 7 (Sepigel® 305), and gel with Sodium Polyacrylate (Rapithix® A100). The sensorial evaluation was conducted by check - all - that - apply method. There were no significant differences between the scores assigned to the formulations, howe ver, we noted a preference for those formulated with 1,5% of Rapithix® A100 and 3,0% of Sepigel® 305. These and the formulation with 3% Rapithix® A60 were tested for preliminary and accelerated stability. In accelerated stability study, samples were stored at different temperatures for 90 days. Organoleptic characteristics, the pH values and rheological behavior were assessed. T he emulsions formulated with 3,0% of Sepigel® 305 and 1,5% of Rapithix® A60 w ere stable with pseudoplastic and thixotropic behavior . The moisturizing clinical efficacy of the emulsions containing 3,0% of Sepigel® 305 containing 1 and 3% of EHG001 was performed using the capacitance method (Corneometer®) and transepidermal water lost – TEWL evaluation ( Tewameter®). The results showed t hat the formulation with 3% of EHG001 increased the skin moisturizing against the vehicle and the extractor solvent formulation after five hours. The formulations containing 1 and 3% of EHG001 increased skin barrier effect by reducing transepidermal water loss up to four hours after application.