980 resultados para ArcGis Runtime SDK for Androide


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal methods should be used to specify and verify on-card software in Java Card applications. Furthermore, Java Card programming style requires runtime verification of all input conditions for all on-card methods, where the main goal is to preserve the data in the card. Design by contract, and in particular, the JML language, are an option for this kind of development and verification, as runtime verification is part of the Design by contract method implemented by JML. However, JML and its currently available tools for runtime verification were not designed with Java Card limitations in mind and are not Java Card compliant. In this thesis, we analyze how much of this situation is really intrinsic of Java Card limitations and how much is just a matter of a complete re-design of JML and its tools. We propose the requirements for a new language which is Java Card compliant and indicate the lines on which a compiler for this language should be built. JCML strips from JML non-Java Card aspects such as concurrency and unsupported types. This would not be enough, however, without a great effort in optimization of the verification code generated by its compiler, as this verification code must run on the card. The JCML compiler, although being much more restricted than the one for JML, is able to generate Java Card compliant verification code for some lightweight specifications. As conclusion, we present a Java Card compliant variant of JML, JCML (Java Card Modeling Language), with a preliminary version of its compiler

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed multimedia systems have highly variable characteristics, resulting in new requirements while new technologies become available or in the need for adequacy in accordance with the amount of available resources. So, these systems should provide support for dynamic adaptations in order to adjust their structures and behaviors at runtime. This paper presents an approach to adaptation model-based and proposes a reflective and component-based framework for construction and support of self-adaptive distributed multimedia systems, providing many facilities for the development and evolution of such systems, such as dynamic adaptation. The propose is to keep one or more models to represent the system at runtime, so some external entity can perform an analysis of these models by identifying problems and trying to solve them. These models integrate the reflective meta-level, acting as a system self-representation. The framework defines a meta-model for description of self-adaptive distributed multimedia applications, which can represent components and their relationships, policies for QoS specification and adaptation actions. Additionally, this paper proposes an ADL and architecture for model-based adaptation. As a case study, this paper presents some scenarios to demonstrate the application of the framework in practice, with and without the use of ADL, as well as check some characteristics related to dynamic adaptation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Self-adaptive software system is able to change its structure and/or behavior at runtime due to changes in their requirements, environment or components. One way to archieve self-adaptation is the use a sequence of actions (known as adaptation plans) which are typically defined at design time. This is the approach adopted by Cosmos - a Framework to support the configuration and management of resources in distributed environments. In order to deal with the variability inherent of self-adaptive systems, such as, the appearance of new components that allow the establishment of configurations that were not envisioned at development time, this dissertation aims to give Cosmos the capability of generating adaptation plans of runtime. In this way, it was necessary to perform a reengineering of the Cosmos Framework in order to allow its integration with a mechanism for the dynamic generation of adaptation plans. In this context, our work has been focused on conducting a reengineering of Cosmos. Among the changes made to in the Cosmos, we can highlight: changes in the metamodel used to represent components and applications, which has been redefined based on an architectural description language. These changes were propagated to the implementation of a new Cosmos prototype, which was then used for developing a case study application for purpose of proof of concept. Another effort undertaken was to make Cosmos more attractive by integrating it with another platform, in the case of this dissertation, the OSGi platform, which is well-known and accepted by the industry

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One way to deal with the high complexity of current software systems is through selfadaptive systems. Self-adaptive system must be able to monitor themselves and their environment, analyzing the monitored data to determine the need for adaptation, decide how the adaptation will be performed, and finally, make the necessary adjustments. One way to perform the adaptation of a system is generating, at runtime, the process that will perform the adaptation. One advantage of this approach is the possibility to take into account features that can only be evaluated at runtime, such as the emergence of new components that allow new architectural arrangements which were not foreseen at design time. In this work we have as main objective the use of a framework for dynamic generation of processes to generate architectural adaptation plans on OSGi environment. Our main interest is evaluate how this framework for dynamic generation of processes behave in new environments

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Alongside the advances of technologies, embedded systems are increasingly present in our everyday. Due to increasing demand for functionalities, many tasks are split among processors, requiring more efficient communication architectures, such as networks on chip (NoC). The NoCs are structures that have routers with channel point-to-point interconnect the cores of system on chip (SoC), providing communication. There are several networks on chip in the literature, each with its specific characteristics. Among these, for this work was chosen the Integrated Processing System NoC (IPNoSyS) as a network on chip with different characteristics compared to general NoCs, because their routing components also accumulate processing function, ie, units have functional able to execute instructions. With this new model, packets are processed and routed by the router architecture. This work aims at improving the performance of applications that have repetition, since these applications spend more time in their execution, which occurs through repeated execution of his instructions. Thus, this work proposes to optimize the runtime of these structures by employing a technique of instruction-level parallelism, in order to optimize the resources offered by the architecture. The applications are tested on a dedicated simulator and the results compared with the original version of the architecture, which in turn, implements only packet level parallelism

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Traveling Purchaser Problem is a variant of the Traveling Salesman Problem, where there is a set of markets and a set of products. Each product is available on a subset of markets and its unit cost depends on the market where it is available. The objective is to buy all the products, departing and returning to a domicile, at the least possible cost defined as the summation of the weights of the edges in the tour and the cost paid to acquire the products. A Transgenetic Algorithm, an evolutionary algorithm with basis on endosymbiosis, is applied to the Capacited and Uncapacited versions of this problem. Evolution in Transgenetic Algorithms is simulated with the interaction and information sharing between populations of individuals from distinct species. The computational results show that this is a very effective approach for the TPP regarding solution quality and runtime. Seventeen and nine new best results are presented for instances of the capacited and uncapacited versions, respectively

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mangrove is a coastal ecosystem of the big ecological importance, showing high fragility front by natural process and the human interventions in the coastal zone. This research has objective to analyses the relation between mangrove species distribution and geochemical parameters variation of the water and soil in Apodi/Mossoro estuary, located in the Rio Grande do Norte state north coastline. The results were obtained from floristic and structural analysis of the vegetation and Quick Bird satellite images interpretation (collected in 2006 year), manipulated with ENVI 4.3 and ArcGIS 9.2 software s. This estuary was characterized by to presents a gradient of the salinity around 40 kilometers extension, finding amount between 50 and 90 g/l-1. Will be identified the formation of the mix vegetation formation in the estuary mount, where the water salinity no show express wide variation on seawater (36 g/l-1), finding species: Rhizophora mangle L., Laguncularia racemosa (L.) C. F. Gaertn, Avicennia schaueriana Stap. & Leechman e Avicennia germinans L. Along of the estuary, have a streak formation of the vegetation composed by Avicennia spp. and L. racemosa. In high estuary, where the salinities value stay above 60 g/l-1, only A. germinans predominate in dwarf form. In this sense, the salinity is as a limiting factor of stress on the mangrove vegetation as it enters the estuary, this parameter should be taken into account when drawing up management plans and environmental restoration in the estuary in question

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to investigate the eco-environmental vulnerability, its changes, and its causes to develop a management system for application of eco-environmental vulnerability and risk assessment in the Apodi-Mossory estuary, Northeast Brazil. This analysis is focused on the interference of the landscape conditions, and its changes, due to the following factors: the oil and natural gas industry, tropical fruits industry, shrimp farms, marine salt industry, occupation of the sensitive areas; demand for land, vegetation degradation, siltation in rivers, severe flooding, sea level rise (SLR), coastal dynamics, low and flat topography, high ecological value and tourism in the region and the rapid growth of urbanization. Conventional and remote sensing data were analyzed using modeling techniques based on ArcGIS, ER-Mapper, ERDAS Imagine and ENVI software. Digital images were initially processed by Principal Component Analysis and transformation of the maximum fraction of noise, and then all bands were normalized to reduce errors caused by bands of different sizes. They were integrated in a Geographic Information System analysis to detect changes, to generate digital elevation models, geomorphic indices and other variables of the study area. A three band color combination of multispectral bands was used to monitor changes of land and vegetation cover from 1986 to 2009. This task also included the analysis of various secondary data, such as field data, socioeconomic data, environmental data and prospects growth. The main objective of this study was to improve our understanding of eco-environmental vulnerability and risk assessment; it´s causes basically show the intensity, its distribution and human-environment effect on the ecosystem, and identify the high and low sensitive areas and area of inundation due to future SLR, and the loss of land due to coastal erosion in the Apodi-Mossoró estuary in order to establish a strategy for sustainable land use. The developed model includes some basic factors such as geology, geomorphology, soils, land use / land cover, vegetation cover, slope, topography and hydrology. The numerical results indicate that 9.86% of total study area was under very high vulnerability, 29.12% high vulnerability, 52.90% moderate vulnerability and 2.23% were in the category of very low vulnerability. The analysis indicates that 216.1 km² and 362.8 km² area flooded on 1m and 10m in sea levels respectively. The sectors most affected were residential, industrial and recreational areas, agricultural land, and ecosystems of high environmental sensitivity. The results showed that changes in eco-environmental vulnerability have a significant impact on the sustainable development of the RN state, since the indicator is a function of sensitivity, exposure and status in relation to a level of damage. The model were presented as a tool to assist in indexing vulnerability in order to optimize actions and assess the implications of decisions makers and policies regarding the management of coastal and estuarine areas. In this context aspects such as population growth, degradation of vegetation, land use / land cover, amount and type of industrialization, SLR and government policies for environmental protection were considered the main factors that affect the eco-environmental changes over the last three decades in the Apodi-Mossoró estuary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, there are many aspect-oriented middleware implementations that take advantage of the modularity provided by the aspect oriented paradigm. Although the works always present an assessment of the middleware according to some quality attribute, there is not a specific set of metrics to assess them in a comprehensive way, following various quality attributes. This work aims to propose a suite of metrics for the assessment of aspect-oriented middleware systems at different development stages: design, refactoring, implementation and runtime. The work presents the metrics and how they are applied at each development stage. The suite is composed of metrics associated to static properties (modularity, maintainability, reusability, exibility, complexity, stability, and size) and dynamic properties (performance and memory consumption). Such metrics are based on existing assessment approaches of object-oriented and aspect-oriented systems. The proposed metrics are used in the context of OiL (Orb in Lua), a middleware based on CORBA and implemented in Lua, and AO-OiL, the refactoring of OIL that follows a reference architecture for aspect-oriented middleware systems. The case study performed in OiL and AO-OiL is a system for monitoring of oil wells. This work also presents the CoMeTA-Lua tool to automate the collection of coupling and size metrics in Lua source code

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web services are loosely coupled applications that use XML documents as a way of integrating distinct systems on the internet. Such documents are used by in standards such as SOAP, WSDL and UDDI which establish, respectively, integrated patterns for the representation of messages, description, and publication of services, thus facilitating the interoperability between heterogeneous systems. Often one single service does not meet the users needs, therefore new systems can be designed from the composition of two or more services. This which is the design goal behind the of the Service Oriented Architecture. Parallel to this scenario, we have the PEWS (Predicate Path-Expressions for Web Services) language, which speci es behavioural speci cations of composite web service interfaces.. The development of the PEWS language is divided into two parts: front-end and back-end. From a PEWS program, the front-end performs the lexical analysis, syntactic and semantic compositions and nally generate XML code. The function of the back-end is to execute the composition PEWS. This master's dissertation work aims to: (i) reformulate the proposed architecture for the runtime system of the language, (ii) Implement the back-end for PEWS by using .NET Framework tools to execute PEWS programs using the Windows Work ow Foundation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current work was developed on the dune systems of the Parque das Dunas and Barreira do Inferno. These places are located in the cities of Natal and Parnamirim (RN, Brazil), respectively. This project has the purpose of developing the deterministic model on a specific blowout at Parque das Dunas, based in the geophysical interpretations of the lines gotten with the Ground Penetration Radar and the planialtimetric acquisitions of the topographical surface of the land. Also analyses of the vulnerability/susceptibility of these dune systems had been done in relation to the human pressures. To develop its deterministic model, it is necessary to acquire inner and outer geometries of the cited blowout. In order to depict inner geometries underneath the surface are used the GPR observing the altimetric control for topographical correction of the GPR lines. As for the outer geometries, the geodesic GPS gives us the planialtimetric points (x, y and z points) with milimetric precision, resulting in high-resolution surfaces. Using interpolation methods of the planialtimetric points was possible create Digital Elevations Models (DEM´s) of these surfaces. As a result, 1,161.4 meters of GPR lines were acquired on the blowout at the Parque das Dunas and 3,735.27 meters on the blowout at the Barreira do Inferno. These lines had been acquired with a 200 MHz antenna, except the 7 and 8 lines, for which we had been used a 100 MHz antenna. The gotten data had been processed and interpreted, being possible to identify boundary surfaces of first, second and third order. The first order boundary surface is related with the contact of the rocks of the Barreiras Group with the aeolian deposits. These deposits had been divided in two groups (Group 1 and Group 2) which are related with the geometry of stratum and the dip of its stratifications. Group 1 presented stratum of sigmoidal and irregular geometries and involved bodies where the reflectors had presented dips that had varied of 20 to the 28 degrees for the Parque das Dunas blowout and of 22 to the 29 degrees for the Barreira do Inferno blowout. Usually, it was limited in the base for the first order surface and in the top for the second order surface. Group 2 presented stratum of trough, wedge or lens geometries, limited in the base for the second order vi surface, where the corresponding deposits had more shown smoothed reflectors or with dips of low angle. The Deterministic and Digital Elevation Models had been developed from the integration and interpretation of the 2D data with the GOCAD® program. In Digital Elevations Models it was possible to see, for the localities, corridor or trough-shaped blowouts. In Deterministic Model it was possible to see first and second order boundary surfaces. For the vulnerability/susceptibility of the dune systems it was applied the methodology proposal by Boderè al (1991); however the same one did not show adequate because it evaluates actual coastal dunes. Actual coastal dunes are dunes that are presented in balance with the current environmental conditions. Therefore, a new methodology was proposal which characterizes the supplying and activity sedimentary, as well as the human pressures. For the methodology developed in this work, both the localities had presented a good management. The Parque das Dunas was characterized as a relic dune system and the Barreira do Inferno was characterized as a palimpsestic dune system. Also two Thematic Maps had been elaborated for the environmental characterization of the studied dune systems, with software ArcGis 8.3, and its respective data bases

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study includes the results of the analysis of areas susceptible to degradation by remote sensing in semi-arid region, which is a matter of concern and affects the whole population and the catalyst of this process occurs by the deforestation of the savanna and improper practices by the use of soil. The objective of this research is to use biophysical parameters of the MODIS / Terra and images TM/Landsat-5 to determine areas susceptible to degradation in semi-arid Paraiba. The study area is located in the central interior of Paraíba, in the sub-basin of the River Taperoá, with average annual rainfall below 400 mm and average annual temperature of 28 ° C. To draw up the map of vegetation were used TM/Landsat-5 images, specifically, the composition 5R4G3B colored, commonly used for mapping land use. This map was produced by unsupervised classification by maximum likelihood. The legend corresponds to the following targets: savanna vegetation sparse and dense, riparian vegetation and exposed soil. The biophysical parameters used in the MODIS were emissivity, albedo and vegetation index for NDVI (NDVI). The GIS computer programs used were Modis Reprojections Tools and System Information Processing Georeferenced (SPRING), which was set up and worked the bank of information from sensors MODIS and TM and ArcGIS software for making maps more customizable. Initially, we evaluated the behavior of the vegetation emissivity by adapting equation Bastiaanssen on NDVI for spatialize emissivity and observe changes during the year 2006. The albedo was used to view your percentage of increase in the periods December 2003 and 2004. The image sensor of Landsat TM were used for the month of December 2005, according to the availability of images and in periods of low emissivity. For these applications were made in language programs for GIS Algebraic Space (LEGAL), which is a routine programming SPRING, which allows you to perform various types of algebras of spatial data and maps. For the detection of areas susceptible to environmental degradation took into account the behavior of the emissivity of the savanna that showed seasonal coinciding with the rainy season, reaching a maximum emissivity in the months April to July and in the remaining months of a low emissivity . With the images of the albedo of December 2003 and 2004, it was verified the percentage increase, which allowed the generation of two distinct classes: areas with increased variation percentage of 1 to 11.6% and the percentage change in areas with less than 1 % albedo. It was then possible to generate the map of susceptibility to environmental degradation, with the intersection of the class of exposed soil with varying percentage of the albedo, resulting in classes susceptibility to environmental degradation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactive visual representations complement traditional statistical and machine learning techniques for data analysis, allowing users to play a more active role in a knowledge discovery process and making the whole process more understandable. Though visual representations are applicable to several stages of the knowledge discovery process, a common use of visualization is in the initial stages to explore and organize a sometimes unknown and complex data set. In this context, the integrated and coordinated - that is, user actions should be capable of affecting multiple visualizations when desired - use of multiple graphical representations allows data to be observed from several perspectives and offers richer information than isolated representations. In this paper we propose an underlying model for an extensible and adaptable environment that allows independently developed visualization components to be gradually integrated into a user configured knowledge discovery application. Because a major requirement when using multiple visual techniques is the ability to link amongst them, so that user actions executed on a representation propagate to others if desired, the model also allows runtime configuration of coordinated user actions over different visual representations. We illustrate how this environment is being used to assist data exploration and organization in a climate classification problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work intends to analyze the application and execution time of a numerical algorithm that simulates incompressible and isothermal flows. It was used the explicit scheme of the Characteristic Based Split (CBS) algorithm and the Artificial Compressibility (AC) scheme for coupling pressure-velocity equations. The discretization was done with the finite element method using a bilinear elements grid. The free software GNU-Octave was used for implementation and execution of routines. The results were analyzed using the classic lid-driven cavity problem. This work shows results for tests with several Reynolds' number. The results for these tests show a good agreement when compared with previous ones obtained from bibliography. The code runtime's analysis shows yet that the matrix's assembly is the part of greater consumption time in the implementation.