939 resultados para Lattes Platform


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is increasingly common use of a single computer system using different devices - personal computers, telephones cellular and others - and software platforms - systems graphical user interfaces, Web and other systems. Depending on the technologies involved, different software architectures may be employed. For example, in Web systems, it utilizes architecture client-server - usually extended in three layers. In systems with graphical interfaces, it is common architecture with the style MVC. The use of architectures with different styles hinders the interoperability of systems with multiple platforms. Another aggravating is that often the user interface in each of the devices have structure, appearance and behaviour different on each device, which leads to a low usability. Finally, the user interfaces specific to each of the devices involved, with distinct features and technologies is a job that needs to be done individually and not allow scalability. This study sought to address some of these problems by presenting a reference architecture platform-independent and that allows the user interface can be built from an abstract specification described in the language in the specification of the user interface, the MML. This solution is designed to offer greater interoperability between different platforms, greater consistency between the user interfaces and greater flexibility and scalability for the incorporation of new devices

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The constant increase of complexity in computer applications demands the development of more powerful hardware support for them. With processor's operational frequency reaching its limit, the most viable solution is the use of parallelism. Based on parallelism techniques and the progressive growth in the capacity of transistors integration in a single chip is the concept of MPSoCs (Multi-Processor System-on-Chip). MPSoCs will eventually become a cheaper and faster alternative to supercomputers and clusters, and applications developed for these high performance systems will migrate to computers equipped with MP-SoCs containing dozens to hundreds of computation cores. In particular, applications in the area of oil and natural gas exploration are also characterized by the high processing capacity required and would benefit greatly from these high performance systems. This work intends to evaluate a traditional and complex application of the oil and gas industry known as reservoir simulation, developing a solution with integrated computational systems in a single chip, with hundreds of functional unities. For this, as the STORM (MPSoC Directory-Based Platform) platform already has a shared memory model, a new distributed memory model were developed. Also a message passing library has been developed folowing MPI standard

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on Wireless Sensor Networks (WSN) has evolved, with potential applications in several domains. However, the building of WSN applications is hampered by the need of programming in low-level abstractions provided by sensor OS and of specific knowledge about each application domain and each sensor platform. We propose a MDA approach do develop WSN applications. This approach allows domain experts to directly contribute in the developing of applications without needing low level knowledge on WSN platforms and, at the same time, it allows network experts to program WSN nodes to met application requirements without specific knowledge on the application domain. Our approach also promotes the reuse of the developed software artifacts, allowing an application model to be reused across different sensor platforms and a platform model to be reused for different applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The World Wide Web has been consolidated over the last years as a standard platform to provide software systems in the Internet. Nowadays, a great variety of user applications are available on the Web, varying from corporate applications to the banking domain, or from electronic commerce to the governmental domain. Given the quantity of information available and the quantity of users dealing with their services, many Web systems have sought to present recommendations of use as part of their functionalities, in order to let the users to have a better usage of the services available, based on their profile, history navigation and system use. In this context, this dissertation proposes the development of an agent-based framework that offers recommendations for users of Web systems. It involves the conception, design and implementation of an object-oriented framework. The framework agents can be plugged or unplugged in a non-invasive way in existing Web applications using aspect-oriented techniques. The framework is evaluated through its instantiation to three different Web systems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aspect-Oriented Software Development (AOSD) is a technique that complements the Object- Oriented Software Development (OOSD) modularizing several concepts that OOSD approaches do not modularize appropriately. However, the current state-of-the art on AOSD suffers with software evolution, mainly because aspect definition can stop to work correctly when base elements evolve. A promising approach to deal with that problem is the definition of model-based pointcuts, where pointcuts are defined based on a conceptual model. That strategy makes pointcut less prone to software evolution than model-base elements. Based on that strategy, this work defines a conceptual model at high abstraction level where we can specify software patterns and architectures that through Model Driven Development techniques they can be instantiated and composed in architecture description language that allows aspect modeling at architecture level. Our MDD approach allows propagate concepts in architecture level to another abstraction levels (design level, for example) through MDA transformation rules. Also, this work shows a plug-in implemented to Eclipse platform called AOADLwithCM. That plug-in was created to support our development process. The AOADLwithCM plug-in was used to describe a case study based on MobileMedia System. MobileMedia case study shows step-by-step how the Conceptual Model approach could minimize Pointcut Fragile Problems, due to software evolution. MobileMedia case study was used as input to analyses evolutions on software according to software metrics proposed by KHATCHADOURIAN, GREENWOOD and RASHID. Also, we analyze how evolution in base model could affect maintenance on aspectual model with and without Conceptual Model approaches

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The game industry has been experiencing a consistent increase in production costs of games lately. Part of this increase refers to the current trend of having bigger, more interactive and replayable environments. This trend translates to an increase in both team size and development time, which makes game development a even more risky investment and may reduce innovation in the area. As a possible solution to this problem, the scientific community is focusing on the generation of procedural content and, more specifically, on procedurally generated levels. Given the great diversity and complexity of games, most works choose to deal with a specific genre, platform games being one of the most studied. This work aims at proposing a procedural level generation method for platform/adventure games, a fairly more complex genre than most classic platformers which so far has not been the subject of study from other works. The level generation process was divided in two steps, planning and viusal generation, respectively responsible for generating a compact representation of the level and determining its view. The planning stage was divided in game design and level design, and uses a goaloriented process to output a set of rooms. The visual generation step receives a set of rooms and fills its interior with the appropriate parts of previously authored geometry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasingly request for processing power during last years has pushed integrated circuit industry to look for ways of providing even more processing power with less heat dissipation, power consumption, and chip area. This goal has been achieved increasing the circuit clock, but since there are physical limits of this approach a new solution emerged as the multiprocessor system on chip (MPSoC). This approach demands new tools and basic software infrastructure to take advantage of the inherent parallelism of these architectures. The oil exploration industry has one of its firsts activities the project decision on exploring oil fields, those decisions are aided by reservoir simulations demanding high processing power, the MPSoC may offer greater performance if its parallelism can be well used. This work presents a proposal of a micro-kernel operating system and auxiliary libraries aimed to the STORM MPSoC platform analyzing its influence on the problem of reservoir simulation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advance of the Cloud Computing paradigm, a single service offered by a cloud platform may not be enough to meet all the application requirements. To fulfill such requirements, it may be necessary, instead of a single service, a composition of services that aggregates services provided by different cloud platforms. In order to generate aggregated value for the user, this composition of services provided by several Cloud Computing platforms requires a solution in terms of platforms integration, which encompasses the manipulation of a wide number of noninteroperable APIs and protocols from different platform vendors. In this scenario, this work presents Cloud Integrator, a middleware platform for composing services provided by different Cloud Computing platforms. Besides providing an environment that facilitates the development and execution of applications that use such services, Cloud Integrator works as a mediator by providing mechanisms for building applications through composition and selection of semantic Web services that take into account metadata about the services, such as QoS (Quality of Service), prices, etc. Moreover, the proposed middleware platform provides an adaptation mechanism that can be triggered in case of failure or quality degradation of one or more services used by the running application in order to ensure its quality and availability. In this work, through a case study that consists of an application that use services provided by different cloud platforms, Cloud Integrator is evaluated in terms of the efficiency of the performed service composition, selection and adaptation processes, as well as the potential of using this middleware in heterogeneous computational clouds scenarios

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The way to deal with information assets means nowadays the main factor not only for the success but also for keeping the companies in the global world. The number of information security incidents has grown for the last years. The establishment of information security policies that search to keep the security requirements of assets in the desired degrees is the major priority for the companies. This dissertation suggests a unified process for elaboration, maintenance and development of information security policies, the Processo Unificado para Políticas de Segurança da Informação - PUPSI. The elaboration of this proposal started with the construction of a structure of knowledge based on documents and official rules, published in the last two decades, about security policies and information security. It's a model based on the examined documents which defines the needed security policies to be established in the organization, its work flow and identifies the sequence of hierarchy among them. It's also made a model of the entities participating in the process. Being the problem treated by the model so complex, which involves all security policies that the company must have. PUPSI has an interative and developing approach. This approach was obtained from the instantiation of the RUP - Rational Unified Process model. RUP is a platform for software development object oriented, of Rational Software (IBM group). Which uses the best practice known by the market. PUPSI got from RUP a structure of process that offers functionality, diffusion capacity and comprehension, performance and agility for the process adjustment, offering yet capacity of adjustment to technological and structural charges of the market and the company

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays due to the security vulnerability of distributed systems, it is needed mechanisms to guarantee the security requirements of distributed objects communications. Middleware Platforms component integration platforms provide security functions that typically offer services for auditing, for guarantee messages protection, authentication, and access control. In order to support these functions, middleware platforms use digital certificates that are provided and managed by external entities. However, most middleware platforms do not define requirements to get, to maintain, to validate and to delegate digital certificates. In addition, most digital certification systems use X.509 certificates that are complex and have a lot of attributes. In order to address these problems, this work proposes a digital certification generic service for middleware platforms. This service provides flexibility via the joint use of public key certificates, to implement the authentication function, and attributes certificates to the authorization function. It also supports delegation. Certificate based access control is transparent for objects. The proposed service defines the digital certificate format, the store and retrieval system, certificate validation and support for delegation. In order to validate the proposed architecture, this work presents the implementation of the digital certification service for the CORBA middleware platform and a case study that illustrates the service functionalities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Self-adaptive software system is able to change its structure and/or behavior at runtime due to changes in their requirements, environment or components. One way to archieve self-adaptation is the use a sequence of actions (known as adaptation plans) which are typically defined at design time. This is the approach adopted by Cosmos - a Framework to support the configuration and management of resources in distributed environments. In order to deal with the variability inherent of self-adaptive systems, such as, the appearance of new components that allow the establishment of configurations that were not envisioned at development time, this dissertation aims to give Cosmos the capability of generating adaptation plans of runtime. In this way, it was necessary to perform a reengineering of the Cosmos Framework in order to allow its integration with a mechanism for the dynamic generation of adaptation plans. In this context, our work has been focused on conducting a reengineering of Cosmos. Among the changes made to in the Cosmos, we can highlight: changes in the metamodel used to represent components and applications, which has been redefined based on an architectural description language. These changes were propagated to the implementation of a new Cosmos prototype, which was then used for developing a case study application for purpose of proof of concept. Another effort undertaken was to make Cosmos more attractive by integrating it with another platform, in the case of this dissertation, the OSGi platform, which is well-known and accepted by the industry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yeasts are becoming a common cause of nosocomial fungal infections that affect immunocompromised patients. Such infections can evolve into sepsis, whose mortality rate is high. This study aimed to evaluate the viability of Candida species identification by the automated system Vitek-Biomerieux (Durham, USA). Ninety-eight medical charts referencing the Candida spp. samples available for the study were retrospectively analyzed. The system Vitek-Biomerieux with Candida identification card is recommended for laboratory routine use and presents 80.6% agreement with the reference method. By separate analysis of species, 13.5% of C. parapsilosis samples differed from the reference method, while the Vitek system wrongly identified them as C. tropicalis, C. lusitaneae or as Candida albicans. C. glabrata presented a discrepancy of only one sample (25%), and was identified by Vitek as C. parapsilosis. C. guilliermondii also differed in only one sample (33.3%), being identified as Candida spp. All C. albicans, C. tropicalis and C. lusitaneae samples were identified correctly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Borborema Province (BP) is a geologic domain located in Northeastern Brazil. The BP is limited at the south by the São Francisco craton, at the west by the Parnaíba basin, and both at the north and east by coastal sedimentary basins. Nonetheless the BP surface geology is well known, several key aspects of its evolution are still open, notably: i)its tectonic compartmentalization established after the Brasiliano orogenesis, ii) the architecture of its cretaceous continental margin, iii) the elastic properties of its lithosphere, and iv) the causes of magmatism and uplifting which occurred in the Cenozoic. In this thesis, a regional coverage of geophysical data (elevation, gravity, magnetic, geoid height, and surface wave global tomography) were integrated with surface geologic information aiming to attain a better understanding of the above questions. In the Riacho do Pontal belt and in the western sector of the Sergipano belt, the neoproterozoic suture of the collision of the Sul domain of the BP with the Sanfranciscana plate (SFP) is correlated with an expressive dipolar gravity anomaly. The positive lobule of this anomaly is due to the BP lower continental crust uplifting whilst the negative lobule is due to the supracrustal nappes overthrusting the SFP. In the eastern sector of the Sergipano belt, this dipolar gravity anomaly does not exist. However the suture still can be identified at the southern sector of the Marancó complex arc, alongside of the Porto da Folha shear zone, where the SFP N-S geophysical alignments are truncated. The boundary associated to the collision of the Ceará domain of the BP with the West African craton is also correlated with a dipolar gravity anomaly. The positive lobule of this anomaly coincides with the Sobral-Pedro II shear zone whilst the negative lobule is associated with the Santa Quitéria magmatic arc. Judging by their geophysical signatures, the major BP internal boundaries are: i)the western sector of the Pernambuco shear zone and the eastern continuation of this shear zone as the Congo shear zone, ii) the Patos shear zone, and iii) the Jaguaribe shear zone and its southwestern continuation as the Tatajuba shear zone. These boundaries divide the BP in five tectonic domains in the geophysical criteria: Sul, Transversal, Rio Grande do Norte, Ceará, and Médio Coreaú. The Sul domain is characterized by geophysical signatures associated with the BP and SFP collision. The fact that Congo shear zone is now proposed as part of the Transversal domain boundary implies an important change in the original definition of this domain. The Rio Grande do Norte domain presents a highly magnetized crust resulted from the superposition of precambrian and phanerozoic events. The Ceará domain is divided by the Senador Pompeu shear zone in two subdomains: the eastern one corresponds to the Orós-Jaguaribe belt and the western one to the Ceará-Central subdomain. The latter subdomain exhibits a positive ENE-W SW gravity anomaly which was associated to a crustal discontinuity. This discontinuity would have acted as a rampart against to the N-S Brasiliano orogenic nappes. The Médio Coreaú domain also presents a dipolar gravity anomaly. Its positive lobule is due to granulitic rocks whereas the negative one is caused by supracrustal rocks. The boundary between Médio Coreaú and Ceará domains can be traced below the Parnaíba basin sediments by its geophysical signature. The joint analysis of free air anomalies, free air admittances, and effective elastic thickness estimates (Te) revealed that the Brazilian East and Equatorial continental margins have quite different elastic properties. In the first one 10 km < Te < 20 km whereas in the second one Te ≤ 10 km. The weakness of the Equatorial margin lithosphere was caused by the cenozoic magmatism. The BP continental margin presents segmentations; some of them have inheritance from precambrian structures and domains. The segmentations conform markedly with some sedimentary basin features which are below described from south to north. The limit between Sergipe and Alagoas subbasins coincides with the suture between BP and SFP. Te estimates indicates concordantly that in Sergipe subbasin Te is around 20 km while Alagoas subbasin has Te around 10 km, thus revealing that the lithosphere in the Sergipe subbasin has a greater rigidity than the lithosphere in the Alagoas subbasin. Additionally inside the crust beneath Sergipe subbasin occurs a very dense body (underplating or crustal heritage?) which is not present in the crust beneath Alagoas subbasin. The continental margin of the Pernambuco basin (15 < Te < 25 km) presents a very distinct free air edge effect displaying two anomalies. This fact indicates the existence in the Pernambuco plateau of a relatively thick crust. In the Paraíba basin the free air edge effect is quite uniform, Te ≈ 15 km, and the lower crust is abnormally dense probably due to its alteration by a magmatic underplating in the Cenozoic. The Potiguar basin segmentation in three parts was corroborated by the Te estimates: in the Potiguar rift Te ≅ 5 km, in the Aracati platform Te ≅ 25 km, and in the Touros platform Te ≅ 10 km. The observed weakness of the lithosphere in the Potiguar rift segment is due to the high heat flux while the relatively high strength of the lithosphere in the Touros platform may be due to the existence of an archaean crust. The Ceará basin, in the region of Mundaú and Icaraí subbasins, presents a quite uniform free air edge effect and Te ranges from 10 to 15 km. The analysis of the Bouguer admittance revealed that isostasy in BP can be explained with an isostatic model where combined surface and buried loadings are present. The estimated ratio of the buried loading relative to the surface loading is equal to 15. In addition, the lower crust in BP is abnormally dense. These affirmations are particularly adequate to the northern portion of BP where adherence of the observed data to the isostatic model is quite good. Using the same above described isostatic model to calculate the coherence function, it was obtained that a single Te estimate for the entire BP must be lower than 60 km; in addition, the BP north portion has Te around 20 km. Using the conventional elastic flexural model to isostasy, an inversion of crust thickness was performed. It was identified two regions in BP where the crust is thickened: one below the Borborema plateau (associated to an uplifting in the Cenozoic) and the other one in the Ceará domain beneath the Santa Quitéria magmatic arc (a residue associated to the Brasiliano orogenesis). On the other hand, along the Cariri-Potiguar trend, the crust is thinned due to an aborted rifting in the Cretaceous. Based on the interpretation of free air anomalies, it was inferred the existence of a large magmatism in the oceanic crust surrounding the BP, in contrast with the incipient magmatism in the continent as shown by surface geology. In BP a quite important positive geoid anomaly exists. This anomaly is spatially correlated with the Borborema plateau and the Macaú-Queimadas volcanic lineament. The integrated interpretation of geoid height anomaly data, global shear velocity model, and geologic data allow to propose that and Edge Driven Convection (EDC) may have caused the Cenozoic magmatism. The EDC is an instability that presumably occurs at the boundary between thick stable lithosphere and oceanic thin lithosphere. In the BP lithosphere, the EDC mechanism would have dragged the cold lithospheric mantle into the hot asthenospheric mantle thus causing a positive density contrast that would have generated the main component of the geoid height anomaly. In addition, the compatibility of the gravity data with the isostatic model, where combined surface and buried loadings are present, together with the temporal correlation between the Cenozoic magmatism and the Borborema plateau uplifting allow to propose that this uplifting would have been caused by the buoyancy effect of a crustal root generated by a magmatic underplating in the Cenozoic