17 resultados para Cloud computing, OpenNebula, sincronizzazione, replica, wide area network
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
With the advance of the Cloud Computing paradigm, a single service offered by a cloud platform may not be enough to meet all the application requirements. To fulfill such requirements, it may be necessary, instead of a single service, a composition of services that aggregates services provided by different cloud platforms. In order to generate aggregated value for the user, this composition of services provided by several Cloud Computing platforms requires a solution in terms of platforms integration, which encompasses the manipulation of a wide number of noninteroperable APIs and protocols from different platform vendors. In this scenario, this work presents Cloud Integrator, a middleware platform for composing services provided by different Cloud Computing platforms. Besides providing an environment that facilitates the development and execution of applications that use such services, Cloud Integrator works as a mediator by providing mechanisms for building applications through composition and selection of semantic Web services that take into account metadata about the services, such as QoS (Quality of Service), prices, etc. Moreover, the proposed middleware platform provides an adaptation mechanism that can be triggered in case of failure or quality degradation of one or more services used by the running application in order to ensure its quality and availability. In this work, through a case study that consists of an application that use services provided by different cloud platforms, Cloud Integrator is evaluated in terms of the efficiency of the performed service composition, selection and adaptation processes, as well as the potential of using this middleware in heterogeneous computational clouds scenarios
Resumo:
Cloud Computing is a paradigm that enables the access, in a simple and pervasive way, through the network, to shared and configurable computing resources. Such resources can be offered on demand to users in a pay-per-use model. With the advance of this paradigm, a single service offered by a cloud platform might not be enough to meet all the requirements of clients. Ergo, it is needed to compose services provided by different cloud platforms. However, current cloud platforms are not implemented using common standards, each one has its own APIs and development tools, which is a barrier for composing different services. In this context, the Cloud Integrator, a service-oriented middleware platform, provides an environment to facilitate the development and execution of multi-cloud applications. The applications are compositions of services, from different cloud platforms and, represented by abstract workflows. However, Cloud Integrator has some limitations, such as: (i) applications are locally executed; (ii) users cannot specify the application in terms of its inputs and outputs, and; (iii) experienced users cannot directly determine the concrete Web services that will perform the workflow. In order to deal with such limitations, this work proposes Cloud Stratus, a middleware platform that extends Cloud Integrator and offers different ways to specify an application: as an abstract workflow or a complete/partial execution flow. The platform enables the application deployment in cloud virtual machines, so that several users can access it through the Internet. It also supports the access and management of virtual machines in different cloud platforms and provides services monitoring mechanisms and assessment of QoS parameters. Cloud Stratus was validated through a case study that consists of an application that uses different services provided by different cloud platforms. Cloud Stratus was also evaluated through computing experiments that analyze the performance of its processes.
Resumo:
Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.
Resumo:
This work is a case study based on Belém Jewelry Pole, whose main issue is to understand how the social network (which the Pole is inserted) influences on innovation process on this area. The main objective is to analyze how interorganizational networks impacted/impact on the potential for innovation, creating both limits and opportunities for the companies development. The adopted method analyzed the historical jewelry industry trajectory since the beginning of mineral extraction in the city of Itaituba (in the Pará State) until nowadays. Primary and secondary data were used allowing the view of the dynamics of the network during transformation periods of the main involved actors in the process. The prospect of embeddedness structural as analysis technique allowed verifying the quality of interactors ties, as well as the visualization of their structures. During the jewelry industry trajectory was verified a change in the quality of social relations, modifying the information flow, trust and associations of various links in the production chain. Both direct and indirect ties facilitated the access to remote networks entering new information related to new products, processes and market aspects. This interaction has led to raising the innovation potential causing a qualitative and quantitative improvement in competitiveness of organizations. Some embedded ties allowed the formation of partnerships bringing various economic earnings for those involved in the relationship. Thus, it is understood how aspects related to the position, architecture and quality of ties in a wide social network influenced on the innovation process and eventual jewelry industry trajectory
Resumo:
This study aims to propose a computing device mechanism which is capable to permit a tactile communication between individuals with visual impairment (blindness or low vision) through the Internet or through a local area network (LAN - Local Network Address). The work was developed under the research projects that currently are realized in the LAI (Laboratory of Integrated Accessibility) of the Federal University of Rio Grande do Norte. This way, the research was done in order to involve a prototype capable to recognize geometries by students considered blind from the Institute of Education and Rehabilitation of Blind of Rio Grande do Norte (IERC-RN), located in Alecrim neighborhood, Natal/RN. Besides this research, another prototype was developed to test the communication via a local network and Internet. To analyze the data, a qualitative and quantitative approach was used through simple statistical techniques, such as percentages and averages, to support subjective interpretations. The results offer an analysis of the extent to which the implementation can contribute to the socialization and learning of the visually impaired. Finally, some recommendations are suggested for the development of future researches in order to facilitate the proposed mechanism.
Resumo:
In the two last decades of the past century, following the consolidation of the Internet as the world-wide computer network, applications generating more robust data flows started to appear. The increasing use of videoconferencing stimulated the creation of a new form of point-to-multipoint transmission called IP Multicast. All companies working in the area of software and the hardware development for network videoconferencing have adjusted their products as well as developed new solutionsfor the use of multicast. However the configuration of such different solutions is not easy done, moreover when changes in the operational system are also requirede. Besides, the existing free tools have limited functions, and the current comercial solutions are heavily dependent on specific platforms. Along with the maturity of IP Multicast technology and with its inclusion in all the current operational systems, the object-oriented programming languages had developed classes able to handle multicast traflic. So, with the help of Java APIs for network, data bases and hipertext, it became possible to the develop an Integrated Environment able to handle multicast traffic, which is the major objective of this work. This document describes the implementation of the above mentioned environment, which provides many functions to use and manage multicast traffic, functions which existed only in a limited way and just in few tools, normally the comercial ones. This environment is useful to different kinds of users, so that it can be used by common users, who want to join multimedia Internet sessions, as well as more advenced users such engineers and network administrators who may need to monitor and handle multicast traffic
Resumo:
This work aims to understand how cloud computing contextualizes the IT government and decision agenda, in the light of the multiple streams model, considering the current status of public IT policies, the dynamics of the agenda setting for the area, the interface between the various institutions, and existing initiatives on the use of cloud computing in government. Therefore, a qualitative study was conducted through interviews with a group of policy makers and the other group consists of IT managers. As analysis technique, this work made use of content analysis and analysis of documents, with some results by word cloud. As regards the main results to overregulation to the area, usually scattered in various agencies of the federal government, which hinders the performance of the managers. Identified a lack of knowledge of standards, government programs, regulations and guidelines. Among these he highlighted a lack of understanding of the TI Maior Program, the lack of effectiveness of the National Broadband Plan in view of the respondents, as well as the influence of Internet Landmark as an element that can jam the advances in the use of computing cloud in the Brazilian government. Also noteworthy is the bureaucratization of the acquisition of goods to IT services, limited, in many cases, technological advances. Regarding the influence of the actors, it was not possible to identify the presence of a political entrepreneur, and it was noticed a lack of political force. Political flow was affected only by changes within the government. Fragmentation was a major factor for the theme of weakening the agenda formation. Information security was questioned by the respondents pointed out that the main limitation coupled with the lack of training of public servants. In terms of benefits, resource economy is highlighted, followed by improving efficiency. Finally, the discussion about cloud computing needs to advance within the public sphere, whereas the international experience is already far advanced, framing cloud computing as a responsible element for the improvement of processes, services and economy of public resources
Resumo:
Cloud computing can be defined as a distributed computational model by through resources (hardware, storage, development platforms and communication) are shared, as paid services accessible with minimal management effort and interaction. A great benefit of this model is to enable the use of various providers (e.g a multi-cloud architecture) to compose a set of services in order to obtain an optimal configuration for performance and cost. However, the multi-cloud use is precluded by the problem of cloud lock-in. The cloud lock-in is the dependency between an application and a cloud platform. It is commonly addressed by three strategies: (i) use of intermediate layer that stands to consumers of cloud services and the provider, (ii) use of standardized interfaces to access the cloud, or (iii) use of models with open specifications. This paper outlines an approach to evaluate these strategies. This approach was performed and it was found that despite the advances made by these strategies, none of them actually solves the problem of lock-in cloud. In this sense, this work proposes the use of Semantic Web to avoid cloud lock-in, where RDF models are used to specify the features of a cloud, which are managed by SPARQL queries. In this direction, this work: (i) presents an evaluation model that quantifies the problem of cloud lock-in, (ii) evaluates the cloud lock-in from three multi-cloud solutions and three cloud platforms, (iii) proposes using RDF and SPARQL on management of cloud resources, (iv) presents the cloud Query Manager (CQM), an SPARQL server that implements the proposal, and (v) comparing three multi-cloud solutions in relation to CQM on the response time and the effectiveness in the resolution of cloud lock-in.
Resumo:
This work is a case study based on Belém Jewelry Pole, whose main issue is to understand how the social network (which the Pole is inserted) influences on innovation process on this area. The main objective is to analyze how interorganizational networks impacted/impact on the potential for innovation, creating both limits and opportunities for the companies development. The adopted method analyzed the historical jewelry industry trajectory since the beginning of mineral extraction in the city of Itaituba (in the Pará State) until nowadays. Primary and secondary data were used allowing the view of the dynamics of the network during transformation periods of the main involved actors in the process. The prospect of embeddedness structural as analysis technique allowed verifying the quality of interactors ties, as well as the visualization of their structures. During the jewelry industry trajectory was verified a change in the quality of social relations, modifying the information flow, trust and associations of various links in the production chain. Both direct and indirect ties facilitated the access to remote networks entering new information related to new products, processes and market aspects. This interaction has led to raising the innovation potential causing a qualitative and quantitative improvement in competitiveness of organizations. Some embedded ties allowed the formation of partnerships bringing various economic earnings for those involved in the relationship. Thus, it is understood how aspects related to the position, architecture and quality of ties in a wide social network influenced on the innovation process and eventual jewelry industry trajectory
Resumo:
Tangara da Serra is located on southwestern Mato Grosso and is found to be on the route of pollutants dispersion originated in the Legal Amazon s deforestation area. This region has also a wide area of sugarcane culture, setting this site quite exposed to atmospheric pollutants. The objective of this work was to evaluate the genotoxicity of three different concentrations of organic particulate matter which was collected from August through December / 2008 in Tangara da Serra, using micronucleus test in Tradescantia pallida (Trad-MCN). The levels of particulate matter less than 10μm (MP10) and black carbon (BC) collected on the Teflon and polycarbonate filters were determined as well. Also, the alkanes and polycyclic aromatic hydrocarbons (PAHs) were identified and quantified on the samples from the burning period by gas chromatography detector with flame ionization detection (GC-FID). The results from the analyzing of alkanes indicate an antropic influence. Among the PAHs, the retene was the one found on the higher quantity and it is an indicator of biomass burning. The compounds indene(1,2,3-cd)pyrene and benzo(k)fluoranthene were identified on the samples and are considered to be potentially mutagenic and carcinogenic. By using Trad-MCN, it was observed a significant increase on the micronucleus frequency during the burning period, and this fact can be related to the mutagenic PAHs which were found on such extracts. When the period of less burnings is analyzed and compared to the negative control group, it was noted that there was no significant difference on the micronuclei rate. On the other hand, when the higher burning period is analyzed, statistically significant differences were evident. This study showed that the Trad-MCN was sensible and efficient on evaluating the genotoxicity potencial of organic matter from biomass burning, and also, emphasizes the importance of performing a chemical composition analysis in order to achieve a complete diagnosis on environmental risk control
Resumo:
The transport of fluids through pipes is used in the oil industry, being the pipelines an important link in the logistics flow of fluids. However, the pipelines suffer deterioration in their walls caused by several factors which may cause loss of fluids to the environment, justifying the investment in techniques and methods of leak detection to minimize fluid loss and environmental damage. This work presents the development of a supervisory module in order to inform to the operator the leakage in the pipeline monitored in the shortest time possible, in order that the operator log procedure that entails the end of the leak. This module is a component of a system designed to detect leaks in oil pipelines using sonic technology, wavelets and neural networks. The plant used in the development and testing of the module presented here was the system of tanks of LAMP, and its LAN, as monitoring network. The proposal consists of, basically, two stages. Initially, assess the performance of the communication infrastructure of the supervisory module. Later, simulate leaks so that the DSP sends information to the supervisory performs the calculation of the location of leaks and indicate to which sensor the leak is closer, and using the system of tanks of LAMP, capture the pressure in the pipeline monitored by piezoresistive sensors, this information being processed by the DSP and sent to the supervisory to be presented to the user in real time
Resumo:
The progresses of the Internet and telecommunications have been changing the concepts of Information Technology IT, especially with regard to outsourcing services, where organizations seek cost-cutting and a better focus on the business. Along with the development of that outsourcing, a new model named Cloud Computing (CC) evolved. It proposes to migrate to the Internet both data processing and information storing. Among the key points of Cloud Computing are included cost-cutting, benefits, risks and the IT paradigms changes. Nonetheless, the adoption of that model brings forth some difficulties to decision-making, by IT managers, mainly with regard to which solutions may go to the cloud, and which service providers are more appropriate to the Organization s reality. The research has as its overall aim to apply the AHP Method (Analytic Hierarchic Process) to decision-making in Cloud Computing. There to, the utilized methodology was the exploratory kind and a study of case applied to a nationwide organization (Federation of Industries of RN). The data collection was performed through two structured questionnaires answered electronically by IT technicians, and the company s Board of Directors. The analysis of the data was carried out in a qualitative and comparative way, and we utilized the software to AHP method called Web-Hipre. The results we obtained found the importance of applying the AHP method in decision-making towards the adoption of Cloud Computing, mainly because on the occasion the research was carried out the studied company already showed interest and necessity in adopting CC, considering the internal problems with infrastructure and availability of information that the company faces nowadays. The organization sought to adopt CC, however, it had doubt regarding the cloud model and which service provider would better meet their real necessities. The application of the AHP, then, worked as a guiding tool to the choice of the best alternative, which points out the Hybrid Cloud as the ideal choice to start off in Cloud Computing. Considering the following aspects: the layer of Infrastructure as a Service IaaS (Processing and Storage) must stay partly on the Public Cloud and partly in the Private Cloud; the layer of Platform as a Service PaaS (Software Developing and Testing) had preference for the Private Cloud, and the layer of Software as a Service - SaaS (Emails/Applications) divided into emails to the Public Cloud and applications to the Private Cloud. The research also identified the important factors to hiring a Cloud Computing provider
Resumo:
This work presents a theoretical and numerical analysis of structures using frequency selective surfaces applied on patch antennas. The FDTD method is used to determine the time domain reflected fields. Applications of frequency selective surfaces and patch antennas cover a wide area of telecommunications, especially mobile communications, filters and WB antennas. scattering parameters are obteained from Fourier Transformer of transmited and reflected fields in time domain. The PML are used as absorbing boundary condition, allowing the determination of the fields with a small interference of reflections from discretized limit space. Rectangular patches are considered on dielectric layer and fed by microstrip line. Frequency selective surfaces with periodic and quasi-periodic structures are analyzed on both sides of antenna. A literature review of the use of frequency selective surfaces in patch antennas are also performed. Numerical results are also compared with measured results for return loss of analyzed structures. It is also presented suggestions of continuity to this work
Resumo:
In Simultaneous Localization and Mapping (SLAM - Simultaneous Localization and Mapping), a robot placed in an unknown location in any environment must be able to create a perspective of this environment (a map) and is situated in the same simultaneously, using only information captured by the robot s sensors and control signals known. Recently, driven by the advance of computing power, work in this area have proposed to use video camera as a sensor and it came so Visual SLAM. This has several approaches and the vast majority of them work basically extracting features of the environment, calculating the necessary correspondence and through these estimate the required parameters. This work presented a monocular visual SLAM system that uses direct image registration to calculate the image reprojection error and optimization methods that minimize this error and thus obtain the parameters for the robot pose and map of the environment directly from the pixels of the images. Thus the steps of extracting and matching features are not needed, enabling our system works well in environments where traditional approaches have difficulty. Moreover, when addressing the problem of SLAM as proposed in this work we avoid a very common problem in traditional approaches, known as error propagation. Worrying about the high computational cost of this approach have been tested several types of optimization methods in order to find a good balance between good estimates and processing time. The results presented in this work show the success of this system in different environments
Resumo:
This work presents an application of a hybrid Fuzzy-ELECTRE-TOPSIS multicriteria approach for a Cloud Computing Service selection problem. The research was exploratory, using a case of study based on the actual requirements of professionals in the field of Cloud Computing. The results were obtained by conducting an experiment aligned with a Case of Study using the distinct profile of three decision makers, for that, we used the Fuzzy-TOPSIS and Fuzzy-ELECTRE-TOPSIS methods to obtain the results and compare them. The solution includes the Fuzzy sets theory, in a way it could support inaccurate or subjective information, thus facilitating the interpretation of the decision maker judgment in the decision-making process. The results show that both methods were able to rank the alternatives from the problem as expected, but the Fuzzy-ELECTRE-TOPSIS method was able to attenuate the compensatory character existing in the Fuzzy-TOPSIS method, resulting in a different alternative ranking. The attenuation of the compensatory character stood out in a positive way at ranking the alternatives, because it prioritized more balanced alternatives than the Fuzzy-TOPSIS method, a factor that has been proven as important at the validation of the Case of Study, since for the composition of a mix of services, balanced alternatives form a more consistent mix when working with restrictions.