17 resultados para indústria de software
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
The academic community and software industry have shown, in recent years, substantial interest in approaches and technologies related to the area of model-driven development (MDD). At the same time, continues the relentless pursuit of industry for technologies to raise productivity and quality in the development of software products. This work aims to explore those two statements, through an experiment carried by using MDD technology and evaluation of its use on solving an actual problem under the security context of enterprise systems. By building and using a tool, a visual DSL denominated CALV3, inspired by the software factory approach: a synergy between software product line, domainspecific languages and MDD, we evaluate the gains in abstraction and productivity through a systematic case study conducted in a development team. The results and lessons learned from the evaluation of this tool within industry are the main contributions of this work
Resumo:
Software product line engineering promotes large software reuse by developing a system family that shares a set of developed core features, and enables the selection and customization of a set of variabilities that distinguish each software product family from the others. In order to address the time-to-market, the software industry has been using the clone-and-own technique to create and manage new software products or product lines. Despite its advantages, the clone-and-own approach brings several difficulties for the evolution and reconciliation of the software product lines, especially because of the code conflicts generated by the simultaneous evolution of the original software product line, called Source, and its cloned products, called Target. This thesis proposes an approach to evolve and reconcile cloned products based on mining software repositories and code conflict analysis techniques. The approach provides support to the identification of different kinds of code conflicts – lexical, structural and semantics – that can occur during development task integration – bug correction, enhancements and new use cases – from the original evolved software product line to the cloned product line. We have also conducted an empirical study of characterization of the code conflicts produced during the evolution and merging of two large-scale web information system product lines. The results of our study demonstrate the approach potential to automatically or semi-automatically solve several existing code conflicts thus contributing to reduce the complexity and costs of the reconciliation of cloned software product lines.
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
Recently the focus given to Web Services and Semantic Web technologies has provided the development of several research projects in different ways to addressing the Web services composition issue. Meanwhile, the challenge of creating an environment that provides the specification of an abstract business process and that it is automatically implemented by a composite service in a dynamic way is considered a currently open problem. WSDL and BPEL provided by industry support only manual service composition because they lack needed semantics so that Web services are discovered, selected and combined by software agents. Services ontology provided by Semantic Web enriches the syntactic descriptions of Web services to facilitate the automation of tasks, such as discovery and composition. This work presents an environment for specifying and ad-hoc executing Web services-based business processes, named WebFlowAH. The WebFlowAH employs common domain ontology to describe both Web services and business processes. It allows processes specification in terms of users goals or desires that are expressed based on the concepts of such common domain ontology. This approach allows processes to be specified in an abstract high level way, unburdening the user from the underline details needed to effectively run the process workflow
Resumo:
The general objective of this thesis has been seasonal monitoring (quarterly time scale) of coastal and estuarine areas of a section of the Northern Coast of Rio Grande do Norte, Brazil, environmentally sensitive and with intense sediment erosion in the oil activities to underpin the implementation of projects for containment of erosion and mitigate the impacts of coastal dynamics. In order to achieve the general objective, the work was done systematically in three stages which consisted the specific objectives. The first stage was the implementation of geodetic reference infrastructure for carrying out the geodetic survey of the study area. This process included the implementation of RGLS (Northern Coast of the RN GPS Network), consisting of stations with geodetic coordinates and orthometric heights of precision; positioning of Benchmarks and evaluation of the gravimetric geoid available, for use in GPS altimetry of precision; and development of software for GPS altimetry of precision. The second stage was the development and improvement of methodologies for collection, processing, representation, integration and analysis of CoastLine (CL) and Digital Elevation Models (DEM) obtained by geodetic positioning techniques. As part of this stage have been made since, the choice of equipment and positioning methods to be used, depending on the required precision and structure implanted, and the definition of the LC indicator and of the geodesic references best suited, to coastal monitoring of precision. The third step was the seasonal geodesic monitoring of the study area. It was defined the execution times of the geodetic surveys by analyzing the pattern of sediment dynamics of the study area; the performing of surveys in order to calculate and locate areas and volumes of erosion and accretion (sandy and volumetric sedimentary balance) occurred on CL and on the beaches and islands surfaces throughout the year, and study of correlations between the measured variations (in area and volume) between each survey and the action of the coastal dynamic agents. The results allowed an integrated study of spatial and temporal interrelationships of the causes and consequences of intensive coastal processes operating in the area, especially to the measurement of variability of erosion, transport, balance and supply sedimentary over the annual cycle of construction and destruction of beaches. In the analysis of the results, it was possible to identify the causes and consequences of severe coastal erosion occurred on beaches exposed, to analyze the recovery of beaches and the accretion occurring in tidal inlets and estuaries. From the optics of seasonal variations in the CL, human interventions to erosion contention have been proposed with the aim of restoring the previous situation of the beaches in the process of erosion.
Resumo:
This work discusses the environmental management thematic, on the basis of ISO 14001 standard and learning organization. This study is carried through an exploratory survey in a company of fuel transport, located in Natal/RN. The objective of this research was to investigate the practices of environmental management, carried through in the context of an implemented ISO 14001 environmental management system, in the researched organization, from the perspective of the learning organization. The methodology used in this work is supported in the quantitative method, combining the exploratory and descriptive types, and uses the technique of questionnaires, having as scope of the research, the managers, employee controlling, coordinators, supervisors and - proper and contracted - of the company. To carry through the analysis of the data of this research, it was used software Excel and Statistical version 6.0. The analysis of the data is divided in two parts: descriptive analysis and analysis of groupings (clusters). The results point, on the basis of the studied theory, as well as in the results of the research, that the implemented ISO 14001 environmental system in the searched organization presents elements that promote learning organization. From the results, it can be concluded that the company uses external information in the decision taking on environmental problems; that the employees are mobilized to generate ideas and to collect n environmental information and that the company has carried through partnerships in the activities of the environmental area with other companies. All these item cited can contribute for the generation of knowledge of the organization. It can also be concluded that the company has evaluated environmental errors occurrences in the past, as well as carried through environmental benchmarking. These practical can be considered as good ways of the company to acquire knowledge. The results also show that the employees have not found difficulties in the accomplishment of the tasks when the manager of its sector is not present. This result can demonstrate that the company has a good diffusion of knowledge
Resumo:
Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system
Resumo:
On this research we investigated how new technologies can help the process of design and manufacturing of furniture in such small manufacturers in Rio Grande do Norte state. Google SketchUp, a 3D software tool, was developed in such a way that its internal structures are opened and can be accessed using SketchUp s API for Ruby and programs written in Ruby language (plugins). Using the concepts of the so-called Group Technology and the flexibility that enables adding new functionalities to this software, it was created a Methodology for Modeling of Furniture, a Coding System and a plugin for Google s tool in order to implement the Methodology developed. As resulted, the following facilities are available: the user may create and reuse the library s models over-and-over; reports of the materials manufacturing process costs are provided and, finally, detailed drawings, getting a better integration between the furniture design and manufacturing process
Resumo:
In this work was developed an information system to apply the concepts of CAD3D-BIM technology for the design activities of the furniture industry. The development of this system was based in an architecture comprised of two modules: a web interface to management the metadata of models from furniture's library and the combination of three-dimensional CAD software with a specific plugin to access the information from this model. To develop this system was also used a Data Base Management System (DBMS) designed to storage the information from models in a hierarchical way, based on concepts of Group Technology (GT). The centralization of information in a single database allows the automatic availability of any changes to all participants involved in a particular project when it‟s happens. Each module from system has its own connection to this database. Finally was developed a prototype from a 3D virtual environment to help create Virtual Reality projects in the web. A study from available technologies to create 3D web applications for execution in websites was done to support this development. The interconnection between modules and the database developed allowed the assembly of a system architecture to support the construction and exhibition of projects of the furniture industry in accordance with the concepts proposed by BIM (Building Information Modeling), using as object of study the furniture industry of state of Rio Grande do Norte
Resumo:
This work shows a project method proposed to design and build software components from the software functional m del up to assembly code level in a rigorous fashion. This method is based on the B method, which was developed with support and interest of British Petroleum (BP). One goal of this methodology is to contribute to solve an important problem, known as The Verifying Compiler. Besides, this work describes a formal model of Z80 microcontroller and a real system of petroleum area. To achieve this goal, the formal model of Z80 was developed and documented, as it is one key component for the verification upto the assembly level. In order to improve the mentioned methodology, it was applied on a petroleum production test system, which is presented in this work. Part of this technique is performed manually. However, almost of these activities can be automated by a specific compiler. To build such compiler, the formal modelling of microcontroller and modelling of production test system should provide relevant knowledge and experiences to the design of a new compiler. In ummary, this work should improve the viability of one of the most stringent criteria for formal verification: speeding up the verification process, reducing design time and increasing the quality and reliability of the product of the final software. All these qualities are very important for systems that involve serious risks or in need of a high confidence, which is very common in the petroleum industry
Resumo:
The principal effluent in the oil industry is the produced water, which is commonly associated to the produced oil. It presents a pronounced volume of production and it can be reflected on the environment and society, if its discharge is unappropriated. Therefore, it is indispensable a valuable careful to establish and maintain its management. The traditional treatment of produced water, usualy includes both tecniques, flocculation and flotation. At flocculation processes, there are traditional floculant agents that aren’t well specified by tecnichal information tables and still expensive. As for the flotation process, it’s the step in which is possible to separate the suspended particles in the effluent. The dissolved air flotation (DAF) is a technique that has been consolidating economically and environmentally, presenting great reliability when compared with other processes. The DAF is presented as a process widely used in various fields of water and wastewater treatment around the globe. In this regard, this study was aimed to evaluate the potential of an alternative natural flocculant agent based on Moringa oleifera to reduce the amount of oil and grease (TOG) in produced water from the oil industry by the method of flocculation/DAF. the natural flocculant agent was evaluated by its efficacy, as well as its efficiency when compared with two commercial flocculant agents normally used by the petroleum industry. The experiments were conducted following an experimental design and the overall efficiencies for all flocculants were treated through statistical calculation based on the use of STATISTICA software version 10.0. Therefore, contour surfaces were obtained from the experimental design and were interpreted in terms of the response variable removal efficiency TOG (total oil and greases). The plan still allowed to obtain mathematical models for calculating the response variable in the studied conditions. Commercial flocculants showed similar behavior, with an average overall efficiency of 90% for oil removal, however it is the economical analysis the decisive factor to choose one of these flocculant agents to the process. The natural alternative flocculant agent based on Moringa oleifera showed lower separation efficiency than those of commercials one (average 70%), on the other hand this flocculant causes less environmental impacts and it´s less expensive
Resumo:
The need of the oil industry to ensure the safety of the facilities, employees and the environment, not to mention the search for maximum efficiency of its facilities, makes it seeks to achieve a high level of excellence in all stages of its production processes in order to obtain the required quality of the final product. Know the reliability of equipment and what it stands for a system is of fundamental importance for ensuring the operational safety. The reliability analysis technique has been increasingly applied in the oil industry as fault prediction tool and undesirable events that can affect business continuity. It is an applied scientific methodology that involves knowledge in engineering and statistics to meet and or analyze the performance of components, equipment and systems in order to ensure that they perform their function without fail, for a period of time and under a specific condition. The results of reliability analyzes help in making decisions about the best maintenance strategy of petrochemical plants. Reliability analysis was applied on equipment (bike-centrifugal fan) between the period 2010-2014 at the Polo Petrobras Guamaré Industrial, situated in rural Guamaré municipality in the state of Rio Grande do Norte, where he collected data field, analyzed historical equipment and observing the behavior of faults and their impacts. The data were processed in commercial software reliability ReliaSoft BlockSim 9. The results were compared with a study conducted by the experts in the field in order to get the best maintenance strategy for the studied system. With the results obtained from the reliability analysis tools was possible to determine the availability of the centrifugal motor-fan and what will be its impact on the security of process units if it will fail. A new maintenance strategy was established to improve the reliability, availability, maintainability and decreased likelihood of Moto-Centrifugal Fan failures, it is a series of actions to promote the increased system reliability and consequent increase in cycle life of the asset. Thus, this strategy sets out preventive measures to reduce the probability of failure and mitigating aimed at minimizing the consequences.
Resumo:
Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.