26 resultados para Problema de minimização de trocas de ferramenta

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topology optimization problem characterize and determine the optimum distribution of material into the domain. In other words, after the definition of the boundary conditions in a pre-established domain, the problem is how to distribute the material to solve the minimization problem. The objective of this work is to propose a competitive formulation for optimum structural topologies determination in 3D problems and able to provide high-resolution layouts. The procedure combines the Galerkin Finite Elements Method with the optimization method, looking for the best material distribution along the fixed domain of project. The layout topology optimization method is based on the material approach, proposed by Bendsoe & Kikuchi (1988), and considers a homogenized constitutive equation that depends only on the relative density of the material. The finite element used for the approach is a four nodes tetrahedron with a selective integration scheme, which interpolate not only the components of the displacement field but also the relative density field. The proposed procedure consists in the solution of a sequence of layout optimization problems applied to compliance minimization problems and mass minimization problems under local stress constraint. The microstructure used in this procedure was the SIMP (Solid Isotropic Material with Penalty). The approach reduces considerably the computational cost, showing to be efficient and robust. The results provided a well defined structural layout, with a sharpness distribution of the material and a boundary condition definition. The layout quality was proporcional to the medium size of the element and a considerable reduction of the project variables was observed due to the tetrahedrycal element

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of social exclusion in Brazil and having as focus the digital inclusion, was started in Federal University of Rio Grande do Norte a project that could talk, at the same time, about concepts of collaborative learning and educational robotics , focused on children digitally excluded. In this context was created a methodology that approaches many subjects as technological elements (e. g. informatics and robotics) and school subjects (e. g. Portuguese, Mathematics, Geography, History), contextualized in everyday situations. We observed educational concepts of collaborative learning and the development of capacities from those students, as group work, logical knowledge and learning ability. This paper proposes an educational software for robotics teaching called RoboEduc, created to be used by children digitally excluded from primary school. Its introduction prioritizes a friendly interface, that makes the concepts of robotics and programming easy and fun to be taught. With this new tool, users without informatics or robotics previous knowledge are able to control a robot, previously set with Lego kits, or even program it to carry some activities out. This paper provides the implementation of the second version of the software. This version presents the control of the robot already used. After were implemented the different levels of programming linked to the many learning levels of the users and their different interfaces and functions. Nowadays, has been implemented the third version, with the improvement of each one of the mentioned stages. In order to validate, prove and test the efficience of the developed methodology to the RoboEduc, were made experiments, through practice of robotics, with children for fourth and fifth grades of primary school at the City School Professor Ascendino de Almeida, in the suburb of Natal (west zone), Rio Grande do Norte. As a preliminary result of the current technology, we verified that the use of robots associated with a well elaborated software can be spread to users that know very little about the subject, without the necessity of previous advanced technology knowledges. Therefore, they showed to be accessible and efficient tools in the process of digital inclusion

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With hardware and software technologies advance, it s also happenning modifications in the development models of computational systems. New methodologies for user interface specification are being created with user interface description languages (UIDL). The UIDLs are a way to have a precise description in a language with more abstraction and independent of how will be implemented. A great problem is that even using these nowadays methodologies, we still have a big distance between the UIDLs and its design, what means, the distance between abstract and concrete. The tool BRIDGE (Interface Design Generator Environment) was created with the intention of being a linking bridge between a specification language (the Interactive Message Modeling Language IMML) and its implementation in Java, linking the abstract (specification) to the concrete (implementation). IMML is a language based on models, that allows the designer works in distinct abstraction levels, being each model a distinct abstraction level. IMML is a XML language, that uses the Semiotic Engineering concepts, that deals the computational system, with the user interface and its elements like a metacommunicative artifact, where these elements must to transmit a message to the user about what task must to be realized and the way to reach this goal. With BRIDGE, we intend to supply a lot of support to the design task, being the user interface prototipation the greater of them. BRIDGE allows the design becomes easier and more intuitive coming from an interface specification language

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing interest of the Computer Science education community for including testing concepts on introductory programming courses. Aiming at contributing to this issue, we introduce POPT, a Problem-Oriented Programming and Testing approach for Introductory Programming Courses. POPT main goal is to improve the traditional method of teaching introductory programming that concentrates mainly on implementation and neglects testing. POPT extends POP (Problem Oriented Programing) methodology proposed on the PhD Thesis of Andrea Mendonça (UFCG). In both methodologies POPT and POP, students skills in dealing with ill-defined problems must be developed since the first programming courses. In POPT however, students are stimulated to clarify ill-defined problem specifications, guided by de definition of test cases (in a table-like manner). This paper presents POPT, and TestBoot a tool developed to support the methodology. In order to evaluate the approach a case study and a controlled experiment (which adopted the Latin Square design) were performed. In an Introductory Programming course of Computer Science and Software Engineering Graduation Programs at the Federal University of Rio Grande do Norte, Brazil. The study results have shown that, when compared to a Blind Testing approach, POPT stimulates the implementation of programs of better external quality the first program version submitted by POPT students passed in twice the number of test cases (professor-defined ones) when compared to non-POPT students. Moreover, POPT students submitted fewer program versions and spent more time to submit the first version to the automatic evaluation system, which lead us to think that POPT students are stimulated to think better about the solution they are implementing. The controlled experiment confirmed the influence of the proposed methodology on the quality of the code developed by POPT students

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of Regression Test (RT) is to reuse the test suite of the latest version of a software in its current version, in order to maximize the value of the tests already developed and ensure that old features continue working after the new changes. Even with reuse, it is common that not all tests need to be executed again. Because of that, it is encouraged to use Regression Tests Selection (RTS) techniques, which aims to select from all tests, only those that reveal faults, this reduces costs and makes this an interesting practice for the testing teams. Several recent research works evaluate the quality of the selections performed by RTS techniques, identifying which one presents the best results, measured by metrics such as inclusion and precision. The RTS techniques should seek in the System Under Test (SUT) for tests that reveal faults. However, because this is a problem without a viable solution, they alternatively seek for tests that reveal changes, where faults may occur. Nevertheless, these changes may modify the execution flow of the algorithm itself, leading some tests no longer exercise the same stretch. In this context, this dissertation investigates whether changes performed in a SUT would affect the quality of the selection of tests performed by an RTS, if so, which features the changes present which cause errors, leading the RTS to include or exclude tests wrongly. For this purpose, a tool was developed using the Java language to automate the measurement of inclusion and precision averages achieved by a regression test selection technique for a particular feature of change. In order to validate this tool, an empirical study was conducted to evaluate the RTS technique Pythia, based on textual differencing, on a large web information system, analyzing the feature of types of tasks performed to evolve the SUT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problems associated to longitudinal interactions in buried pipelines are characterized as three-dimensional and can lead to different soil-pipe issues. Despite the progress achieved in research on buried pipelines, little attention has been given to the three-dimensional nature of the problem throughout the last decades. Most of researches simplify the problem by considering it in plane strain condition. This dissertation aims to present a study on the behavior of buried pipelines under local settlement or elevation, using three-dimensional simulations. Finite element code Plaxis 3D was used for the simulations. Particular aspects of the numerical modeling were evaluated and parametric analyzes were performed, was investigated the effects of soil arching in three-dimensional form. The main variables investigated were as follows: relative density, displacement of the elevation or settlement zone, elevated zone size, height of soil cover and pipe diameter/thickness ratio. The simulations were performed in two stages. The first stage was involved the validation of the numerical analysis using the physical models put forward by Costa (2005). In the second stage, numerical analyzes of a full-scale pipeline subjected to a localized elevation were performed. The obtained results allowed a detailed evaluation of the redistribution of stresses in the soil mass and the deflections along the pipe. It was observed the reduction of stresses in the soil mass and pipe deflections when the height of soil cover was decreased on regions of the pipe subjected to elevation. It was also shown for the analyzed situation that longitudinal thrusts were higher than vi circumferential trusts and exceeded the allowable stresses and deflections. Furthermore, the benefits of minimizing stress with technical as the false trench, compressible cradle and a combination of both applied to the simulated pipeline were verified

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar activity indicators, each as sunspot numbers, sunspot area and flares, over the Sun’s photosphere are not considered to be symmetric between the northern and southern hemispheres of the Sun. This behavior is also known as the North-South Asymmetry of the different solar indices. Among the different conclusions obtained by several authors, we can point that the N-S asymmetry is a real and systematic phenomenon and is not due to random variability. In the present work, the probability distributions from the Marshall Space Flight Centre (MSFC) database are investigated using a statistical tool arises from well-known Non-Extensive Statistical Mechanics proposed by C. Tsallis in 1988. We present our results and discuss their physical implications with the help of theoretical model and observations. We obtained that there is a strong dependence between the nonextensive entropic parameter q and long-term solar variability presents in the sunspot area data. Among the most important results, we highlight that the asymmetry index q reveals the dominance of the North against the South. This behavior has been discussed and confirmed by several authors, but in no time they have given such behavior to a statistical model property. Thus, we conclude that this parameter can be considered as an effective measure for diagnosing long-term variations of solar dynamo. Finally, our dissertation opens a new approach for investigating time series in astrophysics from the perspective of non-extensivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important problem faced by the oil industry is to distribute multiple oil products through pipelines. Distribution is done in a network composed of refineries (source nodes), storage parks (intermediate nodes), and terminals (demand nodes) interconnected by a set of pipelines transporting oil and derivatives between adjacent areas. Constraints related to storage limits, delivery time, sources availability, sending and receiving limits, among others, must be satisfied. Some researchers deal with this problem under a discrete viewpoint in which the flow in the network is seen as batches sending. Usually, there is no separation device between batches of different products and the losses due to interfaces may be significant. Minimizing delivery time is a typical objective adopted by engineers when scheduling products sending in pipeline networks. However, costs incurred due to losses in interfaces cannot be disregarded. The cost also depends on pumping expenses, which are mostly due to the electricity cost. Since industrial electricity tariff varies over the day, pumping at different time periods have different cost. This work presents an experimental investigation of computational methods designed to deal with the problem of distributing oil derivatives in networks considering three minimization objectives simultaneously: delivery time, losses due to interfaces and electricity cost. The problem is NP-hard and is addressed with hybrid evolutionary algorithms. Hybridizations are mainly focused on Transgenetic Algorithms and classical multi-objective evolutionary algorithm architectures such as MOEA/D, NSGA2 and SPEA2. Three architectures named MOTA/D, NSTA and SPETA are applied to the problem. An experimental study compares the algorithms on thirty test cases. To analyse the results obtained with the algorithms Pareto-compliant quality indicators are used and the significance of the results evaluated with non-parametric statistical tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PEDRINI, Aldomar; SZOKOLAY, Steven. Recomendações para o desenvolvimento de uma ferramenta de suporte às primeiras decisões projetuais visando ao desempenho energético de edificações de escritório em clima quente. Ambiente Construído, Porto Alegre, v. 5, n. 1, p.39-54, jan./mar. 2005. Trimestral. Disponível em: . Acesso em: 04 out. 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RAMOS, A. S. M. ; FERREIRA, L. B. . Tecnologia da informação: Commodity ou Ferramenta Estratégica?. Revista de Gestão da Tecnologia e Sistemas de Informação , USP, São Paulo, v. 2, n. 1, p. 69-79, 2005.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent decades the public sector comes under pressure in order to improve its performance. The use of Information Technology (IT) has been a tool increasingly used in reaching that goal. Thus, it has become an important issue in public organizations, particularly in institutions of higher education, determine which factors influence the acceptance and use of technology, impacting on the success of its implementation and the desired organizational results. The Technology Acceptance Model - TAM was used as the basis for this study and is based on the constructs perceived usefulness and perceived ease of use. However, when it comes to integrated management systems due to the complexity of its implementation,organizational factors were added to thus seek further explanation of the acceptance of such systems. Thus, added to the model five TAM constructs related to critical success factors in implementing ERP systems, they are: support of top management, communication, training, cooperation, and technological complexity (BUENO and SALMERON, 2008). Based on the foregoing, launches the following research problem: What factors influence the acceptance and use of SIE / module academic at the Federal University of Para, from the users' perception of teachers and technicians? The purpose of this study was to identify the influence of organizational factors, and behavioral antecedents of behavioral intention to use the SIE / module academic UFPA in the perspective of teachers and technical users. This is applied research, exploratory and descriptive, quantitative with the implementation of a survey, and data collection occurred through a structured questionnaire applied to a sample of 229 teachers and 30 technical and administrative staff. Data analysis was carried out through descriptive statistics and structural equation modeling with the technique of partial least squares (PLS). Effected primarily to assess the measurement model, which were verified reliability, convergent and discriminant validity for all indicators and constructs. Then the structural model was analyzed using the bootstrap resampling technique like. In assessing statistical significance, all hypotheses were supported. The coefficient of determination (R ²) was high or average in five of the six endogenous variables, so the model explains 47.3% of the variation in behavioral intention. It is noteworthy that among the antecedents of behavioral intention (BI) analyzed in this study, perceived usefulness is the variable that has a greater effect on behavioral intention, followed by ease of use (PEU) and attitude (AT). Among the organizational aspects (critical success factors) studied technological complexity (TC) and training (ERT) were those with greatest effect on behavioral intention to use, although these effects were lower than those produced by behavioral factors (originating from TAM). It is pointed out further that the support of senior management (TMS) showed, among all variables, the least effect on the intention to use (BI) and was followed by communications (COM) and cooperation (CO), which exert a low effect on behavioral intention (BI). Therefore, as other studies on the TAM constructs were adequate for the present research. Thus, the study contributed towards proving evidence that the Technology Acceptance Model can be applied to predict the acceptance of integrated management systems, even in public. Keywords: Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research investigates the microclimate and the morphology features of the central campus of the UFRN, in Natal-RN, through the use of bioclimatic tools of analysis in order to assist the implementation of the campus´ Master Plan. It develops a diagnosis of the evolution and growth of the urban space surveyed by analyzing its initial plan and the basic urban conception behind it, as well as the morphology and typologies utilized. The study makes a qualitative analysis of the local microclimate by using Katzschner (1997) methodology, with land-use and topography maps, building heights, vegetation and soil covering. It also makes use of the methodology proposed by Oliveira (1993), which examines, from the bioclimatic standpoint, the human environment as related to the urban form (site and built mass). It identifies zones whose climatic characteristics are representative of the local microclimate and classifies them into areas to be strictly preserved, areas to be protected and areas to be improved. By means of the methodology for spatial and environmental assessment developed by Bustos Romero (2001), the survey selects characteristic points of each area in order to register the environmental data relative to the two basic seasons found in the region where the campus is located, that is, the dry and the rainy season, so that it can evaluate changes in the environment which might have been caused by urban density growth, by arborization or by the influence of the urban form. It then proceeds to a quantitative and statistical survey of the collected data with the purpose of evaluating the degree of influence of the identified features over the environmental variables along the different scales of approach. The study shows the existence of different microclimates and emphasizes the relevance of the bioclimatic analysis of the built environment as a tool for the decision-making process along the development of the Master Plan for UFRN Central Campus

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Noise Pollution causes degradation in the quality of the environment and presents itself as one of the most common environmental problems in the big cities. An Urban environment present scenario and their complex acoustic study need to consider the contribution of various noise sources. Accordingly to computational models through mapping and prediction of acoustic scene become important, because they enable the realization of calculations, analyzes and reports, allowing the interpretation of satisfactory results. The study neighborhood is the neighborhood of Lagoa Nova, a central area of the city of Natal, which will undergo major changes in urban space due to urban mobility projects planned for the area around the stadium and the consequent changes of urban form and traffic. Thus, this study aims to evaluate the noise impact caused by road and morphological changes around the stadium Arena das Dunas in the neighborhood of Lagoa Nova, through on-site measurements and mapping using the computational model SoundPLAN year 2012 and the scenario evolution acoustic for the year 2017. For this analysis was the construction of the first acoustic mapping based on current diagnostic acoustic neighborhood, physical mapping, classified vehicle count and measurement of sound pressure level, and to build the prediction of noise were observed for the area study the modifications provided for traffic, urban form and mobility work. In this study, it is concluded that the sound pressure levels of the year in 2012 and 2017 extrapolate current legislation. For the prediction of noise were numerous changes in the acoustic scene, in which the works of urban mobility provided will improve traffic flow, thus reduce the sound pressure level where interventions are expected

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural air ventilation is the most import passive strategy to provide thermal comfort in hot and humid climates and a significant low energy strategy. However, the natural ventilated building requires more attention with the architectural design than a conventional building with air conditioning systems, and the results are less reliable. Therefore, this thesis focuses on softwares and methods to predict the natural ventilation performance from the point of view of the architect, with limited resource and knowledge of fluid mechanics. A typical prefabricated building was modelled due to its simplified geometry, low cost and occurrence at the local campus. Firstly, the study emphasized the use of computational fluid dynamics (CFD) software, to simulate the air flow outside and inside the building. A series of approaches were developed to make the simulations possible, compromising the results fidelity. Secondly, the results of CFD simulations were used as the input of an energy tool, to simulate the thermal performance under different rates of air renew. Thirdly, the results of temperature were assessed in terms of thermal comfort. Complementary simulations were carried out to detail the analyses. The results show the potentialities of these tools. However the discussions concerning the simplifications of the approaches, the limitations of the tools and the level of knowledge of the average architect are the major contribution of this study