27 resultados para Unconstrained minimization
Resumo:
Through out the course of a steady increase in search and recovery of space in the coastal zone, there is also an expanding concern about the erosion processes of this area. The main problem in coastal areas is that urbanization occurs in a disorderly fashion and unsustainable, further aggravating the problems of coastal dynamics. The study area of this work is located on the southern coast of Pirangi do Norte beach to about 20km south of Natal, capital of Rio Grande do Norte in the Parnamirim City. This area has the length of approximately 1km, divided into three sections (Western, Central and Eastern) with a morphology consisting of a tableland at the top, sea cliffs in the West and Central parts and sand dunes in the Eastern section, both vegetated, and a coastal plain on the inferior part associated with the presence of beach rocks. This study aimed to analyze the erosion processes operating in the excerpt of Pirangi do Norte beach and assess the feasibility of their monitoring making use of DGPS (Global Positioning System Differential mode). During the work it was carried out a physical description of the area through photo-interpretation and site survey after measurement of the shoreline in the period between November 2004 and November 2009 and beach profiles between August 2005 and July 2006. The analysis of the results of the annual surveys showed the occurrence of variations of the shoreline along the stretch traveled. Sites are identified in advancement coast from the sea and it was verified in loco the presence of erosion with deposition of materials on the lower part of the coastal bluff, the former position of the shoreline, showing a false notion of advancing it. This leads to the conclusion that the data collected in a survey of the shoreline should always be accompanied by photographic records of the local area and with the highest rate of erosion, thus avoiding the mistake of treating the deposit materials as evidence of progress coast. At the end of the study, after a review of various works to mitigate the erosion in the coastal zone, it is recommended to the area of study the adoption of an artificial feeding of the beach, aiming the minimization of the erosive effects of the tides. Moreover, it is suggested that even the continuity of monitoring, maintenance of existing vegetation and control of the occupation on the edge of sea cliffs
Resumo:
The practice of medicine related to the gestational processes tend to be organized according to the context and the place of work, being thus dependent of the conditions both social and economical, and of the physical structure and the functionality of the services. The high mortality rate in this process has diminished, since 1986, the study made by the World Health Organization (WHO) as to the technical aspects and the social inequalities that influence this situation in different geographical contexts. This culminated recommendations that proposed the reorientation of the dynamical practice of medicine, with a focus on the safety of maternities. Brazil adopted, in the year 2000, the suggestions of the OMS, emphasizing the humanization as the main reason for these actions. However, this discussion tends to not consider the problems caused by the social inequalities and the epidemiological and social conditionings that define the actions of the Unified Health System (Sistema Único de Saúde SUS). In this area, this research seeks to analyze the practices, cares taken, and the universal symbol that promotes and rewards the assistance to the birth of children by the SUS. Besides the analysis of the public documents that deal with this subject, an ethnographic study was developed in a maternity in Natal/RN, considered a model of humanization after receiving the Galba de Araújo prize in 2002. In this stage, the methodological strategies were observed, and the focus of the individual interviews with workers and users of this service. In the analysis of the data, it became evident that the different professional workers and women who gave birth, tend to show concern of the standards the delimit production and reproduction of the practice of medicine, as they favor the absence of a critical posture of the actions destined to the population. Besides this, if became evident that the institutional difficulties associated to the economical, cultural, and political problems also difficult the involvement and the reflection of the workers in favor of assisting changes of the process. There is also a utilization of a perspective prescriptive of humanization in the everyday life of the social workers, without reflection of its meaning. Some workers present, in their statements, a preoccupation with the social and economical aspects that affect the practice of medicine, and with the limitations of the humanization discourse that disarticulates the necessities of those involved in the process of formation, and soon tend to return to the discussion of humanization while a kind practice characterized by the minimization of the interventionist actions. Now the users of the system show themselves before the dynamic of the services, submitting themselves to what is offered while assistance, without questioning and/or reflecting about their usual shortages. Therefore, to think of changes in the know and do of the practice of medicine destined to the birth of children implies reflection on the quotidian production of these practices and of the social contexts that influence the process of assistance in the practice of medicine. Herein it would be possible to predict the appropriation, by different workers concerning their exasperations and necessities, making them active in the pursuit of their rights as citizens
Resumo:
With the increasing complexity of software systems, there is also an increased concern about its faults. These faults can cause financial losses and even loss of life. Therefore, we propose in this paper the minimization of faults in software by using formally specified tests. The combination of testing and formal specifications is gaining strength in searches mainly through the MBT (Model-Based Testing). The development of software from formal specifications, when the whole process of refinement is done rigorously, ensures that what is specified in the application will be implemented. Thus, the implementation generated from these specifications would accurately depict what was specified. But not always the specification is refined to the level of implementation and code generation, and in these cases the tests generated from the specification tend to find fault. Additionally, the generation of so-called "invalid tests", ie tests that exercise the application scenarios that were not addressed in the specification, complements more significantly the formal development process. Therefore, this paper proposes a method for generating tests from B formal specifications. This method was structured in pseudo-code. The method is based on the systematization of the techniques of black box testing of boundary value analysis, equivalence partitioning, as well as the technique of orthogonal pairs. The method was applied to a B specification and B test machines that generate test cases independent of implementation language were generated. Aiming to validate the method, test cases were transformed manually in JUnit test cases and the application, created from the B specification and developed in Java, was tested. Faults were found with the execution of the JUnit test cases
Resumo:
This work approaches the Scheduling Workover Rigs Problem (SWRP) to maintain the wells of an oil field, although difficult to resolve, is extremely important economical, technical and environmental. A mathematical formulation of this problem is presented, where an algorithmic approach was developed. The problem can be considered to find the best scheduling service to the wells by the workover rigs, taking into account the minimization of the composition related to the costs of the workover rigs and the total loss of oil suffered by the wells. This problem is similar to the Vehicle Routing Problem (VRP), which is classified as belonging to the NP-hard class. The goal of this research is to develop an algorithmic approach to solve the SWRP, using the fundamentals of metaheuristics like Memetic Algorithm and GRASP. Instances are generated for the tests to analyze the computational performance of the approaches mentioned above, using data that are close to reality. Thereafter, is performed a comparison of performance and quality of the results obtained by each one of techniques used
Resumo:
With process urbanization process the Brazilian cities have been goin through, Natal/RN does not differ from the other ones, it has had a fast, inordinate and planned urbanization, but not applied, it has caused a high increase of social environmental problems. One of the worrying problems observed is the change in the coastal landscape, which has caused serious damage to the city‟s population, more specifically, of Ponta Negra beach neighborhood. For the geographical studies, the issue, concerning the occupation of the beaches that has been getting higher and higher in the last decades is extremely important because these, in addition to being used as homes in the new urban configuration, have incorporated new ways of environmental interference, without a simultaneous advance of knowledge which would be necessary for a more suitable and rational use of litoral spaces. Thus, the current assignment aimed to focus the coastal landscape of Ponta Negra Beach, in the city of Natal/RN, checking and analyzing the effects caused by anthropic and natural action, and the way it reflects in the quality of life of the resident, working population and of the frequenters as well as the landscape transformations in the area which is object of study, from 1970 through 2010. The methodology used followed to stages, the first concerned the theoretical work bibliographic surveying and composition; and second one the empirical work marking of the environmental characterization and application of the questionnaires. So, we can measure that Ponta Negra, is very susceptible to environmental changes, the ones caused by the natural dynamics of the beach, as well as the human actions (society) in this really fragile and mutable space, so it needs, a more profound systematic study about the coastal landscape. In order to reach a minimization of the change of the landscapes in the coastal zones there must be an integrated management of the environments, based on the planning of actions and territorial reordination of the occupations of these so important spaces, environmentally, as well as socioeconomically. Whereas, only this way, we will have a sustentable development and a suitable use of that space
Resumo:
An important problem faced by the oil industry is to distribute multiple oil products through pipelines. Distribution is done in a network composed of refineries (source nodes), storage parks (intermediate nodes), and terminals (demand nodes) interconnected by a set of pipelines transporting oil and derivatives between adjacent areas. Constraints related to storage limits, delivery time, sources availability, sending and receiving limits, among others, must be satisfied. Some researchers deal with this problem under a discrete viewpoint in which the flow in the network is seen as batches sending. Usually, there is no separation device between batches of different products and the losses due to interfaces may be significant. Minimizing delivery time is a typical objective adopted by engineers when scheduling products sending in pipeline networks. However, costs incurred due to losses in interfaces cannot be disregarded. The cost also depends on pumping expenses, which are mostly due to the electricity cost. Since industrial electricity tariff varies over the day, pumping at different time periods have different cost. This work presents an experimental investigation of computational methods designed to deal with the problem of distributing oil derivatives in networks considering three minimization objectives simultaneously: delivery time, losses due to interfaces and electricity cost. The problem is NP-hard and is addressed with hybrid evolutionary algorithms. Hybridizations are mainly focused on Transgenetic Algorithms and classical multi-objective evolutionary algorithm architectures such as MOEA/D, NSGA2 and SPEA2. Three architectures named MOTA/D, NSTA and SPETA are applied to the problem. An experimental study compares the algorithms on thirty test cases. To analyse the results obtained with the algorithms Pareto-compliant quality indicators are used and the significance of the results evaluated with non-parametric statistical tests.
Desenvolvimento da célula base de microestruturas periódicas de compósitos sob otimização topológica
Resumo:
This thesis develops a new technique for composite microstructures projects by the Topology Optimization process, in order to maximize rigidity, making use of Deformation Energy Method and using a refining scheme h-adaptative to obtain a better defining the topological contours of the microstructure. This is done by distributing materials optimally in a region of pre-established project named as Cell Base. In this paper, the Finite Element Method is used to describe the field and for government equation solution. The mesh is refined iteratively refining so that the Finite Element Mesh is made on all the elements which represent solid materials, and all empty elements containing at least one node in a solid material region. The Finite Element Method chosen for the model is the linear triangular three nodes. As for the resolution of the nonlinear programming problem with constraints we were used Augmented Lagrangian method, and a minimization algorithm based on the direction of the Quasi-Newton type and Armijo-Wolfe conditions assisting in the lowering process. The Cell Base that represents the composite is found from the equivalence between a fictional material and a preescribe material, distributed optimally in the project area. The use of the strain energy method is justified for providing a lower computational cost due to a simpler formulation than traditional homogenization method. The results are presented prescription with change, in displacement with change, in volume restriction and from various initial values of relative densities.
Resumo:
The electronic journals correspond publishing channels and dissemination of scientific information. Through them, users can spread their studies as well as developing new researches. One of the systems used for creation and e-journals management is the Electronic System for Journal Publishing (SEER), used in the construction of periodic portals, as well as the creation of magazines in isolation. In this purport, it is believed that the management systems and creation of e-journals should be developed (internally and externally) according to the needs of its users. In the case of internal development, some of these processes refer to the copyright registration and submission of articles, which, in turn, are relevant tasks in the editorial process. Thus, the proposed study, thematic Usability of scientific journals, aims to analyze the usability of the copyright registration process and submission of articles in the Electronic System for Journal Publishing through BiblioCanto magazine, part of the Electronic Journals Portal of the Federal University of Rio Grande do Norte (UFRN). For the realization of the research, two valuation techniques were used: the Usability Test with a total of twenty participants and the Cooperative Evaluation, with the same number of participants separated in four categories considered target audience of that magazine, namely: undergraduate students, graduate students, teachers and librarians. The results indicated that the two analyzed processes (copyright registration and submission of articles) need improvement. In the case of the registration process, the following needs are: signalizing of the conducting registration ambient; description and exclusion of requested information on the registration form. In the process of article submission, it is emphasized improvement of aspects: the early steps to submission, signaling of required fields, concise description of the steps, minimization and review of the steps. To this end, it is believed that in general idea the SEER partially meets the needs of its users regarding the usability of such software.
Resumo:
The electronic journals correspond publishing channels and dissemination of scientific information. Through them, users can spread their studies as well as developing new researches. One of the systems used for creation and e-journals management is the Electronic System for Journal Publishing (SEER), used in the construction of periodic portals, as well as the creation of magazines in isolation. In this purport, it is believed that the management systems and creation of e-journals should be developed (internally and externally) according to the needs of its users. In the case of internal development, some of these processes refer to the copyright registration and submission of articles, which, in turn, are relevant tasks in the editorial process. Thus, the proposed study, thematic Usability of scientific journals, aims to analyze the usability of the copyright registration process and submission of articles in the Electronic System for Journal Publishing through BiblioCanto magazine, part of the Electronic Journals Portal of the Federal University of Rio Grande do Norte (UFRN). For the realization of the research, two valuation techniques were used: the Usability Test with a total of twenty participants and the Cooperative Evaluation, with the same number of participants separated in four categories considered target audience of that magazine, namely: undergraduate students, graduate students, teachers and librarians. The results indicated that the two analyzed processes (copyright registration and submission of articles) need improvement. In the case of the registration process, the following needs are: signalizing of the conducting registration ambient; description and exclusion of requested information on the registration form. In the process of article submission, it is emphasized improvement of aspects: the early steps to submission, signaling of required fields, concise description of the steps, minimization and review of the steps. To this end, it is believed that in general idea the SEER partially meets the needs of its users regarding the usability of such software.
Resumo:
A significant observational effort has been directed to investigate the nature of the so-called dark energy. In this dissertation we derive constraints on dark energy models using three different observable: measurements of the Hubble rate H(z) (compiled by Meng et al. in 2015.); distance modulus of 580 Supernovae Type Ia (Union catalog Compilation 2.1, 2011); and the observations of baryon acoustic oscilations (BAO) and the cosmic microwave background (CMB) by using the so-called CMB/BAO of six peaks of BAO (a peak determined through the Survey 6dFGS data, two through the SDSS and three through WiggleZ). The statistical analysis used was the method of the χ2 minimum (marginalized or minimized over h whenever possible) to link the cosmological parameter: m, ω and δω0. These tests were applied in two parameterization of the parameter ω of the equation of state of dark energy, p = ωρ (here, p is the pressure and ρ is the component of energy density). In one, ω is considered constant and less than -1/3, known as XCDM model; in the other the parameter of state equantion varies with the redshift, where we the call model GS. This last model is based on arguments that arise from the theory of cosmological inflation. For comparison it was also made the analysis of model CDM. Comparison of cosmological models with different observations lead to different optimal settings. Thus, to classify the observational viability of different theoretical models we use two criteria information, the Bayesian information criterion (BIC) and the Akaike information criteria (AIC). The Fisher matrix tool was incorporated into our testing to provide us with the uncertainty of the parameters of each theoretical model. We found that the complementarity of tests is necessary inorder we do not have degenerate parametric spaces. Making the minimization process we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are m = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059. Performing a marginalization we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are M = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059.
Resumo:
A significant observational effort has been directed to investigate the nature of the so-called dark energy. In this dissertation we derive constraints on dark energy models using three different observable: measurements of the Hubble rate H(z) (compiled by Meng et al. in 2015.); distance modulus of 580 Supernovae Type Ia (Union catalog Compilation 2.1, 2011); and the observations of baryon acoustic oscilations (BAO) and the cosmic microwave background (CMB) by using the so-called CMB/BAO of six peaks of BAO (a peak determined through the Survey 6dFGS data, two through the SDSS and three through WiggleZ). The statistical analysis used was the method of the χ2 minimum (marginalized or minimized over h whenever possible) to link the cosmological parameter: m, ω and δω0. These tests were applied in two parameterization of the parameter ω of the equation of state of dark energy, p = ωρ (here, p is the pressure and ρ is the component of energy density). In one, ω is considered constant and less than -1/3, known as XCDM model; in the other the parameter of state equantion varies with the redshift, where we the call model GS. This last model is based on arguments that arise from the theory of cosmological inflation. For comparison it was also made the analysis of model CDM. Comparison of cosmological models with different observations lead to different optimal settings. Thus, to classify the observational viability of different theoretical models we use two criteria information, the Bayesian information criterion (BIC) and the Akaike information criteria (AIC). The Fisher matrix tool was incorporated into our testing to provide us with the uncertainty of the parameters of each theoretical model. We found that the complementarity of tests is necessary inorder we do not have degenerate parametric spaces. Making the minimization process we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are m = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059. Performing a marginalization we found (68%), for the Model XCDM the best fit parameters are m = 0.28 ± 0, 012 and ωX = −1.01 ± 0, 052. While for Model GS the best settings are M = 0.28 ± 0, 011 and δω0 = 0.00 ± 0, 059.
Resumo:
The employment of flexibility in the design of façades makes them adaptable to adverse weather conditions, resulting in both minimization of environmental discomfort and improvement of energy efficiency. The present study highlights the potential of flexible façades as a resource to reduce rigidity and form repetition, which are usually employed in condominiums of standardized houses; as such, the work presented herein contributes to field of study of architectural projects strategies for adapting and integrating buildings within the local climate context. Two façade options were designed using as reference the bionics and the kinetics, as well as their applications to architectural constructions. This resulted in two lightweight and dynamic structures, which cater to constraints of comfort through combinations of movements, which control the impact of solar radiation and of cooling in the environment. The efficacy and technical functionality of the façades were tested with comfort analysis and graphic computation software, as well as with physical models. Thus, the current research contributes to the improvement of architectural solutions aimed at using passive energy strategies in order to offer both better quality for the users and for the sustainability of the planet