989 resultados para Métodos de Pontos Interiores
Resumo:
Foram utilizadas 288 aves da linhagem Isa Babcock avaliando o efeito da debicagem leve, severa e da não debicagem sobre o desempenho de poedeiras, distribuídas em delineamento inteiramente casualizado em esquema fatorial 3x3 (primeira debicagem x segunda debicagem), com quatro repetições de oito aves cada. A primeira debicagem foi realizada aos nove dias e a segunda debicagem às 12 semanas. Avaliou-se consumo de ração, peso das aves, percentagem de postura e conversão alimentar em quatro períodos de 28 dias cada. de acordo com os resultados, aves com debicagem severa apresentaram menor consumo de ração e menor percentagem de postura(P<0,05).
Resumo:
Last century Six Sigma Strategy has been the focus of study for many scientists, between the discoveries we have the importance of data process for the free of error product manufactory. So, this work focuses on data quality importance in an enterprise. For this, a descriptive-exploratory study of seventeen pharmacies of manipulations from Rio Grande do Norte was undertaken with the objective to be able to create a base structure model to classify enterprises according to their data bases. Therefore, statistical methods such as cluster and discriminant analyses were used applied to a questionnaire built for this specific study. Data collection identified four group showing strong and weak characteristics for each group and that are differentiated from each other
Resumo:
This study presents a comparative analysis of methodologies about weighted factors considered in the selection of areas for deployment of Sanitary Landfills, applying the methodologies of classification criteria with scoring bands Gomes, Coelho, Erba & Veronez (2000); Waquil et al, 2000. That means, we have the Scoring System used by Union of Municipalities of Bahia and the Quality Index Landfill Waste (IQR) which are applyed for this study in Massaranduba Sanitary Landfill located in the municipality of Ceará Mirim /RN, northeastern of Brazil. The study was conducted in order to classify the methodologies and to give support for future studies on environmental management segment, with main goal to propose suited methodologies which allow safety and rigor during the selection, deployment and management of sanitary landfill, in the Brazilian municipalities, in order to help them in the process to extinction of their dumps, in according of Brazilian Nacional Plan of Solid Waste. During this investigation we have studied the characteristics of the site as morphological, hydrogeological, environmental and socio-economic to permit the installation. We consider important to mention the need of deployment from Rio Grande do Norte State Secretary of Environment and Water (SEMARH), Institute of Sustainable Development and Environment of RN (IDEMA), as well, from Federal and Municipal Governments a public policies for the integrated management of urban solid waste that address environmental preservation and improvement of health conditions of the population of the Rio Grande do Norte
Resumo:
The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries
Resumo:
The usual programs for load flow calculation were in general developped aiming the simulation of electric energy transmission, subtransmission and distribution systems. However, the mathematical methods and algorithms used by the formulations were based, in majority, just on the characteristics of the transmittion systems, which were the main concern focus of engineers and researchers. Though, the physical characteristics of these systems are quite different from the distribution ones. In the transmission systems, the voltage levels are high and the lines are generally very long. These aspects contribute the capacitive and inductive effects that appear in the system to have a considerable influence in the values of the interest quantities, reason why they should be taken into consideration. Still in the transmission systems, the loads have a macro nature, as for example, cities, neiborhoods, or big industries. These loads are, generally, practically balanced, what reduces the necessity of utilization of three-phase methodology for the load flow calculation. Distribution systems, on the other hand, present different characteristics: the voltage levels are small in comparison to the transmission ones. This almost annul the capacitive effects of the lines. The loads are, in this case, transformers, in whose secondaries are connected small consumers, in a sort of times, mono-phase ones, so that the probability of finding an unbalanced circuit is high. This way, the utilization of three-phase methodologies assumes an important dimension. Besides, equipments like voltage regulators, that use simultaneously the concepts of phase and line voltage in their functioning, need a three-phase methodology, in order to allow the simulation of their real behavior. For the exposed reasons, initially was developped, in the scope of this work, a method for three-phase load flow calculation in order to simulate the steady-state behaviour of distribution systems. Aiming to achieve this goal, the Power Summation Algorithm was used, as a base for developing the three phase method. This algorithm was already widely tested and approved by researchers and engineers in the simulation of radial electric energy distribution systems, mainly for single-phase representation. By our formulation, lines are modeled in three-phase circuits, considering the magnetic coupling between the phases; but the earth effect is considered through the Carson reduction. It s important to point out that, in spite of the loads being normally connected to the transformer s secondaries, was considered the hypothesis of existence of star or delta loads connected to the primary circuit. To perform the simulation of voltage regulators, a new model was utilized, allowing the simulation of various types of configurations, according to their real functioning. Finally, was considered the possibility of representation of switches with current measuring in various points of the feeder. The loads are adjusted during the iteractive process, in order to match the current in each switch, converging to the measured value specified by the input data. In a second stage of the work, sensibility parameters were derived taking as base the described load flow, with the objective of suporting further optimization processes. This parameters are found by calculating of the partial derivatives of a variable in respect to another, in general, voltages, losses and reactive powers. After describing the calculation of the sensibility parameters, the Gradient Method was presented, using these parameters to optimize an objective function, that will be defined for each type of study. The first one refers to the reduction of technical losses in a medium voltage feeder, through the installation of capacitor banks; the second one refers to the problem of correction of voltage profile, through the instalation of capacitor banks or voltage regulators. In case of the losses reduction will be considered, as objective function, the sum of the losses in all the parts of the system. To the correction of the voltage profile, the objective function will be the sum of the square voltage deviations in each node, in respect to the rated voltage. In the end of the work, results of application of the described methods in some feeders are presented, aiming to give insight about their performance and acuity
Avaliação de dois métodos para condicionamento e coleta de sêmen em quatro espécies do gênero Mazama
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The reconfiguration of a distribution network is a change in its topology, aiming to provide specific operation conditions of the network, by changing the status of its switches. It can be performed regardless of any system anomaly. The service restoration is a particular case of reconfiguration and should be performed whenever there is a network failure or whenever one or more sections of a feeder have been taken out of service for maintenance. In such cases, loads that are supplied through lines sections that are downstream of portions removed for maintenance may be supplied by the closing of switches to the others feeders. By classical methods of reconfiguration, several switches may be required beyond those used to perform the restoration service. This includes switching feeders in the same substation or for substations that do not have any direct connection to the faulted feeder. These operations can cause discomfort, losses and dissatisfaction among consumers, as well as a negative reputation for the energy company. The purpose of this thesis is to develop a heuristic for reconfiguration of a distribution network, upon the occurrence of a failure in this network, making the switching only for feeders directly involved in this specific failed segment, considering that the switching applied is related exclusively to the isolation of failed sections and bars, as well as to supply electricity to the islands generated by the condition, with significant reduction in the number of applications of load flows, due to the use of sensitivity parameters for determining voltages and currents estimated on bars and lines of the feeders directly involved with that failed segment. A comparison between this process and classical methods is performed for different test networks from the literature about networks reconfiguration
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
A 2.5D ray-tracing propagation model is proposed to predict radio loss in indoor environment. Specifically, we opted for the Shooting and Bouncing Rays (SBR) method, together with the Geometrieal Theory of Diffrartion (GTD). Besides the line-of-sight propagation (LOS), we consider that the radio waves may experience reflection, refraction, and diffraction (NLOS). In the Shooting and Bouncing Rays (SBR) method, the transmitter antenna launches a bundle of rays that may or may not reach the receiver. Considering the transmitting antenna as a point, the rays will start to launch from this position and can reach the receiver either directly or after reflections, refractions, diffractions, or even after any combination of the previous effects. To model the environment, a database is built to record geometrical characteristics and information on the constituent materials of the scenario. The database works independently of the simulation program, allowing robustness and flexibility to model other seenarios. Each propagation mechanism is treated separately. In line-of-sight propagation, the main contribution to the received signal comes from the direct ray, while reflected, refracted, and diffracted signal dominate when the line-of-sight is blocked. For this case, the transmitted signal reaches the receiver through more than one path, resulting in a multipath fading. The transmitting channel of a mobile system is simulated by moving either the transmitter or the receiver around the environment. The validity of the method is verified through simulations and measurements. The computed path losses are compared with the measured values at 1.8 GHz ftequency. The results were obtained for the main corridor and room classes adjacent to it. A reasonable agreement is observed. The numerical predictions are also compared with published data at 900 MHz and 2.44 GHz frequencies showing good convergence
Resumo:
The Electrical Submersible Pump (ESP) has been one of the most appropriate solutions for lifting method in onshore and offshore applications. The typical features for this application are adverse temperature, viscosity fluids and gas environments. The difficulties in equipments maintenance and setup contributing to increasing costs of oil production in deep water, therefore, the optimization through automation can be a excellent approach for decrease costs and failures in subsurface equipment. This work describe a computer simulation related with the artificial lifting method ESP. This tool support the dynamic behavior of ESP approach, considering the source and electric energy transmission model for the motor, the electric motor model (including the thermal calculation), flow tubbing simulation, centrifugal pump behavior simulation with liquid nature effects and reservoir requirements. In addition, there are tri-dimensional animation for each ESP subsytem (transformer, motor, pump, seal, gas separator, command unit). This computer simulation propose a improvement for monitoring oil wells for maximization of well production. Currenty, the proprietaries simulators are based on specific equipments manufactures. Therefore, it is not possible simulation equipments of another manufactures. In the propose approach there are support for diverse kinds of manufactures equipments
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work proposes a kinematic control scheme, using visual feedback for a robot arm with five degrees of freedom. Using computational vision techniques, a method was developed to determine the cartesian 3d position and orientation of the robot arm (pose) using a robot image obtained through a camera. A colored triangular label is disposed on the robot manipulator tool and efficient heuristic rules are used to obtain the vertexes of that label in the image. The tool pose is obtained from those vertexes through numerical methods. A color calibration scheme based in the K-means algorithm was implemented to guarantee the robustness of the vision system in the presence of light variations. The extrinsic camera parameters are computed from the image of four coplanar points whose cartesian 3d coordinates, related to a fixed frame, are known. Two distinct poses of the tool, initial and final, obtained from image, are interpolated to generate a desired trajectory in cartesian space. The error signal in the proposed control scheme consists in the difference between the desired tool pose and the actual tool pose. Gains are applied at the error signal and the signal resulting is mapped in joint incrementals using the pseudoinverse of the manipulator jacobian matrix. These incrementals are applied to the manipulator joints moving the tool to the desired pose
Resumo:
In practically all vertical markets and in every region of the planet, loyalty marketers have adopted the tactic of recognition and reward to identify, maintain and increase the yield of their customers. Several strategies have been adopted by companies, and the most popular among them is the loyalty program, which displays a loyalty club to manage these rewards. But the problem with loyalty programs is that customer identification and transfer of loyalty points are made in a semiautomatic. Aiming at this, this paper presents a master's embedded business automation solution called e-Points. The goal of e-Points is munir clubs allegiances with fully automated tooling technology to identify customers directly at the point of sales, ensuring greater control over the loyalty of associate members. For this, we developed a hardware platform with embedded system and RFID technology to be used in PCs tenant, a smart card to accumulate points with every purchase and a web server, which will provide services of interest to retailers and customers membership to the club
Resumo:
This work proposes a collaborative system for marking dangerous points in the transport routes and generation of alerts to drivers. It consisted of a proximity warning system for a danger point that is fed by the driver via a mobile device equipped with GPS. The system will consolidate data provided by several different drivers and generate a set of points common to be used in the warning system. Although the application is designed to protect drivers, the data generated by it can serve as inputs for the responsible to improve signage and recovery of public roads