1000 resultados para Métricas de Paisagem
Resumo:
As formas do relevo podem ser indicadores da variação dos atributos do solo, pois essa variabilidade é causada por pequenas alterações do declive que afetam os processos pedogenéticos bem como o transporte e o armazenamento de água no perfil do solo. O trabalho foi desenvolvido em Catanduva (SP), com o objetivo de caracterizar a variabilidade espacial de atributos do solo e fatores de erosão em diferentes pedoformas sob cultivo de cana-de-açúcar. de acordo com o modelo de Troeh classificou-se as formas do relevo em duas pedoformas, côncava e convexa. Com a utilização de um DGPS levantaram-se as cotas altimétricas, estabelecendo-se uma malha, com intervalos regulares de 50 m, com 270 pontos na pedoforma côncava e 353 pontos na pedoforma convexa, perfazendo um total de 623 pontos, coletados na profundidade de 0,0 - 0,2 m em uma área de 200 ha. em cada ponto da malha foram determinados os atributos químicos do solo, granulometria, espessura do solo e fatores de erosão locais, tais como erosividade (R), erodibilidade (K), fator topográfico (LS), uso e manejo (C), práticas conservacionistas (P), potencial natural de erosão (PNE), perda de solo (A) e risco de erosão (RE). Os dados foram avaliados primeiramente por uma análise estatística exploratória, calculando-se a média, mediana, variância, coeficiente de variação, coeficiente de assimetria, coeficiente de curtose e teste de normalidade. Posteriormente, a dependência espacial foi verificada por meio da técnica de geoestatística utilizando-se semivariogramas. As maiores perdas de solo, risco de erosão e potencial natural de erosão e menor espessura do solo ocorreram na pedoforma convexa, indicando forte dependência espacial com a forma do relevo. A pedoforma côncava proporcionou maior variabilidade espacial, demonstrando que a forma do relevo condiciona padrões diferenciados de variabilidade. A magnitude da variabilidade dos atributos do solo é mais influenciada pela forma do relevo que pela erosão. A espessura do horizonte A+E integrado com a forma do relevo é um indicador de processos erosivos para classe de Argissolos.
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory
Resumo:
Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time
Resumo:
The greater part of monitoring onshore Oil and Gas environment currently are based on wireless solutions. However, these solutions have a technological configuration that are out-of-date, mainly because analog radios and inefficient communication topologies are used. On the other hand, solutions based in digital radios can provide more efficient solutions related to energy consumption, security and fault tolerance. Thus, this paper evaluated if the Wireless Sensor Network, communication technology based on digital radios, are adequate to monitoring Oil and Gas onshore wells. Percent of packets transmitted with successful, energy consumption, communication delay and routing techniques applied to a mesh topology will be used as metrics to validate the proposal in the different routing techniques through network simulation tool NS-2
Resumo:
Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments
Resumo:
This work deals with experimental studies about VoIP conections into WiFi 802.11b networks with handoff. Indoor and outdoor network experiments are realised to take measurements for the QoS parameters delay, throughput, jitter and packt loss. The performance parameters are obtained through the use of software tools Ekiga, Iperf and Wimanager that assure, respectvely, VoIP conection simulation, trafic network generator and metric parameters acquisition for, throughput, jitter and packt loss. The avarage delay is obtained from the measured throughput and the concept of packt virtual transmition time. The experimental data are validated based on de QoS level for each metric parameter accepted as adequated by the specialized literature
Resumo:
Este trabalho teve o objetivo de avaliar a evolução do uso da terra no município de Botucatu - SP, no período de três anos, considerando-se seis tipos de cobertura vegetal (cana-de-açúcar, reflorestamento, floresta nativa, pastagem, cítrus e outros), tendo como base as imagens de satélite Landsat 5, bandas 3; 4 e 5, órbita 220, ponto 76, quadrante A, passagem de 8 de junho de 1999. O Sistema de Informações Geográficas - IDRISI for Windows 3.2, foi utilizado para as análises. Os resultados mostraram que esse programa foi eficiente para auxiliar na identificação e mapeamento das áreas com uso da terra, facilitando o processamento dos dados. As imagens de satélite TM/LANDSAT 5 forneceram um excelente banco de dados para a classificação supervisionada. O município não vem sendo preservado ambientalmente, pois apresenta-se coberto com menos de 20% de florestas nativas, mínimo exigido por lei. As áreas de pastagem, principal componente da paisagem do município, confirmam a vocação da região para a pecuária.
Resumo:
The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.
Resumo:
The use of wireless sensor and actuator networks in industry has been increasing past few years, bringing multiple benefits compared to wired systems, like network flexibility and manageability. Such networks consists of a possibly large number of small and autonomous sensor and actuator devices with wireless communication capabilities. The data collected by sensors are sent directly or through intermediary nodes along the network to a base station called sink node. The data routing in this environment is an essential matter since it is strictly bounded to the energy efficiency, thus the network lifetime. This work investigates the application of a routing technique based on Reinforcement Learning s Q-Learning algorithm to a wireless sensor network by using an NS-2 simulated environment. Several metrics like energy consumption, data packet delivery rates and delays are used to validate de proposal comparing it with another solutions existing in the literature
Resumo:
This paper analyzes the performance of a parallel implementation of Coupled Simulated Annealing (CSA) for the unconstrained optimization of continuous variables problems. Parallel processing is an efficient form of information processing with emphasis on exploration of simultaneous events in the execution of software. It arises primarily due to high computational performance demands, and the difficulty in increasing the speed of a single processing core. Despite multicore processors being easily found nowadays, several algorithms are not yet suitable for running on parallel architectures. The algorithm is characterized by a group of Simulated Annealing (SA) optimizers working together on refining the solution. Each SA optimizer runs on a single thread executed by different processors. In the analysis of parallel performance and scalability, these metrics were investigated: the execution time; the speedup of the algorithm with respect to increasing the number of processors; and the efficient use of processing elements with respect to the increasing size of the treated problem. Furthermore, the quality of the final solution was verified. For the study, this paper proposes a parallel version of CSA and its equivalent serial version. Both algorithms were analysed on 14 benchmark functions. For each of these functions, the CSA is evaluated using 2-24 optimizers. The results obtained are shown and discussed observing the analysis of the metrics. The conclusions of the paper characterize the CSA as a good parallel algorithm, both in the quality of the solutions and the parallel scalability and parallel efficiency
Resumo:
The increasing demand for high performance wireless communication systems has shown the inefficiency of the current model of fixed allocation of the radio spectrum. In this context, cognitive radio appears as a more efficient alternative, by providing opportunistic spectrum access, with the maximum bandwidth possible. To ensure these requirements, it is necessary that the transmitter identify opportunities for transmission and the receiver recognizes the parameters defined for the communication signal. The techniques that use cyclostationary analysis can be applied to problems in either spectrum sensing and modulation classification, even in low signal-to-noise ratio (SNR) environments. However, despite the robustness, one of the main disadvantages of cyclostationarity is the high computational cost for calculating its functions. This work proposes efficient architectures for obtaining cyclostationary features to be employed in either spectrum sensing and automatic modulation classification (AMC). In the context of spectrum sensing, a parallelized algorithm for extracting cyclostationary features of communication signals is presented. The performance of this features extractor parallelization is evaluated by speedup and parallel eficiency metrics. The architecture for spectrum sensing is analyzed for several configuration of false alarm probability, SNR levels and observation time for BPSK and QPSK modulations. In the context of AMC, the reduced alpha-profile is proposed as as a cyclostationary signature calculated for a reduced cyclic frequencies set. This signature is validated by a modulation classification architecture based on pattern matching. The architecture for AMC is investigated for correct classification rates of AM, BPSK, QPSK, MSK and FSK modulations, considering several scenarios of observation length and SNR levels. The numerical results of performance obtained in this work show the eficiency of the proposed architectures
Resumo:
The manufacture of prostheses for lower limb amputees (transfemural and transtibial) requires the preparation of a cartridge with appropriate and custom fit to the profile of each patient. The traditional process to the patients, mainly in public hospitals in Brazil, begins with the completion of a form where types of equipment, plugins, measures, levels of amputation etc. are identified. Currently, such work is carried out manually using a common metric tape and caliper of wood to take the measures of the stump, featuring a very rudimentary, and with a high degree of uncertainty geometry of the final product. To address this problem, it was necessary to act in two simultaneously and correlated directions. Originally, it was developed an integrated tool for viewing 3D CAD for transfemoral types of prostheses and transtibial called OrtoCAD I. At the same time, it was necessary to design and build a reader Mechanical equipment (sort of three-dimensional scanner simplified) able to obtain, automatically and with accuracy, the geometric information of either of the stump or the healthy leg. The methodology includes the application of concepts of reverse engineering to computationally generate the representation of the stump and/or the reverse image of the healthy member. The materials used in the manufacturing of prostheses nor always obey to a technical scientific criteria, because, if by one way it meets the criteria of resistance, by the other, it brings serious problems mainly due to excess of weight. This causes to the user various disorders due to lack of conformity. That problem was addressed with the creation of a hybrid composite material for the manufacture of cartridges of prostheses. Using the Reader Fitter and OrtoCAD, the new composite material, which aggregates the mechanical properties of strength and rigidity on important parameters such as low weight and low cost, it can be defined in its better way. Besides, it brings a reduction of up steps in the current processes of manufacturing or even the feasibility of using new processes, in the industries, in order to obtain the prostheses. In this sense, the hybridization of the composite with the combination of natural and synthetic fibers can be a viable solution to the challenges offered above
Resumo:
Due to advances in the manufacturing process of orthopedic prostheses, the need for better quality shape reading techniques (i.e. with less uncertainty) of the residual limb of amputees became a challenge. To overcome these problems means to be able in obtaining accurate geometry information of the limb and, consequently, better manufacturing processes of both transfemural and transtibial prosthetic sockets. The key point for this task is to customize these readings trying to be as faithful as possible to the real profile of each patient. Within this context, firstly two prototype versions (α and β) of a 3D mechanical scanner for reading residual limbs shape based on reverse engineering techniques were designed. Prototype β is an improved version of prototype α, despite remaining working in analogical mode. Both prototypes are capable of producing a CAD representation of the limb via appropriated graphical sheets and were conceived to work purely by mechanical means. The first results were encouraging as they were able to achieve a great decrease concerning the degree of uncertainty of measurements when compared to traditional methods that are very inaccurate and outdated. For instance, it's not unusual to see these archaic methods in action by making use of ordinary home kind measure-tapes for exploring the limb's shape. Although prototype β improved the readings, it still required someone to input the plotted points (i.e. those marked in disk shape graphical sheets) to an academic CAD software called OrtoCAD. This task is performed by manual typing which is time consuming and carries very limited reliability. Furthermore, the number of coordinates obtained from the purely mechanical system is limited to sub-divisions of the graphical sheet (it records a point every 10 degrees with a resolution of one millimeter). These drawbacks were overcome by designing the second release of prototype β in which it was developed an electronic variation of the reading table components now capable of performing an automatic reading (i.e. no human intervention in digital mode). An interface software (i.e. drive) was built to facilitate data transfer. Much better results were obtained meaning less degree of uncertainty (it records a point every 2 degrees with a resolution of 1/10 mm). Additionally, it was proposed an algorithm to convert the CAD geometry, used by OrtoCAD, to an appropriate format and enabling the use of rapid prototyping equipment aiming future automation of the manufacturing process of prosthetic sockets.
Resumo:
Nas culturas agrícolas as plantas daninhas devem ser controladas de modo a não afetar negativamente o rendimento e a qualidade do produto colhido. Deste modo, quantidades pequenas de plantas daninhas em um campo, na maioria dos casos, não é um problema, exceto na produção de sementes. Ressalta-se que em gramados não existe um componente de produção a se colhido. O valor do gramado é a sua qualidade inerente a estética e usabilidade. Qualidade estética é a beleza e o valor que acrescenta ao gramado em uma paisagem gerenciada. Usabilidade pode ser a durabilidade de um campo de esporte ou a redução na perda de solo pela erosão da água ou do vento. A presença de qualquer planta daninha em gramados pode diminuir a qualidade estética e usabilidade do gramado. Enquanto for possível reduzir a população de plantas daninhas utilizando práticas culturais ou mecânicas, não se poderá eliminá-las completamente. A utilização de herbicidas é a única maneira de controlar completamente as plantas daninhas em áreas de gramados. Esta revisão irá rever os principais herbicidas utilizados em gramados nos Estados Unidos com relação a seus modos de ação, a família de herbicidas e uso primário no gramado.
Resumo:
The pegmatite rocks in Rio Grande do Norte are responsible for much of the production of industrial minerals like quartz and feldspar. Quartz and feldspar are minerals from pegmatite which may occur in pockets with metric to centimetric dimensions or as millimetric to sub millimetric intergrowths. The correct physical liberation of the mineral of interest, in case of intergrowths, requires an appropriate particle size, acquired by size reduction operations. The method for treating mineral which has a high efficiency fines particles recovery is flotation. The main purpose of the present study is to evaluate the recovery of quartz and potassium feldspar using cationic diamine and quaternary ammonium salt as collectors by means of dissolved air flotation DAF. The tests were performed based on a central composite design 24, by which the influence of process variables was statistically verified: concentration of the quaternary ammonium salt and diamine collectors, pH and conditioning time. The efficiency of flotation was calculated from the removal of turbidity of the solution. Results of maximum flotation efficiency (60%) were found in the level curves, plotted in conditions of low concentrations of collectors (1,0 x 10-5 mol.L-1). These high flotation efficiencies were obtained when operating at pH 4 to 8 with conditioning time ranging from 3 to 5 minutes. Thus, the results showed that the process variables have played important roles in the dissolved air flotation process concerning the flotability of the minerals.