62 resultados para Cópulas variantes no tempo


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work presents an algorithm proposal, which aims for controlling and improving idle time to be applied in oil production wells equipped with beam pump. The algorithm was totally designed based on existing papers and data acquired from two Potiguar Basin pilot wells. Oil engineering concepts such as submergence, pump off, Basic Sediments and Water (BSW), Inflow Performance Relationship (IPR), reservo ir pressure, inflow pressure, among others, were included into the algorithm through a mathematical treatment developed from a typical well and then extended to the general cases. The optimization will increase the well production potential maximum utilization having the smallest number of pumping unit cycles directly reflecting on operational cost and electricity consumption reduction

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new approach to reduction and abstraction of visual information for robotics vision applications. Basically, we propose to use a multi-resolution representation in combination with a moving fovea for reducing the amount of information from an image. We introduce the mathematical formalization of the moving fovea approach and mapping functions that help to use this model. Two indexes (resolution and cost) are proposed that can be useful to choose the proposed model variables. With this new theoretical approach, it is possible to apply several filters, to calculate disparity and to obtain motion analysis in real time (less than 33ms to process an image pair at a notebook AMD Turion Dual Core 2GHz). As the main result, most of time, the moving fovea allows the robot not to perform physical motion of its robotics devices to keep a possible region of interest visible in both images. We validate the proposed model with experimental results

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The incorporate of industrial automation in the medical are requires mechanisms to safety and efficient establishment of communication between biomedical devices. One solution to this problem is the MP-HA (Multicycles Protocol to Hospital Automation) that down a segmented network by beds coordinated by an element called Service Provider. The goal of this work is to model this Service Provider and to do performance analysis of the activities executed by in establishment and maintenance of hospital networks

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several mobile robots show non-linear behavior, mainly due friction phenomena between the mechanical parts of the robot or between the robot and the ground. Linear models are efficient in some cases, but it is necessary take the robot non-linearity in consideration when precise displacement and positioning are desired. In this work a parametric model identification procedure for a mobile robot with differential drive that considers the dead-zone in the robot actuators is proposed. The method consists in dividing the system into Hammerstein systems and then uses the key-term separation principle to present the input-output relations which shows the parameters from both linear and non-linear blocks. The parameters are then simultaneously estimated through a recursive least squares algorithm. The results shows that is possible to identify the dead-zone thresholds together with the linear parameters

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hard metals are the composite developed in 1923 by Karl Schröter, with wide application because high hardness, wear resistance and toughness. It is compound by a brittle phase WC and a ductile phase Co. Mechanical properties of hardmetals are strongly dependent on the microstructure of the WC Co, and additionally affected by the microstructure of WC powders before sintering. An important feature is that the toughness and the hardness increase simultaneously with the refining of WC. Therefore, development of nanostructured WC Co hardmetal has been extensively studied. There are many methods to manufacture WC-Co hard metals, including spraying conversion process, co-precipitation, displacement reaction process, mechanochemical synthesis and high energy ball milling. High energy ball milling is a simple and efficient way of manufacturing the fine powder with nanostructure. In this process, the continuous impacts on the powders promote pronounced changes and the brittle phase is refined until nanometric scale, bring into ductile matrix, and this ductile phase is deformed, re-welded and hardened. The goal of this work was investigate the effects of highenergy milling time in the micro structural changes in the WC-Co particulate composite, particularly in the refinement of the crystallite size and lattice strain. The starting powders were WC (average particle size D50 0.87 μm) supplied by Wolfram, Berglau-u. Hutten - GMBH and Co (average particle size D50 0.93 μm) supplied by H.C.Starck. Mixing 90% WC and 10% Co in planetary ball milling at 2, 10, 20, 50, 70, 100 and 150 hours, BPR 15:1, 400 rpm. The starting powders and the milled particulate composite samples were characterized by X-ray Diffraction (XRD) and Scanning Electron Microscopy (SEM) to identify phases and morphology. The crystallite size and lattice strain were measured by Rietveld s method. This procedure allowed obtaining more precise information about the influence of each one in the microstructure. The results show that high energy milling is efficient manufacturing process of WC-Co composite, and the milling time have great influence in the microstructure of the final particles, crushing and dispersing the finely WC nanometric order in the Co particles

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To obtain a process stability and a quality weld bead it is necessary an adequate parameters set: base current and time, pulse current and pulse time, because these influence the mode of metal transfer and the weld quality in the MIG-P, sometimes requiring special sources with synergistic modes with external control for this stability. This work aims to analyze and compare the effects of pulse parameters and droplet size in arc stability in MIG-P, four packets of pulse parameters were analysed: Ip = 160 A, tp = 5.7 ms; Ip = 300 A and tp = 2 ms, Ip = 350 A, tp = 1.2 ms and Ip = 350 A, tp = 0.8 ms. Each was analyzed with three different drop diameters: drop with the same diameter of the wire electrode; droplet diameter larger drop smaller than the diameter of the wire electrode. For purposes of comparison the same was determined relation between the average current and welding speed was determined generating a constant (Im / Vs = K) for all parameters. Welding in flat plate by simple deposition for the MIG-P with a distance beak contact number (DBCP) constant was perfomed subsequently making up welding in flat plate by simple deposition with an inclination of 10 degrees to vary the DBCP, where by assessment on how the MIG-P behaved in such a situation was possible, in addition to evaluating the MIG-P with adaptive control, in order to maintain a constant arc stability. Also high speed recording synchronized with acquiring current x voltage (oscillogram) was executed for better interpretation of the transfer mechanism and better evaluation in regard to the study of the stability of the process. It is concluded that parameters 3 and 4 exhibited greater versatility; diameters drop equal to or slightly less than the diameter of the wire exhibited better stability due to their higher frequency of detachment, and the detachment of the drop base does not harm the maintenance the height of the arc

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the main challenges in the beer industrial production is the market supply at the lowest cost and high quality, in order to ensure the expectations of customers and. consumers The beer fermentation stage represents approximately 70% of the whole time necessary to its production, having a obligatoriness of strict process controls to avoid becoming bottleneck in beer production. This stage is responsible for the formation of a series of subproducts, which are responsible for the composition of aroma/bouquet existing in beer and some of these subproducts, if produced in larger quantities, they will confer unpleasant taste and odor to the final product. Among the subproducts formed during the fermentation stage, total vicinal diketones is the main component, since it is limiting for product transfusion to the subsequent steps, besides having a low perception threshold by the consumer and giving undesirable taste and odor. Due to the instability of main raw materials quality and also process controls during fermentation, the development of alternative forms of beer production without impacting on total fermentation time and final product quality is a great challenge to breweries. In this work, a prior acidification of the pasty yeast was carried out, utilizing for that phosphoric acid, food grade, reducing yeast pH of about 5.30 to 2.20 and altering its characteristic from flocculent to pulverulent during beer fermentation. An increase of six times was observed in amount of yeast cells in suspension in the second fermentation stage regarding to fermentations by yeast with no prior acidification. With alteration on two input variables, temperature curve and cell multiplication, which goal was to minimize the maximum values for diketones detected in the fermenter tank, a reduction was obtained from peak of formed diacetyl and consequently contributed to reduction in fermentation time and total process time. Several experiments were performed with those process changes in order to verify the influence on the total fermentation time and total vicinal diketones concentration at the end of fermentation. This experiment reached as the best production result a total fermentation time of 151 hours and total vicinal diketone concentration of 0.08 ppm. The mass of yeast in suspension in the second phase of fermentation increased from 2.45 x 106 to 16.38 x 106 cells/mL of yeast, which fact is key to a greater efficiency in reducing total vicinal diketones existing in the medium, confirming that the prior yeast acidification, as well as the control of temperature and yeast cell multiplication in fermentative process enhances the performance of diketones reduction and consequently reduce the total fermentation time with diketones concentration below the expected value (Max: 0.10 ppm)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the process of the salt production, the first the salt crystals formed are disposed of as industrial waste. This waste is formed basically by gypsum, composed of calcium sulfate dihydrate (CaSO4.2H2O), known as carago cru or malacacheta . After be submitted the process of calcination to produce gypsum (CaSO4.0,5H2O), can be made possible its application in cement industry. This work aims to optimize the time and temperature for the process of calcination of the gypsum (carago) for get beta plaster according to the specifications of the norms of civil construction. The experiments involved the chemical and mineralogical characterization of the gypsum (carago) from the crystallizers, and of the plaster that is produced in the salt industry located in Mossoró, through the following techniques: x-ray diffraction (XRD), x-ray fluorescence (FRX), thermogravimetric analysis (TG/DTG) and scanning electron microscopy (SEM) with EDS. For optimization of time and temperature of the process of calcination was used the planning three factorial with levels with response surfaces of compressive mechanical tests and setting time, according norms NBR-13207: Plasters for civil construction and x-ray diffraction of plasters (carago) beta obtained in calcination. The STATISTICA software 7.0 was used for the calculations to relate the experimental data for a statistical model. The process for optimization of calcination of gypsum (carago) occurred in the temperature range from 120° C to 160° C and the time in the range of 90 to 210 minutes in the oven at atmospheric pressure, it was found that with the increase of values of temperature of 160° C and time calcination of 210 minutes to get the results of tests of resistance to compression with values above 10 MPa which conform to the standard required (> 8.40) and that the X-ray diffractograms the predominance of the phase of hemidrato beta, getting a beta plaster of good quality and which is in accordance with the norms in force, giving a by-product of the salt industry employability in civil construction

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research aims to understand the relationship between media, capitalism and ownership of free time for leisure practices in industrial societies and postindustrial. Searching is thus a conceptual framework that takes into account the kind of ideology that naturalizes the relationship of leisure with the foundations of contemporary media, and the media only with leisure, forgetting their insertion in the labor and industrial relations in society. We intend to demonstrate that every mode of production, in the capitalist system, entails a mode of reproduction. Methodologically, this is a first approximation, from theoretical concerns already performed, constituting a theoretical research, bibliographic and descriptive character. The results of the text drives us to the conclusion that the work and leisure spheres tend to be less and less differentiated, since both remain as activities of product management with the same intellective protocols, based on information and communication technology, and that accordingly, the media favors an expansion of productive activity even during leisure time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La recherche presentée, realisée sur le domaine de la méthaphysique, s´agit de rassembler des pressupositions pour une fondamentation ontologique de la technologie de l´Information, basé sur la philophie de Martin Heidegger; foncièrement, sur l´analytique existentiel du Dasein dans l´ouvrage Être et Temps. À partir de la pensée sur ce qui est aujourd´hui , il s´agit d´investiguer sur quels fondaments la Nouvelle Tecnologie se fut érigée de façon a que nous sommes engajés au projet de numérisation des étants que en même temps que destine l´homme a l´oubli de l´Être, l´offre la possibilité de transformation. Le rapport entre la question de l´Être et la question de la technique est analysé comme des chemins croisés et dans ce carrefour il devient possible penser ce qui est technique, ce qui est information pour Heidegger et de quel façon les modes existentiels du Dasein sont prêtes pour caractériser l ´homme au sein de la tecnologie de l´information. Par cette appropriation, il reste penser comment c´est possible l´ouverture d´une perspective de reconduction de l´homme à la vérité de l´Être. Finalement, la structuration des fondements rends possible la réflexion discursive général: avec qui nous nous ocuppons, comme nous sommes, dans quelle direction nous nous acheminons, les thèmes générales, respectivement, des trois chapitres. Les points d´investigation du premier chapitre son: a) La caractérisation précise du Dasein, appuyé sur des considerations de Benedito Nunes, Hans-Georg Gadamer, Jacques Derrida et Rüdiger Safränski; b) Le concept de technique et son essence chez Heidegger; c) la distinction entre technique et technologie, appuyé sur le pensée de J. Ellul, Michel Séris, Otto Pöggeler, Michel Haar, Dominique Janicaud; c) Le concept de cibernetique chez Heidegger et chez Norbert Wiener; d) La caractérisation preliminaire d´information, l´analyse étimologique e philosophique, l´avis de Heidegger te les théories de Rafael Capurro; f) L´Analyse du phénomène de la numérisation des étants, des considérations de Virilio, et l´analyse d´un concept de virtuel avec Henri Bergson et Gilles Deleuze. Dans le deuxième chapitre, l´analyse des existentiels du Dasein vers le sommaire des fondements de base pour la caractérisation de la technologie de l´information comme un problème philosophique. Finalement, aprés avoir presenté les concepts introdutoires que délimitent le questionement, suivi par les indications et pressupositions ontologiques trouvés sur Être et Temps, le troisième chapitre disserte sur le péril, ce qui sauve et la sérénité, les trois mots-clés de la pensée heideggerienne sur la technique que permettent l´approche conclusif de la question

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis aims better understanding the relation between time and evil in Schelling s Freiheitsschrift, having its starting point in approximations from Gnosticism. For that purpose, before approaching that relation, it is reviewed (chapter I) the question of Gnosticism, a strain of thought essentially concerned with the problem of time and permeated by the belief in an evil nature of creation, and which is alleged to have significantly influenced certain ideas of Schelling. An evaluation of approximations between Gnosticism, gnosis and German thought follows (chapter II), as well as an evaluation of Schellingian aproximations to Gnosticism (chapter III). Then, the Freiheitsschrift is analysed as the text where Schelling, having taken hold of a very distinct appropriation of Gnosticism, goes beyond Kantian theodicy (chapter IV). Some interrogations about whether key ideas of Schellingian philosophy (about gnosis, creation, duality, time, and evil) are conceived in a way that is essentially different from that of historic Gnosticism, despite the much that has been said to the contrary, are then addressed (chapter V). The proposal of a Platonic-Plotinian key to the understanding of the relations between time and evil in the Freiheitsschrift comes next (chapter VI), and then gives way to the concluding remarks (chapter VII). We perceive that Gnosticism and Neoplatonism are systems of thought that sometimes converge, and that German thought is one of the places of this convergence. Notwithstanding this perception, it is possible to affirm that Schellingian thought, with its valorization of time and of a certain perception of evil, is essentially anti-gnostic, despite some contrary observations