8 resultados para nanocomposite, nanoparticles, multi-component solvent systems
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
In Brazil and around the world, oil companies are looking for, and expected development of new technologies and processes that can increase the oil recovery factor in mature reservoirs, in a simple and inexpensive way. So, the latest research has developed a new process called Gas Assisted Gravity Drainage (GAGD) which was classified as a gas injection IOR. The process, which is undergoing pilot testing in the field, is being extensively studied through physical scale models and core-floods laboratory, due to high oil recoveries in relation to other gas injection IOR. This process consists of injecting gas at the top of a reservoir through horizontal or vertical injector wells and displacing the oil, taking advantage of natural gravity segregation of fluids, to a horizontal producer well placed at the bottom of the reservoir. To study this process it was modeled a homogeneous reservoir and a model of multi-component fluid with characteristics similar to light oil Brazilian fields through a compositional simulator, to optimize the operational parameters. The model of the process was simulated in GEM (CMG, 2009.10). The operational parameters studied were the gas injection rate, the type of gas injection, the location of the injector and production well. We also studied the presence of water drive in the process. The results showed that the maximum vertical spacing between the two wells, caused the maximum recovery of oil in GAGD. Also, it was found that the largest flow injection, it obtained the largest recovery factors. This parameter controls the speed of the front of the gas injected and determined if the gravitational force dominates or not the process in the recovery of oil. Natural gas had better performance than CO2 and that the presence of aquifer in the reservoir was less influential in the process. In economic analysis found that by injecting natural gas is obtained more economically beneficial than CO2
Resumo:
In Brazil and around the world, oil companies are looking for, and expected development of new technologies and processes that can increase the oil recovery factor in mature reservoirs, in a simple and inexpensive way. So, the latest research has developed a new process called Gas Assisted Gravity Drainage (GAGD) which was classified as a gas injection IOR. The process, which is undergoing pilot testing in the field, is being extensively studied through physical scale models and core-floods laboratory, due to high oil recoveries in relation to other gas injection IOR. This process consists of injecting gas at the top of a reservoir through horizontal or vertical injector wells and displacing the oil, taking advantage of natural gravity segregation of fluids, to a horizontal producer well placed at the bottom of the reservoir. To study this process it was modeled a homogeneous reservoir and a model of multi-component fluid with characteristics similar to light oil Brazilian fields through a compositional simulator, to optimize the operational parameters. The model of the process was simulated in GEM (CMG, 2009.10). The operational parameters studied were the gas injection rate, the type of gas injection, the location of the injector and production well. We also studied the presence of water drive in the process. The results showed that the maximum vertical spacing between the two wells, caused the maximum recovery of oil in GAGD. Also, it was found that the largest flow injection, it obtained the largest recovery factors. This parameter controls the speed of the front of the gas injected and determined if the gravitational force dominates or not the process in the recovery of oil. Natural gas had better performance than CO2 and that the presence of aquifer in the reservoir was less influential in the process. In economic analysis found that by injecting natural gas is obtained more economically beneficial than CO2
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
The WAT is the temperature at the beginning of the appearance of wax crystals. At this temperature the first wax crystals are formed by the cooling systems paraffin / solvents. Paraffins are composed of a mixture of saturated hydrocarbons of high molecular weight. The removal of petroleum from wells and the production lines means a surcharge on produced oil, thus solubilize these deposits formed due to modifications of thermodynamics has been a constant challenge for companies of oil exploration. This study combines the paraffin solubilization by microemulsion systems, the determination of WAT systems paraffin / solvent and performance of surfactant in reducing the crystallization. We used the methods: rheological and the photoelectric signal, validating the latter which was developed to optimize the data obtained due to sensitivity of the equipment used. Methods developed for description of wax precipitation are often in poor agreement with the experimental data, they tend to underestimate the amount of wax at temperatures below the turbidity point. The Won method and the Ideal solution method were applied to the WAT data obtained in solvent systems, best represented by the second interaction of Won method using the solvents naphtha, hexane and LCO. It was observed that the results obtained by WAT photoelectric signal when compared with the viscosity occur in advance, demonstrating the greatest sensitivity of the method developed. The ionic surfactant reduced the viscosity of the solvent systems as it acted modifying the crystalline structure and, consequently, the pour point. The curves show that the WAT experimental data is, in general, closer to the modeling performed by the method of Won than to the one performed by the ideal solution method, because this method underestimates the curve predicting the onset of paraffin hydrocarbons crystallization temperature. This occurs because the actual temperature measured was the crystallization temperature and the method proposes the fusion temperature measurement.
Resumo:
Among the new drugs launched into the market since 1980, up to 30% of them belong to the class of natural products or they have semisynthetic origin. Between 40-70% of the new chemical entities (or lead compounds) possess poor water solubility, which may impair their commercial use. An alternative for administration of poorly water-soluble drugs is their vehiculation into drug delivery systems like micelles, microemulsions, nanoparticles, liposomes, and cyclodextrin systems. In this work, microemulsion-based drug delivery systems were obtained using pharmaceutically acceptable components: a mixture Tween 80 and Span 20 in ratio 3:1 as surfactant, isopropyl mirystate or oleic acid as oil, bidistilled water, and ethanol, in some formulations, as cosurfactants. Self-Microemulsifying Drug Delivery Systems (SMEDDS) were also obtained using propylene glycol or sorbitol as cosurfactant. All formulations were characterized for rheological behavior, droplet size and electrical conductivity. The bioactive natural product trans-dehydrocrotonin, as well some extracts and fractions from Croton cajucara Benth (Euphorbiaceae), Anacardium occidentale L. (Anacardiaceae) e Phyllanthus amarus Schum. & Thonn. (Euphorbiaceae) specimens, were satisfactorily solubilized into microemulsions formulations. Meanwhile, two other natural products from Croton cajucara, trans-crotonin and acetyl aleuritolic acid, showed poor solubility in these formulations. The evaluation of the antioxidant capacity, by DPPH method, of plant extracts loaded into microemulsions evidenced the antioxidant activity of Phyllanthus amarus and Anacardium occidentale extracts. For Phyllanthus amarus extract, the use of microemulsions duplicated its antioxidant efficiency. A hydroalcoholic extract from Croton cajucara incorporated into a SMEDDS formulation showed bacteriostatic activity against colonies of Bacillus cereus and Escherichia coli bacteria. Additionally, Molecular Dynamics simulations were performed using micellar systems, for drug delivery systems, containing sugar-based surfactants, N-dodecylamino-1-deoxylactitol and N-dodecyl-D-lactosylamine. The computational simulations indicated that micellization process for N-dodecylamino-1- deoxylactitol is more favorable than N-dodecyl-D-lactosylamine system.
Resumo:
Multimedia systems must incorporate middleware concepts in order to abstract hardware and operational systems issues. Applications in those systems may be executed in different kinds of platforms, and their components need to communicate with each other. In this context, it is needed the definition of specific communication mechanisms for the transmission of information flow. This work presents a interconnection component model for distributed multimedia environments, and its implementation details. The model offers specific communication mechanisms for transmission of information flow between software components considering the Cosmos framework requirements in order to support component dynamic reconfiguration
Resumo:
Multimedia systems must incorporate middleware concepts in order to abstract hardware and operational systems issues. Applications in those systems may be executed in different kinds of platforms, and their components need to communicate with each other. In this context, it is needed the definition of specific communication mechanisms for the transmission of information flow. This work presents a interconnection component model for distributed multimedia environments, and its implementation details. The model offers specific communication mechanisms for transmission of information flow between software components considering the Cosmos framework requirements in order to support component dynamic reconfiguration
Resumo:
Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity