992 resultados para Main artificial lifting


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generic concept of the artificial meteorite experiment STONE is to fix rock samples bearing microorganisms on the heat shield of a recoverable space capsule and to study their modifications during atmospheric re-entry. The STONE-5 experiment was performed mainly to answer astrobiological questions. The rock samples mounted on the heat shield were used (i) as a carrier for microorganisms and (ii) as internal control to verify whether physical conditions during atmospheric re-entry were comparable to those experienced by "real" meteorites. Samples of dolerite (an igneous rock), sandstone (a sedimentary rock), and gneiss impactite from Haughton Crater carrying endolithic cyanobacteria were fixed to the heat shield of the unmanned recoverable capsule FOTON-M2. Holes drilled on the back side of each rock sample were loaded with bacterial and fungal spores and with dried vegetative cryptoendoliths. The front of the gneissic sample was also soaked with cryptoendoliths. <p>The mineralogical differences between pre- and post-flight samples are detailed. Despite intense ablation resulting in deeply eroded samples, all rocks in part survived atmospheric re-entry. Temperatures attained during re-entry were high enough to melt dolerite, silica, and the gneiss impactite sample. The formation of fusion crusts in STONE-5 was a real novelty and strengthens the link with real meteorites. The exposed part of the dolerite is covered by a fusion crust consisting of silicate glass formed from the rock sample with an admixture of holder material (silica). Compositionally, the fusion crust varies from silica-rich areas (undissolved silica fibres of the holder material) to areas whose composition is "basaltic". Likewise, the fusion crust on the exposed gneiss surface was formed from gneiss with an admixture of holder material. The corresponding composition of the fusion crust varies from silica-rich areas to areas with "gneiss" composition (main component potassium-rich feldspar). The sandstone sample was retrieved intact and did not develop a fusion crust. Thermal decomposition of the calcite matrix followed by disintegration and liberation of the silicate grains prevented the formation of a melt.</p> <p>Furthermore, the non-exposed surface of all samples experienced strong thermal alterations. Hot gases released during ablation pervaded the empty space between sample and sample holder leading to intense local heating. The intense heating below the protective sample holder led to surface melting of the dolerite rock and to the formation of calcium-silicate rims on quartz grains in the sandstone sample. (c) 2008 Elsevier Ltd. All rights reserved.</p>

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract The main objective of this work is to show how the choice of the temporal dimension and of the spatial structure of the population influences an artificial evolutionary process. In the field of Artificial Evolution we can observe a common trend in synchronously evolv¬ing panmictic populations, i.e., populations in which any individual can be recombined with any other individual. Already in the '90s, the works of Spiessens and Manderick, Sarma and De Jong, and Gorges-Schleuter have pointed out that, if a population is struc¬tured according to a mono- or bi-dimensional regular lattice, the evolutionary process shows a different dynamic with respect to the panmictic case. In particular, Sarma and De Jong have studied the selection pressure (i.e., the diffusion of a best individual when the only selection operator is active) induced by a regular bi-dimensional structure of the population, proposing a logistic modeling of the selection pressure curves. This model supposes that the diffusion of a best individual in a population follows an exponential law. We show that such a model is inadequate to describe the process, since the growth speed must be quadratic or sub-quadratic in the case of a bi-dimensional regular lattice. New linear and sub-quadratic models are proposed for modeling the selection pressure curves in, respectively, mono- and bi-dimensional regu¬lar structures. These models are extended to describe the process when asynchronous evolutions are employed. Different dynamics of the populations imply different search strategies of the resulting algorithm, when the evolutionary process is used to solve optimisation problems. A benchmark of both discrete and continuous test problems is used to study the search characteristics of the different topologies and updates of the populations. In the last decade, the pioneering studies of Watts and Strogatz have shown that most real networks, both in the biological and sociological worlds as well as in man-made structures, have mathematical properties that set them apart from regular and random structures. In particular, they introduced the concepts of small-world graphs, and they showed that this new family of structures has interesting computing capabilities. Populations structured according to these new topologies are proposed, and their evolutionary dynamics are studied and modeled. We also propose asynchronous evolutions for these structures, and the resulting evolutionary behaviors are investigated. Many man-made networks have grown, and are still growing incrementally, and explanations have been proposed for their actual shape, such as Albert and Barabasi's preferential attachment growth rule. However, many actual networks seem to have undergone some kind of Darwinian variation and selection. Thus, how these networks might have come to be selected is an interesting yet unanswered question. In the last part of this work, we show how a simple evolutionary algorithm can enable the emrgence o these kinds of structures for two prototypical problems of the automata networks world, the majority classification and the synchronisation problems. Synopsis L'objectif principal de ce travail est de montrer l'influence du choix de la dimension temporelle et de la structure spatiale d'une population sur un processus évolutionnaire artificiel. Dans le domaine de l'Evolution Artificielle on peut observer une tendence à évoluer d'une façon synchrone des populations panmictiques, où chaque individu peut être récombiné avec tout autre individu dans la population. Déjà dans les année '90, Spiessens et Manderick, Sarma et De Jong, et Gorges-Schleuter ont observé que, si une population possède une structure régulière mono- ou bi-dimensionnelle, le processus évolutionnaire montre une dynamique différente de celle d'une population panmictique. En particulier, Sarma et De Jong ont étudié la pression de sélection (c-à-d la diffusion d'un individu optimal quand seul l'opérateur de sélection est actif) induite par une structure régulière bi-dimensionnelle de la population, proposant une modélisation logistique des courbes de pression de sélection. Ce modèle suppose que la diffusion d'un individu optimal suit une loi exponentielle. On montre que ce modèle est inadéquat pour décrire ce phénomène, étant donné que la vitesse de croissance doit obéir à une loi quadratique ou sous-quadratique dans le cas d'une structure régulière bi-dimensionnelle. De nouveaux modèles linéaires et sous-quadratique sont proposés pour des structures mono- et bi-dimensionnelles. Ces modèles sont étendus pour décrire des processus évolutionnaires asynchrones. Différentes dynamiques de la population impliquent strategies différentes de recherche de l'algorithme résultant lorsque le processus évolutionnaire est utilisé pour résoudre des problèmes d'optimisation. Un ensemble de problèmes discrets et continus est utilisé pour étudier les charactéristiques de recherche des différentes topologies et mises à jour des populations. Ces dernières années, les études de Watts et Strogatz ont montré que beaucoup de réseaux, aussi bien dans les mondes biologiques et sociologiques que dans les structures produites par l'homme, ont des propriétés mathématiques qui les séparent à la fois des structures régulières et des structures aléatoires. En particulier, ils ont introduit la notion de graphe sm,all-world et ont montré que cette nouvelle famille de structures possède des intéressantes propriétés dynamiques. Des populations ayant ces nouvelles topologies sont proposés, et leurs dynamiques évolutionnaires sont étudiées et modélisées. Pour des populations ayant ces structures, des méthodes d'évolution asynchrone sont proposées, et la dynamique résultante est étudiée. Beaucoup de réseaux produits par l'homme se sont formés d'une façon incrémentale, et des explications pour leur forme actuelle ont été proposées, comme le preferential attachment de Albert et Barabàsi. Toutefois, beaucoup de réseaux existants doivent être le produit d'un processus de variation et sélection darwiniennes. Ainsi, la façon dont ces structures ont pu être sélectionnées est une question intéressante restée sans réponse. Dans la dernière partie de ce travail, on montre comment un simple processus évolutif artificiel permet à ce type de topologies d'émerger dans le cas de deux problèmes prototypiques des réseaux d'automates, les tâches de densité et de synchronisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En los tiempos que corren la robótica forma uno de los pilares más importantes en la industria y una gran noticia para los ingenieros es la referente a las ventas de estos, ya que en 2013, unos 179.000 robots industriales se vendieron en todo el mundo, de nuevo un máximo histórico y un 12% más que en 2012 según datos de la IFR (International Federation of Robotics). Junto a esta noticia, la robótica colaborativa entra en juego en el momento que los robots y los seres humanos deben compartir el lugar de trabajo sin que nos veamos excluidos por las maquinas, por lo tanto lo que se intenta es que los robots mejoren la calidad del trabajo al hacerse cargo de los trabajos peligrosos, tediosos y sucios que no son posibles o seguros para los seres humanos. Otro concepto muy importante y directamente relacionado con lo anterior que está muy en boga y se escucha desde hace relativamente poco tiempo es el de la fabrica del futuro o “Factory Of The Future” la cual intenta que los operarios y los robots encuentren la sintonía en el entorno laboral y que los robots se consideren como maquinaria colaborativa y no como sustitutiva, considerándose como uno de los grandes nichos productivos en plena expansión. Dejando a un lado estos conceptos técnicos que nunca debemos olvidar si nuestra carrera profesional va enfocada en este ámbito industrial, el tema central de este proyecto está basado, como no podía ser de otro modo, en la robótica, que junto con la visión artificial, el resultado de esta fusión, ha dado un manipulador robótico al que se le ha dotado de cierta “inteligencia”. Se ha planteado un sencillo pero posible proceso de producción el cual es capaz de almacenar piezas de diferente forma y color de una forma autónoma solamente guiado por la imagen capturada con una webcam integrada en el equipo. El sistema consiste en una estructura soporte delimitada por una zona de trabajo en la cual se superponen unas piezas diseñadas al efecto las cuales deben ser almacenadas en su lugar correspondiente por el manipulador robótico. Dicho manipulador de cinemática paralela está basado en la tecnología de cables, comandado por cuatro motores que le dan tres grados de libertad (±X, ±Y, ±Z) donde el efector se encuentra suspendido sobre la zona de trabajo moviéndose de forma que es capaz de identificar las características de las piezas en situación, color y forma para ser almacenadas de una forma ordenada según unas premisas iníciales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spatial distribution of illuminance and the electric consumption of artificial lighting system is one of the main problems related to broiler production. Therefore, the aim of this study was to evaluate the spatial distribution of luminance level and energy efficiency of different lighting systems for broiler houses. Six types of lamps were tested in two different configurations to find the minimum illuminance of 20 and 5 lux. The tested lamps were incandescent (IL) 100 W, compact fluorescent (CFL) 34 W, mixed (ML) 160 W, sodium vapor (SVL) 70 W, T8 fluorescent tube (T8 FTL) 40 W and T5 fluorescent tube (T5 FTL) 28 W. The first four were evaluated with and without reflective light fixture and the latter two without light fixture. It was observed that the tested system with light fixtures negatively affected the spatial distribution of illuminance inside the house. The systems composed by IL and ML without light fixture led to better results in meeting the minimum illuminance of 20 lux and 5 lux, respectively. T5 FTL presented the lowest energy demand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this master’s thesis, wind speeds and directions were modeled with the aim of developing suitable models for hourly, daily, weekly and monthly forecasting. Artificial Neural Networks implemented in MATLAB software were used to perform the forecasts. Three main types of artificial neural network were built, namely: Feed forward neural networks, Jordan Elman neural networks and Cascade forward neural networks. Four sub models of each of these neural networks were also built, corresponding to the four forecast horizons, for both wind speeds and directions. A single neural network topology was used for each of the forecast horizons, regardless of the model type. All the models were then trained with real data of wind speeds and directions collected over a period of two years in the municipal region of Puumala in Finland. Only 70% of the data was used for training, validation and testing of the models, while the second last 15% of the data was presented to the trained models for verification. The model outputs were then compared to the last 15% of the original data, by measuring the mean square errors and sum square errors between them. Based on the results, the feed forward networks returned the lowest generalization errors for hourly, weekly and monthly forecasts of wind speeds; Jordan Elman networks returned the lowest errors when used for forecasting of daily wind speeds. Cascade forward networks gave the lowest errors when used for forecasting daily, weekly and monthly wind directions; Jordan Elman networks returned the lowest errors when used for hourly forecasting. The errors were relatively low during training of the models, but shot up upon simulation with new inputs. In addition, a combination of hyperbolic tangent transfer functions for both hidden and output layers returned better results compared to other combinations of transfer functions. In general, wind speeds were more predictable as compared to wind directions, opening up opportunities for further research into building better models for wind direction forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Revisión del problema de la filosofía de la Inteligencia Artificial a la vista del Equilibrio refractivo. La revisión del problema se lleva a cabo para mostrar como "¿pueden pensar las máquinas?" sólo se ha evaluado en los terminos humanos. El equilibrio refractivo se plantea como una herramienta para definir conceptos de tal modo que la experiencia y los preceptos se encuentren en equilibrio, para con él construir una definición de pensar que no esté limitada exclusivamente a "pensar tal y como lo hacen los humanos".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Invasive plant species have been shown to alter the microbial community composition of the soils they invade and it is suggested that this below-ground perturbation of potential pathogens, decomposers or symbionts may feedback positively to allow invasive success. Whether these perturbations are mediated through specific components of root exudation are not understood. We focussed on 8-hydroxyquinoline, a putative allelochemical of Centaurea diffusa (diffuse knapweed) and used an artificial root system to differentiate the effects of 8-hydroxyquinoline against a background of total rhizodeposition as mimicked through supply of a synthetic exudate solution. In soil proximal (0-10 cm) to the artificial root, synthetic exudates had a highly significant (P < 0.001) influence on dehydrogenase, fluorescein diacetate hydrolysis and urease activity. in addition, 8-hydroxyquinoline was significant (p = 0.003) as a main effect on dehydrogenase activity and interacted with synthetic exudates to affect urease activity (p = 0.09). Hierarchical cluster analysis of 16S rDNA-based DGGE band patterns also identified a primary affect of synthetic exudates and a secondary affect of 8-hydroxyquinoline on bacterial community structure. Thus, we show that the artificial rhizosphere produced by the synthetic exudates was the predominant effect, but, that the influence of the 8-hydroxyquinoline signal on the activity and structure of soil microbial communities could also be detected. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The redistribution of a finite amount of martian surface dust during global dust storms and in the intervening periods has been modelled in a dust lifting version of the UK Mars General Circulation Model. When using a constant, uniform threshold in the model’s wind stress lifting parameterisation and assuming an unlimited supply of surface dust, multiannual simulations displayed some variability in dust lifting activity from year to year, arising from internal variability manifested in surface wind stress, but dust storms were limited in size and formed within a relatively short seasonal window. Lifting thresholds were then allowed to vary at each model gridpoint, dependent on the rates of emission or deposition of dust. This enhanced interannual variability in dust storm magnitude and timing, such that model storms covered most of the observed ranges in size and initiation date within a single multiannual simulation. Peak storm magnitude in a given year was primarily determined by the availability of surface dust at a number of key sites in the southern hemisphere. The observed global dust storm (GDS) frequency of roughly one in every 3 years was approximately reproduced, but the model failed to generate these GDSs spontaneously in the southern hemisphere, where they have typically been observed to initiate. After several years of simulation, the surface threshold field—a proxy for net change in surface dust density—showed good qualitative agreement with the observed pattern of martian surface dust cover. The model produced a net northward cross-equatorial dust mass flux, which necessitated the addition of an artificial threshold decrease rate in order to allow the continued generation of dust storms over the course of a multiannual simulation. At standard model resolution, for the southward mass flux due to cross-equatorial flushing storms to offset the northward flux due to GDSs on a timescale of ∼3 years would require an increase in the former by a factor of 3–4. Results at higher model resolution and uncertainties in dust vertical profiles mean that quasi-periodic redistribution of dust on such a timescale nevertheless appears to be a plausible explanation for the observed GDS frequency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar radiation sustains and affects all life forms on Earth. The increase in solar UV-radiation at environmental levels, due to depletion of the stratospheric ozone layer, highlights serious issues of social concern. This becomes still more dramatic in tropical and subtropical regions where radiation-intensity is still higher. Thus, there is the need to evaluate the harmful effects of solar UV-radiation on the DNA molecule as a basis for assessing the risks involved for human health, biological productivity and ecosystems. In order to evaluate the profile of DNA damage induced by this form of radiation and its genotoxic effects, plasmid DNA samples were exposed to artificial-UV lamps and directly to sunlight. The induction of cyclobutane pyrimidine dimer photoproducts (CPDs) and oxidative DNA damage in these molecules were evaluated by means of specific DNA repair enzymes. On the other hand, the biological effects of such lesions were determined through the analysis of the DNA inactivation rate and mutation frequency, after replication of the damaged pCMUT vector in an Escherichia coli MBL50 strain. The results indicated the induction of a significant number of CPDs after exposure to increasing doses of UVC, UVB, UVA radiation and sunlight. Interestingly, these photoproducts are those lesions that better correlate with plasmid inactivation as well as mutagenesis, and the oxidative DNA damages induced present very low correlation with these effects. The results indicated that DNA photoproducts play the main role in the induction of genotoxic effects by artificial UV-radiation sources and sunlight. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the last decade the problem of surface inspection has been receiving great attention from the scientific community, the quality control and the maintenance of products are key points in several industrial applications.The railway associations spent much money to check the railway infrastructure. The railway infrastructure is a particular field in which the periodical surface inspection can help the operator to prevent critical situations. The maintenance and monitoring of this infrastructure is an important aspect for railway association.That is why the surface inspection of railway also makes importance to the railroad authority to investigate track components, identify problems and finding out the way that how to solve these problems. In railway industry, usually the problems find in railway sleepers, overhead, fastener, rail head, switching and crossing and in ballast section as well. In this thesis work, I have reviewed some research papers based on AI techniques together with NDT techniques which are able to collect data from the test object without making any damage. The research works which I have reviewed and demonstrated that by adopting the AI based system, it is almost possible to solve all the problems and this system is very much reliable and efficient for diagnose problems of this transportation domain. I have reviewed solutions provided by different companies based on AI techniques, their products and reviewed some white papers provided by some of those companies. AI based techniques likemachine vision, stereo vision, laser based techniques and neural network are used in most cases to solve the problems which are performed by the railway engineers.The problems in railway handled by the AI based techniques performed by NDT approach which is a very broad, interdisciplinary field that plays a critical role in assuring that structural components and systems perform their function in a reliable and cost effective fashion. The NDT approach ensures the uniformity, quality and serviceability of materials without causing any damage of that materials is being tested. This testing methods use some way to test product like, Visual and Optical testing, Radiography, Magnetic particle testing, Ultrasonic testing, Penetrate testing, electro mechanic testing and acoustic emission testing etc. The inspection procedure has done periodically because of better maintenance. This inspection procedure done by the railway engineers manually with the aid of AI based techniques.The main idea of thesis work is to demonstrate how the problems can be reduced of thistransportation area based on the works done by different researchers and companies. And I have also provided some ideas and comments according to those works and trying to provide some proposal to use better inspection method where it is needed.The scope of this thesis work is automatic interpretation of data from NDT, with the goal of detecting flaws accurately and efficiently. AI techniques such as neural networks, machine vision, knowledge-based systems and fuzzy logic were applied to a wide spectrum of problems in this area. Another scope is to provide an insight into possible research methods concerning railway sleeper, fastener, ballast and overhead inspection by automatic interpretation of data.In this thesis work, I have discussed about problems which are arise in railway sleepers,fastener, and overhead and ballasted track. For this reason I have reviewed some research papers related with these areas and demonstrated how their systems works and the results of those systems. After all the demonstrations were taking place of the advantages of using AI techniques in contrast with those manual systems exist previously.This work aims to summarize the findings of a large number of research papers deploying artificial intelligence (AI) techniques for the automatic interpretation of data from nondestructive testing (NDT). Problems in rail transport domain are mainly discussed in this work. The overall work of this paper goes to the inspection of railway sleepers, fastener, ballast and overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest nurseries are essential for producing good quality seedlings, thus being a key element in the reforestation process. With increasing climate change awareness, nursery managers are looking for new tools that can help reduce the effects of their operations on the environment. The ZEPHYR project, funded by the European Commission under the Seventh Framework Programme (FP7), has the objective of finding new alternatives for nurseries by developing innovative zero-impact technologies for forest plant production. Due to their direct relationship to the energy consumption of the nurseries, one of the main elements addressed are the grow lights used for the pre-cultivation. New LED luminaires with a light spectrum tailored to the seedlings’ needs are being studied and compared against the traditional fluorescent lamps. Seedlings of Picea abies and Pinus sylvestris were grown under five different light spectra (one fluorescent and 4 LED) during 5 weeks with a photoperiod of 16 hours at 100 μmol∙m-2∙s-1 and 60% humidity. In order to evaluate if these seedlings were able cope with real field stress conditions, a forest field trial was also designed. The terrain chosen was a typical planting site in mid-Sweden after clear-cutting. Two vegetation periods after the outplanting, the seedlings that were pre-cultivated under the LED lamps have performed at least as well as those that were grown under fluorescent lights. These results show that there is a good  potential for lightning substitution in forestry nurseries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes an environment for programming programmable logic controllers applied to oil wells with BCP type method of artificially lifting. The environment will have an editor based in the diagram of sequential functions for programming of PLCs. This language was chosen due to the fact of being high-level and accepted by the international standard IEC 61131-3. The use of these control programs in real PLC will be possible with the use of an intermediate level of language based on XML specification PLCopen T6 XML. For the testing and validation of the control programs, an area should be available for viewing variables obtained through communication with a real PLC. Thus, the main contribution of this work is to develop a computational environment that allows: modeling, testing and validating the controls represented in SFC and applied in oil wells with BCP type method of artificially lifting

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the artificial lift method by Electrical Submersible Pump (ESP), the energy is transmitted for the well´s deep through a flat electric handle, where it is converted into mechanical energy through an engine of sub-surface, which is connected to a centrifugal pump. This transmits energy to the fluid under the pressure form, bringing it to the surface In this method the subsurface equipment is basically divided into: pump, seal and motor. The main function of the seal is the protect the motor, avoiding the motor´s oil be contaminated by oil production and the consequent burning of it. Over time, the seal will be wearing and initiates a contamination of motor oil, causing it to lose its insulating characteristics. This work presents a design of a magnetic sensor capable of detecting contamination of insulating oil used in the artificial lift method of oil-type Electrical Submersible Pump (ESP). The objective of this sensor is to generate alarm signal just the moment when the contamination in the isolated oil is present, enabling the implementation of a predictive maintenance. The prototype was designed to work in harsh conditions to reach a depth of 2000m and temperatures up to 150°C. It was used a simulator software to defined the mechanical and electromagnetic variables. Results of field experiments were performed to validate the prototype. The final results performed in an ESP system with a 62HP motor showed a good reliability and fast response of the prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a computer simulator for sucker rod pumped vertical wells. The simulator is able to represent the dynamic behavior of the systems and the computation of several important parameters, allowing the easy visualization of several pertinent phenomena. The use of the simulator allows the execution of several tests at lower costs and shorter times, than real wells experiments. The simulation uses a model based on the dynamic behavior of the rod string. This dynamic model is represented by a second order partial differencial equation. Through this model, several common field situations can be verified. Moreover, the simulation includes 3D animations, facilitating the physical understanding of the process, due to a better visual interpretation of the phenomena. Another important characteristic is the emulation of the main sensors used in sucker rod pumping automation. The emulation of the sensors is implemented through a microcontrolled interface between the simulator and the industrial controllers. By means of this interface, the controllers interpret the simulator as a real well. A "fault module" was included in the simulator. This module incorporates the six more important faults found in sucker rod pumping. Therefore, the analysis and verification of these problems through the simulator, allows the user to identify such situations that otherwise could be observed only in the field. The simulation of these faults receives a different treatment due to the different boundary conditions imposed to the numeric solution of the problem. Possible applications of the simulator are: the design and analysis of wells, training of technicians and engineers, execution of tests in controllers and supervisory systems, and validation of control algorithms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks are reality nowadays. The growing necessity of connectivity between existing industrial plant equipments pushes the research and development of several technologies. The IEEE 802.15.4 LR-WPAN comes as a low-cost and powersaving viable solution, which are important concerns while making decisions on remote sensoring projects. This study intends to propose a wireless communication system which makes possible the monitoring of analogic and/or digital variables (i. e., the pressure studied) involved on the artificial methods for oil and gas lifting. The main issues are: To develop a software based on SMAC Standard in order to create a wireless network to monitoring analogic and/or digital variables; To evaluate the communication link based on the number of lost packets tested in different environments (indoor and outdoor) and To propose an instrumentation system consisting of wireless devices