16 resultados para periodically poled
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The present objective study to inside identify the critical factors of success of the local family companies of a competitive boarding. For in such a way the same it understands the culture and the management of the family companies of success, identifies the essential areas of performance, it establishes the restrictive factors of the success, and analyzes the level of influence of the critical factors of success in the competitiveness of this type of company. In function of the subject little to be explored, and of this study to provide a general vision concerning the factors that take the family companies to get success, this research is explorer. On the other hand, for describing characteristic of the familiar companies in prominence in the local scene and for being worried about the practical performance, the same one also is descriptive. The sample in turn is the not-probabilist one of the intentional type, for accessibility. For operacionalization of the collection of data, the direct contact was used, being the composed instrument of research for variable as management, culture, critical factors of success and competitiveness. The study it evidences that in regards to the management and the culture of the family companies of success, some variable are turned aside from the standard of the conventional family companies cited by literature. Of general form in the familiar companies of success it has a bigger level of professionalization of the management. As for the value given to the knowledge, the study sample that the conventional family companies give little importance to it, in contrast of the family companies in prominence, who value of significant form the search for the knowledge. He is demonstrated despite the family companies of success, even so total are not professionalized, possess a bigger level of professionalization of the management, ratifying of certain forms the reason for which the majority develops the Strategical Planning formal periodically. In short, the results point 17 critical factors of success with respect to the family companies, in special factors as the product quality and services, and the use of the technology
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Several methods of mobile robot navigation request the mensuration of robot position and orientation in its workspace. In the wheeled mobile robot case, techniques based on odometry allow to determine the robot localization by the integration of incremental displacements of its wheels. However, this technique is subject to errors that accumulate with the distance traveled by the robot, making unfeasible its exclusive use. Other methods are based on the detection of natural or artificial landmarks present in the environment and whose location is known. This technique doesnt generate cumulative errors, but it can request a larger processing time than the methods based on odometry. Thus, many methods make use of both techniques, in such a way that the odometry errors are periodically corrected through mensurations obtained from landmarks. Accordding to this approach, this work proposes a hybrid localization system for wheeled mobile robots in indoor environments based on odometry and natural landmarks. The landmarks are straight lines de.ned by the junctions in environments floor, forming a bi-dimensional grid. The landmark detection from digital images is perfomed through the Hough transform. Heuristics are associated with that transform to allow its application in real time. To reduce the search time of landmarks, we propose to map odometry errors in an area of the captured image that possesses high probability of containing the sought mark
Resumo:
A serious problem that affects an oil refinery s processing units is the deposition of solid particles or the fouling on the equipments. These residues are naturally present on the oil or are by-products of chemical reactions during its transport. A fouled heat exchanger loses its capacity to adequately heat the oil, needing to be shut down periodically for cleaning. Previous knowledge of the best period to shut down the exchanger may improve the energetic and production efficiency of the plant. In this work we develop a system to predict the fouling on a heat exchanger from the Potiguar Clara Camarão Refinery, based on data collected in a partnership with Petrobras. Recurrent Neural Networks are used to predict the heat exchanger s flow in future time. This variable is the main indicator of fouling, because its value decreases gradually as the deposits on the tubes reduce their diameter. The prediction could be used to tell when the flow will have decreased under an acceptable value, indicating when the exchanger shutdown for cleaning will be needed
Resumo:
In Natal still dominates the use of individual disposal systems for domestic sewage, once only 29% of the city has a sewarage system. Wastes that are accumulated in these individual treatment systems should be exhausted periodically, service provided by collector entrepreneurs. Some of these companies causing major damage to the environment. In Natal, only two companies have their own septage (RESTI) treatment system, which were designed with parameters from domestic sewage generating strain and inefficient systems. Therefore, the characterization becomes essential as a source of parameters for their design. Thus, this work presents the physical-chemical and microbiological characterization of waste pumped from individual sewage treatment systems. Samples collections were made weekly from 5 different trucks at the reception point on the treatment plant on the point of the preliminary treatment. From each truck it was taken 5 samples during the discharge in order to make a composite sample. Afterwards, samples were carried out to laboratory and analyses for determination of temperature, pH, conductivity, BOD, COD, nitrogen (ammonia e organic), alkalinity, oils, phosphorus, solids, faecal coliforms and helminth egg. The results were treated as a single database, and ranked according to its generating source (multi and single house, lodging, health, services and / or food), area of origin (metropolitan, south and north) and type of system (cesspits, septic tank and / or sink). Through these data it was possible to verify that the type of system adopted by most in Natal and the metropolitan region is cesspit, besides to confirm the difference between the septage of areas with a population have different social and economical characteristics. It was found that the septage have higher concentrations than domestic sewage, except for thermotolerant coliforms that showed concentrations of 1,38E+07. Among the parameters studied, is the median values identified for COD (3,549 mg / L), BOD (973mg / L) and total solids (3.557mg / L). The volatile fraction constitutes about 70% of the total solids of the septage. For helminths has been a median of 7 eggs/L. In general, the characteristics of the waste followed the variability found in the literature reviewed for all variables, showing high amplitudes
Resumo:
In this work, we present a theoretical study of the propagation of electromagnetic waves in multilayer structures called Photonic Crystals. For this purpose, we investigate the phonon-polariton band gaps in periodic and quasi-periodic (Fibonacci-type) multilayers made up of both positive and negative refractive index materials in the terahertz (THz) region. The behavior of the polaritonic band gaps as a function of the multilayer period is investigated systematically. We use a theoretical model based on the formalism of transfer matrix in order to simplify the algebra involved in obtaining the dispersion relation of phonon-polaritons (bulk and surface modes). We also present a quantitative analysis of the results, pointing out the distribution of the allowed polaritonic bandwidths for high Fibonacci generations, which gives good insight about their localization and power laws. We calculate the emittance spectrum of the electromagnetic radiation, in THZ frequency, normally and obliquely incident (s and p polarized modes) on a one-dimensional multilayer structure composed of positive and negative refractive index materials organized periodically and quasi-periodically. We model the negative refractive index material by a effective medium whose electric permittivity is characterized by a phonon-polariton frequency dependent dielectric function, while for the magnetic permeability we have a Drude like frequency-dependent function. Similarity to the one-dimensional photonic crystal, this layered effective medium, called polaritonic Crystals, allow us the control of the electromagnetic propagation, generating regions named polaritonic bandgap. The emittance spectra are determined by means of a well known theoretical model based on Kirchoff s second law, together with a transfer matrix formalism. Our results shows that the omnidirectional band gaps will appear in the THz regime, in a well defined interval, that are independent of polarization in periodic case as well as in quasiperiodic case
Resumo:
We study magnetic interface roughness in F/AF bilayers. Two kinds of roughness were considered. The first one consists of isolated defects that divide the substrate in two regions, each one with an AF sub-lattice. The interface exchange coupling is considered uniform and presents a sudden change in the defects line, favoring Neel wall nucleation. Our results show the interface field dependence of the threshold thickness for the reorientation of the magnetization in the ferromagnetic film. Angular profiles show the relaxation of the magnetization, from Neel wall, at the interface, to reoriented state, at the surface. External magnetic field, perpendicular to the easy axis of the substrate, favors the reoriented state. Depending, of the external magnetic field intensity, parallel to the easy axis of the AF, the magnetization profile at surface can be parallel or perpendicular to the field direction. The second one treats of distributed deffects, periodically. The shape hysteresis curves, exchange bias and coercivity were characterized by interface field intensity and roughness pattern. Our results show that dipolar effects decrease the exchange bias and coercivity
Resumo:
Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests
Resumo:
Northeastern Brazil is mainly formed by crystalline terrains (around 60% in area). Moreover, this region presents a semi-arid climate so that it is periodically subject to drought seasons. Furthermore, ground water quality extracted fromwells usually presents poor quality because of their high salinity contents. Nevertheless, ground water is still a very important source of water for human and animal consumption in this region. Well sitting in hard rocks terrains in Northeastern Brazil offers a mean success index of aboul 60%, given that a successful siting is defined by a well producing at least 0.5 m³/h. This low index reveals lack of knowledga about the true conditions of storage and percolation of ground water in crystalline rocks. Two models for structures storing and producing ground water in crystalline rocks in Northeastem Brazil have been proposed in the literature. The first model,tradnionally used for well sitting since the sixties are controlled by faults or fractures zones. This model is commonly referred, in Brazilian hydrogeological literature, as the "creek-crack" model (riacho-fenda in Portuguese). Sites appearing to present dense drainage network are preferred for water well siting - particularly at points where the drainages cross-cul each other. Field follow up work is usually based only on geological criteria. The second model is the "eluvio-alluvial through" (calha eluvio-aluvionar in Portuguese); it is also described in the literature but it is not yet incorporated in well sitting practice. This model is based on the hypothesis that reclilinear drainages can also be controlled by the folietion of the rock. Eventually, depending upon the degree of weathering, a through-shaped structure filled with sediments (alluvium and regolith) can be developed which can store and water can be produced from. Using severalfield case studies, this Thesis presents a thorough analysis ofthe two above cited models and proposes a new model. The analysis is based on an integrated methodological approach using geophysics and structural geology. Both land (Resitiviy and Ground Penetrating Radar- GPR) and aerogeophysical (magnetics and frequency domain eletromagnetics) surveys were used. Slructural analysis emphasized neolectonic aspects; in general, itwas found that fractures in the E-W direction are relatively open, as compared to fracturas inthe N-S direction, probably because E-W fractures were opened by the neotectonic stress regime in Northeastern Brazil, which is controlled by E-W compression and N-S extension. The riacho-fenda model is valid where drainages are controlled by fractures. The degree of fracturing and associated weathering dictale the hydrogeological potential of the structure. Field work in structural analogues reveals that subvertical fractures show consistent directions both in outcrop and aerophotograph scales. Geophysical surveys reveal subvertical conductive anomalies associated to the fracture network controlling the drainage; one of the borders of the conductive anomaly usually coincide wih the drainage. An aspect of particular importance to the validation of fracture control are the possible presence of relalively deep conductive anomalies wihoul continuation or propagalion to the surface. The conductive nature of lhe anomaly is due to the presence of wealhered rock and sedirnenls (alluvium and/or regolilh) storing ground waler which occur associated to the fracture network. Magnetic surveys are not very sensisnive to these structures.lf soil or covering sedirnents are resislive (> 100 Ohm.m), GPR can ba used to image precisely lhe fracture network. A major limialion of riacho-fenda model, revealed by GPR images, is associated to the fact thal subhorizontal fractures do play a very important role in connecting the fracture network, besides connect shallow recharge zones to relalively deep subvertical frecture zones. Iffractures play just a secondary control on the drainage, however, r/acho-fenda model may have a very limiled validny; in these cases, large portions oflhe drainage do nol coincide wilh frectures and mosl oflhewells localed in lhe drainage surrounding would resull dry. Usually, a secondary conlrol on lhe drainage by Ihefraclure networkcan be revealed only wilh detailed geophysical survey. The calha elClv1o-aluvlonarmodel is valid where drainages are conlrolled by folialion. The degree 01 wealhering 01 lhe lolialion planes dictales lhe hydrogeological polenlial 01 lhe slruclure. Outcrop analysis reveals Ihal lolialion and drainage direclions are parallel and Ihal no Iraclures, orfraclures wilh diflerent directions 01 lhe drainage direclion occur. Geophysical surveys reveal conduclive anomalies in a slab lorm associaled 10 lhe Ihrough 01 lhe wealhered rock and sedimenls (alluvium and/or regolith). Magnelic surveys can ofler a very good conlrol on lolialion direclion. An importanl aspect 10 validale lolialion conlrol are lhe presence 01 conductive anomalies showing shallow and deep portions area which are linked. Illhere is an exlensive soil cover, r/acho-fenda and calha eIClv1o-aluv/onar conlrols can be easily misinlerpreled in lhe absence 01 geophysical conlrol. Certainly, Ihis lacl could explain at leasl a part of lhe failure index in well sitting. The model wealhering sack (bolsllo de Intempertsmo in Portuguese) is proposed to explain cases where a very inlensive wealhering occur over lhe crystalline rock so Ihal a secondary inlerslilial porosity is crealed. The waler is Ihen stored in lhe porous of lhe regolilh in a similar mannerlo sedimentary rocks. A possible example ofthis model was delecled by using land geophysical survey where a relalivelyvery deep isolaled conduclive anomaly, in a slab form, was delected. Iflhis structure does store ground waler, certainly Ihere must be a link 01 lhe deep slructure wilh lhe surface in orderlo provide walerfeeding. This model mighl explain anomalous waler yields as greal as 50 m³/h Ihalsomelimescan occur in crystalline rocks in Northeaslern Brazil
Resumo:
Epidemiological surveys are important for obtaining information on the prevalence and etiology of mouth diseases, since the data collected permit health actions to be planned, performed, and assessed. Methodological uniformity is necessary, however, to maintain reproductibility, validity, and reliability, and to allow national and international comparisons. The initiative of the World Health Organization (WHO) as an advisor in ongoing surveys has been extremely useful, stimulating standardization in all countries. In 1991, a Portuguese version of the 1987 third edition of Oral Health Surveys - basic methods, an instruction manual for performing epidemiological surveys, was published and became a reference for many parts of Brazil and the World. The present analysis found conflicting points in relation to the sample size, calibration of the examiners, and criteria for evaluating oral health and treatment needs. In conclusion, due to the dynamic characteristics of scientific knowledge and, considering the regional differences in relation to the development of oral diseases, we recommend that proposals for standardizing surveys be checked periodically. Other important issues may have not been detected in this analysis, urging a thorough discussion within the dentistry community as a whole.
Resumo:
Epidemiological surveys are important for obtaining information on the prevalence and etiology of mouth diseases, since the data collected permit health actions to be planned, performed, and assessed. Methodological uniformity is necessary, however, to maintain reproductibility, validity, and reliability, and to allow national and international comparisons. The initiative of the World Health Organization (WHO) as an advisor in ongoing surveys has been extremely useful, stimulating standardization in all countries. In 1991, a Portuguese version of the 1987 third edition of Oral Health Surveys - basic methods, an instruction manual for performing epidemiological surveys, was published and became a reference for many parts of Brazil and the World. The present analysis found conflicting points in relation to the sample size, calibration of the examiners, and criteria for evaluating oral health and treatment needs. In conclusion, due to the dynamic characteristics of scientific knowledge and, considering the regional differences in relation to the development of oral diseases, we recommend that proposals for standardizing surveys be checked periodically. Other important issues may have not been detected in this analysis, urging a thorough discussion within the dentistry community as a whole.
Resumo:
Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.
Resumo:
Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.
Resumo:
It seeks to find an alternative to the current tantalum electrolytic capacitors in the market due to its high cost. Niobium is a potential replacement for be lighter and cheaper than tantalum. They belong to the same table group periodically and thus exhibit several physical and chemical properties similar. Niobium is used in many technologically important applications, and Brazil has the largest reserves, around 96%. These electrolytic capacitors have high specific capacitance, so they can store high energy in small volumes compared to other types of capacitors. This is the main attraction of this type of capacitor because is growing demand in the production of capacitors with capacitance specifies increasingly high, this because of the miniaturization of various devices such as GPS devices, televisions, computers, phones and many others. The production route of the capacitor was made by powder metallurgy. The initial niobium poder was first characterized by XRD, SEM and laser particle size to then be sieved into particle size 400mesh. The powder was then compacted at pressure of 150MPa and sintered at 1400, 1450 and 1500°C using two sintering time 30 and 60min. Sintering is an important part of the process as it affects properties as porosity and surface cleaning of the samples, which greatly affected the quality of the capacitor. After sintering the samples were underwent a process of anodic oxidation (anodizing), which created a thin film of niobium pentoxide over the whole surface of the sample, this film is the dielectric capacitor. The anodizing process variables influenced a lot in film formation and consequently the capacitor. The samples were characterized by electrical measurements of capacitance, loss factor and ESR (equivalent series resistance). The sintering has affected the porosity and in turn the specific area of the samples. The capacitor area is directly related to the capacitance, that is, the higher the specific area is the capacitance. Higher sintering temperatures decrease the surface area but eliminate as many impurities. The best results were obtained at a temperature of 1400°C with 60 minutes. The most interesting results were compared with the specific capacitance and ESR for all samples.
Resumo:
It seeks to find an alternative to the current tantalum electrolytic capacitors in the market due to its high cost. Niobium is a potential replacement for be lighter and cheaper than tantalum. They belong to the same table group periodically and thus exhibit several physical and chemical properties similar. Niobium is used in many technologically important applications, and Brazil has the largest reserves, around 96%. These electrolytic capacitors have high specific capacitance, so they can store high energy in small volumes compared to other types of capacitors. This is the main attraction of this type of capacitor because is growing demand in the production of capacitors with capacitance specifies increasingly high, this because of the miniaturization of various devices such as GPS devices, televisions, computers, phones and many others. The production route of the capacitor was made by powder metallurgy. The initial niobium poder was first characterized by XRD, SEM and laser particle size to then be sieved into particle size 400mesh. The powder was then compacted at pressure of 150MPa and sintered at 1400, 1450 and 1500°C using two sintering time 30 and 60min. Sintering is an important part of the process as it affects properties as porosity and surface cleaning of the samples, which greatly affected the quality of the capacitor. After sintering the samples were underwent a process of anodic oxidation (anodizing), which created a thin film of niobium pentoxide over the whole surface of the sample, this film is the dielectric capacitor. The anodizing process variables influenced a lot in film formation and consequently the capacitor. The samples were characterized by electrical measurements of capacitance, loss factor and ESR (equivalent series resistance). The sintering has affected the porosity and in turn the specific area of the samples. The capacitor area is directly related to the capacitance, that is, the higher the specific area is the capacitance. Higher sintering temperatures decrease the surface area but eliminate as many impurities. The best results were obtained at a temperature of 1400°C with 60 minutes. The most interesting results were compared with the specific capacitance and ESR for all samples.