19 resultados para Fourier, Analise de
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
O gerenciamento dos processos organizacionais vem sendo estudado pela ciência administrativa como forma de romper com o paradigma da estrutura organizacional funcional através da Gestão por Processos. O Business Process Management ( BPM alinhado às estratégias organizacionais e suportado cada vez mais pela Tecnologia da Informação (TI), proporciona clareza nas diversas pontas do processo colaborando para sua melhoria contínua com o objetivo de gerar valor agregado ao cliente. As organizações de saúde estão entre as empresas prestadoras de serviço pouco estudadas em relação ao gerenciamento por processos. Assim, este estudo analisou por meio de um estudo empírico de natureza qualitativa, como estão sendo conduzidos os processos organizacionais hospitalares à luz das melhores práticas em BPM. A pesquisa foi realizada através do estudo de casos múltiplos realizados em duas organizações hospitalares na cidade de Natal/RN. A literatura de referência apresentou diversos fatores para um desempenho otimizado em BPM, tratados nesta pesquisa como as melhores práticas em BPM. A partir da revisão da literatura foi elaborada uma síntese das melhores práticas de BPM que serviu de base para elaboração do modelo da pesquisa utilizado para coleta e análise dos dados. Este modelo indicou onze categorias que foram utilizadas para elaboração do roteiro de estrevistas semi-estruturadas, através da técnica de análise de conteúdo, com categorização de grade fechada. As categorias foram agrupadas em duas dimensões: Elementos relacionados à gestão ( governança ; liderança , alinhamento estratégico , cultura e conhecimento ) e elementos relacionados aos processos ( desenho , responsável , executores , tecnologia da informação e indicadores ), e ainda foi identificada uma terceira categoria: escritório de processos . Para seleção dos sujeitos desta pesquisa foi adotada a estratégia em cadeia ou bola de neve . Foi possível identificar que todas as categorias apontadas no modelo de pesquisa emergem entre os fatores buscados pelas organizações hospitalares para o gerenciamento por processos com destaque para categorias: cultura ; conhecimento ; desenho ; tecnologia da informação e indicadores . Em complemento às categorias de análise, foram identificadas dificuldades relacionadas à comunicação e integração dos diversos elos do processo. Além disso, constatou-se que nos hospitais investigados há um desvio do conceito de BPM no que diz respeito a seu objetivo final: agregar valor ao cliente. A pesquisa concluiu que o gerenciamento por processos nas organizações hospitalares investigadas encontra-se em fase inicial ou em desenvolvimento, sendo necessário superar as barreiras da comunicação e edificar uma cultura organizacional orientada às necessidades dos clientes para aplicação das melhores práticas de BPM, desta forma pesquisas futuras sobre este tema em outras organizações hospitalares, podem facilitar um estudo comparativo e ampliar o conhecimento no assunto
Resumo:
The search for a sustainable urban mobility, has recast the public policy of transport and movement for all, in order to contribute to the welfare economic, social and environmental. Within this context, has as its main objective review here in the city of Natal in the state of Rio Grande Norte, the deployment of the new road infrastructure of the transport corridor of Bernardo Vieira Avenue and checking at least with regard to urban areas and environmental chosen here, as will indicators to assess sustainable urban mobility, that the theory has been well constructed, but in practice little way to apply the proposed guidelines for sustainability. To achieve this result, is initially a literature review with the principal investigators of the matter, since the concepts of indicators of sustainable urban mobility. And a second time, participating in to the case study, using the methodology of environmental awareness, through analysis photographs, notes and testimony in the study area ace to reach conclusions
Resumo:
In this work, biodiesel was produced from castor oil that was a byproduct glycerin. The molar ratio between oil and alcohol, as well as the use of (KOH) catalyst to provide the chemical reaction is based on literature. The best results were obtained using 1 mol of castor oil (260g) to 3 moles of methyl alcohol (138g), using 1.0% KOH as catalyst at a temperature of 260 ° C and shaken at 120 rpm. The oil used was commercially available, the process involves the reaction of transesterification of a vegetable oil with methyl alcohol. The product of this reaction is an ester, biodiesel being the main product and the glycerin by-product which has undergone treatment for use as raw material for the production of allyl alcohol. The great advantage of the use of glycerin to obtain allyl alcohol is that its use eliminates the large amount of waste of the biodiesel and various forms of insult to the environment. The reactions for the formation of allyl alcohol was conducted from formic acid and glycerin in a ratio 1/1, at a temperature of 260oC in a heater blanket, being sprayed by a spiral condenser for a period of 2 hours and the product obtained contains mostly the allylic alcohol .. The monitoring of reactions was performed by UV-Visible Spectrophotometer: FTIR Fourier transform, the analysis showed that these changes occur spectrometer indicating the formation of the product allylic alcohol (prop-2-en-1-ol) in the presence of water, This alcohol was appointed Alcohol GL. The absorption bands confirms that the reaction was observed in (υ C = C) 1470 -1600 cm -1 and (υ CO), 3610-3670 attributed to C = C groups and OH respectively. The thermal analysis was carried out in a thermogravimetric analyzer SDT Q600, where the mass and temperature are displayed against time, that allows checking the approximate rate of heating. The innovative methodology developed in the laboratory (LABTAM, UFRN), was able to treat the glycerine produced by transesterification of castor oil and used as raw material for production of allyl alcohol, with a yield of 80%, of alcohol, the same is of great importance in the manufacture of polymers, pharmaceuticals, organic compounds, herbicides, pesticides and other chemicals
Resumo:
The acidic galactan (AG) was obtained by extraction and proteolysis by acetone precipitation of the eggs of the mollusc Pomacea lineata. Its structure was elucidated by a combination of chemical analysis, the intrinsic viscosity and NMR spectroscopy 1D and 2D. Biological aspects of AG were evaluated by in vivo testing of healing and peritonitis induced (anti-inflammatory activity) and in vitro assays of cytotoxicity (MTT). This polymer showed a simple structure without the presence of sulfate and uronic acids in its structure. Its intrinsic viscosity and relative were evaluated at 0.44 ± 0.05 and 1.744± 0.07 dl.g-1. Spectroscopy showed that the AG has a constitution composed predominantly of β-D-galactosis, and β-D-glucosamine-NAcetil that comes in a smaller proportion in chain. The character of this acidic polysaccharide is given by the presence of pyruvate in the molecule, forming a cyclic acetal of six states, located in β-D-galactosis. The involvement of AG in the healing process was evaluated and the histological analysis revealed that there was so early in the process of healing, a great stimulation of macrophages with granuloma formation. Suggesting that AG may have promoted the advance of biological events required for tissue healing. In the trial of the GA-induced peritonitis showed dose dependent, demonstrating the anti-inflammatory effect at concentrations above 20 mg/kg, and confirming its inflammatory character and the concentration of 1mg/kg. In vitro tests used in the GA concentration of 1000 μg/mL showed proliferative activity by stimulating the growth of 3T3 cells, corroborating the findings in vivo and demonstrating the absence of cytotoxic activity
Resumo:
In the current conjuncture, the environmental factor has been changing the position of companies that are practicing or minimally adopting environmental management. Such tool has been used by companies to face the problems caused by solid waste, in particular green coconut waste, which is constantly among the material discarded by society (companies/ consumer). It is a typical tropical fruit whose fresh water is very benefic for human health, and its popularization has caused a progressive increase of its consumption. Following this stream of thought, this present work came up with an analysis of strengths, weaknesses, threats, and opportunities SWOT analysis on green coconut solid waste management at two agribusiness companies in the state of Rio Grande do Norte (RN), Brazil, aiming to know the challenges and the potentials of this kind of waste. According to the approach of the problem, this work fits a descriptive, exploratory, and qualitative research. The data collection was obtained by a questionnaire and a structured interview, in order to evaluate the strategic posture of agribusiness companies through SWOT analysis, which is an English acronym for Strengths, Weaknesses, Opportunities and Threats. The SWOT analysis is an effective tool to analyze the internal and external environment of an organization. This tool contributes to locate the company at the environment in question and when well applied it enables the detection of mistakes, the strengthening of correct procedures, the avoidance of threats, and the bet on opportunities. The studied agribusiness industries have very similar profiles, such as a long business life span, and a strategy that extends the useful life of the fruit, by using its waste for the manufacturing of new subproducts. In both, the daily quantity of waste resulted of this process reaches approximately 20 thousand units of the fruit in high season, being necessary a focus directed at use and/or treatment of these waste. Further to SWOT analysis, it was ascertained that the agribusiness company A works through a defensive marketing strategy and acts vulnerably, in other words, unable of acting before this market segment, for it has decided to stop using the waste due to a lack of equipment and technology. On the other hand, the agribusiness company B has incorporated an offensive marketing strategy because even not possessing equipments, technology, and appropriated internal installations, it still insists on use and benefits of green coconut waste in its agribusiness. Thus, it is considered that the potential of green coconut waste management for the production of several subproducts reduces the impacts produced by inappropriate placement and generates profits in a short, medium and long term. Such profits being tangible and intangible, as the interest for sustainability actions is not only a matter of obtaining return on capital, but it is an important question in order to move on into business, since it is not enough to have quality on products and process nowadays. It is necessary to establish socio-environmental practices aiming the image of the company as the prevailing role on consumers buying decision
Resumo:
In the globalized world modern telecommunications have assumed key role within the company, causing a large increase in demand for the wireless technology of communication, which has been happening in recent years have greatly increased the number of applications using this technology. Due to this demand, new materials are developed to enable new control mechanisms and propagation of electromagnetic waves. The research to develop new technologies for wireless communication presents a multidisciplinary study that covers from the new geometries for passive antennas, active up to the development of materials for devices that improve the performance at the frequency range of operation. Recently, planar antennas have attracted interest due to their characteristics and advantages when compared with other types of antennas. In the area of mobile communications the need for antennas of this type has become increasingly used, due to intensive development, which needs to operate in multifrequency antennas and broadband. The microstrip antennas have narrow bandwidth due to the dielectric losses generated by irradiation. Another limitation is the degradation of the radiation pattern due to the generation of surface waves in the substrate. Some techniques have been developed to minimize this limitation of bandwidth, such as the study of type materials PBG - Photonic Band Gap, to form the dielectric material. This work has as main objective the development project of a slot resonator with multiple layers and use the type PBG substrate, which carried out the optimization from the numerical analysis and then designed the device initially proposed for the band electromagnetic spectrum between 3-9 GHz, which basically includes the band S to X. Was used as the dielectric material RT/Duroid 5870 and RT/Duroid 6010.LM where both are laminated ceramic-filled PTFE dielectric constants 2.33 and 10.2, respectively. Through an experimental investigation was conducted an analysis of the simulated versus measured by observing the behavior of the radiation characteristics from the height variation of the dielectric multilayer substrates. We also used the LTT method resonators structures rectangular slot with multiple layers of material photonic PBG in order to obtain the resonance frequency and the entire theory involving the electromagnetic parameters of the structure under consideration. xviii The analysis developed in this work was performed using the method LTT - Transverse Transmission Line, in the field of Fourier transform that uses a component propagating in the y direction (transverse to the real direction of propagation z), thus treating the general equations of the fields electric and magnetic and function. The PBG theory is applied to obtain the relative permittivity of the polarizations for the sep photonic composite substrates material. The results are obtained with the commercial software Ansoft HFSS, used for accurate analysis of the electromagnetic behavior of the planar device under study through the Finite Element Method (FEM). Numerical computational results are presented in graphical form in two and three dimensions, playing in the parameters of return loss, frequency of radiation and radiation diagram, radiation efficiency and surface current for the device under study, and have as substrates, photonic materials and had been simulated in an appropriate computational tool. With respect to the planar device design study are presented in the simulated and measured results that show good agreement with measurements made. These results are mainly in the identification of resonance modes and determining the characteristics of the designed device, such as resonant frequency, return loss and radiation pattern
Resumo:
It was studied a system for heating water to be used to obtain water for bathing at home, the absorbing surface of the collector is formed by one plate of polycarbonate. The polycarbonate plate has 6 mm thick, 1.050 mm wide and 1.500 mm long with an area equal to 1,575 m². The plate was attached by its edges parallel to PVC tubes of 32 mm. The system worked under the thermo-siphon and was tested for two configurations: plate absorber with and without isolation of EPS of 30 mm thick on the bottom surface in order to minimize heat losses from the bottom. The tank's thermal heating system is alternative and low cost, since it was constructed from a polyethylene reservoir for water storage, with a volume of 200 liters. Will present data on the thermal efficiency, heat loss, water temperature of thermal reservoir at the end of the process simulation and baths. Will be demonstrated the feasibility of thermal, economic and material pickup proposed for the intended purpose.
Resumo:
In this work is presented a new method for the determination of the orbital period (Porb) of eclipsing binary systems based on the wavelet technique. This method is applied on 18 eclipsing binary systems detected by the CoRoT (Convection Rotation and planetary transits) satellite. The periods obtained by wavelet were compared with those obtained by the conventional methods: box Fitting (EEBLS) for detached and semi-detached eclipsing binaries; and polynomial methods (ANOVA) for contact binary systems. Comparing the phase diagrams obtained by the different techniques the wavelet method determine better Porb compared with EEBLS. In the case of contact binary systems the wavelet method shows most of the times better results than the ANOVA method but when the number of data per orbital cicle is small ANOVA gives more accurate results. Thus, the wavelet technique seems to be a great tool for the analysis of data with the quality and precision given by CoRoT and the incoming photometric missions.
Resumo:
Oil prospecting is one of most complex and important features of oil industry Direct prospecting methods like drilling well logs are very expensive, in consequence indirect methods are preferred. Among the indirect prospecting techniques the seismic imaging is a relevant method. Seismic method is based on artificial seismic waves that are generated, go through the geologic medium suffering diffraction and reflexion and return to the surface where they are recorded and analyzed to construct seismograms. However, the seismogram contains not only actual geologic information, but also noise, and one of the main components of the noise is the ground roll. Noise attenuation is essential for a good geologic interpretation of the seismogram. It is common to study seismograms by using time-frequency transformations that map the seismic signal into a frequency space where it is easier to remove or attenuate noise. After that, data is reconstructed in the original space in such a way that geologic structures are shown in more detail. In addition, the curvelet transform is a new and effective spectral transformation that have been used in the analysis of complex data. In this work, we employ the curvelet transform to represent geologic data using basis functions that are directional in space. This particular basis can represent more effectively two dimensional objects with contours and lines. The curvelet analysis maps real space into frequencies scales and angular sectors in such way that we can distinguish in detail the sub-spaces where is the noise and remove the coefficients corresponding to the undesired data. In this work we develop and apply the denoising analysis to remove the ground roll of seismograms. We apply this technique to a artificial seismogram and to a real one. In both cases we obtain a good noise attenuation
Resumo:
Nowadays, chemistry contents taught in high school continue to be presented in a fragmented and decontextualized manner by the teachers and the textbooks. Even though it is known that contextualization and interdisciplinary exchange play an important role in the process of Chemistry teaching/learning. Therefore, the present study aims at enlightening the importance of these methodological foundations in the learning of chemistry. The data acquisition about the subject Contextualization and Interdisciplinary Exchange involved in Chemistry Teaching was developed through bibliographical researches on chemistry textbooks, which focused on the analysis of the topics acid and base , since it is a theme studied throughout all three years of high school. The present study also developed questionnaires which were applied to analyze to what extent chemistry teachers are working in a contextualized and interdisciplinary manner throughout the process of Chemistry teaching/learning. The results obtained in the researches show that a contextualized and interdisciplinary teaching contributes to a more meaningful acquisition of chemistry knowledge, in a dynamic and interactive way, but there are still many roadblocks towards the achievement of this kind of Chemistry teaching/learning process
Resumo:
Two-level factorial designs are widely used in industrial experimentation. However, many factors in such a design require a large number of runs to perform the experiment, and too many replications of the treatments may not be feasible, considering limitations of resources and of time, making it expensive. In these cases, unreplicated designs are used. But, with only one replicate, there is no internal estimate of experimental error to make judgments about the significance of the observed efects. One of the possible solutions for this problem is to use normal plots or half-normal plots of the efects. Many experimenters use the normal plot, while others prefer the half-normal plot and, often, for both cases, without justification. The controversy about the use of these two graphical techniques motivates this work, once there is no register of formal procedure or statistical test that indicates \which one is best". The choice between the two plots seems to be a subjective issue. The central objective of this master's thesis is, then, to perform an experimental comparative study of the normal plot and half-normal plot in the context of the analysis of the 2k unreplicated factorial experiments. This study involves the construction of simulated scenarios, in which the graphics performance to detect significant efects and to identify outliers is evaluated in order to verify the following questions: Can be a plot better than other? In which situations? What kind of information does a plot increase to the analysis of the experiment that might complement those provided by the other plot? What are the restrictions on the use of graphics? Herewith, this work intends to confront these two techniques; to examine them simultaneously in order to identify similarities, diferences or relationships that contribute to the construction of a theoretical reference to justify or to aid in the experimenter's decision about which of the two graphical techniques to use and the reason for this use. The simulation results show that the half-normal plot is better to assist in the judgement of the efects, while the normal plot is recommended to detect outliers in the data
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Resumo:
The Borborema Province, Northeastern Brazil, had its internal structure investigated by different geophysical methods like gravity, magnetics and seismics. Additionally, many geological studies were also carried out to define the structural domains of this province. Despite the plethora of studies, there are still many important open aspects about its evolution. Here, we study the velocity structure of S-wave in the crust using dispersion of surface waves. The dispersion of surface waves allows an estimate of the average thickness of the crust across the region between the stations. The inversion of the velocity structure was carried out using the inter-station dispersion of surface waves of Rayleigh and Love types. The teleseismic events are mainly from the edges of the South and North American plates. The period of data collection occurred between 2007 and 2010 and we selected 7 events with magnitude above 5.0 MW and up to 40 km depth. The difference between the events back-azimuths and the interstation path was not greater than 10. We also know the depth of the Moho, results from Receiver Functions (Novo Barbosa, 2008), and use those as constrains in inversion. Even using different parameterizations of models for the inversion, our results were very similar the mean profiles velocity structure of S-wave. In pairs of stations located in the Cear´a Central Domain Borborema the province, there are ranges of depths for which the velocities of S are very close. Most of the results in the profile near the Moho complicate their interpretation at that depth, coinciding with the geology of the region, where there are many shear zones. In particular, the profile that have the route Potiguar Bacia in inter-station, had low velocities in the crust. We combine these results to the results of gravimetry and magnetometry (Oliveira, 2008) and receptor function (Novo Barbosa, 2008). We finally, the first results on the behavior of the velocity structure of S-wave with depth in the Province Borborema
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.