11 resultados para GALAXIES: SPIRAL

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, biodiesel was produced from castor oil that was a byproduct glycerin. The molar ratio between oil and alcohol, as well as the use of (KOH) catalyst to provide the chemical reaction is based on literature. The best results were obtained using 1 mol of castor oil (260g) to 3 moles of methyl alcohol (138g), using 1.0% KOH as catalyst at a temperature of 260 ° C and shaken at 120 rpm. The oil used was commercially available, the process involves the reaction of transesterification of a vegetable oil with methyl alcohol. The product of this reaction is an ester, biodiesel being the main product and the glycerin by-product which has undergone treatment for use as raw material for the production of allyl alcohol. The great advantage of the use of glycerin to obtain allyl alcohol is that its use eliminates the large amount of waste of the biodiesel and various forms of insult to the environment. The reactions for the formation of allyl alcohol was conducted from formic acid and glycerin in a ratio 1/1, at a temperature of 260oC in a heater blanket, being sprayed by a spiral condenser for a period of 2 hours and the product obtained contains mostly the allylic alcohol .. The monitoring of reactions was performed by UV-Visible Spectrophotometer: FTIR Fourier transform, the analysis showed that these changes occur spectrometer indicating the formation of the product allylic alcohol (prop-2-en-1-ol) in the presence of water, This alcohol was appointed Alcohol GL. The absorption bands confirms that the reaction was observed in (υ C = C) 1470 -1600 cm -1 and (υ CO), 3610-3670 attributed to C = C groups and OH respectively. The thermal analysis was carried out in a thermogravimetric analyzer SDT Q600, where the mass and temperature are displayed against time, that allows checking the approximate rate of heating. The innovative methodology developed in the laboratory (LABTAM, UFRN), was able to treat the glycerine produced by transesterification of castor oil and used as raw material for production of allyl alcohol, with a yield of 80%, of alcohol, the same is of great importance in the manufacture of polymers, pharmaceuticals, organic compounds, herbicides, pesticides and other chemicals

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The birth models of care are discussed, in the light of classical and contemporary social science theoretical background, emphasizing the humanistic model. The double spiral of the sociology of absences and the sociology of emergences is detailed, being based, on one hand, on the translation of experiences of knowledge, and, on the other, on the translation of experiences of information and communication, by revealing the movement articulated by Brazilian women on blogs that defend and bring into light initiatives aiming to recover natural and humanized birth. A cartography of the thematic ideas in birth literature is produced, resulting in the elaboration of a synthetic map on obstetric models of care in contemporaneity, pointing out the consequences of the obstetric model that has become hegemonic in contemporary societies, and comparing that model to others that work more efficaciously to mothers and babies. A symbolic cartography of the activism for humanizing birth on the Brazilian blogosphere is configured by the elaboration of an analytical map synthetizing the main mottos defended by the movement: Normal humanized birth; Against obstetrical violence; and Planned home birth. The superposition of the obstetric models of care s map and the rebirth of birth s analytical map indicates it is necessary to reinforce three main measures in order to make a paradigmatic turn in contemporary birth models of care possible: pave the way for the humanistic care of assistance in normal birth, by defending and highlighting practices and professionals that act in compliance with evidence based medicine, respecting the physiology of birth; denaturalize obstetric violence, by showing how routine procedures and interventions can be means of aggression, jeopardizing the autonomy, the protagonism and the respect towards women; and motivate initiatives of planned home birth, the best place for the occurrence of holistic experiences of birth. It is concluded that Internet tools have allowed a pioneer mobilization in respecting women s reproductive rights in Brazil and that the potential of the crowd s biopower that resides on the blogosphere can turn blogs into a hegemonic alternative way to reach more democratic forms of social organization. In that condition of being virtually hegemonic in contesting the established power, these blogs can be understood, therefore, as potentially great contra-hegemonic channels for the rebirth of birth and for the reinvention of social emancipation, as their author s articulate and organize themselves to strive against the waste of experience, trying to create reciprocal intelligibility amongst different experiences of world

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents the development of new microwaves structures, filters and high gain antenna, through the cascading of frequency selective surfaces, which uses fractals Dürer and Minkowski patches as elements, addition of an element obtained from the combination of the other two simple the cross dipole and the square spiral. Frequency selective surfaces (FSS) includes a large area of Telecommunications and have been widely used due to its low cost, low weight and ability to integrate with others microwaves circuits. They re especially important in several applications, such as airplane, antennas systems, radomes, rockets, missiles, etc. FSS applications in high frequency ranges have been investigated, as well as applications of cascading structures or multi-layer, and active FSS. In this work, we present results for simulated and measured transmission characteristics of cascaded structures (multilayer), aiming to investigate the behavior of the operation in terms of bandwidth, one of the major problems presented by frequency selective surfaces. Comparisons are made with simulated results, obtained using commercial software such as Ansoft DesignerTM v3 and measured results in the laboratory. Finally, some suggestions are presented for future works on this subject

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is the analysis of a structure of the microstrip antenna designed for application in ultra wide band systems (Ultra Wideband - UWB). This is a prospective analytical study where they tested the changes in the geometry of the antenna, observing their suitability to the proposed objectives. It is known that the UWB antenna must operate in a range of at least 500 MHz, and answer a fractional bandwidth greater than or equal to 25%. It is also desirable that the antenna meets the specifications of track determined by FCC - Federal Communication Commission, which regulates the system in 2002 designating the UWB bandwidth of 7.5 GHz, a range that varies from 3.1 GHz to 10, 6 GHz. by setting the maximum power spectral density of operation in -41.3 dB / MHz, and defining the fractional bandwidth by 20%. The study starts of a structure of geometry in the form of stylized @, which evolves through changes in its form, in simulated commercial software CST MICROWAVE STUDIO, version 5.3.1, and then tested using the ANSOFT HFSS, version 9. These variations, based on observations of publications available from literature referring to the microstrip monopole planar antennas. As a result it is proposed an antenna, called Monopole Antenna Planar Spiral Almost Rectangular for applications in UWB systems - AMQEUWB, which presents simulated and measured results satisfactory, consistent with the objectives of the study. Some proposals for future work are mentioned

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technical and economic viability of solar heating for swimming pools is unquestionable, besides there it replaces the high costs and environmental impacts of conventional supply of energy, and it improves an optimization in the pool heating uses. This work applies the principles of the greenhouse effect: advanced thermodynamics, heat retention and equalization of temperature, to optimize the solar heating equipment, reducing the area required by collectors as much as 40% (still estimated value) for commercial collectors, with minor architectural and aesthetic impacts on the environment. It features a solar heating alternative in pools, whose main characteristics: low cost, simplicity in manufacturing and assembly and a faster heating. The system consists of two collectors spiral hoses made of polyethylene with a hundred meters each, and working on a forced flow, with only one pass of the working fluid inside the coils, and is used to pump itself treatment of pool water to obtain the desired flow. One of the collectors will be exposed to direct solar radiation, and the other will be covered by a glass slide and closed laterally, so providing the greenhouse effect. The equipment will be installed in parallel and simultaneously exposed to the sun in order to obtain comparative data on their effectiveness. Will be presented results of thermal tests for this the two cases, with and without transparent cover. Will be demonstrated, by comparison, the thermal, economic and material feasibility of these systems for heating swimming pools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is nowadays a growing demand for located cooling and stabilization in optical and electronic devices, haul of portable systems of cooling that they allow a larger independence in several activities. The modules of thermoelectrical cooling are bombs of heat that use efect Peltier, that consists of the production of a temperature gradient when an electric current is applied to a thermoelectrical pair formed by two diferent drivers. That efect is part of a class of thermoelectrical efcts that it is typical of junctions among electric drivers. The modules are manufactured with semiconductors. The used is the bismuth telluride Bi2Te3, arranged in a periodic sequence. In this sense the idea appeared of doing an analysis of a system that obeys the sequence of Fibonacci. The sequence of Fibonacci has connections with the golden proportion, could be found in the reproductive study of the bees, in the behavior of the light and of the atoms, as well as in the growth of plants and in the study of galaxies, among many other applications. An apparatus unidimensional was set up with the objective of investigating the thermal behavior of a module that obeys it a rule of growth of the type Fibonacci. The results demonstrate that the modules that possess periodic arrangement are more eficient

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lithium (Li) is a chemical element with atomic number 3 and it is among the lightest known elements in the universe. In general, the Lithium is found in the nature under the form of two stable isotopes, the 6Li and 7Li. This last one is the most dominant and responds for about 93% of the Li found in the Universe. Due to its fragileness this element is largely used in the astrophysics, especially in what refers to the understanding of the physical process that has occurred since the Big Bang going through the evolution of the galaxies and stars. In the primordial nucleosynthesis in the Big Bang moment (BBN), the theoretical calculation forecasts a Li production along with all the light elements such as Deuterium and Beryllium. To the Li the BNB theory reviews a primordial abundance of Log log ǫ(Li) =2.72 dex in a logarithmic scale related to the H. The abundance of Li found on the poor metal stars, or pop II stars type, is called as being the abundance of Li primordial and is the measure as being log ǫ(Li) =2.27 dex. In the ISM (Interstellar medium), that reflects the current value, the abundance of Lithium is log ǫ(Li) = 3.2 dex. This value has great importance for our comprehension on the chemical evolution of the galaxy. The process responsible for the increasing of the primordial value present in the Li is not clearly understood until nowadays. In fact there is a real contribution of Li from the giant stars of little mass and this contribution needs to be well streamed if we want to understand our galaxy. The main objection in this logical sequence is the appearing of some giant stars with little mass of G and K spectral types which atmosphere is highly enriched with Li. Such elevated values are exactly the opposite of what could happen with the typical abundance of giant low mass stars, where convective envelops pass through a mass deepening in which all the Li should be diluted and present abundances around log ǫ(Li) ∼1.4 dex following the model of stellar evolution. In the Literature three suggestions are found that try to reconcile the values of the abundance of Li theoretical and observed in these rich in Li giants, but any of them bring conclusive answers. In the present work, we propose a qualitative study of the evolutionary state of the rich in Li stars in the literature along with the recent discovery of the first star rich in Li observed by the Kepler Satellite. The main objective of this work is to promote a solid discussion about the evolutionary state based on the characteristic obtained from the seismic analysis of the object observed by Kepler. We used evolutionary traces and simulation done with the population synthesis code TRILEGAL intending to evaluate as precisely as possible the evolutionary state of the internal structure of these groups of stars. The results indicate a very short characteristic time when compared to the evolutionary scale related to the enrichment of these stars

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent astronomical observations indicate that the universe has null spatial curvature, is accelerating and its matter-energy content is composed by circa 30% of matter (baryons + dark matter) and 70% of dark energy, a relativistic component with negative pressure. However, in order to built more realistic models it is necessary to consider the evolution of small density perturbations for explaining the richness of observed structures in the scale of galaxies and clusters of galaxies. The structure formation process was pioneering described by Press and Schechter (PS) in 1974, by means of the galaxy cluster mass function. The PS formalism establishes a Gaussian distribution for the primordial density perturbation field. Besides a serious normalization problem, such an approach does not explain the recent cluster X-ray data, and it is also in disagreement with the most up-to-date computational simulations. In this thesis, we discuss several applications of the nonextensive q-statistics (non-Gaussian), proposed in 1988 by C. Tsallis, with special emphasis in the cosmological process of the large structure formation. Initially, we investigate the statistics of the primordial fluctuation field of the density contrast, since the most recent data from the Wilkinson Microwave Anisotropy Probe (WMAP) indicates a deviation from gaussianity. We assume that such deviations may be described by the nonextensive statistics, because it reduces to the Gaussian distribution in the limit of the free parameter q = 1, thereby allowing a direct comparison with the standard theory. We study its application for a galaxy cluster catalog based on the ROSAT All-Sky Survey (hereafter HIFLUGCS). We conclude that the standard Gaussian model applied to HIFLUGCS does not agree with the most recent data independently obtained by WMAP. Using the nonextensive statistics, we obtain values much more aligned with WMAP results. We also demonstrate that the Burr distribution corrects the normalization problem. The cluster mass function formalism was also investigated in the presence of the dark energy. In this case, constraints over several cosmic parameters was also obtained. The nonextensive statistics was implemented yet in 2 distinct problems: (i) the plasma probe and (ii) in the Bremsstrahlung radiation description (the primary radiation from X-ray clusters); a problem of considerable interest in astrophysics. In another line of development, by using supernova data and the gas mass fraction from galaxy clusters, we discuss a redshift variation of the equation of state parameter, by considering two distinct expansions. An interesting aspect of this work is that the results do not need a prior in the mass parameter, as usually occurs in analyzes involving only supernovae data.Finally, we obtain a new estimate of the Hubble parameter, through a joint analysis involving the Sunyaev-Zeldovich effect (SZE), the X-ray data from galaxy clusters and the baryon acoustic oscillations. We show that the degeneracy of the observational data with respect to the mass parameter is broken when the signature of the baryon acoustic oscillations as given by the Sloan Digital Sky Survey (SDSS) catalog is considered. Our analysis, based on the SZE/X-ray data for a sample of 25 galaxy clusters with triaxial morphology, yields a Hubble parameter in good agreement with the independent studies, provided by the Hubble Space Telescope project and the recent estimates of the WMAP

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the way in which large-scale structures, like galaxies, form remains one of the most challenging problems in cosmology today. The standard theory for the origin of these structures is that they grew by gravitational instability from small, perhaps quantum generated, °uctuations in the density of dark matter, baryons and photons over an uniform primordial Universe. After the recombination, the baryons began to fall into the pre-existing gravitational potential wells of the dark matter. In this dissertation a study is initially made of the primordial recombination era, the epoch of the formation of the neutral hydrogen atoms. Besides, we analyzed the evolution of the density contrast (of baryonic and dark matter), in clouds of dark matter with masses among 104M¯ ¡ 1010M¯. In particular, we take into account the several physical mechanisms that act in the baryonic component, during and after the recombination era. The analysis of the formation of these primordial objects was made in the context of three models of dark energy as background: Quintessence, ¤CDM(Cosmological Constant plus Cold Dark Matter) and Phantom. We show that the dark matter is the fundamental agent for the formation of the structures observed today. The dark energy has great importance at that epoch of its formation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform