105 resultados para Magalhães Neto, Antônio Carlos, discursos etc., análise, Brasil
Resumo:
Le présent étude analyse les effets de la politique de financement de l éducation de base, par les Fonds contables, Fundef et Fundeb, et sa proposition de valorisation de l enseignement, en considerant les dimensions de la carrière et de la rémunération des professeurs de l éducation publique de l état du Rio Grande do Norte, entre les années de 1996 et 2000. Pour comprendre les contraintes de l évaluation des politiques publiques, en cherchant aussi les contribution en Marx (1996) selon qui « le concret est concret » et que la dialétique du concret peut appuyer pour la tentative de capter le fenomène étudié. On a utilisé encore le référentiel bibliographique relatif au financement de l éducation et la valorisation de l enseignement à partir de la littérature reférente aux dimensions de l objet (Fundef et Fundeb) et (carrière et rémuneration). Dans la recherche documental, au-delà des législations, directrices nationales et locales pertinentes, se sont utilisés des donnés référents aux ressources, disponibles à la Finbra, Trésor National, SIOPE/RN, INEP/MEC, des informations du résumé de la feuille et feuille de payement du Secrétariat d État, de l Éducation et de la Culture (SEEC) et 289 bulletins de salaire de 21 professeurs. On a réalisé interview semi structurée avec une quantité de 9 professeurs, reférent à la carrière, et un questionaire appliqué à 12 professeurs relatif à la remuneration. On considère que sur les résultats reférents aux indicateurs éducationnel, dans la période Fundef il y a eu une réduction des inscriptions aux écoles de l état comme aussi aux fonctions des professeurs de l Enseignement Fondamental, et cela correspond à 37%. À partir de la vigence du Fondebe (2007 - 2010) ces indicateurs ont équalisé. Pendant toute la période, 1996 et 2010 il y a eu une augmentation des inscriptions de 119,03%, et aussi aux fonctions des professeurs de 77,44%. Par rapport aux informations de financement, on a constaté que, du minimum exigé (60%) sur l aplication des fonds à la rémuneration de l enseignant, on applique pour la période des deux fonds, plus que le minimum exigé, c est-à-dire de 83,29% à 98,89% des fonds. Les effets des fonds sur la carrière des 9 professeurs n ont pas été satisfactoires, si l on considère la promotion et la progression. Au cas de la promotion des 9 de ces professeurs, un seul a évolué son niveau (les titres) mais a, au même temps, rétroagit dans sa progression. Pour la progression des 9 professeurs, 8 d entre eux ont sa progression retardée, ce qui correspond à entre 2 et 5 classes, et ce qui provoque un préjudice qui varie entre 10% à 45% sur sa remunération. La différence d une classe à l autre correspond à 5% de son salaire. On évalue que les avantages financières contribuent pour la remunération avec un pourcentage plus élevé que son salaire, ce qui diminuent pendant lo Fundeb. Par rapport à la remunération un professeur de 24 ans de service avec formation, n arrive même pas à gagner 2 salaires minimums. Le professeur de 30 ans de service, maître, reçoit un salaire, en 2010, qui correspond a moins de 3 salaires minimums, c est-à-dire, une proportion de 2,82 et une remuneration qui correspond à un peu plus que 3 salaires minimums, c est-à-dire, une proportion de 3,66. L enseignement n est pas très favorisé si on le met face à d autres profession qui ont aussi l exigence de formation supérieure, ce qui provoque un effet négatif pour voir l enseignement comme profession. À propos des effets sur la rémuneration, on conclue qu il y a eu une amélioration mais encore insufisante, surtout si l on compare au Salaire Minimum annuel. On évalue que les fonds Fundef et Fundeb n ont pas été capables de promouvoir la valorisation de l enseignement dans le contexte de carrière et rémuneration. On observe quelques résultats négatif dans la politique de fonds, une fois qu il y aurait en avoir avec l incapacité de tel politique en promouvoir la dite valorisation de l enseignement, ce qui est une des causes, le financement avec des restriction budgétaire
Resumo:
Politics of Continued Formation of Tachers: settings, directives and pratices is constituted in a study from politics of continued formation of teachers materialized for the actual government. It has for purpose to analyse aspects of experience of continued formation in techers service that work at initial levels of fundamental education desenvolved in Natal through Program Management of School Learning (GESTAR) at the period from 2002 to 2005. The empiric field of study privileged the experience of formation in service developed by GESTAR in a school located at the procedures that materialized the search, we can point out: bibliographic review of literature that analyses the new demands for education in view of occurred transformations at the word setting and studies about the thematic continued formation of teachers; documentary search about the politics of continue formation of teachers and the GESTAR program from municipal net of study of Natal, four teachers and pedagogic coordinater of searched school. The study established a positive evaluation by the teachers that took part at the program in Natal and they pointed out that politics strategy of continued formation in service made possible the study of contents of Portuguese Language and Mathernatics associated to a new form for working them in the classroom, understanding of certain contents that they had difficulties before, understanding that activitus realized should have meaning for the student, favouring the understanding of subject studied. Among some limits noticed, we can point out: absence of reading of the material by the teachers, difficulty of the teachers in reconcile the diary activities wiht the individual studies at distance, no realization of a systematic following of the pedagogic practical of teachers use the exam how the only instrument of evaluation used by the teachers and difficulties of them in going on with the pedagogic proposal of GESTAR after the end of this program
Resumo:
The present work has as aim to analyze the reorganization process of the rural education in Jardim de Piranhas-RN, on the context of the education policies, in particular of the period of 1999-2006, having as reference the transformations in the political, cultural and socio-economic setting in the national, regional and local level, above all from the decade of 1990. The studies carried out in diverse sources made possible to understand from the context in which they had developed the education policies, in particular, that one directed for the rural way, as well as the mediation of this with the education reorganization in the local scope. Besides these research procedures, we carry out interviews - semi-structuralized - with managers and teachers, and we analyze documents from the produced ones in national level to those local ones. From the viewpoint theoretician-methodological, we focus the national discussion that comes developing under a new ideological political configuration and, being entitled by the Field Education, understood as a policy directed to education specifities in this sector and consolidated in the Operational Guidelines for the Basic Education in the Field Schools (CNE/MEC/2002). As particularity of this object in Jardim de Piranhas-RN, we emphasize events occurred that had marked the rural education reorganization in that city, especially from the creation of the Rural Education named Center Teacher Maria Edite Batista. Studies make possible to realize that until the Center creation, the schools functioned in rather precarious infrastructure and physical conditions, that is, without electric energy and water supplying, as well as the lack of school snack and the management structure. There was not a project or specific pedagogical accompaniment for the sector. Moreover, the teachers worked predominantly with several grade classes and still they fulfilled the manager functions, caretakers, and cook- in some cases as school secretary. However, exactly with the creation of the Rural Education Center, the education municipal system did not become capable to overcome problems as of the evasion and school failure, as well as decreasing the work overload of teachers, neither to give greater consistency to the pedagogical project of the field schools in that city
Resumo:
Photo-oxidation processes of toxic organic compounds have been widely studied. This work seeks the application of the photo-Fenton process for the degradation of hydrocarbons in water. The gasoline found in the refinery, without additives and alcohol, was used as the model pollutant. The effects of the concentration of the following substances have been properly evaluated: hydrogen peroxide (100-200 mM), iron ions (0.5-1 mM) and sodium chloride (200 2000 ppm). The experiments were accomplished in reactor with UV lamp and in a falling film solar reactor. The photo-oxidation process was monitored by measurements of the absorption spectra, total organic carbon (TOC) and chemical oxygen demand (COD). Experimental results demonstrated that the photo-Fenton process is feasible for the treatment of wastewaters containing aliphatic hydrocarbons, inclusive in the presence of salts. These conditions are similar to the water produced by the petroleum fields, generated in the extraction and production of petroleum. A neural network model of process correlated well the observed data for the photooxidation process of hydrocarbons
Resumo:
At the cashew nut processing industry it is often the generation of wastewaters containing high content of toxic organic compounds. The presence of these compounds is due mainly to the so called liquid of the cashew nut (CNSL). CNSL, as it is commercially known in Brazil, is the liquid of the cashew nut. It looks like an oil with dark brown color, viscous and presents a high toxicity index due to the chemical composition, i.e. phenol compounds, such as anacardic acid, cardol, 2-methyl cardol and monophenol (cardanol). These compounds are bio resistant to the conventional treatments. Furthermore, the corresponding wastewaters present high content of TOC (total organic carbon). Therefore due to the high degree of toxicity it is very important to study and develop treatments of these wastewaters before discharge to the environmental. This research aims to decompose these compounds using advanced oxidative processes (AOP) based on the photo-Fenton system. The advantage of this system is the fast and non-selective oxidation promoted by the hydroxyl radicals (●OH), that is under determined conditions can totally convert the organic pollutants to CO2 and H2O. In order to evaluate the decomposition of the organic charge system samples of the real wastewater od a processing cashew nut industry were taken. This industry was located at the country of the state of Rio Grande do Norte. The experiments were carried out with a photochemical annular reactor equipped with UV (ultra violet) lamp. Based on preliminary experiments, a Doehlert experimental design was defined to optimize the concentrations of H2O2 and Fe(II) with a total of 13 runs. The experimental conditions were set to pH equal to 3 and temperature of 30°C. The power of the lamps applied was 80W, 125W and 250W. To evaluate the decomposition rate measures of the TOC were accomplished during 4 hours of experiment. According to the results, the organic removal obtained in terms of TOC was 80% minimum and 95% maximum. Furthermore, it was gotten a minimum time of 49 minutes for the removal of 30% of the initial TOC. Based on the obtained experimental results, the photo-Fenton system presents a very satisfactory performance as a complementary treatment of the wastewater studied
Resumo:
The oil and petrochemical industry is responsable to generate a large amount of waste and wastewater. Among some efluents, is possible find the benzene, toluene, ethilbenze and isomers of xilenes compounds, known as BTEX. These compounds are very volatily, toxic for environment and potencially cancerigenous in man. Oxidative advanced processes, OAP, are unconventional waste treatment, wich may be apply on treatment and remotion this compounds. Fenton is a type of OAPs, wich uses the Fenton s reactant, hydrogen peroxide and ferrous salt, to promove the organic degradation. While the Photo-Fenton type uses the Fenton s reactant plus UV radiation (ultraviolet). These two types of OAP, according to literature, may be apply on BTEX complex system. This project consists on the consideration of the utilization of technologies Fenton and Photo-Fenton in aqueous solution in concentration of 100 ppm of BTEX, each, on simulation of condition near of petrochemical effluents. Different reactors were used for each type of OAP. For the analyticals results of amount of remotion were used the SPME technique (solid phase microextraction) for extraction in gaseous phase of these analytes and the gas chromatography/mass espectrometry The arrangement mechanical of Photo-Fenton system has been shown big loss by volatilization of these compounds. The Fenton system has been shown capable of degradate benzene and toluene compounds, with massic percentage of remotion near the 99%.
Resumo:
Effluents from pesticide industries have great difficulty to decontaminate the environment and, moreover, are characterized by high organic charge and toxicity. The research group Center for Chemical Systems Engineering (CESQ) at the Department of Chemical Engineering of Polytechnical School of University of São Paulo and Department of Chemical Engineering, Federal University of Rio Grande do Norte have been applying the Advanced Oxidation Processes (AOP's) for the degradation of various types of pollutants. These processes are based on the generation of hydroxyl radicals, highly reactive substances. Thus, this dissertation aims to explore this process, since it has been proven to be quite effective in removing organic charge. Therefore, it was decided by photo-Fenton process applied to the degradation of the fungicide Thiophanate methyl in aqueous system using annular reactor (with lamp Philips HPLN 125W) and solar. The samples were collected during the experiment and analyzed for dissolved organic carbon (TOC) using a Shimadzu TOC (Shimadzu 5050A e VCP). The Doehlert experimental design has been used to evaluate the influence of ultraviolet radiation, the concentrations of methyl thiophanate (C12H14N4O4S2), hydrogen peroxide (H2O2) and iron ions (Fe2+), among these parameters, was considered the best experimental conditions, [Fe2+] = 0.6 mmol/L and [H2O2] = 0.038 mol/L in EXP 5 experiment and in SOL 5 experiment, obtaining a percentage of TOC removal of 60% in the annular reactor and 75% in the solar reactor
Resumo:
This work is a detailed study of self-similar models for the expansion of extragalactic radio sources. A review is made of the definitions of AGN, the unified model is discussed and the main characteristics of double radio sources are examined. Three classification schemes are outlined and the self-similar models found in the literature are studied in detail. A self-similar model is proposed that represents a generalization of the models found in the literature. In this model, the area of the head of the jet varies with the size of the jet with a power law with an exponent γ. The atmosphere has a variable density that may or may not be spherically symmetric and it is taken into account the time variation of the cinematic luminosity of the jet according to a power law with an exponent h. It is possible to show that models Type I, II and III are particular cases of the general model and one also discusses the evolution of the sources radio luminosity. One compares the evolutionary curves of the general model with the particular cases and with the observational data in a P-D diagram. The results show that the model allows a better agreement with the observations depending on the appropriate choice of the model parameters.
Resumo:
In the present work we use a Tsallis maximum entropy distribution law to fit the observations of projected rotational velocity measurements of stars in the Pleiades open cluster. This new distribution funtion which generalizes the Ma.xwel1-Boltzmann one is derived from the non-extensivity of the Boltzmann-Gibbs entropy. We also present a oomparison between results from the generalized distribution and the Ma.xwellia.n law, and show that the generalized distribution fits more closely the observational data. In addition, we present a oomparison between the q values of the generalized distribution determined for the V sin i distribution of the main sequence stars (Pleiades) and ones found for the observed distribution of evolved stars (subgiants). We then observe a correlation between the q values and the star evolution stage for a certain range of stel1ar mass
Resumo:
In this work is presented a new method for the determination of the orbital period (Porb) of eclipsing binary systems based on the wavelet technique. This method is applied on 18 eclipsing binary systems detected by the CoRoT (Convection Rotation and planetary transits) satellite. The periods obtained by wavelet were compared with those obtained by the conventional methods: box Fitting (EEBLS) for detached and semi-detached eclipsing binaries; and polynomial methods (ANOVA) for contact binary systems. Comparing the phase diagrams obtained by the different techniques the wavelet method determine better Porb compared with EEBLS. In the case of contact binary systems the wavelet method shows most of the times better results than the ANOVA method but when the number of data per orbital cicle is small ANOVA gives more accurate results. Thus, the wavelet technique seems to be a great tool for the analysis of data with the quality and precision given by CoRoT and the incoming photometric missions.
Resumo:
In this Thesis, we analyzed the formation of maxwellian tails of the distributions of the rotational velocity in the context of the out of equilibrium Boltzmann Gibbs statistical mechanics. We start from a unified model for the angular momentum loss rate which made possible the construction of a general theory for the rotational decay in the which, finally, through the compilation between standard Maxwellian and the relation of rotational decay, we defined the (_, _) Maxwellian distributions. The results reveal that the out of equilibrium Boltzmann Gibbs statistics supplies us results as good as the one of the Tsallis and Kaniadakis generalized statistics, besides allowing fittings controlled by physical properties extracted of the own theory of stellar rotation. In addition, our results point out that these generalized statistics converge to the one of Boltzmann Gibbs when we inserted, in your respective functions of distributions, a rotational velocity defined as a distribution
Resumo:
Double radio sources have been studied since the discovery of extragalactic radio sources in the decade of 1930. Since then, several numerical studies and analytical models have been proposed seeking a better understanding of the physical phenomena that determines the origin and evolution of such objects. In this thesis, we intended to study the evolution problem of the double radio sources in two fronts: in the ¯rst we have developed an analytical self-similar model that represents a generalization of most models found in the literature and solve some existent problems related to the jet head evolution. We deal with this problem using samples of hot spot sizes to ¯nd a power law relation between the jet head dimension and the source length. Using our model, we were able to draw the evolution curves of the double sources in a PD diagram for both compact sources (GPS and CSS) and extended sources of the 3CR catalogue. We have alson developed a computation tool that allows us to generate synthetic radio maps of the double sources. The objective is to determine the principal physical parameters of those objects by comparing synthetic and observed radio maps. In the second front, we used numeric simulations to study the interaction of the extra- galactic jets with the environment. We simulated situations where the jet propagates in a medium with high density contrast gas clouds capable to block the jet forward motion, forming the distorted structures observed in the morphology of real sources. We have also analyzed the situation in which the jet changes its propagation direction due to a change of the source main axis, creating the X-shaped sources. The comparison between our simulations and the real double radio sources, enable us to determine the values of the main physical parameters responsible for the distortions observed in those objects
Resumo:
Mirror therapy (MT) is being used as a rehabilitation tool in various diseases, including stroke. Although some studies have shown its effectiveness, little is known about neural mechanisms that underlie the rehabilitation process. Therefore, this study aimed at assessing cortical neuromodulation after a single MT intervention in ischemic stroke survivors, by means of by functional Magnetic Resonance Imaging (fMRI) and Transcranial Magnetic Stimulation (TMS). Fifteen patients participated in a single thirty minutes MT session. fMRI data was analyzed bilaterally in the following Regions of Interest (ROI): Supplementary Motor Area (SMA), Premotor cortex (PMC), Primary Motor cortex (M1), Primary Sensory cortex (S1) and Cerebellum. In each ROI, changes in the percentage of occupation and beta values were computed. Group fMRI data showed a significant decreased in the percentage of occupation in PMC and cerebellum, contralateral to the affected hand (p <0.05). Significant increase in beta values was observed in the following contralateral motor areas: SMA, Cerebellum, PMC and M1 (p<0,005). Moreover, a significant decrease was observed in the following ipsilateral motor areas: PMC and M1 (p <0,001). In S1 a bilateral significant decrease (p<0.0005) was observed.TMS consisted of the analysis of Motor Evoked Potential (MEP) of M1 hotspot. A significant increase in the amplitude of the MEP was observed after therapy in the group (p<0,0001) and individually in 4 patients (p <0.05). Altogether, our results imply that single MT intervention is already capable of promoting changes in neurobiological markers toward patterns observed in healthy subjects. Furthermore, the contralateral hemisphere motor areas changes are opposite to the ones in the ipsilateral side, suggesting an increase system homeostasis.
Resumo:
This work presents a algorithmic study of Multicast Packing Problem considering a multiobjective approach. The first step realized was an extensive review about the problem. This review serverd as a reference point for the definition of the multiobjective mathematical model. Then, the instances used in the experimentation process were defined, this instances were created based on the main caracteristics from literature. Since both mathematical model and the instances were definined, then several algoritms were created. The algorithms were based on the classical approaches to multiobjective optimization: NSGA2 (3 versions), SPEA2 (3 versions). In addition, the GRASP procedures were adapted to work with multiples objectives, two vesions were created. These algorithms were composed by three recombination operators(C1, C2 e C3), two operator for build solution, a mutation operator and a local search procedure. Finally, a long experimentation process was performed. This process has three stages: the first consisted of adjusting the parameters; the second was perfomed to indentify the best version for each algorithm. After, the best versions for each algorithm were compared in order to identify the best algorithm among all. The algorithms were evaluated based on quality indicators and Hypervolume Multiplicative Epsilon
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms