957 resultados para GALAXIES: SPIRAL


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To investigate if a home environment test battery can be used to measure effects of Parkinson’s disease (PD) treatment intervention and disease progression. Background Seventy-seven patients diagnosed with advanced PD were recruited in an open longitudinal 36-month study at 10 clinics in Sweden and Norway; 40 of them were treated with levodopa-carbidopa intestinal gel (LCIG) and 37 patients were candidates for switching from oral PD treatment to LCIG. They utilized a mobile device test battery, consisting of self-assessments of symptoms and objective measures of motor function through a set of fine motor tests (tapping and spiral drawings), in their homes. Both the LCIG-naïve and LCIG-non-naïve patients used the test battery four times per day during week-long test periods. Methods Assessments The LCIG-naïve patients used the test battery at baseline (before LCIG), month 0 (first visit; at least 3 months after intraduodenal LCIG), and thereafter quarterly for the first year and biannually for the second and third years. The LCIG-non-naïve patients used the test battery from the first visit, i.e. month 0. Out of the 77 patients, only 65 utilized the test battery; 35 were LCIG-non-naïve and 30 LCIG-naïve. In 20 of the LCIG-naïve patients, assessments with the test battery were available during oral treatment and at least one test period after having started infusion treatment. Three LCIG-naïve patients did not use the test battery at baseline but had at least one test period of assessments thereafter. Hence, n=23 in the LCIG-naïve group. In total, symptom assessments in the full sample (including both patient groups) were collected during 379 test periods and 10079 test occasions. For 369 of these test periods, clinical assessments including UPDRS and PDQ-39 were performed in afternoons at the start of the test periods. The repeated measurements of the test battery were processed and summarized into scores representing patients’ symptom severities over a test period, using statistical methods. Six conceptual dimensions were defined; four subjectively-reported: ‘walking’, ‘satisfied’, ‘dyskinesia’, and ‘off’ and two objectively-measured: ‘tapping’ and ‘spiral’. In addition, an ‘overall test score’ (OTS) was defined to represent the global health condition of the patient during a test period. Statistical methods Change in the test battery scores over time, that is at baseline and follow-up test periods, was assessed with linear mixed-effects models with patient ID as a random effect and test period as a fixed effect of interest. The within-patient variability of OTS was assessed using intra-class correlation coefficient (ICC), for the two patient groups. Correlations between clinical rating scores and test battery scores were assessed using Spearman’s rank correlations (rho). Results In LCIG-naïve patients, mean OTS compared to baseline was significantly improved from the first test period on LCIG treatment until month 24. However, there were no significant changes in mean OTS scores of LCIG-non-naïve patients, except for worse mean OTS at month 36 (p<0.01, n=16). The mean scores of all subjectively-reported dimensions improved significantly throughout the course of the study, except ‘walking’ at month 36 (p=0.41, n=4). However, there were no significant differences in mean scores of objectively-measured dimensions between baseline and other test periods, except improved ‘tapping’ at month 6 and month 36, and ‘spiral’ at month 3 (p<0.05). The LCIG-naïve patients had a higher within-subject variability in their OTS scores (ICC=0.67) compared to LCIG-non-naïve patients (ICC=0.71). The OTS correlated adequately with total UPDRS (rho=0.59) and total PDQ-39 (rho=0.59). Conclusions In this 3-year follow-up study of advanced PD patients treated with LCIG we found that it is possible to monitor PD progression over time using a home environment test battery. The significant improvements in the mean OTS scores indicate that the test battery is able to measure functional improvement with LCIG sustained over at least 24 months.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to investigate if a telemetry test battery can be used to measure effects of Parkinson’s disease (PD) treatment intervention and disease progression in patients with fluctuations. Sixty-five patients diagnosed with advanced PD were recruited in an open longitudinal 36-month study; 35 treated with levodopa-carbidopa intestinal gel (LCIG) and 30 were candidates for switching from oral PD treatment to LCIG. They utilized a test battery, consisting of self-assessments of symptoms and fine motor tests (tapping and spiral drawings), four times per day in their homes during week-long test periods. The repeated measurements were summarized into an overall test score (OTS) to represent the global condition of the patient during a test period. Clinical assessments included ratings on Unified PD Rating Scale (UPDRS) and 39-item PD Questionnaire (PDQ-39) scales. In LCIG-naïve patients, mean OTS compared to baseline was significantly improved from the first test period on LCIG treatment until month 24. In LCIG-non-naïve patients, there were no significant changes in mean OTS until month 36. The OTS correlated adequately with total UPDRS (rho = 0.59) and total PDQ-39 (0.59). Responsiveness measured as effect size was 0.696 and 0.536 for OTS and UPDRS respectively. The trends of the test scores were similar to the trends of clinical rating scores but dropout rate was high. Correlations between OTS and clinical rating scales were adequate indicating that the test battery contains important elements of the information of well-established scales. The responsiveness and reproducibility were better for OTS than for total UPDRS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study proposes and uses a multipolar qualitative methodology for the investigation of the meanings of corporate work under the impact of the technical-scientific knowledge. Based on the constructionist epistemology and on a heideggerian approach, this methodology made possible the description of such phenomenon under the Management perspective, but also under the Psychology, Sociology and Philosophy views. Study starts with empirical data, collected by interviews and in publications, and analyzes them based on conceptual contributions from the four referred poles. As a result, this investigation proposes reflexivity, impermanence and entanglement as meanings of the corporate work that culminate in the paradoxical work. It also demonstrates that the impact of knowledge occurs in a reflexive spiral of complexification, continuously affecting individuals and work processes; describes the impermanent character of the work ruled by time, by neophilia and by uncertainties that make precarious the very subjective experience of worker; and proposes that this work becomes entangled due to the way it is managed, and is an entangling work, for embarrassing the individual in its techniques, logics and dissonances. The paradoxical essence that emerges from this appreciation of corporate work is described on the following scopes: its reflexivity does not prevent its rationality to be fragile; its dynamism coexists with the difficulty to achieve substantive organizational changes; its entanglement compromises the possibilities of an apparently reachable well-being; its focus on present ends up to be emptied; and its purpose of individual sustenance becomes unsustainable due to hyperconsumption. But the conclusions of the study do no describe a definitive phenomenon. At the ending, it points out possibilities of living with the paradoxical work as well as possibilities of its transformation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract. Interplanetary scintillation observations of 48 of the 55 Augusto et al. (1998) flat spectrum radio sources were carried out at 111 MHz using the interplanetary scintillation method on the Large Phased Array (LPA) in Russia. Due to the large size of the LPA beam (1◦ × 0.5◦) a careful inspection of all possible confusion sources was made using extant large radio surveys: 37 of the 48 sources are not confused. We were able to estimate the scintillating flux densities of 13 sources, getting upper limits for the remaining 35. Gathering more or improving extant VLBI data on these sources might significantly improve our results. This proof-of-concept project tells us that compact (<1 ) flat spectrum radio sources show strong enough scintillations at 111 MHz to establish/constrain their spectra (low-frequency end). Key words. galaxies: general – galaxies: active – galaxies: quasars: general

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, biodiesel was produced from castor oil that was a byproduct glycerin. The molar ratio between oil and alcohol, as well as the use of (KOH) catalyst to provide the chemical reaction is based on literature. The best results were obtained using 1 mol of castor oil (260g) to 3 moles of methyl alcohol (138g), using 1.0% KOH as catalyst at a temperature of 260 ° C and shaken at 120 rpm. The oil used was commercially available, the process involves the reaction of transesterification of a vegetable oil with methyl alcohol. The product of this reaction is an ester, biodiesel being the main product and the glycerin by-product which has undergone treatment for use as raw material for the production of allyl alcohol. The great advantage of the use of glycerin to obtain allyl alcohol is that its use eliminates the large amount of waste of the biodiesel and various forms of insult to the environment. The reactions for the formation of allyl alcohol was conducted from formic acid and glycerin in a ratio 1/1, at a temperature of 260oC in a heater blanket, being sprayed by a spiral condenser for a period of 2 hours and the product obtained contains mostly the allylic alcohol .. The monitoring of reactions was performed by UV-Visible Spectrophotometer: FTIR Fourier transform, the analysis showed that these changes occur spectrometer indicating the formation of the product allylic alcohol (prop-2-en-1-ol) in the presence of water, This alcohol was appointed Alcohol GL. The absorption bands confirms that the reaction was observed in (υ C = C) 1470 -1600 cm -1 and (υ CO), 3610-3670 attributed to C = C groups and OH respectively. The thermal analysis was carried out in a thermogravimetric analyzer SDT Q600, where the mass and temperature are displayed against time, that allows checking the approximate rate of heating. The innovative methodology developed in the laboratory (LABTAM, UFRN), was able to treat the glycerine produced by transesterification of castor oil and used as raw material for production of allyl alcohol, with a yield of 80%, of alcohol, the same is of great importance in the manufacture of polymers, pharmaceuticals, organic compounds, herbicides, pesticides and other chemicals

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The birth models of care are discussed, in the light of classical and contemporary social science theoretical background, emphasizing the humanistic model. The double spiral of the sociology of absences and the sociology of emergences is detailed, being based, on one hand, on the translation of experiences of knowledge, and, on the other, on the translation of experiences of information and communication, by revealing the movement articulated by Brazilian women on blogs that defend and bring into light initiatives aiming to recover natural and humanized birth. A cartography of the thematic ideas in birth literature is produced, resulting in the elaboration of a synthetic map on obstetric models of care in contemporaneity, pointing out the consequences of the obstetric model that has become hegemonic in contemporary societies, and comparing that model to others that work more efficaciously to mothers and babies. A symbolic cartography of the activism for humanizing birth on the Brazilian blogosphere is configured by the elaboration of an analytical map synthetizing the main mottos defended by the movement: Normal humanized birth; Against obstetrical violence; and Planned home birth. The superposition of the obstetric models of care s map and the rebirth of birth s analytical map indicates it is necessary to reinforce three main measures in order to make a paradigmatic turn in contemporary birth models of care possible: pave the way for the humanistic care of assistance in normal birth, by defending and highlighting practices and professionals that act in compliance with evidence based medicine, respecting the physiology of birth; denaturalize obstetric violence, by showing how routine procedures and interventions can be means of aggression, jeopardizing the autonomy, the protagonism and the respect towards women; and motivate initiatives of planned home birth, the best place for the occurrence of holistic experiences of birth. It is concluded that Internet tools have allowed a pioneer mobilization in respecting women s reproductive rights in Brazil and that the potential of the crowd s biopower that resides on the blogosphere can turn blogs into a hegemonic alternative way to reach more democratic forms of social organization. In that condition of being virtually hegemonic in contesting the established power, these blogs can be understood, therefore, as potentially great contra-hegemonic channels for the rebirth of birth and for the reinvention of social emancipation, as their author s articulate and organize themselves to strive against the waste of experience, trying to create reciprocal intelligibility amongst different experiences of world

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents the development of new microwaves structures, filters and high gain antenna, through the cascading of frequency selective surfaces, which uses fractals Dürer and Minkowski patches as elements, addition of an element obtained from the combination of the other two simple the cross dipole and the square spiral. Frequency selective surfaces (FSS) includes a large area of Telecommunications and have been widely used due to its low cost, low weight and ability to integrate with others microwaves circuits. They re especially important in several applications, such as airplane, antennas systems, radomes, rockets, missiles, etc. FSS applications in high frequency ranges have been investigated, as well as applications of cascading structures or multi-layer, and active FSS. In this work, we present results for simulated and measured transmission characteristics of cascaded structures (multilayer), aiming to investigate the behavior of the operation in terms of bandwidth, one of the major problems presented by frequency selective surfaces. Comparisons are made with simulated results, obtained using commercial software such as Ansoft DesignerTM v3 and measured results in the laboratory. Finally, some suggestions are presented for future works on this subject

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is the analysis of a structure of the microstrip antenna designed for application in ultra wide band systems (Ultra Wideband - UWB). This is a prospective analytical study where they tested the changes in the geometry of the antenna, observing their suitability to the proposed objectives. It is known that the UWB antenna must operate in a range of at least 500 MHz, and answer a fractional bandwidth greater than or equal to 25%. It is also desirable that the antenna meets the specifications of track determined by FCC - Federal Communication Commission, which regulates the system in 2002 designating the UWB bandwidth of 7.5 GHz, a range that varies from 3.1 GHz to 10, 6 GHz. by setting the maximum power spectral density of operation in -41.3 dB / MHz, and defining the fractional bandwidth by 20%. The study starts of a structure of geometry in the form of stylized @, which evolves through changes in its form, in simulated commercial software CST MICROWAVE STUDIO, version 5.3.1, and then tested using the ANSOFT HFSS, version 9. These variations, based on observations of publications available from literature referring to the microstrip monopole planar antennas. As a result it is proposed an antenna, called Monopole Antenna Planar Spiral Almost Rectangular for applications in UWB systems - AMQEUWB, which presents simulated and measured results satisfactory, consistent with the objectives of the study. Some proposals for future work are mentioned

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technical and economic viability of solar heating for swimming pools is unquestionable, besides there it replaces the high costs and environmental impacts of conventional supply of energy, and it improves an optimization in the pool heating uses. This work applies the principles of the greenhouse effect: advanced thermodynamics, heat retention and equalization of temperature, to optimize the solar heating equipment, reducing the area required by collectors as much as 40% (still estimated value) for commercial collectors, with minor architectural and aesthetic impacts on the environment. It features a solar heating alternative in pools, whose main characteristics: low cost, simplicity in manufacturing and assembly and a faster heating. The system consists of two collectors spiral hoses made of polyethylene with a hundred meters each, and working on a forced flow, with only one pass of the working fluid inside the coils, and is used to pump itself treatment of pool water to obtain the desired flow. One of the collectors will be exposed to direct solar radiation, and the other will be covered by a glass slide and closed laterally, so providing the greenhouse effect. The equipment will be installed in parallel and simultaneously exposed to the sun in order to obtain comparative data on their effectiveness. Will be presented results of thermal tests for this the two cases, with and without transparent cover. Will be demonstrated, by comparison, the thermal, economic and material feasibility of these systems for heating swimming pools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is nowadays a growing demand for located cooling and stabilization in optical and electronic devices, haul of portable systems of cooling that they allow a larger independence in several activities. The modules of thermoelectrical cooling are bombs of heat that use efect Peltier, that consists of the production of a temperature gradient when an electric current is applied to a thermoelectrical pair formed by two diferent drivers. That efect is part of a class of thermoelectrical efcts that it is typical of junctions among electric drivers. The modules are manufactured with semiconductors. The used is the bismuth telluride Bi2Te3, arranged in a periodic sequence. In this sense the idea appeared of doing an analysis of a system that obeys the sequence of Fibonacci. The sequence of Fibonacci has connections with the golden proportion, could be found in the reproductive study of the bees, in the behavior of the light and of the atoms, as well as in the growth of plants and in the study of galaxies, among many other applications. An apparatus unidimensional was set up with the objective of investigating the thermal behavior of a module that obeys it a rule of growth of the type Fibonacci. The results demonstrate that the modules that possess periodic arrangement are more eficient

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lithium (Li) is a chemical element with atomic number 3 and it is among the lightest known elements in the universe. In general, the Lithium is found in the nature under the form of two stable isotopes, the 6Li and 7Li. This last one is the most dominant and responds for about 93% of the Li found in the Universe. Due to its fragileness this element is largely used in the astrophysics, especially in what refers to the understanding of the physical process that has occurred since the Big Bang going through the evolution of the galaxies and stars. In the primordial nucleosynthesis in the Big Bang moment (BBN), the theoretical calculation forecasts a Li production along with all the light elements such as Deuterium and Beryllium. To the Li the BNB theory reviews a primordial abundance of Log log ǫ(Li) =2.72 dex in a logarithmic scale related to the H. The abundance of Li found on the poor metal stars, or pop II stars type, is called as being the abundance of Li primordial and is the measure as being log ǫ(Li) =2.27 dex. In the ISM (Interstellar medium), that reflects the current value, the abundance of Lithium is log ǫ(Li) = 3.2 dex. This value has great importance for our comprehension on the chemical evolution of the galaxy. The process responsible for the increasing of the primordial value present in the Li is not clearly understood until nowadays. In fact there is a real contribution of Li from the giant stars of little mass and this contribution needs to be well streamed if we want to understand our galaxy. The main objection in this logical sequence is the appearing of some giant stars with little mass of G and K spectral types which atmosphere is highly enriched with Li. Such elevated values are exactly the opposite of what could happen with the typical abundance of giant low mass stars, where convective envelops pass through a mass deepening in which all the Li should be diluted and present abundances around log ǫ(Li) ∼1.4 dex following the model of stellar evolution. In the Literature three suggestions are found that try to reconcile the values of the abundance of Li theoretical and observed in these rich in Li giants, but any of them bring conclusive answers. In the present work, we propose a qualitative study of the evolutionary state of the rich in Li stars in the literature along with the recent discovery of the first star rich in Li observed by the Kepler Satellite. The main objective of this work is to promote a solid discussion about the evolutionary state based on the characteristic obtained from the seismic analysis of the object observed by Kepler. We used evolutionary traces and simulation done with the population synthesis code TRILEGAL intending to evaluate as precisely as possible the evolutionary state of the internal structure of these groups of stars. The results indicate a very short characteristic time when compared to the evolutionary scale related to the enrichment of these stars

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent astronomical observations indicate that the universe has null spatial curvature, is accelerating and its matter-energy content is composed by circa 30% of matter (baryons + dark matter) and 70% of dark energy, a relativistic component with negative pressure. However, in order to built more realistic models it is necessary to consider the evolution of small density perturbations for explaining the richness of observed structures in the scale of galaxies and clusters of galaxies. The structure formation process was pioneering described by Press and Schechter (PS) in 1974, by means of the galaxy cluster mass function. The PS formalism establishes a Gaussian distribution for the primordial density perturbation field. Besides a serious normalization problem, such an approach does not explain the recent cluster X-ray data, and it is also in disagreement with the most up-to-date computational simulations. In this thesis, we discuss several applications of the nonextensive q-statistics (non-Gaussian), proposed in 1988 by C. Tsallis, with special emphasis in the cosmological process of the large structure formation. Initially, we investigate the statistics of the primordial fluctuation field of the density contrast, since the most recent data from the Wilkinson Microwave Anisotropy Probe (WMAP) indicates a deviation from gaussianity. We assume that such deviations may be described by the nonextensive statistics, because it reduces to the Gaussian distribution in the limit of the free parameter q = 1, thereby allowing a direct comparison with the standard theory. We study its application for a galaxy cluster catalog based on the ROSAT All-Sky Survey (hereafter HIFLUGCS). We conclude that the standard Gaussian model applied to HIFLUGCS does not agree with the most recent data independently obtained by WMAP. Using the nonextensive statistics, we obtain values much more aligned with WMAP results. We also demonstrate that the Burr distribution corrects the normalization problem. The cluster mass function formalism was also investigated in the presence of the dark energy. In this case, constraints over several cosmic parameters was also obtained. The nonextensive statistics was implemented yet in 2 distinct problems: (i) the plasma probe and (ii) in the Bremsstrahlung radiation description (the primary radiation from X-ray clusters); a problem of considerable interest in astrophysics. In another line of development, by using supernova data and the gas mass fraction from galaxy clusters, we discuss a redshift variation of the equation of state parameter, by considering two distinct expansions. An interesting aspect of this work is that the results do not need a prior in the mass parameter, as usually occurs in analyzes involving only supernovae data.Finally, we obtain a new estimate of the Hubble parameter, through a joint analysis involving the Sunyaev-Zeldovich effect (SZE), the X-ray data from galaxy clusters and the baryon acoustic oscillations. We show that the degeneracy of the observational data with respect to the mass parameter is broken when the signature of the baryon acoustic oscillations as given by the Sloan Digital Sky Survey (SDSS) catalog is considered. Our analysis, based on the SZE/X-ray data for a sample of 25 galaxy clusters with triaxial morphology, yields a Hubble parameter in good agreement with the independent studies, provided by the Hubble Space Telescope project and the recent estimates of the WMAP

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the way in which large-scale structures, like galaxies, form remains one of the most challenging problems in cosmology today. The standard theory for the origin of these structures is that they grew by gravitational instability from small, perhaps quantum generated, °uctuations in the density of dark matter, baryons and photons over an uniform primordial Universe. After the recombination, the baryons began to fall into the pre-existing gravitational potential wells of the dark matter. In this dissertation a study is initially made of the primordial recombination era, the epoch of the formation of the neutral hydrogen atoms. Besides, we analyzed the evolution of the density contrast (of baryonic and dark matter), in clouds of dark matter with masses among 104M¯ ¡ 1010M¯. In particular, we take into account the several physical mechanisms that act in the baryonic component, during and after the recombination era. The analysis of the formation of these primordial objects was made in the context of three models of dark energy as background: Quintessence, ¤CDM(Cosmological Constant plus Cold Dark Matter) and Phantom. We show that the dark matter is the fundamental agent for the formation of the structures observed today. The dark energy has great importance at that epoch of its formation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it