953 resultados para Galaxy: open clusters and associations: general


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large parts of the world are subjected to one or more natural hazards, such as earthquakes, tsunamis, landslides, tropical storms (hurricanes, cyclones and typhoons), costal inundation and flooding. Virtually the entire world is at risk of man-made hazards. In recent decades, rapid population growth and economic development in hazard-prone areas have greatly increased the potential of multiple hazards to cause damage and destruction of buildings, bridges, power plants, and other infrastructure; thus posing a grave danger to the community and disruption of economic and societal activities. Although an individual hazard is significant in many parts of the United States (U.S.), in certain areas more than one hazard may pose a threat to the constructed environment. In such areas, structural design and construction practices should address multiple hazards in an integrated manner to achieve structural performance that is consistent with owner expectations and general societal objectives. The growing interest and importance of multiple-hazard engineering has been recognized recently. This has spurred the evolution of multiple-hazard risk-assessment frameworks and development of design approaches which have paved way for future research towards sustainable construction of new and improved structures and retrofitting of the existing structures. This report provides a review of literature and the current state of practice for assessment, design and mitigation of the impact of multiple hazards on structural infrastructure. It also presents an overview of future research needs related to multiple-hazard performance of constructed facilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Life-Patterns on the Periphery: A Humanities Base for Development Imperatives and their Application in the Chicago City-Region is informed by the need to bring diverse fields together in order to tackle issues related to the contemporary city-region. By honouring the long-term economic, social, political, and ecological imperatives that form the fabric of healthy, productive, sustainable communities, it becomes possible to setup political structures and citizen will to develop distinct places that result in the overlapping of citizen life patterns, setting the stage for citizen action and interaction. Based in humanities scholarship, the four imperatives act as checks on each other so that no one imperative is solely honoured in development. Informed by Heidegger, Arendt, deCerteau, Casey, and others, their foundation in the humanities underlines their importance, while at the same time creating a stage where all fields can contribute to actualizing this balance in practice. For this project, theoretical assistance has been greatly borrowed from architecture, planning theory, urban theory, and landscape urbanism, including scholarship from Saskia Sassen, John Friedmann, William Cronon, Jane Jacobs, Joel Garreau, Alan Berger, and many others. This project uses the Chicago city-region as a site, specifically the Interstate 80 and 88 corridors extending west from Chicago. Both transportation corridors are divided into study regions, providing the opportunity to examine a broad variety of population and development densities. Through observational research, a picture of each study region can be extrapolated, analyzed, and understood with respect to the four imperatives. This is put to use in this project by studying region-specific suggestions for future development moves, culminating in some universal steps that can be taken to develop stronger communities and set both the research site specifically and North American city-regions in general on a path towards healthy, productive, sustainable development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The lack of access to sufficient water and sanitation facilities is one of the largest hindrances towards the sustainable development of the poorest 2.2 billion people in the world. Rural Uganda is one of the areas where such inaccessibility is seriously hampering their efforts at development. Many rural Ugandans must travel several kilometers to fetch adequate water and many still do not have adequate sanitation facilities. Such poor access to clean water forces Ugandans to spend an inordinate amount of time and energy collecting water - time and energy that could be used for more useful endeavors. Furthermore, the difficulty in getting water means that people use less water than they need to for optimal health and well-being. Access to other sanitation facilities can also have a large impact, particularly on the health of young children and the elderly whose immune systems are less than optimal. Hand-washing, presence of a sanitary latrine, general household cleanliness, maintenance of the safe water chain and the households’ knowledge about and adherence to sound sanitation practices may be as important as access to clean water sources. This report investigates these problems using the results from two different studies. It first looks into how access to water affects peoples’ use of it. In particular it investigates how much water households use as a function of perceived effort to fetch it. Operationally, this was accomplished by surveying nearly 1,500 residents in three different districts around Uganda about their water usage and the time and distance they must travel to fetch it. The study found that there is no statistically significant correlation between a family’s water usage and the perceived effort they must put forth to have to fetch it. On average, people use around 15 liters per person per day. Rural Ugandan residents apparently require a certain amount of water and will travel as far or as long as necessary to collect it. Secondly, a study entitled “What Works Best in Diarrheal Disease Prevention?” was carried out to study the effectiveness of five different water and sanitation facilities in reducing diarrheal disease incidences amongst children under five. It did this by surveying five different communities before and after the implementation of improvements to find changes in diarrheal disease incidences amongst children under five years of age. It found that household water treatment devices provide the best means of preventing diarrheal diseases. This is likely because water often becomes contaminated before it is consumed even if it was collected from a protected source.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Renewable energy is growing in demand, and thus the the manufacture of solar cells and photovoltaic arrays has advanced dramatically in recent years. This is proved by the fact that the photovoltaic production has doubled every 2 years, increasing by an average of 48% each year since 2002. Covering the general overview of solar cell working, and its model, this thesis will start with the three generations of photovoltaic solar cell technology, and move to the motivation of dedicating research to nanostructured solar cell. For the current generation solar cells, among several factors, like photon capture, photon reflection, carrier generation by photons, carrier transport and collection, the efficiency also depends on the absorption of photons. The absorption coefficient,α, and its dependence on the wavelength, λ, is of major concern to improve the efficiency. Nano-silicon structures (quantum wells and quantum dots) have a unique advantage compared to bulk and thin film crystalline silicon that multiple direct and indirect band gaps can be realized by appropriate size control of the quantum wells. This enables multiple wavelength photons of the solar spectrum to be absorbed efficiently. There is limited research on the calculation of absorption coefficient in nano structures of silicon. We present a theoretical approach to calculate the absorption coefficient using quantum mechanical calculations on the interaction of photons with the electrons of the valence band. One model is that the oscillator strength of the direct optical transitions is enhanced by the quantumconfinement effect in Si nanocrystallites. These kinds of quantum wells can be realized in practice in porous silicon. The absorption coefficient shows a peak of 64638.2 cm-1 at = 343 nm at photon energy of ξ = 3.49 eV ( = 355.532 nm). I have shown that a large value of absorption coefficient α comparable to that of bulk silicon is possible in silicon QDs because of carrier confinement. Our results have shown that we can enhance the absorption coefficient by an order of 10, and at the same time a nearly constant absorption coefficient curve over the visible spectrum. The validity of plots is verified by the correlation with experimental photoluminescence plots. A very generic comparison for the efficiency of p-i-n junction solar cell is given for a cell incorporating QDs and sans QDs. The design and fabrication technique is discussed in brief. I have shown that by using QDs in the intrinsic region of a cell, we can improve the efficiency by a factor of 1.865 times. Thus for a solar cell of efficiency of 26% for first generation solar cell, we can improve the efficiency to nearly 48.5% on using QDs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Chronic pain is an important outcome variable after inguinal hernia repair that is generally not assessed by objective methods. The aim of this study was to objectively investigate chronic pain and hypoesthesia after inguinal hernia repair using three types of operation: open suture, open mesh, and laparoscopic. METHODS: A total of 96 patients were included in the study with a median follow-up of 4.7 years. Open suture repair was performed in 40 patients (group A), open mesh repair in 20 patients (group B), and laparoscopic repair in 36 patients (group C). Hypoesthesia and pain were assessed using von Frey monofilaments. Quality of life was investigated with Short Form 36. RESULTS: Pain occurring at least once a week was found in 7 (17.5%) patients of group A, in 5 (25%) patients of group B, and in 6 (16.6%) patients of group C. Area and intensity of hyposensibility were increased significantly after open nonmesh and mesh repair compared to those after laparoscopy (p = 0.01). Hyposensibility in patients who had laparoscopic hernia repair was significantly associated with postoperative pain (p = 0.03). Type of postoperative pain was somatic in 19 (61%), neuropathic in 9 (29%), and visceral in 3 (10%) patients without significant differences between the three groups. CONCLUSIONS: The incidence of hypoesthesia in patients who had laparoscopic hernia repair is significantly lower than in those who had open hernia repair. Hypoesthesia after laparoscopic but not after open repair is significantly associated with postoperative pain. Von Frey monofilaments are important tools for the assessment of inguinal hypoesthesia and pain in patients who had inguinal hernia repair allowing quantitative and qualitative comparison between various surgical techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Moisture induced distresses have been the prevalent distress type affecting the deterioration of both asphalt and concrete pavement sections. While various surface techniques have been employed over the years to minimize the ingress of moisture into the pavement structural sections, subsurface drainage components like open-graded base courses remain the best alternative in minimizing the time the pavement structural sections are exposed to saturated conditions. This research therefore focuses on assessing the performance and cost-effectiveness of pavement sections containing both treated and untreated open-graded aggregate base materials. Three common roadway aggregates comprising of two virgin aggregates and one recycled aggregate were investigated using four open-ended gradations and two binder types. Laboratory tests were conducted to determine the hydraulic, mechanical and durability characteristics of treated and untreated open-graded mixes made from these three aggregate types. Results of the experimental program show that for the same gradation and mix design types, limestone samples have the greatest drainage capacity, stability to traffic loads and resistance to degradation from environmental conditions like freeze-thaw. However, depending on the gradation and mix design used, all three aggregate types namely limestone, natural gravel and recycled concrete can meet the minimum coefficient of hydraulic conductivity required for good drainage in most pavements. Tests results for both asphalt and cement treated open-graded samples indicate that a percent air void content within the range of 15-25 will produce a treated open-graded base course with sufficient drainage capacity and also long term stability under both traffic and environmental loads. Using the new Mechanistic and Empirical Design Guide software, computer simulations of pavement performance were conducted on pavement sections containing these open-graded base aggregate base materials to determine how the MEPDG predicted pavement performance is sensitive to drainage. Using three truck traffic levels and four climatic regions, results of the computer simulations indicate that the predicted performance was not sensitive to the drainage characteristics of the open-graded base course. Based on the result of the MEPDG predicted pavement performance, the cost-effectiveness of the pavement sections with open-graded base was computed on the assumption that the increase service life experienced by these sections was attributed to the positive effects of subsurface drainage. The two cost analyses used gave two contrasting results with the one indicating that the inclusion of open-graded base courses can lead to substantial savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rooted in critical scholarship this dissertation is an interdisciplinary study, which contends that having a history is a basic human right. Advocating a newly conceived and termed, Solidarity-inspired History framework/practice perspective, the dissertation argues for and then delivers a restorative voice to working-class historical actors during the 1916 Minnesota Iron Ore Strike. Utilizing an interdisciplinary methodological framework the dissertation combines research methods from the Humanities and the Social Sciences to form a working-class history that is a corrective to standardized studies of labor in the late 19th and early 20th centuries. Oftentimes class interests and power relationships determine the dominant perspectives or voices established in history and disregard people and organizations that run counter to, or in the face of, customary or traditional American themes of patriotism, the Protestant work ethic, adherence to capitalist dogma, or United States exceptionalism. This dissertation counteracts these traditional narratives with a unique, perhaps even revolutionary, examination of the 1916 Minnesota Iron Ore Strike. The intention of this dissertation's critical perspective is to poke, prod, and prompt academics, historians, and the general public to rethink, and then think again, about the place of those who have been dislocated from or altogether forgotten, misplaced, or underrepresented in the historical record. Thus, the purpose of the dissertation is to give voice to historical actors in the dismembered past. Historical actors who have run counter to traditional American narratives often have their body of "evidence" disjointed or completely dislocated from the story of our nation. This type of disremembering creates an artificial recollection of our collective past, which de-articulates past struggles from contemporary groups seeking solidarity and social justice in the present. Class-conscious actors, immigrants, women, the GLBTQ community, and people of color have the right to be remembered on their own terms using primary sources and resources they produced. Therefore, similar to the Wobblies industrial union and its rank-and-file, this dissertation seeks to fan the flames of discontented historical memory by offering a working-class perspective of the 1916 Strike that seeks to interpret the actions, events, people, and places of the strike anew, thus restoring the voices of these marginalized historical actors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The five counties discussed in this paper compose the northernmost and westernmost counties in Montana. On the eastern boundary are Glacier National Park and the Continental Divide; on the southern boundary are Missoula and Powell counties; Idaho lies on the southwestern and western side; and the Canadian border lies along the northern edge. The region is on the Pacific Ocean side of the Rocky Mountains. Three major rivers, the Clark Fork, the Flathead, and the Kootenai drain this area into the Columbia River.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PLATO 2.0 has recently been selected for ESA’s M3 launch opportunity (2022/24). Providing accurate key planet parameters (radius, mass, density and age) in statistical numbers, it addresses fundamental questions such as: How do planetary systems form and evolve? Are there other systems with planets like ours, including potentially habitable planets? The PLATO 2.0 instrument consists of 34 small aperture telescopes (32 with 25 s readout cadence and 2 with 2.5 s candence) providing a wide field-of-view (2232 deg 2) and a large photometric magnitude range (4–16 mag). It focusses on bright (4–11 mag) stars in wide fields to detect and characterize planets down to Earth-size by photometric transits, whose masses can then be determined by ground-based radial-velocity follow-up measurements. Asteroseismology will be performed for these bright stars to obtain highly accurate stellar parameters, including masses and ages. The combination of bright targets and asteroseismology results in high accuracy for the bulk planet parameters: 2 %, 4–10 % and 10 % for planet radii, masses and ages, respectively. The planned baseline observing strategy includes two long pointings (2–3 years) to detect and bulk characterize planets reaching into the habitable zone (HZ) of solar-like stars and an additional step-and-stare phase to cover in total about 50 % of the sky. PLATO 2.0 will observe up to 1,000,000 stars and detect and characterize hundreds of small planets, and thousands of planets in the Neptune to gas giant regime out to the HZ. It will therefore provide the first large-scale catalogue of bulk characterized planets with accurate radii, masses, mean densities and ages. This catalogue will include terrestrial planets at intermediate orbital distances, where surface temperatures are moderate. Coverage of this parameter range with statistical numbers of bulk characterized planets is unique to PLATO 2.0. The PLATO 2.0 catalogue allows us to e.g.: - complete our knowledge of planet diversity for low-mass objects, - correlate the planet mean density-orbital distance distribution with predictions from planet formation theories,- constrain the influence of planet migration and scattering on the architecture of multiple systems, and - specify how planet and system parameters change with host star characteristics, such as type, metallicity and age. The catalogue will allow us to study planets and planetary systems at different evolutionary phases. It will further provide a census for small, low-mass planets. This will serve to identify objects which retained their primordial hydrogen atmosphere and in general the typical characteristics of planets in such low-mass, low-density range. Planets detected by PLATO 2.0 will orbit bright stars and many of them will be targets for future atmosphere spectroscopy exploring their atmosphere. Furthermore, the mission has the potential to detect exomoons, planetary rings, binary and Trojan planets. The planetary science possible with PLATO 2.0 is complemented by its impact on stellar and galactic science via asteroseismology as well as light curves of all kinds of variable stars, together with observations of stellar clusters of different ages. This will allow us to improve stellar models and study stellar activity. A large number of well-known ages from red giant stars will probe the structure and evolution of our Galaxy. Asteroseismic ages of bright stars for different phases of stellar evolution allow calibrating stellar age-rotation relationships. Together with the results of ESA’s Gaia mission, the results of PLATO 2.0 will provide a huge legacy to planetary, stellar and galactic science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The stress of dental treatment often elicits negative emotions in children, expressed as dental fear or anxiety. Highly anxious children obstruct treatment and avoid therapy, further amplifying oral health problems. The aim of this study was to examine the neuroendocrine and autonomic nervous system responses to dental treatment and their possible interactions and associations with psychometric indices of anxiety, caries, previous dental experience, anesthesia, age and gender in school children. Upon informed consent, saliva was obtained from 97 children (59% males, mean age ±  SD: 89.73 ± 15 months) in the Clinic of pediatric dentistry before treatment, immediately post-treatment and at the recall visit to determine cortisol and salivary alpha-amylase (sAA) levels. Dental and general anxiety was assessed through specific questionnaires completed by the children. Compared to pre-treatment, cortisol levels were increased following treatment, while sAA levels were higher at the recall. Pre- and post-treatment cortisol and sAA responses were positively correlated. Dental and general anxiety questionnaire scores were also significantly correlated with each other. The integrated autonomic and neuroendocrine responses prior to treatment were correlated with state anxiety and those following treatment with dental anxiety. However, univariable and multivariable linear regression analysis associated post-treatment cortisol, but not sAA, levels with dental anxiety. No associations of cortisol or sAA responses with caries, age, gender, previous dental experience or anesthesia were detected. These data provide some evidence that both sAA and cortisol levels are altered in children in anticipation or during dental treatment, but only cortisol levels are associated to dental anxiety.