931 resultados para Topological data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Helicobacter pylori infection is frequently acquired during childhood. This microorganism is known to cause gastritis, and duodenal ulcer in pediatric patients, however most children remain completely asymptomatic to the infection. Currently there is no consensus in favor of treatment of H. pylori infection in asymptomatic children. The firstline of treatment for this population is triple medication therapy including two antibacterial agents and one proton pump inhibitor for a 2 week duration course. Decreased eradication rate of less than 75% has been documented with the use of this first-line therapy but novel tinidazole-containing quadruple sequential therapies seem worth investigating. None of the previous studies on such therapy has been done in the United States of America. As part of an iron deficiency anemia study in asymptomatic H. pylori infected children of El Paso, Texas, we conducted a secondary data analysis of study data collected in this trial to assess the effectiveness of this tinidazole-containing sequential quadruple therapy compared to placebo on clearing the infection. Subjects were selected from a group of asymptomatic children identified through household visits to 11,365 randomly selected dwelling units. After obtaining parental consent and child assent a total of 1,821 children 3-10 years of age were screened and 235 were positive to a novel urine immunoglobulin class G antibodies test for H. pylori infection and confirmed as infected using a 13C urea breath test, using a hydrolysis urea rate >10 μg/min as cut-off value. Out of those, 119 study subjects had a complete physical exam and baseline blood work and were randomly allocated to four groups, two of which received active H. pylori eradication medication alone or in combination with iron, while the other two received iron only or placebo only. Follow up visits to their houses were done to assess compliance and occurrence of adverse events and at 45+ days post-treatment, a second urea breath test was performed to assess their infection status. The effectiveness was primarily assessed on intent to treat basis (i.e., according to their treatment allocation), and the proportion of those who cleared their infection using a cut-off value >10 μg/min of for urea hydrolysis rate, was the primary outcome. Also we conducted analysis on a per-protocol basis and according to the cytotoxin associated gene A product of the H. pylori infection status. Also we compared the rate of adverse events across the two arms. On intent-to-treat and per-protocol analyses, 44.3% and 52.9%, respectively, of the children receiving the novel quadruple sequential eradication cleared their infection compared to 12.2% and 15.4% in the arms receiving iron or placebo only, respectively. Such differences were statistically significant (p<0.001). The study medications were well accepted and safe. In conclusion, we found in this study population, of mostly asymptomatically H. pylori infected children, living in the US along the border with Mexico, that the quadruple sequential eradication therapy cleared the infection in only half of the children receiving this treatment. Research is needed to assess the antimicrobial susceptibility of the strains of H. pylori infecting this population to formulate more effective therapies. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective. The goal of this study is to characterize the current workforce of CIHs, the lengths of professional practice careers of the past and current CIHs.^ Methods. This is a secondary data analysis of data compiled from all of the nearly 50 annual roster listings of the American Board of Industrial Hygiene (ABIH) for Certified Industrial Hygienists active in each year since 1960. Survival analysis was performed as a technique to measure the primary outcome of interest. The technique which was involved in this study was the Kaplan-Meier method for estimating the survival function.^ Study subjects: The population to be studied is all Certified Industrial Hygienists (CIHs). A CIH is defined by the ABIH as an individual who has achieved the minimum requirements for education, working experience and through examination, has demonstrated a minimum level of knowledge and competency in the prevention of occupational illnesses. ^ Results. A Cox-proportional hazards model analysis was performed by different start-time cohorts of CIHs. In this model we chose cohort 1 as the reference cohort. The estimated relative risk of the event (defined as retirement, or absent from 5 consecutive years of listing) occurred for CIHs for cohorts 2,3,4,5 relative to cohort 1 is 0.385, 0.214, 0.234, 0.299 relatively. The result show that cohort 2 (CIHs issued from 1970-1980) has the lowest hazard ratio which indicates the lowest retirement rate.^ Conclusion. The manpower of CIHs (still actively practicing up to the end of 2009) increased tremendously starting in 1980 and grew into a plateau in recent decades. This indicates that the supply and demand of the profession may have reached equilibrium. More demographic information and variables are needed to actually predict the future number of CIHs needed. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When choosing among models to describe categorical data, the necessity to consider interactions makes selection more difficult. With just four variables, considering all interactions, there are 166 different hierarchical models and many more non-hierarchical models. Two procedures have been developed for categorical data which will produce the "best" subset or subsets of each model size where size refers to the number of effects in the model. Both procedures are patterned after the Leaps and Bounds approach used by Furnival and Wilson for continuous data and do not generally require fitting all models. For hierarchical models, likelihood ratio statistics (G('2)) are computed using iterative proportional fitting and "best" is determined by comparing, among models with the same number of effects, the Pr((chi)(,k)('2) (GREATERTHEQ) G(,ij)('2)) where k is the degrees of freedom for ith model of size j. To fit non-hierarchical as well as hierarchical models, a weighted least squares procedure has been developed.^ The procedures are applied to published occupational data relating to the occurrence of byssinosis. These results are compared to previously published analyses of the same data. Also, the procedures are applied to published data on symptoms in psychiatric patients and again compared to previously published analyses.^ These procedures will make categorical data analysis more accessible to researchers who are not statisticians. The procedures should also encourage more complex exploratory analyses of epidemiologic data and contribute to the development of new hypotheses for study. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study is to descriptively analyze the current program at Ben Taub Pediatric Weight Management Program in Houston, Texas, a program designed to help overweight children ages three to eighteen to lose weight. In Texas, approximately one in every three children is overweight or obese. Obesity is seen at an even greater level within Ben Taub due to the hospital's high rate of service for underserved minority populations (Dehghan et al, 2005; Tyler and Horner, 2008; Hunt, 2009). The weight management program consists of nutritional, behavioral, physical activity, and medical counseling. Analysis will focus on changes in weight, BMI, cholesterol levels, and blood pressure from 2007–2010 for all participants who attended at least two weight management sessions. Recommendations will be given in response to the results of the data analysis.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: In this secondary data analysis, three statistical methodologies were implemented to handle cases with missing data in a motivational interviewing and feedback study. The aim was to evaluate the impact that these methodologies have on the data analysis. ^ Methods: We first evaluated whether the assumption of missing completely at random held for this study. We then proceeded to conduct a secondary data analysis using a mixed linear model to handle missing data with three methodologies (a) complete case analysis, (b) multiple imputation with explicit model containing outcome variables, time, and the interaction of time and treatment, and (c) multiple imputation with explicit model containing outcome variables, time, the interaction of time and treatment, and additional covariates (e.g., age, gender, smoke, years in school, marital status, housing, race/ethnicity, and if participants play on athletic team). Several comparisons were conducted including the following ones: 1) the motivation interviewing with feedback group (MIF) vs. the assessment only group (AO), the motivation interviewing group (MIO) vs. AO, and the intervention of the feedback only group (FBO) vs. AO, 2) MIF vs. FBO, and 3) MIF vs. MIO.^ Results: We first evaluated the patterns of missingness in this study, which indicated that about 13% of participants showed monotone missing patterns, and about 3.5% showed non-monotone missing patterns. Then we evaluated the assumption of missing completely at random by Little's missing completely at random (MCAR) test, in which the Chi-Square test statistic was 167.8 with 125 degrees of freedom, and its associated p-value was p=0.006, which indicated that the data could not be assumed to be missing completely at random. After that, we compared if the three different strategies reached the same results. For the comparison between MIF and AO as well as the comparison between MIF and FBO, only the multiple imputation with additional covariates by uncongenial and congenial models reached different results. For the comparison between MIF and MIO, all the methodologies for handling missing values obtained different results. ^ Discussions: The study indicated that, first, missingness was crucial in this study. Second, to understand the assumptions of the model was important since we could not identify if the data were missing at random or missing not at random. Therefore, future researches should focus on exploring more sensitivity analyses under missing not at random assumption.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

These three manuscripts are presented as a PhD dissertation for the study of using GeoVis application to evaluate telehealth programs. The primary reason of this research was to understand how the GeoVis applications can be designed and developed using combined approaches of HC approach and cognitive fit theory and in terms utilized to evaluate telehealth program in Brazil. First manuscript The first manuscript in this dissertation presented a background about the use of GeoVisualization to facilitate visual exploration of public health data. The manuscript covered the existing challenges that were associated with an adoption of existing GeoVis applications. The manuscript combines the principles of Human Centered approach and Cognitive Fit Theory and a framework using a combination of these approaches is developed that lays the foundation of this research. The framework is then utilized to propose the design, development and evaluation of “the SanaViz” to evaluate telehealth data in Brazil, as a proof of concept. Second manuscript The second manuscript is a methods paper that describes the approaches that can be employed to design and develop “the SanaViz” based on the proposed framework. By defining the various elements of the HC approach and CFT, a mixed methods approach is utilized for the card sorting and sketching techniques. A representative sample of 20 study participants currently involved in the telehealth program at the NUTES telehealth center at UFPE, Recife, Brazil was enrolled. The findings of this manuscript helped us understand the needs of the diverse group of telehealth users, the tasks that they perform and helped us determine the essential features that might be necessary to be included in the proposed GeoVis application “the SanaViz”. Third manuscript The third manuscript involved mix- methods approach to compare the effectiveness and usefulness of the HC GeoVis application “the SanaViz” against a conventional GeoVis application “Instant Atlas”. The same group of 20 study participants who had earlier participated during Aim 2 was enrolled and a combination of quantitative and qualitative assessments was done. Effectiveness was gauged by the time that the participants took to complete the tasks using both the GeoVis applications, the ease with which they completed the tasks and the number of attempts that were taken to complete each task. Usefulness was assessed by System Usability Scale (SUS), a validated questionnaire tested in prior studies. In-depth interviews were conducted to gather opinions about both the GeoVis applications. This manuscript helped us in the demonstration of the usefulness and effectiveness of HC GeoVis applications to facilitate visual exploration of telehealth data, as a proof of concept. Together, these three manuscripts represent challenges of combining principles of Human Centered approach, Cognitive Fit Theory to design and develop GeoVis applications as a method to evaluate Telehealth data. To our knowledge, this is the first study to explore the usefulness and effectiveness of GeoVis to facilitate visual exploration of telehealth data. The results of the research enabled us to develop a framework for the design and development of GeoVis applications related to the areas of public health and especially telehealth. The results of our study showed that the varied users were involved with the telehealth program and the tasks that they performed. Further it enabled us to identify the components that might be essential to be included in these GeoVis applications. The results of our research answered the following questions; (a) Telehealth users vary in their level of understanding about GeoVis (b) Interaction features such as zooming, sorting, and linking and multiple views and representation features such as bar chart and choropleth maps were considered the most essential features of the GeoVis applications. (c) Comparing and sorting were two important tasks that the telehealth users would perform for exploratory data analysis. (d) A HC GeoVis prototype application is more effective and useful for exploration of telehealth data than a conventional GeoVis application. Future studies should be done to incorporate the proposed HC GeoVis framework to enable comprehensive assessment of the users and the tasks they perform to identify the features that might be necessary to be a part of the GeoVis applications. The results of this study demonstrate a novel approach to comprehensively and systematically enhance the evaluation of telehealth programs using the proposed GeoVis Framework.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An important competence of human data analysts is to interpret and explain the meaning of the results of data analysis to end-users. However, existing automatic solutions for intelligent data analysis provide limited help to interpret and communicate information to non-expert users. In this paper we present a general approach to generating explanatory descriptions about the meaning of quantitative sensor data. We propose a type of web application: a virtual newspaper with automatically generated news stories that describe the meaning of sensor data. This solution integrates a variety of techniques from intelligent data analysis into a web-based multimedia presentation system. We validated our approach in a real world problem and demonstrate its generality using data sets from several domains. Our experience shows that this solution can facilitate the use of sensor data by general users and, therefore, can increase the utility of sensor network infrastructures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We can say without hesitation that in energy markets a throughout data analysis is crucial when designing sophisticated models that are able to capture most of the critical market drivers. In this study we will attempt to investigate into Spanish natural gas prices structure to improve understanding of the role they play in the determination of electricity prices and decide in the future about price modelling aspects. To further understand the potential for modelling, this study will focus on the nature and characteristics of the different gas price data available. The fact that the existing gas market in Spain does not incorporate enough liquidity of trade makes it even more critical to analyze in detail available gas price data information that in the end will provide relevant information to understand how electricity prices are affected by natural gas markets. In this sense representative Spanish gas prices are typically difficult to explore given the fact that there is not a transparent gas market yet and all the gas imported in the country is negotiated and purchased by private companies at confidential terms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La embriogénesis es el proceso mediante el cual una célula se convierte en un ser un vivo. A lo largo de diferentes etapas de desarrollo, la población de células va proliferando a la vez que el embrión va tomando forma y se configura. Esto es posible gracias a la acción de varios procesos genéticos, bioquímicos y mecánicos que interaccionan y se regulan entre ellos formando un sistema complejo que se organiza a diferentes escalas espaciales y temporales. Este proceso ocurre de manera robusta y reproducible, pero también con cierta variabilidad que permite la diversidad de individuos de una misma especie. La aparición de la microscopía de fluorescencia, posible gracias a proteínas fluorescentes que pueden ser adheridas a las cadenas de expresión de las células, y los avances en la física óptica de los microscopios han permitido observar este proceso de embriogénesis in-vivo y generar secuencias de imágenes tridimensionales de alta resolución espacio-temporal. Estas imágenes permiten el estudio de los procesos de desarrollo embrionario con técnicas de análisis de imagen y de datos, reconstruyendo dichos procesos para crear la representación de un embrión digital. Una de las más actuales problemáticas en este campo es entender los procesos mecánicos, de manera aislada y en interacción con otros factores como la expresión genética, para que el embrión se desarrolle. Debido a la complejidad de estos procesos, estos problemas se afrontan mediante diferentes técnicas y escalas específicas donde, a través de experimentos, pueden hacerse y confrontarse hipótesis, obteniendo conclusiones sobre el funcionamiento de los mecanismos estudiados. Esta tesis doctoral se ha enfocado sobre esta problemática intentando mejorar las metodologías del estado del arte y con un objetivo específico: estudiar patrones de deformación que emergen del movimiento organizado de las células durante diferentes estados del desarrollo del embrión, de manera global o en tejidos concretos. Estudios se han centrado en la mecánica en relación con procesos de señalización o interacciones a nivel celular o de tejido. En este trabajo, se propone un esquema para generalizar el estudio del movimiento y las interacciones mecánicas que se desprenden del mismo a diferentes escalas espaciales y temporales. Esto permitiría no sólo estudios locales, si no estudios sistemáticos de las escalas de interacción mecánica dentro de un embrión. Por tanto, el esquema propuesto obvia las causas de generación de movimiento (fuerzas) y se centra en la cuantificación de la cinemática (deformación y esfuerzos) a partir de imágenes de forma no invasiva. Hoy en día las dificultades experimentales y metodológicas y la complejidad de los sistemas biológicos impiden una descripción mecánica completa de manera sistemática. Sin embargo, patrones de deformación muestran el resultado de diferentes factores mecánicos en interacción con otros elementos dando lugar a una organización mecánica, necesaria para el desarrollo, que puede ser cuantificado a partir de la metodología propuesta en esta tesis. La metodología asume un medio continuo descrito de forma Lagrangiana (en función de las trayectorias de puntos materiales que se mueven en el sistema en lugar de puntos espaciales) de la dinámica del movimiento, estimado a partir de las imágenes mediante métodos de seguimiento de células o de técnicas de registro de imagen. Gracias a este esquema es posible describir la deformación instantánea y acumulada respecto a un estado inicial para cualquier dominio del embrión. La aplicación de esta metodología a imágenes 3D + t del pez zebra sirvió para desvelar estructuras mecánicas que tienden a estabilizarse a lo largo del tiempo en dicho embrión, y que se organizan a una escala semejante al del mapa de diferenciación celular y con indicios de correlación con patrones de expresión genética. También se aplicó la metodología al estudio del tejido amnioserosa de la Drosophila (mosca de la fruta) durante el cierre dorsal, obteniendo indicios de un acoplamiento entre escalas subcelulares, celulares y supracelulares, que genera patrones complejos en respuesta a la fuerza generada por los esqueletos de acto-myosina. En definitiva, esta tesis doctoral propone una estrategia novedosa de análisis de la dinámica celular multi-escala que permite cuantificar patrones de manera inmediata y que además ofrece una representación que reconstruye la evolución de los procesos como los ven las células, en lugar de como son observados desde el microscopio. Esta metodología por tanto permite nuevas formas de análisis y comparación de embriones y tejidos durante la embriogénesis a partir de imágenes in-vivo. ABSTRACT The embryogenesis is the process from which a single cell turns into a living organism. Through several stages of development, the cell population proliferates at the same time the embryo shapes and the organs develop gaining their functionality. This is possible through genetic, biochemical and mechanical factors that are involved in a complex interaction of processes organized in different levels and in different spatio-temporal scales. The embryogenesis, through this complexity, develops in a robust and reproducible way, but allowing variability that makes possible the diversity of living specimens. The advances in physics of microscopes and the appearance of fluorescent proteins that can be attached to expression chains, reporting about structural and functional elements of the cell, have enabled for the in-vivo observation of embryogenesis. The imaging process results in sequences of high spatio-temporal resolution 3D+time data of the embryogenesis as a digital representation of the embryos that can be further analyzed, provided new image processing and data analysis techniques are developed. One of the most relevant and challenging lines of research in the field is the quantification of the mechanical factors and processes involved in the shaping process of the embryo and their interactions with other embryogenesis factors such as genetics. Due to the complexity of the processes, studies have focused on specific problems and scales controlled in the experiments, posing and testing hypothesis to gain new biological insight. However, methodologies are often difficult to be exported to study other biological phenomena or specimens. This PhD Thesis is framed within this paradigm of research and tries to propose a systematic methodology to quantify the emergent deformation patterns from the motion estimated in in-vivo images of embryogenesis. Thanks to this strategy it would be possible to quantify not only local mechanisms, but to discover and characterize the scales of mechanical organization within the embryo. The framework focuses on the quantification of the motion kinematics (deformation and strains), neglecting the causes of the motion (forces), from images in a non-invasive way. Experimental and methodological challenges hamper the quantification of exerted forces and the mechanical properties of tissues. However, a descriptive framework of deformation patterns provides valuable insight about the organization and scales of the mechanical interactions, along the embryo development. Such a characterization would help to improve mechanical models and progressively understand the complexity of embryogenesis. This framework relies on a Lagrangian representation of the cell dynamics system based on the trajectories of points moving along the deformation. This approach of analysis enables the reconstruction of the mechanical patterning as experienced by the cells and tissues. Thus, we can build temporal profiles of deformation along stages of development, comprising both the instantaneous events and the cumulative deformation history. The application of this framework to 3D + time data of zebrafish embryogenesis allowed us to discover mechanical profiles that stabilized through time forming structures that organize in a scale comparable to the map of cell differentiation (fate map), and also suggesting correlation with genetic patterns. The framework was also applied to the analysis of the amnioserosa tissue in the drosophila’s dorsal closure, revealing that the oscillatory contraction triggered by the acto-myosin network organized complexly coupling different scales: local force generation foci, cellular morphology control mechanisms and tissue geometrical constraints. In summary, this PhD Thesis proposes a theoretical framework for the analysis of multi-scale cell dynamics that enables to quantify automatically mechanical patterns and also offers a new representation of the embryo dynamics as experienced by cells instead of how the microscope captures instantaneously the processes. Therefore, this framework enables for new strategies of quantitative analysis and comparison between embryos and tissues during embryogenesis from in-vivo images.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La gran cantidad de datos que se registran diariamente en los sistemas de base de datos de las organizaciones ha generado la necesidad de analizarla. Sin embargo, se enfrentan a la complejidad de procesar enormes volúmenes de datos a través de métodos tradicionales de análisis. Además, dentro de un contexto globalizado y competitivo las organizaciones se mantienen en la búsqueda constante de mejorar sus procesos, para lo cual requieren herramientas que les permitan tomar mejores decisiones. Esto implica estar mejor informado y conocer su historia digital para describir sus procesos y poder anticipar (predecir) eventos no previstos. Estos nuevos requerimientos de análisis de datos ha motivado el desarrollo creciente de proyectos de minería de datos. El proceso de minería de datos busca obtener desde un conjunto masivo de datos, modelos que permitan describir los datos o predecir nuevas instancias en el conjunto. Implica etapas de: preparación de los datos, procesamiento parcial o totalmente automatizado para identificar modelos en los datos, para luego obtener como salida patrones, relaciones o reglas. Esta salida debe significar un nuevo conocimiento para la organización, útil y comprensible para los usuarios finales, y que pueda ser integrado a los procesos para apoyar la toma de decisiones. Sin embargo, la mayor dificultad es justamente lograr que el analista de datos, que interviene en todo este proceso, pueda identificar modelos lo cual es una tarea compleja y muchas veces requiere de la experiencia, no sólo del analista de datos, sino que también del experto en el dominio del problema. Una forma de apoyar el análisis de datos, modelos y patrones es a través de su representación visual, utilizando las capacidades de percepción visual del ser humano, la cual puede detectar patrones con mayor facilidad. Bajo este enfoque, la visualización ha sido utilizada en minería datos, mayormente en el análisis descriptivo de los datos (entrada) y en la presentación de los patrones (salida), dejando limitado este paradigma para el análisis de modelos. El presente documento describe el desarrollo de la Tesis Doctoral denominada “Nuevos Esquemas de Visualizaciones para Mejorar la Comprensibilidad de Modelos de Data Mining”. Esta investigación busca aportar con un enfoque de visualización para apoyar la comprensión de modelos minería de datos, para esto propone la metáfora de modelos visualmente aumentados. ABSTRACT The large amount of data to be recorded daily in the systems database of organizations has generated the need to analyze it. However, faced with the complexity of processing huge volumes of data over traditional methods of analysis. Moreover, in a globalized and competitive environment organizations are kept constantly looking to improve their processes, which require tools that allow them to make better decisions. This involves being bettered informed and knows your digital story to describe its processes and to anticipate (predict) unanticipated events. These new requirements of data analysis, has led to the increasing development of data-mining projects. The data-mining process seeks to obtain from a massive data set, models to describe the data or predict new instances in the set. It involves steps of data preparation, partially or fully automated processing to identify patterns in the data, and then get output patterns, relationships or rules. This output must mean new knowledge for the organization, useful and understandable for end users, and can be integrated into the process to support decision-making. However, the biggest challenge is just getting the data analyst involved in this process, which can identify models is complex and often requires experience not only of the data analyst, but also the expert in the problem domain. One way to support the analysis of the data, models and patterns, is through its visual representation, i.e., using the capabilities of human visual perception, which can detect patterns easily in any context. Under this approach, the visualization has been used in data mining, mostly in exploratory data analysis (input) and the presentation of the patterns (output), leaving limited this paradigm for analyzing models. This document describes the development of the doctoral thesis entitled "New Visualizations Schemes to Improve Understandability of Data-Mining Models". This research aims to provide a visualization approach to support understanding of data mining models for this proposed metaphor visually enhanced models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context. The Gaia-ESO Public Spectroscopic Survey is obtaining high-quality spectroscopy of some 100 000 Milky Way stars using the FLAMES spectrograph at the VLT, down to V = 19 mag, systematically covering all the main components of the Milky Way and providing the first homogeneous overview of the distributions of kinematics and chemical element abundances in the Galaxy. Observations of young open clusters, in particular, are giving new insights into their initial structure, kinematics, and their subsequent evolution. Aims. This paper describes the analysis of UVES and GIRAFFE spectra acquired in the fields of young clusters whose population includes pre-main sequence (PMS) stars. The analysis is applied to all stars in such fields, regardless of any prior information on membership, and provides fundamental stellar atmospheric parameters, elemental abundances, and PMS-specific parameters such as veiling, accretion, and chromospheric activity. Methods. When feasible, different methods were used to derive raw parameters (e.g. line equivalent widths) fundamental atmospheric parameters and derived parameters (e.g. abundances). To derive some of these parameters, we used methods that have been extensively used in the past and new ones developed in the context of the Gaia-ESO survey enterprise. The internal precision of these quantities was estimated by inter-comparing the results obtained by these different methods, while the accuracy was estimated by comparison with independent external data, such as effective temperature and surface gravity derived from angular diameter measurements, on a sample of benchmarks stars. A validation procedure based on these comparisons was applied to discard spurious or doubtful results and produce recommended parameters. Specific strategies were implemented to resolve problems of fast rotation, accretion signatures, chromospheric activity, and veiling. Results. The analysis carried out on spectra acquired in young cluster fields during the first 18 months of observations, up to June 2013, is presented in preparation of the first release of advanced data products. These include targets in the fields of the ρ Oph, Cha I, NGC 2264, γ Vel, and NGC 2547 clusters. Stellar parameters obtained with the higher resolution and larger wavelength coverage from UVES are reproduced with comparable accuracy and precision using the smaller wavelength range and lower resolution of the GIRAFFE setup adopted for young stars, which allows us to provide stellar parameters with confidence for the much larger GIRAFFE sample. Precisions are estimated to be ≈120 K rms in Teff, ≈0.3 dex rms in log g, and ≈0.15 dex rms in [Fe/H] for the UVES and GIRAFFE setups.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context. The ongoing Gaia-ESO Public Spectroscopic Survey is using FLAMES at the VLT to obtain high-quality medium-resolution Giraffe spectra for about 105 stars and high-resolution UVES spectra for about 5000 stars. With UVES, the Survey has already observed 1447 FGK-type stars. Aims. These UVES spectra are analyzed in parallel by several state-of-the-art methodologies. Our aim is to present how these analyses were implemented, to discuss their results, and to describe how a final recommended parameter scale is defined. We also discuss the precision (method-to-method dispersion) and accuracy (biases with respect to the reference values) of the final parameters. These results are part of the Gaia-ESO second internal release and will be part of its first public release of advanced data products. Methods. The final parameter scale is tied to the scale defined by the Gaia benchmark stars, a set of stars with fundamental atmospheric parameters. In addition, a set of open and globular clusters is used to evaluate the physical soundness of the results. Each of the implemented methodologies is judged against the benchmark stars to define weights in three different regions of the parameter space. The final recommended results are the weighted medians of those from the individual methods. Results. The recommended results successfully reproduce the atmospheric parameters of the benchmark stars and the expected Teff-log  g relation of the calibrating clusters. Atmospheric parameters and abundances have been determined for 1301 FGK-type stars observed with UVES. The median of the method-to-method dispersion of the atmospheric parameters is 55 K for Teff, 0.13 dex for log  g and 0.07 dex for [Fe/H]. Systematic biases are estimated to be between 50−100 K for Teff, 0.10−0.25 dex for log  g and 0.05−0.10 dex for [Fe/H]. Abundances for 24 elements were derived: C, N, O, Na, Mg, Al, Si, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Y, Zr, Mo, Ba, Nd, and Eu. The typical method-to-method dispersion of the abundances varies between 0.10 and 0.20 dex. Conclusions. The Gaia-ESO sample of high-resolution spectra of FGK-type stars will be among the largest of its kind analyzed in a homogeneous way. The extensive list of elemental abundances derived in these stars will enable significant advances in the areas of stellar evolution and Milky Way formation and evolution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of my study is to investigate the relationship between selected deictic shields on the pronoun ‘I’ and the involvement/detachment dichotomy in a sample of television news interviews. I focus on the use of personal pronouns in political discourse. Drawing upon Caffi’s (2007) classification of mitigating devices into bushes, hedges and shields, I focus on deictic shields on the pronoun ‘I’: I examine the way a selection of ‘I’-related deictic shields is employed in a collection of news interviews broadcast during the electoral campaign prior to the UK 2015 General Election. My purpose is to uncover the frequencies of each of the linguistic items selected and the pragmatic functions of those linguistic items in the involvement/detachment dichotomy. The research is structured as follows. Chapter 1 provides an account of previous studies on the three main areas of research: speech event analysis, institutional interaction and the news interview, and the UK 2015 General Election television programmes. Chapter 2 is centred on the involvement/detachment dichotomy: I provide an overview of nonlinguistic and linguistic features of involvement and detachment at all levels of sentence structure. Chapter 3 contains a detailed account of the data collection and data analysis process. Chapter 4 provides an accurate description of results in three steps: quantitative analysis, qualitative analysis and discussion of the pragmatic functions of the selected linguistic features of involvement and detachment. Chapter 5 includes a brief summary of the investigation, reviews the main findings, and indicates limitations of the study and possible inputs for further research. The results of the analysis confirm that, while some of the linguistic items examined point toward involvement, others have a detaching effect. I therefore conclude that deictic shields on the pronoun ‘I’ permit the realisation of the involvement/detachment dichotomy in the speech genre of the news interview.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Federal Highway Administration, Office of Safety and Traffic Operations, Washington, D.C.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Federal Highway Administration, Washington, D.C.