940 resultados para Methods: Data Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The magnetic field in the local interstellar medium (ISM) provides a key indicator of the galactic environment of the Sun and influences the shape of the heliosphere. We have studied the interstellar magnetic field (ISMF) in the solar vicinity using polarized starlight for stars within 40 pc of the Sun and 90 degrees of the heliosphere nose. In Frisch et al. (Paper I), we developed a method for determining the local ISMF direction by finding the best match to a group of interstellar polarization position angles obtained toward nearby stars, based on the assumption that the polarization is parallel to the ISMF. In this paper, we extend the analysis by utilizing weighted fits to the position angles and by including new observations acquired for this study. We find that the local ISMF is pointed toward the galactic coordinates l, b = 47 degrees +/- 20 degrees, 25 degrees +/- 20 degrees. This direction is close to the direction of the ISMF that shapes the heliosphere, l, b = 33 degrees +/- 4 degrees, 55 degrees +/- 4 degrees, as traced by the center of the "Ribbon" of energetic neutral atoms discovered by the Interstellar Boundary Explorer (IBEX) mission. Both the magnetic field direction and the kinematics of the local ISM are consistent with a scenario where the local ISM is a fragment of the Loop I superbubble. A nearby ordered component of the local ISMF has been identified in the region l approximate to 0 degrees -> 80 degrees and b approximate to 0 degrees -> 30 degrees, where PlanetPol data show a distance-dependent increase of polarization strength. The ordered component extends to within 8 pc of the Sun and implies a weak curvature in the nearby ISMF of +/- 0 degrees.25 pc(-1). This conclusion is conditioned on the small sample of stars available for defining this rotation. Variations from the ordered component suggest a turbulent component of +/- 23 degrees. The ordered component and standard relations between polarization, color excess, and H-o column density predict a reasonable increase of N(H) with distance in the local ISM. The similarity of the ISMF directions traced by the polarizations, the IBEX Ribbon, and pulsars inside the Local Bubble in the third galactic quadrant suggest that the ISMF is relatively uniform over spatial scales of 8-200 pc and is more similar to interarm than spiral-arm magnetic fields. The ISMF direction from the polarization data is also consistent with small-scale spatial asymmetries detected in GeV-TeV cosmic rays with a galactic origin. The peculiar geometrical relation found earlier between the cosmic microwave background dipole moment, the heliosphere nose, and the ISMF direction is supported by this study. The interstellar radiation field at +/- 975 angstrom does not appear to play a role in grain alignment for the low-density ISM studied here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Gaia-ESO Survey is a large public spectroscopic survey that aims to derive radial velocities and fundamental parameters of about 105 Milky Way stars in the field and in clusters. Observations are carried out with the multi-object optical spectrograph FLAMES, using simultaneously the medium-resolution (R ~ 20 000) GIRAFFE spectrograph and the high-resolution (R ~ 47 000) UVES spectrograph. In this paper we describe the methods and the software used for the data reduction, the derivation of the radial velocities, and the quality control of the FLAMES-UVES spectra. Data reduction has been performed using a workflow specifically developed for this project. This workflow runs the ESO public pipeline optimizing the data reduction for the Gaia-ESO Survey, automatically performs sky subtraction, barycentric correction and normalisation, and calculates radial velocities and a first guess of the rotational velocities. The quality control is performed using the output parameters from the ESO pipeline, by a visual inspection of the spectra and by the analysis of the signal-to-noise ratio of the spectra. Using the observations of the first 18 months, specifically targets observed multiple times at different epochs, stars observed with both GIRAFFE and UVES, and observations of radial velocity standards, we estimated the precision and the accuracy of the radial velocities. The statistical error on the radial velocities is σ ~ 0.4 km s-1 and is mainly due to uncertainties in the zero point of the wavelength calibration. However, we found a systematic bias with respect to the GIRAFFE spectra (~0.9 km s-1) and to the radial velocities of the standard stars (~0.5 km s-1) retrieved from the literature. This bias will be corrected in the future data releases, when a common zero point for all the set-ups and instruments used for the survey is be established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the largest catalogue to date of optical counterparts for H I radio-selected galaxies, HOPCAT. Of the 4315 H I radio-detected sources from the H I Parkes All Sky Survey (HIPASS) catalogue, we find optical counterparts for 3618 (84 per cent) galaxies. Of these, 1798 (42 per cent) have confirmed optical velocities and 848 (20 per cent) are single matches without confirmed velocities. Some galaxy matches are members of galaxy groups. From these multiple galaxy matches, 714 (16 per cent) have confirmed optical velocities and a further 258 (6 per cent) galaxies are without confirmed velocities. For 481 (11 per cent), multiple galaxies are present but no single optical counterpart can be chosen and 216 (5 per cent) have no obvious optical galaxy present. Most of these 'blank fields' are in crowded fields along the Galactic plane or have high extinctions. Isolated 'dark galaxy' candidates are investigated using an extinction cut of A(Bj) < 1 mag and the blank-fields category. Of the 3692 galaxies with an A(Bj) extinction < 1 mag, only 13 are also blank fields. Of these, 12 are eliminated either with follow-up Parkes observations or are in crowded fields. The remaining one has a low surface brightness optical counterpart. Hence, no isolated optically dark galaxies have been found within the limits of the HIPASS survey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an application of Mathematical Morphology (MM) for the classification of astronomical objects, both for star/galaxy differentiation and galaxy morphology classification. We demonstrate that, for CCD images, 99.3 +/- 3.8% of galaxies can be separated from stars using MM, with 19.4 +/- 7.9% of the stars being misclassified. We demonstrate that, for photographic plate images, the number of galaxies correctly separated from the stars can be increased using our MM diffraction spike tool, which allows 51.0 +/- 6.0% of the high-brightness galaxies that are inseparable in current techniques to be correctly classified, with only 1.4 +/- 0.5% of the high-brightness stars contaminating the population. We demonstrate that elliptical (E) and late-type spiral (Sc-Sd) galaxies can be classified using MM with an accuracy of 91.4 +/- 7.8%. It is a method involving fewer 'free parameters' than current techniques, especially automated machine learning algorithms. The limitation of MM galaxy morphology classification based on seeing and distance is also presented. We examine various star/galaxy differentiation and galaxy morphology classification techniques commonly used today, and show that our MM techniques compare very favourably.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photometry of moving sources typically suffers from a reduced signal-to-noise ratio (S/N) or flux measurements biased to incorrect low values through the use of circular apertures. To address this issue, we present the software package, TRIPPy: TRailed Image Photometry in Python. TRIPPy introduces the pill aperture, which is the natural extension of the circular aperture appropriate for linearly trailed sources. The pill shape is a rectangle with two semicircular end-caps and is described by three parameters, the trail length and angle, and the radius. The TRIPPy software package also includes a new technique to generate accurate model point-spread functions (PSFs) and trailed PSFs (TSFs) from stationary background sources in sidereally tracked images. The TSF is merely the convolution of the model PSF, which consists of a moffat profile, and super-sampled lookup table. From the TSF, accurate pill aperture corrections can be estimated as a function of pill radius with an accuracy of 10 mmag for highly trailed sources. Analogous to the use of small circular apertures and associated aperture corrections, small radius pill apertures can be used to preserve S/Ns of low flux sources, with appropriate aperture correction applied to provide an accurate, unbiased flux measurement at all S/Ns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Several studies suggest that the activity level of a planet-host star can be influenced by the presence of a close-by orbiting planet. Moreover, the interaction mechanisms that have been proposed, magnetic interaction and tidal interaction, exhibit a very different dependence on the orbital separation between the star and the planet. A detection of activity enhancement and characterization of its dependence on planetary orbital distance can, in principle, allow us to characterize the physical mechanism behind the activity enhancement. Methods: We used the HARPS-N spectrograph to measure the stellar activity level of HD 80606 during the planetary periastron passage and compared the activity measured to that close to apastron. Being characterized by an eccentricity of 0.93 and an orbital period of 111 days, the system's extreme variation in orbital separation makes it a perfect target to test our hypothesis. Results: We find no evidence for a variation in the activity level of the star as a function of planetary orbital distance, as measured by all activity indicators employed: log(R'HK), Hα, NaI, and HeI. None of the models employed, whether magnetic interaction or tidal interaction, provides a good description of the data. The photometry revealed no variation either, but it was strongly affected by poor weather conditions. Conclusions: We find no evidence for star-planet interaction in HD 80606 at the moment of the periastron passage of its very eccentric planet. The straightforward explanation for the non-detection is the absence of interaction as a result of a low magnetic field strength on either the planet or the star and of the low level of tidal interaction between the two. However, we cannot exclude two scenarios: i) the interaction can be instantaneous and of magnetic origin, being concentrated on the substellar point and its surrounding area; and ii) the interaction can lead to a delayed activity enhancement. In either scenario, a star-planet interaction would not be detectable with the dataset described in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Meta-analyses based on individual patient data (IPD) are regarded as the gold standard for systematic reviews. However, the methods used for analysing and presenting results from IPD meta-analyses have received little discussion. Methods We review 44 IPD meta-analyses published during the years 1999–2001. We summarize whether they obtained all the data they sought, what types of approaches were used in the analysis, including assumptions of common or random effects, and how they examined the effects of covariates. Results: Twenty-four out of 44 analyses focused on time-to-event outcomes, and most analyses (28) estimated treatment effects within each trial and then combined the results assuming a common treatment effect across trials. Three analyses failed to stratify by trial, analysing the data is if they came from a single mega-trial. Only nine analyses used random effects methods. Covariate-treatment interactions were generally investigated by subgrouping patients. Seven of the meta-analyses included data from less than 80% of the randomized patients sought, but did not address the resulting potential biases. Conclusions: Although IPD meta-analyses have many advantages in assessing the effects of health care, there are several aspects that could be further developed to make fuller use of the potential of these time-consuming projects. In particular, IPD could be used to more fully investigate the influence of covariates on heterogeneity of treatment effects, both within and between trials. The impact of heterogeneity, or use of random effects, are seldom discussed. There is thus considerable scope for enhancing the methods of analysis and presentation of IPD meta-analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E) seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R) distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.