873 resultados para Data acquisition.
Resumo:
Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe.
Initial phase of LIGO started in 2002, and since then data was collected during six science runs. Instrument sensitivity was improving from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010.
In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted till 2014.
This thesis describes results of commissioning work done at LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers.
The first part of this thesis is devoted to the description of methods for bringing interferometer to the linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details.
Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in real time. Sensitivity analysis was done to understand and eliminate noise sources of the instrument.
Coupling of noise sources to gravitational wave channel can be reduced if robust feedforward and optimal feedback control loops are implemented. The last part of this thesis describes static and adaptive feedforward noise cancellation techniques applied to Advanced LIGO interferometers and tested at the 40m prototype. Applications of optimal time domain feedback control techniques and estimators to aLIGO control loops are also discussed.
Commissioning work is still ongoing at the sites. First science run of advanced LIGO is planned for September 2015 and will last for 3-4 months. This run will be followed by a set of small instrument upgrades that will be installed on a time scale of few months. Second science run will start in spring 2016 and last for about 6 months. Since current sensitivity of advanced LIGO is already more than factor of 3 higher compared to initial detectors and keeps improving on a monthly basis, upcoming science runs have a good chance for the first direct detection of gravitational waves.
Resumo:
Due to concerns about environmental protection and resource utilization, product lifecycle management for end-of-life (EOL) has received increasing attention in many industrial sectors including manufacturing, maintenance/repair, and recycling/refurbishing of the product. To support these functions, crucial issues are studied to realize a product recovery management system (PRMS), including: (1) an architecture design for EOL services, such as remanufacturing and recycling; (2) a product data model required for EOL activity based on international standards; and (3) an infrastructure for information acquisition and mapping to product lifecycle information. The presented works are illustrated via a realistic scenario. © 2008 Elsevier B.V. All rights reserved.
Resumo:
A 2.5-D and 3-D multi-fold GPR survey was carried out in the Archaeological Park of Aquileia (northern Italy). The primary objective of the study was the identification of targets of potential archaeological interest in an area designated by local archaeological authorities. The second geophysical objective was to test 2-D and 3-D multi-fold methods and to study localised targets of unknown shape and dimensions in hostile soil conditions. Several portions of the acquisition grid were processed in common offset (CO), common shot (CSG) and common mid point (CMP) geometry. An 8×8 m area was studied with orthogonal CMPs thus achieving a 3-D subsurface coverage with azimuthal range limited to two normal components. Coherent noise components were identified in the pre-stack domain and removed by means of FK filtering of CMP records. Stack velocities were obtained from conventional velocity analysis and azimuthal velocity analysis of 3-D pre-stack gathers. Two major discontinuities were identified in the area of study. The deeper one most probably coincides with the paleosol at the base of the layer associated with activities of man in the area in the last 2500 years. This interpretation is in agreement with the results obtained from nearby cores and excavations. The shallow discontinuity is observed in a part of the investigated area and it shows local interruptions with a linear distribution on the grid. Such interruptions may correspond to buried targets of archaeological interest. The prominent enhancement of the subsurface images obtained by means of multi-fold techniques, compared with the relatively poor quality of the conventional single-fold georadar sections, indicates that multi-fold methods are well suited for the application to high resolution studies in archaeology.
A Diffusion MRI Tractography Connectome of the Mouse Brain and Comparison with Neuronal Tracer Data.
Resumo:
Interest in structural brain connectivity has grown with the understanding that abnormal neural connections may play a role in neurologic and psychiatric diseases. Small animal connectivity mapping techniques are particularly important for identifying aberrant connectivity in disease models. Diffusion magnetic resonance imaging tractography can provide nondestructive, 3D, brain-wide connectivity maps, but has historically been limited by low spatial resolution, low signal-to-noise ratio, and the difficulty in estimating multiple fiber orientations within a single image voxel. Small animal diffusion tractography can be substantially improved through the combination of ex vivo MRI with exogenous contrast agents, advanced diffusion acquisition and reconstruction techniques, and probabilistic fiber tracking. Here, we present a comprehensive, probabilistic tractography connectome of the mouse brain at microscopic resolution, and a comparison of these data with a neuronal tracer-based connectivity data from the Allen Brain Atlas. This work serves as a reference database for future tractography studies in the mouse brain, and demonstrates the fundamental differences between tractography and neuronal tracer data.
Resumo:
In this experiment, we examined the extent to which the spatiotemporal reorganization of muscle synergies mediates skill acquisition on a two degree-of-freedom (df) target-acquisition task. Eight participants completed five practice sessions on consecutive days. During each session they practiced movements to eight target positions presented by a visual display. The movements required combinations of flexion/extension and pronation/supination of the elbow joint complex. During practice sessions, eight targets displaced 5.4 cm from the start position ( representing joint excursions of 54) were presented 16 times. During pre- and posttests, participants acquired the targets at two distances (3.6 cm [36 degrees] and 7.2 cm [72 degrees]). EMG data were recorded from eight muscles contributing to the movements during the pre- and posttests. Most targets were acquired more rapidly after the practice period. Performance improvements were, in most target directions, accompanied by increases in the smoothness of the movement trajectories. When target acquisition required movement in both dfs, there were also practice-related decreases in the extent to which the trajectories deviated from a direct path to the target. The contribution of monofunctional muscles ( those producing torque in a single df) increased with practice during movements in which they acted as agonists. The activity in bifunctional muscles ( those contributing torque in both dfs) remained at pretest levels in most movements. The results suggest that performance gains were mediated primarily by changes in the spatial organization of muscles synergies. These changes were expressed most prominently in terms of the magnitude of activation of the monofunctional muscles.
Resumo:
BACKGROUND: Published work assessing psychosocial stress (job strain) as a risk factor for coronary heart disease is inconsistent and subject to publication bias and reverse causation bias. We analysed the relation between job strain and coronary heart disease with a meta-analysis of published and unpublished studies. METHODS: We used individual records from 13 European cohort studies (1985-2006) of men and women without coronary heart disease who were employed at time of baseline assessment. We measured job strain with questions from validated job-content and demand-control questionnaires. We extracted data in two stages such that acquisition and harmonisation of job strain measure and covariables occurred before linkage to records for coronary heart disease. We defined incident coronary heart disease as the first non-fatal myocardial infarction or coronary death. FINDINGS: 30?214 (15%) of 197?473 participants reported job strain. In 1·49 million person-years at risk (mean follow-up 7·5 years [SD 1·7]), we recorded 2358 events of incident coronary heart disease. After adjustment for sex and age, the hazard ratio for job strain versus no job strain was 1·23 (95% CI 1·10-1·37). This effect estimate was higher in published (1·43, 1·15-1·77) than unpublished (1·16, 1·02-1·32) studies. Hazard ratios were likewise raised in analyses addressing reverse causality by exclusion of events of coronary heart disease that occurred in the first 3 years (1·31, 1·15-1·48) and 5 years (1·30, 1·13-1·50) of follow-up. We noted an association between job strain and coronary heart disease for sex, age groups, socioeconomic strata, and region, and after adjustments for socioeconomic status, and lifestyle and conventional risk factors. The population attributable risk for job strain was 3·4%. INTERPRETATION: Our findings suggest that prevention of workplace stress might decrease disease incidence; however, this strategy would have a much smaller effect than would tackling of standard risk factors, such as smoking. FUNDING: Finnish Work Environment Fund, the Academy of Finland, the Swedish Research Council for Working Life and Social Research, the German Social Accident Insurance, the Danish National Research Centre for the Working Environment, the BUPA Foundation, the Ministry of Social Affairs and Employment, the Medical Research Council, the Wellcome Trust, and the US National Institutes of Health.
Resumo:
Burkholderia cepacia complex organisms are important transmissible pathogens found in cystic fibrosis (CF) patients. In recent years, the rates of cross-infection of epidemic strains have declined due to effective infection control efforts. However, cases of sporadic B. cepacia complex infection continue to occur in some centers. The acquisition pathways and clinical outcomes of sporadic B. cepacia complex infection are unclear. We sought to determine the patient clinical characteristics, outcomes, incidence, and genotypic relatedness for all cases of B. cepacia complex infection at two CF centers. We also sought to study the external conditions that influence the acquisition of infection. From 2001 to 2011, 67 individual organisms were cultured from the respiratory samples of 64 patients. Sixty-five percent of the patients were adults, in whom chronic infections were more common (68%) (P = 0.006). The incidence of B. cepacia complex infection increased by a mean of 12% (95% confidence interval [CI], 3 to 23%) per year. The rates of transplantation and death were similar in the incident cases who developed chronic infection compared to those in patients with chronic Pseudomonas aeruginosa infection. Multilocus sequence typing revealed 50 individual strains from 65 isolates. Overall, 85% of the patients were infected with unique strains, suggesting sporadic acquisition of infection. The yearly incidence of nonepidemic B. cepacia complex infection was positively correlated with the amount of rainfall in the two sites examined: subtropical Brisbane (r = 0.65, P = 0.031) and tropical Townsville (r = 0.82, P = 0.002). This study demonstrates that despite strict cohort segregation, new cases of unrelated B. cepacia complex infection continue to occur. These data also support an environmental origin of infection and suggest that climate conditions may be associated with the acquisition of B. cepacia complex infections.
Resumo:
Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month.
Aims. We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1).
Methods. PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5<sup>m</sup> for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHK<inf>s</inf> filters.
Results. This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ∼15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHK<inf>s</inf> imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this.
Conclusions. Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey.
Resumo:
Tese de doutoramento, Informática (Bioinformática), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
Thesis (Ph.D.)--University of Washington, 2013
Resumo:
Post-MAPS is a web platform that collects gastroenterological exam data from several european hospital centers, to be used in future clinical studies and was developed in partnership with experts from the gastroenterological area and information technology (IT) technicians. However, although functional, this platform has some issues that are crucial for its functioning, and can render user interaction unpleasant and exhaustive. Accordingly, we proposed the development of a new web platform, in which we aimed for an improvement in terms of usability, data uni cation and interoperability. Therefore, it was necessary to identify and study different ways of acquiring clinical data and review some of the existing clinical databases in order to understand how they work and what type of data they store, as well as their impact and contribution to clinical knowledge. Closely linked to the data model is the ability to share data with other systems, so, we also studied the concept of interoperability and analyzed some of the most widely used international standards, such as DICOM, HL7 and openEHR. As one of the primary objectives of this project was to achieve a better level of usability, practices related to Human Computer-Interaction, such as requirement analysis, creation of conceptual models, prototyping, and evaluation were also studied. Before we began the development, we conducted an analysis of the previous platform, from a functional point of view, which allowed us to gather not only a list of architectural and interface issues, but also a list of improvement opportunities. It was also performed a small preliminary study in order to evaluate the platform's usability, where we were able to realize that perceived usability is different between users, and that, in some aspects, varies according to their location, age and years of experience. Based on the information gathered during the platform's analysis and in the conclusions of the preliminary study, a new platform was developed, prepared for all potential users, from the inexperienced to the most comfortable with technology. It presents major improvements in terms of usability, also providing several new features that simplify the users' work, improving their interaction with the system, making their experience more enjoyable.
Resumo:
In this research we conducted a mixed research, using qualitative and quantitative analysis to study the relationship and impact between mobile advertisement and mobile app user acquisition and the conclusions companies can derive from it. Data was gathered from management of mobile advertisement campaigns of a portfolio of three different mobile apps. We found that a number of implications can be extracted from this intersection, namely to product development, internationalisation and management of marketing budget. We propose further research on alternative app users sources, impact of revenue on apps and exploitation of product segments: wearable technology and Internet of Things.
Resumo:
Recent studies have shown that providing learners Knowledge of Results (KR) after “good trials” rather than “poor trials” is superior for learning. The present study examined whether requiring participants to estimate their three best or three worst trials in a series of six trial blocks before receiving KR would prove superior to learning compared to not estimating their performance. Participants were required to push and release a slide along a confined pathway using their non-dominant hand to a target distance (133cm). The retention and transfer data suggest those participants who received KR after good trials demonstrated superior learning and performance estimations compared to those receiving KR after poor trials. The results of the present experiment offer an important theoretical extension in our understanding of the role of KR content and performance estimation on motor skill learning.
Resumo:
Very little research has examined K–12 educational technology decision-making in Canada. This collective case study explores the technology procurement process in Ontario’s publicly funded school districts to determine if it is informed by the relevant research, grounded in best practices, and enhances student learning. Using a qualitative approach, 10 senior leaders (i.e., chief information officers, superintendents, etc.) were interviewed. A combination of open-ended and closed-ended questions were used to reveal the most important factors driving technology acquisition, research support, governance procedures, data use, and assessment and return on investment (ROI) measures utilized by school districts in their implementation of educational technology. After participants were interviewed, the data were transcribed, member checked, and then submitted to “Computer-assisted NCT analysis” (Friese, 2014) using ATLAS.ti. The findings show that senior leaders are making acquisitions that are not aligned with current scholarship and not with student learning as the focus. It was also determined that districts struggle to use data-driven decision-making to support the governance of educational technology spending. Finally, the results showed that districts do not have effective assessment measures in place to determine the efficacy or ROI of a purchased technology. Although data are limited to the responses of 10 senior leaders, findings represent the technology leadership for approximately 746,000 Ontario students. The study is meant to serve as an informative resource for senior leaders and presents strategic and research-validated approaches to technology procurement. Further, the study has the potential to refine technology decision-making, policies, and practices in K–12 education.
Resumo:
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal