874 resultados para Data acquisition card


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed compressed sensing exploits information redundancy, inbuilt in multi-signal ensembles with interas well as intra-signal correlations, to reconstruct undersampled signals. In this paper we revisit this problem, albeit from a different perspective, of taking streaming data, from several correlated sources, as input to a real time system which, without any a priori information, incrementally learns and admits each source into the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following rising demands in positioning with GPS, low-cost receivers are becoming widely available; but their energy demands are still too high. For energy efficient GPS sensing in delay-tolerant applications, the possibility of offloading a few milliseconds of raw signal samples and leveraging the greater processing power of the cloud for obtaining a position fix is being actively investigated. In an attempt to reduce the energy cost of this data offloading operation, we propose Sparse-GPS(1): a new computing framework for GPS acquisition via sparse approximation. Within the framework, GPS signals can be efficiently compressed by random ensembles. The sparse acquisition information, pertaining to the visible satellites that are embedded within these limited measurements, can subsequently be recovered by our proposed representation dictionary. By extensive empirical evaluations, we demonstrate the acquisition quality and energy gains of Sparse-GPS. We show that it is twice as energy efficient than offloading uncompressed data, and has 5-10 times lower energy costs than standalone GPS; with a median positioning accuracy of 40 m.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a new method for acquiring three-dimensional (3-D) volumes of ultrasonic axial strain data. The method uses a mechanically-swept probe to sweep out a single volume while applying a continuously varying axial compression. Acquisition of a volume takes 15-20 s. A strain volume is then calculated by comparing frame pairs throughout the sequence. The method uses strain quality estimates to automatically pick out high quality frame pairs, and so does not require careful control of the axial compression. In a series of in vitro and in vivo experiments, we quantify the image quality of the new method and also assess its ease of use. Results are compared with those for the current best alternative, which calculates strain between two complete volumes. The volume pair approach can produce high quality data, but skillful scanning is required to acquire two volumes with appropriate relative strain. In the new method, the automatic quality-weighted selection of image pairs overcomes this difficulty and the method produces superior quality images with a relatively relaxed scanning technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drift cards were released in Monterey Bay, California, to detect seasonal variations in the California Current system, and seasonal and diurnal wind variations in the immediate vicinity of the bay. About 23% of the cards were recovered, although the recovery rate varied from about 5% in the winter to about 60% in the late summer. Drift card speeds ranged from 1 to 8 km/day, in the winter and summer months respectively. Good agreement was observed between geostrophic current, wind, drogue, and drift card data, although drift cards were observed to be primarily wind driven. A weekend bias in drift card recoveries was observed for the entire period of study; however, it was less pronounced for those cards released during the summer months. Two bogus releases were used to estimate the discovery lag time, reported position accuracy, and longshore drift currents. Diurnal winds were observed during a 24-hour study, and indicated daily variations in the wind field may be as important as seasonal changes in moving surface water. The drift card speed was observed to be about 3% of the wind velocity, and 1 m/sec was estimated as the minimum effective wind. The wind factor, ranging from 2.2% to 4.0%, was used to estimate the actual paths of drift cards and to examine the role of diurnal winds in affecting surface water movement. (PDF contains 79 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-disciplinary investigation was conducted in southern Biscayne Bay and Card Sound from 1968 to 1973. The purpose of the investigation was to conduct an integrated study of the ecology of southern Biscayne Bay with special emphasis on the effects of the heated effluent from the Turkey Point fossil fuel power plant, and to predict the impact of additional effluent from the planned conversion of the plant to nuclear fuel. The results of this investigation have been discussed in numerous publications. This report contains the unpublished biology data that resulted from the investigation. (PDF contains 44 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe.

Initial phase of LIGO started in 2002, and since then data was collected during six science runs. Instrument sensitivity was improving from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010.

In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted till 2014.

This thesis describes results of commissioning work done at LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers.

The first part of this thesis is devoted to the description of methods for bringing interferometer to the linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details.

Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in real time. Sensitivity analysis was done to understand and eliminate noise sources of the instrument.

Coupling of noise sources to gravitational wave channel can be reduced if robust feedforward and optimal feedback control loops are implemented. The last part of this thesis describes static and adaptive feedforward noise cancellation techniques applied to Advanced LIGO interferometers and tested at the 40m prototype. Applications of optimal time domain feedback control techniques and estimators to aLIGO control loops are also discussed.

Commissioning work is still ongoing at the sites. First science run of advanced LIGO is planned for September 2015 and will last for 3-4 months. This run will be followed by a set of small instrument upgrades that will be installed on a time scale of few months. Second science run will start in spring 2016 and last for about 6 months. Since current sensitivity of advanced LIGO is already more than factor of 3 higher compared to initial detectors and keeps improving on a monthly basis, upcoming science runs have a good chance for the first direct detection of gravitational waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to concerns about environmental protection and resource utilization, product lifecycle management for end-of-life (EOL) has received increasing attention in many industrial sectors including manufacturing, maintenance/repair, and recycling/refurbishing of the product. To support these functions, crucial issues are studied to realize a product recovery management system (PRMS), including: (1) an architecture design for EOL services, such as remanufacturing and recycling; (2) a product data model required for EOL activity based on international standards; and (3) an infrastructure for information acquisition and mapping to product lifecycle information. The presented works are illustrated via a realistic scenario. © 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A 2.5-D and 3-D multi-fold GPR survey was carried out in the Archaeological Park of Aquileia (northern Italy). The primary objective of the study was the identification of targets of potential archaeological interest in an area designated by local archaeological authorities. The second geophysical objective was to test 2-D and 3-D multi-fold methods and to study localised targets of unknown shape and dimensions in hostile soil conditions. Several portions of the acquisition grid were processed in common offset (CO), common shot (CSG) and common mid point (CMP) geometry. An 8×8 m area was studied with orthogonal CMPs thus achieving a 3-D subsurface coverage with azimuthal range limited to two normal components. Coherent noise components were identified in the pre-stack domain and removed by means of FK filtering of CMP records. Stack velocities were obtained from conventional velocity analysis and azimuthal velocity analysis of 3-D pre-stack gathers. Two major discontinuities were identified in the area of study. The deeper one most probably coincides with the paleosol at the base of the layer associated with activities of man in the area in the last 2500 years. This interpretation is in agreement with the results obtained from nearby cores and excavations. The shallow discontinuity is observed in a part of the investigated area and it shows local interruptions with a linear distribution on the grid. Such interruptions may correspond to buried targets of archaeological interest. The prominent enhancement of the subsurface images obtained by means of multi-fold techniques, compared with the relatively poor quality of the conventional single-fold georadar sections, indicates that multi-fold methods are well suited for the application to high resolution studies in archaeology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interest in structural brain connectivity has grown with the understanding that abnormal neural connections may play a role in neurologic and psychiatric diseases. Small animal connectivity mapping techniques are particularly important for identifying aberrant connectivity in disease models. Diffusion magnetic resonance imaging tractography can provide nondestructive, 3D, brain-wide connectivity maps, but has historically been limited by low spatial resolution, low signal-to-noise ratio, and the difficulty in estimating multiple fiber orientations within a single image voxel. Small animal diffusion tractography can be substantially improved through the combination of ex vivo MRI with exogenous contrast agents, advanced diffusion acquisition and reconstruction techniques, and probabilistic fiber tracking. Here, we present a comprehensive, probabilistic tractography connectome of the mouse brain at microscopic resolution, and a comparison of these data with a neuronal tracer-based connectivity data from the Allen Brain Atlas. This work serves as a reference database for future tractography studies in the mouse brain, and demonstrates the fundamental differences between tractography and neuronal tracer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this experiment, we examined the extent to which the spatiotemporal reorganization of muscle synergies mediates skill acquisition on a two degree-of-freedom (df) target-acquisition task. Eight participants completed five practice sessions on consecutive days. During each session they practiced movements to eight target positions presented by a visual display. The movements required combinations of flexion/extension and pronation/supination of the elbow joint complex. During practice sessions, eight targets displaced 5.4 cm from the start position ( representing joint excursions of 54) were presented 16 times. During pre- and posttests, participants acquired the targets at two distances (3.6 cm [36 degrees] and 7.2 cm [72 degrees]). EMG data were recorded from eight muscles contributing to the movements during the pre- and posttests. Most targets were acquired more rapidly after the practice period. Performance improvements were, in most target directions, accompanied by increases in the smoothness of the movement trajectories. When target acquisition required movement in both dfs, there were also practice-related decreases in the extent to which the trajectories deviated from a direct path to the target. The contribution of monofunctional muscles ( those producing torque in a single df) increased with practice during movements in which they acted as agonists. The activity in bifunctional muscles ( those contributing torque in both dfs) remained at pretest levels in most movements. The results suggest that performance gains were mediated primarily by changes in the spatial organization of muscles synergies. These changes were expressed most prominently in terms of the magnitude of activation of the monofunctional muscles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Published work assessing psychosocial stress (job strain) as a risk factor for coronary heart disease is inconsistent and subject to publication bias and reverse causation bias. We analysed the relation between job strain and coronary heart disease with a meta-analysis of published and unpublished studies. METHODS: We used individual records from 13 European cohort studies (1985-2006) of men and women without coronary heart disease who were employed at time of baseline assessment. We measured job strain with questions from validated job-content and demand-control questionnaires. We extracted data in two stages such that acquisition and harmonisation of job strain measure and covariables occurred before linkage to records for coronary heart disease. We defined incident coronary heart disease as the first non-fatal myocardial infarction or coronary death. FINDINGS: 30?214 (15%) of 197?473 participants reported job strain. In 1·49 million person-years at risk (mean follow-up 7·5 years [SD 1·7]), we recorded 2358 events of incident coronary heart disease. After adjustment for sex and age, the hazard ratio for job strain versus no job strain was 1·23 (95% CI 1·10-1·37). This effect estimate was higher in published (1·43, 1·15-1·77) than unpublished (1·16, 1·02-1·32) studies. Hazard ratios were likewise raised in analyses addressing reverse causality by exclusion of events of coronary heart disease that occurred in the first 3 years (1·31, 1·15-1·48) and 5 years (1·30, 1·13-1·50) of follow-up. We noted an association between job strain and coronary heart disease for sex, age groups, socioeconomic strata, and region, and after adjustments for socioeconomic status, and lifestyle and conventional risk factors. The population attributable risk for job strain was 3·4%. INTERPRETATION: Our findings suggest that prevention of workplace stress might decrease disease incidence; however, this strategy would have a much smaller effect than would tackling of standard risk factors, such as smoking. FUNDING: Finnish Work Environment Fund, the Academy of Finland, the Swedish Research Council for Working Life and Social Research, the German Social Accident Insurance, the Danish National Research Centre for the Working Environment, the BUPA Foundation, the Ministry of Social Affairs and Employment, the Medical Research Council, the Wellcome Trust, and the US National Institutes of Health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Burkholderia cepacia complex organisms are important transmissible pathogens found in cystic fibrosis (CF) patients. In recent years, the rates of cross-infection of epidemic strains have declined due to effective infection control efforts. However, cases of sporadic B. cepacia complex infection continue to occur in some centers. The acquisition pathways and clinical outcomes of sporadic B. cepacia complex infection are unclear. We sought to determine the patient clinical characteristics, outcomes, incidence, and genotypic relatedness for all cases of B. cepacia complex infection at two CF centers. We also sought to study the external conditions that influence the acquisition of infection. From 2001 to 2011, 67 individual organisms were cultured from the respiratory samples of 64 patients. Sixty-five percent of the patients were adults, in whom chronic infections were more common (68%) (P = 0.006). The incidence of B. cepacia complex infection increased by a mean of 12% (95% confidence interval [CI], 3 to 23%) per year. The rates of transplantation and death were similar in the incident cases who developed chronic infection compared to those in patients with chronic Pseudomonas aeruginosa infection. Multilocus sequence typing revealed 50 individual strains from 65 isolates. Overall, 85% of the patients were infected with unique strains, suggesting sporadic acquisition of infection. The yearly incidence of nonepidemic B. cepacia complex infection was positively correlated with the amount of rainfall in the two sites examined: subtropical Brisbane (r = 0.65, P = 0.031) and tropical Townsville (r = 0.82, P = 0.002). This study demonstrates that despite strict cohort segregation, new cases of unrelated B. cepacia complex infection continue to occur. These data also support an environmental origin of infection and suggest that climate conditions may be associated with the acquisition of B. cepacia complex infections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month. 

Aims. We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1). 

Methods. PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5<sup>m</sup> for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHK<inf>s</inf> filters. 

Results. This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ∼15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHK<inf>s</inf> imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this. 

Conclusions. Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de doutoramento, Informática (Bioinformática), Universidade de Lisboa, Faculdade de Ciências, 2014