968 resultados para High-precision Radiocarbon Dating


Relevância:

100.00% 100.00%

Publicador:

Resumo:

在星间半导体激光通信系统中,如何检测发射光束波面的质量是个较难处理的问题,为了较好地解决这一问题,在简单介绍白光横向双剪切干涉仪的基础上,报道了用此干涉仪对近衍射极限半导体激光光束波面的检测,在此基础上推导出计算远场发散度的公式。实验测得近场光束的波高差为0.2A,通过夫朗和费衍射求得光束的发散度仅为64.8μrad,这表明光束接近光学衍射极限。同时,表明双剪切干涉仪灵敏度高、实用性好。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Precision polarimetry of the cosmic microwave background (CMB) has become a mainstay of observational cosmology. The ΛCDM model predicts a polarization of the CMB at the level of a few μK, with a characteristic E-mode pattern. On small angular scales, a B-mode pattern arises from the gravitational lensing of E-mode power by the large scale structure of the universe. Inflationary gravitational waves (IGW) may be a source of B-mode power on large angular scales, and their relative contribution to primordial fluctuations is parameterized by a tensor-to-scalar ratio r. BICEP2 and Keck Array are a pair of CMB polarimeters at the South Pole designed and built for optimal sensitivity to the primordial B-mode peak around multipole l ~ 100. The BICEP2/Keck Array program intends to achieve a sensitivity to r ≥ 0.02. Auxiliary science goals include the study of gravitational lensing of E-mode into B-mode signal at medium angular scales and a high precision survey of Galactic polarization. These goals require low noise and tight control of systematics. We describe the design and calibration of the instrument. We also describe the analysis of the first three years of science data. BICEP2 observes a significant B-mode signal at 150 GHz in excess of the level predicted by the lensed-ΛCDM model, and Keck Array confirms the excess signal at > 5σ. We combine the maps from the two experiments to produce 150 GHz Q and U maps which have a depth of 57 nK deg (3.4 μK arcmin) over an effective area of 400 deg2 for an equivalent survey weight of 248000 μK2. We also show preliminary Keck Array 95 GHz maps. A joint analysis with the Planck collaboration reveals that much of BICEP2/Keck Array's observed 150 GHz signal at low l is more likely a Galactic dust foreground than a measurement of r. Marginalizing over dust and r, lensing B-modes are detected at 7.0σ significance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis has two basic themes: the investigation of new experiments which can be used to test relativistic gravity, and the investigation of new technologies and new experimental techniques which can be applied to make gravitational wave astronomy a reality.

Advancing technology will soon make possible a new class of gravitation experiments: pure laboratory experiments with laboratory sources of non-Newtonian gravity and laboratory detectors. The key advance in techno1ogy is the development of resonant sensing systems with very low levels of dissipation. Chapter 1 considers three such systems (torque balances, dielectric monocrystals, and superconducting microwave resonators), and it proposes eight laboratory experiments which use these systems as detectors. For each experiment it describes the dominant sources of noise and the technology required.

The coupled electro-mechanical system consisting of a microwave cavity and its walls can serve as a gravitational radiation detector. A gravitational wave interacts with the walls, and the resulting motion induces transitions from a highly excited cavity mode to a nearly unexcited mode. Chapter 2 describes briefly a formalism for analyzing such a detector, and it proposes a particular design.

The monitoring of a quantum mechanical harmonic oscillator on which a classical force acts is important in a variety of high-precision experiments, such as the attempt to detect gravitational radiation. Chapter 3 reviews the standard techniques for monitoring the oscillator; and it introduces a new technique which, in principle, can determine the details of the force with arbitrary accuracy, despite the quantum properties of the oscillator.

The standard method for monitoring the oscillator is the "amplitude- and-phase" method (position or momentum transducer with output fed through a linear amplifier). The accuracy obtainable by this method is limited by the uncertainty principle. To do better requires a measurement of the type which Braginsky has called "quantum nondemolition." A well-known quantum nondemolition technique is "quantum counting," which can detect an arbitrarily weak force, but which cannot provide good accuracy in determining its precise time-dependence. Chapter 3 considers extensively a new type of quantum nondemolition measurement - a "back-action-evading" measurement of the real part X1 (or the imaginary part X2) of the oscillator's complex amplitude. In principle X1 can be measured arbitrarily quickly and arbitrarily accurately, and a sequence of such measurements can lead to an arbitrarily accurate monitoring of the classical force.

Chapter 3 describes explicit gedanken experiments which demonstrate that X1 can be measured arbitrarily quickly and arbitrarily accurately, it considers approximate back-action-evading measurements, and it develops a theory of quantum nondemolition measurement for arbitrary quantum mechanical systems.

In Rosen's "bimetric" theory of gravity the (local) speed of gravitational radiation vg is determined by the combined effects of cosmological boundary values and nearby concentrations of matter. It is possible for vg to be less than the speed of light. Chapter 4 shows that emission of gravitational radiation prevents particles of nonzero rest mass from exceeding the speed of gravitational radiation. Observations of relativistic particles place limits on vg and the cosmological boundary values today, and observations of synchrotron radiation from compact radio sources place limits on the cosmological boundary values in the past.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In petawatt laser system, the gratings used to compose pulse compressor are very large in size which can be only acquired currently by arraying small aperture gratings to form a large one instead, an approach referred to as grating tiling. Theory and experiments have demonstrated that the coherent addition of multiple small gratings to form a larger grating is viable, the key technology of which is to control the relative position and orientation of each grating with high precision. According to the main factors that affect the performance of the grating tiling, a 5-DOF ultraprecision stage is developed for the grating tiling experiment. The mechanism is formed by serial structures. The motion of the mechanism is guided by flexure hinges and driven by piezoelectric actuators and the movement resolution of which can achieve nanometer level. To keep the stability of the mechanism, capacitive position sensors with nanometer accuracy are fixed on it to provide feedback signals with which to realize closed-loop control, thus the positioning precision of the mechanism is within several nanometers range through voltage control and digital PID algorithm. Results of experiments indicate that the performance of the mechanism can meet the requirement of precision for grating tiling.}

Relevância:

100.00% 100.00%

Publicador:

Resumo:

研制了一种高精度电容式位移传感器,详细介绍了该传感器的基本工作原理和提高传感器精度的关键技术,对电容传感器的具体电路进行模块化设计;分析影响传感器精度与稳定性的因素,采用完全等电位屏蔽技术,对正弦激励电路、参考电容、传感器测头、电源进行技术改进,最后对电容传感器进行系统标定。实验证明:该传感器测量范围为±5-±40μm,测量分辨率〈10nm,测量精度〈20nm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

利用二维光谱色散平滑技术和透镜列阵(LA)来改善激光驱动器中靶面的辐照均匀性。通过消衍射透镜列阵可得到包络陡峭且中小空间尺度均匀性较好的焦斑。当在光路中加入二维光谱色散平滑单元后,光束在两个互相垂直的方向发生光谱色散,多光束干涉所引起的细密条纹也将在很大程度上被抹平,如果把横向热传导平滑效应也考虑在内,高空间频率的强度波动可进一步被消除。二维理论模拟结果表明采用该方案可获得顶部平坦边缘陡峭的焦斑,而且该方案无需仔细调整靶面的位置,实际应用较方便。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EU]Hortzetako protesi osoek edentulismo partzial edo osoa jasaten duten pertsonentzat soluzio alternatibo bat suposatzen dute hortz bakarreko inplanteen aurrean, goialdeko edo behealdeko hortz guztiak (edo batzuk) pieza bakarrarekin ordezkatuz [11]. Protesi sistema hauen pieza bakoitzaren fabrikazioak zehaztasun handia eskatzen du eta aztertu beharreko hainbat faktore daude, amaiera produktuaren funtzionamendua egokia izatea nahi bada. Hauetako aspektu asko aurretik gauzatuak izan diren lanetan jorratu dira jadanik, tolerantzia gap-a eta torlojutze sekuentzia bezalako aldagaiei buruz hainbat ikerkuntza eginez [1]. Lan honen bidez, bi faktore hauez gain hezur erlaxazioaren eragina kontuan hartu nahi da, All On Four sistema batean edukiko duen irismena neurtzeko. Horrela, protesiaren fabrikazio edota ezarpenerako baldintza onargarri minimoak ezagutzea espero da, erabiltzailearentzat protesia egokia suerta dadin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[Es]Este documento explica el procedimiento seguido para desarrollar la última etapa de un decodificador de DVB-T2, que consiste en la extracción de un archivo de vídeo desde un archivo binario resultante del resto del decodificador. Este decodificador se trata del software de un receptor desarrollado por el departamento de TSR (Tratamiento de Señal y Radiocomunicaciones) de la Escuela de Ingenieros de Bilbao en el año 2010. Dicho software es capaz de analizar la señal recibida de DVB-T2 para calcular la tasa de errores y conocer otros parámetros relevantes como el tipo de modulación utilizado. No obstante, para analizar de manera subjetiva las mejoras de DVB-T2 e incluso para determinar de qué manera afectan los errores a la calidad de la imagen es necesario visualizar el video transmitido. Por esta razón se ha comenzado un proyecto en el que el objetivo es programar un nuevo software que proporcione un archivo que contenga el video en cuestión. Este software se ha programado en lenguaje del programa Matlab, y toma el fichero resultante del receptor como entrada, para procesarlo y obtener uno nuevo con el vídeo. De modo que una vez programado y probado para su corrección, se aplica a continuación del receptor del departamento TSR. Una vez obtenido el vídeo es posible comparar la calidad de la imagen con diferentes tasas de error en la comunicación, simulando transmisiones en diferentes ámbitos cada uno con su correspondiente ruido. De esta manera, se estima con muy alta precisión el comportamiento de una transmisión real dependiendo de la climatología y otros factores que afecten a la relación señal a ruido.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

本文提出的微伽绝对重力仪基于高精度、高稳定的差动干涉仪。详细研究了重力仪的距离测量技术,自由落体运动的距离测量实验表明,本文提出的高精度差动干涉仪可以满足相对不确定度达6.4×10^(-9)的微伽绝对重力仪要求。而且,差动干涉仪比目前广泛应用于绝对重力仪的Mach-Zehnder干涉仪更稳定。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente trabajo se ha preparado en el seno de los intereses de los proyectos de investigación: HAR2011-26364 “Las Comunidades humanas de la alta Cuenca del Ebro en la Transición Pleistoceno-Holoceno” del Ministerio de Ciencia e Innovación y CGL2009-12703-C03-03 “Geología, geocronología y paleobiología de los Yacimientos de la Sierra de Atapuerca” del Ministerio de Educación y Ciencia. Así mismo se encuadra en el trabajo del Grupo de Investigación en Prehistoria de la Universidad del País Vasco (UPV/EHU) IT-288-07/ UFI 11-09.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Red bream (Beryx decadactylus) is a commercially important deep-sea benthopelagic fish with a circumglobal distribution on insular and continental slopes and seamounts. In the United States, small numbers are caught incidentally in the wreckfish (Polyprion americanus) fishery which operates off the southeastern coast, but no biological information exists for the management of the U.S. red bream population. For this study, otoliths (n=163) and gonads (n=161) were collected from commercially caught red bream between 2003 and 2008 to determine life history parameters. Specimens ranged in size from 410 to 630 mm fork length and were all determined to be mature by histological examination of the gonads. Females in spawning condition were observed from June through September, and reproductively active males were found year-round. Sectioned otoliths were difficult to interpret, but maximum age estimates were much higher than the 15 years previously reported for this species from the eastern North Atlantic based on whole-otolith analysis. Estimated ages ranged from 8 to 69 years, and a minimum lifespan of 49 years was validated by using bomb radiocarbon dating. Natural mortality was estimated at 0.06/yr. This study shows that red bream are longer lived and more vulnerable to overfishing than previously assumed and should be managed carefully to prevent overexploitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O Compact Muon Solenoid (CMS) é um dos principais detectores instalados no LHC que possibilita o estudo de diferentes aspectos da Física, indo do Modelo Padrão à matéria escura. Esse detector de propósito geral, foi construído para ser capaz de medir múons com uma grande precisão e todos os seus subdetectores foram construídos com uma alta granularidade, tornando possível identificar e caracterizar as propriedades cinemáticas das partículas finais da colisão. O algoritmo de reconstrução de eventos inclui a identificação de jatos, ou seja, é possível identificar a assinatura da produção de pártons na colisão e a medida de seções de choque da produção de muitos jatos é um dos métodos para se explorar as contribuições da Cromodinâmica Quântica (Quantum Chromodynamics - QCD) perturbativa, permitindo avaliar as previsões implementadas nas simulações de eventos. Tendo em vista a caracterização de processos relacionados com a QCD em colisões de próton-próton a uma energia do centro de massa de 7 TeV, é apresentada a medida da seção de choque da produção inclusiva de multijatos no CMS. Para realizar essa medida foram utilizados dados reais coletados em 2010, onde não se apresentava muitas colisões por cruzamento de pacote, com uma luminosidade integrada de L = 2,869 pb-1 e utilizando jatos que estão em quase todo o espaço de fase acessível em pseudorapidez |n|≤ 4,8 e momentum transverso pT ≥ 30 GeV/ c2. Desse resultado foram removidos os efeitos de detecção comparado com predições simuladas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reef fishes are conspicuous and essential components of coral reef ecosystems and economies of southern Florida and the United States Virgin Islands (USVI). Throughout Florida and the USVI, reef fish are under threat from a variety of anthropogenic and natural stressors including overfishing, habitat loss, and environmental changes. The South Florida/Caribbean Network (SFCN), a unit of the National Park Service (NPS), is charged with monitoring reef fishes, among other natural and cultural resources, within six parks in the South Florida - Caribbean region (Biscayne National Park, BISC; Buck Island Reef National Monument, BUIS; Dry Tortugas National Park, DRTO; Everglades National Park, EVER; Salt River Bay National Historic Park and Ecological Preserve, SARI; Virgin Islands National Park, VIIS). Monitoring data is intended for park managers who are and will continue to be asked to make decisions to balance environmental protection, fishery sustainability and park use by visitors. The range and complexity of the issues outlined above, and the need for NPS to invest in a strategy of monitoring, modeling, and management to ensure the sustainability of its precious assets, will require strategic investment in long-term, high-precision, multispecies reef fish data that increases inherent system knowledge and reduces uncertainty. The goal of this guide is to provide the framework for park managers and researchers to create or enhance a reef fish monitoring program within areas monitored by the SFCN. The framework is expected to be applicable to other areas as well, including the Florida Keys National Marine Sanctuary and Virgin Islands Coral Reef National Monument. The favored approach is characterized by an iterative process of data collection, dataset integration, sampling design analysis, and population and community assessment that evaluates resource risks associated with management policies. Using this model, a monitoring program can adapt its survey methods to increase accuracy and precision of survey estimates as new information becomes available, and adapt to the evolving needs and broadening responsibilities of park management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perhaps the most difficult job of the ecotoxicologist is extrapolating data calculated from laboratory experiments with high precision and accuracy into the real world of highly-dynamics aquatic environments. The establishment of baseline laboratory toxicity testing data for individual compounds and ecologically important and field studies serve as a precursor to ecosystem level studies needed for ecological risk assessment. The first stage in the field portion of risk assessment is the determination of actual environmental concentrations of the contaminant being studied and matching those concentrations with laboratory toxicity tests. Risk estimates can be produced via risk quotients that would determine the probability that adverse effects may occur. In this first stage of risk assessment, environmental realism is often not achieved. This is due, in part, to the fact that single-species laboratory toxicity tests, while highly controlled, do not account for the complex interactions (Chemical, physical, and biological) that take place in the natural environment. By controlling as many variables in the laboratory as possible, an experiment can be produced in such a fashion that real effects from a compound can be determined for a particular test organism. This type of approach obviously makes comparison with real world data most difficult. Conversely, field oriented studies fall short in the interpretation of ecological risk assessment because of low statistical power, lack of adequate replicaiton, and the enormous amount of time and money needed to perform such studies. Unlike a controlled laboratory bioassay, many other stressors other than the chemical compound in question affect organisms in the environment. These stressors range from natural occurrences (such as changes in temperature, salinity, and community interactions) to other confounding anthropogenic inputs. Therefore, an improved aquatic toxicity test that will enhance environmental realism and increase the accuracy of future ecotoxicological risk assessments is needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Biogeography Branch’s Sampling Design Tool for ArcGIS provides a means to effectively develop sampling strategies in a geographic information system (GIS) environment. The tool was produced as part of an iterative process of sampling design development, whereby existing data informs new design decisions. The objective of this process, and hence a product of this tool, is an optimal sampling design which can be used to achieve accurate, high-precision estimates of population metrics at a minimum of cost. Although NOAA’s Biogeography Branch focuses on marine habitats and some examples reflects this, the tool can be used to sample any type of population defined in space, be it coral reefs or corn fields.