864 resultados para Sensor Data Fusion Applicazioni
Resumo:
Describes a series of experiments in the Joint European Torus (JET), culminating in the first tokamak discharges in deuterium-tritium fuelled mixture. The experiments were undertaken within limits imposed by restrictions on vessel activation and tritium usage. The objectives were: (i) to produce more than one megawatt of fusion power in a controlled way; (ii) to validate transport codes and provide a basis for accurately predicting the performance of deuterium-tritium plasmas from measurements made in deuterium plasmas; (iii) to determine tritium retention in the torus systems and to establish the effectiveness of discharge cleaning techniques for tritium removal; (iv) to demonstrate the technology related to tritium usage; and (v) to establish safe procedures for handling tritium in compliance with the regulatory requirements. A single-null X-point magnetic configuration, diverted onto the upper carbon target, with reversed toroidal magnetic field was chosen. Deuterium plasmas were heated by high power, long duration deuterium neutral beams from fourteen sources and fuelled also by up to two neutral beam sources injecting tritium. The results from three of these high performance hot ion H-mode discharges are described: a high performance pure deuterium discharge; a deuterium-tritium discharge with a 1% mixture of tritium fed to one neutral beam source; and a deuterium-tritium discharge with 100% tritium fed to two neutral beam sources. The TRANSP code was used to check the internal consistency of the measured data and to determine the origin of the measured neutron fluxes. In the best deuterium-tritium discharge, the tritium concentration was about 11% at the time of peak performance, when the total neutron emission rate was 6.0 × 1017 neutrons/s. The integrated total neutron yield over the high power phase, which lasted about 2 s, was 7.2 × 1017 neutrons, with an accuracy of ±7%. The actual fusion amplification factor, QDT was about 0.15
Resumo:
The Alliance for Coastal Technologies (ACT) convened a workshop on "Wave Sensor Technologies" in St. Petersburg, Florida on March 7-9, 2007, hosted by the University of South Florida (USF) College of Marine Science, an ACT partner institution. The primary objectives of this workshop were to: 1) define the present state of wave measurement technologies, 2) identify the major impediments to their advancement, and 3) make strategic recommendations for future development and on the necessary steps to integrate wave measurement sensors into operational coastal ocean observing systems. The participants were from various sectors, including research scientists, technology developers and industry providers, and technology users, such as operational coastal managers and coastal decision makers. Waves consistently are ranked as a critical variable for numerous coastal issues, from maritime transportation to beach erosion to habitat restoration. For the purposes of this workshop, the participants focused on measuring "wind waves" (i.e., waves on the water surface, generated by the wind, restored by gravity and existing between approximately 3 and 30-second periods), although it was recognized that a wide range of both forced and free waves exist on and in the oceans. Also, whereas the workshop put emphasis on the nearshore coastal component of wave measurements, the participants also stressed the importance of open ocean surface waves measurement. Wave sensor technologies that are presently available for both environments include bottom-mounted pressure gauges, surface following buoys, wave staffs, acoustic Doppler current profilers, and shore-based remote sensing radar instruments. One of the recurring themes of workshop discussions was the dichotomous nature of wave data users. The two separate groups, open ocean wave data users and the nearshore/coastal wave data users, have different requirements. Generally, the user requirements increase both in spatial/temporal resolution and precision as one moves closer to shore. Most ocean going mariners are adequately satisfied with measurements of wave period and height and a wave general direction. However, most coastal and nearshore users require at least the first five Fourier parameters ("First 5"): wave energy and the first four directional Fourier coefficients. Furthermore, wave research scientists would like sensors capable of providing measurements beyond the first four Fourier coefficients. It was debated whether or not high precision wave observations in one location can take the place of a less precise measurement at a different location. This could be accomplished by advancing wave models and using wave models to extend data to nearby areas. However, the consensus was that models are no substitution for in situ wave data.[PDF contains 26 pages]
Resumo:
The Alliance for Coastal Technologies (ACT) Workshop entitled, "Biological Platforms as Sensor Technologies and their Use as Indicators for the Marine Environment" was held in Seward, Alaska, September 19 - 21,2007. The workshop was co-hosted by the University of Alaska Fairbanks (UAF) and the Alaska SeaLife Center (ASLC). The workshop was attended by 25 participants representing a wide range of research scientists, managers, and manufacturers who develop and deploy sensory equipment using aquatic vertebrates as the mode of transport. Eight recommendations were made by participants at the conclusion of the workshop and are presented here without prioritization: 1. Encourage research toward development of energy scavenging devices of suitable sizes for use in remote sensing packages attached to marine animals. 2. Encourage funding sources for development of new sensor technologies and animal-borne tags. 3. Develop animal-borne environmental sensor platforms that offer more combined systems and improved data recovery methodologies, and expand the geographic scope of complementary fixed sensor arrays. 4. Engage the oceanographic community by: a. Offering a mini workshop at an AGU ocean sciences conference for people interested in developing an ocean carbon program that utilizes animal-borne sensor technology. b. Outreach to chemical oceanographers. 5. Min v2d6.sheepserver.net e and merge technologies from other disciplines that may be applied to marine sensors (e.g. biomedical field). 6. Encourage the NOAA Permitting Office to: a. Make a more predictable, reliable, and consistent permitting system for using animal platforms. b. Establish an evaluation process. c. Adhere to established standards. 7. Promote the expanded use of calibrated hydrophones as part of existing animal platforms. 8. Encourage the Integrated Ocean Observing System (IOOS) to promote animal tracking as effective samplers of the marine environment, and use of animals as ocean sensor technology platforms. [PDF contains 20 pages]
Resumo:
The use of self-contained, low-maintenance sensor systems installed on commercial vessels is becoming an important monitoring and scientific tool in many regions around the world. These systems integrate data from meteorological and water quality sensors with GPS data into a data stream that is automatically transferred from ship to shore. To begin linking some of this developing expertise, the Alliance for Coastal Technologies (ACT) and the European Coastal and Ocean Observing Technology (ECOOT) organized a workshop on this topic in Southampton, United Kingdom, October 10-12, 2006. The participants included technology users, technology developers, and shipping representatives. They collaborated to identify sensors currently employed on integrated systems, users of this data, limitations associated with these systems, and ways to overcome these limitations. The group also identified additional technologies that could be employed on future systems and examined whether standard architectures and data protocols for integrated systems should be established. Participants at the workshop defined 17 different parameters currently being measured by integrated systems. They identified that diverse user groups utilize information from these systems from resource management agencies, such as the Environmental Protection Agency (EPA), to local tourism groups and educational organizations. Among the limitations identified were instrument compatibility and interoperability, data quality control and quality assurance, and sensor calibration andlor maintenance frequency. Standardization of these integrated systems was viewed to be both advantageous and disadvantageous; while participants believed that standardization could be beneficial on many levels, they also felt that users may be hesitant to purchase a suite of instruments from a single manufacturer; and that a "plug and play" system including sensors from multiple manufactures may be difficult to achieve. A priority recommendation and conclusion for the general integrated sensor system community was to provide vessel operators with real-time access to relevant data (e.g., ambient temperature and salinity to increase efficiency of water treatment systems and meteorological data for increased vessel safety and operating efficiency) for broader system value. Simplified data displays are also required for education and public outreach/awareness. Other key recommendations were to encourage the use of integrated sensor packages within observing systems such as 100s and EuroGOOS, identify additional customers of sensor system data, and publish results of previous work in peer-reviewed journals to increase agency and scientific awareness and confidence in the technology. Priority recommendations and conclusions for ACT entailed highlighting the value of integrated sensor systems for vessels of opportunity through articles in the popular press, and marine science. [PDF contains 28 pages]
Resumo:
The co-organized Alliance for Coastal Technologies (ACT) and National Data Buoy Center (NDBC) Workshop "Meteorological Buoy Sensors Workshop" convened in Solomons, Maryland, April 19 to 21,2006, sponsored by the University of Maryland Center for Environmental Science (UMCES) Chesapeake Bay Laboratory (CBL), an ACT partner institution. Participants from various sectors including resource managers and industry representatives collaborated to focus on technologies and sensors that measure the near surface variables of wind speed and direction, barometric pressure, humidity and air temperature. The vendor list was accordingly targeted at companies that produced these types of sensors. The managers represented a cross section of federal, regional and academic marine observing interests from around the country. Workshop discussions focused on the challenges associated with making marine meteorological observations in general and problems that were specific to a particular variable. Discussions also explored methods to mitigate these challenges through the adoption of best practices, improved technologies and increased standardization. Some of the key workshop outcomes and recommendations included: 0cean.US should establish a committee devoted to observations. The committee would have a key role in developing observing standards. The community should adopt the target cost, reliability and performance standards drafted for a typical meteorological package to be used by a regional observing system. A forum should be established to allow users and manufacturers to share best practices for the employment of marine meteorological sensors. The ACT website would host the forum. Federal activities that evaluate meteorological sensors should make their results publicly available. ACT should extend their evaluation process to include meteorological sensors. A follow on workshop should be conducted that covers the observing of meteorological variables not addressed by this workshop. (pdf contains 18 pages)
Resumo:
The Alliance for Coastal Technologies (ACT) held a Workshop on Sensor Technology for Assessing Groundwater-Surface Water Interactions in the Coastal Zone on March 7 to 9,2005 in Savannah, GA. The main goal of the workshop was to summarize the general parameters, which have been found to be useful in assessing groundwater-surface water (GW-SW) interactions in the coastal zone. The workshop participants (Appendix I) were specifically charged with identifying the types of sensor systems, if any, that have been used to obtain time-series data and to make known which parameters may be the most amenable to the development/application of sensor technology. The group consisted of researchers, industry representatives, and environmental managers. Four general recommendations were made: 1. Educate coastal managers and agencies on the importance of GW-SW interactions, keeping in mind that regulatory agencies are driven by a different set of rules than researchers: the focus is on understanding the significance of the problem and providing solutions. ACT could facilitate this process in two ways. First, given that the research literature on this subject is fairly diffuse, ACT could provide links from its web site to fact sheets or other literature. Second, ACT could organize a focused meeting for managers and/or agency groups. Encourage development of primary tools for quantifying flow. The most promising technology in this respect is flow meters designed for flux chambers, mainly because they should be simple to use and can be made relatively inexpensively. However, it should be kept in mind that they provide only point measurements and several would need to be deployed as a network in order to obtain reliable flow estimates. For evaluating system wide GW-SW interactions, tools that integrate the signal over large areas would be required. Suggestions include a user-friendly hydrogeologic models, keeping in mind that freshwater flow is not the entire story, or continuous radon monitors. Though the latter would be slightly more difficult to use in terms of background knowledge, such an instrument would be low power and easy to operate and maintain. ACT could facilitate this recommendation by identifying funding opportunities on its web site and/or performing evaluations of existing technologies that could be summarized on the web site. (pdf contains 18 pages)
Resumo:
This thesis presents theories, analyses, and algorithms for detecting and estimating parameters of geospatial events with today's large, noisy sensor networks. A geospatial event is initiated by a significant change in the state of points in a region in a 3-D space over an interval of time. After the event is initiated it may change the state of points over larger regions and longer periods of time. Networked sensing is a typical approach for geospatial event detection. In contrast to traditional sensor networks comprised of a small number of high quality (and expensive) sensors, trends in personal computing devices and consumer electronics have made it possible to build large, dense networks at a low cost. The changes in sensor capability, network composition, and system constraints call for new models and algorithms suited to the opportunities and challenges of the new generation of sensor networks. This thesis offers a single unifying model and a Bayesian framework for analyzing different types of geospatial events in such noisy sensor networks. It presents algorithms and theories for estimating the speed and accuracy of detecting geospatial events as a function of parameters from both the underlying geospatial system and the sensor network. Furthermore, the thesis addresses network scalability issues by presenting rigorous scalable algorithms for data aggregation for detection. These studies provide insights to the design of networked sensing systems for detecting geospatial events. In addition to providing an overarching framework, this thesis presents theories and experimental results for two very different geospatial problems: detecting earthquakes and hazardous radiation. The general framework is applied to these specific problems, and predictions based on the theories are validated against measurements of systems in the laboratory and in the field.
Resumo:
The effect of the laser spot size on the neutron yield of table-top nuclear fusion from explosions of a femtosecond intense laser pulse heated deuterium clusters is investigated by using a simplified model, in which the cluster size distribution and the energy attenuation of the laser as it propagates through the cluster jet are taken into account. It has been found that there exists a proper laser spot size for the maximum fusion neutron yield for a given laser pulse and a specific deuterium gas cluster jet. The proper spot size, which is dependent on the laser parameters and the cluster jet parameters, has been calculated and compared with the available experimental data. A reasonable agreement between the calculated results and the published experimental results is found.
Resumo:
Viruses possess very specific methods of targeting and entering cells. These methods would be extremely useful if they could also be applied to drug delivery, but little is known about the molecular mechanisms of the viral entry process. In order to gain further insight into mechanisms of viral entry, chemical and spectroscopic studies in two systems were conducted, examining hydrophobic protein-lipid interactions during Sendai virus membrane fusion, and the kinetics of bacteriophage λ DNA injection.
Sendai virus glycoprotein interactions with target membranes during the early stages of fusion were examined using time-resolved hydrophobic photoaffinity labeling with the lipid-soluble carbene generator3-(trifluoromethyl)-3-(m-^(125 )I] iodophenyl)diazirine (TID). The probe was incorporated in target membranes prior to virus addition and photolysis. During Sendai virus fusion with liposomes composed of cardiolipin (CL) or phosphatidylserine (PS), the viral fusion (F) protein is preferentially labeled at early time points, supporting the hypothesis that hydrophobic interaction of the fusion peptide at the N-terminus of the F_1 subunit with the target membrane is an initiating event in fusion. Correlation of the hydrophobic interactions with independently monitored fusion kinetics further supports this conclusion. Separation of proteins after labeling shows that the F_1 subunit, containing the putative hydrophobic fusion sequence, is exclusively labeled, and that the F_2 subunit does not participate in fusion. Labeling shows temperature and pH dependence consistent with a need for protein conformational mobility and fusion at neutral pH. Higher amounts of labeling during fusion with CL vesicles than during virus-PS vesicle fusion reflects membrane packing regulation of peptide insertion into target membranes. Labeling of the viral hemagglutinin/neuraminidase (HN) at low pH indicates that HN-mediated fusion is triggered by hydrophobic interactions, after titration of acidic amino acids. HN labeling under nonfusogenic conditions reveals that viral binding may involve hydrophobic as well as electrostatic interactions. Controls for diffusional labeling exclude a major contribution from this source. Labeling during reconstituted Sendai virus envelope-liposome fusion shows that functional reconstitution involves protein retention of the ability to undergo hydrophobic interactions.
Examination of Sendai virus fusion with erythrocyte membranes indicates that hydrophobic interactions also trigger fusion between biological membranes, and that HN binding may involve hydrophobic interactions as well. Labeling of the erythrocyte membranes revealed close membrane association of spectrin, which may play a role in regulating membrane fusion. The data show that hydrophobic fusion protein interaction with both artificial and biological membranes is a triggering event in fusion. Correlation of these results with earlier studies of membrane hydration and fusion kinetics provides a more detailed view of the mechanism of fusion.
The kinetics of DNA injection by bacteriophage λ. into liposomes bearing reconstituted receptors were measured using fluorescence spectroscopy. LamB, the bacteriophage receptor, was extracted from bacteria and reconstituted into liposomes by detergent removal dialysis. The DNA binding fluorophore ethidium bromide was encapsulated in the liposomes during dialysis. Enhanced fluorescence of ethidium bromide upon binding to injected DNA was monitored, and showed that injection is a rapid, one-step process. The bimolecular rate law, determined by the method of initial rates, revealed that injection occurs several times faster than indicated by earlier studies employing indirect assays.
It is hoped that these studies will increase the understanding of the mechanisms of virus entry into cells, and to facilitate the development of virus-mimetic drug delivery strategies.
Resumo:
A novel fiber Bragg grating (FBG) sensor system based on an interrogating technique by two parallel matched gratings was designed and theoretically discussed. With an interrogation grating playing the role of temperature compensation grating simultaneously, the wavelength drifts induced by temperature and strain were discriminated. Additionally, the expressions of temperature and strain were deduced for our solution, and dual-value problem and cross sensitivity were solved synchronously through data processing. The influence of the FBG's parameters on the dynamic range and precision was discussed. Besides, the change of environment temperature cannot influence the dynamic range of the sensor system through temperature tuning. The system proposed in this paper will be of great significance to accelerate the real engineering applications of FBG sensing techniques. (c) 2007 Elsevier GmbH. All rights reserved.
Resumo:
The key issues of engineering application of the dual gratings parallel matched interrogation method are expanding the measurable range, improving the usability, and lowering the cost by adopting a compact and simple setup based on existing conditions and improving the precision of the data-processing scheme. A credible and effective data-processing scheme based on a novel divisional look-up table is proposed based on the advantages of other schemes. Any undetermined data is belonged to a certain section, which can be confirmed at first, then it can be looked up in the table to correspond to microstrain by the scheme. It not only solves inherent problems of the traditional one (double value and small measurable range) but also enhances the precision, which improves the performance of the system. From the experimental results, the measurable range of the system is 525 mu epsilon, and the precision is +/- 1 mu epsilon based on normal matched gratings. The system works in real time, which is competent for most engineering measurement requirements. (C) 2007 Elsevier GmbH. All rights reserved.
Resumo:
在神光Ⅱ第9路ICF高功率激光装置中,采用可调法布里-珀罗(F-P)滤波器对幅度调制效应进行补偿,根据补偿装置的技术要求,提出-种应用nm量级精度的电容式位移传感器对可调F-P滤波器间距稳定度进行监控的系统,详细论述了监控系统的结构与工作原理。给出了电容式位移传感器的驱动电路及数据处理与控制软件的设计方案,并对电容式位移传感器的精度进行了标定。实验结果表明,该位移监控系统能够使可调F—P滤波器的间距稳定度保持在15nm/h以内,使幅度调制效应的调制深度优于4%。
Resumo:
A new supervised burned area mapping software named BAMS (Burned Area Mapping Software) is presented in this paper. The tool was built from standard ArcGIS (TM) libraries. It computes several of the spectral indexes most commonly used in burned area detection and implements a two-phase supervised strategy to map areas burned between two Landsat multitemporal images. The only input required from the user is the visual delimitation of a few burned areas, from which burned perimeters are extracted. After the discrimination of burned patches, the user can visually assess the results, and iteratively select additional sampling burned areas to improve the extent of the burned patches. The final result of the BAMS program is a polygon vector layer containing three categories: (a) burned perimeters, (b) unburned areas, and (c) non-observed areas. The latter refer to clouds or sensor observation errors. Outputs of the BAMS code meet the requirements of file formats and structure of standard validation protocols. This paper presents the tool's structure and technical basis. The program has been tested in six areas located in the United States, for various ecosystems and land covers, and then compared against the National Monitoring Trends in Burn Severity (MTBS) Burned Area Boundaries Dataset.