861 resultados para direct load control
Resumo:
w Traditionally, nitrogen control is generally considered an important component of reducing lake eutrophication and cyanobacteria blooms. However, this viewpoint is refuted recently by researchers in China and North America. In the present paper, the traditional viewpoint of nitrogen control is pointed out to lack a scientific basis: the N/P hypothesis is just a subjective assumption; bottle bioassay experiments fail to simulate the natural process of nitrogen fixation. Our multi-year comparative research in more than 40 Yangtze lakes indicates that phosphorus is the key factor determining phytoplankton growth regardless of nitrogen concentrations and that total phytoplankton biomass is determined by total phosphorus and not by total nitrogen concentrations. These results imply that, in the field, nitrogen control will not decrease phytoplankton biomass. This finding is supported by a long-term whole-lake experiment from North America. These outcomes can be generalized in terms that a reduction in nitrogen loading may not decrease the biomass of total phytoplankton as it can stimulate blooms of nitrogen-fixing cyanobacteria. To mitigate eutrophication, it is not nitrogen but phosphorus that should be reduced, unless nitrogen concentrations are too high to induce direct toxic impacts on human beings or other organisms. Finally, details are provided on how to reduce controls on nitrogen and how to mitigate eutrophication. (C) 2009 National Natural Science Foundation of China and Chinese Academy of Sciences. Published by Elsevier Limited and Science in China Press. All rights reserved.
Resumo:
The variety of laser systems available to industrial laser users is growing and the choice of the correct laser for a material target application is often based on an empirical assessment. Industrial master oscillator power amplifier systems with tuneable temporal pulse shapes have now entered the market, providing enormous pulse parameter flexibility in an already crowded parameter space. In this paper, an approach is developed to design interaction parameters based on observations of material responses. Energy and material transport mechanisms are studied using pulsed digital holography, post process analysis techniques and finite-difference modelling to understand the key response mechanisms for a variety of temporal pulse envelopes incident on a silicon (1/1/1) substrate. The temporal envelope is shown to be the primary control parameter of the source term that determines the subsequent material response and the resulting surface morphology. A double peak energy-bridged temporal pulse shape designed through direct application of holographic imaging data is shown to substantially improve surface quality. © 2014 IEEE.
Sonar gain control in echolocating finless porpoises (Neophocaena phocaenoides) in an open water (L)
Resumo:
Source levels of echolocating free-ranging Yangtze finless porpoise (Neophocaena phocaenoides asiaeorientalis) were calculated using a range estimated by measuring the time delays of the signals via the surface and bottom reflection paths to the hydrophone, relative to the direct signal. Peak-to-peak source levels for finless porpoise were from 163.7 to 185.6 dB re:1 mu Pa. The source levels are highly range dependent and varied approximately as a function of the one-way transmission loss for signals traveling from the animals to the hydrophone. (c) 2006 Acoustical Society of America.
Resumo:
We report a direct observation of excitonic polaron in InAs/GaAs quantum dots using the photoluminescence (PL) spectroscopy. We observe that a new peak s' emerges below the s-shell which has anomalous temperature dependence emission energy. The peak s' anticrosses with s at a certain temperature, with a large anticrossing gap up to 31 meV. The behavior of the new peak, which cannot be interpreted using Huang-Rhys model, provides a direct evidence for strong coupling between exciton and LO phonons, and the formation of the excitonic polaron. The strong coupling between exciton and phonons opens a way to coherently control the polaron states.
Resumo:
A DC-offset cancellation scheme in the 5GHz direct-conversion receiver compliant with IEEE 802.11a wireless LAN standard is described in this paper. It uses the analog feedback loop to eliminate the DC-offset at the output of the double-balanced mixer. The mixer has a simulation voltage conversion gain of IMB at 5.2GHz, noise figure of 9.67dB, IIP3 of 7.6dBm. The solution provides 39.1dB reduction according to the leakage value at LO and mixer load resistors, the additional noise figure added to mixer is less than 0.9dB, the added power dissipation is 0.1mW and was fabricated in 60GHz 0.35 mu m SiGe BiCMOS technology.
Resumo:
Because of its high energy density direct current(dc)thermal plasmas are widely accepted as a processing medium which facilitates high processing rates high fluxes of radical species the potential for smaller jnstallations a wide choice of reactants and high quench rates[1].A broad range of industrial processing methods have been developed based on dc plasma technology. However,nonstationary features limited new applications of dc plasma in advanced processing, where reliability£¬reproducibility and precise controllability are required£. These challenges call for better understanding of the arc and jet behavior over a wide range of generating parameters and a comprehensive control of every aspect of lhe plasma processing.
Resumo:
A new vinyl acyl azide monomer, 4-(azidocarbonyl) phenyl methacrylate, has been synthesized and characterized by NMR and FTIR spectroscopy. The thermal stability of the new monomer has been investigated with FTIR and thermal gravimetry/differential thermal analysis (TG/DTA), and the monomer has been demonstrated to be stable below 50 degrees C in the solid state. The copolymerizations of the new monomer with methyl acrylate have been carried out at room temperature under Co-60 gamma-ray irradiation in the presence of benzyl 1-H-imidazole-1-carbodithioate. The results show that the polymerizations bear all the characteristics of controlled/living free-radical polymerizations, such as the molecular weight increasing linearly with the monomer conversion, the molecular weight distribution being narrow (< 1.20), and a linear relationship existing between In([M](0)/[M]) and the polymerization time. The data from H-1 NMR and FTIR confirm that no change in the acyl azide groups has occurred in the polymerization process and that acyl azide copolymers have been obtained. The thermal stability of the polymers has also been investigated with TG/DTA and FTIR.
Resumo:
Self-doped polyaniline (PANI) micro-rings have been successfully generated electrochemically. The polymer forming rings were about 100 nm wide, and the ring diameter is tunable from several to dozens of micrometres depending on deferent current densities. The morphology of such nanostructured polyaniline rings was investigated and further confirmed with field-emission scanning electron microscopy (FE-SEM). Furthermore, the film was characterized using UV/visible spectroscopy and cyclic voltammetry. The bubble template formation mechanism of the micro-rings was also proposed. Such nanostructured materials synthesized electrochemically open up a new approach to surface morphology control.
Resumo:
In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.
Resumo:
The buckling of compressively-loaded members is one of the most important factors limiting the overall strength and stability of a structure. I have developed novel techniques for using active control to wiggle a structural element in such a way that buckling is prevented. I present the results of analysis, simulation, and experimentation to show that buckling can be prevented through computer-controlled adjustment of dynamical behavior.sI have constructed a small-scale railroad-style truss bridge that contains compressive members that actively resist buckling through the use of piezo-electric actuators. I have also constructed a prototype actively controlled column in which the control forces are applied by tendons, as well as a composite steel column that incorporates piezo-ceramic actuators that are used to counteract buckling. Active control of buckling allows this composite column to support 5.6 times more load than would otherwise be possible.sThese techniques promise to lead to intelligent physical structures that are both stronger and lighter than would otherwise be possible.
Resumo:
Literature on the nonprofit sector focuses on charities and their interactions with clients or governmental agencies; donors are studied less often. Studies on philanthropy do examine donors but tend to focus on microlevel factors to explain their behavior. This study, in contrast, draws on institutional theory to show that macrolevel factors affect donor behavior. It also extends the institutional framework by examining the field‐level configurations in which donors and fundraisers are embedded. Employing the case of workplace charity, this new model highlights how the composition of the organizational field structures fundraisers and donors alike, shaping fundraisers’ strategies of solicitation and, therefore, the extent of donor control.
Resumo:
Wireless sensor networks are characterized by limited energy resources. To conserve energy, application-specific aggregation (fusion) of data reports from multiple sensors can be beneficial in reducing the amount of data flowing over the network. Furthermore, controlling the topology by scheduling the activity of nodes between active and sleep modes has often been used to uniformly distribute the energy consumption among all nodes by de-synchronizing their activities. We present an integrated analytical model to study the joint performance of in-network aggregation and topology control. We define performance metrics that capture the tradeoffs among delay, energy, and fidelity of the aggregation. Our results indicate that to achieve high fidelity levels under medium to high event reporting load, shorter and fatter aggregation/routing trees (toward the sink) offer the best delay-energy tradeoff as long as topology control is well coordinated with routing.
Resumo:
Speculative service implies that a client's request for a document is serviced by sending, in addition to the document requested, a number of other documents (or pointers thereto) that the server speculates will be requested by the client in the near future. This speculation is based on statistical information that the server maintains for each document it serves. The notion of speculative service is analogous to prefetching, which is used to improve cache performance in distributed/parallel shared memory systems, with the exception that servers (not clients) control when and what to prefetch. Using trace simulations based on the logs of our departmental HTTP server http://cs-www.bu.edu, we show that both server load and service time could be reduced considerably, if speculative service is used. This is above and beyond what is currently achievable using client-side caching [3] and server-side dissemination [2]. We identify a number of parameters that could be used to fine-tune the level of speculation performed by the server.
Resumo:
High-speed networks, such as ATM networks, are expected to support diverse Quality of Service (QoS) constraints, including real-time QoS guarantees. Real-time QoS is required by many applications such as those that involve voice and video communication. To support such services, routing algorithms that allow applications to reserve the needed bandwidth over a Virtual Circuit (VC) have been proposed. Commonly, these bandwidth-reservation algorithms assign VCs to routes using the least-loaded concept, and thus result in balancing the load over the set of all candidate routes. In this paper, we show that for such reservation-based protocols|which allow for the exclusive use of a preset fraction of a resource's bandwidth for an extended period of time-load balancing is not desirable as it results in resource fragmentation, which adversely affects the likelihood of accepting new reservations. In particular, we show that load-balancing VC routing algorithms are not appropriate when the main objective of the routing protocol is to increase the probability of finding routes that satisfy incoming VC requests, as opposed to equalizing the bandwidth utilization along the various routes. We present an on-line VC routing scheme that is based on the concept of "load profiling", which allows a distribution of "available" bandwidth across a set of candidate routes to match the characteristics of incoming VC QoS requests. We show the effectiveness of our load-profiling approach when compared to traditional load-balancing and load-packing VC routing schemes.
Resumo:
To support the diverse Quality of Service (QoS) requirements of real-time (e.g. audio/video) applications in integrated services networks, several routing algorithms that allow for the reservation of the needed bandwidth over a Virtual Circuit (VC) established on one of several candidate routes have been proposed. Traditionally, such routing is done using the least-loaded concept, and thus results in balancing the load across the set of candidate routes. In a recent study, we have established the inadequacy of this load balancing practice and proposed the use of load profiling as an alternative. Load profiling techniques allow the distribution of "available" bandwidth across a set of candidate routes to match the characteristics of incoming VC QoS requests. In this paper we thoroughly characterize the performance of VC routing using load profiling and contrast it to routing using load balancing and load packing. We do so both analytically and via extensive simulations of multi-class traffic routing in Virtual Path (VP) based networks. Our findings confirm that for routing guaranteed bandwidth flows in VP networks, load balancing is not desirable as it results in VP bandwidth fragmentation, which adversely affects the likelihood of accepting new VC requests. This fragmentation is more pronounced when the granularity of VC requests is large. Typically, this occurs when a common VC is established to carry the aggregate traffic flow of many high-bandwidth real-time sources. For VP-based networks, our simulation results show that our load-profiling VC routing scheme performs better or as well as the traditional load-balancing VC routing in terms of revenue under both skewed and uniform workloads. Furthermore, load-profiling routing improves routing fairness by proactively increasing the chances of admitting high-bandwidth connections.