977 resultados para Editorial quality
Resumo:
An attempt is made to discuss in brief the current philosophy and trends in quality assurance through nondestructive testing. The techniques currently in use and those being developed for newer and advanced materials such as composites are reviewed. 27 ref.--AA
Resumo:
To protect and restore lake ecosystems under threats posed by the increasing human population, information on their ecological quality is needed. Lake sediments provide a data rich archive that allows identification of various biological components present prior to anthropogenic alterations as well as a constant record of changes. By providing a longer dimension of time than any ongoing monitoring programme, palaeolimnological methods can help in understanding natural variability and long-term ecological changes in lakes. As zooplankton have a central role in the lake food web, their remains can potentially provide versatile information on past trophic structure. However, various taphonomic processes operating in the lakes still raise questions concerning how subfossil assemblages reflect living communities. This thesis work aimed at improving the use of sedimentary zooplankton remains in the reconstruction of past zooplankton communities and the trophic structure in lakes. To quantify interspecific differences in the accumulation of remains, the subfossils of nine pelagic zooplankton taxa in annually laminated sediments were compared with monitoring results for live zooplankton in Lake Vesijärvi. This lake has a known history of eutrophication and recovery, which resulted from reduced external loading and effective fishing of plankti-benthivorous fish. The response of zooplankton assemblages to these known changes was resolved using annually laminated sediments. The generality of the responses observed in Lake Vesijärvi were further tested with a set of 31 lakes in Southern Finland, relating subfossils in surface sediments to contemporary water quality and fish density, as well as to lake morphometry. The results demonstrated differential preservation and retention of cladoceran species in the sediment. Daphnia, Diaphanosoma and Ceriodaphnia were clearly underrepresented in the sediment samples in comparison to well-preserved Bosmina species, Chydorus, Limnosida and Leptodora. For well-preserved species, the annual net accumulation rate was similar to or above the expected values, reflecting effective sediment focusing and accumulation in the deepest part of the lake. The decreased fish density and improved water quality led to subtle changes in zooplankton community composition. The abundance of Diaphanosoma and Limnosida increased after the reduction in fish density, while Ceriodaphnia and rotifers decreased. The most sensitive indicator of fish density was the mean size of Daphnia ephippia and Bosmina (E.) crassicornis ephippia and carapaces. The concentration of plant-associated species increased, reflecting expanding littoral vegetation along with increasing transparency. Several of the patterns observed in Lake Vesijärvi could also be found within the set of 31 lakes. According to this thesis work, the most useful cladoceran-based indices for nutrient status and planktivorous fish density in Finnish lakes were the relative abundances of certain pelagic taxa, and the mean size of Bosmina spp. carapaces, especially those of Bosmina (E.) cf. coregoni. The abundance of plant-associated species reflected the potential area for aquatic plants. Lake morphometry and sediment organic content, however, explained a relatively high proportion of the variance in the species data, and more studies are needed to quantify lake-specific differences in the accumulation and preservation of remains. Commonly occurring multicollinearity between environmental variables obstructs the cladoceran-based reconstruction of single environmental variables. As taphonomic factors and several direct and indirect structuring forces in lake ecosystems simultaneously affect zooplankton, the subfossil assemblages should be studied in a holistic way before making final conclusions about the trophic structure and the change in lake ecological quality.
Resumo:
Many next-generation distributed applications, such as grid computing, require a single source to communicate with a group of destinations. Traditionally, such applications are implemented using multicast communication. A typical multicast session requires creating the shortest-path tree to a fixed number of destinations. The fundamental issue in multicasting data to a fixed set of destinations is receiver blocking. If one of the destinations is not reachable, the entire multicast request (say, grid task request) may fail. Manycasting is a generalized variation of multicasting that provides the freedom to choose the best subset of destinations from a larger set of candidate destinations. We propose an impairment-aware algorithm to provide manycasting service in the optical layer, specifically OBS. We compare the performance of our proposed manycasting algorithm with traditional multicasting and multicast with over provisioning. Our results show a significant improvement in the blocking probability by implementing optical-layer manycasting.
Resumo:
In lake-rich regions, the gathering of information about water quality is challenging because only a small proportion of the lakes can be assessed each year by conventional methods. One of the techniques for improving the spatial and temporal representativeness of lake monitoring is remote sensing from satellites and aircrafts. The experimental material included detailed optical measurements in 11 lakes, air- and spaceborne remote sensing measurements with concurrent field sampling, automatic raft measurements and a national dataset of routine water quality measurements from over 1100 lakes. The analyses of the spatially high-resolution airborne remote sensing data from eutrophic and mesotrophic lakes showed that one or a few discrete water quality observations using conventional monitoring can yield a clear over- or underestimation of the overall water quality in a lake. The use of TM-type satellite instruments in addition to routine monitoring results substantially increases the number of lakes for which water quality information can be obtained. The preliminary results indicated that coloured dissolved organic matter (CDOM) can be estimated with TM-type satellite instruments, which could possibly be utilised as an aid in estimating the role of lakes in global carbon budgets. Based on the results of reflectance modelling and experimental data, MERIS satellite instrument has optimal or near-optimal channels for the estimation of turbidity, chlorophyll a and CDOM in Finnish lakes. MERIS images with a 300 m spatial resolution can provide water quality information in different parts of large and medium-sized lakes, and in filling in the gaps resulting from conventional monitoring. Algorithms that would not require simultaneous field data for algorithm training would increase the amount of remote sensing-based information available for lake monitoring. The MERIS Boreal Lakes processor, trained with the optical data and concentration ranges provided by this study, enabled turbidity estimations with good accuracy without the need for algorithm correction with field measurements, while chlorophyll a and CDOM estimations require further development of the processor. The accuracy of interpreting chlorophyll a via semi empirical algorithms can be improved by classifying lakes prior to interpretation according to their CDOM level and trophic status. Optical modelling indicated that the spectral diffuse attenuation coefficient can be estimated with reasonable accuracy from the measured water quality concentrations. This provides more detailed information on light attenuation from routine monitoring measurements than is available through the Secchi disk transparency. The results of this study improve the interpretation of lake water quality by remote sensing and encourage the use of remote sensing in lake monitoring.
Resumo:
This paper proposes a method of sharing power/energy between multiple sources and multiple loads using an integrated magnetic circuit as a junction between sources and sinks. It also presents a particular use of the magnetic circuit as an ac power supply, delivering sinusoidal voltage to load irrespective of the presence of the grid, taking only active power from the grid. The proposed magnetic circuit is a three-energy-port unit, viz.: 1) power/energy from grid; 2) power energy from battery-inverter unit; and 3) power/energy delivery to the load in its particular application as quality ac power supply (QPS). The product provides sinusoidal regulated output voltage, input power-factor correction, electrical isolation between the sources and loads, low battery voltage, and control simplicity. Unlike conventional series-shunt-compensated uninterruptible power supply topologies with low battery voltage, the isolation is provided using a single magnetic circuit that results in a smaller size and lower cost. The circuit operating principles and analysis, as well as simulation and experimental results, are presented for this QPS.
Resumo:
Frequency-domain scheduling and rate adaptation have helped next generation orthogonal frequency division multiple access (OFDMA) based wireless cellular systems such as Long Term Evolution (LTE) achieve significantly higher spectral efficiencies. To overcome the severe uplink feedback bandwidth constraints, LTE uses several techniques to reduce the feedback required by a frequency-domain scheduler about the channel state information of all subcarriers of all users. In this paper, we analyze the throughput achieved by the User Selected Subband feedback scheme of LTE. In it, a user feeds back only the indices of the best M subbands and a single 4-bit estimate of the average rate achievable over all selected M subbands. In addition, we compare the performance with the subband-level feedback scheme of LTE, and highlight the role of the scheduler by comparing the performances of the unfair greedy scheduler and the proportional fair (PF) scheduler. Our analysis sheds several insights into the working of the feedback reduction techniques used in LTE.
Resumo:
Tower platforms, with instrumentation at six levels above the surface to a height of 30 m, were used to record various atmospheric parameters in the surface layer. Sensors for measuring both mean and fluctuating quantities were used, with the majority of them indigenously built. Soil temperature sensors up to a depth of 30 cm from the surface were among the variables connected to the mean data logger. A PC-based data acquisition system built at the Centre for Atmospheric Sciences, IISc, was used to acquire the data from fast response sensors. This paper reports the various components of a typical MONTBLEX tower observatory and describes the actual experiments carried out in the surface layer at four sites over the monsoon trough region as a part of the MONTBLEX programme. It also describes and discusses several checks made on randomly selected tower data-sets acquired during the experiment. Checks made include visual inspection of time traces from various sensors, comparative plots of sensors measuring the same variable, wind and temperature profile plots calculation of roughness lengths, statistical and stability parameters, diurnal variation of stability parameters, and plots of probability density and energy spectrum for the different sensors. Results from these checks are found to be very encouraging and reveal the potential for further detailed analysis to understand more about surface layer characteristics.
Resumo:
We consider the problem of wireless channel allocation to multiple users. A slot is given to a user with a highest metric (e.g., channel gain) in that slot. The scheduler may not know the channel states of all the users at the beginning of each slot. In this scenario opportunistic splitting is an attractive solution. However this algorithm requires that the metrics of different users form independent, identically distributed (iid) sequences with same distribution and that their distribution and number be known to the scheduler. This limits the usefulness of opportunistic splitting. In this paper we develop a parametric version of this algorithm. The optimal parameters of the algorithm are learnt online through a stochastic approximation scheme. Our algorithm does not require the metrics of different users to have the same distribution. The statistics of these metrics and the number of users can be unknown and also vary with time. Each metric sequence can be Markov. We prove the convergence of the algorithm and show its utility by scheduling the channel to maximize its throughput while satisfying some fairness and/or quality of service constraints.
Resumo:
Electron Diffraction Structure Analysis (EDSA) with data from standard selected-area electron diffraction (SAED) is still the method of choice for structure determination of nano-sized single crystals. The recently determined heavy atom structure α-Ti2Se (Albe & Weirich, 2003) is used as an example to illustrate the developed procedure for structure determination from two-dimensionally SAED data via direct methods and kinematical least-squares refinement. Despite the investigated crystallite had a relatively large effective thickness of about 230 Å as determined from dynamical calculations, the obtained structural model from SAED data was found in good agreement with the result from an earlier single crystal X-ray study (Weirich, Pöttgen & Simon, 1996). Arguments, which support the validity of the used quasi-kinematical approach, are given in the text. The influences of dynamical and secondary scattering on the quality of the data and the structure solution are discussed. Moreover, the usefulness of first-principles calculations for verifying the results from EDSA is demonstrated by two examples, whereas one of the structures was unattainable by conventional X-ray diffraction.
Resumo:
Because of frequent topology changes and node failures, providing quality of service routing in mobile ad hoc networks becomes a very critical issue. The quality of service can be provided by routing the data along multiple paths. Such selection of multiple paths helps to improve reliability and load balancing, reduce delay introduced due to route rediscovery in presence of path failures. There are basically two issues in such a multipath routing Firstly, the sender node needs to obtain the exact topology information. Since the nodes are continuously roaming, obtaining the exact topology information is a tough task. Here, we propose an algorithm which constructs highly accurate network topology with minimum overhead. The second issue is that the paths in the path set should offer best reliability and network throughput. This is achieved in two ways 1) by choice of a proper metric which is a function of residual power, traffic load on the node and in the surrounding medium 2) by allowing the reliable links to be shared between different paths.