917 resultados para sequential cropping
Resumo:
Ecological coherence is a multifaceted conservation objective that includes some potentially conflicting concepts. These concepts include the extent to which the network maximises diversity (including genetic diversity) and the extent to which protected areas interact with non-reserve locations. To examine the consequences of different selection criteria, the preferred location to complement protected sites was examined using samples taken from four locations around each of two marine protected areas: Strangford Lough and Lough Hyne, Ireland. Three different measures of genetic distance were used: FST, Dest and a measure of allelic dissimilarity, along with a direct assessment of the total number of alleles in different candidate networks. Standardized site scores were used for comparisons across methods and selection criteria. The average score for Castlehaven, a site relatively close to Lough Hyne, was highest, implying that this site would capture the most genetic diversity while ensuring highest degree of interaction between protected and unprotected sites. Patterns around Strangford Lough were more ambiguous, potentially reflecting the weaker genetic structure around this protected area in comparison to Lough Hyne. Similar patterns were found across species with different dispersal capacities, indicating that methods based on genetic distance could be used to help maximise ecological coherence in reserve networks. ⺠Ecological coherence is a key component of marine protected area network design. ⺠Coherence contains a number of competing concepts. ⺠Genetic information from field populations can help guide assessments of coherence. ⺠Average choice across different concepts of coherence was consistent among species. ⺠Measures can be combined to compare the coherence of different network designs.
Sequential antimicrobial therapy: treatment of severe lower respiratory tract infections in children
Resumo:
Although there have been a number of studies in adults, to date there has been little research into sequential antimicrobial therapy (SAT) in paediatric populations. The present study evaluates the impact of a SAT protocol for the treatment of severe lower respiratory tract infection in paediatric patients. The study involved 89 paediatric patients (44 control and 45 SAT). The SAT patients had a shorter length of hospital stay (4.0 versus 8.3 days), shorter duration of inpatient antimicrobial therapy (4.0 versus 7.9 days) with the period of iv therapy being reduced from a mean of 5.6 to 1.7 days. The total healthcare costs were reduced by 52%. The resolution of severe lower respiratory tract infection with a short course of iv antimicrobials, followed by conversion to oral therapy yielded clinical outcomes comparable to those achieved using longer term iv therapy. SAT proved to be an important cost-minimizing tool for realizing substantial healthcare costs savings.
Resumo:
Traditional static analysis fails to auto-parallelize programs with a complex control and data flow. Furthermore, thread-level parallelism in such programs is often restricted to pipeline parallelism, which can be hard to discover by a programmer. In this paper we propose a tool that, based on profiling information, helps the programmer to discover parallelism. The programmer hand-picks the code transformations from among the proposed candidates which are then applied by automatic code transformation techniques.
This paper contributes to the literature by presenting a profiling tool for discovering thread-level parallelism. We track dependencies at the whole-data structure level rather than at the element level or byte level in order to limit the profiling overhead. We perform a thorough analysis of the needs and costs of this technique. Furthermore, we present and validate the belief that programs with complex control and data flow contain significant amounts of exploitable coarse-grain pipeline parallelism in the program’s outer loops. This observation validates our approach to whole-data structure dependencies. As state-of-the-art compilers focus on loops iterating over data structure members, this observation also explains why our approach finds coarse-grain pipeline parallelism in cases that have remained out of reach for state-of-the-art compilers. In cases where traditional compilation techniques do find parallelism, our approach allows to discover higher degrees of parallelism, allowing a 40% speedup over traditional compilation techniques. Moreover, we demonstrate real speedups on multiple hardware platforms.
Resumo:
<p> The recollision model has been applied to separate the probability for double ionization into contributions from electron-impact ionization and electron-impact excitation for intensities at which the dielectronic interaction is important for generating double ionization. For a wavelength of 780 am, electron-impact excitation dominates just above the threshold intensity for double ionization, approximate to 1.2 x 10(14) W cm(-2), with electron-impact ionization becoming more important for higher intensities. For a wavelength of 390 nm, the ratio between electron-impact ionization and electron-impact excitation remains fairly constant for all intensities above the threshold intensity for double ionization, approximate to 6 x 10(14) W cm(-2). The results point to an explanation of the experimental results, but more detailed calculations on the behaviour of excited He+ ions are required.</p>
Resumo:
Support vector machines (SVMs), though accurate, are not preferred in applications requiring high classification speed or when deployed in systems of limited computational resources, due to the large number of support vectors involved in the model. To overcome this problem we have devised a primal SVM method with the following properties: (1) it solves for the SVM representation without the need to invoke the representer theorem, (2) forward and backward selections are combined to approach the final globally optimal solution, and (3) a criterion is introduced for identification of support vectors leading to a much reduced support vector set. In addition to introducing this method the paper analyzes the complexity of the algorithm and presents test results on three public benchmark problems and a human activity recognition application. These applications demonstrate the effectiveness and efficiency of the proposed algorithm.
--------------------------------------------------------------------------------
Resumo:
At the U.S. DOE Oak Ridge Integrated Field Research Challenge (ORIFRC) site, the iron content of shallow subsurface materials (i.e. weathered saprolite) is relatively high (up to 5-6% as w/w), and therefore, the forms of the iron species present plays a critical role in the long-term sequestration of uranium. A long term pilot-scale study of the bioreduction and reoxidation of uranium conducted at the ORIFRC area 3 site, adjacent to the former S-3 disposal ponds (source zone), has provided us with the opportunity to study the impact of iron species on the sequestration of U(VI). The aqueous U(VI) concentrations at the site were decreased to below the EPA MCL through the intermittent injection of ethanol as the electron donor. Previous field tests indicated that both oxygen and nitrate could oxidize the bioreduced U(IV) and cause a short-term rebound of aqueous phase uranium concentration after the oxidative agents were delivered directly to the bioreduced zone.
A field test has been conducted to examine the long-term effect of exposure of bioreduced sediments to nitrate in contaminated groundwater for more than 1,380 days at the Area 3 site. Contaminated groundwater was allowed to invade the previously bioreduced zone via the natural groundwater gradient after an extended period in which reducing conditions were maintained and the bioreduced zone was protected from the influx of upgradient contaminated groundwater. The geochemical response to the invasion of contaminated groundwater was dependent on whether the monitoring location is in the middle or the fringe of the previously bioreduced zone. In general, the nitrate concentrations in the previously bioreduced area, increased gradually from near zero to ~50-300 mM within 200 days and then stabilized. The pH declined from bioreduced levels of 6.2-6.7 to below 5.0. Uranium concentrations rebounded in all monitoring wells but at different rates. At most locations U concentrations rebounded, declined and then rebounded again. Methane gas disappeared while a significant level (20,000 to 44,000 ppmv) N2O was found in the groundwater of monitoring wells after three years of reoxidization.
The U(IV) in sediments was mainly reoxidized to U(VI) species. Based on XANES analysis, the predominate uranium in all samples after re-oxidation was similar to a uranyl nitrate form. But the U content in the sediment remained as high as that determined after bioreduction activates were completed, indicating that much of the U is still sequestrated in situ. SEM observations of surged fine sediments revealed that clusters of colloidal-sized (200-500nm) U-containing precipitates appeared to have formed in situ, regardless from sample of FW106 in non-bioactivity control area or of pre-bioreduced FW101-2 and FW102-3. Additionally, SEM-EDS and microprobe analysis, showed that the U-containing precipitates (~1% U) in FW106 are notably higher in Fe, compared to the precipitates (~1-2.5% U) from FW101-2 and FW102-3. However, XRF analysis indicated that the U content was remained as high as 2180 and 1810 mg/kg with U/Fe ratio at 0.077 and 0.055 vs 0.037 g/g, respectively in pre-bioreduced FW101-2 and FW102-3, suggesting more U sequestrated by Fe in pre-bioreduced sediments.
Resumo:
Gastric carcinogenesis has been well documented in the step-wise histopathological model, known as Correa pathway. Several biomarkers including CD44, Musashi-1 and CD133 have been reported as putative stem cell (PSC) markers.
Resumo:
We introduce a general scheme for sequential one-way quantum computation where static systems with long-living quantum coherence (memories) interact with moving systems that may possess very short coherence times. Both the generation of the cluster state needed for the computation and its consumption by measurements are carried out simultaneously. As a consequence, effective clusters of one spatial dimension fewer than in the standard approach are sufficient for computation. In particular, universal computation requires only a one-dimensional array of memories. The scheme applies to discrete-variable systems of any dimension as well as to continuous-variable ones, and both are treated equivalently under the light of local complementation of graphs. In this way our formalism introduces a general framework that encompasses and generalizes in a unified manner some previous system-dependent proposals. The procedure is intrinsically well suited for implementations with atom-photon interfaces.
Resumo:
Human listeners seem to be remarkably able to recognise acoustic sound sources based on timbre cues. Here we describe a psychophysical paradigm to estimate the time it takes to recognise a set of complex sounds differing only in timbre cues: both in terms of the minimum duration of the sounds and the inferred neural processing time. Listeners had to respond to the human voice while ignoring a set of distractors. All sounds were recorded from natural sources over the same pitch range and equalised to the same duration and power. In a first experiment, stimuli were gated in time with a raised-cosine window of variable duration and random onset time. A voice/non-voice (yes/no) task was used. Performance, as measured by d', remained above chance for the shortest sounds tested (2 ms); d's above 1 were observed for durations longer than or equal to 8 ms. Then, we constructed sequences of short sounds presented in rapid succession. Listeners were asked to report the presence of a single voice token that could occur at a random position within the sequence. This method is analogous to the "rapid sequential visual presentation" paradigm (RSVP), which has been used to evaluate neural processing time for images. For 500-ms sequences made of 32-ms and 16-ms sounds, d' remained above chance for presentation rates of up to 30 sounds per second. There was no effect of the pitch relation between successive sounds: identical for all sounds in the sequence or random for each sound. This implies that the task was not determined by streaming or forward masking, as both phenomena would predict better performance for the random pitch condition. Overall, the recognition of familiar sound categories such as the voice seems to be surprisingly fast, both in terms of the acoustic duration required and of the underlying neural time constants.
Resumo:
Simple meso-scale capacitor structures have been made by incorporating thin (300 nm) single crystal lamellae of KTiOPO4 (KTP) between two coplanar Pt electrodes. The influence that either patterned protrusions in the electrodes or focused ion beam milled holes in the KTP have on the nucleation of reverse domains during switching was mapped using piezoresponse force microscopy imaging. The objective was to assess whether or not variations in the magnitude of field enhancement at localised “hot-spots,” caused by such patterning, could be used to both control the exact locations and bias voltages at which nucleation events occurred. It was found that both the patterning of electrodes and the milling of various hole geometries into the KTP could allow controlled sequential injection of domain wall pairs at different bias voltages; this capability could have implications for the design and operation of domain wall electronic devices, such as memristors, in the future.