982 resultados para cache coherence protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to compare the outcomes associated with two differing right unilateral (RUL) electroconvulsive therapy (ECT) dosing protocols: 2-3X seizure threshold (2-3X ST) and fixed high dose (FHD) at 353 mC. A retrospective chart review was performed to compare patient outcomes during the implementation of two different dosing protocols: 2-3X ST from October 2000 to May 2001 and FHD from June 2001 to February 2002. A total of 56 patients received ECT under the 2-3X ST protocol, and 46 received ECT under the FHD protocol. In total, 13.6% of patients receiving ECT according to the 2-3X ST protocol received more than 12 ECT, whereas none of the FHD group received more than 12 ECT. The mean number of ECT per treatment course reduced significantly from 7.6 to 5.7 following the switch from the 2-3X ST protocol to the FHD protocol. There were no significant differences between the two groups in the incidence of adverse cognitive effects. ECT practitioners adhered to the 2-3X ST protocol for only 51.8% of ECT courses, with protocol adherence improving to 87% following introduction of the FHD protocol. Although this naturalistic retrospective chart survey had significant methodological limitations, it found that practitioners are more likely to correctly adhere to a fixed dose protocol, therefore, increasing its 'real world' effectiveness in comparison to titrated suprathreshold dosing techniques. The FHD protocol was associated with shorter courses of ECT than the 2-3X ST protocol, with no significant difference between the two protocols in clinically discernable adverse cognitive effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Full-field Fourier-domain optical coherence tomography (3F-OCT) is a full-field version of spectraldomain/swept-source optical coherence tomography. A set of two-dimensional Fourier holograms is recorded at discrete wavenumbers spanning the swept-source tuning range. The resultant three-dimensional data cube contains comprehensive information on the three-dimensional morphological layout of the sample that can be reconstructed in software via three-dimensional discrete Fourier-transform. This method of recording of the OCT signal confers signal-to-noise ratio improvement in comparison with "flying-spot" time-domain OCT. The spatial resolution of the 3F-OCT reconstructed image, however, is degraded due to the presence of a phase cross-term, whose origin and effects are addressed in this paper. We present theoretical and experimental study of imaging performance of 3F-OCT, with particular emphasis on elimination of the deleterious effects of the phase cross-term.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While others have attempted to determine, by way of mathematical formulae, optimal resource duplication strategies for random walk protocols, this paper is concerned with studying the emergent effects of dynamic resource propagation and replication. In particular, we show, via modelling and experimentation, that under any given decay (purge) rate the number of nodes that have knowledge of particular resource converges to a fixed point or a limit cycle. We also show that even for high rates of decay - that is, when few nodes have knowledge of a particular resource - the number of hops required to find that resource is small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security protocols preserve essential properties, such as confidentiality and authentication, of electronically transmitted data. However, such properties cannot be directly expressed or verified in contemporary formal methods. Via a detailed example, we describe the phases needed to formalise and verify the correctness of a security protocol in the state-oriented Z formalism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security protocols are often modelled at a high level of abstraction, potentially overlooking implementation-dependent vulnerabilities. Here we use the Z specification language's rich set of data structures to formally model potentially ambiguous messages that may be exploited in a 'type flaw' attack. We then show how to formally verify whether or not such an attack is actually possible in a particular protocol using Z's schema calculus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Freshwater is extremely precious; but even more precious than freshwater is clean freshwater. From the time that 2/3 of our planet is covered in water, we have contaminated our globe with chemicals that have been used by industrial activities over the last century in a unprecedented way causing harm to humans and wildlife. We have to adopt a new scientific mindset in order to face this problem so to protect this important resource. The Water Framework Directive (European Parliament and the Council, 2000) is a milestone legislative document that transformed the way that water quality monitoring is undertaken across all Member States by introducing the Ecological and Chemical Status. A “good or higher” Ecological Status is expected to be achieved for all waterbodies in Europe by 2015. Yet, most of the European waterbodies, which are determined to be at risk, or of moderate to bad quality, further information will be required so that adequate remediation strategies can be implemented. To date, water quality evaluation is based on five biological components (phytoplankton, macrophytes and benthic algae, macroinvertebrates and fishes) and various hydromorphological and physicochemical elements. The evaluation of the chemical status is principally based on 33 priority substances and on 12 xenobiotics, considered as dangerous for the environment. This approach takes into account only a part of the numerous xenobiotics that can be present in surface waters and could not evidence all the possible causes of ecotoxicological stress that can act in a water section. The mixtures of toxic chemicals may constitute an ecological risk not predictable on the basis of the single component concentration. To improve water quality, sources of contamination and causes of ecological alterations need to be identified. On the other hand, the analysis of the community structure, which is the result of multiple processes, including hydrological constrains and physico-chemical stress, give back only a “photograph” of the actual status of a site without revealing causes and sources of the perturbation. A multidisciplinary approach, able to integrate the information obtained by different methods, such as community structure analysis and eco-genotoxicological studies, could help overcome some of the difficulties in properly identifying the different causes of stress in risk assessment. In synthesis, the river ecological status is the result of a combination of multiple pressures that, for management purposes and quality improvement, have to be disentangled from each other. To reduce actual uncertainty in risk assessment, methods that establish quantitative links between levels of contamination and community alterations are needed. The analysis of macrobenthic invertebrate community structure has been widely used to identify sites subjected to perturbation. Trait-based descriptors of community structure constitute a useful method in ecological risk assessment. The diagnostic capacity of freshwater biomonitoring could be improved by chronic sublethal toxicity testing of water and sediment samples. Requiring an exposure time that covers most of the species’ life cycle, chronic toxicity tests are able to reveal negative effects on life-history traits at contaminant concentrations well below the acute toxicity level. Furthermore, the responses of high-level endpoints (growth, fecundity, mortality) can be integrated in order to evaluate the impact on population’s dynamics, a highly relevant endpoint from the ecological point of view. To gain more accurate information about potential causes and consequences of environmental contamination, the evaluation of adverse effects at physiological, biochemical and genetic level is also needed. The use of different biomarkers and toxicity tests can give information about the sub-lethal and toxic load of environmental compartments. Biomarkers give essential information about the exposure to toxicants, such as endocrine disruptor compounds and genotoxic substances whose negative effects cannot be evidenced by using only high-level toxicological endpoints. The increasing presence of genotoxic pollutants in the environment has caused concern regarding the potential harmful effects of xenobiotics on human health, and interest on the development of new and more sensitive methods for the assessment of mutagenic and cancerogenic risk. Within the WFD, biomarkers and bioassays are regarded as important tools to gain lines of evidence for cause-effect relationship in ecological quality assessment. Despite the scientific community clearly addresses the advantages and necessity of an ecotoxicological approach within the ecological quality assessment, a recent review reports that, more than one decade after the publication of the WFD, only few studies have attempted to integrate ecological water status assessment and biological methods (namely biomarkers or bioassays). None of the fifteen reviewed studies included both biomarkers and bioassays. The integrated approach developed in this PhD Thesis comprises a set of laboratory bioassays (Daphnia magna acute and chronic toxicity tests, Comet Assay and FPG-Comet) newly-developed, modified tacking a cue from standardized existing protocols or applied for freshwater quality testing (ecotoxicological, genotoxicological and toxicogenomic assays), coupled with field investigations on macrobenthic community structures (SPEAR and EBI indexes). Together with the development of new bioassays with Daphnia magna, the feasibility of eco-genotoxicological testing of freshwater and sediment quality with Heterocypris incongruens was evaluated (Comet Assay and a protocol for chronic toxicity). However, the Comet Assay, although standardized, was not applied to freshwater samples due to the lack of sensitivity of this species observed after 24h of exposure to relatively high (and not environmentally relevant) concentrations of reference genotoxicants. Furthermore, this species demonstrated to be unsuitable also for chronic toxicity testing due to the difficult evaluation of fecundity as sub-lethal endpoint of exposure and complications due to its biology and behaviour. The study was applied to a pilot hydrographic sub-Basin, by selecting section subjected to different levels of anthropogenic pressure: this allowed us to establish the reference conditions, to select the most significant endpoints and to evaluate the coherence of the responses of the different lines of evidence (alteration of community structure, eco-genotoxicological responses, alteration of gene expression profiles) and, finally, the diagnostic capacity of the monitoring strategy. Significant correlations were found between the genotoxicological parameter Tail Intensity % (TI%) and macrobenthic community descriptors SPEAR (p<0.001) and EBI (p<0.05), between the genotoxicological parameter describing DNA oxidative stress (ΔTI%) and mean levels of nitrates (p<0.01) and between reproductive impairment (Failed Development % from D. magna chronic bioassays) and TI% (p<0.001) as well as EBI (p<0.001). While correlation among parameters demonstrates a general coherence in the response to increasing impacts, the concomitant ability of each single endpoint to be responsive to specific sources of stress is at the basis of the diagnostic capacity of the integrated approach as demonstrated by stations presenting a mismatch among the different lines of evidence. The chosen set of bioassays, as well as the selected endpoints, are not providing redundant indications on the water quality status but, on the contrary, are contributing with complementary pieces of information about the several stressors that insist simultaneously on a waterbody section providing this monitoring strategy with a solid diagnostic capacity. Our approach should provide opportunities for the integration of biological effects into monitoring programmes for surface water, especially in investigative monitoring. Moreover, it should provide a more realistic assessment of impact and exposure of aquatic organisms to contaminants. Finally this approach should provide an evaluation of drivers of change in biodiversity and its causalities on ecosystem function/services provision, that is the direct and indirect contributions to human well-being.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The aim of this study was to compare a developmental optical coherence tomography (OCT) based contact lens inspection instrument to a widely used geometric inspection instrument (Optimec JCF), to establish the capability of a market focused OCT system. Methods: Measurements of 27 soft spherical contact lenses were made using the Optimec JCF and a new OCT based instrument, the Optimec is830. Twelve of the lenses analysed were specially commissioned from a traditional hydrogel (Contamac GM Advance 49%) and 12 from a silicone hydrogel (Contamac Definitive 65), each set with a range of back optic zone radius (BOZR) and centre thickness (CT) values. Three commercial lenses were also measured; CooperVision MyDay (Stenfilcon A) in −10D, −3D and +6D powers. Two measurements of BOZR, CT and total diameter were made for each lens in temperature controlled saline on both instruments. Results: The results showed that the is830 and JCF measurements were comparable, but that the is830 had a better repeatability coefficient for BOZR (0.065 mm compared to 0.151 mm) and CT (0.008 mm compared to 0.027 mm). Both instruments had similar results for total diameter (0.041 mm compared to 0.044 mm). Conclusions: The OCT based instrument assessed in this study is able to match and improve on the JCF instrument for the measurement of total diameter, back optic zone radius and centre thickness for soft contact lenses in temperature controlled saline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spectral and coherence methodologies are ubiquitous for the analysis of multiple time series. Partial coherence analysis may be used to try to determine graphical models for brain functional connectivity. The outcome of such an analysis may be considerably influenced by factors such as the degree of spectral smoothing, line and interference removal, matrix inversion stabilization and the suppression of effects caused by side-lobe leakage, the combination of results from different epochs and people, and multiple hypothesis testing. This paper examines each of these steps in turn and provides a possible path which produces relatively ‘clean’ connectivity plots. In particular we show how spectral matrix diagonal up-weighting can simultaneously stabilize spectral matrix inversion and reduce effects caused by side-lobe leakage, and use the stepdown multiple hypothesis test procedure to help formulate an interaction strength.