910 resultados para cache coherence protocols
Resumo:
Full-field Fourier-domain optical coherence tomography (3F-OCT) is a full-field version of spectraldomain/swept-source optical coherence tomography. A set of two-dimensional Fourier holograms is recorded at discrete wavenumbers spanning the swept-source tuning range. The resultant three-dimensional data cube contains comprehensive information on the three-dimensional morphological layout of the sample that can be reconstructed in software via three-dimensional discrete Fourier-transform. This method of recording of the OCT signal confers signal-to-noise ratio improvement in comparison with "flying-spot" time-domain OCT. The spatial resolution of the 3F-OCT reconstructed image, however, is degraded due to the presence of a phase cross-term, whose origin and effects are addressed in this paper. We present theoretical and experimental study of imaging performance of 3F-OCT, with particular emphasis on elimination of the deleterious effects of the phase cross-term.
Resumo:
While others have attempted to determine, by way of mathematical formulae, optimal resource duplication strategies for random walk protocols, this paper is concerned with studying the emergent effects of dynamic resource propagation and replication. In particular, we show, via modelling and experimentation, that under any given decay (purge) rate the number of nodes that have knowledge of particular resource converges to a fixed point or a limit cycle. We also show that even for high rates of decay - that is, when few nodes have knowledge of a particular resource - the number of hops required to find that resource is small.
Resumo:
Security protocols preserve essential properties, such as confidentiality and authentication, of electronically transmitted data. However, such properties cannot be directly expressed or verified in contemporary formal methods. Via a detailed example, we describe the phases needed to formalise and verify the correctness of a security protocol in the state-oriented Z formalism.
Resumo:
Security protocols are often modelled at a high level of abstraction, potentially overlooking implementation-dependent vulnerabilities. Here we use the Z specification language's rich set of data structures to formally model potentially ambiguous messages that may be exploited in a 'type flaw' attack. We then show how to formally verify whether or not such an attack is actually possible in a particular protocol using Z's schema calculus.
Resumo:
Freshwater is extremely precious; but even more precious than freshwater is clean freshwater. From the time that 2/3 of our planet is covered in water, we have contaminated our globe with chemicals that have been used by industrial activities over the last century in a unprecedented way causing harm to humans and wildlife. We have to adopt a new scientific mindset in order to face this problem so to protect this important resource. The Water Framework Directive (European Parliament and the Council, 2000) is a milestone legislative document that transformed the way that water quality monitoring is undertaken across all Member States by introducing the Ecological and Chemical Status. A “good or higher” Ecological Status is expected to be achieved for all waterbodies in Europe by 2015. Yet, most of the European waterbodies, which are determined to be at risk, or of moderate to bad quality, further information will be required so that adequate remediation strategies can be implemented. To date, water quality evaluation is based on five biological components (phytoplankton, macrophytes and benthic algae, macroinvertebrates and fishes) and various hydromorphological and physicochemical elements. The evaluation of the chemical status is principally based on 33 priority substances and on 12 xenobiotics, considered as dangerous for the environment. This approach takes into account only a part of the numerous xenobiotics that can be present in surface waters and could not evidence all the possible causes of ecotoxicological stress that can act in a water section. The mixtures of toxic chemicals may constitute an ecological risk not predictable on the basis of the single component concentration. To improve water quality, sources of contamination and causes of ecological alterations need to be identified. On the other hand, the analysis of the community structure, which is the result of multiple processes, including hydrological constrains and physico-chemical stress, give back only a “photograph” of the actual status of a site without revealing causes and sources of the perturbation. A multidisciplinary approach, able to integrate the information obtained by different methods, such as community structure analysis and eco-genotoxicological studies, could help overcome some of the difficulties in properly identifying the different causes of stress in risk assessment. In synthesis, the river ecological status is the result of a combination of multiple pressures that, for management purposes and quality improvement, have to be disentangled from each other. To reduce actual uncertainty in risk assessment, methods that establish quantitative links between levels of contamination and community alterations are needed. The analysis of macrobenthic invertebrate community structure has been widely used to identify sites subjected to perturbation. Trait-based descriptors of community structure constitute a useful method in ecological risk assessment. The diagnostic capacity of freshwater biomonitoring could be improved by chronic sublethal toxicity testing of water and sediment samples. Requiring an exposure time that covers most of the species’ life cycle, chronic toxicity tests are able to reveal negative effects on life-history traits at contaminant concentrations well below the acute toxicity level. Furthermore, the responses of high-level endpoints (growth, fecundity, mortality) can be integrated in order to evaluate the impact on population’s dynamics, a highly relevant endpoint from the ecological point of view. To gain more accurate information about potential causes and consequences of environmental contamination, the evaluation of adverse effects at physiological, biochemical and genetic level is also needed. The use of different biomarkers and toxicity tests can give information about the sub-lethal and toxic load of environmental compartments. Biomarkers give essential information about the exposure to toxicants, such as endocrine disruptor compounds and genotoxic substances whose negative effects cannot be evidenced by using only high-level toxicological endpoints. The increasing presence of genotoxic pollutants in the environment has caused concern regarding the potential harmful effects of xenobiotics on human health, and interest on the development of new and more sensitive methods for the assessment of mutagenic and cancerogenic risk. Within the WFD, biomarkers and bioassays are regarded as important tools to gain lines of evidence for cause-effect relationship in ecological quality assessment. Despite the scientific community clearly addresses the advantages and necessity of an ecotoxicological approach within the ecological quality assessment, a recent review reports that, more than one decade after the publication of the WFD, only few studies have attempted to integrate ecological water status assessment and biological methods (namely biomarkers or bioassays). None of the fifteen reviewed studies included both biomarkers and bioassays. The integrated approach developed in this PhD Thesis comprises a set of laboratory bioassays (Daphnia magna acute and chronic toxicity tests, Comet Assay and FPG-Comet) newly-developed, modified tacking a cue from standardized existing protocols or applied for freshwater quality testing (ecotoxicological, genotoxicological and toxicogenomic assays), coupled with field investigations on macrobenthic community structures (SPEAR and EBI indexes). Together with the development of new bioassays with Daphnia magna, the feasibility of eco-genotoxicological testing of freshwater and sediment quality with Heterocypris incongruens was evaluated (Comet Assay and a protocol for chronic toxicity). However, the Comet Assay, although standardized, was not applied to freshwater samples due to the lack of sensitivity of this species observed after 24h of exposure to relatively high (and not environmentally relevant) concentrations of reference genotoxicants. Furthermore, this species demonstrated to be unsuitable also for chronic toxicity testing due to the difficult evaluation of fecundity as sub-lethal endpoint of exposure and complications due to its biology and behaviour. The study was applied to a pilot hydrographic sub-Basin, by selecting section subjected to different levels of anthropogenic pressure: this allowed us to establish the reference conditions, to select the most significant endpoints and to evaluate the coherence of the responses of the different lines of evidence (alteration of community structure, eco-genotoxicological responses, alteration of gene expression profiles) and, finally, the diagnostic capacity of the monitoring strategy. Significant correlations were found between the genotoxicological parameter Tail Intensity % (TI%) and macrobenthic community descriptors SPEAR (p<0.001) and EBI (p<0.05), between the genotoxicological parameter describing DNA oxidative stress (ΔTI%) and mean levels of nitrates (p<0.01) and between reproductive impairment (Failed Development % from D. magna chronic bioassays) and TI% (p<0.001) as well as EBI (p<0.001). While correlation among parameters demonstrates a general coherence in the response to increasing impacts, the concomitant ability of each single endpoint to be responsive to specific sources of stress is at the basis of the diagnostic capacity of the integrated approach as demonstrated by stations presenting a mismatch among the different lines of evidence. The chosen set of bioassays, as well as the selected endpoints, are not providing redundant indications on the water quality status but, on the contrary, are contributing with complementary pieces of information about the several stressors that insist simultaneously on a waterbody section providing this monitoring strategy with a solid diagnostic capacity. Our approach should provide opportunities for the integration of biological effects into monitoring programmes for surface water, especially in investigative monitoring. Moreover, it should provide a more realistic assessment of impact and exposure of aquatic organisms to contaminants. Finally this approach should provide an evaluation of drivers of change in biodiversity and its causalities on ecosystem function/services provision, that is the direct and indirect contributions to human well-being.
Resumo:
Purpose: The aim of this study was to compare a developmental optical coherence tomography (OCT) based contact lens inspection instrument to a widely used geometric inspection instrument (Optimec JCF), to establish the capability of a market focused OCT system. Methods: Measurements of 27 soft spherical contact lenses were made using the Optimec JCF and a new OCT based instrument, the Optimec is830. Twelve of the lenses analysed were specially commissioned from a traditional hydrogel (Contamac GM Advance 49%) and 12 from a silicone hydrogel (Contamac Definitive 65), each set with a range of back optic zone radius (BOZR) and centre thickness (CT) values. Three commercial lenses were also measured; CooperVision MyDay (Stenfilcon A) in −10D, −3D and +6D powers. Two measurements of BOZR, CT and total diameter were made for each lens in temperature controlled saline on both instruments. Results: The results showed that the is830 and JCF measurements were comparable, but that the is830 had a better repeatability coefficient for BOZR (0.065 mm compared to 0.151 mm) and CT (0.008 mm compared to 0.027 mm). Both instruments had similar results for total diameter (0.041 mm compared to 0.044 mm). Conclusions: The OCT based instrument assessed in this study is able to match and improve on the JCF instrument for the measurement of total diameter, back optic zone radius and centre thickness for soft contact lenses in temperature controlled saline.
Resumo:
Spectral and coherence methodologies are ubiquitous for the analysis of multiple time series. Partial coherence analysis may be used to try to determine graphical models for brain functional connectivity. The outcome of such an analysis may be considerably influenced by factors such as the degree of spectral smoothing, line and interference removal, matrix inversion stabilization and the suppression of effects caused by side-lobe leakage, the combination of results from different epochs and people, and multiple hypothesis testing. This paper examines each of these steps in turn and provides a possible path which produces relatively ‘clean’ connectivity plots. In particular we show how spectral matrix diagonal up-weighting can simultaneously stabilize spectral matrix inversion and reduce effects caused by side-lobe leakage, and use the stepdown multiple hypothesis test procedure to help formulate an interaction strength.
Resumo:
Background: Sense of coherence (SOC) is an individual characteristic related to a positive life orientation leading to effective coping. A weak SOC has been associated with indicators of general morbidity and mortality. However, the relationship between SOC and diabetes has not been studied in prospective design. The present study prospectively examined the relationship between a weak SOC and the incidence of diabetes. Methods: The relationship between a weak SOC and the incidence of diabetes was investigated among 5827 Finnish male employees aged 18–65 at baseline (1986). SOC was measured by questionnaire survey at baseline. Data on prescription diabetes drugs from 1987 to 2004 were obtained from the Drug Imbursement Register held by the Social Insurance Institution. Results: During the follow-up, 313 cases of diabetes were recorded. A weak SOC was associated with a 46% higher risk of diabetes in participants who had been =<50 years of age on entry into the study. This association was independent of age, education, marital status, psychological distress, self-rated health, smoking status, binge drinking and physical activity. No similar association was observed in older employees. Conclusion: The results suggest that besides focusing on well-known risk factors for diabetes, strengthening SOC in employees of =<50 years of age can also play a role in attempts to tackle increasing rates of diabetes.
Resumo:
EV is a child with a talent for learning language combined with Asperger syndrome. EV’s talent is evident in the unusual circumstances of her acquisition of both her first (Bulgarian) and second (German) languages and the unique patterns of both receptive and expressive language (in both the L1 and L2), in which she shows subtle dissociations in competence and performance consistent with an uneven cognitive profile of skills and abilities. We argue that this case provides support for theories of language learning and usage that require more general underlying cognitive mechanisms and skills. One such account, the Weak Central Coherence (WCC) hypothesis of autism, provides a plausible framework for the interpretation of the simultaneous co-occurrence of EV’s particular pattern of cognitive strengths and weaknesses. Furthermore, we show that specific features of the uneven cognitive profile of Asperger syndrome can help explain the observed language talent displayed by EV. Thus, rather than demonstrating a case where language learning takes place despite the presence of deficits, EV’s case illustrates how a pattern of strengths within this profile can specifically promote language learning.
Resumo:
We describe an all-fibre, passive scheme for making extended range interferometric measurements based on the dual wavelength technique. The coherence tuned interferometer network is illuminated with a single superfluorescent fibre source at 1.55 µm and the two wavelengths are synthesised at the output by means of chirped fibre Bragg gratings. We demonstrate an unambiguous sensing range of 270 µm, with a dynamic range of 2.7 × 10 5.
Resumo:
Purpose. To evaluate the repeatability and reproducibility of subfoveal choroidal thickness (CT) calculations performed manually using optical coherence tomography (OCT). Methods. The CT was imaged in vivo at each of two visits on 11 healthy volunteers (mean age, 35.72 ± 13.19 years) using the spectral domain OCT. CT was manually measured after applying ImageJ processing filters on 15 radial subfoveal scans. Each radial scan was spaced 12° from each other and contained 2500 A-scans. The coefficient of variability, coefficient of repeatability (CoR), coefficient of reproducibility, and intraclass correlation coefficient determined the reproducibility and repeatability of the calculation. Axial length (AL) and mean spherical equivalent refractive error were measured with the IOLMaster and an open view autorefractor to study their potential relationship with CT. Results. The within-visit and between-visit coefficient of variability, CoR, coefficient of reproducibility, and intraclass correlation coefficient were 0.80, 2.97% 2.44%, and 99%, respectively. The subfoveal CT correlated significantly with AL (R = -0.60, p = 0.05). Conclusions. The subfoveal CT could be measured manually in vivo using OCT and the readings obtained from the healthy subjects evaluated were repeatable and reproducible. It is proposed that OCT could be a useful instrument to perform in vivo assessment and monitoring of CT changes in retinal disease. The preliminary results suggest a negative correlation between subfoveal CT and AL in such a way that it decreases with increasing AL but not with refractive error.