925 resultados para cache coherence protocols
Resumo:
Security protocols are often modelled at a high level of abstraction, potentially overlooking implementation-dependent vulnerabilities. Here we use the Z specification language's rich set of data structures to formally model potentially ambiguous messages that may be exploited in a 'type flaw' attack. We then show how to formally verify whether or not such an attack is actually possible in a particular protocol using Z's schema calculus.
Resumo:
Freshwater is extremely precious; but even more precious than freshwater is clean freshwater. From the time that 2/3 of our planet is covered in water, we have contaminated our globe with chemicals that have been used by industrial activities over the last century in a unprecedented way causing harm to humans and wildlife. We have to adopt a new scientific mindset in order to face this problem so to protect this important resource. The Water Framework Directive (European Parliament and the Council, 2000) is a milestone legislative document that transformed the way that water quality monitoring is undertaken across all Member States by introducing the Ecological and Chemical Status. A “good or higher” Ecological Status is expected to be achieved for all waterbodies in Europe by 2015. Yet, most of the European waterbodies, which are determined to be at risk, or of moderate to bad quality, further information will be required so that adequate remediation strategies can be implemented. To date, water quality evaluation is based on five biological components (phytoplankton, macrophytes and benthic algae, macroinvertebrates and fishes) and various hydromorphological and physicochemical elements. The evaluation of the chemical status is principally based on 33 priority substances and on 12 xenobiotics, considered as dangerous for the environment. This approach takes into account only a part of the numerous xenobiotics that can be present in surface waters and could not evidence all the possible causes of ecotoxicological stress that can act in a water section. The mixtures of toxic chemicals may constitute an ecological risk not predictable on the basis of the single component concentration. To improve water quality, sources of contamination and causes of ecological alterations need to be identified. On the other hand, the analysis of the community structure, which is the result of multiple processes, including hydrological constrains and physico-chemical stress, give back only a “photograph” of the actual status of a site without revealing causes and sources of the perturbation. A multidisciplinary approach, able to integrate the information obtained by different methods, such as community structure analysis and eco-genotoxicological studies, could help overcome some of the difficulties in properly identifying the different causes of stress in risk assessment. In synthesis, the river ecological status is the result of a combination of multiple pressures that, for management purposes and quality improvement, have to be disentangled from each other. To reduce actual uncertainty in risk assessment, methods that establish quantitative links between levels of contamination and community alterations are needed. The analysis of macrobenthic invertebrate community structure has been widely used to identify sites subjected to perturbation. Trait-based descriptors of community structure constitute a useful method in ecological risk assessment. The diagnostic capacity of freshwater biomonitoring could be improved by chronic sublethal toxicity testing of water and sediment samples. Requiring an exposure time that covers most of the species’ life cycle, chronic toxicity tests are able to reveal negative effects on life-history traits at contaminant concentrations well below the acute toxicity level. Furthermore, the responses of high-level endpoints (growth, fecundity, mortality) can be integrated in order to evaluate the impact on population’s dynamics, a highly relevant endpoint from the ecological point of view. To gain more accurate information about potential causes and consequences of environmental contamination, the evaluation of adverse effects at physiological, biochemical and genetic level is also needed. The use of different biomarkers and toxicity tests can give information about the sub-lethal and toxic load of environmental compartments. Biomarkers give essential information about the exposure to toxicants, such as endocrine disruptor compounds and genotoxic substances whose negative effects cannot be evidenced by using only high-level toxicological endpoints. The increasing presence of genotoxic pollutants in the environment has caused concern regarding the potential harmful effects of xenobiotics on human health, and interest on the development of new and more sensitive methods for the assessment of mutagenic and cancerogenic risk. Within the WFD, biomarkers and bioassays are regarded as important tools to gain lines of evidence for cause-effect relationship in ecological quality assessment. Despite the scientific community clearly addresses the advantages and necessity of an ecotoxicological approach within the ecological quality assessment, a recent review reports that, more than one decade after the publication of the WFD, only few studies have attempted to integrate ecological water status assessment and biological methods (namely biomarkers or bioassays). None of the fifteen reviewed studies included both biomarkers and bioassays. The integrated approach developed in this PhD Thesis comprises a set of laboratory bioassays (Daphnia magna acute and chronic toxicity tests, Comet Assay and FPG-Comet) newly-developed, modified tacking a cue from standardized existing protocols or applied for freshwater quality testing (ecotoxicological, genotoxicological and toxicogenomic assays), coupled with field investigations on macrobenthic community structures (SPEAR and EBI indexes). Together with the development of new bioassays with Daphnia magna, the feasibility of eco-genotoxicological testing of freshwater and sediment quality with Heterocypris incongruens was evaluated (Comet Assay and a protocol for chronic toxicity). However, the Comet Assay, although standardized, was not applied to freshwater samples due to the lack of sensitivity of this species observed after 24h of exposure to relatively high (and not environmentally relevant) concentrations of reference genotoxicants. Furthermore, this species demonstrated to be unsuitable also for chronic toxicity testing due to the difficult evaluation of fecundity as sub-lethal endpoint of exposure and complications due to its biology and behaviour. The study was applied to a pilot hydrographic sub-Basin, by selecting section subjected to different levels of anthropogenic pressure: this allowed us to establish the reference conditions, to select the most significant endpoints and to evaluate the coherence of the responses of the different lines of evidence (alteration of community structure, eco-genotoxicological responses, alteration of gene expression profiles) and, finally, the diagnostic capacity of the monitoring strategy. Significant correlations were found between the genotoxicological parameter Tail Intensity % (TI%) and macrobenthic community descriptors SPEAR (p<0.001) and EBI (p<0.05), between the genotoxicological parameter describing DNA oxidative stress (ΔTI%) and mean levels of nitrates (p<0.01) and between reproductive impairment (Failed Development % from D. magna chronic bioassays) and TI% (p<0.001) as well as EBI (p<0.001). While correlation among parameters demonstrates a general coherence in the response to increasing impacts, the concomitant ability of each single endpoint to be responsive to specific sources of stress is at the basis of the diagnostic capacity of the integrated approach as demonstrated by stations presenting a mismatch among the different lines of evidence. The chosen set of bioassays, as well as the selected endpoints, are not providing redundant indications on the water quality status but, on the contrary, are contributing with complementary pieces of information about the several stressors that insist simultaneously on a waterbody section providing this monitoring strategy with a solid diagnostic capacity. Our approach should provide opportunities for the integration of biological effects into monitoring programmes for surface water, especially in investigative monitoring. Moreover, it should provide a more realistic assessment of impact and exposure of aquatic organisms to contaminants. Finally this approach should provide an evaluation of drivers of change in biodiversity and its causalities on ecosystem function/services provision, that is the direct and indirect contributions to human well-being.
Resumo:
Purpose: The aim of this study was to compare a developmental optical coherence tomography (OCT) based contact lens inspection instrument to a widely used geometric inspection instrument (Optimec JCF), to establish the capability of a market focused OCT system. Methods: Measurements of 27 soft spherical contact lenses were made using the Optimec JCF and a new OCT based instrument, the Optimec is830. Twelve of the lenses analysed were specially commissioned from a traditional hydrogel (Contamac GM Advance 49%) and 12 from a silicone hydrogel (Contamac Definitive 65), each set with a range of back optic zone radius (BOZR) and centre thickness (CT) values. Three commercial lenses were also measured; CooperVision MyDay (Stenfilcon A) in −10D, −3D and +6D powers. Two measurements of BOZR, CT and total diameter were made for each lens in temperature controlled saline on both instruments. Results: The results showed that the is830 and JCF measurements were comparable, but that the is830 had a better repeatability coefficient for BOZR (0.065 mm compared to 0.151 mm) and CT (0.008 mm compared to 0.027 mm). Both instruments had similar results for total diameter (0.041 mm compared to 0.044 mm). Conclusions: The OCT based instrument assessed in this study is able to match and improve on the JCF instrument for the measurement of total diameter, back optic zone radius and centre thickness for soft contact lenses in temperature controlled saline.
Resumo:
Spectral and coherence methodologies are ubiquitous for the analysis of multiple time series. Partial coherence analysis may be used to try to determine graphical models for brain functional connectivity. The outcome of such an analysis may be considerably influenced by factors such as the degree of spectral smoothing, line and interference removal, matrix inversion stabilization and the suppression of effects caused by side-lobe leakage, the combination of results from different epochs and people, and multiple hypothesis testing. This paper examines each of these steps in turn and provides a possible path which produces relatively ‘clean’ connectivity plots. In particular we show how spectral matrix diagonal up-weighting can simultaneously stabilize spectral matrix inversion and reduce effects caused by side-lobe leakage, and use the stepdown multiple hypothesis test procedure to help formulate an interaction strength.
Resumo:
Background: Sense of coherence (SOC) is an individual characteristic related to a positive life orientation leading to effective coping. A weak SOC has been associated with indicators of general morbidity and mortality. However, the relationship between SOC and diabetes has not been studied in prospective design. The present study prospectively examined the relationship between a weak SOC and the incidence of diabetes. Methods: The relationship between a weak SOC and the incidence of diabetes was investigated among 5827 Finnish male employees aged 18–65 at baseline (1986). SOC was measured by questionnaire survey at baseline. Data on prescription diabetes drugs from 1987 to 2004 were obtained from the Drug Imbursement Register held by the Social Insurance Institution. Results: During the follow-up, 313 cases of diabetes were recorded. A weak SOC was associated with a 46% higher risk of diabetes in participants who had been =<50 years of age on entry into the study. This association was independent of age, education, marital status, psychological distress, self-rated health, smoking status, binge drinking and physical activity. No similar association was observed in older employees. Conclusion: The results suggest that besides focusing on well-known risk factors for diabetes, strengthening SOC in employees of =<50 years of age can also play a role in attempts to tackle increasing rates of diabetes.
Resumo:
EV is a child with a talent for learning language combined with Asperger syndrome. EV’s talent is evident in the unusual circumstances of her acquisition of both her first (Bulgarian) and second (German) languages and the unique patterns of both receptive and expressive language (in both the L1 and L2), in which she shows subtle dissociations in competence and performance consistent with an uneven cognitive profile of skills and abilities. We argue that this case provides support for theories of language learning and usage that require more general underlying cognitive mechanisms and skills. One such account, the Weak Central Coherence (WCC) hypothesis of autism, provides a plausible framework for the interpretation of the simultaneous co-occurrence of EV’s particular pattern of cognitive strengths and weaknesses. Furthermore, we show that specific features of the uneven cognitive profile of Asperger syndrome can help explain the observed language talent displayed by EV. Thus, rather than demonstrating a case where language learning takes place despite the presence of deficits, EV’s case illustrates how a pattern of strengths within this profile can specifically promote language learning.
Resumo:
We describe an all-fibre, passive scheme for making extended range interferometric measurements based on the dual wavelength technique. The coherence tuned interferometer network is illuminated with a single superfluorescent fibre source at 1.55 µm and the two wavelengths are synthesised at the output by means of chirped fibre Bragg gratings. We demonstrate an unambiguous sensing range of 270 µm, with a dynamic range of 2.7 × 10 5.
Resumo:
Purpose. To evaluate the repeatability and reproducibility of subfoveal choroidal thickness (CT) calculations performed manually using optical coherence tomography (OCT). Methods. The CT was imaged in vivo at each of two visits on 11 healthy volunteers (mean age, 35.72 ± 13.19 years) using the spectral domain OCT. CT was manually measured after applying ImageJ processing filters on 15 radial subfoveal scans. Each radial scan was spaced 12° from each other and contained 2500 A-scans. The coefficient of variability, coefficient of repeatability (CoR), coefficient of reproducibility, and intraclass correlation coefficient determined the reproducibility and repeatability of the calculation. Axial length (AL) and mean spherical equivalent refractive error were measured with the IOLMaster and an open view autorefractor to study their potential relationship with CT. Results. The within-visit and between-visit coefficient of variability, CoR, coefficient of reproducibility, and intraclass correlation coefficient were 0.80, 2.97% 2.44%, and 99%, respectively. The subfoveal CT correlated significantly with AL (R = -0.60, p = 0.05). Conclusions. The subfoveal CT could be measured manually in vivo using OCT and the readings obtained from the healthy subjects evaluated were repeatable and reproducible. It is proposed that OCT could be a useful instrument to perform in vivo assessment and monitoring of CT changes in retinal disease. The preliminary results suggest a negative correlation between subfoveal CT and AL in such a way that it decreases with increasing AL but not with refractive error.
Resumo:
This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.
Resumo:
Modern distributed control systems comprise of a set of processors which are interconnected using a suitable communication network. For use in real-time control environments, such systems must be deterministic and generate specified responses within critical timing constraints. Also, they should be sufficiently robust to survive predictable events such as communication or processor faults. This thesis considers the problem of coordinating and synchronizing a distributed real-time control system under normal and abnormal conditions. Distributed control systems need to periodically coordinate the actions of several autonomous sites. Often the type of coordination required is the all or nothing property of an atomic action. Atomic commit protocols have been used to achieve this atomicity in distributed database systems which are not subject to deadlines. This thesis addresses the problem of applying time constraints to atomic commit protocols so that decisions can be made within a deadline. A modified protocol is proposed which is suitable for real-time applications. The thesis also addresses the problem of ensuring that atomicity is provided even if processor or communication failures occur. Previous work has considered the design of atomic commit protocols for use in non time critical distributed database systems. However, in a distributed real-time control system a fault must not allow stringent timing constraints to be violated. This thesis proposes commit protocols using synchronous communications which can be made resilient to a single processor or communication failure and still satisfy deadlines. Previous formal models used to design commit protocols have had adequate state coverability but have omitted timing properties. They also assumed that sites communicated asynchronously and omitted the communications from the model. Timed Petri nets are used in this thesis to specify and design the proposed protocols which are analysed for consistency and timeliness. Also the communication system is mcxielled within the Petri net specifications so that communication failures can be included in the analysis. Analysis of the Timed Petri net and the associated reachability tree is used to show the proposed protocols always terminate consistently and satisfy timing constraints. Finally the applications of this work are described. Two different types of applications are considered, real-time databases and real-time control systems. It is shown that it may be advantageous to use synchronous communications in distributed database systems, especially if predictable response times are required. Emphasis is given to the application of the developed commit protocols to real-time control systems. Using the same analysis techniques as those used for the design of the protocols it can be shown that the overall system performs as expected both functionally and temporally.
Resumo:
We assess the accuracy of the Visante anterior segment optical coherence tomographer (AS-OCT) and present improved formulas for measurement of surface curvature and axial separation. Measurements are made in physical model eyes. Accuracy is compared for measurements of corneal thickness (d1) and anterior chamber depth (d2) using-built-in AS-OCT software versus the improved scheme. The improved scheme enables measurements of lens thickness (d 3) and surface curvature, in the form of conic sections specified by vertex radii and conic constants. These parameters are converted to surface coordinates for error analysis. The built-in AS-OCT software typically overestimates (mean±standard deviation(SD)]d1 by +62±4 μm and d2 by +4±88μm. The improved scheme reduces d1 (-0.4±4 μm) and d2 (0±49 μm) errors while also reducing d3 errors from +218±90 (uncorrected) to +14±123 μm (corrected). Surface x coordinate errors gradually increase toward the periphery. Considering the central 6-mm zone of each surface, the x coordinate errors for anterior and posterior corneal surfaces reached +3±10 and 0±23 μm, respectively, with the improved scheme. Those of the anterior and posterior lens surfaces reached +2±22 and +11±71 μm, respectively. Our improved scheme reduced AS-OCT errors and could, therefore, enhance pre- and postoperative assessments of keratorefractive or cataract surgery, including measurement of accommodating intraocular lenses. © 2007 Society of Photo-Optical Instrumentation Engineers.
Resumo:
Background: A new commercially available optical low coherence reflectometry device (Lenstar, Haag-Streit or Allegro Biograph, Wavelight) provides high-resolution non-contact measurements of ocular biometry. The study evaluates the validity and repeatability of these measurements compared with current clinical instrumentation. Method: Measurements were taken with the LenStar and IOLMaster on 112 patients aged 41–96 years listed for cataract surgery. A subgroup of 21 patients also had A-scan applanation ultrasonography (OcuScan) performed. Intersession repeatability of the LenStar measurements was assessed on 32 patients Results: LenStar measurements of white-to-white were similar to the IOLMaster (average difference 0.06 (SD 0.03) D; p?=?0.305); corneal curvature measurements were similar to the IOLMaster (average difference -0.04 (0.20) D; p?=?0.240); anterior chamber depth measurements were significantly longer than the IOLMaster (by 0.10 (0.40) mm) and ultrasound (by 0.32 (0.62) mm; p<0.001); crystalline lens thickness measurements were similar to ultrasound (difference 0.16 (0.83) mm, p?=?0.382); axial length measurements were significantly longer than the IOLMaster (by 0.01 (0.02) mm) but shorter than ultrasound (by 0.14 (0.15) mm; p<0.001). The LensStar was unable to take measurements due to dense media opacities in a similar number of patients to the IOLMaster (9–10%). The LenStar biometric measurements were found to be highly repeatable (variability =2% of average value). Conclusions: Although there were some statistical differences between ocular biometry measurements between the LenStar and current clinical instruments, they were not clinically significant. LenStar measurements were highly repeatable and the instrument easy to use.
Resumo:
Gestalt grouping rules imply a process or mechanism for grouping together local features of an object into a perceptual whole. Several psychophysical experiments have been interpreted as evidence for constrained interactions between nearby spatial filter elements and this has led to the hypothesis that element linking might be mediated by these interactions. A common tacit assumption is that these interactions result in response modulation which disturbs a local contrast code. We addressed this possibility by performing contrast discrimination experiments using two-dimensional arrays of multiple Gabor patches arranged either (i) vertically, (ii) in circles (coherent conditions), or (iii) randomly (incoherent condition), as well as for a single Gabor patch. In each condition, contrast increments were applied to either the entire test stimulus (experiment 1) or a single patch whose position was cued (experiment 2). In experiment 3, the texture stimuli were reduced to a single contour by displaying only the central vertical strip. Performance was better for the multiple-patch conditions than for the single-patch condition, but whether the multiple-patch stimulus was coherent or not had no systematic effect on the results in any of the experiments. We conclude that constrained local interactions do not interfere with a local contrast code for our suprathreshold stimuli, suggesting that, in general, this is not the way in which element linking is achieved. The possibility that interactions are involved in enhancing the detectability of contour elements at threshold remains unchallenged by our experiments.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
We describe how an acousto-optic tunable filter can be used to both demultiplex the signals from multiple fibre Bragg grating sensors and simultaneously provide wide bandwidth signal demodulation in a system using interferometric wavelength shift detection. In an experimental demonstration, the approach provided a noise limited strain resolution of 24.9 n epsilon Hz(-1/ 2) at 15 Hz.