982 resultados para cache coherence protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A compact, fiber-based spectrometer for biomedical application utilizing a tilted fiber Bragg grating (TFBG) as integrated dispersive element is demonstrated. Based on a 45° UV-written PS750 TFBG a refractive spectrometer with 2.06 radiant/μm dispersion and a numerical aperture of 0.1 was set up and tested as integrated detector for an optical coherence tomography (OCT) system. Featuring a 23 mm long active region at the fiber the spectrum is projected via a cylindrical lens for vertical beam collimation and focused by an achromatic doublet onto the detector array. Covering 740 nm to 860 nm the spectrometer was optically connected to a broadband white light interferometer and a wide field scan head and electronically to an acquisition and control computer. Tomograms of ophthalmic and dermal samples obtained by the frequency domain OCT-system were obtained achieving 2.84 μm axial and 7.6 μm lateral resolution. © 2014 SPIE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To determine the best photographic surrogate markers for detecting sight-threatening macular oedema (MO) in people with diabetes attending UK national screening programmes. Design: A multicentre, prospective, observational cohort study of 3170 patients with photographic signs of diabetic retinopathy visible within the macular region [exudates within two disc diameters, microaneurysms/dot haemorrhages (M/DHs) and blot haemorrhages (BHs)] who were recruited from seven study centres. Setting: All patients were recruited and imaged at one of seven study centres in Aberdeen, Birmingham, Dundee, Dunfermline, Edinburgh, Liverpool and Oxford. Participants: Subjects with features of diabetic retinopathy visible within the macular region attending one of seven diabetic retinal screening programmes. Interventions: Alternative referral criteria for suspected MO based on photographic surrogate markers; an optical coherence tomographic examination in addition to the standard digital retinal photograph. Main outcome measures: (1) To determine the best method to detect sight-threatening MO in people with diabetes using photographic surrogate markers. (2) Sensitivity and specificity estimates to assess the costs and consequences of using alternative strategies. (3) Modelled long-term costs and quality-adjusted life-years (QALYs). Results: Prevalence of MO was strongly related to the presence of lesions and was roughly five times higher in subjects with exudates or BHs or more than two M/DHs within one disc diameter. Having worse visual acuity was associated with about a fivefold higher prevalence of MO. Current manual screening grading schemes that ignore visual acuity or the presence of M/DHs could be improved by taking these into account. Health service costs increase substantially with more sensitive/less specific strategies. A fully automated strategy, using the automated detection of patterns of photographic surrogate markers, is superior to all current manual grading schemes for detecting MO in people with diabetes. The addition of optical coherence tomography (OCT) to each strategy, prior to referral, results in a reduction in costs to the health service with no decrement in the number of MO cases detected. Conclusions: Compared with all current manual grading schemes, for the same sensitivity, a fully automated strategy, using the automated detection of patterns of photographic surrogate markers, achieves a higher specificity for detecting MO in people with diabetes, especially if visual acuity is included in the automated strategy. Overall, costs to the health service are likely to increase if more sensitive referral strategies are adopted over more specific screening strategies for MO, for only very small gains in QALYs. The addition of OCT to each screening strategy, prior to referral, results in a reduction in costs to the health service with no decrement in the number of MO cases detected. © Queen's Printer and Controller of HMSO 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SSince the external dimension of the European Union’s Justice and Home Affairs (JHA) began to be considered, a substantial amount of literature has been dedicated to discussing how the EU is cooperating with non-member states in order to counter problems such as terrorism, organized crime and illegal migration. According to the EU, the degree of security interconnectedness has become so relevant that threats can only be adequately controlled if there is effective concerted regional action. This reasoning has led the EU to develop a number of instruments, which have resulted in the exporting of certain elements of its JHA policies, either through negotiation or socialization. Although the literature has explored how this transfer has been applied to the field of terrorism and immigration, very little has been written on the externalisation of knowledge, practice and norms in the area of organized crime. This article proposes to bridge this gap by looking at EU practice in the development of the external dimension of organized crime policies, through the theoretical lens of the EU governance framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PurposeTo develop and validate a classification system for focal vitreomacular traction (VMT) with and without macular hole based on spectral domain optical coherence tomography (SD-OCT), intended to aid in decision-making and prognostication.MethodsA panel of retinal specialists convened to develop this system. A literature review followed by discussion on a wide range of cases formed the basis for the proposed classification. Key features on OCT were identified and analysed for their utility in clinical practice. A final classification was devised based on two sequential, independent validation exercises to improve interobserver variability.ResultsThis classification tool pertains to idiopathic focal VMT assessed by a horizontal line scan using SD-OCT. The system uses width (W), interface features (I), foveal shape (S), retinal pigment epithelial changes (P), elevation of vitreous attachment (E), and inner and outer retinal changes (R) to give the acronym WISPERR. Each category is scored hierarchically. Results from the second independent validation exercise indicated a high level of agreement between graders: intraclass correlation ranged from 0.84 to 0.99 for continuous variables and Fleiss' kappa values ranged from 0.76 to 0.95 for categorical variables.ConclusionsWe present an OCT-based classification system for focal VMT that allows anatomical detail to be scrutinised and scored qualitatively and quantitatively using a simple, pragmatic algorithm, which may be of value in clinical practice as well as in future research studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of the mild detemplation method, based on Fenton chemistry (with and without previous solvent extraction), and calcination was evaluated by the drug uptake capacity of SBA-15 materials. A number of characterization techniques were applied for evaluation and comparison of the materials properties such as TGA, CNH, N2 physisorption and 29Si NMR. The mild Fenton detemplation method rendered a nearly pristine SBA-15 without structural shrinkage, low residual template, improved surface area, pore volume and silanol concentration. The drug (ibuprofen) adsorption experiments were carried out by solution immersion in powdery form. The mild detemplated samples experienced an enhanced uptake that could be explained by the enhanced density of silanols (mmol/g), originated from the absence of calcination in the Fenton approaches. © 2014 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tragic events of September 11th ushered a new era of unprecedented challenges. Our nation has to be protected from the alarming threats of adversaries. These threats exploit the nation's critical infrastructures affecting all sectors of the economy. There is the need for pervasive monitoring and decentralized control of the nation's critical infrastructures. The communications needs of monitoring and control of critical infrastructures was traditionally catered for by wired communication systems. These technologies ensured high reliability and bandwidth but are however very expensive, inflexible and do not support mobility and pervasive monitoring. The communication protocols are Ethernet-based that used contention access protocols which results in high unsuccessful transmission and delay. An emerging class of wireless networks, named embedded wireless sensor and actuator networks has potential benefits for real-time monitoring and control of critical infrastructures. The use of embedded wireless networks for monitoring and control of critical infrastructures requires secure, reliable and timely exchange of information among controllers, distributed sensors and actuators. The exchange of information is over shared wireless media. However, wireless media is highly unpredictable due to path loss, shadow fading and ambient noise. Monitoring and control applications have stringent requirements on reliability, delay and security. The primary issue addressed in this dissertation is the impact of wireless media in harsh industrial environment on the reliable and timely delivery of critical data. In the first part of the dissertation, a combined networking and information theoretic approach was adopted to determine the transmit power required to maintain a minimum wireless channel capacity for reliable data transmission. The second part described a channel-aware scheduling scheme that ensured efficient utilization of the wireless link and guaranteed delay. Various analytical evaluations and simulations are used to evaluate and validate the feasibility of the methodologies and demonstrate that the protocols achieved reliable and real-time data delivery in wireless industrial networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding how decisions for international investments are made and how this affects the overall pattern of investments and firm’s performance is of particular importance both in strategy and international business research. This dissertation introduced first home-host country relatedness (HHCR) as the degree to which countries are efficiently combined within the investment portfolios of firms. It theorized and demonstrated that HHCR will vary with the motivation for investments along at least two key dimensions: the nature of foreign investments and the connectedness of potential host countries to the rest of the world. Drawing on cognitive psychology and decision-making research, it developed a theory of strategic decision making proposing that strategic solutions are chosen close to a convenient anchor. Building on research on memory imprinting, it also proposed that managers tend to rely on older knowledge representation. In the context of international investment decisions, managers use their home countries as an anchor and are more likely to choose as a site for foreign investments host countries that are ‘close’ to the home country. These decisions are also likely to rely more strongly on closeness to time invariant country factors of historic and geographic nature rather than time-variant institutions. Empirical tests using comprehensive investments data by all public multinational companies (MNC) worldwide, or over 15,000 MNCs with over half a million subsidiaries, support the claims. Finally, the dissertation introduced the concept of International Coherence (IC) defined as the degree to which an MNE’s network comprises countries that are related. It was hypothesized that maintaining a high level of coherence is important for firm performance and will enhance it. Also, the presence of international coherence mitigates some of the negative effects of unrelated product diversification. Empirical tests using data on foreign investments of over 20,000 public firms, while also developing a home-host country relatedness index for up to 24,300 home-host pairs, provided support for the theory advanced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I conducted this study to provide insights toward deepening understanding of association between culture and writing by building, assessing, and refining a conceptual model of second language writing. To do this, I examined culture and coherence as well as the relationship between them through a mixed methods research design. Coherence has been an important and complex concept in ESL/EFL writing. I intended to study the concept of coherence in the research context of contrastive rhetoric, comparing the coherence quality in argumentative essays written by undergraduates in Mainland China and their U.S. peers. In order to analyze the complex concept of coherence, I synthesized five linguistic theories of coherence: Halliday and Hasan's cohesion theory, Carroll's theory of coherence, Enkvist's theory of coherence, Topical Structure Analysis, and Toulmin's Model. Based upon the synthesis, 16 variables were generated. Across these 16 variables, Hotelling t-test statistical analysis was conducted to predict differences in argumentative coherence between essays written by two groups of participants. In order to complement the statistical analysis, I conducted 30 interviews of the writers in the studies. Participants' responses were analyzed with open and axial coding. By analyzing the empirical data, I refined the conceptual model by adding more categories and establishing associations among them. The study found that U.S. students made use of more pronominal reference. Chinese students adopted more lexical devices of reiteration and extended paralleling progression. The interview data implied that the difference may be associated with the difference in linguistic features and rhetorical conventions in Chinese and English. As far as Toulmin's Model is concerned, Chinese students scored higher on data than their U.S. peers. According to the interview data, this may be due to the fact that Toulmin's Model, modified as three elements of arguments, have been widely and long taught in Chinese writing instruction while U.S. interview participants said that they were not taught to write essays according to Toulmin's Model. Implications were generated from the process of textual data analysis and the formulation of structural model defining coherence. These implications were aimed at informing writing instruction, assessment, peer-review, and self-revision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The deployment of wireless communications coupled with the popularity of portable devices has led to significant research in the area of mobile data caching. Prior research has focused on the development of solutions that allow applications to run in wireless environments using proxy based techniques. Most of these approaches are semantic based and do not provide adequate support for representing the context of a user (i.e., the interpreted human intention.). Although the context may be treated implicitly it is still crucial to data management. In order to address this challenge this dissertation focuses on two characteristics: how to predict (i) the future location of the user and (ii) locations of the fetched data where the queried data item has valid answers. Using this approach, more complete information about the dynamics of an application environment is maintained. ^ The contribution of this dissertation is a novel data caching mechanism for pervasive computing environments that can adapt dynamically to a mobile user's context. In this dissertation, we design and develop a conceptual model and context aware protocols for wireless data caching management. Our replacement policy uses the validity of the data fetched from the server and the neighboring locations to decide which of the cache entries is less likely to be needed in the future, and therefore a good candidate for eviction when cache space is needed. The context aware driven prefetching algorithm exploits the query context to effectively guide the prefetching process. The query context is defined using a mobile user's movement pattern and requested information context. Numerical results and simulations show that the proposed prefetching and replacement policies significantly outperform conventional ones. ^ Anticipated applications of these solutions include biomedical engineering, tele-health, medical information systems and business. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The promise of Wireless Sensor Networks (WSNs) is the autonomous collaboration of a collection of sensors to accomplish some specific goals which a single sensor cannot offer. Basically, sensor networking serves a range of applications by providing the raw data as fundamentals for further analyses and actions. The imprecision of the collected data could tremendously mislead the decision-making process of sensor-based applications, resulting in an ineffectiveness or failure of the application objectives. Due to inherent WSN characteristics normally spoiling the raw sensor readings, many research efforts attempt to improve the accuracy of the corrupted or "dirty" sensor data. The dirty data need to be cleaned or corrected. However, the developed data cleaning solutions restrict themselves to the scope of static WSNs where deployed sensors would rarely move during the operation. Nowadays, many emerging applications relying on WSNs need the sensor mobility to enhance the application efficiency and usage flexibility. The location of deployed sensors needs to be dynamic. Also, each sensor would independently function and contribute its resources. Sensors equipped with vehicles for monitoring the traffic condition could be depicted as one of the prospective examples. The sensor mobility causes a transient in network topology and correlation among sensor streams. Based on static relationships among sensors, the existing methods for cleaning sensor data in static WSNs are invalid in such mobile scenarios. Therefore, a solution of data cleaning that considers the sensor movements is actively needed. This dissertation aims to improve the quality of sensor data by considering the consequences of various trajectory relationships of autonomous mobile sensors in the system. First of all, we address the dynamic network topology due to sensor mobility. The concept of virtual sensor is presented and used for spatio-temporal selection of neighboring sensors to help in cleaning sensor data streams. This method is one of the first methods to clean data in mobile sensor environments. We also study the mobility pattern of moving sensors relative to boundaries of sub-areas of interest. We developed a belief-based analysis to determine the reliable sets of neighboring sensors to improve the cleaning performance, especially when node density is relatively low. Finally, we design a novel sketch-based technique to clean data from internal sensors where spatio-temporal relationships among sensors cannot lead to the data correlations among sensor streams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il traguardo più importante per la connettività wireless del futuro sarà sfruttare appieno le potenzialità offerte da tutte le interfacce di rete dei dispositivi mobili. Per questo motivo con ogni probabilità il multihoming sarà un requisito obbligatorio per quelle applicazioni che puntano a fornire la migliore esperienza utente nel loro utilizzo. Sinteticamente è possibile definire il multihoming come quel processo complesso per cui un end-host o un end-site ha molteplici punti di aggancio alla rete. Nella pratica, tuttavia, il multihoming si è rivelato difficile da implementare e ancor di più da ottimizzare. Ad oggi infatti, il multihoming è lontano dall’essere considerato una feature standard nel network deployment nonostante anni di ricerche e di sviluppo nel settore, poiché il relativo supporto da parte dei protocolli è quasi sempre del tutto inadeguato. Naturalmente anche per Android in quanto piattaforma mobile più usata al mondo, è di fondamentale importanza supportare il multihoming per ampliare lo spettro delle funzionalità offerte ai propri utenti. Dunque alla luce di ciò, in questa tesi espongo lo stato dell’arte del supporto al multihoming in Android mettendo a confronto diversi protocolli di rete e testando la soluzione che sembra essere in assoluto la più promettente: LISP. Esaminato lo stato dell’arte dei protocolli con supporto al multihoming e l’architettura software di LISPmob per Android, l’obiettivo operativo principale di questa ricerca è duplice: a) testare il roaming seamless tra le varie interfacce di rete di un dispositivo Android, il che è appunto uno degli obiettivi del multihoming, attraverso LISPmob; e b) effettuare un ampio numero di test al fine di ottenere attraverso dati sperimentali alcuni importanti parametri relativi alle performance di LISP per capire quanto è realistica la possibilità da parte dell’utente finale di usarlo come efficace soluzione multihoming.