23 resultados para Calculation tool in reliability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaccines remain a key tool in the defence against major diseases. However, in the development of vaccines a trade off between safety and efficacy is required with newer vaccines, based on sub-unit proteins and peptides, displaying improved safety profiles yet suffering from low efficacy. Adjuvants can be employed to improve their potency, but currently there are only a limited number of adjuvant systems licensed for clinical use. Of the new adjuvants being investigated, particulate systems offer several advantages including: passive targeting to the antigen-presenting cells within the immune system, protection against adjuvant degradation, and ability for sustained antigen release. There has been a range of particulate vaccine delivery systems outlined in recent patents including polymer-based microspheres (which are generally more focused on the use of synthetic polymers, in particular the polyesters) and surfactant-based vesicles. Within these formulations, several patented systems are exploiting the use of cationic lipids which, despite their limitations in gene therapy, clearly offer strong potential as adjuvants. Within this review, the current range of particulate system technologies being investigated as potential adjuvants are discussed with regard to both their respective advantages and the potential hurdles which must be overcome for such systems to be converted into successful pharmaceutical products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous research into formulaic language has focussed on specialised groups of people (e.g. L1 acquisition by infants and adult L2 acquisition) with ordinary adult native speakers of English receiving less attention. Additionally, whilst some features of formulaic language have been used as evidence of authorship (e.g. the Unabomber’s use of you can’t eat your cake and have it too) there has been no systematic investigation into this as a potential marker of authorship. This thesis reports the first full-scale study into the use of formulaic sequences by individual authors. The theory of formulaic language hypothesises that formulaic sequences contained in the mental lexicon are shaped by experience combined with what each individual has found to be communicatively effective. Each author’s repertoire of formulaic sequences should therefore differ. To test this assertion, three automated approaches to the identification of formulaic sequences are tested on a specially constructed corpus containing 100 short narratives. The first approach explores a limited subset of formulaic sequences using recurrence across a series of texts as the criterion for identification. The second approach focuses on a word which frequently occurs as part of formulaic sequences and also investigates alternative non-formulaic realisations of the same semantic content. Finally, a reference list approach is used. Whilst claiming authority for any reference list can be difficult, the proposed method utilises internet examples derived from lists prepared by others, a procedure which, it is argued, is akin to asking large groups of judges to reach consensus about what is formulaic. The empirical evidence supports the notion that formulaic sequences have potential as a marker of authorship since in some cases a Questioned Document was correctly attributed. Although this marker of authorship is not universally applicable, it does promise to become a viable new tool in the forensic linguist’s tool-kit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Typically resorting to ad hoc means of tool selection, developers are often dissatisfied with their chosen tool on account of the fact that the tool lacks required functionality or does not fit seamlessly within the context in which it is to be used. This paper describes a system for evaluating the suitability of user interface development tools for use in software development organisations and projects such that the selected tool appears ‘invisible’ within its anticipated context of use. The paper also outlines and presents the results of an informal empirical study and a series of observational case studies of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid-level sensing technologies have attracted great prominence, because such measurements are essential to industrial applications, such as fuel storage, flood warning and in the biochemical industry. Traditional liquid level sensors are based on electromechanical techniques; however they suffer from intrinsic safety concerns in explosive environments. In recent years, given that optical fiber sensors have lots of well-established advantages such as high accuracy, costeffectiveness, compact size, and ease of multiplexing, several optical fiber liquid level sensors have been investigated which are based on different operating principles such as side-polishing the cladding and a portion of core, using a spiral side-emitting optical fiber or using silica fiber gratings. The present work proposes a novel and highly sensitive liquid level sensor making use of polymer optical fiber Bragg gratings (POFBGs). The key elements of the system are a set of POFBGs embedded in silicone rubber diaphragms. This is a new development building on the idea of determining liquid level by measuring the pressure at the bottom of a liquid container, however it has a number of critical advantages. The system features several FBG-based pressure sensors as described above placed at different depths. Any sensor above the surface of the liquid will read the same ambient pressure. Sensors below the surface of the liquid will read pressures that increase linearly with depth. The position of the liquid surface can therefore be approximately identified as lying between the first sensor to read an above-ambient pressure and the next higher sensor. This level of precision would not in general be sufficient for most liquid level monitoring applications; however a much more precise determination of liquid level can be made by linear regression to the pressure readings from the sub-surface sensors. There are numerous advantages to this multi-sensor approach. First, the use of linear regression using multiple sensors is inherently more accurate than using a single pressure reading to estimate depth. Second, common mode temperature induced wavelength shifts in the individual sensors are automatically compensated. Thirdly, temperature induced changes in the sensor pressure sensitivity are also compensated. Fourthly, the approach provides the possibility to detect and compensate for malfunctioning sensors. Finally, the system is immune to changes in the density of the monitored fluid and even to changes in the effective force of gravity, as might be obtained in an aerospace application. The performance of an individual sensor was characterized and displays a sensitivity (54 pm/cm), enhanced by more than a factor of 2 when compared to a sensor head configuration based on a silica FBG published in the literature, resulting from the much lower elastic modulus of POF. Furthermore, the temperature/humidity behavior and measurement resolution were also studied in detail. The proposed configuration also displays a highly linear response, high resolution and good repeatability. The results suggest the new configuration can be a useful tool in many different applications, such as aircraft fuel monitoring, and biochemical and environmental sensing, where accuracy and stability are fundamental. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetoencephalographic (MEG) signals, like electroencephalographic (EEG) measures, are the direct extracranial manifestations of neuronal activation. The two techniques can detect time-varying changes in electromagnetic activity with a sub-millisecond time resolution. Extra-cranial electromagnetic measures are the cornerstone of the non-invasive diagnostic armamentarium in patients with epilepsy. Their extremely high temporal resolution – comparable to intracranial recordings – is the basis for a precise definition of onset and propagation of ictal and interictal abnormalities. Given the cost of the infrastructure and equipment, MEG has yet to develop into a routinely applicable diagnostic tool in clinical settings. However, in recent years, an increasing number of patients with epilepsy have been investigated – usually in the context of presurgical evaluation of refractory epilepsies – and initial encouraging results have been reported. We will briefly review the principles and the technology behind MEG and its contribution in the diagnostic work-up of patients with epilepsy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Floods are one of the most dangerous and common disasters worldwide, and these disasters are closely linked to the geography of the affected area. As a result, several papers in the academic field of humanitarian logistics have incorporated the use of Geographical Information Systems (GIS) for disaster management. However, most of the contributions in the literature are using these systems for network analysis and display, with just a few papers exploiting the capabilities of GIS to improve planning and preparedness. To show the capabilities of GIS for disaster management, this paper uses raster GIS to analyse potential flooding scenarios and provide input to an optimisation model. The combination is applied to two real-world floods in Mexico to evaluate the value of incorporating GIS for disaster planning. The results provide evidence that including GIS analysis for a decision-making tool in disaster management can improve the outcome of disaster operations by reducing the number of facilities used at risk of flooding. Empirical results imply the importance of the integration of advanced remote sensing images and GIS for future systems in humanitarian logistics.