27 resultados para Field assisted sintering technique

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Epilepsy is one of the most common neurological disorders, a large fraction of which is resistant to pharmacotherapy. In this light, understanding the mechanisms of epilepsy and its intractable forms in particular could create new targets for pharmacotherapeutic intervention. The current project explores the dynamic changes in neuronal network function in the chronic temporal lobe epilepsy (TLE) in rat and human brain in vitro. I focused on the process of establishment of epilepsy (epileptogenesis) in the temporal lobe. Rhythmic behaviour of the hippocampal neuronal networks in healthy animals was explored using spontaneous oscillations in the gamma frequency band (SγO). The use of an improved brain slice preparation technique resulted in the natural occurence (in the absence of pharmacological stimulation) of rhythmic activity, which was then pharmacologically characterised and compared to other models of gamma oscillations (KA- and CCh-induced oscillations) using local field potential recording technique. The results showed that SγO differed from pharmacologically driven models, suggesting higher physiological relevance of SγO. Network activity was also explored in the medial entorhinal cortex (mEC), where spontaneous slow wave oscillations (SWO) were detected. To investigate the course of chronic TLE establishment, a refined Li-pilocarpine-based model of epilepsy (RISE) was developed. The model significantly reduced animal mortality and demonstrated reduced intensity, yet high morbidy with almost 70% mean success rate of developing spontaneous recurrent seizures. We used SγO to characterize changes in the hippocampal neuronal networks throughout the epileptogenesis. The results showed that the network remained largely intact, demonstrating the subtle nature of the RISE model. Despite this, a reduction in network activity was detected during the so-called latent (no seizure) period, which was hypothesized to occur due to network fragmentation and an abnormal function of kainate receptors (KAr). We therefore explored the function of KAr by challenging SγO with kainic acid (KA). The results demonstrated a remarkable decrease in KAr response during the latent period, suggesting KAr dysfunction or altered expression, which will be further investigated using a variety of electrophysiological and immunocytochemical methods. The entorhinal cortex, together with the hippocampus, is known to play an important role in the TLE. Considering this, we investigated neuronal network function of the mEC during epileptogenesis using SWO. The results demonstrated a striking difference in AMPAr function, with possible receptor upregulation or abnormal composition in the early development of epilepsy. Alterations in receptor function inevitably lead to changes in the network function, which may play an important role in the development of epilepsy. Preliminary investigations were made using slices of human brain tissue taken following surgery for intratctable epilepsy. Initial results showed that oscillogenesis could be induced in human brain slices and that such network activity was pharmacologically similar to that observed in rodent brain. Overall, our findings suggest that excitatory glutamatergic transmission is heavily involved in the process of epileptogenesis. Together with other types of receptors, KAr and AMPAr contribute to epilepsy establishment and may be the key to uncovering its mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To extend our understanding of the early visual hierarchy, we investigated the long-range integration of first- and second-order signals in spatial vision. In our first experiment we performed a conventional area summation experiment where we varied the diameter of (a) luminance-modulated (LM) noise and (b) contrastmodulated (CM) noise. Results from the LM condition replicated previous findings with sine-wave gratings in the absence of noise, consistent with long-range integration of signal contrast over space. For CM, the summation function was much shallower than for LM suggesting, at first glance, that the signal integration process was spatially less extensive than for LM. However, an alternative possibility was that the high spatial frequency noise carrier for the CM signal was attenuated by peripheral retina (or cortex), thereby impeding our ability to observe area summation of CM in the conventional way. To test this, we developed the ''Swiss cheese'' stimulus of Meese and Summers (2007) in which signal area can be varied without changing the stimulus diameter, providing some protection against inhomogeneity of the retinal field. Using this technique and a two-component subthreshold summation paradigm we found that (a) CM is spatially integrated over at least five stimulus cycles (possibly more), (b) spatial integration follows square-law signal transduction for both LM and CM and (c) the summing device integrates over spatially-interdigitated LM and CM signals when they are co-oriented, but not when crossoriented. The spatial pooling mechanism that we have identified would be a good candidate component for amodule involved in representing visual textures, including their spatial extent.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We report the fabrication of a refractive index (RI) sensor based on a liquid core fibre Bragg grating (FBG). A micro-slot FBG was created in standard telecom optical fibre employing the tightly focused femtosecond laser inscription aided chemical etching. A micro-slot with dimensions of 5.74(h) × 125(w) × 1388.72(l) μm was engraved across the whole fibre and along 1mm long FBG which gives advantage of a relatively robust liquid core waveguide. The device performed the refractive index sensitivity up to about 742.72 nm/RIU. © 2011 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, we introduced a new 'GLM-beamformer' technique for MEG analysis that enables accurate localisation of both phase-locked and non-phase-locked neuromagnetic effects, and their representation as statistical parametric maps (SPMs). This provides a useful framework for comparison of the full range of MEG responses with fMRI BOLD results. This paper reports a 'proof of principle' study using a simple visual paradigm (static checkerboard). The five subjects each underwent both MEG and fMRI paradigms. We demonstrate, for the first time, the presence of a sustained (DC) field in the visual cortex, and its co-localisation with the visual BOLD response. The GLM-beamformer analysis method is also used to investigate the main non-phase-locked oscillatory effects: an event-related desynchronisation (ERD) in the alpha band (8-13 Hz) and an event-related synchronisation (ERS) in the gamma band (55-70 Hz). We show, using SPMs and virtual electrode traces, the spatio-temporal covariance of these effects with the visual BOLD response. Comparisons between MEG and fMRI data sets generally focus on the relationship between the BOLD response and the transient evoked response. Here, we show that the stationary field and changes in oscillatory power are also important contributors to the BOLD response, and should be included in future studies on the relationship between neuronal activation and the haemodynamic response. © 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have investigated the effect of ageing on the visual system using the relatively new technique of magentoencephalography (MEG). This technique measures the magnetic signals produced by the visual system using a SQUID magnetometer. The magnetic visual evoked field (VEF) was measured over the occipital cortex to pattern and flash stimuli in 86 normal subjects aged 15 - 86 years. Factors that influenced subject defocussing or defixating the stimulus or selective attention were controlled as far as possible. The latency of the major positive component to the pattern reversal stimulus (P100M) increased with age particularly after the age of 55 years while the amplitude of the P100M decreased over the life span. The latency of the major flash component (P2M) increased much more slowly with age, while its amplitude decreased in only a proportion of elderly subjects. Changes in the P100M with age may reflect senile changes in the eye and optic nerve, e.g. senile miosis or degenerative changes in the retina. The P2M may be more susceptible to senile changes in the retina. The data suggest that the spatial frequency channels deteriorate more rapidly with age than the luminance channels and that MEG may be an effective method of studying ageing in the visual system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grafting of antioxidants and other modifiers onto polymers by reactive extrusion, has been performed successfully by the Polymer Processing and Performance Group at Aston University. Traditionally the optimum conditions for the grafting process have been established within a Brabender internal mixer. Transfer of this batch process to a continuous processor, such as an extruder, has, typically, been empirical. To have more confidence in the success of direct transfer of the process requires knowledge of, and comparison between, residence times, mixing intensities, shear rates and flow regimes in the internal mixer and in the continuous processor.The continuous processor chosen for the current work in the closely intermeshing, co-rotating twin-screw extruder (CICo-TSE). CICo-TSEs contain screw elements that convey material with a self-wiping action and are widely used for polymer compounding and blending. Of the different mixing modules contained within the CICo-TSE, the trilobal elements, which impose intensive mixing, and the mixing discs, which impose extensive mixing, are of importance when establishing the intensity of mixing. In this thesis, the flow patterns within the various regions of the single-flighted conveying screw elements and within both the trilobal element and mixing disc zones of a Betol BTS40 CICo-TSE, have been modelled using the computational fluid dynamics package Polyflow. A major obstacle encountered when solving the flow problem within all of these sets of elements, arises from both the complex geometry and the time-dependent flow boundaries as the elements rotate about their fixed axes. Simulation of the time dependent boundaries was overcome by selecting a number of sequential 2D and 3D geometries, used to represent partial mixing cycles. The flow fields were simulated using the ideal rheological properties of polypropylene and characterised in terms of velocity vectors, shear stresses generated and a parameter known as the mixing efficiency. The majority of the large 3D simulations were performed on the Cray J90 supercomputer situated at the Rutherford-Appleton laboratories, with pre- and postprocessing operations achieved via a Silicon Graphics Indy workstation. A mechanical model was constructed consisting of various CICo-TSE elements rotating within a transparent outer barrel. A technique has been developed using coloured viscous clays whereby the flow patterns and mixing characteristics within the CICo-TSE may be visualised. In order to test and verify the simulated predictions, the patterns observed within the mechanical model were compared with the flow patterns predicted by the computational model. The flow patterns within the single-flighted conveying screw elements in particular, showed good agreement between the experimental and simulated results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some critical aspects of a new kind of on-line measurement technique for micro and nanoscale surface measurements are described. This attempts to use spatial light-wave scanning to replace mechanical stylus scanning, and an optical fibre interferometer to replace optically bulky interferometers for measuring the surfaces. The basic principle is based on measuring the phase shift of a reflected optical signal. Wavelength-division-multiplexing and fibre Bragg grating techniques are used to carry out wavelength-to-field transformation and phase-to-depth detection, allowing a large dynamic measurement ratio (range/resolution) and high signal-to-noise ratio with remote access. In effect the paper consists of two parts: multiplexed fibre interferometry and remote on-machine surface detection sensor (an optical dispersive probe). This paper aims to investigate the metrology properties of a multiplexed fibre interferometer and to verify its feasibility by both theoretical and experimental studies. Two types of optical probes, using a dispersive prism and a blazed grating, respectively, are introduced to realize wavelength-to-spatial scanning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last decade the use of randomised gene libraries has had an enormous impact in the field of protein engineering. Such libraries comprise many variations of a single gene in which codon replacements are used to substitute key residues of the encoded protein. The expression of such libraries generates a library of randomised proteins which can subsequently be screened for desired or novel activities. Randomisation in this fashion has predominantly been achieved by the inclusion of the codons NNN or NNGCor T, in which N represents any of the four bases A,C,G, or T. The use of thesis codons however, necessities the cloning of redundant codons at each position of randomisation, in addition to those required to encode the twenty possible amino acid substitutions. As degenerate codons must be included at each position of randomisation, this results in a progressive loss of randomisation efficiency as the number of randomised positions is increased. The ratio of genes to proteins in these libraries rises exponentially with each position of randomisation, creating large gene libraries, which generate protein libraries of limited diversity upon expression. In addition to these problems of library size, the cloning of redundant codons also results in the generation of protein libraries in which substituted amino acids are unevenly represented. As several of the randomised codons may encode the same amino acid, for example serine which is encoded six time using the codon NNN, an inherent bias may be introduced into the resulting protein library during the randomisation procedure. The work outlined here describes the development of a novel randomisation technique aimed at a eliminating codon redundancy from randomised gene libraries, thus addressing the problems of library size and bias, associated with the cloning of redundant codons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis documents the design, manufacture and testing of a passive and non-invasive micro-scale planar particle-from-fluid filter for segregating cell types from a homogeneous suspension. The microfluidics system can be used to separate spermatogenic cells from testis biopsy samples, providing a mechanism for filtrate retrieval for assisted reproduction therapy. The system can also be used for point-of-service diagnostics applications for hospitals, lab-on-a-chip pre-processing and field applications such as clinical testing in the third world. Various design concepts are developed and manufactured, and are assessed based on etched structure morphology, robustness to variations in the manufacturing process, and design impacts on fluid flow and particle separation characteristics. Segregation was measured using image processing algorithms that demonstrate efficiency is more than 55% for 1 µl volumes at populations exceeding 1 x 107. the technique supports a significant reduction in time over conventional processing, in the separation and identification of particle groups, offering a potential reduction in the associated cost of the targeted procedure. The thesis has developed a model of quasi-steady wetting flow within the micro channel and identifies the forces across the system during post-wetting equalisation. The model and its underlying assumptions are validated empirically in microfabricated test structures through a novel Micro-Particle Image Velocimetry technique. The prototype devices do not require ancillary equipment nor additional filtration media, and therefore offer fewer opportunities for sample contamination over conventional processing methods. The devices are disposable with minimal reagent volumes and process waste. Optimal processing parameters and production methods are identified with any improvements that could be made to enhance their performance in a number of identified potential applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study developed statistical techniques to evaluate visual field progression for use with the Humphrey Field Analyzer (HFA). The long-term fluctuation (LF) was evaluated in stable glaucoma. The magnitude of both LF components showed little relationship with MD, CPSD and SF. An algorithm was proposed for determining the clinical necessity for a confirmatory follow-up examination. The between-examination variability was determined for the HFA Standard and FASTPAC algorithms in glaucoma. FASTPAC exhibited greater between-examination variability than the Standard algorithm across the range of sensitivities and with increasing eccentricity. The difference in variability between the algorithms had minimal clinical significance. The effect of repositioning the baseline in the Glaucoma Change Probability Analysis (GCPA) was evaluated. The global baseline of the GCPA limited the detection of progressive change at a single stimulus location. A new technique, pointwise univariate linear regressions (ULR), of absolute sensitivity and, of pattern deviation, against time to follow-up was developed. In each case, pointwise ULR was more sensitive to localised progressive changes in sensitivity than ULR of MD, alone. Small changes in sensitivity were more readily determined by the pointwise ULR than by the GCPA. A comparison between the outcome of pointwise ULR for all fields and for the last six fields manifested linear and curvilinear declines in the absolute sensitivity and the pattern deviation. A method for delineating progressive loss in glaucoma, based upon the error in the forecasted sensitivity of a multivariate model, was developed. Multivariate forecasting exhibited little agreement with GCPA in glaucoma but showed promise for monitoring visual field progression in OHT patients. The recovery of sensitivity in optic neuritis over time was modelled with a Cumulative Gaussian function. The rate and level of recovery was greater in the peripheral than the central field. Probability models to forecast the field of recovery were proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We experimentally demonstrate the use of full-field electronic dispersion compensation (EDC) to achieve a bit error rate of 5 x 10(-5) at 22.3 dB optical signal-to-noise ratio for single-channel 10 Gbit/s on-off keyed signal after transmission over 496 km field-installed single-mode fibre with an amplifier spacing of 124 km. This performance is achieved by designing the EDC so as to avoid electronic amplification of the noise content of the signal during full-field reconstruction. We also investigate the tolerance of the system to key signal processing parameters, and numerically demonstrate that single-channel 2160 km single mode fibre transmission without in-line optical dispersion compensation can be achieved using this technique with 80 km amplifier spacing and optimized system parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We numerically investigate the combination of full-field detection and feed-forward equalizer (FFE) for adaptive chromatic dispersion compensation up to 2160 km in a 10 Gbit/s on-off keyed optical transmission system. The technique, with respect to earlier reports, incorporates several important implementation modules, including the algorithm for adaptive equalization of the gain imbalance between the two receiver chains, compensation of phase misalignment of the asymmetric Mach-Zehnder interferometer, and simplified implementation of field calculation. We also show that in addition to enabling fast adaptation and simplification of field calculation, full-field FFE exhibits enhanced tolerance to the sampling phase misalignment and reduced sampling rate when compared to the full-field implementation using a dispersive transmission line.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical and thermal transport properties of the carbon nanotube bulk material compacted by spark plasma sintering have been investigated. The electrical conductivity of the as-prepared sample shows a lnT dependence from 4 to 50 K, after which the conductivity begins to increase approximately linearly with temperature. A magnetic field applied perpendicularly to the sample increases the electrical conductivity in the range of 0-8T at all testing temperatures, indicating that the sample possesses the two-dimensional weak localization at lower temperatures (?50 K), while behaviors like a semimetal at higher temperatures (?50 K). This material acts like a uniform compact consisting of randomly distributed two dimensional graphene layers. For the same material, the thermal conductivity is found to decrease almost linearly with decreasing temperature, similar to that of a single multi-walled carbon nanotube. Magnetic fields applied perpendicularly to the sample cause the thermal conductivity to decrease significantly, but the influence of the magnetic fields becomes weak when temperature increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion-ordered spectroscopy (DOSY) is a powerful technique for mixture analysis, but in its basic form it cannot separate the component spectra for species with very similar diffusion coefficients. It has been recently demonstrated that the component spectra of a mixture of isomers with nearly identical diffusion coefficients (the three dihydroxybenzenes) can be resolved using matrix-assisted DOSY (MAD), in which diffusion is perturbed by the addition of a co-solute such as a surfactant [R. Evans, S. Haiber, M. Nilsson, G. A. Morris, Anal. Chem. 2009, 81, 4548-4550]. However, little is known about the conditions required for such a separation, for example, the concentrations and concentration ratios of surfactant and solutes. The aim of this study was to explore the concentration range over whichmatrix-assisted DOSY using the surfactant SDS can achieve diffusion resolution of a simple model set of isomers, the monomethoxyphenols. The results show that the separation is remarkably robust with respect to both the concentrations and the concentration ratios of surfactant and solutes, supporting the idea that MAD may become a valuable tool formixture analysis. © 2010 John Wiley & Sons, Ltd.