42 resultados para Elements, Elettrofisiologia, Acquisizione Real Time, Analisi Real Time, High Throughput Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel interrogation technique for fully distributed linearly chirped fiber Bragg grating (LCFBG) strain sensors with simultaneous high temporal and spatial resolution based on optical time-stretch frequency-domain reflectometry (OTS-FDR) is proposed and experimentally demonstrated. LCFBGs is a promising candidate for fully distributed sensors thanks to its longer grating length and broader reflection bandwidth compared to normal uniform FBGs. In the proposed system, two identical LCFBGs are employed in a Michelson interferometer setup with one grating serving as the reference grating whereas the other serving as the sensing element. Broadband spectral interferogram is formed and the strain information is encoded into the wavelength-dependent free spectral range (FSR). Ultrafast interrogation is achieved based on dispersion-induced time stretch such that the target spectral interferogram is mapped to a temporal interference waveform that can be captured in real-Time using a single-pixel photodector. The distributed strain along the sensing grating can be reconstructed from the instantaneous RF frequency of the captured waveform. High-spatial resolution is also obtained due to high-speed data acquisition. In a proof-of-concept experiment, ultrafast real-Time interrogation of fully-distributed grating sensors with various strain distributions is experimentally demonstrated. An ultrarapid measurement speed of 50 MHz with a high spatial resolution of 31.5 μm over a gauge length of 25 mm and a strain resolution of 9.1 μϵ have been achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exploratory analysis of data in all sciences seeks to find common patterns to gain insights into the structure and distribution of the data. Typically visualisation methods like principal components analysis are used but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this technical report we discuss a complementary approach based on a non-linear probabilistic model. The generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate far more structure than a two dimensional principal components plot could, and deal at the same time with missing data. We show that using the generative topographic mapping provides us with an optimal method to explore the data while being able to replace missing values in a dataset, particularly where a large proportion of the data is missing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research about diagnosis of chronic illness indicates this is an emotional time for patients. Information provision is especially salient for diabetes management. Yet current orthodoxy suggests that too much information at the time of diagnosis is unhelpful for patients. In this study, we used in-depth interviews with 40 newly diagnosed type 2 diabetic (T2DM) patients in Scotland, to explore their emotional reactions about diagnosis, and their views about information provision at the time of diagnosis. Data were analysed using a thematic approach. Our results showed three main 'routes' to diagnosis: 'suspected diabetes' route; 'illness' route; and 'routine' route. Those within the 'routine' route described the most varied emotional reactions to their diagnosis. We found that most patients, irrespective of their route to diagnosis, wanted more information about diabetes management at the time of diagnosis. We suggest that practitioners would benefit from being sensitive to the route patients follow to diagnosis, and prompt, simple but detailed advice about T2DM management would be helpful for newly diagnosed patients. © 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis examines options for high capacity all optical networks. Specifically optical time division multiplexed (OTDM) networks based on electro-optic modulators are investigated experimentally, whilst comparisons with alternative approaches are carried out. It is intended that the thesis will form the basis of comparison between optical time division multiplexed networks and the more mature approach of wavelength division multiplexed networks. Following an introduction to optical networking concepts, the required component technologies are discussed. In particular various optical pulse sources are described with the demanding restrictions of optical multiplexing in mind. This is followed by a discussion of the construction of multiplexers and demultiplexers, including favoured techniques for high speed clock recovery. Theoretical treatments of the performance of Mach Zehnder and electroabsorption modulators support the design criteria that are established for the construction of simple optical time division multiplexed systems. Having established appropriate end terminals for an optical network, the thesis examines transmission issues associated with high speed RZ data signals. Propagation of RZ signals over both installed (standard fibre) and newly commissioned fibre routes are considered in turn. In the case of standard fibre systems, the use of dispersion compensation is summarised, and the application of mid span spectral inversion experimentally investigated. For green field sites, soliton like propagation of high speed data signals is demonstrated. In this case the particular restrictions of high speed soliton systems are discussed and experimentally investigated, namely the increasing impact of timing jitter and the downward pressure on repeater spacings due to the constraint of the average soliton model. These issues are each addressed through investigations of active soliton control for OTDM systems and through investigations of novel fibre types respectively. Finally the particularly remarkable networking potential of optical time division multiplexed systems is established, and infinite node cascadability using soliton control is demonstrated. A final comparison of the various technologies for optical multiplexing is presented in the conclusions, where the relative merits of the technologies for optical networking emerges as the key differentiator between technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. Most existing systems concentrate either on mining algorithms or on visualization techniques. Though visual methods developed in information visualization have been helpful, for improved understanding of a complex large high-dimensional dataset, there is a need for an effective projection of such a dataset onto a lower-dimension (2D or 3D) manifold. This paper introduces a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualization domain. The framework follows Shneiderman’s mantra to provide an effective user interface. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection methods, such as Generative Topographic Mapping (GTM) and Hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, billboarding, and user interaction facilities, to provide an integrated visual data mining framework. Results on a real life high-dimensional dataset from the chemoinformatics domain are also reported and discussed. Projection results of GTM are analytically compared with the projection results from other traditional projection methods, and it is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genome sequences from many organisms, including humans, have been completed, and high-throughput analyses have produced burgeoning volumes of 'omics' data. Bioinformatics is crucial for the management and analysis of such data and is increasingly used to accelerate progress in a wide variety of large-scale and object-specific functional analyses. Refined algorithms enable biotechnologists to follow 'computer-aided strategies' based on experiments driven by high-confidence predictions. In order to address compound problems, current efforts in immuno-informatics and reverse vaccinology are aimed at developing and tuning integrative approaches and user-friendly, automated bioinformatics environments. This will herald a move to 'computer-aided biotechnology': smart projects in which time-consuming and expensive large-scale experimental approaches are progressively replaced by prediction-driven investigations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We demonstrate a single-step method for the generation of collagen and poly-l-Lysine (PLL) micropatterns on a poly(ethylene glycol) (PEG) functionalized glass surface for cell based assays. The method involves establishing a reliable silanization method to create an effective non-adhesive PEG layer on glass that inhibits cell attachment, followed by the spotting of collagen or PLL solutions using non-contact piezoelectric printing. We show for the first time that the spotted protein micropatterns remain stable on the PEG surface even after extensive washing, thus significantly simplifying protein pattern formation. We found that adherence and spreading of NIH-3T3 fibroblasts was confined to PLL and collagen areas of the micropatterns. In contrast, primary rat hepatocytes adhered and spread only on collagen micropatterns, where they formed uniform, well defined functionally active cell arrays. The differing affinity of hepatocytes and NIH-3T3 fibroblasts for collagen and PLL patterns was used to develop a simple technique for creating a co-culture of the two cell types. This has the potential to form structured arrays that mimic the in vivo hepatic environment and is easily integrated within a miniaturized analytical platform for developing high throughput toxicity analysis in vitro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the pricing-to-market (PTM) behaviour of the UK export sector. Unlike previous studies, this study econometrically tests for seasonal unit roots in the export prices prior to estimating PTM behaviour. Prior studies have seasonally adjusted the data automatically. This study’s results show that monthly export prices contain very little seasonal unit roots implying that there is a loss of information in the data generating process of the series when estimating PTM using seasonally-adjusted data. Prior studies have also ignored the econometric properties of the data despite the existence of ARCH effects in such data. The standard approach has been to estimate PTM models using Ordinary Least Square (OLS). For this reason, both EGARCH and GJR-EGARCH (hereafter GJR) estimation methods are used to estimate both a standard and an Error Correction model (ECM) of PTM. The results indicate that PTM behaviour varies across UK sectors. The variables used in the PTM models are co-integrated and an ECM is a valid representation of pricing behaviour. The study also finds that the price adjustment is slower when the analysis is performed on real prices, i.e., data that are adjusted for inflation. There is strong evidence of auto-regressive condition heteroscedasticity (ARCH) effects – meaning that the PTM parameter estimates of prior studies have been ineffectively estimated. Surprisingly, there is very little evidence of asymmetry. This suggests that exporters appear to PTM at a relatively constant rate. This finding might also explain the failure of prior studies to find evidence of asymmetric exposure in foreign exchange (FX) rates. This study also provides a cross sectional analysis to explain the implications of the observed PTM of producers’ marginal cost, market share and product differentiation. The cross-sectional regressions are estimated using OLS, Generalised Method of Moment (GMM) and Logit estimations. Overall, the results suggest that market share affects PTM positively.Exporters with smaller market share are more likely to operate PTM. Alternatively, product differentiation is negatively associated with PTM. So industries with highly differentiated products are less likely to adjust their prices. However, marginal costs seem not to be significantly associated with PTM. Exporters perform PTM to limit the FX rate effect pass-through to their foreign customers, but they also avoided exploiting PTM to the full, since to do so can substantially reduce their profits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of astrogliosis, or reactive gliosis, is a typical response of astrocytes to a wide range of physical and chemical injuries. The up-regulation of the astrocyte specific glial fibrillary acidic protein (GFAP) is a hallmark of reactive gliosis and is widely used as a marker to identify the response. In order to develop a reliable, sensitive and high throughput astrocyte toxicity assay that is more relevant to the human response than existing animal cell based models, the U251-MG, U373-MG and CCF-STTG 1 human astrocytoma cell lines were investigated for their ability to exhibit reactive-like changes following exposure to ethanol, chloroquine diphosphate, trimethyltin chloride and acrylamide. Cytotoxicity analysis showed that the astrocytic cells were generally more resistant to the cytotoxic effects of the agents than the SH-SY5Y neuroblastoma cells. Retinoic acid induced differentiation of the SH-SY5Y line was also seen to confer some degree of resistance to toxicant exposure, particularly in the case of ethanol. Using a cell based ELISA for GFAP together with concurrent assays for metabolic activity and cell number, each of the three cell lines responded to toxicant exposure by an increase in GFAP immunoreactivity (GFAP-IR), or by increased metabolic activity. Ethanol, chloroquine diphosphate, trimethyltin chloride and bacterial lipopolysaccharide all induced either GFAP or MTT increases depending upon the cell line, dose and exposure time. Preliminary investigations of additional aspects of astrocytic injury indicated that IL-6, but not TNF-α. or nitric oxide, is released following exposure to each of the compounds, with the exception of acrylamide. It is clear that these human astrocytoma cell lines are capable of responding to toxicant exposure in a manner typical of reactive gliosis and are therefore a valuable cellular model in the assessment of in vitro neurotoxicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2,5-hexanedione (2,5HD) is the neurotoxic metabolite of the aliphatic hydrocarbon n-Hexane. The isomers, 2,3-hexanedione (2,3HD) and 3,4-hexanedione (3,4HD) are used as food additives. Although the neurotoxicity of 2,5HD is well established, there are no human data of the possible toxicity of the 2,3- and 3,4- isomers. MTT and flow cytometry were utilised to determine the cytotoxicity of hexanedione isomers in neuroblastoma cells. The neuroblastoma cell lines SK-N-SH and SH-SY5Y are sufficiently neuron-like to provide preliminary assessment of the neurotoxic potential of these isomers, in comparison with toxicity towards human non-neuronal cells. Initial studies showed that 2,5HD was the least toxic in all cell lines at all times (4, 24 and 48h). Although considerably lower than for 2,5HD, in general the IC50s for the α isomers were not significantly different from each other and, besides 4h exposure, the SH-SY5Y cells were significantly more sensitive to 2,3HD and 3,4HD than the SK-N-SH cells. All three isomers caused varying degrees of apoptosis in the neuroblastoma lines, with 3,4HD more potent than 2,3HD. Flow cytometry highlighted cell cycle arrest indicative of DNA damage with 2,3- and 3,4HD. The toxicity of the isomers towards 3 non-neuronal cell lines (MCF7, HepG2 and CaCo-2) was assessed by MTT assay. All 3 hexanedione isomers proved to be cytotoxic in all non-neuronal cell lines at all time points. These data suggest cytotoxicity of 2,3- and 3,4HD (mM range), but it is difficult to define this as specific neurotoxicity in the absence of specific neurotoxic endpoints. However, the neuroblastomas were significantly more susceptible to the cytotoxic effects of the α hexanedione isomers at exposures of 4 and 24 hours, compared to non-neuronal lines. Finally, a mechanism of toxicity is suggested for the α HD isomers whereby inhibition of the oxoglutarate carrier (OGC) releases apoptosis inducing factor (AIF), causing apoptosis-like cell death.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. High Throughput Screening (HTS) is an important tool in the pharmaceutical industry for discovering leads which can be optimised and further developed into candidate drugs. Since the development of new robotic technologies, the ability to test the activities of compounds has considerably increased in recent years. Traditional methods, looking at tables and graphical plots for analysing relationships between measured activities and the structure of compounds, have not been feasible when facing a large HTS dataset. Instead, data visualisation provides a method for analysing such large datasets, especially with high dimensions. So far, a few visualisation techniques for drug design have been developed, but most of them just cope with several properties of compounds at one time. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine the distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of t.he hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E- and M-step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model. In this thesis we also demonstrate the applicability of the hierarchy of latent trait models in the field of document data mining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new family of commercial zinc alloys designated as ZA8, ZA12, and ZA27 and high damping capacity alloys including Cosmal and Supercosmal and aluminium alloy LM25 were investigated for compressive creep and load relaxation behaviour under a series of temperatures and stresses. A compressive creep machine was designed to test the sand cast hollow cylindrical test specimens of these alloys. For each compressive creep experiment the variation of creep strain was presented in the form of graphs plotted as percentage of creep strain () versus time in seconds (s). In all cases, the curves showed the same general form of the creep curve, i.e. a primary creep stage, followed by a linear steady-state region (secondary creep). In general, it was observed that alloy ZA8 had the least primary creep among the commercial zinc-based alloys and ZA27 the greatest. The extent of primary creep increased with aluminium content to that of ZA27 then declined to Supercosmal. The overall creep strength of ZA27 was generally less than ZA8 and ZA12 but it showed better creep strength than ZA8 and ZA12 at high temperature and high stress. In high damping capacity alloys, Supercosmal had less primary creep and longer secondary creep regions and also had the lowest minimum creep rate among all the tested alloys. LM25 exhibited almost no creep at maximum temperature and stress used in this research work. Total creep elongation was shown to be well correlated using an empirical equation. Stress exponent and activation energies were calculated and found to be consistent with the creep mechanism of dislocation climb. The primary α and β phases in the as-cast structures decomposed to lamellar phases on cooling, with some particulates at dendrite edges and grain boundaries. Further breakdown into particulate bodies occurred during creep testing, and zinc bands developed at the highest test temperature of 160°C. The results of load relaxation testing showed that initially load loss proceeded rapidly and then deminished gradually with time. Load loss increased with temperature and almost all the curves approximated to a logarithmic decay of preload with time. ZA alloys exhibited almost the same load loss at lower temperature, but at 120°C ZA27 improved its relative performance with the passage of time. High damping capacity alloys and LM25 had much better resistance to load loss than ZA alloys and LM25 was found to be the best against load loss among these alloys. A preliminary equation was derived to correlate the retained load with time and temperature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The drug efflux pump P-glycoprotein (P-gp) (ABCB1) confers multidrug resistance, a major cause of failure in the chemotherapy of tumours, exacerbated by a shortage of potent and selective inhibitors. A high throughput assay using purified P-gp to screen and characterise potential inhibitors would greatly accelerate their development. However, long-term stability of purified reconstituted ABCB1 can only be reliably achieved with storage at -80 °C. For example, at 20 °C, the activity of ABCB1 was abrogated with a half-life of <1 day. The aim of this investigation was to stabilise purified, reconstituted ABCB1 to enable storage at higher temperatures and thereby enable design of a high throughput assay system. The ABCB1 purification procedure was optimised to allow successful freeze drying by substitution of glycerol with the disaccharides trehalose or maltose. Addition of disaccharides resulted in ATPase activity being retained immediately following lyophilisation with no significant difference between the two disaccharides. However, during storage trehalose preserved ATPase activity for several months regardless of the temperature (e.g. 60% retention at 150 days), whereas ATPase activity in maltose purified P-gp was affected by both storage time and temperature. The data provide an effective mechanism for the production of resilient purified, reconstituted ABCB1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze a Big Data set of geo-tagged tweets for a year (Oct. 2013–Oct. 2014) to understand the regional linguistic variation in the U.S. Prior work on regional linguistic variations usually took a long time to collect data and focused on either rural or urban areas. Geo-tagged Twitter data offers an unprecedented database with rich linguistic representation of fine spatiotemporal resolution and continuity. From the one-year Twitter corpus, we extract lexical characteristics for twitter users by summarizing the frequencies of a set of lexical alternations that each user has used. We spatially aggregate and smooth each lexical characteristic to derive county-based linguistic variables, from which orthogonal dimensions are extracted using the principal component analysis (PCA). Finally a regionalization method is used to discover hierarchical dialect regions using the PCA components. The regionalization results reveal interesting linguistic regional variations in the U.S. The discovered regions not only confirm past research findings in the literature but also provide new insights and a more detailed understanding of very recent linguistic patterns in the U.S.