21 resultados para MS-based methods

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many inflammatory diseases have an oxidative aetiology, which leads to oxidative damage to biomolecules, including proteins. It is now increasingly recognized that oxidative post-translational modifications (oxPTMs) of proteins affect cell signalling and behaviour, and can contribute to pathology. Moreover, oxidized proteins have potential as biomarkers for inflammatory diseases. Although many assays for generic protein oxidation and breakdown products of protein oxidation are available, only advanced tandem mass spectrometry approaches have the power to localize specific oxPTMs in identified proteins. While much work has been carried out using untargeted or discovery mass spectrometry approaches, identification of oxPTMs in disease has benefitted from the development of sophisticated targeted or semi-targeted scanning routines, combined with chemical labeling and enrichment approaches. Nevertheless, many potential pitfalls exist which can result in incorrect identifications. This review explains the limitations, advantages and challenges of all of these approaches to detecting oxidatively modified proteins, and provides an update on recent literature in which they have been used to detect and quantify protein oxidation in disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper advances a philosophically informed rationale for the broader, reflexive and practical application of arts-based methods to benefit research, practice and pedagogy. It addresses the complexity and diversity of learning and knowing, foregrounding a cohabitative position and recognition of a plurality of research approaches, tailored and responsive to context. Appreciation of art and aesthetic experience is situated in the everyday, underpinned by multi-layered exemplars of pragmatic visual-arts narrative inquiry undertaken in the third, creative and communications sectors. Discussion considers semi-guided use of arts-based methods as a conduit for topic engagement, reflection and intersubjective agreement; alongside observation and interpretation of organically employed approaches used by participants within daily norms. Techniques span handcrafted (drawing), digital (photography), hybrid (cartooning), performance dimensions (improvised installations) and music (metaphor and structure). The process of creation, the artefact/outcome produced and experiences of consummation are all significant, with specific reflexivity impacts. Exploring methodology and epistemology, both the "doing" and its interpretation are explicated to inform method selection, replication, utility, evaluation and development of cross-media skills literacy. Approaches are found engaging, accessible and empowering, with nuanced capabilities to alter relationships with phenomena, experiences and people. By building a discursive space that reduces barriers; emancipation, interaction, polyphony, letting-go and the progressive unfolding of thoughts are supported, benefiting ways of knowing, narrative (re)construction, sensory perception and capacities to act. This can also present underexplored researcher risks in respect to emotion work, self-disclosure, identity and agenda. The paper therefore elucidates complex, intricate relationships between form and content, the represented and the representation or performance, researcher and participant, and the self and other. This benefits understanding of phenomena including personal experience, sensitive issues, empowerment, identity, transition and liminality. Observations are relevant to qualitative and mixed methods researchers and a multidisciplinary audience, with explicit identification of challenges, opportunities and implications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described in this PhD thesis focuses on proteomics approaches to study the effect of oxidation on the modification status and protein-protein interactions of PTEN, a redox-sensitive phosphatase involved in a number of cellular processes including metabolism, apoptosis, cell proliferation, and survival. While direct evidence of a redox regulation of PTEN and its downstream signaling has been reported, the effect of cellular oxidative stress or direct PTEN oxidation on PTEN structure and interactome is still poorly defined. In a first study, GST-tagged PTEN was directly oxidized over a range of hypochlorous acid (HOCl) concentration, assayed for phosphatase activity, and oxidative post-translational modifications (oxPTMs) were quantified using LC-MS/MS-based label-free methods. In a second study, GSTtagged PTEN was prepared in a reduced and reversibly H2O2-oxidized form, immobilized on a resin support and incubated with HCT116 cell lysate to capture PTEN interacting proteins, which were analyzed by LC-MS/MS and comparatively quantified using label-free methods. In parallel experiments, HCT116 cells transfected with a GFP-tagged PTEN were treated with H2O2 and PTENinteracting proteins immunoprecipitated using standard methods. Several high abundance HOCl-induced oxPTMs were mapped, including those taking place at amino acids known to be important for PTEN phosphatase activity and protein-protein interactions, such as Met35, Tyr155, Tyr240 and Tyr315. A PTEN redox interactome was also characterized, which identified a number of PTEN-interacting proteins that vary with the reversible inactivation of PTEN caused by H2O2 oxidation. These included new PTEN interactors as well as the redox proteins peroxiredoxin-1 (Prdx1) and thioredoxin (Trx), which are known to be involved in the recycling of PTEN active site following H2O2-induced reversible inactivation. The results suggest that the oxidative modification of PTEN causes functional alterations in PTEN structure and interactome, with fundamental implications for the PTEN signaling role in many cellular processes, such as those involved in the pathophysiology of disease and ageing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large-scale disasters are constantly occurring around the world, and in many cases evacuation of regions of city is needed. ‘Operational Research/Management Science’ (OR/MS) has been widely used in emergency planning for over five decades. Warning dissemination, evacuee transportation and shelter management are three ‘Evacuation Support Functions’ (ESF) generic to many hazards. This thesis has adopted a case study approach to illustrate the importance of integrated approach of evacuation planning and particularly the role of OR/MS models. In the warning dissemination phase, uncertainty in the household’s behaviour as ‘warning informants’ has been investigated along with uncertainties in the warning system. An agentbased model (ABM) was developed for ESF-1 with households as agents and ‘warning informants’ behaviour as the agent behaviour. The model was used to study warning dissemination effectiveness under various conditions of the official channel. In the transportation phase, uncertainties in the household’s behaviour such as departure time (a function of ESF-1), means of transport and destination have been. Households could evacuate as pedestrians, using car or evacuation buses. An ABM was developed to study the evacuation performance (measured in evacuation travel time). In this thesis, a holistic approach for planning the public evacuation shelters called ‘Shelter Information Management System’ (SIMS) has been developed. A generic allocation framework of was developed to available shelter capacity to the shelter demand by considering the evacuation travel time. This was formulated using integer programming. In the sheltering phase, the uncertainty in household shelter choices (either nearest/allocated/convenient) has been studied for its impact on allocation policies using sensitivity analyses. Using analyses from the models and detailed examination of household states from ‘warning to safety’, it was found that the three ESFs though sequential in time, however have lot of interdependencies from the perspective of evacuation planning. This thesis has illustrated an OR/MS based integrated approach including and beyond single ESF preparedness. The developed approach will help in understanding the inter-linkages of the three evacuation phases and preparing a multi-agency-based evacuation planning evacuation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper aims to reducing difference between sketches and photos by synthesizing sketches from photos, and vice versa, and then performing sketch-sketch/photo-photo recognition with subspace learning based methods. Pseudo-sketch/pseudo-photo patches are synthesized with embedded hidden Markov model. Because these patches are assembled by averaging their overlapping area in most of the local strategy based methods, which leads to blurring effect to the resulted pseudo-sketch/pseudo-photo, we integrate the patches with image quilting. Experiments are carried out to demonstrate that the proposed method is effective to produce pseudo-sketch/pseudo-photo with high quality and achieve promising recognition results. © 2009.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work, we introduce the periodic nonlinear Fourier transform (PNFT) method as an alternative and efficacious tool for compensation of the nonlinear transmission effects in optical fiber links. In the Part I, we introduce the algorithmic platform of the technique, describing in details the direct and inverse PNFT operations, also known as the inverse scattering transform for periodic (in time variable) nonlinear Schrödinger equation (NLSE). We pay a special attention to explaining the potential advantages of the PNFT-based processing over the previously studied nonlinear Fourier transform (NFT) based methods. Further, we elucidate the issue of the numerical PNFT computation: we compare the performance of four known numerical methods applicable for the calculation of nonlinear spectral data (the direct PNFT), in particular, taking the main spectrum (utilized further in Part II for the modulation and transmission) associated with some simple example waveforms as the quality indicator for each method. We show that the Ablowitz-Ladik discretization approach for the direct PNFT provides the best performance in terms of the accuracy and computational time consumption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we review recent theoretical approaches for analysing the dynamics of on-line learning in multilayer neural networks using methods adopted from statistical physics. The analysis is based on monitoring a set of macroscopic variables from which the generalisation error can be calculated. A closed set of dynamical equations for the macroscopic variables is derived analytically and solved numerically. The theoretical framework is then employed for defining optimal learning parameters and for analysing the incorporation of second order information into the learning process using natural gradient descent and matrix-momentum based methods. We will also briefly explain an extension of the original framework for analysing the case where training examples are sampled with repetition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Road traffic accident involvement rates show that younger males are over represented in accidents. A number of studies have shown individual differences in accident involvement. Questionnaire-based methods to investigate individual and group differences in driver stress and risk perceptions reported in chapter 2 and 3 revealed that neuroticism was associated with; heightened perception of personal risk, driver stress, and inefficient coping strategies. Younger drivers and female drivers reported higher levels of stress. Young male drivers assessed their personal risk and driving abilities less realistically than did other age and sex groups. Driving simulator-based methods reported in chapter 4 revealed that young drivers and male drivers; drive faster, overtake more often, and commit more `high risk' overtakes than do other age and sex groups. Middle-aged and elderly drivers were poorer at maintaining a fixed distance from a lead `vehicle'. Older drivers adopt a slower, more cautious driving style, but appear to be worse at controlling distance from a `lead' vehicle. Results are consistent with individual and group differences in accident involvement rates. Findings are discussed with reference to the implementation of driver education programs to reduce stress, the adoption of more realistic perceptions of risk among younger drivers, and the training of compensation strategies to counteract age-related changes in older drivers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conventional methods of form-roll design and manufacture for Cold Roll-Forming of thin-walled metal sections have been entirely manual, time consuming and prone to errors, resulting in inefficiency and high production costs. With the use of computers, lead time can be significantly improved, particularly for those aspects involving routine but tedious human decisions and actions. This thesis describes the development of computer aided tools for producing form-roll designs for NC manufacture in the CAD/CAM environment. The work was undertaken to modernise the existing activity of a company manufacturing thin-walled sections. The investigated areas of the activity, including the design and drafting of the finished section, the flower patterns, the 10 to 1 templates, and the rolls complete with pinch-difference surfaces, side-rolls and extension-contours, have been successfully computerised by software development . Data generated by the developed software can be further processed for roll manufacturing using NC lathes. The software has been specially designed for portability to facilitate its implementation on different computers. The Opening-Radii method of forming was introduced as a subsitute to the conventional method for better forming. Most of the essential aspects in roll design have been successfully incorporated in the software. With computerisation, extensive standardisation in existing roll design practices and the use of more reliable and scientifically-based methods have been achieved. Satisfactory and beneficial results have also been obtained by the company in using the software through a terminal linked to the University by a GPO line. Both lead time and productivity in roll design and manufacture have been significantly improved. It is therefore concluded that computerisation in the design of form-rolls for automation by software development is viable. The work also demonstrated the promising nature of the CAD/CAM approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In today's market, the global competition has put manufacturing businesses in great pressures to respond rapidly to dynamic variations in demand patterns across products and changing product mixes. To achieve substantial responsiveness, the manufacturing activities associated with production planning and control must be integrated dynamically, efficiently and cost-effectively. This paper presents an iterative agent bidding mechanism, which performs dynamic integration of process planning and production scheduling to generate optimised process plans and schedules in response to dynamic changes in the market and production environment. The iterative bidding procedure is carried out based on currency-like metrics in which all operations (e.g. machining processes) to be performed are assigned with virtual currency values, and resource agents bid for the operations if the costs incurred for performing them are lower than the currency values. The currency values are adjusted iteratively and resource agents re-bid for the operations based on the new set of currency values until the total production cost is minimised. A simulated annealing optimisation technique is employed to optimise the currency values iteratively. The feasibility of the proposed methodology has been validated using a test case and results obtained have proven the method outperforming non-agent-based methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The field of free radical biology and medicine continues to move at a tremendous pace, with a constant flow of ground-breaking discoveries. The following collection of papers in this issue of Biochemical Society Transactions highlights several key areas of topical interest, including the crucial role of validated measurements of radicals and reactive oxygen species in underpinning nearly all research in the field, the important advances being made as a result of the overlap of free radical research with the reinvigorated field of lipidomics (driven in part by innovations in MS-based analysis), the acceleration of new insights into the role of oxidative protein modifications (particularly to cysteine residues) in modulating cell signalling, and the effects of free radicals on the functions of mitochondria, extracellular matrix and the immune system. In the present article, we provide a brief overview of these research areas, but, throughout this discussion, it must be remembered that it is the availability of reliable analytical methodologies that will be a key factor in facilitating continuing developments in this exciting research area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Signal transduction pathways control cell fate, survival and function. They are organized as intricate biochemical networks which enable biochemical protein activities, crosstalk and subcellular localization to be integrated and tuned to produce highly specific biological responses in a robust and reproducible manner. Post translational Modifications (PTMs) play major roles in regulating these processes through a wide variety of mechanisms that include changes in protein activities, interactions, and subcellular localizations. Determining and analyzing PTMs poses enormous challenges. Recent progress in mass spectrometry (MS) based proteomics have enhanced our capability to map and identify many PTMs. Here we review the current state of proteomic PTM analysis relevant for signal transduction research, focusing on two areas: phosphorylation, which is well established as a widespread key regulator of signal transduction; and oxidative modifications, which from being primarily viewed as protein damage now start to emerge as important regulatory mechanisms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Peptides are of great therapeutic potential as vaccines and drugs. Knowledge of physicochemical descriptors, including the partition coefficient logP, is useful for the development of predictive Quantitative Structure-Activity Relationships (QSARs). We have investigated the accuracy of available programs for the prediction of logP values for peptides with known experimental values obtained from the literature. Eight prediction programs were tested, of which seven programs were fragment-based methods: XLogP, LogKow, PLogP, ACDLogP, AlogP, Interactive Analysis's LogP and MlogP; and one program used a whole molecule approach: QikProp. The predictive accuracy of the programs was assessed using r(2) values, with ALogP being the most effective (r( 2) = 0.822) and MLogP the least (r(2) = 0.090). We also examined three distinct types of peptide structure: blocked, unblocked, and cyclic. For each study (all peptides, blocked, unblocked and cyclic peptides) the performance of programs rated from best to worse is as follows: all peptides - ALogP, QikProp, PLogP, XLogP, IALogP, LogKow, ACDLogP, and MlogP; blocked peptides - PLogP, XLogP, ACDLogP, IALogP, LogKow, QikProp, ALogP, and MLogP; unblocked peptides - QikProp, IALogP, ALogP, ACDLogP, MLogP, XLogP, LogKow and PLogP; cyclic peptides - LogKow, ALogP, XLogP, MLogP, QikProp, ACDLogP, IALogP. In summary, all programs gave better predictions for blocked peptides, while, in general, logP values for cyclic peptides were under-predicted and those of unblocked peptides were over-predicted.