995 resultados para Peptide mapping
Resumo:
A precise and rapid method for identifying sites of interaction between proteins was demonstrated; the basis of the method is direct mass spectrometric readout from the complex to determine the specific components of the proteins that interact--a method termed affinity-directed mass spectrometry. The strategy was used to define the region of interaction of a protein growth factor with a monoclonal antibody. A combination of proteolytic digestion and affinity-directed mass spectrometry was used to rapidly determine the approximate location of a continuous binding epitope within the growth factor. The precise boundaries of the binding epitope were determined by affinity-directed mass spectrometric analysis of sets of synthetic peptide ladders that span the approximate binding region. In addition to the mapping of such linear epitopes, affinity-directed mass spectrometry can be applied to the mapping of other types of molecule-molecule contacts, including ligand-receptor and protein-oligonucleotide interactions.
Resumo:
The structure of the small hepatitis B virus surface antigen (HBsAg) was investigated by epitope mapping of four anti-HBsAg monoclonal antibodies (mAbs). Amino acid sequences of epitopes were derived from affinity-enrichment experiments (biopanning) using a filamentous phage peptide library. The library consists of 10(9) different clones bearing a 30-residue peptide fused to gene III. Sequence homologies between peptides obtained from panning the library against the antibodies and the native HBsAg sequence allowed for precise description of the binding regions. Three of four mAbs were found to bind to distinct discontinuous epitopes between amino acid residues 101 and 207 of HBsAg. The fourth mAb was demonstrated to bind to residues 121-124. The sequence data are supported by ELISA assays demonstrating the binding of the HBsAg-specific peptides on filamentous phage to mAbs. The sequence data were used to map the surface of HBsAg and to derive a topological model for the alpha-carbon trace of the 101-207 region of HBsAg. The approach should be useful for other proteins for which the crystal structure is not available but a representative set of mAbs can be obtained.
Resumo:
There is a growing awareness that inflammatory diseases have an oxidative pathology, which can result in specific oxidation of amino acids within proteins. It is known that patients with inflammatory disease have higher levels of plasma protein nitro-tyrosine than healthy controls. Fibrinogen is an abundant plasma protein, highly susceptible to such oxidative modifications, and is therefore a potential marker for oxidative protein damage. The aim of this study was to map tyrosine nitration in fibrinogen under oxidative conditions and identify susceptible residues. Fibrinogen was oxidised with 0.25mM and 1mM SIN-1, a peroxynitrite generator, and methionine was used to quench excess oxidant in the samples. The carbonyl assay was used to confirm oxidation in the samples. The carbonyl levels were 2.3, 8.72 and 11.5nmol/mg protein in 0, 0.25mM and 1mM SIN-1 samples respectively. The samples were run on a SDS-PAGE gel and tryptically digested before analysis by HPLC MS-MS. All 3 chains of fibrinogen were observed for all treatment conditions. The overall sequence coverage for fibrinogen determined by Mascot was between 60-75%. The oxidised samples showed increases in oxidative modifications in both alpha and beta chains, commonly methionine sulfoxide and tyrosine nitration, correlating with increasing SIN-1 treatment. Tyrosines that were most susceptible were Tyr135 (tryptic peptide YLQEIYNSNNQK) and Tyr277 (tryptic peptide GGSTSYGTGSETESPR), but several other nitrated tyrosines were also identified with high confidence. Identification of these susceptible peptides will allow design of sequences-specific biomarkers of oxidative and nitrative damage to plasma protein in inflammatory conditions.
Resumo:
Circulating low density lipoproteins (LDL) are thought to play a crucial role in the onset and development of atherosclerosis, though the detailed molecular mechanisms responsible for their biological effects remain controversial. The complexity of biomolecules (lipids, glycans and protein) and structural features (isoforms and chemical modifications) found in LDL particles hampers the complete understanding of the mechanism underlying its atherogenicity. For this reason the screening of LDL for features discriminative of a particular pathology in search of biomarkers is of high importance. Three major biomolecule classes (lipids, protein and glycans) in LDL particles were screened using mass spectrometry coupled to liquid chromatography. Dual-polarity screening resulted in good lipidome coverage, identifying over 300 lipid species from 12 lipid sub-classes. Multivariate analysis was used to investigate potential discriminators in the individual lipid sub-classes for different study groups (age, gender, pathology). Additionally, the high protein sequence coverage of ApoB-100 routinely achieved (≥70%) assisted in the search for protein modifications correlating to aging and pathology. The large size and complexity of the datasets required the use of chemometric methods (Partial Least Square-Discriminant Analysis, PLS-DA) for their analysis and for the identification of ions that discriminate between study groups. The peptide profile from enzymatically digested ApoB-100 can be correlated with the high structural complexity of lipids associated with ApoB-100 using exploratory data analysis. In addition, using targeted scanning modes, glycosylation sites within neutral and acidic sugar residues in ApoB-100 are also being explored. Together or individually, knowledge of the profiles and modifications of the major biomolecules in LDL particles will contribute towards an in-depth understanding, will help to map the structural features that contribute to the atherogenicity of LDL, and may allow identification of reliable, pathology-specific biomarkers. This research was supported by a Marie Curie Intra-European Fellowship within the 7th European Community Framework Program (IEF 255076). Work of A. Rudnitskaya was supported by Portuguese Science and Technology Foundation, through the European Social Fund (ESF) and "Programa Operacional Potencial Humano - POPH".
Resumo:
For robots to operate in human environments they must be able to make their own maps because it is unrealistic to expect a user to enter a map into the robot’s memory; existing floorplans are often incorrect; and human environments tend to change. Traditionally robots have used sonar, infra-red or laser range finders to perform the mapping task. Digital cameras have become very cheap in recent years and they have opened up new possibilities as a sensor for robot perception. Any robot that must interact with humans can reasonably be expected to have a camera for tasks such as face recognition, so it makes sense to also use the camera for navigation. Cameras have advantages over other sensors such as colour information (not available with any other sensor), better immunity to noise (compared to sonar), and not being restricted to operating in a plane (like laser range finders). However, there are disadvantages too, with the principal one being the effect of perspective. This research investigated ways to use a single colour camera as a range sensor to guide an autonomous robot and allow it to build a map of its environment, a process referred to as Simultaneous Localization and Mapping (SLAM). An experimental system was built using a robot controlled via a wireless network connection. Using the on-board camera as the only sensor, the robot successfully explored and mapped indoor office environments. The quality of the resulting maps is comparable to those that have been reported in the literature for sonar or infra-red sensors. Although the maps are not as accurate as ones created with a laser range finder, the solution using a camera is significantly cheaper and is more appropriate for toys and early domestic robots.
Resumo:
This chapter reports on Australian and Swedish experiences in the iterative design, development, and ongoing use of interactive educational systems we call ‘Media Maps.’ Like maps in general, Media Maps are usefully understood as complex cultural technologies; that is, they are not only physical objects, tools and artefacts, but also information creation and distribution technologies, the use and development of which are embedded in systems of knowledge and social meaning. Drawing upon Australian and Swedish experiences with one Media Map technology, this paper illustrates this three-layered approach to the development of media mapping. It shows how media mapping is being used to create authentic learning experiences for students preparing for work in the rapidly evolving media and communication industries. We also contextualise media mapping as a response to various challenges for curriculum and learning design in Media and Communication Studies that arise from shifts in tertiary education policy in a global knowledge economy.
Resumo:
As regulators, governments are often criticised for over‐regulating industries. This research project seeks to examine the regulation affecting the construction industry in a federal system of government. It uses a case study of the Australian system of government to focus on the question of the implications of regulation in the construction industry. Having established the extent of the regulatory environment, the research project considers the costs associated with this environment. Consequently, ways in which the regulatory burden on industry can be reduced are evaluated. The Construction Industry Business Environment project is working with industry and government agencies to improve regulatory harmonisation in Australia, and thereby reduce the regulatory burden on industry. It is found that while taxation and compliance costs are not likely to be reduced in the short term, costs arising from having to adapt to variation between regulatory regimes in a federal system of government, seem the most promising way of reducing regulatory costs. Identifying and reducing adaptive costs across jurisdictional are argued to present a novel approach to regulatory reform.
Resumo:
There are currently a number of issues of great importance affecting universities and the way in which their programs are now offered. Many issues are largely being driven top-down and impact both at a university-wide and at an individual discipline level. This paper provides a brief history of cartography and digital mapping education at the Queensland University of Technology (QUT). It also provides an overview of what is curriculum mapping and presents some interesting findings from the program review process. Further, this review process has triggered discussion and action for the review, mapping and embedding of graduate attributes within the spatial science major program. Some form of practical based learning is expected in vocationally oriented degrees that lead to professional accreditation and are generally regarded as a good learning exposure. With the restructure of academic programs across the Faculty of Built Environment and Engineering in 2006, spatial science and surveying students now undertake a formal work integrated learning unit. There is little doubt that students acquire the skills of their discipline (mapping science, spatial) by being immersed in the industry culture- learning how to process information and solve real-world problems within context. The broad theme of where geo-spatial mapping skills are embedded in this broad-based tertiary education course are examined with some focused discussion on the learning objectives, outcomes and examples of some student learning experiences
Resumo:
Network crawling and visualisation tools and other datamining systems are now advanced enough to provide significant new impulses to the study of cultural activity on the Web. A growing range of studies focus on communicative processes in the blogosphere – including for example Adamic & Glance’s 2005 map of political allegiances during the 2004 U.S. presidential election and Kelly & Etling’s 2008 study of blogging practices in Iran. There remain a number of significant shortcomings in the application of such tools and methodologies to the study of blogging; these relate both to how the content of blogs is analysed, and to how the network maps resulting from such studies are understood. Our project highlights and addresses such shortcomings.
Resumo:
The last three decades have seen consumers’ environmental consciousness grow as the environment has moved to a mainstream issue. Results from our study of green marketing blog site comments in the first half of 2009 finds thirteen prominent concepts: carbon, consumers, global and energy were the largest themes, while crisis, power, people, water, fuel, product, work, time, water, organic, content and interest were the others. However sub issues were also identified, as the driving factor of this information is coming from consumer led social networks. While marketers hold some power, consumers are the real key factor to possess influence for change. They want to drive change and importantly, they have the power. Power to the people.
Resumo:
The paper analyses the expected value of OD volumes from probe with fixed error, error that is proportional to zone size and inversely proportional to zone size. To add realism to the analysis, real trip ODs in the Tokyo Metropolitan Region are synthesised. The results show that for small zone coding with average radius of 1.1km, and fixed measurement error of 100m, an accuracy of 70% can be expected. The equivalent accuracy for medium zone coding with average radius of 5km would translate into a fixed error of approximately 300m. As expected small zone coding is more sensitive than medium zone coding as the chances of the probe error envelope falling into adjacent zones are higher. For the same error radii, error proportional to zone size would deliver higher level of accuracy. As over half (54.8%) of the trip ends start or end at zone with equivalent radius of ≤ 1.2 km and only 13% of trips ends occurred at zones with equivalent radius ≥2.5km, measurement error that is proportional to zone size such as mobile phone would deliver higher level of accuracy. The synthesis of real OD with different probe error characteristics have shown that expected value of >85% is difficult to achieve for small zone coding with average radius of 1.1km. For most transport applications, OD matrix at medium zone coding is sufficient for transport management. From this study it can be drawn that GPS with error range between 2 and 5m, and at medium zone coding (average radius of 5km) would provide OD estimates greater than 90% of the expected value. However, for a typical mobile phone operating error range at medium zone coding the expected value would be lower than 85%. This paper assumes transmission of one origin and one destination positions from the probe. However, if multiple positions within the origin and destination zones are transmitted, map matching to transport network could be performed and it would greatly improve the accuracy of the probe data.
Resumo:
The CDIO (Conceive-Design-Implement-Operate) Initiative has been globally recognised as an enabler for engineering education reform. With the CDIO process, the CDIO Standards and the CDIO Syllabus, many scholarly contributions have been made around cultural change, curriculum reform and learning environments. In the Australasian region, reform is gaining significant momentum within the engineering education community, the profession, and higher education institutions. This paper presents the CDIO Syllabus cast into the Australian context by mapping it to the Engineers Australia Graduate Attributes, the Washington Accord Graduate Attributes and the Queensland University of Technology Graduate Capabilities. Furthermore, in recognition that many secondary schools and technical training institutions offer introductory engineering technology subjects, this paper presents an extended self-rating framework suited for recognising developing levels of proficiency at a preparatory level. A demonstrator mapping tool has been created to demonstrate the application of this extended graduate attribute mapping framework as a precursor to an integrated curriculum information model.