873 resultados para vision-based performance analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cupin superfamily is a group of functionally diverse proteins that are found in all three kingdoms of life, Archaea, Eubacteria, and Eukaryota. These proteins have a characteristic signature domain comprising two histidine- containing motifs separated by an intermotif region of variable length. This domain consists of six beta strands within a conserved beta barrel structure. Most cupins, such as microbial phosphomannose isomerases (PMIs), AraC- type transcriptional regulators, and cereal oxalate oxidases (OXOs), contain only a single domain, whereas others, such as seed storage proteins and oxalate decarboxylases (OXDCs), are bi-cupins with two pairs of motifs. Although some cupins have known functions and have been characterized at the biochemical level, the majority are known only from gene cloning or sequencing projects. In this study, phylogenetic analyses were conducted on the conserved domain to investigate the evolution and structure/function relationships of cupins, with an emphasis on single- domain plant germin-like proteins (GLPs). An unrooted phylogeny of cupins from a wide spectrum of evolutionary lineages identified three main clusters, microbial PMIs, OXDCs, and plant GLPs. The sister group to the plant GLPs in the global analysis was then used to root a phylogeny of all available plant GLPs. The resulting phylogeny contained three main clades, classifying the GLPs into distinct subfamilies. It is suggested that these subfamilies correlate with functional categories, one of which contains the bifunctional barley germin that has both OXO and superoxide dismutase (SOD) activity. It is proposed that GLPs function primarily as SODs, enzymes that protect plants from the effects of oxidative stress. Closer inspection of the DNA sequence encoding the intermotif region in plant GLPs showed global conservation of thymine in the second codon position, a character associated with hydrophobic residues. Since many of these proteins are multimeric and enzymatically inactive in their monomeric state, this conservation of hydrophobicity is thought to be associated with the need to maintain the various monomer- monomer interactions. The type of structure-based predictive analysis presented in this paper is an important approach for understanding gene function and evolution in an era when genomes from a wide range of organisms are being sequenced at a rapid rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical approaches have been applied to examine amino acid pairing preferences within parallel beta-sheets. The main chain hydrogen bonding pattern in parallel beta-sheets means that, for each residue pair, only one of the residues is involved in main chain hydrogen bonding with the strand containing the partner residue. We call this the hydrogen bonded (HB) residue and the partner residue the non-hydrogen bonded (nHB) residue, and differentiate between the favorability of a pair and that of its reverse pair, e.g. Asn(HB)-Thr(nHB)versus Thr(HB)-Asn(nHB). Significantly (p < or = 0.000001) favoured pairings were rationalised using stereochemical arguments. For instance, Asn(HB)-Thr(nHB) and Arg(HB)-Thr(nHB) were favoured pairs, where the residues adopted favoured chi1 rotamer positions that allowed side-chain interactions to occur. In contrast, Thr(HB)-Asn(nHB) and Thr(HB)-Arg(nHB) were not significantly favoured, and could only form side-chain interactions if the residues involved adopted less favourable chi1 conformations. The favourability of hydrophobic pairs e.g. Ile(HB)-Ile(nHB), Val(HB)-Val(nHB) and Leu(HB)-Ile(nHB) was explained by the residues adopting their most preferred chi1 and chi2 conformations, which enabled them to form nested arrangements. Cysteine-cysteine pairs are significantly favoured, although these do not form intrasheet disulphide bridges. Interactions between positively and negatively charged residues were asymmetrically preferred: those with the negatively charged residue at the HB position were more favoured. This trend was accounted for by the presence of general electrostatic interactions, which, based on analysis of distances between charged atoms, were likely to be stronger when the negatively charged residue is the HB partner. The Arg(HB)-Asp(nHB) interaction was an exception to this trend and its favorability was rationalised by the formation of specific side-chain interactions. This research provides rules that could be applied to protein structure prediction, comparative modelling and protein engineering and design. The methods used to analyse the pairing preferences are automated and detailed results are available (http://www.rubic.rdg.ac.uk/betapairprefsparallel/).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical approaches have been applied to examine amino acid pairing preferences within parallel beta-sheets. The main chain hydrogen bonding pattern in parallel beta-sheets means that, for each residue pair, only one of the residues is involved in main chain hydrogen bonding with the strand containing the partner residue. We call this the hydrogen bonded (HB) residue and the partner residue the non-hydrogen bonded (nHB) residue, and differentiate between the favourability of a pair and that of its reverse pair, e.g. Asn(HB)-Thr(nHB) versus Thr(HB)-Asn(nHB). Significantly (p <= 0.000001) favoured pairings were rationalised using stereochemical arguments. For instance, Asn(HB)-Thr(nHB) and Arg(HB)-Thr(nHB) were favoured pairs, where the residues adopted favoured chi(1) rotamer positions that allowed side-chain interactions to occur. In contrast, Thr(HB)-Asn(nHB) and Thr(HB)-Arg(nHB) were not significantly favoured, and could only form side-chain interactions if the residues involved adopted less favourable chi(1) conformations. The favourability of hydrophobic pairs e.g. Ile(HB)-Ile(nHB), Val(HB)-Val(nHB) and Leu(HB)-Ile(nHB) was explained by the residues adopting their most preferred chi(1) and chi(2) conformations, which enabled them to form nested arrangements. Cysteine-cysteine pairs are significantly favoured, although these do not form intrasheet disulphide bridges. Interactions between positively and negatively charged residues were asymmetrically preferred: those with the negatively charged residue at the HB position were more favoured. This trend was accounted for by the presence of general electrostatic interactions, which, based on analysis of distances between charged atoms, were likely to be stronger when the negatively charged residue is the HB partner. The Arg(HB)-Asp(nHB) interaction was an exception to this trend and its favourability was rationalised by the formation of specific side-chain interactions. This research provides rules that could be applied to protein structure prediction, comparative modelling and protein engineering and design. The methods used to analyse the pairing preferences are automated and detailed results are available (http:// www.rubic.rdg.ac.uk/betapairprefsparallel/). (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The elaC gene of Escherichia coli encodes a binuclear zinc phosphodiesterase (ZiPD). ZiPD homologs from various species act as 3' tRNA processing endoribonucleases, and although the homologous gene in Bacillus subtilis is essential for viability [EMBO J. 22 (2003) 4534], the physiological function of E. coli ZiPD has remained enigmatic. In order to investigate the function of E. coli ZiPD we generated and characterized an E. coli elaC deletion mutant. Surprisingly, the E. coli elaC deletion mutant was viable and had wild-type like growth properties. Micro array-based transcriptional analysis indicated expression of the E. coli elaC gene at basal levels during aerobic growth. The elaC gene deletion had no effect on the expression of genes coding for RNases or amino-acyl tRNA synthetases or any other gene among a total of > 1300 genes probed. 2D-PAGE analysis showed that the elaC mutation, likewise, had no effect on the proteome. These results strengthen doubts about the involvement of E. coli ZiPD in tRNA maturation and suggest functional diversity within the ZiPD/ElaCl protein family. In addition to these unexpected features of the E. coli elaC deletion mutant, a sequence comparison of ZiPD (ElaCl) proteins revealed specific regions for either enterobacterial or mammalian ZiPD (ElaCl) proteins. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organisms generally respond to iron deficiency by increasing their capacity to take up iron and by consuming intracellular iron stores. Escherichia coli, in which iron metabolism is particularly well understood, contains at least 7 iron-acquisition systems encoded by 35 iron-repressed genes. This Fe-dependent repression is mediated by a transcriptional repressor, Fur ( ferric uptake regulation), which also controls genes involved in other processes such as iron storage, the Tricarboxylic Acid Cycle, pathogenicity, and redox-stress resistance. Our macroarray-based global analysis of iron- and Fur-dependent gene expression in E. coli has revealed several novel Fur-repressed genes likely to specify at least three additional iron- transport pathways. Interestingly, a large group of energy metabolism genes was found to be iron and Fur induced. Many of these genes encode iron- rich respiratory complexes. This iron- and Fur-dependent regulation appears to represent a novel iron-homeostatic mechanism whereby the synthesis of many iron- containing proteins is repressed under iron- restricted conditions. This mechanism thus accounts for the low iron contents of fur mutants and explains how E. coli can modulate its iron requirements. Analysis of Fe-55-labeled E. coli proteins revealed a marked decrease in iron- protein composition for the fur mutant, and visible and EPR spectroscopy showed major reductions in cytochrome b and d levels, and in iron- sulfur cluster contents for the chelator-treated wild-type and/or fur mutant, correlating well with the array and quantitative RT-PCR data. In combination, the results provide compelling evidence for the regulation of intracellular iron consumption by the Fe2+-Fur complex.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance analysis has been used for many applications including providing feedback to coaches and players, media applications, scoring of sports performance and scientific research into sports performance. The current study has used performance analysis to generate knowledge relating to the demands of netball competition which has been used in the development of a Netball Specific Fitness Test (NSFT). A modified version of the Bloomfield movement classification was used to provide a detailed analysis of player movement during netball competition. This was considered during a needs analysis when proposing the structure of the NSFT. A series of pilot versions were tested during an evolutionary prototyping process that resulted in the final version of the NSFT, which was found to be representative of movement in netball competition and it distinguished between recreational club players and players of university first team level or above. The test is incremental and involves forward, backward and sideways movement, jumping, lunging, turning and choice reaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The question "what Monte Carlo models can do and cannot do efficiently" is discussed for some functional spaces that define the regularity of the input data. Data classes important for practical computations are considered: classes of functions with bounded derivatives and Holder type conditions, as well as Korobov-like spaces. Theoretical performance analysis of some algorithms with unimprovable rate of convergence is given. Estimates of computational complexity of two classes of algorithms - deterministic and randomized for both problems - numerical multidimensional integration and calculation of linear functionals of the solution of a class of integral equations are presented. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we deal with performance analysis of Monte Carlo algorithm for large linear algebra problems. We consider applicability and efficiency of the Markov chain Monte Carlo for large problems, i.e., problems involving matrices with a number of non-zero elements ranging between one million and one billion. We are concentrating on analysis of the almost Optimal Monte Carlo (MAO) algorithm for evaluating bilinear forms of matrix powers since they form the so-called Krylov subspaces. Results are presented comparing the performance of the Robust and Non-robust Monte Carlo algorithms. The algorithms are tested on large dense matrices as well as on large unstructured sparse matrices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is general agreement across all interested parties that a process of working together is the best way to determine which school or educational setting is right for an individual child with autism spectrum disorder. In the UK, families and local authorities both desire a constructive working relationship and see this as the best means by which to reach an agreement to determine where a child should be educated. It has been shown in published works 1 1. Batten and colleagues (Make schools make sense. Autism and education: the reality for families today; London: The National Autistic Society, 2006). View all notes that a constructive working relationship is not always achieved. Purpose: This small-scale study aims to explore the views of both parents and local authorities, focussing on how both parties perceive and experience the process of determining educational provision for children with autism spectrum disorders (ASD) within an English context. Sample, design and method: Parental opinion was gathered through the use of a questionnaire with closed and open responses. The questionnaire was distributed to two national charities, two local charities and 16 specialist schools, which offered the questionnaire to parents of children with ASD, resulting in an opportunity sample of 738 returned surveys. The views of local authority personnel from five local authorities were gathered through the use of semi-structured interviews. Data analyses included quantitative analysis of the closed response questionnaire items, and theme-based qualitative analysis of the open responses and interviews with local authority personnel. Results: In the majority of cases, parents in the survey obtained their first choice placement for their child. Despite this positive outcome, survey data indicated that parents found the process bureaucratic, stressful and time consuming. Parents tended to perceive alternative placement suggestions as financially motivated rather than in the best interests of the child. Interviews with local authority personnel showed an awareness of these concerns and the complex considerations involved in determining what is best for an individual child. Conclusions: This small-scale study highlights the need for more effective communication between parents of children with ASDs and local authority personnel at all stages of the process

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Searching for the optimum tap-length that best balances the complexity and steady-state performance of an adaptive filter has attracted attention recently. Among existing algorithms that can be found in the literature, two of which, namely the segmented filter (SF) and gradient descent (GD) algorithms, are of particular interest as they can search for the optimum tap-length quickly. In this paper, at first, we carefully compare the SF and GD algorithms and show that the two algorithms are equivalent in performance under some constraints, but each has advantages/disadvantages relative to the other. Then, we propose an improved variable tap-length algorithm using the concept of the pseudo fractional tap-length (FT). Updating the tap-length with instantaneous errors in a style similar to that used in the stochastic gradient [or least mean squares (LMS)] algorithm, the proposed FT algorithm not only retains the advantages from both the SF and the GD algorithms but also has significantly less complexity than existing algorithms. Both performance analysis and numerical simulations are given to verify the new proposed algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents findings and seeks to establish the theoretical markers that indicate the growing importance of fact-based drama in screen and theatre performance to the wider Anglophone culture. During the final decade of the twentieth century and the opening one of the twenty-first, television docudrama and documentary theatre have grown in visibility and importance in the UK, providing key responses to social, cultural and political change over the millennial period. Actors were the prime focus for the enquiry principally because so little research has been done into the special demands that fact-based performance makes on them. The main emphasis in actor training (in the UK at any rate) is, as it always has been, on preparation for fictional drama. Preparation in acting schools is also heavily geared towards stage performance. Our thesis was that performers called upon to play the roles of real people, in whatever medium, have added responsibilities both towards history and towards real individuals and their families. Actors must engage with ethical questions whether they like it or not, and we found them keenly aware of this. In the course of the research, we conducted 30 interviews with a selection of actors ranging from the experienced to the recently-trained. We also interviewed a few industry professionals and actor trainers. Once the interviews started it was clear that actors themselves made little or no distinction between how they set about their work for television and film. The essential disciplines for work in front of the camera, they told us, are the same whether the camera is electronic or photographic. Some adjustments become necessary, of course in the multi-camera TV studio. But much serious drama for the screen is made on film anyway. We found it was also the case that young actors now tend to get their first paid employment before a camera rather than on a stage. The screen-before-stage tendency, along with the fundamental re-shaping that has gone on in the British theatre since at least the early 1980s, had implications for actor training. We have also found that theatre work still tends to be most valued by actors. For all the actors we interviewed, theatre was what they liked doing best because it was there they could practice and develop their skills, there they could work most collectively towards performance, and there they could more directly experience audience feedback in the real time of the stage play. The current world of television has been especially constrained in regard to rehearsal time in comparison to theatre (and, to a lesser extent, film). This has also affected actors’ valuation of their work. Theatre is, and is not, the most important medium in which they find work. Theatre is most important spiritually and intellectually, because in theatre is collaborative, intensive, and involving; theatre is not as important in financial and career terms, because it is not as lucrative and not as visible to a large public as acting for the screen. Many actors took the view that, for all the industrial differences that do affect them and inevitably interest the academic, acting for the visible media of theatre, film and television involved fundamentally the same process with slightly different emphases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new satellite mission to deliver high quality measurements of upper air water vapour. The concept centres around a LiDAR in limb sounding by occultation geometry, designed to operate as a very long path system for differential absorption measurements. We present a preliminary performance analysis with a system sized to send 75 mJ pulses at 25 Hz at four wavelengths close to 935 nm, to up to 5 microsatellites in a counter-rotating orbit, carrying retroreflectors characterized by a reflected beam divergence of roughly twice the emitted laser beam divergence of 15 µrad. This provides water vapour profiles with a vertical sampling of 110 m; preliminary calculations suggest that the system could detect concentrations of less than 5 ppm. A secondary payload of a fairly conventional medium resolution multispectral radiometer allows wide-swath cloud and aerosol imaging. The total weight and power of the system are estimated at 3 tons and 2,700 W respectively. This novel concept presents significant challenges, including the performance of the lasers in space, the tracking between the main spacecraft and the retroreflectors, the refractive effects of turbulence, and the design of the telescopes to achieve a high signal-to-noise ratio for the high precision measurements. The mission concept was conceived at the Alpbach Summer School 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Progressive telomere shortening from cell division (replicative aging) provides a barrier for human tumor progression. This program is not conserved in laboratory mice, which have longer telomeres and constitutive telomerase. Wild species that do ⁄ do not use replicative aging have been reported, but the evolution of different phenotypes and a conceptual framework for understanding their uses of telomeres is lacking. We examined telomeres ⁄ telomerase in cultured cells from > 60 mammalian species to place different uses of telomeres in a broad mammalian context. Phylogeny-based statistical analysis reconstructed ancestral states. Our analysis suggested that the ancestral mammalian phenotype included short telomeres (< 20 kb, as we now see in humans) and repressed telomerase. We argue that the repressed telomerase was a response to a higher mutation load brought on by the evolution of homeothermy. With telomerase repressed, we then see the evolution of replicative aging. Telomere length inversely correlated with lifespan, while telomerase expression co-evolved with body size. Multiple independent times smaller, shorter-lived species changed to having longer telomeres and expressing telomerase. Trade-offs involving reducing the energetic ⁄ cellular costs of specific oxidative protection mechanisms (needed to protect < 20 kb telomeres in the absence oftelomerase) could explain this abandonment of replicative aging. These observations provide a conceptual framework for understanding different uses of telomeres in mammals, support a role for human-like telomeres in allowing longer lifespans to evolve, demonstrate the need to include telomere length in the analysis of comparative studies of oxidative protection in the biology of aging, and identify which mammals can be used as appropriate model organisms for the study of the role of telomeres in human cancer and aging. Key words: evolution of telomeres; immortalization; telomerase; replicative aging; senescence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of gut microbiota on the toxicity and metabolism of hydrazine has been investigated in germ-free and ‘conventional’ Sprague Dawley rats using 1H NMR based metabonomic analysis of urine and plasma. Toxicity was more severe in germ-free rats compared with conventional rats for equivalent exposures indicating that bacterial presence altered the nature or extent of response to hydrazine and that the toxic response can vary markedly in the absence of a functional microbiome.