957 resultados para experimental methods
Resumo:
[Excerpt] Purine nucleobases are fundamental biochemicals in living organisms. They have been a valuable inspiration for drug design once they play several key roles in the cell.1 To the best of our knowledge, reported routes to 8-aminopurines are still scarce due to the difficulty in introducing amino groups in this position of the purine ring. Here we report a novel, inexpensive and facile synthetic method to generate N3,N6-disubstituted-6,8-diaminopurines. In our research group, a number of substituted purines have been obtained from a common imidazole precursor, the 5-amino-4-cyanoformimidoyl imidazole 1. Recently, a comprehensive study on the reactivity of imidazoles 1 with nucleophiles under acidic conditions led us to develop experimental methods to incorporate primary amines into the cyanoformimidoyl group.2 (...)
Resumo:
Over the last few years, there has been a surge of work in a new field called "moral psychology", which uses experimental methods to test the psychological processes underlying human moral activity. In this paper, I shall follow this line of approach with the aim of working out a model of how people form value judgements and how they are motivated to act morally. I call this model an "affective picture": 'picture' because it remains strictly at the descriptive level and 'affective' because it has an important role for affects and emotions. This affective picture is grounded on a number of plausible and empirically supported hypotheses. The main idea is that we should distinguish between various kinds of value judgements by focusing on the sort of state of mind people find themselves in while uttering a judgement. "Reasoned judgements" are products of rational considerations and are based on preliminary acceptance of norms and values. On the contrary, "basic value judgements" are affective, primitive and non-reflective ways of assessing the world. As we shall see, this analysis has some consequences for the traditional internalism-externalism debate in philosophy; it highlights the fact that motivation is primarily linked to "basic value judgements" and that the judgements we openly defend might not have a particular effect on our actions, unless we are inclined to have an emotional attitude that conforms to them.
Resumo:
The epidemiological methods have become useful tools for the assessment of the effectiveness and safety of health care technologies. The experimental methods, namely the randomized controlled trials (RCT), give the best evidence of the effect of a technology. However, the ethical issues and the very nature of the intervention under study sometimes make it difficult to carry out an RCT. Therefore, quasi-experimental and non-experimental study designs are also applied. The critical issues concerning these designs are discussed. The results of evaluative studies are of importance for decision-makers in health policy. The measurements of the impact of a medical technology should go beyond a statement of its effectiveness, because the essential outcome of an intervention or programme is the health status and quality of life of the individuals and populations concerned.
Resumo:
Background: We present the results of EGASP, a community experiment to assess the state-ofthe-art in genome annotation within the ENCODE regions, which span 1% of the human genomesequence. The experiment had two major goals: the assessment of the accuracy of computationalmethods to predict protein coding genes; and the overall assessment of the completeness of thecurrent human genome annotations as represented in the ENCODE regions. For thecomputational prediction assessment, eighteen groups contributed gene predictions. Weevaluated these submissions against each other based on a ‘reference set’ of annotationsgenerated as part of the GENCODE project. These annotations were not available to theprediction groups prior to the submission deadline, so that their predictions were blind and anexternal advisory committee could perform a fair assessment.Results: The best methods had at least one gene transcript correctly predicted for close to 70%of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into accountalternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotidelevel, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programsrelying on mRNA and protein sequences were the most accurate in reproducing the manuallycurated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could beverified.Conclusions: This is the first such experiment in human DNA, and we have followed thestandards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe theresults presented here contribute to the value of ongoing large-scale annotation projects and shouldguide further experimental methods when being scaled up to the entire human genome sequence.
Resumo:
A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.
Resumo:
Gene correction at the site of the mutation in the chromosome is the absolute way to really cure a genetic disease. The oligonucleotide (ODN)-mediated gene repair technology uses an ODN perfectly complementary to the genomic sequence except for a mismatch at the base that is mutated. The endogenous repair machinery of the targeted cell then mediates substitution of the desired base in the gene, resulting in a completely normal sequence. Theoretically, it avoids potential gene silencing or random integration associated with common viral gene augmentation approaches and allows an intact regulation of expression of the therapeutic protein. The eye is a particularly attractive target for gene repair because of its unique features (small organ, easily accessible, low diffusion into systemic circulation). Moreover therapeutic effects on visual impairment could be obtained with modest levels of repair. This chapter describes in details the optimized method to target active ODNs to the nuclei of photoreceptors in neonatal mouse using (1) an electric current application at the eye surface (saline transpalpebral iontophoresis), (2) combined with an intravitreous injection of ODNs, as well as the experimental methods for (3) the dissection of adult neural retinas, (4) their immuno-labelling, and (5) flat-mounting for direct observation of photoreceptor survival, a relevant criteria of treatment outcomes for retinal degeneration.
Resumo:
BACKGROUND: We present the results of EGASP, a community experiment to assess the state-of-the-art in genome annotation within the ENCODE regions, which span 1% of the human genome sequence. The experiment had two major goals: the assessment of the accuracy of computational methods to predict protein coding genes; and the overall assessment of the completeness of the current human genome annotations as represented in the ENCODE regions. For the computational prediction assessment, eighteen groups contributed gene predictions. We evaluated these submissions against each other based on a 'reference set' of annotations generated as part of the GENCODE project. These annotations were not available to the prediction groups prior to the submission deadline, so that their predictions were blind and an external advisory committee could perform a fair assessment. RESULTS: The best methods had at least one gene transcript correctly predicted for close to 70% of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into account alternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotide level, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programs relying on mRNA and protein sequences were the most accurate in reproducing the manually curated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could be verified. CONCLUSION: This is the first such experiment in human DNA, and we have followed the standards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe the results presented here contribute to the value of ongoing large-scale annotation projects and should guide further experimental methods when being scaled up to the entire human genome sequence.
Resumo:
DNA methylation is involved in a diversity of processes in bacteria, including maintenance of genome integrity and regulation of gene expression. Here, using Caulobacter crescentus as a model, we exploit genome-wide experimental methods to uncover the functions of CcrM, a DNA methyltransferase conserved in most Alphaproteobacteria. Using single molecule sequencing, we provide evidence that most CcrM target motifs (GANTC) switch from a fully methylated to a hemi-methylated state when they are replicated, and back to a fully methylated state at the onset of cell division. We show that DNA methylation by CcrM is not required for the control of the initiation of chromosome replication or for DNA mismatch repair. By contrast, our transcriptome analysis shows that >10% of the genes are misexpressed in cells lacking or constitutively over-expressing CcrM. Strikingly, GANTC methylation is needed for the efficient transcription of dozens of genes that are essential for cell cycle progression, in particular for DNA metabolism and cell division. Many of them are controlled by promoters methylated by CcrM and co-regulated by other global cell cycle regulators, demonstrating an extensive cross talk between DNA methylation and the complex regulatory network that controls the cell cycle of C. crescentus and, presumably, of many other Alphaproteobacteria.
Resumo:
BACKGROUND: Small RNAs (sRNAs) are widespread among bacteria and have diverse regulatory roles. Most of these sRNAs have been discovered by a combination of computational and experimental methods. In Pseudomonas aeruginosa, a ubiquitous Gram-negative bacterium and opportunistic human pathogen, the GacS/GacA two-component system positively controls the transcription of two sRNAs (RsmY, RsmZ), which are crucial for the expression of genes involved in virulence. In the biocontrol bacterium Pseudomonas fluorescens CHA0, three GacA-controlled sRNAs (RsmX, RsmY, RsmZ) regulate the response to oxidative stress and the expression of extracellular products including biocontrol factors. RsmX, RsmY and RsmZ contain multiple unpaired GGA motifs and control the expression of target mRNAs at the translational level, by sequestration of translational repressor proteins of the RsmA family. RESULTS: A combined computational and experimental approach enabled us to identify 14 intergenic regions encoding sRNAs in P. aeruginosa. Eight of these regions encode newly identified sRNAs. The intergenic region 1698 was found to specify a novel GacA-controlled sRNA termed RgsA. GacA regulation appeared to be indirect. In P. fluorescens CHA0, an RgsA homolog was also expressed under positive GacA control. This 120-nt sRNA contained a single GGA motif and, unlike RsmX, RsmY and RsmZ, was unable to derepress translation of the hcnA gene (involved in the biosynthesis of the biocontrol factor hydrogen cyanide), but contributed to the bacterium's resistance to hydrogen peroxide. In both P. aeruginosa and P. fluorescens the stress sigma factor RpoS was essential for RgsA expression. CONCLUSION: The discovery of an additional sRNA expressed under GacA control in two Pseudomonas species highlights the complexity of this global regulatory system and suggests that the mode of action of GacA control may be more elaborate than previously suspected. Our results also confirm that several GGA motifs are required in an sRNA for sequestration of the RsmA protein.
Resumo:
When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.
Resumo:
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on fi eld data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unifi cation of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call"plausibility"- including the fi delity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram"s 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.
Resumo:
Tässä diplomityössä on käsitelty mikroliuskarakenteen johdinhäviöiden määrittämistä kokeellisin ja analyyttisin menetelmin. Työssä on käyty läpi mikroliuskarakenteen ominaisuudet sekä esitetty tärkeimmät kirjallisuudessa esitellyt menetelmät johdinhäviöiden laskemiseksi. Mikroliuskarakenteen häviöitä mallinnettiin modernilla momenttimenetelmään perustuvalla kenttälaskentasimulaattorilla. Simulaattorin toimintaperiaate on esitelty ja sen soveltaminen mikroliuskarakenteen simuloimiseen on käyty läpi. Eräs mikroliuskarakenne mallinnettiin taajuusalueella 1-10 GHz ja simuloituja johdinhäviöitä verrattiin analyyttisesti määritettyihin. Mikroliuskarakenteen vaste mitattiin ja sitä verrattiin mallinnetun mikroliuskarakenteen vasteeseen. Tulosten perusteella on pohdittu simulaattorin soveltuvuutta mikroliuskarakenteen johdinhäviöiden mallintamiseen. Mikroliuskarakenteen johdinhäviöiden määrittämiseen mittausten avulla on esitetty ideoita jatkotutkimusta varten.
Resumo:
Object of the Master’s thesis was to obtain clarification to problems related to sludge treatment at the waste water treatment plant at Stora Enso Varkaus Mill. From time to time these problems have caused emissions to exceed given limits. Case was studied by examining existing data, fiber length fractions and experimental methods. Changes at the Mill have reduced total solids emissions. At the same time the requirement for tertiary treatment has grown. Treatment of tertiary sludge is hard. In the future emission limits shall tighten and the urge for tertiary treatment will grow. The significance of thesis’s results may have a great impact to Stora Enso Varkaus Mill in the future. The results give valuable information to follow-up research and guidelines to sludge treatment. The results encourage observing researched matters at longer period of time. Future recommendations emphasize the meaning of maintenance and systematic, appropriate experiments.
Resumo:
Se exponen la base lógica y los antecedentes históricos de la presentación taquistoscópica en hemicampos visuales y se comentan algunos aspectos fundamentales referente a los sujetos, aparatos, estímulos, respuestas y procedirnientos de loss disafios experimentales con esta técnica.
Resumo:
Broadband access is a key factor for economic and social development. However, providing broadband to rural areas is not attractive to private telecommunications operators due its low or zero investment return. To deal with broadband provision in rural areas, different governance systems based on private and public cooperation have appeared. This paper not only identifies and defines public and private cooperation models but also assesses their impact on overcoming the digital divide in rural areas. The results show that public ownership infrastructure under private management policy has had positive effects on reducing the broadband digital divide and being applied to areas with higher digital divide; subsides to private operators providers only positive effects on reducing broadband digital divide; but public infrastructure with public management programs did not. The results, obtained using quasi-experimental methods, suggest the importance of incentives and control mechanisms in broadband universal service provision plans.