977 resultados para Scientists reflexivity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Marine sponges have been an abundant source of new metabolites in recent years. The symbiotic association between the bacteria and the sponge has enabled scientists to access the bacterial diversity present within the bacterial/sponge ecosystem. This study has focussed on accessing the bacterial diversity in two Irish coastal marine sponges, namely Amphilectus fucorum and Eurypon major. A novel species from the genus Aquimarina has been isolated from the sponge Amphilectus fucorum. The study has also resulted in the identification of an α–Proteobacteria, Pseudovibrio sp. as a potential producer of antibiotics. Thus a targeted based approach to specifically cultivate Pseudovibrio sp. may prove useful for the development of new metabolites from this particular genus. Bacterial isolates from the marine sponge Haliclona simulans were screened for anti–fungal activity and one isolate namely Streptomyces sp. SM8 displayed activity against all five fungal strains tested. The strain was also tested for anti–bacterial activity and it showed activity against both against B. subtilis and P. aeruginosa. Hence a combinatorial approach involving both biochemical and genomic approaches were employed in an attempt to identify the bioactive compounds with these activities which were being produced by this strain. Culture broths from Streptomyces sp. SM8 were extracted and purified by various techniques such as reverse–phase HPLC, MPLC and ash chromatography. Anti–bacterial activity was observed in a fraction which contained a hydroxylated saturated fatty acid and also another compound with a m/z 227 but further structural elucidation of these compounds proved unsuccessful. The anti–fungal fractions from SM8 were shown to contain antimycin–like compounds, with some of these compounds having different retention times from that of an antimycin standard. A high–throughput assay was developed to screen for novel calcineurin inhibitors using yeast as a model system and three putative bacterial extracts were found to be positive using this screen. One of these extracts from SM8 was subsequently analysed using NMR and the calcineurin inhibition activity was con rmed to belong to a butenolide type compound. A H. simulans metagenomic library was also screened using the novel calcineurin inhibitor high–throughput assay system and eight clones displaying putative calcineurin inhibitory activity were detected. The clone which displayed the best inhibitory activity was subsequently sequenced and following the use of other genetic based approaches it became clear that the inhibition was being caused by a hypothetical protein with similarity to a hypothetical Na+/Ca2+ exchanger protein. The Streptomyces sp. SM8 genome was sequenced from a fragment library using Roche 454 pyrosequencing technology to identify potential secondary metabolism clusters. The draft genome was annotated by IMG/ER using the Prodigal pipeline. The Whole Genome Shotgun project has been deposited at DDBJ/EMBL/GenBank under the accession AMPN00000000. The genome contains genes which appear to encode for several polyketide synthases (PKS), non–ribosomal peptide synthetases (NRPS), terpene and siderophore biosynthesis and ribosomal peptides. Transcriptional analyses led to the identification of three hybrid clusters of which one is predicted to be involved in the synthesis of antimycin, while the functions of the others are as yet unknown. Two NRPS clusters were also identified, of which one may be involved in gramicidin biosynthesis and the function of the other is unknown. A Streptomyces sp. SM8 NRPS antC gene knockout was constructed and extracts from the strain were shown to possess a mild anti–fungal activity when compared to the SM8 wild–type. Subsequent LCMS analysis of antC mutant extracts confirmed the absence of the antimycin in the extract proving that the observed anti–fungal activity may involve metabolite(s) other than antimycin. Anti–bacterial activity in the antC gene knockout strain against P. aeruginosa was reduced when compared to the SM8 wild–type indicating that antimycin may be contributing to the observed anti–bacterial activity in addition to the metabolite(s) already identified during the chemical analyses. This is the first report of antimycins exhibiting anti–bacterial activity against P. aeruginosa. One of the hybrid clusters potentially involved in secondary metabolism in SM8 that displayed high and consistent levels of gene–expression in RNA studies was analysed in an attempt to identify the metabolite being produced by the pathway. A number of unusual features were observed following bioinformatics analysis of the gene sequence of the cluster, including a formylation domain within the NRPS cluster which may add a formyl group to the growing chain. Another unusual feature is the lack of AT domains on two of the PKS modules. Other unusual features observed in this cluster is the lack of a KR domain in module 3 of the cluster and an aminotransferase domain in module 4 for which no clear role has been hypothesised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wonder of the last century has been the rapid development in technology. One of the sectors that it has touched immensely is the electronic industry. There has been exponential development in the field and scientists are pushing new horizons. There is an increased dependence in technology for every individual from different strata in the society. Atomic Layer Deposition (ALD) is a unique technique for growing thin films. It is widely used in the semiconductor industry. Films as thin as few nanometers can be deposited using this technique. Although this process has been explored for a variety of oxides, sulphides and nitrides, a proper method for deposition of many metals is missing. Metals are often used in the semiconductor industry and hence are of significant importance. A deficiency in understanding the basic chemistry at the nanoscale for possible reactions has delayed the improvement in metal ALD. In this thesis, we study the intrinsic chemistry involved for Cu ALD. This work reports computational study using Density Functional Theory as implemented in TURBOMOLE program. Both the gas phase and surface reactions are studied in most of the cases. The merits and demerits of a promising transmetallation reaction have been evaluated at the beginning of the study. Further improvements in the structure of precursors and coreagent have been proposed. This has led to the proposal of metallocenes as co-reagents and Cu(I) carbene compounds as new set of precursors. A three step process for Cu ALD that generates ligand free Cu layer after every ALD pulse has also been studied. Although the chemistry has been studied under the umbrella of Cu ALD the basic principles hold true for ALD of other metals (e.g. Co, Ni, Fe ) and also for other branches of science like thin film deposition other than ALD, electrochemical reactions, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This qualitative research expands understanding of how information about a range of Novel Food Technologies (NFTs) is used and assimilated, and the implications of this on the evolution of attitudes and acceptance. This work enhances theoretical and applied understanding of citizens’ evaluative processes around these technologies. The approach applied involved observations of interactive exchanges between citizens and information providers (i.e. food scientists), during which they discussed a specific technology. This flexible, yet structured, approach revealed how individuals construct meaning around information about specific NFTs. A rich dataset of 42 ‘deliberate discourse’ and 42 postdiscourse transcripts was collected. Data analysis encompassed three stages: an initial descriptive account of the complete dataset based on the top-down bottom-up (TDBU) model of attitude formation, followed by inductive and deductive thematic analysis across the selected technology groups. The hybrid thematic analysis undertaken identified a Conceptual Model, which represents a holistic perspective on the influences and associated features directing ‘sense-making’ and ultimate evaluations around the technology clusters. How individuals make sense of these technologies is shaped by: their beliefs, values and personal characteristics; their perceptions of power and control over the application of the technology; and, the assumed relevance of the technology and its applications within different contexts. These influences form the frame for the creation of sense-making around the technologies. Internal negotiations between these influences are evident and evaluations are based on the relative importance of each influence to the individual, which tend to contribute to attitude ambivalence and instability. The findings indicate the processes of forming and changing attitudes towards these technologies are: complex; dependent on characteristics of the individual, technology, application and product; and, impacted by the nature and forms of information provided. Challenges are faced in engaging with the public about these technologies, as levels of knowledge, understanding and interest vary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After 9/11, it has become increasingly obvious that strongly held religious convictions about the end of the world cannot be dismissed as the predictable consequences of deprivation, as several generations of social scientists once claimed. Instead, it has become clear that these kinds of ideas, having a life of their own, may establish discourses which may have extraordinary capacity to cross nations, cultures and even religions, encouraging passive withdrawal from the political world as well as inspiring vicious and sometimes violent attempts at its subjugation, underwriting the ‘war on terror’ as well as inspiring some of those intent on the destruction of the United States. This article describes one of Ireland’s most successful intellectual exports – a very specific system of thinking about the end of the world known as ‘dispensational premillennialism.’ And the article will move from county Wicklow in the early nineteenth century, through the troubled decades of American modernity, to arrive, perhaps unexpectedly, in the company of the soldiers of radical jihad. The article will describe the globalisation of a discourse which was developed among the most privileged classes of early nineteenth-century Ireland to explain and justify their attempt to withdraw from the world, and which has more recently been used to explain and justify sometimes violent political interventions by both prominent Western politicians and some of the most marginalised and desperate inhabitants of our broken twenty-first century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For many academic physician-scientists, the yearly Tri-Societies meeting of the ASCI, AAP, and AFCR during the 1960s, '70s, and '80s was an annual rite of spring and the focal point of the academic year. In this brief essay, I set down some miscellaneous recollections of these meetings and some thoughts about why they were of such central importance in the careers of those of my generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Scientists rarely reuse expert knowledge of phylogeny, in spite of years of effort to assemble a great "Tree of Life" (ToL). A notable exception involves the use of Phylomatic, which provides tools to generate custom phylogenies from a large, pre-computed, expert phylogeny of plant taxa. This suggests great potential for a more generalized system that, starting with a query consisting of a list of any known species, would rectify non-standard names, identify expert phylogenies containing the implicated taxa, prune away unneeded parts, and supply branch lengths and annotations, resulting in a custom phylogeny suited to the user's needs. Such a system could become a sustainable community resource if implemented as a distributed system of loosely coupled parts that interact through clearly defined interfaces. RESULTS: With the aim of building such a "phylotastic" system, the NESCent Hackathons, Interoperability, Phylogenies (HIP) working group recruited 2 dozen scientist-programmers to a weeklong programming hackathon in June 2012. During the hackathon (and a three-month follow-up period), 5 teams produced designs, implementations, documentation, presentations, and tests including: (1) a generalized scheme for integrating components; (2) proof-of-concept pruners and controllers; (3) a meta-API for taxonomic name resolution services; (4) a system for storing, finding, and retrieving phylogenies using semantic web technologies for data exchange, storage, and querying; (5) an innovative new service, DateLife.org, which synthesizes pre-computed, time-calibrated phylogenies to assign ages to nodes; and (6) demonstration projects. These outcomes are accessible via a public code repository (GitHub.com), a website (http://www.phylotastic.org), and a server image. CONCLUSIONS: Approximately 9 person-months of effort (centered on a software development hackathon) resulted in the design and implementation of proof-of-concept software for 4 core phylotastic components, 3 controllers, and 3 end-user demonstration tools. While these products have substantial limitations, they suggest considerable potential for a distributed system that makes phylogenetic knowledge readily accessible in computable form. Widespread use of phylotastic systems will create an electronic marketplace for sharing phylogenetic knowledge that will spur innovation in other areas of the ToL enterprise, such as annotation of sources and methods and third-party methods of quality assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We expect scientists to follow a code of honor and conduct and to report their research honestly and accurately, but so-called scientific misconduct, which includes plagiarism, faked data, and altered images, has led to a tenfold increase in the number of retractions over the past decade. Among the reasons for this troubling upsurge is increased competition for journal placement, grant money, and prestigious appointments. The solutions are not easy, but reform and greater vigilance is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this review, we discuss recent work by the ENIGMA Consortium (http://enigma.ini.usc.edu) - a global alliance of over 500 scientists spread across 200 institutions in 35 countries collectively analyzing brain imaging, clinical, and genetic data. Initially formed to detect genetic influences on brain measures, ENIGMA has grown to over 30 working groups studying 12 major brain diseases by pooling and comparing brain data. In some of the largest neuroimaging studies to date - of schizophrenia and major depression - ENIGMA has found replicable disease effects on the brain that are consistent worldwide, as well as factors that modulate disease effects. In partnership with other consortia including ADNI, CHARGE, IMAGEN and others(1), ENIGMA's genomic screens - now numbering over 30,000 MRI scans - have revealed at least 8 genetic loci that affect brain volumes. Downstream of gene findings, ENIGMA has revealed how these individual variants - and genetic variants in general - may affect both the brain and risk for a range of diseases. The ENIGMA consortium is discovering factors that consistently affect brain structure and function that will serve as future predictors linking individual brain scans and genomic data. It is generating vast pools of normative data on brain measures - from tens of thousands of people - that may help detect deviations from normal development or aging in specific groups of subjects. We discuss challenges and opportunities in applying these predictors to individual subjects and new cohorts, as well as lessons we have learned in ENIGMA's efforts so far.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The phrase “not much mathematics required” can imply a variety of skill levels. When this phrase is applied to computer scientists, software engineers, and clients in the area of formal specification, the word “much” can be widely misinterpreted with disastrous consequences. A small experiment in reading specifications revealed that students already trained in discrete mathematics and the specification notation performed very poorly; much worse than could reasonably be expected if formal methods proponents are to be believed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches, which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, I reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: 'The JAMA Model and Narrative Interpretation Patterns'), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability were infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for Artificial Intelligence (AI) researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially Bayesian probability) in accounts of evidence has been flourishing among legal scholars; nowadays both the Bayesians (e.g. Peter Tillers) and the Bayesio-skeptics (e.g. Ron Allen), among those legal scholars who are involved in the controversy, are willing to give AI research a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application of probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making. Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Digital Art Weeks PROGRAM (DAW06) is concerned with the application of digital technology in the arts. Consisting again this year of symposium, workshops and performances, the program offers insight into current research and innovations in art and technology as well as illustrating resulting synergies in a series of performances, making artists aware of impulses in technology and scientists aware of the possibilities of the application of technology in the arts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. To examine the thermal transition(s) between different polymorphic forms of Nifedipine and to define experimental conditions that lead to the generation of polymorph IV. Methods. Experiments were performed using a DSC 823e (Mettler Toledo). Nifedipine exists in four polymorphic forms, as well as an amorphous state. Examination of Nifedipine was conducted using the following method(s): cycle 1: 25ºC to 190ºC, 190ºC to 25ºC (formation of amorphous Nifedipine); cycle 2: 25ºC to X (60,70,80...150ºC), X to 25ºC; cycle 3: 25ºC to 190ºC and holding isothermally for 5 min between cycles (heating/cooling rate of 10ºC/min). Results. The amorphous state Nifedipine can sustain heating up to 90ºC without significant changes in its composition. Cycle 2 of amorphous material heated up to 90ºC shows only the glass transition at ~44ºC. In cycle 3 of the same material, a glass transition has been recorded at ~44ºC, followed by two exotherms (~100 and ~115ºC (crystallisation of polymorph III and II, respectively) and an endotherm (169ºC (melting of polymorphs I/II)). Samples that have been heated to temperatures between 100ºC and 120ºC in the second cycle showed a glass transition at ~44ºC and an additional exotherm at ~95ºC (crystallisation of polymorph III) on cooling a exotherm was observed at ~40ºC (crystallisation of polymorph IV). The same material showed no glass transition in cycle 3 but an endotherm at around 62ºC (melting of polymorph IV) an exotherm (~98ºC) and an endotherm (169ºC) melting of polymorph I/II. Heating the sample to a temperatures greater than 130ºC in cycle two results in a glass transition at ~44ºC, and two exotherms at ~102 and 125ºC (crystallisation of polymorphs III and I, respectively). Conclusions. DSC data suggests that polymorph IV can only be produced from amorphous or polymorph III samples. The presence of polymorph I or II drives the conversion of the less stable polymorphic form IV into the most stable form, I. Although form IV of Nifedipine can easily be created, following defined experimental conditions, it may only coexist with amorphous or polymorph III states. When polymorphs I and II are present in the sample polymorph IV cannot be etected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Nicardipine is a member of a family of calcium channel blockers named dihydropiridines that are known to be photolabile and may cause phototoxicity. It is therefore vital to develop analytical method which can study the photodegradation of nicardipine. Method: Forced acid degradation of nicardipine was conducted by heating 12 ml of 1 mg/ml nicardipine with 3 ml of 2.5 M HCl for two hours. A gradient HPLC medthod was developed using Agilent Technologies 1200 series quaternary system. Separation was achieved with a Hichrome (250 x 4.6 mm) 5 μm C18 reversed phase column and mobile phase composition of 70% A(100%v/v water) and 30% B(99%v/v acetonitrile + 1%v/v formic acid) at time zero, composition of A and B was then charged to 60%v/v A;40%v/v B at 10minutes, 50%v/v A; 50%v/v B at 30minutes and 70%v/v A; 30%v/v B at 35minutes. 20μl of 0.8mg/ml of nicardipine degradation was injected at room temperature (25oC). The gradient method was transferred onto a HPLC-ESI-MS system (HP 1050 series - AQUAMAX mass detector) and analysis conducted with an acid degradation concentration of 0.25mg/ml and 20μl injection volume. ESI spectra were acquired in positive ionisation mode with MRM 0-600 m/z. Results: Eleven nicardipine degradation products were detected in the HPLC analysis and the resolution (RS) between the respective degradants where 1.0, 1.2, 6.0, 0.4, 1.7, 3.7, 1.8, 1.0, and 1.7 respectively. Nine degradation products were identified in the ESI spectra with the respective m/z ratio; 171.0, 166.1, 441.2, 423.2, 455.2, 455.2, 331.1, 273.1, and 290.1. The possible molecular formulae for each degradants were ambiguously determined. Conclusion: A sensitive and specific method was developed for the analysis of nicardipine degradants. Method enables detection and quantification of nicardipine degradation products that can be used for the study of the kinetics of nicardipine degradation processes.