985 resultados para Scientists,


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Scientists rarely reuse expert knowledge of phylogeny, in spite of years of effort to assemble a great "Tree of Life" (ToL). A notable exception involves the use of Phylomatic, which provides tools to generate custom phylogenies from a large, pre-computed, expert phylogeny of plant taxa. This suggests great potential for a more generalized system that, starting with a query consisting of a list of any known species, would rectify non-standard names, identify expert phylogenies containing the implicated taxa, prune away unneeded parts, and supply branch lengths and annotations, resulting in a custom phylogeny suited to the user's needs. Such a system could become a sustainable community resource if implemented as a distributed system of loosely coupled parts that interact through clearly defined interfaces. RESULTS: With the aim of building such a "phylotastic" system, the NESCent Hackathons, Interoperability, Phylogenies (HIP) working group recruited 2 dozen scientist-programmers to a weeklong programming hackathon in June 2012. During the hackathon (and a three-month follow-up period), 5 teams produced designs, implementations, documentation, presentations, and tests including: (1) a generalized scheme for integrating components; (2) proof-of-concept pruners and controllers; (3) a meta-API for taxonomic name resolution services; (4) a system for storing, finding, and retrieving phylogenies using semantic web technologies for data exchange, storage, and querying; (5) an innovative new service, DateLife.org, which synthesizes pre-computed, time-calibrated phylogenies to assign ages to nodes; and (6) demonstration projects. These outcomes are accessible via a public code repository (GitHub.com), a website (http://www.phylotastic.org), and a server image. CONCLUSIONS: Approximately 9 person-months of effort (centered on a software development hackathon) resulted in the design and implementation of proof-of-concept software for 4 core phylotastic components, 3 controllers, and 3 end-user demonstration tools. While these products have substantial limitations, they suggest considerable potential for a distributed system that makes phylogenetic knowledge readily accessible in computable form. Widespread use of phylotastic systems will create an electronic marketplace for sharing phylogenetic knowledge that will spur innovation in other areas of the ToL enterprise, such as annotation of sources and methods and third-party methods of quality assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We expect scientists to follow a code of honor and conduct and to report their research honestly and accurately, but so-called scientific misconduct, which includes plagiarism, faked data, and altered images, has led to a tenfold increase in the number of retractions over the past decade. Among the reasons for this troubling upsurge is increased competition for journal placement, grant money, and prestigious appointments. The solutions are not easy, but reform and greater vigilance is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this review, we discuss recent work by the ENIGMA Consortium (http://enigma.ini.usc.edu) - a global alliance of over 500 scientists spread across 200 institutions in 35 countries collectively analyzing brain imaging, clinical, and genetic data. Initially formed to detect genetic influences on brain measures, ENIGMA has grown to over 30 working groups studying 12 major brain diseases by pooling and comparing brain data. In some of the largest neuroimaging studies to date - of schizophrenia and major depression - ENIGMA has found replicable disease effects on the brain that are consistent worldwide, as well as factors that modulate disease effects. In partnership with other consortia including ADNI, CHARGE, IMAGEN and others(1), ENIGMA's genomic screens - now numbering over 30,000 MRI scans - have revealed at least 8 genetic loci that affect brain volumes. Downstream of gene findings, ENIGMA has revealed how these individual variants - and genetic variants in general - may affect both the brain and risk for a range of diseases. The ENIGMA consortium is discovering factors that consistently affect brain structure and function that will serve as future predictors linking individual brain scans and genomic data. It is generating vast pools of normative data on brain measures - from tens of thousands of people - that may help detect deviations from normal development or aging in specific groups of subjects. We discuss challenges and opportunities in applying these predictors to individual subjects and new cohorts, as well as lessons we have learned in ENIGMA's efforts so far.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The phrase “not much mathematics required” can imply a variety of skill levels. When this phrase is applied to computer scientists, software engineers, and clients in the area of formal specification, the word “much” can be widely misinterpreted with disastrous consequences. A small experiment in reading specifications revealed that students already trained in discrete mathematics and the specification notation performed very poorly; much worse than could reasonably be expected if formal methods proponents are to be believed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches, which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, I reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: 'The JAMA Model and Narrative Interpretation Patterns'), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability were infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for Artificial Intelligence (AI) researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially Bayesian probability) in accounts of evidence has been flourishing among legal scholars; nowadays both the Bayesians (e.g. Peter Tillers) and the Bayesio-skeptics (e.g. Ron Allen), among those legal scholars who are involved in the controversy, are willing to give AI research a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application of probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making. Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Digital Art Weeks PROGRAM (DAW06) is concerned with the application of digital technology in the arts. Consisting again this year of symposium, workshops and performances, the program offers insight into current research and innovations in art and technology as well as illustrating resulting synergies in a series of performances, making artists aware of impulses in technology and scientists aware of the possibilities of the application of technology in the arts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. To examine the thermal transition(s) between different polymorphic forms of Nifedipine and to define experimental conditions that lead to the generation of polymorph IV. Methods. Experiments were performed using a DSC 823e (Mettler Toledo). Nifedipine exists in four polymorphic forms, as well as an amorphous state. Examination of Nifedipine was conducted using the following method(s): cycle 1: 25ºC to 190ºC, 190ºC to 25ºC (formation of amorphous Nifedipine); cycle 2: 25ºC to X (60,70,80...150ºC), X to 25ºC; cycle 3: 25ºC to 190ºC and holding isothermally for 5 min between cycles (heating/cooling rate of 10ºC/min). Results. The amorphous state Nifedipine can sustain heating up to 90ºC without significant changes in its composition. Cycle 2 of amorphous material heated up to 90ºC shows only the glass transition at ~44ºC. In cycle 3 of the same material, a glass transition has been recorded at ~44ºC, followed by two exotherms (~100 and ~115ºC (crystallisation of polymorph III and II, respectively) and an endotherm (169ºC (melting of polymorphs I/II)). Samples that have been heated to temperatures between 100ºC and 120ºC in the second cycle showed a glass transition at ~44ºC and an additional exotherm at ~95ºC (crystallisation of polymorph III) on cooling a exotherm was observed at ~40ºC (crystallisation of polymorph IV). The same material showed no glass transition in cycle 3 but an endotherm at around 62ºC (melting of polymorph IV) an exotherm (~98ºC) and an endotherm (169ºC) melting of polymorph I/II. Heating the sample to a temperatures greater than 130ºC in cycle two results in a glass transition at ~44ºC, and two exotherms at ~102 and 125ºC (crystallisation of polymorphs III and I, respectively). Conclusions. DSC data suggests that polymorph IV can only be produced from amorphous or polymorph III samples. The presence of polymorph I or II drives the conversion of the less stable polymorphic form IV into the most stable form, I. Although form IV of Nifedipine can easily be created, following defined experimental conditions, it may only coexist with amorphous or polymorph III states. When polymorphs I and II are present in the sample polymorph IV cannot be etected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Nicardipine is a member of a family of calcium channel blockers named dihydropiridines that are known to be photolabile and may cause phototoxicity. It is therefore vital to develop analytical method which can study the photodegradation of nicardipine. Method: Forced acid degradation of nicardipine was conducted by heating 12 ml of 1 mg/ml nicardipine with 3 ml of 2.5 M HCl for two hours. A gradient HPLC medthod was developed using Agilent Technologies 1200 series quaternary system. Separation was achieved with a Hichrome (250 x 4.6 mm) 5 μm C18 reversed phase column and mobile phase composition of 70% A(100%v/v water) and 30% B(99%v/v acetonitrile + 1%v/v formic acid) at time zero, composition of A and B was then charged to 60%v/v A;40%v/v B at 10minutes, 50%v/v A; 50%v/v B at 30minutes and 70%v/v A; 30%v/v B at 35minutes. 20μl of 0.8mg/ml of nicardipine degradation was injected at room temperature (25oC). The gradient method was transferred onto a HPLC-ESI-MS system (HP 1050 series - AQUAMAX mass detector) and analysis conducted with an acid degradation concentration of 0.25mg/ml and 20μl injection volume. ESI spectra were acquired in positive ionisation mode with MRM 0-600 m/z. Results: Eleven nicardipine degradation products were detected in the HPLC analysis and the resolution (RS) between the respective degradants where 1.0, 1.2, 6.0, 0.4, 1.7, 3.7, 1.8, 1.0, and 1.7 respectively. Nine degradation products were identified in the ESI spectra with the respective m/z ratio; 171.0, 166.1, 441.2, 423.2, 455.2, 455.2, 331.1, 273.1, and 290.1. The possible molecular formulae for each degradants were ambiguously determined. Conclusion: A sensitive and specific method was developed for the analysis of nicardipine degradants. Method enables detection and quantification of nicardipine degradation products that can be used for the study of the kinetics of nicardipine degradation processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To study the impact of powder flow properties on dosator filling systems, with particular focus on improvements in dose weight accuracy and repeatability. Method: This study evaluates a range of critical powder flow properties such as: flow function, cohesion, wall friction, adhesion to wall surfaces, density/compressibility data, stress ratio “K” and gas permeability. The characterisations of the powders considered in this study were undertaken using an annular shear cell using a sample size of 0.5 litres. This tester also incorporated the facility to measure bed expansion during shear in addition to contraction under consolidation forces. A modified Jenike type linear wall friction tester was used to develop the failure loci for the powder sample in conjunction with multiple wall samples (representing a variety of material types and surface finishes). Measurements of the ratio of applied normal stress versus lateral stress were determined using a piece of test equipment specifically designed for the purpose. Results: The correct characterisation of powders and the incorporation of this data into the design of process equipment are recognised as critical for reliable and accurate operation. An example of one aspect of this work is the stress ratio “K”. This characteristic is not well understood or correctly interpreted in many cases – despite its importance. Fig 1 [Omitted] (illustrates a sample of test data. The slope of the line gives the stress ratio in a uniaxial compaction system – indicating the behaviour of the material under compaction during dosing processes. Conclusions: A correct assessment of the bulk powder properties for a given formulation can allow prediction of: cavity filling behaviour (and hence dosage), efficiency of release from dosator, and strength and stability of extruded dose en route to capsule filling Influences over the effectiveness of dosator systems have been shown to be impacted upon by: bed pre-compaction history, gas permeability in the bed (with respect to local density effects), and friction effects for materials of construction for dosators

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To develop an improved mathematical model for the prediction of dose accuracy of Dosators - based upon the geometry of the machine in conjunction with measured flow properties of the powder. Methods: A mathematical model has been created, based on a analytical method of differential slices - incorporating measured flow properties. The key flow properties of interest in this investigation were: flow function, effective angle of wall friction, wall adhesion, bulk density, stress ratio K and permeability. To simulate the real process and (very importantly) validate the model, a Dosator test-rig has been used to measure the forces acting on the Dosator during the filling stage, the force required to eject the dose and the dose weight. Results: Preliminary results were obtained from the Dosator test-rig. Figure 1 [Omitted] shows the dose weight for different depths to the bottom of the powder bed at the end of the stroke and different levels of pre-compaction of the powder bed. A strong influence over dose weight arising from the proximity between the Dosator and the bottom of the powder bed at the end of the stroke and the conditions of the powder bed has been established. Conclusions: The model will provide a useful tool to predict dosing accuracy and, thus, optimise the future design of Dosator based equipment technology – based on measured bulk properties of the powder to be handled. Another important factor (with a significant influence) on Dosator processes, is the condition of the powder bed and the clearance between the Dosator and the bottom of the powder bed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. To study thermal stability of Aspirin and define thermal events that are associated with the thermal degradation of aspirin. Methods. Experiments were performed using a DSC 823e (Mettler Toledo, Swiss). Aspirin is prone to thermal degradation upon exposure to high temperatures. The melting point of aspirin is 140.1±0.4ºC (DSC). Aspirin has been examined by heating samples to 120ºC, 155ºC and 185ºC with subsequent cooling to -55ºC and a final heating to 155ºC. Although different heating and cooling ranges have been used, only results obtained at a rate of 10ºC/min will be presented. All runs where conducted in hermetically sealed pans. Results. Upon heating the sample to 120ºC no significant thermal event can be detected. After cooling the sample and reheating a glass transition can be observed at ~-8ºC, followed by the melting of aspirin at ~139ºC. By heating the sample to 155ºC melting of aspirin has been detected at ~139ºC. On cooling and subsequent heating a glass transition occurs at ~-32ºC, together with a broad crystallisation (onset at ~38ºC and peak maximum at ~57ºC) followed by a broad melting with an onset at 94ºC and peak maximum at ~112ºC. Finally, by heating the sample to 185ºC melting at ~ 139ºC was observed, and upon cooling and reheating a glass transition was detected at ~-26ºC and no further events could be recorded. Conclusions. This research demonstrates that the degradation steps of Aspirin depend on the thermal treatment. The main degradation products of different thermal treatments are currently unknown it is clear that acetic acid, which is one of the degradation products, acts as an antiplasticiser by lowering the glass transition temperature. In addition, due to the presence of the degradation products in liquid form (observed by hot stage microscopy), Aspirin is still present in the sample and recrystallises during the second heating step and melts at much lower temperatures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although some countries plan to build new nuclear power plants in the near future, in aggregate the data indicates that nuclear power's influence will continue to dwindle across the globe in coming decades.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document provides details of the transfer of the Norman Holme archive data held in the National Marine Biological Library onto a modern database, specifically Marine Recorder. A key part in the creation of the database was the retrieval of a large amount of information recorded in field notebooks and on loosely-bound sheets of paper. As this work involved amending, interpreting and updating the available information, it was felt that an accurate record of this process should exist to allow scientists of the future to be able to clearly link the modern database to the archive material. This document also provides details of external information sources that were used to enhance and qualify the historical interpretation, such as estimating volumes and species abundances.