984 resultados para commons based peer production


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Escherichia coli-based bioreporters for arsenic detection are typically based on the natural feedback loop that controls ars operon transcription. Feedback loops are known to show a wide range linear response to the detriment of the overall amplification of the incoming signal. While being a favourable feature in controlling arsenic detoxification for the cell, a feedback loop is not necessarily the most optimal for obtaining highest sensitivity and response in a designed cellular reporter for arsenic detection. Here we systematically explore the effects of uncoupling the topology of arsenic sensing circuitry on the developed reporter signal as a function of arsenite concentration input. A model was developed to describe relative ArsR and GFP levels in feedback and uncoupled circuitry, which was used to explore new ArsR-based synthetic circuits. The expression of arsR was then placed under the control of a series of constitutive promoters, which differed in promoter strength, and which could be further modulated by TetR repression. Expression of the reporter gene was maintained under the ArsR-controlled Pars promoter. ArsR expression in the systems was measured by using ArsR-mCherry fusion proteins. We find that stronger constitutive ArsR production decreases arsenite-dependent EGFP output from Pars and vice versa. This leads to a tunable series of arsenite-dependent EGFP outputs in a variety of systematically characterized circuitries. The higher expression levels and sensitivities of the response curves in the uncoupled circuits may be useful for improving field-test assays using arsenic bioreporters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La meva incorporació al grup de recerca del Prof. McCammon (University of California San Diego) en qualitat d’investigador post doctoral amb una beca Beatriu de Pinós, va tenir lloc el passat 1 de desembre de 2010; on vaig dur a terme les meves tasques de recerca fins al darrer 1 d’abril de 2012. El Prof. McCammon és un referent mundial en l’aplicació de simulacions de dinàmica molecular (MD) en sistemes biològics d’interès humà. La contribució més important del Prof. McCammon en la simulació de sistemes biològics és el desenvolupament del mètode de dinàmiques moleculars accelerades (AMD). Les simulacions MD convencionals, les quals estan limitades a l’escala de temps del nanosegon (~10-9s), no son adients per l’estudi de sistemes biològics rellevants a escales de temps mes llargues (μs, ms...). AMD permet explorar fenòmens moleculars poc freqüents però que son clau per l’enteniment de molts sistemes biològics; fenòmens que no podrien ser observats d’un altre manera. Durant la meva estada a la “University of California San Diego”, vaig treballar en diferent aplicacions de les simulacions AMD, incloent fotoquímica i disseny de fàrmacs per ordinador. Concretament, primer vaig desenvolupar amb èxit una combinació dels mètodes AMD i simulacions Car-Parrinello per millorar l’exploració de camins de desactivació (interseccions còniques) en reaccions químiques fotoactivades. En segon lloc, vaig aplicar tècniques estadístiques (Replica Exchange) amb AMD en la descripció d’interaccions proteïna-lligand. Finalment, vaig dur a terme un estudi de disseny de fàrmacs per ordinador en la proteïna-G Rho (involucrada en el desenvolupament de càncer humà) combinant anàlisis estructurals i simulacions AMD. Els projectes en els quals he participat han estat publicats (o estan encara en procés de revisió) en diferents revistes científiques, i han estat presentats en diferents congressos internacionals. La memòria inclosa a continuació conté més detalls de cada projecte esmentat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanomotors are nanoscale devices capable of converting energy into movement and forces. Among them, self-propelled nanomotors offer considerable promise for developing new and novel bioanalytical and biosensing strategies based on the direct isolation of target biomolecules or changes in their movement in the presence of target analytes. The mainachievements of this project consists on the development of receptor-functionalized nanomotors that offer direct and rapid target detection, isolation and transport from raw biological samples without preparatory and washing steps. For example, microtube engines functionalized with aptamer, antibody, lectin and enzymes receptors were used for the direct isolation of analytes of biomedical interest, including proteins and whole cells, among others. A target protein was also isolated from a complex sample by using an antigen-functionalized microengine navigating into the reservoirs of a lab-on-a-chip device. The new nanomotorbased target biomarkers detection strategy not only offers highly sensitive, rapid, simple and low cost alternative for the isolation and transport of target molecules, but also represents a new dimension of analytical information based on motion. The recognition events can be easily visualized by optical microscope (without any sophisticated analytical instrument) to reveal the target presence and concentration. The use of artificial nanomachines has shown not only to be useful for (bio)recognition and (bio)transport but also for detection of environmental contamination and remediation. In this context, micromotors modified with superhydrophobic layer demonstrated that effectively interacted, captured, transported and removed oil droplets from oil contaminated samples. Finally, a unique micromotor-based strategy for water-quality testing, that mimics live-fish water-quality testing, based on changes in the propulsion behavior of artificial biocatalytic microswimmers in the presence of aquatic pollutants was also developed. The attractive features of the new micromachine-based target isolation and signal transduction protocols developed in this project offer numerous potential applications in biomedical diagnostics, environmental monitoring, and forensic analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estudi realitzat a partir d’una estada a l’ Institut für Komplexe Materialien, Leibniz-Institut für Festkörper- und Werkstoffforschung Dresden, entre 2010 i 2011. S'ha explorat l'efecte de les condicions i influència dels elements d'aleació en la capacitat de formació de vidre, l'estructura i les propietats tèrmiques i magnètiques de vidres metàl•lics massissos i materials nanocristal•lins en base Fe. La producció d'aquests materials en forma de cintes de unes 20 micres de gruix ha estat àmpliament estudiada i s'ha vist que presenten unes propietats excel•lents com a materials magnètics tous. El propòsit general d'aquest projecte era l'obtenció de composicions òptimes amb alta capacitat de formar vidre i amb excel•lents propietats magnètiques com a materials magnètics tous combinat amb bones propietats mecàniques. El projecte prenia com a punt de partida l'aliatge [FeCoBSi]96Nb4 ja que és el que presenta millor capacitat de formar vidre i presenta una alta imantació de saturació i baix camp coercitiu. S'ha fet un estudi dels factors fonamentals que intervenen en la formació de l'estat vitri. La composició abans esmentada ha estat variada amb l'addició d'altres elements per estudiar com afecten aquests nous elements a les propietats, la formació de vidre i l'estructura dels aliatges resultants amb l'objectiu de millorar-ne les propietats magnètiques i la capacitat de formació de vidre. Entre altres s'ha usat el Zr, Mo, Y i el Gd per millorar la formació de vidre; i el Co i el Ni per millorar les propietats magnètiques a alta temperatura. S'han estudiat les relacions entre la capacitat de formació de vidre i la seva estabilitat tèrmica, la resistència a la cristal•lització i la estructura de l'aliatge resultant després del procés de solidificació. Per aquest estudi s'han determinat els mecanismes que controlen la transformació i la seva cinètica així com les fases que es formen durant el tractament tèrmic permetent la formulació de models predictius.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des del segon quart del s. I aC i, especialment, durant el regnat d’August, es va desenvolupar a l’antiga província Tarraconensis un sistema productiu centrat en l’explotació agrària vitivinícola amb una finalitat clarament comercial. La majoria d’assentament vitivinícoles es troben emplaçats al litoral català, associats de vegades a figlinae que fabricaven les àmfores per al transport i comerç de l’excedent vinícola. No obstant, a l’àrea del Vallès Occidental i del Baix Llobregat es troben una sèrie de vil•les vinculades a la producció de vi i a la fabricació d’àmfores que han proporcionat restes molt significatives sobre la contribució d’aquesta zona a l’expansió econòmica de la província. La caracterització arqueològica i arqueomètrica d’un gran nombre d’àmfores procedents de diversos tallers ceràmics situats al Vallès Occidental i al Baix Llobregat, utilitzant diverses tècniques d’anàlisi química, mineralògica i petrogràfica, ha portat a establir quins tipus d’àmfores es van fabricar a cada taller i de quina manera. S’han identificat alguns dels processos tecnològics de la cadena operativa: la selecció i processat de les matèries primeres per conformar la pasta procedents, generalment, de l’àrea on es troba cada centre de producció, el modelatge, l’assecat i la cocció de les peces. En alguns dels casos analitzats, s’ha identificat quins tipus de contenidors van ser importants a l’establiment i la seva provinença. La integració d’aquests resultats en la base de dades analítica que disposa l’ERAAUB ha permès avaluar el grau d’estandardització dels processos tecnològics en aquesta àrea. La contrastació final amb les dades històriques i arqueològiques contribueix al coneixement arqueològic de les àmfores vinàries de la Tarraconensis i, a través d’elles, al coneixement de les societats que les van fabricar, comercialitzar i utilitzar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some of the elements that characterize the globalization of food and agriculture are industrialization and intensification of agriculture and liberalization of agricultural markets, that favours elongation of the food chain and homogenization of food habits (nutrition transition), among other impacts. As a result, the probability of food contamination has increased with the distance and the number of “hands" that may contact the food (critical points); the nutritional quality of food has been reduced because of increased transport and longer periods of time from collection to consumption; and the number of food-related diseases due to changes in eating patterns has increased. In this context, there exist different agencies and regulations intended to ensure food safety at different levels, e.g. at the international level, Codex Alimentarius develops standards and regulations for the marketing of food in a global market. Although governments determine the legal framework, the food industry manages the safety of their products, and thus, develops its own standards for their marketing, such as the Good Agricultural Practices (GAP) programs. The participation of the private sector in the creation of regulatory standards strengthens the free trade of food products, favouring mostly large agribusiness companies. These standards are in most cases unattainable for small producers and food safety regulations are favouring removal of the peasantry and increase concentration and control in the food system by industrial actors. Particularly women, who traditionally have been in charge of the artisanal transformation process, can be more affected by these norms than men. In this project I am analysing the impcact of food safety norms over small farms, based on the case of artisanal production made by women in Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El siguiente artículo describe la experiencia de la metodología de aprendizaje basada en la evaluación entre iguales aplicada a la profesionalización de los educadores/as sociales.Tanto a nivel escolar como universitario, hay numerosas experiencias publicadas en relación con la metodología de evaluación entre iguales. Además, existen numerosos apartados de los “Documentos profesionalizadores” donde se justifica el hecho de que se entrenen algunas habilidades que deberían darse en la práctica profesional (como porejemplo en la definición, en el código ético y en el catálogo de funciones y competencias del educador/a social). En estas páginas pretendemos dar a conocer la experiencia de entrenamiento de futuros educadores/as en aspectos que tienen una relación directa con la evaluación entre educadores/as sociales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a failure alert system and a methodology for content reuse in a new instructional design system called InterMediActor (IMA). IMA provides an environment for instructional content design, production and reuse, and for students’ evaluation based in content specification through a hierarchical structure of competences. The student assessment process and information extraction process for content reuse are explained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The COSMIN checklist is a tool for evaluating the methodological quality of studies on measurement properties of health-related patient-reported outcomes. The aim of this study is to determine the inter-rater agreement and reliability of each item score of the COSMIN checklist (n = 114). Methods: 75 articles evaluating measurement properties were randomly selected from the bibliographic database compiled by the Patient-Reported Outcome Measurement Group, Oxford, UK. Raters were asked to assess the methodological quality of three articles, using the COSMIN checklist. In a one-way design, percentage agreement and intraclass kappa coefficients or quadratic-weighted kappa coefficients were calculated for each item. Results: 88 raters participated. Of the 75 selected articles, 26 articles were rated by four to six participants, and 49 by two or three participants. Overall, percentage agreement was appropriate (68% was above 80% agreement), and the kappa coefficients for the COSMIN items were low (61% was below 0.40, 6% was above 0.75). Reasons for low inter-rater agreement were need for subjective judgement, and accustom to different standards, terminology and definitions.Conclusions: Results indicated that raters often choose the same response option, but that it is difficult on item level to distinguish between articles. When using the COSMIN checklist in a systematic review, we recommend getting some training and experience, completing it by two independent raters, and reaching consensus on one final rating. Instructions for using the checklist are improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional methods of gene prediction rely on the recognition of DNA-sequence signals, the coding potential or the comparison of a genomic sequence with a cDNA, EST, or protein database. Reasons for limited accuracy in many circumstances are species-specific training and the incompleteness of reference databases. Lately, comparative genome analysis has attracted increasing attention. Several analysis tools that are based on human/mouse comparisons are already available. Here, we present a program for the prediction of protein-coding genes, termed SGP-1 (Syntenic Gene Prediction), which is based on the similarity of homologous genomic sequences. In contrast to most existing tools, the accuracy of SGP-1 depends little on species-specific properties such as codon usage or the nucleotide distribution. SGP-1 may therefore be applied to nonstandard model organisms in vertebrates as well as in plants, without the need for extensive parameter training. In addition to predicting genes in large-scale genomic sequences, the program may be useful to validate gene structure annotations from databases. To this end, SGP-1 output also contains comparisons between predicted and annotated gene structures in HTML format. The program can be accessed via a Web server at http://soft.ice.mpg.de/sgp-1. The source code, written in ANSI C, is available on request from the authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The information provided by the alignment-independent GRid Independent Descriptors (GRIND) can be condensed by the application of principal component analysis, obtaining a small number of principal properties (GRIND-PP), which is more suitable for describing molecular similarity. The objective of the present study is to optimize diverse parameters involved in the obtention of the GRIND-PP and validate their suitability for applications, requiring a biologically relevant description of the molecular similarity. With this aim, GRIND-PP computed with a collection of diverse settings were used to carry out ligand-based virtual screening (LBVS) on standard conditions. The quality of the results obtained was remarkable and comparable with other LBVS methods, and their detailed statistical analysis allowed to identify the method settings more determinant for the quality of the results and their optimum. Remarkably, some of these optimum settings differ significantly from those used in previously published applications, revealing their unexplored potential. Their applicability in large compound database was also explored by comparing the equivalence of the results obtained using either computed or projected principal properties. In general, the results of the study confirm the suitability of the GRIND-PP for practical applications and provide useful hints about how they should be computed for obtaining optimum results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Despite the continuous production of genome sequence for a number of organisms,reliable, comprehensive, and cost effective gene prediction remains problematic. This is particularlytrue for genomes for which there is not a large collection of known gene sequences, such as therecently published chicken genome. We used the chicken sequence to test comparative andhomology-based gene-finding methods followed by experimental validation as an effective genomeannotation method.Results: We performed experimental evaluation by RT-PCR of three different computational genefinders, Ensembl, SGP2 and TWINSCAN, applied to the chicken genome. A Venn diagram wascomputed and each component of it was evaluated. The results showed that de novo comparativemethods can identify up to about 700 chicken genes with no previous evidence of expression, andcan correctly extend about 40% of homology-based predictions at the 5' end.Conclusions: De novo comparative gene prediction followed by experimental verification iseffective at enhancing the annotation of the newly sequenced genomes provided by standardhomology-based methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Après la votation fédérale demandant de prendre en compte les médecines complémentaires, un consensus a été recherché dans quatorze services et unités du Centre hospitalier universitaire vaudois (CHUV). Confrontés aux données de la littérature (Plus de 2000 publications en "Evidence-based complementary medicine" depuis 1998), les soignants étaient tous surpris par l'ampleur des résultats cliniques disponibles actuellement. Tous identifiaient un besoin en formation et en informations sur le sujet. Une prise de position officielle de l'institution était aussi souhaitée, instituant l'enseignement et la recherche sur les médecines complémentaires et assurant la production d'informations rigoureuses et pertinentes pour la clinique. [Abstract] While a popular vote supported a new article on complementary and alternative medicines (CAM) in the Swiss Constitution, this assessment in 14 wards of the University Hospital of Lausanne, Switzerland, attempted at answering the question: How can CAM use be better taken into account and patients informed with more rigor and respect for their choices? Confronted with a review of the literature (> 2000 publications in "Evidence-based cornplementary medicine" since 1998), respondents declared their ignorance of the clinical data presently available on CAM. All were in favour of more teaching and information on the subject, plus an official statement from the Hospital direction, ensuring production and diffusion of rigorous and cJinically significant information on CAM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.