945 resultados para Alchemical experiments
Resumo:
Context: Empirical Software Engineering (ESE) replication researchers need to store and manipulate experimental data for several purposes, in particular analysis and reporting. Current research needs call for sharing and preservation of experimental data as well. In a previous work, we analyzed Replication Data Management (RDM) needs. A novel concept, called Experimental Ecosystem, was proposed to solve current deficiencies in RDM approaches. The empirical ecosystem provides replication researchers with a common framework that integrates transparently local heterogeneous data sources. A typical situation where the Empirical Ecosystem is applicable, is when several members of a research group, or several research groups collaborating together, need to share and access each other experimental results. However, to be able to apply the Empirical Ecosystem concept and deliver all promised benefits, it is necessary to analyze the software architectures and tools that can properly support it.
Resumo:
This paper shares our experience with initial negotiation and topic elicitation process for conducting industry experiments in six software development organizations in Finland. The process involved interaction with company representatives in the form of both multiple group discussions and separate face-to-face meetings. Fitness criteria developed by researchers were applied to the list of generated topics to decide on a common topic. The challenges we faced include diversity of proposed topics, communication gaps, skepticism about research methods, initial disconnect between research and industry needs, and lack of prior work relationship. Lessons learned include having enough time to establish trust with partners, importance of leveraging the benefits of training and skill development that are inherent in the experimental approach, uniquely positioning the experimental approach within the landscape of other validation approaches more familiar to industrial partners, and introducing the fitness criteria early in the process.
Resumo:
Pest management practices that rely on pesticides are growing increasingly less effective and environmentally inappropriate in many cases and the search of alternatives is under focus nowadays. Exclusion of pests from the crop by means of pesticide-treated screens can be an eco-friendly method to protect crops, especially if pests are vectors of important diseases. The mesh size of nets is crucial to determine if insects can eventually cross the barrier or exclude them because there is a great variation in insect size depending on the species. Long-lasting insecticide-treated (LLITN) nets, factory pre-treated, have been used since years to fight against mosquitoes vector of malaria and are able to retain their biological efficacy under field for 3 years. In agriculture, treated nets with different insecticides have shown efficacy in controlling some insects and mites, so they seem to be a good tool in helping to solve some pest problems. However, treated nets must be carefully evaluated because can diminish air flow, increase temperature and humidity and decrease light transmission, which may affect plant growth, pests and natural enemies. As biological control is considered a key factor in IPM nowadays, the potential negative effects of treated nets on natural enemies need to be studied carefully. In this work, the effects of a bifentrhin-treated net (3 g/Kg) (supplied by the company Intelligent Insect Control, IIC) on natural enemies of aphids were tested on a cucumber crop in Central Spain in autumn 2011. The crop was sown in 8x6.5 m tunnels divided in 2 sealed compartments with control or treated nets, which were simple yellow netting with 25 mesh (10 x 10 threads/cm2; 1 x 1 mm hole size). Pieces of 2 m high of the treated-net were placed along the lateral sides of one of the two tunnel compartments in each of the 3 available tunnels (replicates); the rest was covered by a commercial untreated net of a similar mesh. The pest, Aphis gossypii Glover (Aphidae), the parasitoid Aphidius colemani (Haliday) (Braconidae) and the predator Adalia bipunctata L. (Coccinellidae) were artificially introduced in the crop. Weekly sampling was done determining the presence or absence of the pest and the natural enemies (NE) in the 42 plants/compartment as well as the number of insects in 11 marked plants. Environmental conditions (temperature, relative humidity, UV and PAR radiation) were recorded. Results show that when aphids were artificially released inside the tunnels, neither its number/plant nor their distribution was affected by the treated net. A lack of negative effect of the insecticide-treated net on natural enemies was also observed. Adalia bipunctata did not establish in the crop and only a short term control of aphids was observed one week after release. On the other hand, A. colemani did establish in the crop and a more long-term effect on the numbers of aphids/plant was detected irrespective of the type of net. KEY WORDS: bifenthrin-treated net, Adalia bipunctata, Aphidius colemani, Aphis gossypii, semi-field
Resumo:
La reproducibilidad de estudios y resultados científicos es una meta a tener en cuenta por cualquier científico a la hora de publicar el producto de una investigación. El auge de la ciencia computacional, como una forma de llevar a cabo estudios empíricos haciendo uso de modelos matemáticos y simulaciones, ha derivado en una serie de nuevos retos con respecto a la reproducibilidad de dichos experimentos. La adopción de los flujos de trabajo como método para especificar el procedimiento científico de estos experimentos, así como las iniciativas orientadas a la conservación de los datos experimentales desarrolladas en las últimas décadas, han solucionado parcialmente este problema. Sin embargo, para afrontarlo de forma completa, la conservación y reproducibilidad del equipamiento computacional asociado a los flujos de trabajo científicos deben ser tenidas en cuenta. La amplia gama de recursos hardware y software necesarios para ejecutar un flujo de trabajo científico hace que sea necesario aportar una descripción completa detallando que recursos son necesarios y como estos deben de ser configurados. En esta tesis abordamos la reproducibilidad de los entornos de ejecución para flujos de trabajo científicos, mediante su documentación usando un modelo formal que puede ser usado para obtener un entorno equivalente. Para ello, se ha propuesto un conjunto de modelos para representar y relacionar los conceptos relevantes de dichos entornos, así como un conjunto de herramientas que hacen uso de dichos módulos para generar una descripción de la infraestructura, y un algoritmo capaz de generar una nueva especificación de entorno de ejecución a partir de dicha descripción, la cual puede ser usada para recrearlo usando técnicas de virtualización. Estas contribuciones han sido aplicadas a un conjunto representativo de experimentos científicos pertenecientes a diferentes dominios de la ciencia, exponiendo cada uno de ellos diferentes requisitos hardware y software. Los resultados obtenidos muestran la viabilidad de propuesta desarrollada, reproduciendo de forma satisfactoria los experimentos estudiados en diferentes entornos de virtualización. ABSTRACT Reproducibility of scientific studies and results is a goal that every scientist must pursuit when announcing research outcomes. The rise of computational science, as a way of conducting empirical studies by using mathematical models and simulations, have opened a new range of challenges in this context. The adoption of workflows as a way of detailing the scientific procedure of these experiments, along with the experimental data conservation initiatives that have been undertaken during last decades, have partially eased this problem. However, in order to fully address it, the conservation and reproducibility of the computational equipment related to them must be also considered. The wide range of software and hardware resources required to execute a scientific workflow implies that a comprehensive description detailing what those resources are and how they are arranged is necessary. In this thesis we address the issue of reproducibility of execution environments for scientific workflows, by documenting them in a formalized way, which can be later used to obtain and equivalent one. In order to do so, we propose a set of semantic models for representing and relating the relevant information of those environments, as well as a set of tools that uses these models for generating a description of the infrastructure, and an algorithmic process that consumes these descriptions for deriving a new execution environment specification, which can be enacted into a new equivalent one using virtualization solutions. We apply these three contributions to a set of representative scientific experiments, belonging to different scientific domains, and exposing different software and hardware requirements. The obtained results prove the feasibility of the proposed approach, by successfully reproducing the target experiments under different virtualization environments.
Resumo:
Our recent studies have shown that deregulated expression of R2, the rate-limiting component of ribonucleotide reductase, enhances transformation and malignant potential by cooperating with activated oncogenes. We now demonstrate that the R1 component of ribonucleotide reductase has tumor-suppressing activity. Stable expression of a biologically active ectopic R1 in ras-transformed mouse fibroblast 10T½ cell lines, with or without R2 overexpression, led to significantly reduced colony-forming efficiency in soft agar. The decreased anchorage independence was accompanied by markedly suppressed malignant potential in vivo. In three ras-transformed cell lines, R1 overexpression resulted in abrogation or marked suppression of tumorigenicity. In addition, the ability to form lung metastases by cells overexpressing R1 was reduced by >85%. Metastasis suppressing activity also was observed in the highly malignant mouse 10T½ derived RMP-6 cell line, which was transformed by a combination of oncogenic ras, myc, and mutant p53. Furthermore, in support of the above observations with the R1 overexpressing cells, NIH 3T3 cells cotransfected with an R1 antisense sequence and oncogenic ras showed significantly increased anchorage independence as compared with control ras-transfected cells. Finally, characteristics of reduced malignant potential also were demonstrated with R1 overexpressing human colon carcinoma cells. Taken together, these results indicate that the two components of ribonucleotide reductase both are unique malignancy determinants playing opposing roles in its regulation, that there is a novel control point important in mechanisms of malignancy, which involves a balance in the levels of R1 and R2 expression, and that alterations in this balance can significantly modify transformation, tumorigenicity, and metastatic potential.
Resumo:
The vibrational energy relaxation of carbon monoxide in the heme pocket of sperm whale myoglobin was studied by using molecular dynamics simulation and normal mode analysis methods. Molecular dynamics trajectories of solvated myoglobin were run at 300 K for both the δ- and ɛ-tautomers of the distal His-64. Vibrational population relaxation times of 335 ± 115 ps for the δ-tautomer and 640 ± 185 ps for the ɛ-tautomer were estimated by using the Landau–Teller model. Normal mode analysis was used to identify those protein residues that act as the primary “doorway” modes in the vibrational relaxation of the oscillator. Although the CO relaxation rates in both the ɛ- and δ-tautomers are similar in magnitude, the simulations predict that the vibrational relaxation of the CO is faster in the δ-tautomer with the distal His playing an important role in the energy relaxation mechanism. Time-resolved mid-IR absorbance measurements were performed on photolyzed carbonmonoxy hemoglobin (Hb13CO). From these measurements, a T1 time of 600 ± 150 ps was determined. The simulation and experimental estimates are compared and discussed.
Resumo:
The NMR assignment of 13C, 15N-labeled proteins with the use of triple resonance experiments is limited to molecular weights below ∼25,000 Daltons, mainly because of low sensitivity due to rapid transverse nuclear spin relaxation during the evolution and recording periods. For experiments that exclusively correlate the amide proton (1HN), the amide nitrogen (15N), and 13C atoms, this size limit has been previously extended by additional labeling with deuterium (2H). The present paper shows that the implementation of transverse relaxation-optimized spectroscopy ([15N,1H]-TROSY) into triple resonance experiments results in several-fold improved sensitivity for 2H/13C/15N-labeled proteins and approximately twofold sensitivity gain for 13C/15N-labeled proteins. Pulse schemes and spectra recorded with deuterated and protonated proteins are presented for the [15N, 1H]-TROSY-HNCA and [15N, 1H]-TROSY-HNCO experiments. A theoretical analysis of the HNCA experiment shows that the primary TROSY effect is on the transverse relaxation of 15N, which is only little affected by deuteration, and predicts sensitivity enhancements that are in close agreement with the experimental data.
Resumo:
The observation of light metal ions in nucleic acids crystals is generally a fortuitous event. Sodium ions in particular are notoriously difficult to detect because their X-ray scattering contributions are virtually identical to those of water and Na+…O distances are only slightly shorter than strong hydrogen bonds between well-ordered water molecules. We demonstrate here that replacement of Na+ by K+, Rb+ or Cs+ and precise measurements of anomalous differences in intensities provide a particularly sensitive method for detecting alkali metal ion-binding sites in nucleic acid crystals. Not only can alkali metal ions be readily located in such structures, but the presence of Rb+ or Cs+ also allows structure determination by the single wavelength anomalous diffraction technique. Besides allowing identification of high occupancy binding sites, the combination of high resolution and anomalous diffraction data established here can also pinpoint binding sites that feature only partial occupancy. Conversely, high resolution of the data alone does not necessarily allow differentiation between water and partially ordered metal ions, as demonstrated with the crystal structure of a DNA duplex determined to a resolution of 0.6 Å.
Resumo:
Laser tweezers and atomic force microscopes are increasingly used to probe the interactions and mechanical properties of individual molecules. Unfortunately, using such time-dependent perturbations to force rare molecular events also drives the system away from equilibrium. Nevertheless, we show how equilibrium free energy profiles can be extracted rigorously from repeated nonequilibrium force measurements on the basis of an extension of Jarzynski's remarkable identity between free energies and the irreversible work.
Resumo:
This paper describes the first participation of IR-n system at Spoken Document Retrieval, focusing on the experiments we made before participation and showing the results we obtained. IR-n system is an Information Retrieval system based on passages and the recognition of sentences to define them. So, the main goal of this experiment is to adapt IR-n system to the spoken document structure by means of the utterance splitter and the overlapping passage technique allowing to match utterances and sentences.
Resumo:
Three sets of laboratory column experimental results concerning the hydrogeochemistry of seawater intrusion have been modelled using two codes: ACUAINTRUSION (Chemical Engineering Department, University of Alicante) and PHREEQC (U.S.G.S.). These reactive models utilise the hydrodynamic parameters determined using the ACUAINTRUSION TRANSPORT software and fit the chloride breakthrough curves perfectly. The ACUAINTRUSION code was improved, and the instabilities were studied relative to the discretisation. The relative square errors were obtained using different combinations of the spatial and temporal steps: the global error for the total experimental data and the partial error for each element. Good simulations for the three experiments were obtained using the ACUAINTRUSION software with slight variations in the selectivity coefficients for both sediments determined in batch experiments with fresh water. The cation exchange parameters included in ACUAINTRUSION are those reported by the Gapon convention with modified exponents for the Ca/Mg exchange. PHREEQC simulations performed using the Gains-Thomas convention were unsatisfactory, with the exchange coefficients from the database of PHREEQC (or range), but those determined with fresh water – natural sediment allowed only an approximation to be obtained. For the treated sediment, the adjusted exchange coefficients were determined to improve the simulation and are vastly different from those from the database of PHREEQC or batch experiment values; however, these values fall in an order similar to the others determined under dynamic conditions. Different cation concentrations were simulated using two different software packages; this disparity could be attributed to the defined selectivity coefficients that affect the gypsum equilibrium. Consequently, different calculated sulphate concentrations are obtained using each type of software; a smaller mismatch was predicted using ACUAINTRUSION. In general, the presented simulations by ACUAINTRUSION and PHREEQC produced similar results, making predictions consistent with the experimental data. However, the simulated results are not identical to the experimental data; sulphate (total S) is overpredicted by both models, most likely due to such factors as the kinetics of gypsum, the possible variations in the exchange coefficients due to salinity and the neglect of other processes.
Resumo:
This article describes an effective procedure for reducing the water content of excess sludge production from a wastewater treatment plant by increasing its concentration and, as a consequence, minimizing the volume of sludge to be managed. It consists of a pre-dewatering sludge process, which is used as a preliminary step or alternative to the thickening. It is made up of two discontinuous sequential stages: the first is resettling and the second, filtration through a porous medium. The process is strictly physical, without any chemical additives or electromechanical equipment intervening. The experiment was carried out in a pilot-scale system, consisting of a column of sedimentation that incorporates a filter medium. Different sludge heights were tested over the filter to verify the influence of hydrostatic pressure on the various final concentrations of each stage. The results show that the initial sludge concentration may increase by more than 570% by the end of the process with the final volume of sludge being reduced in similar proportions and hydrostatic pressure having a limited effect on this final concentration. Moreover, the value of the hydrostatic pressure at which critical specific cake resistance is reached is established.