905 resultados para Replicated Microarray Experiments
Resumo:
Questions of scale have received ample attention in physical scale modeling and experimentation, but have not been discussed with regard to economic experimentation. In this article I distinguish between two kinds of experiments, "generic" and "specific" experiments. Using a comparison between two experimental laboratory studies on the "posted price effect", I then show that scale issues become important in specific laboratory experiments because of the scaling down of time in the target market to laboratory dimensions. This entails choices in the material configuration of the experiment as well as role changes of experimental subjects. My discussion thus adds to recent literature on external validity and on the materiality of experiments.
Resumo:
Glucose is the most important metabolic substrate of the retina and maintenance of normoglycemia is an essential challenge for diabetic patients. Chronic, exaggerated, glycemic excursions could lead to cardiovascular diseases, nephropathy, neuropathy and retinopathy. We recently showed that hypoglycemia induced retinal cell death in mouse via caspase 3 activation and glutathione (GSH) decrease. Ex vivo experiments in 661W photoreceptor cells confirmed the low-glucose induction of death via superoxide production and activation of caspase 3, which was concomitant with a decrease of GSH content. We evaluate herein retinal gene expression 4 h and 48 h after insulin-induced hypoglycemia. Microarray analysis demonstrated clusters of genes whose expression was modified by hypoglycemia and we discuss the potential implication of those genes in retinal cell death. In addition, we identify by gene set enrichment analysis, three important pathways, including lysosomal function, GSH metabolism and apoptotic pathways. Then we tested the effect of recurrent hypoglycemia (three successive 4h periods of hypoglycemia spaced by 48 h recovery) on retinal cell death. Interestingly, exposure to multiple hypoglycemic events prevented GSH decrease and retinal cell death, or adapted the retina to external stress by restoring GSH level comparable to control situation. We hypothesize that scavenger GSH is a key compound in this apoptotic process, and maintaining "normal" GSH level, as well as a strict glycemic control, represents a therapeutic challenge in order to avoid side effects of diabetes, especially diabetic retinopathy.
Resumo:
JXTA is a mature set of open protocols, with morethan 10 years of history, that enable the creation and deployment of peer-to-peer (P2P) networks, allowing the execution of services in a distributed manner. Throughout its lifecycle, ithas slowly evolved in order to appeal a broad set of different applications. Part of this evolution includes providing basic security capabilities in its protocols in order to achieve some degree of message privacy and authentication. However, undersome contexts, more advanced security requirements should be met, such as anonymity. There are several methods to attain anonymity in generic P2P networks. In this paper, we proposehow to adapt a replicated message-based approach to JXTA, by taking advantage of its idiosyncracies and capabilities.
Resumo:
L'objectiu principal d'aquest treball és aplicar tècniques de visió articial per aconseguir localitzar i fer el seguiment de les extremitats dels ratolins dins l'entorn de prova de les investigacions d'optogenètica del grup de recerca del Neuroscience Institute de la Universitat de Princeton, Nova Jersey.
Resumo:
The structural relaxation of pure amorphous silicon (a-Si) and hydrogenated amorphous silicon (a-Si:H) materials, that occurs during thermal annealing experiments, has been analyzed by Raman spectroscopy and differential scanning calorimetry. Unlike a-Si, the heat evolved from a-Si:H cannot be explained by relaxation of the Si-Si network strain but it reveals a derelaxation of the bond angle strain. Since the state of relaxation after annealing is very similar for pure and hydrogenated materials, our results give strong experimental support to the predicted configurational gap between a-Si and crystalline silicon
Resumo:
The aim of this project is to get used to another kind of programming. Since now, I used very complex programming languages to develop applications or even to program microcontrollers, but PicoCricket system is the evidence that we don’t need so complex development tools to get functional devices. PicoCricket system is the clear example of simple programming to make devices work the way we programmed it. There’s an easy but effective way to programs mall devices just saying what we want them to do. We cannot do complex algorithms and mathematical operations but we can program them in a short time. Nowadays, the easier and faster we produce, the more we earn. So the tendency is to develop fast, cheap and easy, and PicoCricket system can do it.
Resumo:
In the article we resume four experiments of an interdisciplinary nature carried out in four different secondary education centres. The nexus of the union of these didactic proposals is that of looking at values in sport and the critical capacity of the students from distinct perspectives: violence, mass media, politics and gender and the treatment of body in our society
Resumo:
Ultrafast 2D NMR is a powerful methodology that allows recording of a 2D NMR spectrum in a fraction of second. However, due to the numerous non-conventional parameters involved in this methodology its implementation is no trivial task. Here, an optimized experimental protocol is carefully described to ensure efficient implementation of ultrafast NMR. The ultrafast spectra resulting from this implementation are presented based on the example of two widely used 2D NMR experiments, COSY and HSQC, obtained in 0.2 s and 41 s, respectively.
Resumo:
The effect of Heterodera glycines on photosynthesis, leaf area and yield of soybean (Glycine max) was studied in two experiments carried out under greenhouse condition. Soybean seeds were sown in 1.5 l (Experiment 1) or 5.0 l (Experiment 2) clay pots filled with a mixture of field soil + sand (1:1) sterilized with methyl bromide. Eight days after sowing, seedlings were thinned to one per pot, and one day later inoculated with 0; 1.200; 3.600; 10.800; 32.400 or 97.200 J2 juveniles of H. glycines. Experiment 1 was carried out during the first 45 days of the inoculation while Experiment 2 was conducted during the whole cycle of the crop. Measurements of photosynthetic rate, stomatic conductance, chlorophyll fluorescence, leaf color, leaf area, and chlorophyll leaf content were taken at ten-day intervals throughout the experiments. Data on fresh root weight, top dry weight, grain yield, number of eggs/gram of roots, and nematode reproduction factor were obtained at the end of the trials. Each treatment was replicated ten times. There was a marked reduction in both photosynthetic rate and chlorophyll content, as well as an evident yellowing of the leaves of the infected plants. Even at the lowest Pi, the effects of H. glycines on the top dry weight or grain yield were quite severe. Despite the parasitism, soybean yield was highly correlated with the integrated leaf area and, accordingly, the use of this parameter was suggested for the design of potential damage prediction models that include physiological aspects of nematode-diseased plants.
Resumo:
In this thesis three experiments with atomic hydrogen (H) at low temperatures T<1 K are presented. Experiments were carried out with two- (2D) and three-dimensional (3D) H gas, and with H atoms trapped in solid H2 matrix. The main focus of this work is on interatomic interactions, which have certain specific features in these three systems considered. A common feature is the very high density of atomic hydrogen, the systems are close to quantum degeneracy. Short range interactions in collisions between atoms are important in gaseous H. The system of H in H2 differ dramatically because atoms remain fixed in the H2 lattice and properties are governed by long-range interactions with the solid matrix and with H atoms. The main tools in our studies were the methods of magnetic resonance, with electron spin resonance (ESR) at 128 GHz being used as the principal detection method. For the first time in experiments with H in high magnetic fields and at low temperatures we combined ESR and NMR to perform electron-nuclear double resonance (ENDOR) as well as coherent two-photon spectroscopy. This allowed to distinguish between different types of interactions in the magnetic resonance spectra. Experiments with 2D H gas utilized the thermal compression method in homogeneous magnetic field, developed in our laboratory. In this work methods were developed for direct studies of 3D H at high density, and for creating high density samples of H in H2. We measured magnetic resonance line shifts due to collisions in the 2D and 3D H gases. First we observed that the cold collision shift in 2D H gas composed of atoms in a single hyperfine state is much smaller than predicted by the mean-field theory. This motivated us to carry out similar experiments with 3D H. In 3D H the cold collision shift was found to be an order of magnitude smaller for atoms in a single hyperfine state than that for a mixture of atoms in two different hyperfine states. The collisional shifts were found to be in fair agreement with the theory, which takes into account symmetrization of the wave functions of the colliding atoms. The origin of the small shift in the 2D H composed of single hyperfine state atoms is not yet understood. The measurement of the shift in 3D H provides experimental determination for the difference of the scattering lengths of ground state atoms. The experiment with H atoms captured in H2 matrix at temperatures below 1 K originated from our work with H gas. We found out that samples of H in H2 were formed during recombination of gas phase H, enabling sample preparation at temperatures below 0.5 K. Alternatively, we created the samples by electron impact dissociation of H2 molecules in situ in the solid. By the latter method we reached highest densities of H atoms reported so far, 3.5(5)x1019 cm-3. The H atoms were found to be stable for weeks at temperatures below 0.5 K. The observation of dipolar interaction effects provides a verification for the density measurement. Our results point to two different sites for H atoms in H2 lattice. The steady-state nuclear polarizations of the atoms were found to be non-thermal. The possibility for further increase of the impurity H density is considered. At higher densities and lower temperatures it might be possible to observe phenomena related to quantum degeneracy in solid.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Particle Image Velocimetry, PIV, is an optical measuring technique to obtain velocity information of a flow in interest. With PIV it is possible to achieve two or three dimensional velocity vector fields from a measurement area instead of a single point in a flow. Measured flow can be either in liquid or in gas form. PIV is nowadays widely applied to flow field studies. The need for PIV is to obtain validation data for Computational Fluid Dynamics calculation programs that has been used to model blow down experiments in PPOOLEX test facility in the Lappeenranta University of Technology. In this thesis PIV and its theoretical background are presented. All the subsystems that can be considered to be part of a PIV system are presented as well with detail. Emphasis is also put to the mathematics behind the image evaluation. The work also included selection and successful testing of a PIV system, as well as the planning of the installation to the PPOOLEX facility. Already in the preliminary testing PIV was found to be good addition to the measuring equipment for Nuclear Safety Research Unit of LUT. The installation to PPOOLEX facility was successful even though there were many restrictions considering it. All parts of the PIV system worked and they were found out to be appropriate for the planned use. Results and observations presented in this thesis are a good background to further PIV use.
Resumo:
Protein engineering aims to improve the properties of enzymes and affinity reagents by genetic changes. Typical engineered properties are affinity, specificity, stability, expression, and solubility. Because proteins are complex biomolecules, the effects of specific genetic changes are seldom predictable. Consequently, a popular strategy in protein engineering is to create a library of genetic variants of the target molecule, and render the population in a selection process to sort the variants by the desired property. This technique, called directed evolution, is a central tool for trimming protein-based products used in a wide range of applications from laundry detergents to anti-cancer drugs. New methods are continuously needed to generate larger gene repertoires and compatible selection platforms to shorten the development timeline for new biochemicals. In the first study of this thesis, primer extension mutagenesis was revisited to establish higher quality gene variant libraries in Escherichia coli cells. In the second study, recombination was explored as a method to expand the number of screenable enzyme variants. A selection platform was developed to improve antigen binding fragment (Fab) display on filamentous phages in the third article and, in the fourth study, novel design concepts were tested by two differentially randomized recombinant antibody libraries. Finally, in the last study, the performance of the same antibody repertoire was compared in phage display selections as a genetic fusion to different phage capsid proteins and in different antibody formats, Fab vs. single chain variable fragment (ScFv), in order to find out the most suitable display platform for the library at hand. As a result of the studies, a novel gene library construction method, termed selective rolling circle amplification (sRCA), was developed. The method increases mutagenesis frequency close to 100% in the final library and the number of transformants over 100-fold compared to traditional primer extension mutagenesis. In the second study, Cre/loxP recombination was found to be an appropriate tool to resolve the DNA concatemer resulting from error-prone RCA (epRCA) mutagenesis into monomeric circular DNA units for higher efficiency transformation into E. coli. Library selections against antigens of various size in the fourth study demonstrated that diversity placed closer to the antigen binding site of antibodies supports generation of antibodies against haptens and peptides, whereas diversity at more peripheral locations is better suited for targeting proteins. The conclusion from a comparison of the display formats was that truncated capsid protein three (p3Δ) of filamentous phage was superior to the full-length p3 and protein nine (p9) in obtaining a high number of uniquely specific clones. Especially for digoxigenin, a difficult hapten target, the antibody repertoire as ScFv-p3Δ provided the clones with the highest affinity for binding. This thesis on the construction, design, and selection of gene variant libraries contributes to the practical know-how in directed evolution and contains useful information for scientists in the field to support their undertakings.