991 resultados para Simple interest


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large number of polymorphic simple sequence repeats (SSRs) or microsatellites are needed to develop a genetic map for shrimp. However, developing an SSR map is very time-consuming, expensive, and most SSRs are not specifically linked to gene loci of immediate interest. We report here on our strategy to develop polymorphic markers using expressed sequence tags (ESTs) by designing primers flanking single or multiple SSRs with three or more repeats. A subtracted cDNA library was prepared using RNA from specific pathogen-free (SPF) Litopenaeus vannamei juveniles (similar to 1 g) collected before (0) and after (48 h) inoculation with the China isolate of white spot syndrome virus (WSSV). A total of 224 clones were sequenced, 194 of which were useful for homology comparisons against annotated genes in NCBI nonredundant (nr) and protein databases, providing 179 sequences encoded by nuclear DNA, 4 mitochondrial DNA, and 11 were similar to portions of WSSV genome. The nuclear sequences clustered in 43 groups, 11 of which were homologous to various ESTs of unknown function, 4 had no homology to any sequence, and 28 showed similarities to known genes of invertebrates and vertebrates, representatives of cellular metabolic processes such as calcium ion balance, cytoskeleton mRNAs, and protein synthesis. A few sequences were homologous to immune system-related (allergens) genes and two were similar to motifs of the sex-lethal gene of Drosophila. A large number of EST sequences were similar to domains of the EF-hand superfamily (Ca2+ binding motif and FRQ protein domain of myosin light chains). Single or multiple SSRs with three or more repeats were found in approximately 61 % of the 179 nuclear sequences. Primer sets were designed from 28 sequences representing 19 known or putative genes and tested for polymorphism (EST-SSR marker) in a small test panel containing 16 individuals. Ten (53%) of the 19 putative or unknown function genes were polymorphic, 4 monomorphic, and 3 either failed to satisfactorily amplify genomic DNA or the allele amplification conditions need to be further optimized. Five polymorphic ESTs were genotyped with the entire reference mapping family, two of them (actin, accession #CX535973 and shrimp allergen arginine kinase, accession #CX535999) did not amplify with all offspring of the IRMF panel suggesting presence of null alleles, and three of them amplified in most of the IRM F offspring and were used for linkage analysis. EF-hand motif of myosin light chain (accession #CX535935) was placed in ShrimpMap's linkage group 7, whereas ribosomal protein S5 (accession #CX535957) and troponin I (accession #CX535976) remained unassigned. Results indicate that (a) a large number of ESTs isolated from this cDNA library are similar to cytoskeleton mRNAs and may reflect a normal pathway of the cellular response after im infection with WSSV, and (b) primers flanking single or multiple SSRs with three or more repeats from shrimp ESTs could be an efficient approach to develop polymorphic markers useful for linkage mapping. Work is underway to map additional SSR-containing ESTs from this and other cDNA libraries as a plausible strategy to increase marker density in ShrimpMap.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To study how the air and sea interact with each other during El Nino/La Nina onsets, extended associate pattern analysis (EAPA) is adopted with the simple ocean data assimilation (SODA) data. The results show that as El Nino/La Nina's parents their behaviors are quite different, there does not exist a relatively independent tropical atmosphere but does exist a relatively independent tropical Pacific Ocean because the air is heated from the bottom surface instead of the top surface and of much stronger baroclinic instability than the sea and has a very large inter-tropical convergence zone covering the most tropical Pacific Ocean. The idea that it is the wester burst and wind convergence, coming from middle latitudes directly that produce the seawater eastward movement and meridional convergence in the upper levels and result in the typical El Nino sea surface temperature warm signal is confirmed again.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two N-dichloroacetyl oxazolidines were synthesized with a simple, mild and convenient method. All the compounds were characterized by IR, (HNMR)-H-1 and elemental analysis. The preliminary biological test showed that the compounds protected maize against injury by some herbicides to some extent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissolved inorganic carbon (DIC) account for more than 95% of total carbon in seawater, so it is necessary to make reliable and precise measurements of DIC to study marine carbon cycling. In order to establish a simple and speed method, an airproof device of gas extraction-absorption was designed. Finally a simple method was developed for the determination of DIC in seawater through a large mount of experiments. The determination procedure is as follows: 100 similar to 150 mL seawater was put into conical flask, then add 10% H3PO4, the DIC in seawater sample was dissolved to form CO2 gas and carried by pure N-2, then the CO2 gas was absorbed by two grades 0.1 mol/L NaOH solution. Finally the absorbed solution was titrated by HCl standard solution of 0.01000 mol/L with the end points detected with the indicator phenolphthalein and bromocresol green-methyl red mixture. The precision and accuracy of the method were satisfied. This method was used to analyse seawater samples from Jiaozhou bay in June, 2003. The result shows that the average DIC in surface seawater is 2066 mumol/L, DIC in bottom seawater is 2075 mumol/L inside bay, but the average DIC in surface seawater is 1949 mumol/L, DIC in bottom seawater is 2147 mumol/L outside bay.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although respiration of organisms and biomass as well as fossil fuel burning industrial production are identified as the major sources, the CO2 flux is still unclear due to the lack of proper measurements. A mass-balance approach that exploits differences in the carbon isotopic signature (delta(13)C) of CO2 Sources and sinks was introduced and may provide a means of reducing uncertainties in the atmospheric budget. delta(13)C measurements of atmospheric CO2 yielded an average of - 10.3 parts per thousand relative to the Peedee Belemnite standard; soil and plants had a narrow range from -25.09 parts per thousand to -26.51 parts per thousand and averaged at -25.80 parts per thousand. Based on the fact of steady fractionation and enrichment during respiration of mitochondria, we obtained the emission Of CO2 of 35.451 mol m(-2) a(-1) and CO2 flux of 0.2149 mu mol m(-2) s(-)1. The positive CO2 flux indicated the Haibei Alpine Meadow Ecosystem a source rather than a sink. The mass-balance model can be applied for other ecosystem even global carbon cycles because it neglects the complicated process of carbon metabolism, however just focuses on stable carbon isotopic compositions in any of compartments of carbon sources and sinks. (C) 2005 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple and sensitive method for the determination of short and long-chain fatty acids using high-performance liquid chromatography with fluorimetric detection has been developed. The fatty acids were derivatized to their corresponding esters with 9-(2-hydroxyethyl)-carbazole (HEC) in acetonitrile at 60 degreesC with 1-ethyl-3-(3-dimethylaminopropyl)carbodiimide hydrochloride as a coupling agent in the presence of 4-dimethylaminopyridine (DMAP). A mixture of esters of C-1-C-20 fatty acids was completely separated within 38 min in conjunction with a gradient elution on a reversed-phase C-18 column. The maximum fluorescence emission for the derivatized fatty acids is at 365 nm (lambda (ex) 335 nm). Studies on derivatization conditions indicate that fatty acids react proceeded rapidly and smoothly with HEC in the presence of EDC and DMAP in acetonitrile to give the corresponding sensitively fluorescent derivatives. The application of this method to the analysis of long chain fatty acids in plasma is also investigated. The LC separation shows good selectivity and reproducibility for fatty acids derivatives. The R.S.D. (n = 6) for each fatty acid derivative are <4%. The detection limits are at 45-68 fmol levels for C-14-C-20 fatty acids and even lower levels for

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the first part of this paper we show that a new technique exploiting 1D correlation of 2D or even 1D patches between successive frames may be sufficient to compute a satisfactory estimation of the optical flow field. The algorithm is well-suited to VLSI implementations. The sparse measurements provided by the technique can be used to compute qualitative properties of the flow for a number of different visual tsks. In particular, the second part of the paper shows how to combine our 1D correlation technique with a scheme for detecting expansion or rotation ([5]) in a simple algorithm which also suggests interesting biological implications. The algorithm provides a rough estimate of time-to-crash. It was tested on real image sequences. We show its performance and compare the results to previous approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper consists of two major parts. First, we present the outline of a simple approach to very-low bandwidth video-conferencing system relying on an example-based hierarchical image compression scheme. In particular, we discuss the use of example images as a model, the number of required examples, faces as a class of semi-rigid objects, a hierarchical model based on decomposition into different time-scales, and the decomposition of face images into patches of interest. In the second part, we present several algorithms for image processing and animation as well as experimental evaluations. Among the original contributions of this paper is an automatic algorithm for pose estimation and normalization. We also review and compare different algorithms for finding the nearest neighbors in a database for a new input as well as a generalized algorithm for blending patches of interest in order to synthesize new images. Finally, we outline the possible integration of several algorithms to illustrate a simple model-based video-conference system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We provide a theory of the three-dimensional interpretation of a class of line-drawings called p-images, which are interpreted by the human vision system as parallelepipeds ("boxes"). Despite their simplicity, p-images raise a number of interesting vision questions: *Why are p-images seen as three-dimensional objects? Why not just as flatimages? *What are the dimensions and pose of the perceived objects? *Why are some p-images interpreted as rectangular boxes, while others are seen as skewed, even though there is no obvious distinction between the images? *When p-images are rotated in three dimensions, why are the image-sequences perceived as distorting objects---even though structure-from-motion would predict that rigid objects would be seen? *Why are some three-dimensional parallelepipeds seen as radically different when viewed from different viewpoints? We show that these and related questions can be answered with the help of a single mathematical result and an associated perceptual principle. An interesting special case arises when there are right angles in the p-image. This case represents a singularity in the equations and is mystifying from the vision point of view. It would seem that (at least in this case) the vision system does not follow the ordinary rules of geometry but operates in accordance with other (and as yet unknown) principles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolutionary algorithms are a common tool in engineering and in the study of natural evolution. Here we take their use in a new direction by showing how they can be made to implement a universal computer. We consider populations of individuals with genes whose values are the variables of interest. By allowing them to interact with one another in a specified environment with limited resources, we demonstrate the ability to construct any arbitrary logic circuit. We explore models based on the limits of small and large populations, and show examples of such a system in action, implementing a simple logic circuit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple method, based on the technique of capillary column switching-back flushing, has been developed for the detailed analysis of aromatic compounds in gasoline. The sample was first separated on a 30-m long OV-2330 polar precolumn and then backflushed onto a nonpolar analytical column. The early eluting components from the precolumn and the components of interest (aromatic compounds plus heavier compounds) eluting from the analytical column are all directed to the same flame ionization detection system through a T piece, which permits the quantitative analysis of aromatic hydrocarbons in gasoline by a normalization method using correcting factors. The switching time window of the method is +/-5 s, resulting in easier operation and higher reliability. The reproducibility of the quantitative analysis was less than or equal to3% RSD for real gasoline samples. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been much interest in the area of model-based reasoning within the Artificial Intelligence community, particularly in its application to diagnosis and troubleshooting. The core issue in this thesis, simply put, is, model-based reasoning is fine, but whence the model? Where do the models come from? How do we know we have the right models? What does the right model mean anyway? Our work has three major components. The first component deals with how we determine whether a piece of information is relevant to solving a problem. We have three ways of determining relevance: derivational, situational and an order-of-magnitude reasoning process. The second component deals with the defining and building of models for solving problems. We identify these models, determine what we need to know about them, and importantly, determine when they are appropriate. Currently, the system has a collection of four basic models and two hybrid models. This collection of models has been successfully tested on a set of fifteen simple kinematics problems. The third major component of our work deals with how the models are selected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this work is to navigate through an office environmentsusing only visual information gathered from four cameras placed onboard a mobile robot. The method is insensitive to physical changes within the room it is inspecting, such as moving objects. Forward and rotational motion vision are used to find doors and rooms, and these can be used to build topological maps. The map is built without the use of odometry or trajectory integration. The long term goal of the project described here is for the robot to build simple maps of its environment and to localize itself within this framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple analog circuit designer has been implemented as a rule based system. The system can design voltage followers. Miller integrators, and bootstrap ramp generators from functional descriptions of what these circuits do. While the designer works in a simple domain where all components are ideal, it demonstrates the abilities of skilled designers. While the domain is electronics, the design ideas are useful in many other engineering domains, such as mechanical engineering, chemical engineering, and numerical programming. Most circuit design systems are given the circuit schematic and use arithmetic constraints to select component values. This circuit designer is different because it designs the schematic. The designer uses a unidirectional CONTROL relation to find the schematic. The circuit designs are built around this relation; it restricts the search space, assigns purposes to components and finds design bugs.