887 resultados para Redundant Manipulator


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic power dissipation due to redundant switching is an important metric in data-path design. This paper focuses on the use of ingenious operand isolation circuits for low power design. Operand isolation attempts to reduce switching by clamping or latching the output of a first level of combinational circuit. This paper presents a novel method using power supply switching wherein both PMOS and NMOS stacks of a circuit are connected to the same power supply. Thus, the output gets clamped or latched to the power supply value with minimal leakage. The proposed circuits make use of only two transistors to clamp the entire Multiple Input Multiple Output (MIMO) block. Also, the latch-based designs have higher drive strength in comparison to the existing methods. Simulation results have shown considerable area reduction in comparison to the existing techniques without increasing timing overhead.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heterodimeric proteins with homologous subunits of same fold are involved in various biological processes. The objective of this study is to understand the evolution of structural and functional features of such heterodimers. Using a non-redundant dataset of 70 such heterodimers of known 3D structure and an independent dataset of 173 heterodimers from yeast, we note that the mean sequence identity between interacting homologous subunits is only 23-24% suggesting that, generally, highly diverged paralogues assemble to form such a heterodimer. We also note that the functional roles of interacting subunits/domains are generally quite different. This suggests that, though the interacting subunits/domains are homologous, the high evolutionary divergence characterize their high functional divergence which contributes to a gross function for the heterodimer considered as a whole. The inverse relationship between sequence identity and RMSD of interacting homologues in heterodimers is not followed. We also addressed the question of formation of homodimers of the subunits of heterodimers by generating models of fictitious homodimers on the basis of the 3D structures of the heterodimers. Interaction energies associated with these homodimers suggests that, in overwhelming majority of the cases, such homodimers are unlikely to be stable. Majority of the homologues of heterodimers of known structures form heterodimers (51.8%) and a small proportion (14.6%) form homodimers. Comparison of 3D structures of heterodimers with homologous homodimers suggests that interfacial nature of residues is not well conserved. In over 90% of the cases we note that the interacting subunits of heterodimers are co-localized in the cell. Proteins 2015; 83:1766-1786. (c) 2015 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with the study of the nonlinear dynamics of a rotating flexible link modeled as a one dimensional beam, undergoing large deformation and with geometric nonlinearities. The partial differential equation of motion is discretized using a finite element approach to yield four nonlinear, nonautonomous and coupled ordinary differential equations (ODEs). The equations are nondimensionalized using two characteristic velocities-the speed of sound in the material and a velocity associated with the transverse bending vibration of the beam. The method of multiple scales is used to perform a detailed study of the system. A set of four autonomous equations of the first-order are derived considering primary resonances of the external excitation and one-to-one internal resonances between the natural frequencies of the equations. Numerical simulations show that for certain ranges of values of these characteristic velocities, the slow flow equations can exhibit chaotic motions. The numerical simulations and the results are related to a rotating wind turbine blade and the approach can be used for the study of the nonlinear dynamics of a single link flexible manipulator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most pattern mining methods yield a large number of frequent patterns, and isolating a small relevant subset of patterns is a challenging problem of current interest. In this paper, we address this problem in the context of discovering frequent episodes from symbolic time-series data. Motivated by the Minimum Description Length principle, we formulate the problem of selecting relevant subset of patterns as one of searching for a subset of patterns that achieves best data compression. We present algorithms for discovering small sets of relevant non-redundant episodes that achieve good data compression. The algorithms employ a novel encoding scheme and use serial episodes with inter-event constraints as the patterns. We present extensive simulation studies with both synthetic and real data, comparing our method with the existing schemes such as GoKrimp and SQS. We also demonstrate the effectiveness of these algorithms on event sequences from a composable conveyor system; this system represents a new application area where use of frequent patterns for compressing the event sequence is likely to be important for decision support and control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

针对目前空间机械臂避障路径规划算法计算量大难以达到在线实时规划的缺点,对空间机械臂的在线实时避障路径规划问题进行了研究和探讨.采用规则体的包络对障碍物进行建模,并借助C空间法的思想,把障碍物和机械臂映射到两个相互垂直的平面内,将机械臂工作空间的三维问题转化为二维问题,并结合二岔树逆向寻优的方法进行路径搜索,从而大大减少了计算量,达到了在线实时规划的要求.最后在空间机器人仿真系统上对其进行了仿真研究,验证了该方法的可行性.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a job contest in which candidates go through interviews (cheap talk) and are subject to reference checks. We show how competitive pressure - increasing the ratio of "good" to "bad" type candi- dates - can lead to a vast increase in lying and in some cases make bad hires more likely. As the number of candidates increases, it becomes harder to in- duce truth-telling. The interview stage becomes redundant if the candidates, a priori, know each others' type or the result of their own reference check. Finally, we show that the employer can bene t from committing not to reject all the applicants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary: The offshore shelf and canyon habitats of the OCNMS (Fig. 1) are areas of high primary productivity and biodiversity that support extensive groundfish fisheries. Recent acoustic surveys conducted in these waters have indicated the presence of hard-bottom substrates believed to harbor unique deep-sea coral and sponge assemblages. Such fauna are often associated with shallow tropical waters, however an increasing number of studies around the world have recorded them in deeper, cold-water habitats in both northern and southern latitudes. These habitats are of tremendous value as sites of recruitment for commercially important fishes. Yet, ironically, studies have shown how the gear used in offshore demersal fishing, as well as other commercial operations on the seafloor, can cause severe physical disturbances to resident benthic fauna. Due to their exposed structure, slow growth and recruitment rates, and long life spans, deep-sea corals and sponges may be especially vulnerable to such disturbances, requiring very long periods to recover. Potential effects of fishing and other commercial operations in such critical habitats, and the need to define appropriate strategies for the protection of these resources, have been identified as a high-priority management issue for the sanctuary. To begin addressing this issue, an initial pilot survey was conducted June 1-12, 2004 at six sites in offshore waters of the OCNMS (Fig. 2, average depths of 147-265 m) to explore for the presence of deep-sea coral/sponge assemblages and to look for evidence of potential anthropogenic impacts in these critical habitats. The survey was conducted on the NOAA Ship McARTHUR-II using the Navy’s Phantom DHD2+2 remotely operated vehicle (ROV), which was equipped with a video camera, lasers, and a manipulator arm for the collection of voucher specimens. At each site, a 0.1-m2 grab sampler also was used to collect samples of sediments for the analysis of macroinfauna (> 1.0 mm), total organic carbon (TOC), grain size, and chemical contaminants. Vertical profiles of salinity, dissolved oxygen (DO), temperature, and pressure were recorded at each site with a small SeaCat conductivity-temperature-depth (CTD) profiler. Niskin bottles attached to the CTD also obtained near-bottom water samples in support of a companion study of microbial indicators of coral health and general ecological condition across these sites. All samples except the sediment-contaminant samples are being analyzed with present project funds. Original cruise plans included a total of 12 candidate stations to investigate (Fig. 3). However, inclement weather and equipment failures restricted the sampling to half of these sites. In spite of the limited sampling, the work completed was sufficient to address key project objectives and included several significant scientific observations. Foremost, the cruise was successful in demonstrating the presence of target deepwater coral species in these waters. Patches of the rare stony coral Lophelia pertusa, more characteristic of deepwater coral/sponge assemblages in the North Atlantic, were observed for the first time in OCNMS at a site in 271 meters of water. A large proportion of these corals consisted of dead and broken skeletal remains, and a broken gorgonian (soft coral) also was observed nearby. The source of these disturbances is not known. However, observations from several sites included evidence of bottom trawl marks in the sediment and derelict fishing gear (long lines). Preliminary results also support the view that these areas are important reservoirs of marine biodiversity and of value as habitat for demersal fishes. For example, onboard examination of 18 bottom-sediment grabs revealed benthic infaunal species representative of 14 different invertebrate phyla. Twenty-eight species of fishes from 11 families, including 11 (possibly 12) species of ommercially important rockfishes, also were identified from ROV video footage. These initial discoveries have sparked considerable interests in follow-up studies to learn more about the spatial extent of these assemblages and magnitude of potential impacts from commercial-fishing and other anthropogenic activities in the area. It is essential to expand our knowledge of these deep-sea communities and their vulnerability to potential environmental risks in order to determine the most appropriate management strategies. The survey was conducted under a partnership between NOAA’s National Centers for Coastal Ocean Science (NCCOS) and National Marine Sanctuary Program (NMSP) and included scientists from NCCOS, OCNMS, and several other west-coast State, academic, private, and tribal research institutions (see Section 4 for a complete listing of participating scientists). (PDF contains 20 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Features of homologous relationship of proteins can provide us a general picture of protein universe, assist protein design and analysis, and further our comprehension of the evolution of organisms. Here we carried Out a Study of the evolution Of protein molecules by investigating homologous relationships among residue segments. The motive was to identify detailed topological features of homologous relationships for short residue segments in the whole protein universe. Based on the data of a large number of non-redundant Proteins, the universe of non-membrane polypeptide was analyzed by considering both residue mutations and structural conservation. By connecting homologous segments with edges, we obtained a homologous relationship network of the whole universe of short residue segments, which we named the graph of polypeptide relationships (GPR). Since the network is extremely complicated for topological transitions, to obtain an in-depth understanding, only subgraphs composed of vital nodes of the GPR were analyzed. Such analysis of vital subgraphs of the GPR revealed a donut-shaped fingerprint. Utilization of this topological feature revealed the switch sites (where the beginning of exposure Of previously hidden "hot spots" of fibril-forming happens, in consequence a further opportunity for protein aggregation is Provided; 188-202) of the conformational conversion of the normal alpha-helix-rich prion protein PrPC to the beta-sheet-rich PrPSc that is thought to be responsible for a group of fatal neurodegenerative diseases, transmissible spongiform encephalopathies. Efforts in analyzing other proteins related to various conformational diseases are also introduced. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transcription factor binding sites (TFBS) play key roles in genebior 6.8 wavelet expression and regulation. They are short sequence segments with de¯nite structure and can be recognized by the corresponding transcription factors correctly. From the viewpoint of statistics, the candidates of TFBS should be quite di®erent from the segments that are randomly combined together by nucleotide. This paper proposes a combined statistical model for ¯nding over- represented short sequence segments in di®erent kinds of data set. While the over-represented short sequence segment is described by position weight matrix, the nucleotide distribution at most sites of the segment should be far from the background nucleotide distribution. The central idea of this approach is to search for such kind of signals. This algorithm is tested on 3 data sets, including binding sites data set of cyclic AMP receptor protein in E.coli, PlantProm DB which is a non-redundant collection of proximal promoter sequences from di®erent species, collection of the intergenic sequences of the whole genome of E.Coli. Even though the complexity of these three data sets is quite di®erent, the results show that this model is rather general and sensible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a novel framework for state estimation in the context of robotic grasping and manipulation. The overall estimation approach is based on fusing various visual cues for manipulator tracking, namely appearance and feature-based, shape-based, and silhouette-based visual cues. Similarly, a framework is developed to fuse the above visual cues, but also kinesthetic cues such as force-torque and tactile measurements, for in-hand object pose estimation. The cues are extracted from multiple sensor modalities and are fused in a variety of Kalman filters.

A hybrid estimator is developed to estimate both a continuous state (robot and object states) and discrete states, called contact modes, which specify how each finger contacts a particular object surface. A static multiple model estimator is used to compute and maintain this mode probability. The thesis also develops an estimation framework for estimating model parameters associated with object grasping. Dual and joint state-parameter estimation is explored for parameter estimation of a grasped object's mass and center of mass. Experimental results demonstrate simultaneous object localization and center of mass estimation.

Dual-arm estimation is developed for two arm robotic manipulation tasks. Two types of filters are explored; the first is an augmented filter that contains both arms in the state vector while the second runs two filters in parallel, one for each arm. These two frameworks and their performance is compared in a dual-arm task of removing a wheel from a hub.

This thesis also presents a new method for action selection involving touch. This next best touch method selects an available action for interacting with an object that will gain the most information. The algorithm employs information theory to compute an information gain metric that is based on a probabilistic belief suitable for the task. An estimation framework is used to maintain this belief over time. Kinesthetic measurements such as contact and tactile measurements are used to update the state belief after every interactive action. Simulation and experimental results are demonstrated using next best touch for object localization, specifically a door handle on a door. The next best touch theory is extended for model parameter determination. Since many objects within a particular object category share the same rough shape, principle component analysis may be used to parametrize the object mesh models. These parameters can be estimated using the action selection technique that selects the touching action which best both localizes and estimates these parameters. Simulation results are then presented involving localizing and determining a parameter of a screwdriver.

Lastly, the next best touch theory is further extended to model classes. Instead of estimating parameters, object class determination is incorporated into the information gain metric calculation. The best touching action is selected in order to best discern between the possible model classes. Simulation results are presented to validate the theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vulval differentiation in C. elegans is mediated by an Epidermal growth factor (EGF)- EGF receptor (EGFR) signaling pathway. I have cloned unc-101, a negative regulator of vulval differentiation of the nematode C. elegans. unc-101 encodes a homolog of AP47, the medium chain of the trans-Golgi clathrin-associated protein complex. This identity was confirmed by cloning and comparing sequence of a C. elegans homolog of AP50, the medium chain of the plasma membrane clathrin-associated protein complex. I provided the first genetic evidence that the trans-Golgi clathrin-coated vesicles are involved in regulation of an EGF signaling pathway. Most of the unc-101 alleles are deletions or nonsense mutations, suggesting that these alleles severely reduce the unc-101 activity. A hybrid gene that contains parts of unc-101 and mouse AP4 7 rescued at least two phenotypes of unc-101 mutations, the Unc and the suppression of vulvaless phenotype of let-23(sy1) mutation. Therefore, the functions of AP47 are conserved between nematodes and mammals.

unc-101 mutations can cause a greater than wild-type vulval differentiation in combination with certain mutations in sli-1, another negative regulator of the vulval induction pathway. A mutation in a new gene, rok-1, causes no defect by itself, but causes a greater than wild-type vulval differentiation in the presence of a sli-1 mutation. The unc-101; rok-1; sli-1 triple mutants display a greater extent of vulval differentiation than any double mutant combinations of unc-101, rok-1 and sli-1. Therefore, rok-1 locus defines another negative regulator of the vulval induction pathway.

I analyzed a second gene encoding an AP47 homolog in C. elegans. This gene, CEAP47, encodes a protein 72% identical to both unc-101 and mammalian AP47. A hybrid gene containing parts of unc-101 and CEAP47 sequences can rescue phenotypes of unc-101 mutants, indicating that UNC- 101 and CEAP47 proteins can be redundant if expressed in the same set of cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have used the technique of non-redundant masking at the Palomar 200-inch telescope and radio VLBI imaging software to make optical aperture synthesis maps of two binary stars, β Corona Borealis and σ Herculis. The dynamic range of the map of β CrB, a binary star with a separation of 230 milliarcseconds is 50:1. For σ Her, we find a separation of 70 milliarcseconds and the dynamic range of our image is 30:1. These demonstrate the potential of the non-redundant masking technique for diffraction-limited imaging of astronomical objects with high dynamic range.

We find that the optimal integration time for measuring the closure phase is longer than that for measuring the fringe amplitude. There is not a close relationship between amplitude errors and phase errors, as is found in radio interferometry. Amplitude self calibration is less effective at optical wavelengths than at radio wavelengths. Primary beam sensitivity correction made in radio aperture synthesis is not necessary in optical aperture synthesis.

The effects of atmospheric disturbances on optical aperture synthesis have been studied by Monte Carlo simulations based on the Kolmogorov theory of refractive-index fluctuations. For the non-redundant masking with τ_c-sized apertures, the simulated fringe amplitude gives an upper bound of the observed fringe amplitude. A smooth transition is seen from the non-redundant masking regime to the speckle regime with increasing aperture size. The fractional reduction of the fringe amplitude according to the bandwidth is nearly independent of the aperture size. The limiting magnitude of optical aperture synthesis with τ_c-sized apertures and that with apertures larger than τ_c are derived.

Monte Carlo simulations are also made to study the sensitivity and resolution of the bispectral analysis of speckle interferometry. We present the bispectral modulation transfer function and its signal-to-noise ratio at high light levels. The results confirm the validity of the heuristic interferometric view of image-forming process in the mid-spatial-frequency range. The signal-to- noise ratio of the bispectrum at arbitrary light levels is derived in the mid-spatial-frequency range.

The non-redundant masking technique is suitable for imaging bright objects with high resolution and high dynamic range, while the faintest limit will be better pursued by speckle imaging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 0.2% experimental accuracy of the 1968 Beers and Hughes measurement of the annihilation lifetime of ortho-positronium motivates the attempt to compute the first order quantum electrodynamic corrections to this lifetime. The theoretical problems arising in this computation are here studied in detail up to the point of preparing the necessary computer programs and using them to carry out some of the less demanding steps -- but the computation has not yet been completed. Analytic evaluation of the contributing Feynman diagrams is superior to numerical evaluation, and for this process can be carried out with the aid of the Reduce algebra manipulation computer program.

The relation of the positronium decay rate to the electronpositron annihilation-in-flight amplitude is derived in detail, and it is shown that at threshold annihilation-in-flight, Coulomb divergences appear while infrared divergences vanish. The threshold Coulomb divergences in the amplitude cancel against like divergences in the modulating continuum wave function.

Using the lowest order diagrams of electron-positron annihilation into three photons as a test case, various pitfalls of computer algebraic manipulation are discussed along with ways of avoiding them. The computer manipulation of artificial polynomial expressions is preferable to the direct treatment of rational expressions, even though redundant variables may have to be introduced.

Special properties of the contributing Feynman diagrams are discussed, including the need to restore gauge invariance to the sum of the virtual photon-photon scattering box diagrams by means of a finite subtraction.

A systematic approach to the Feynman-Brown method of Decomposition of single loop diagram integrals with spin-related tensor numerators is developed in detail. This approach allows the Feynman-Brown method to be straightforwardly programmed in the Reduce algebra manipulation language.

The fundamental integrals needed in the wake of the application of the Feynman-Brown decomposition are exhibited and the methods which were used to evaluate them -- primarily dis persion techniques are briefly discussed.

Finally, it is pointed out that while the techniques discussed have permitted the computation of a fair number of the simpler integrals and diagrams contributing to the first order correction of the ortho-positronium annihilation rate, further progress with the more complicated diagrams and with the evaluation of traces is heavily contingent on obtaining access to adequate computer time and core capacity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES]Este proyecto tiene como objetivo desarrollar una línea de investigación de opciones de sensorización de un mecanismo mediante acelerómetros. Se construirá para ello un sistema de adquisición y tratamiento de señales destinado a la sensorización de un mecanismo de cinemática paralela en base a los conocimientos adquiridos durante el curso. Se trabajará además con otros alumnos para llevar a cabo el diseño y montaje de un robot prototipo de cinemática paralela de dos grados de libertad sobre el que se experimentará y llevará a cabo el proyecto. Se plantean de este modo dos líneas de trabajo que se desarrollarán en este proyecto: Elaboración de un sistema de adquisición y tratamiento de señales adaptable a distintos sensores. Utilización de señales de múltiples acelerómetros para conocer en primer lugar aceleración, y de ser posible, posición de puntos de interés del mecanismo.