877 resultados para Hyper-redundant manipulators
Resumo:
Phosphatidylinositol 3-kinase (PI3K) isoforms PI3Kbeta and PI3Kgamma are implicated in platelet adhesion, activation, and aggregation, but their relative contribution is still unclear or controversial. Here, we report the first comparative functional analysis of platelets from mice expressing a catalytically inactive form of PI3Kbeta or PI3Kgamma. We demonstrate that both isoforms were similarly required for maximal activation of the small GTPase Rap1b and for complete platelet aggregation upon stimulation of G protein-coupled receptors for adenosine 5'-diphosphate (ADP) or U46619. Their contribution to these events, however, was largely redundant and dispensable. However, PI3Kbeta, but not PI3Kgamma, enzymatic activity was absolutely required for Akt phosphorylation, Rap1 activation, and platelet aggregation downstream of the immunoreceptor tyrosine-based activation motif (ITAM)-bearing receptor glycoprotein VI (GPVI). Moreover, PI3Kbeta was a major essential regulator of platelet adhesion to fibrinogen and of integrin alpha(IIb)beta(3)-mediated spreading. These results provide genetic evidence for a crucial and selective role of PI3Kbeta in signaling through GPVI and integrin alpha(IIb)beta(3).
Resumo:
The optimal utilisation of hyper-spectral satellite observations in numerical weather prediction is often inhibited by incorrectly assuming independent interchannel observation errors. However, in order to represent these observation-error covariance structures, an accurate knowledge of the true variances and correlations is needed. This structure is likely to vary with observation type and assimilation system. The work in this article presents the initial results for the estimation of IASI interchannel observation-error correlations when the data are processed in the Met Office one-dimensional (1D-Var) and four-dimensional (4D-Var) variational assimilation systems. The method used to calculate the observation errors is a post-analysis diagnostic which utilises the background and analysis departures from the two systems. The results show significant differences in the source and structure of the observation errors when processed in the two different assimilation systems, but also highlight some common features. When the observations are processed in 1D-Var, the diagnosed error variances are approximately half the size of the error variances used in the current operational system and are very close in size to the instrument noise, suggesting that this is the main source of error. The errors contain no consistent correlations, with the exception of a handful of spectrally close channels. When the observations are processed in 4D-Var, we again find that the observation errors are being overestimated operationally, but the overestimation is significantly larger for many channels. In contrast to 1D-Var, the diagnosed error variances are often larger than the instrument noise in 4D-Var. It is postulated that horizontal errors of representation, not seen in 1D-Var, are a significant contributor to the overall error here. Finally, observation errors diagnosed from 4D-Var are found to contain strong, consistent correlation structures for channels sensitive to water vapour and surface properties.
Resumo:
Traditionally, the formal scientific output in most fields of natural science has been limited to peer- reviewed academic journal publications, with less attention paid to the chain of intermediate data results and their associated metadata, including provenance. In effect, this has constrained the representation and verification of the data provenance to the confines of the related publications. Detailed knowledge of a dataset’s provenance is essential to establish the pedigree of the data for its effective re-use, and to avoid redundant re-enactment of the experiment or computation involved. It is increasingly important for open-access data to determine their authenticity and quality, especially considering the growing volumes of datasets appearing in the public domain. To address these issues, we present an approach that combines the Digital Object Identifier (DOI) – a widely adopted citation technique – with existing, widely adopted climate science data standards to formally publish detailed provenance of a climate research dataset as an associated scientific workflow. This is integrated with linked-data compliant data re-use standards (e.g. OAI-ORE) to enable a seamless link between a publication and the complete trail of lineage of the corresponding dataset, including the dataset itself.
Resumo:
Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.
Resumo:
IEEE 754 floating-point arithmetic is widely used in modern, general-purpose computers. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. Modifying the IEEE arithmetic so that it uses transreal arithmetic has a number of advantages. It removes one redundant binade from IEEE floating-point objects, doubling the numerical precision of the arithmetic. It removes eight redundant, relational,floating-point operations and removes the redundant total order operation. It replaces the non-reflexive, floating-point, equality operator with a reflexive equality operator and it indicates that some of the exceptions may be removed as redundant { subject to issues of backward compatibility and transient future compatibility as programmers migrate to the transreal paradigm.
Resumo:
We discussed a floating mechanism based on quasi-magnetic levitation method that can be attached at the endpoint of a robot arm in order to construct a novel redundant robot arm for producing compliant motions. The floating mechanism can be composed of magnets and a constraint mechanism such that the repelling force of the magnets floats the endpoint part of the mechanism stable for the guided motions. The analytical and experimental results show that the proposed floating mechanism can produce stable floating motions with small inertia and viscosity. The results also show that the proposed mechanism can detect small force applied to the endpoint part because the friction force of the mechanism is very small.
Resumo:
tMelt-polycondensation of succinic acid anhydride with oxazoline-based diol monomers gave hyper-branched polymers with carboxylicacids terminal groups.1H NMR and quantitative13C NMRspectroscopy coupled with DEPT-13513C NMR experiment showed high degrees of branching (over 60%).Esterification of the acid end groups by addition of citronellol at 160◦C produced novel white spirit solubleresins which were characterized by Fourier transform-infrared (FTIR) spectroscopy, gel permeation chro-matography (GPC), differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA). Blendsof the new hyperbranched materials with commercial alkyd resins resulted in a dramatic, concentrationdependent drop in viscosity. Solvent-borne coatings were formulated containing the hyperbranchedpolymers. Dynamic mechanical analysis studies revealed that the air drying rates of the new coatingsystems were enhanced compared with identical formulations containing only commercial alkyd resins.
Resumo:
Glutamine synthetase (GS) is a key enzyme in nitrogen (N) assimilation, particularly during seed development. Three cytosolic GS isoforms (HvGS1) were identified in barley (Hordeum vulgare L. cv Golden Promise). Quantitation of gene expression, localization and response to N supply revealed that each gene plays a non-redundant role in different tissues and during development. Localization of HvGS1_1 in vascular cells of different tissues, combined with its abundance in the stem and its response to changes in N supply, indicate that it is important in N transport and remobilization. HvGS1_1 is located on chromosome 6H at 72.54 cM, close to the marker HVM074 which is associated with a major quantitative trait locus (QTL) for grain protein content (GPC). HvGS1_1 may be a potential candidate gene to manipulate barley GPC. HvGS1_2 mRNA was localized to the leaf mesophyll cells, in the cortex and pericycle of roots, and was the dominant HvGS1 isoform in these tissues. HvGS1_2 expression increased in leaves with an increasing supply of N, suggesting its role in the primary assimilation of N. HvGS1_3 was specifically and predominantly localized in the grain, being highly expressed throughout grain development. HvGS1_3 expression increased specifically in the roots of plants grown on high NH+4, suggesting that it has a primary role in grain N assimilation and also in the protection against ammonium toxicity in roots. The expression of HvGS1 genes is directly correlated with protein and enzymatic activity, indicating that transcriptional regulation is of prime importance in the control of GS activity in barley.
Resumo:
The health benefits of garlic have been proven by epidemiological and experimental studies. Diallyl disulphide (DADS), the major organosulfur compound found in garlic oil, is known to lower the incidence of breast cancer both in vitro and in vivo. The studies reported here demonstrate that DADS induces apoptosis in the MCF-7 breast-cancer cell line through interfering with cell-cycle growth phases in a way that increases the sub-G0 population and substantially halts DNA synthesis. DADS also induces phosphatidylserine (PS) translocation from the inner to the outer leaflet of the plasma membrane and activates caspase-3. Further studies revealed that DADS modulates the cellular levels of Bax, Bcl-2, Bcl-xL and Bcl-w in a dose-dependent manner, suggesting the involvement of Bcl-2 family proteins in DADS induced apoptosis. Histone deacetylation inhibitors (HDACi) are known to suppress cancer growth and induce apoptosis in cancer cells. Here it is shown that DADS has HDACi properties in MCF-7 cells as it lowers the removal of an acetyl group from an acetylated substrate and induces histone-4 (H4) hyper-acetylation. The data thus indicate that the HDACi properties of DADS may be responsible for the induction of apoptosis in breast cancer cells.
Resumo:
We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.
Resumo:
Present climate in the Nafud desert of northern Saudi Arabia is hyper-arid and moisture brought by north-westerly winds scarcely reaches the region. The existence of abundant palaeolake sediments provides evidence for a considerably wetter climate in the past. However, the existing chronological framework of these deposits is solely based on radiocarbon dating of questionable reliability, due to potential post-depositional contamination with younger 14C. By using luminescence dating, we show that the lake deposits were not formed between 40 and 20 ka as suggested previously, but approximately ca 410 ka, 320 ka, 200 ka, 125 ka, and 100 ka ago. All of these humid phases are in good agreement with those recorded in lake sediments and speleothems from southern Arabia. Surprisingly, no Holocene lake deposits were identified. Geological characteristics of the deposits and diatom analysis suggest that a single, perennial lake covered the entire south-western Nafud ca 320 ka ago. In contrast, lakes of the 200 ka, 125 ka, and 100 ka humid intervals were smaller and restricted to interdune depressions of a pre-existing dune relief. The concurrent occurrence of humid phases in the Nafud, southern Arabia and the eastern Mediterranean suggests that moisture in northern Arabia originated either from the Mediterranean due to more frequent frontal depression systems or from stronger Indian monsoon circulation, respectively. However, based on previously published climate model simulations and palaecolimate evidence from central Arabia and the Negev desert, we argue that humid climate conditions in the Nafud were probably caused by a stronger African monsoon and a distinct change in zonal atmospheric circulation.
Resumo:
We aim to develop an efficient robotic system for stroke rehabilitation, in which a robotic arm moves the hemiplegic upper limb when the patient tries to move it. In order to achieve this goal we have considered a method to detect the patient's intended motion using EEG (Electroencephalogram), and have designed a rehabilitation robot based on a Redundant Drive Method. In this paper, we propose an EEG driven rehabilitation robot system and present initial results evaluating the feasibility of the proposed system.
Resumo:
Industrial robotic manipulators can be found in most factories today. Their tasks are accomplished through actively moving, placing and assembling parts. This movement is facilitated by actuators that apply a torque in response to a command signal. The presence of friction and possibly backlash have instigated the development of sophisticated compensation and control methods in order to achieve the desired performance may that be accurate motion tracking, fast movement or in fact contact with the environment. This thesis presents a dual drive actuator design that is capable of physically linearising friction and hence eliminating the need for complex compensation algorithms. A number of mathematical models are derived that allow for the simulation of the actuator dynamics. The actuator may be constructed using geared dc motors, in which case the benefits of torque magnification is retained whilst the increased non-linear friction effects are also linearised. An additional benefit of the actuator is the high quality, low latency output position signal provided by the differencing of the two drive positions. Due to this and the linearised nature of friction, the actuator is well suited for low velocity, stop-start applications, micro-manipulation and even in hard-contact tasks. There are, however, disadvantages to its design. When idle, the device uses power whilst many other, single drive actuators do not. Also the complexity of the models mean that parameterisation is difficult. Management of start-up conditions still pose a challenge.
Resumo:
Much has been written on Roth’s representation of masculinity, but this critical discourse has tended to be situated within a heteronormative frame of reference, perhaps because of Roth’s popular reputation as an aggressively heterosexual, libidinous, masculinist, in some versions sexist or even misogynist author. In this essay I argue that Roth’s representation of male sexuality is more complex, ambiguous, and ambivalent than has been generally recognized. Tracing a strong thread of what I call homosocial discourse running through Roth’s oeuvre, I suggest that the series of intimate relationships with other men that many of Roth’s protagonists form are conspicuously couched in this discourse and that a recognition of this ought to reconfigure our sense of the sexual politics of Roth’s career, demonstrating in particular that masculinity in his work is too fluid and dynamic to be accommodated by the conventional binaries of heterosexual and homosexual, feminized Jew and hyper-masculine Gentile, the “ordinary sexual man” and the transgressively desiring male subject.
Resumo:
Coconut, Cocos nucifera L. is a major plantation crop, which ensures income for millions of people in the tropical region. Detailed molecular studies on zygotic embryo development would provide valuable clues for the identification of molecular markers to improve somatic embryogenesis. Since there is no ongoing genome project for this species, coconut expressed sequence tags (EST) would be an interesting technique to identify important coconut embryo specific genes as well as other functional genes in different biochemical pathways. The goal of this study was to analyse the ESTs by examining the transcriptome data of the different embryo tissue types together with one somatic tissue. Here, four cDNA libraries from immature embryo, mature embryo, microspore derived embryo and mature leaves were constructed. cDNA was sequenced by the Roche-454 GS-FLX system and assembled into 32621 putative unigenes and 155017 singletons. Of these unigenes, 18651 had significant sequence similarities to non-redundant protein database, from which 16153 were assigned to one or more gene ontology categories. Homologue genes, which are responsible for embryo development such as chitinase, beta-1,3-glucanase, ATP synthase CF0 subunit, thaumatin-like protein and metallothionein-like protein were identified among the embryo EST collection. Of the unigenes, 6694 were mapped into 139 KEGG pathways including carbohydrate metabolism, energy metabolism, lipid metabolism, amino acid metabolism and nucleotide metabolism. This collection of 454-derived EST data generated from different tissue types provides a significant resource for genome wide studies and gene discovery of coconut, a non-model species.