980 resultados para Tibetan coded character set extension A
Resumo:
Objective. To localize the regions containing genes that determine susceptibility to ankylosing spondylitis (AS). Methods. One hundred five white British families with 121 affected sibling pairs with AS were recruited, largely from the Royal National Hospital for Rheumatic Diseases AS database. A genome-wide linkage screen was undertaken using 254 highly polymorphic microsatellite markers from the Medical Research Council (UK) (MRC) set. The major histocompatibility complex (MHC) region was studied more intensively using 5 microsatellites lying within the HLA class III region and HLA-DRB1 typing. The Analyze package was used for 2-point analysis, and GeneHunter for multipoint analysis. Results. When only the MRC set was considered, 11 markers in 7 regions achieved a P value of ≤0.01. The maximum logarithm of odds score obtained was 3.8 (P = 1.4 x 10-5) using marker D6S273, which lies in the HLA class III region. A further marker used in mapping of the MHC class III region achieved a LOD score of 8.1 (P = 1 x 10-9). Nine of 118 affected sibling pairs (7.6%) did not share parental haplotypes identical by descent across the MHC, suggesting that only 31% of the susceptibility to AS is coded by genes linked to the MHC. The maximum non-MHC LOD score obtained was 2.6 (P = 0.0003) for marker D16S422. Conclusion. The results of this study confirm the strong linkage of the MHC with AS, and provide suggestive evidence regarding the presence and location of non-MHC genes influencing susceptibility to the disease.
Resumo:
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.
Resumo:
The two-dimensional polymeric structures of the caesium complexes with the phenoxyacetic acid analogues (4-fluorophenoxy)acetic acid, (3-chloro-2-methylphenoxy)acetic acid and the herbicidally active (2,4-dichlorophenoxy)acetic acid (2,4-D), namely poly[[5-(4-fluorophenoxy)acetato][4-(4-fluorophenoxy)acetato]dicaesium], [Cs2(C8H6FO3)2]n, (I), poly[aqua[5-(3-chloro-2-methylphenoxy)acetato]caesium], [Cs(C9H8ClO3)(H2O)]n, (II), and poly[[7-(2,4-dichlorophenoxy)acetato][(2,4-dichlorphenoxy)acetic acid]caesium], [Cs(C8H5Cl2O3)(C8H6Cl2O3)]n, (III), are described. In (I), the Cs+ cations of the two individual irregular coordination polyhedra in the asymmetric unit (one CsO7 and the other CsO8) are linked by bridging carboxylate O-atom donors from the two ligand molecules, both of which are involved in bidentate chelate Ocarboxy,Ophenoxy interactions, while only one has a bidentate carboxylate O,O'-chelate interaction. Polymeric extension is achieved through a number of carboxylate O-atom bridges, with a minimum CsCs separation of 4.3231 (9) Å, giving layers which lie parallel to (001). In hydrated complex (II), the irregular nine-coordination about the Cs+ cation comprises a single monodentate water molecule, a bidentate Ocarboxy,Ophenoxy chelate interaction and six bridging carboxylate O-atom bonding interactions, giving a CsCs separation of 4.2473 (3) Å. The water molecule forms intralayer hydrogen bonds within the two-dimensional layers, which lie parallel to (100). In complex (III), the irregular centrosymmetric CsO6Cl2 coordination environment comprises two O-atom donors and two ring-substituted Cl-atom donors from two hydrogen bis[(2,4-dichlorophenoxy)acetate] ligand species in a bidentate chelate mode, and four O-atom donors from bridging carboxyl groups. The duplex ligand species lie across crystallographic inversion centres, linked through a short O-HO hydrogen bond involving the single acid H atom. Structure extension gives layers which lie parallel to (001). The present set of structures of Cs salts of phenoxyacetic acids show previously demonstrated trends among the alkali metal salts of simple benzoic acids with no stereochemically favourable interactive substituent groups for formation of two-dimensional coordination polymers.
Resumo:
Theodor Adorno was opposed to the cinema because he felt it was too close to reality, and ipso facto an extension of ideological Capital, as he wrote in 1944 in Dialectic of Enlightenment. What troubled Adorno was the iconic nature of cinema – the semiotic category invented by C. S. Peirce where the signifier (sign) does not merely signify, in the arbitrary capacity attested by Saussure, but mimics the formal-visual qualities of its referent. Iconicity finds its perfect example in the film’s ingenuous surface illusion of an unmediated reality – its genealogy (the iconic), since classical antiquity, lay in the Greek term eikōn which meant “image,” to refer to the ancient portrait statues of victorious athletes which were thought to bear a direct similitude with their parent divinities. For the postwar, Hollywood-film spectator, Adorno said, “the world outside is an extension of the film he has just left,” because realism is a precise instrument for the manipulation of the mass spectator by the culture industry, for which the filmic image is an advertisement for the world unedited. Mimesis, or the reproduction of reality, is a “mere reproduction of the economic base.” It is precisely film’s iconicity, then, its “realist aesthetic . . . [that] makes it inseparable from its commodity character.”...
Resumo:
This investigation combined musicality and theatricality in the creation of four shows: Bear with Me, The Empty City, Gentlemen Songsters and Warmwaters. Led by creative practice, the research identified four polyvalences that characterise Composed Theatre, a transformational artistic domain which offers distinct challenges for performance makers. These include tensions and resolutions between compositional and theatrical thinking; music and words; setlist and script; and finally persona and character. The research finds that these interplays not only lend Composed Theatre its distinct qualities, but offer a potential set of balances to strike for writers, performers, composers and musicians who mix music and theatre in intermedial performance.
A novel human leucocyte antigen-DRB1 genotyping method based on multiplex primer extension reactions
Resumo:
We have developed and validated a semi-automated fluorescent method of genotyping human leucocyte antigen (HLA)-DRB1 alleles, HLA-DRB1*01-16, by multiplex primer extension reactions. This method is based on the extension of a primer that anneals immediately adjacent to the single-nucleotide polymorphism with fluorescent dideoxynucleotide triphosphates (minisequencing), followed by analysis on an ABI Prism 3700 capillary electrophoresis instrument. The validity of the method was confirmed by genotyping 261 individuals using both this method and polymerase chain reaction with sequence-specific primer (PCR-SSP) or sequencing and by demonstrating Mendelian inheritance of HLA-DRB1 alleles in families. Our method provides a rapid means of performing high-throughput HLA-DRB1 genotyping using only two PCR reactions followed by four multiplex primer extension reactions and PCR-SSP for some allele groups. In this article, we describe the method and discuss its advantages and limitations.
Resumo:
The present work focuses on simulation of nonlinear mechanical behaviors of adhesively bonded DLS (double lap shear) joints for variable extension rates and temperatures using the implicit ABAQUS solver. Load-displacement curves of DLS joints at nine combinations of extension rates and environmental temperatures are initially obtained by conducting tensile tests in a UTM. The joint specimens are made from dual phase (DP) steel coupons bonded with a rubber-toughened adhesive. It is shown that the shell-solid model of a DLS joint, in which substrates are modeled with shell elements and adhesive with solid elements, can effectively predict the mechanical behavior of the joint. Exponent Drucker-Prager or Von Mises yield criterion together with nonlinear isotropic hardening is used for the simulation of DLS joint tests. It has been found that at a low temperature (-20 degrees C), both Von Mises and exponent Drucker-Prager criteria give close prediction of experimental load-extension curves. However. at a high temperature (82 degrees C), Von Mises condition tends to yield a perceptibly softer joint behavior, while the corresponding response obtained using exponent Drucker-Prager criterion is much closer to the experimental load-displacement curve.
Resumo:
High end network security applications demand high speed operation and large rule set support. Packet classification is the core functionality that demands high throughput in such applications. This paper proposes a packet classification architecture to meet such high throughput. We have implemented a Firewall with this architecture in reconflgurable hardware. We propose an extension to Distributed Crossproducting of Field Labels (DCFL) technique to achieve scalable and high performance architecture. The implemented Firewall takes advantage of inherent structure and redundancy of rule set by using our DCFL Extended (DCFLE) algorithm. The use of DCFLE algorithm results in both speed and area improvement when it is implemented in hardware. Although we restrict ourselves to standard 5-tuple matching, the architecture supports additional fields. High throughput classification invariably uses Ternary Content Addressable Memory (TCAM) for prefix matching, though TCAM fares poorly in terms of area and power efficiency. Use of TCAM for port range matching is expensive, as the range to prefix conversion results in large number of prefixes leading to storage inefficiency. Extended TCAM (ETCAM) is fast and the most storage efficient solution for range matching. We present for the first time a reconfigurable hardware implementation of ETCAM. We have implemented our Firewall as an embedded system on Virtex-II Pro FPGA based platform, running Linux with the packet classification in hardware. The Firewall was tested in real time with 1 Gbps Ethernet link and 128 sample rules. The packet classification hardware uses a quarter of logic resources and slightly over one third of memory resources of XC2VP30 FPGA. It achieves a maximum classification throughput of 50 million packet/s corresponding to 16 Gbps link rate for the worst case packet size. The Firewall rule update involves only memory re-initialization in software without any hardware change.
Resumo:
High end network security applications demand high speed operation and large rule set support. Packet classification is the core functionality that demands high throughput in such applications. This paper proposes a packet classification architecture to meet such high throughput. We have Implemented a Firewall with this architecture in reconfigurable hardware. We propose an extension to Distributed Crossproducting of Field Labels (DCFL) technique to achieve scalable and high performance architecture. The implemented Firewall takes advantage of inherent structure and redundancy of rule set by using, our DCFL Extended (DCFLE) algorithm. The use of DCFLE algorithm results In both speed and area Improvement when It is Implemented in hardware. Although we restrict ourselves to standard 5-tuple matching, the architecture supports additional fields.High throughput classification Invariably uses Ternary Content Addressable Memory (TCAM) for prefix matching, though TCAM fares poorly In terms of area and power efficiency. Use of TCAM for port range matching is expensive, as the range to prefix conversion results in large number of prefixes leading to storage inefficiency. Extended TCAM (ETCAM) is fast and the most storage efficient solution for range matching. We present for the first time a reconfigurable hardware Implementation of ETCAM. We have implemented our Firewall as an embedded system on Virtex-II Pro FPGA based platform, running Linux with the packet classification in hardware. The Firewall was tested in real time with 1 Gbps Ethernet link and 128 sample rules. The packet classification hardware uses a quarter of logic resources and slightly over one third of memory resources of XC2VP30 FPGA. It achieves a maximum classification throughput of 50 million packet/s corresponding to 16 Gbps link rate for file worst case packet size. The Firewall rule update Involves only memory re-initialiization in software without any hardware change.
Resumo:
Young females with mild hallux valgus (HV) have been identified as having an increased risk of first ray deformation. Little is known, however, about the biomechanical changes that might contribute to this increased risk. The purpose of this study was to compare kinetics changes during walking for mild HV subjects with high-heel-height shoes. Twelve female subjects (six with mild HV and six controls) participated in this study with heel height varying from 0 cm (barefoot) to 4.5 cm. Compared to healthy controls, patients had significantly higher peak pressure on the big toe area during barefoot walking. When the heel height increased, loading was transferred to medial side of the forefoot, and the big toe area suffered more impact compared to barefoot in mild HV. This study also demonstrated that the center of pressure (COP) inclines to medial side alteration after high-heeled shoes wearing. These findings indicate that mild HV people should be discouraged from wearing high-heeled shoes.
Resumo:
In treatment comparison experiments, the treatment responses are often correlated with some concomitant variables which can be measured before or at the beginning of the experiments. In this article, we propose schemes for the assignment of experimental units that may greatly improve the efficiency of the comparison in such situations. The proposed schemes are based on general ranked set sampling. The relative efficiency and cost-effectiveness of the proposed schemes are studied and compared. It is found that some proposed schemes are always more efficient than the traditional simple random assignment scheme when the total cost is the same. Numerical studies show promising results using the proposed schemes.
Resumo:
Abstract-To detect errors in decision tables one needs to decide whether a given set of constraints is feasible or not. This paper describes an algorithm to do so when the constraints are linear in variables that take only integer values. Decision tables with such constraints occur frequently in business data processing and in nonnumeric applications. The aim of the algorithm is to exploit. the abundance of very simple constraints that occur in typical decision table contexts. Essentially, the algorithm is a backtrack procedure where the the solution space is pruned by using the set of simple constrains. After some simplications, the simple constraints are captured in an acyclic directed graph with weighted edges. Further, only those partial vectors are considered from extension which can be extended to assignments that will at least satisfy the simple constraints. This is how pruning of the solution space is achieved. For every partial assignment considered, the graph representation of the simple constraints provides a lower bound for each variable which is not yet assigned a value. These lower bounds play a vital role in the algorithm and they are obtained in an efficient manner by updating older lower bounds. Our present algorithm also incorporates an idea by which it can be checked whether or not an (m - 2)-ary vector can be extended to a solution vector of m components, thereby backtracking is reduced by one component.
Resumo:
This paper considers the one-sample sign test for data obtained from general ranked set sampling when the number of observations for each rank are not necessarily the same, and proposes a weighted sign test because observations with different ranks are not identically distributed. The optimal weight for each observation is distribution free and only depends on its associated rank. It is shown analytically that (1) the weighted version always improves the Pitman efficiency for all distributions; and (2) the optimal design is to select the median from each ranked set.
Resumo:
Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971) considered optimal set size for ranked set sampling (RSS) with fixed operational costs. This framework can be very useful in practice to determine whether RSS is beneficial and to obtain the optimal set size that minimizes the variance of the population estimator for a fixed total cost. In this article, we propose a scheme of general RSS in which more than one observation can be taken from each ranked set. This is shown to be more cost-effective in some cases when the cost of ranking is not so small. We demonstrate using the example in Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971), by taking two or more observations from one set even with the optimal set size from the RSS design can be more beneficial.