960 resultados para Perfect
Resumo:
The structural characterization in crystals of three designed decapeptides containing a double D-segment at the C-terminus is described. The crystal structures of the peptides Boc-Leu-Aib-Val-Xxx-Leu-Aib-Val- (D)Ala-(D)Leu-Aib-OMe, (Xxx = Gly 2, (D)Ala 3, Aib 4) have been determined and compared with those reported earlier for peptide 1 (Xxx = Ala) and the all L analogue Boc-Leu-Aib-Val-Ala-Leu-Aib-Val-Ala-Leu-Aib-OMe, which yielded a perfect right-handed a-helical structure. Peptides 1 and 2 reveal a right-handed helical segment spanning residues 1 to 7, ending in a Schellman motif with Ala(8) functioning as the terminating residue. Polypeptide chain reversal occurs at residue 9, a novel feature that appears to be the consequence of a C-(HO)-O-... hydrogen bond between residue 4 (CH)-H-alpha and residue 9 CO groups. The structures of peptides 3 and 4, which lack the pro R hydrogen at the C-alpha atom of residue 4, are dramatically different. Peptide 3 adopts a right-handed helical conformation over the 1 to 7 segment. Residues 8 and 9 adopt at conformations forming a C-terminus type I' beta-turn, corresponding to an incipient left-handed twist of the polypeptide chain. In peptide 4, helix termination occurs at Aib(6), with residues 6 to 9 forming a left-handed helix, resulting in a structure that accommodates direct fusion of two helical segments of opposite twist. Peptides 3 and 4 provide examples of chiral residues occurring in the less favored sense of helical twist; (D)Ala(4) in peptide 3 adopts an alpha(R) conformation, while (L)Val(7) in 4 adopts an alpha(L) conformation. The structural comparison of the decapeptides reported here provides evidence for the role of specific C-(HO)-O-... hydrogen bonds in stabilizing chain reversals at helix termini, which may be relevant in aligning contiguous helical and strand segments in polypeptide structures.
Resumo:
A local algorithm with local horizon r is a distributed algorithm that runs in r synchronous communication rounds; here r is a constant that does not depend on the size of the network. As a consequence, the output of a node in a local algorithm only depends on the input within r hops from the node. We give tight bounds on the local horizon for a class of local algorithms for combinatorial problems on unit-disk graphs (UDGs). Most of our bounds are due to a refined analysis of existing approaches, while others are obtained by suggesting new algorithms. The algorithms we consider are based on network decompositions guided by a rectangular tiling of the plane. The algorithms are applied to matching, independent set, graph colouring, vertex cover, and dominating set. We also study local algorithms on quasi-UDGs, which are a popular generalisation of UDGs, aimed at more realistic modelling of communication between the network nodes. Analysing the local algorithms on quasi-UDGs allows one to assume that the nodes know their coordinates only approximately, up to an additive error. Despite the localisation error, the quality of the solution to problems on quasi-UDGs remains the same as for the case of UDGs with perfect location awareness. We analyse the increase in the local horizon that comes along with moving from UDGs to quasi-UDGs.
Resumo:
The crystal and molecular structures of the Tris salt of adenosine 5'-diphosphate were determined from X-ray diffraction data. The crystals are monoclinic, space P21, and Z = 2 with a=9.198 (2) A, b=6.894 (1) A, c=18.440 (4) A, and beta = 92.55 (2) degrees. Intensity data were collected on an automated diffractometer. The structure was solved by the heavy-atom technique and refined by least squares to R = 0.047. The ADP molecule adopts a folded conformation. The conformation about the glycosidic bond is anti. The conformation of the ribose ring is close to a perfect C(2')-endo-C-(3')-exo puckering. The conformation about C(4')-C(5') is gauche-gauche, similar to other nucleotide structures. The pyrophosphate chain displays a nearly eclipsed geometry when viewed down the P-P vector, unlike the staggered conformation observed in crystal structures of other pyrophosphates. The less favorable eclipsed conformation probably results from the observed association of Tris molecules with the polar diphosphate chain through electrostatic interactions and hydrogen bonds. Such interactions may play an important role in Tris-buffered aqueous solutions of nucleotides and metal ions.
Resumo:
Using ab initio methods we have investigated the fluorination of graphene and find that different stoichiometric phases can be formed without a nucleation barrier, with the complete “2D-Teflon” CF phase being thermodynamically most stable. The fluorinated graphene is an insulator and turns out to be a perfect matrix-host for patterning nanoroads and quantum dots of pristine graphene. The electronic and magnetic properties of the nanoroads can be tuned by varying the edge orientation and width. The energy gaps between the highest occupied and lowest unoccupied molecular orbitals (HOMO-LUMO) of quantum dots are size-dependent and show a confinement typical of Dirac fermions. Furthermore, we study the effect of different basic coverage of F on graphene (with stoichiometries CF and C4F) on the band gaps, and show the suitability of these materials to host quantum dots of graphene with unique electronic properties.
Resumo:
Species identification forms the basis for understanding the diversity of the living world, but it is also a prerequisite for understanding many evolutionary patterns and processes. The most promising approach for correctly delimiting and identifying species is to integrate many types of information in the same study. Our aim was to test how cuticular hydro- carbons, traditional morphometrics, genetic polymorphisms in nuclear markers (allozymes and DNA microsatellites) and DNA barcoding (partial mitochondrial COI gene) perform in delimiting species. As an example, we used two closely related Formica ants, F. fusca and F. lemani, sampled from a sympatric population in the northern part of their distribu- tion. Morphological characters vary and overlap in different parts of their distribution areas, but cuticular hydrocarbons include a strong taxonomic signal and our aim is to test the degree to which morphological and genetic data correspond to the chemical data. In the morphological analysis, species were best separated by the combined number of hairs on pro- notum and mesonotum, but individual workers overlapped in hair numbers, as previously noted by several authors. Nests of the two species were separated but not clustered according to species in a Principal Component Analysis made on nuclear genetic data. However, model-based Bayesian clustering resulted in perfect separation of the species and gave no indication of hybridization. Furthermore, F. lemani and F. fusca did not share any mitochondrial haplotypes, and the species were perfectly separated in a phylogenetic tree. We conclude that F. fusca and F. lemani are valid species that can be separated in our study area relatively well with all methods employed. However, the unusually small genetic differen- tiation in nuclear markers (FST = 0.12) shows that they are closely related, and occasional hybridization between F. fusca and F. lemani cannot be ruled out.
Resumo:
Trafficking in human beings has become one of the most talked about criminal concerns of the 21st century. But this is not all that it has become. Trafficking has also been declared as one of the most pressing human rights issues of our time. In this sense, it has become a part of the expansion of the human rights phenomenon. Although it is easy to see that the crime of trafficking violates several of the human rights of its victims, it is still, in its essence, a fairly conventional although particularly heinous and often transnational crime, consisting of acts between private actors, and lacking, therefore, the vertical effect associated traditionally with human rights violations. This thesis asks, then, why, and how, has the anti-trafficking campaign been translated in human rights language. And even more fundamentally: in light of the critical, theoretical studies surrounding the expansion of the human rights phenomenon, especially that of Costas Douzinas, who has declared that we have come to the end of human rights as a consequence of the expansion and bureaucratization of the phenomenon, can human rights actually bring salvation to the victims of trafficking? The thesis demonstrates that the translation process of the anti-trafficking campaign into human rights language has been a complicated process involving various actors, including scholars, feminist NGOs, local activists and global human rights NGOs. It has also been driven by a complicated web of interests, the most prevalent one the sincere will to help the victims having become entangled with other aims, such as political, economical, and structural goals. As a consequence of its fragmented background, the human rights approach to trafficking seeks still its final form, consisting of several different claims. After an assessment of these claims from a legal perspective, this thesis concludes that the approach is most relevant regarding the mistreatment of victims of trafficking in the hands of state authorities. It seems to be quite common that authorities have trouble identifying the victims of trafficking, which means that the rights granted to themin international and national documents are not realized in practice, but victims of trafficking are systematically deported as illegal immigrants. It is argued that in order to understand the measures of the authorities, and to assess the usefulness of human rights, it is necessary to adopt a Foucauldian perspective and to observe the measures as biopolitical defence mechanisms. From a biopolitical perspective, the victims of trafficking can be seen as a threat to the population a threat that must be eliminated either by assimilating them to the main population with the help of disciplinary techniques, or by excluding them completely from the society. This biopolitical aim is accomplished through an impenetrable net of seemingly insignificant practices and discourses that not even the participants are aware of. As a result of these practices and discourses, trafficking victims only very few of fit the myth of the perfect victim, produced by biopolitical discourses become invisible and therefore subject to deportation as (risky) illegal immigrants, turning them into bare life in the Agambenian sense, represented by the homo sacer, who cannot be sacrificed, yet does not enjoy the protection of the society and its laws. It is argued, following Jacques Rancière and Slavoj i ek, that human rights can, through their universality and formal equality, provide bare life the tools to formulate political claims and therefore utilize their politicization through their exclusion to return to the sphere of power and politics. Even though human rights have inevitably become entangled with biopolitical practices, they are still perhaps the most efficient way to challenge biopower. Human rights have not, therefore, become useless for the victims of trafficking, but they must be conceived as a universal tool to formulate political claims and challenge power .In the case of trafficking this means that human rights must be utilized to constantly renegotiate the borders of the problematic concept of victim of trafficking created by international instruments, policies and discourses, including those that are sincerely aimed to provide help for the victims.
Resumo:
In this article we present a new, general but simple, microscopic expression for time-dependent solvation energy of an ion. This expression is surprisingly similar to the expression for the time-dependent dielectric friction on a moving ion. We show that both the Chandra-Bagchi and the Fried-Mukamel formulations of solvation dynamics can be easily derived from this expression. This expression leads to an almost perfect agreement of the theory with all the available computer simulation results. Second, we show here for the first time that the mobility of a light solute ion can significantly accelerate its own solvation, specially in the underdamped limit. The latter result is also in excellent agreement with the computer simulations.
Resumo:
This dissertation examines the impacts of energy and climate policies on the energy and forest sectors, focusing on the case of Finland. The thesis consists of an introduction article and four separate studies. The dissertation was motivated by the climate concern and the increasing demand of renewable energy. In particular, the renewable energy consumption and greenhouse gas emission reduction targets of the European Union were driving this work. In Finland, both forest and energy sectors are in key roles in achieving these targets. In fact, the separation between forest and energy sector is diminishing as the energy sector is utilizing increasing amounts of wood in energy production and as the forest sector is becoming more and more important energy producer. The objective of this dissertation is to find out and measure the impacts of climate and energy policies on the forest and energy sectors. In climate policy, the focus is on emissions trading, and in energy policy the dissertation focuses on the promotion of renewable forest-based energy use. The dissertation relies on empirical numerical models that are based on microeconomic theory. Numerical partial equilibrium mixed complementarity problem models were constructed to study the markets under scrutiny. The separate studies focus on co-firing of wood biomass and fossil fuels, liquid biofuel production in the pulp and paper industry, and the impacts of climate policy on the pulp and paper sector. The dissertation shows that the policies promoting wood-based energy may have have unexpected negative impacts. When feed-in tariff is imposed together with emissions trading, in some plants the production of renewable electricity might decrease as the emissions price increases. The dissertation also shows that in liquid biofuel production, investment subsidy may cause high direct policy costs and other negative impacts when compared to other policy instruments. The results of the dissertation also indicate that from the climate mitigation perspective, perfect competition is the favored wood market competition structure, at least if the emissions trading system is not global. In conclusion, this dissertation suggests that when promoting the use of wood biomass in energy production, the favored policy instruments are subsidies that promote directly the renewable energy production (i.e. production subsidy, renewables subsidy or feed-in premium). Also, the policy instrument should be designed to be dependent on the emissions price or on the substitute price. In addition, this dissertation shows that when planning policies to promote wood-based renewable energy, the goals of the policy scheme should be clear before decisions are made on the choice of the policy instruments.
Resumo:
We examine the potential for adaptation to climate change in Indian forests, and derive the macroeconomic implications of forest impacts and adaptation in India. The study is conducted by integrating results from the dynamic global vegetation model IBIS and the computable general equilibrium model GRACE-IN, which estimates macroeconomic implications for six zones of India. By comparing a reference scenario without climate change with a climate impact scenario based on the IPCC A2-scenario, we find major variations in the pattern of change across zones. Biomass stock increases in all zones but the Central zone. The increase in biomass growth is smaller, and declines in one more zone, South zone, despite higher stock. In the four zones with increases in biomass growth, harvest increases by only approximately 1/3 of the change in biomass growth. This is due to two market effects of increased biomass growth. One is that an increase in biomass growth encourages more harvest given other things being equal. The other is that more harvest leads to higher supply of timber, which lowers market prices. As a result, also the rent on forested land decreases. The lower prices and rent discourage more harvest even though they may induce higher demand, which increases the pressure on harvest. In a less perfect world than the model describes these two effects may contribute to an increase in the risk of deforestation because of higher biomass growth. Furthermore, higher harvest demands more labor and capital input in the forestry sector. Given total supply of labor and capital, this increases the cost of production in all the other sectors, although very little indeed. Forestry dependent communities with declining biomass growth may, however, experience local unemployment as a result.
Resumo:
An oscillating droplet method combined with electromagnetic levitation has been applied to determine the surface tensions of liquid pure iron, nickel and iron-nickel alloys as a function of the temperature. The natural frequency of the oscillating droplet is evaluated using a Fourier analyser. The theoretical background of this method and the experimental set-up were described, and the influence of magnetic field strength was also discussed. The experimental results were compared with those of other investigators and interpreted using theoretical models (Butler's equation, subregular and perfect solution model for the surface phase).
Resumo:
The performance of a program will ultimately be limited by its serial (scalar) portion, as pointed out by Amdahl′s Law. Reported studies thus far of instruction-level parallelism have mixed data-parallel program portions with scalar program portions, often leading to contradictory and controversial results. We report an instruction-level behavioral characterization of scalar code containing minimal data-parallelism, extracted from highly vectorized programs of the PERFECT benchmark suite running on a Cray Y-MP system. We classify scalar basic blocks according to their instruction mix, characterize the data dependencies seen in each class, and, as a first step, measure the maximum intrablock instruction-level parallelism available. We observe skewed rather than balanced instruction distributions in scalar code and in individual basic block classes of scalar code; nonuniform distribution of parallelism across instruction classes; and, as expected, limited available intrablock parallelism. We identify frequently occurring data-dependence patterns and discuss new instructions to reduce latency. Toward effective scalar hardware, we study latency-pipelining trade-offs and restricted multiple instruction issue mechanisms.
Resumo:
Channel assignment in multi-channel multi-radio wireless networks poses a significant challenge due to scarcity of number of channels available in the wireless spectrum. Further, additional care has to be taken to consider the interference characteristics of the nodes in the network especially when nodes are in different collision domains. This work views the problem of channel assignment in multi-channel multi-radio networks with multiple collision domains as a non-cooperative game where the objective of the players is to maximize their individual utility by minimizing its interference. Necessary and sufficient conditions are derived for the channel assignment to be a Nash Equilibrium (NE) and efficiency of the NE is analyzed by deriving the lower bound of the price of anarchy of this game. A new fairness measure in multiple collision domain context is proposed and necessary and sufficient conditions for NE outcomes to be fair are derived. The equilibrium conditions are then applied to solve the channel assignment problem by proposing three algorithms, based on perfect/imperfect information, which rely on explicit communication between the players for arriving at an NE. A no-regret learning algorithm known as Freund and Schapire Informed algorithm, which has an additional advantage of low overhead in terms of information exchange, is proposed and its convergence to the stabilizing outcomes is studied. New performance metrics are proposed and extensive simulations are done using Matlab to obtain a thorough understanding of the performance of these algorithms on various topologies with respect to these metrics. It was observed that the algorithms proposed were able to achieve good convergence to NE resulting in efficient channel assignment strategies.
Resumo:
The shear alignment of an initially disordered lamellar phase is examined using lattice Boltzmann simulations of a mesoscopic model based on a free-energy functional for the concentration modulation. For a small shear cell of width 8 lambda, the qualitative features of the alignment process are strongly dependent on the Schmidt number Sc = nu/D (ratio of kinematic viscosity and mass diffusion coefficient). Here, lambda is the wavelength of the concentration modulation. At low Schmidt number, it is found that there is a significant initial increase in the viscosity, coinciding with the alignment of layers along the extensional axis, followed by a decrease at long times due to the alignment along the flow direction. At high Schmidt number, alignment takes place due to the breakage and reformation of layers because diffusion is slow compared to shear deformation; this results in faster alignment. The system size has a strong effect on the alignment process; perfect alignment takes place for a small systems of width 8 lambda and 16 lambda, while a larger system of width 32 lambda does not align completely even at long times. In the larger system, there appears to be a dynamical steady state in which the layers are not perfectly aligned-where there is a balance between the annealing of defects due to shear and the creation due to an instability of the aligned lamellar phase under shear. We observe two types of defect creation mechanisms: the buckling instability under dilation, which was reported earlier, as well as a second mechanism due to layer compression.
Resumo:
For an n(t) transmit, n(r) receive antenna system (n(t) x nr system), a full-rate space time block code (STBC) transmits min(n(t), n(r)) complex symbols per channel use. In this paper, a scheme to obtain a full-rate STBC for 4 transmit antennas and any n(r), with reduced ML-decoding complexity is presented. The weight matrices of the proposed STBC are obtained from the unitary matrix representations of a Clifford Algebra. By puncturing the symbols of the STBC, full rate designs can be obtained for n(r) < 4. For any value of n(r), the proposed design offers the least ML-decoding complexity among known codes. The proposed design is comparable in error performance to the well known Perfect code for 4 transmit antennas while offering lower ML-decoding complexity. Further, when n(r) < 4, the proposed design has higher ergodic capacity than the punctured Perfect code. Simulation results which corroborate these claims are presented.
Resumo:
The crystal structure of a daturalactone derivative has been determined by X-ray structural analysis. The compound crystallizes in orthorhomic space group P2(1)2(1)2(1) with cell parameters a = 15.141(1) angstrom, b = 18.425(1) angstrom, c = 19.251(2) angstrom. The structure was solved by direct methods and refined to R = 0.082. The asymmetric unit contains two non-equivalent molecules. Extensive hydrogen bonding is present. The conformations of the rings are A: a distorted half-chair, B: a perfect half-chair, C: a chair, D: an envelope-half chair and E: a twist boat. Ring junctions A/B, B/C, C/D are all trans fused. Methyl carbons C(18), C(19), C(27) and the lactone moiety is beta-oriented whereas the methyl carbons C(21) and C(28) are alpha-oriented.