931 resultados para Router ottico, Click, Reti ottiche, linux
Resumo:
In a previous paper, we described the room temperature rapid, selective, reversible, and near quantitative Cu-activated nitroxide radical coupling (NRC) technique to prepare 3-arm polystyrene stars. In this work, we evaluated the Cu-activation mechanism, either conventional atom transfer or single electron transfer (SET), through kinetic simulations. Simulation data showed that one can describe the system by either activation mechanism. We also found through simulations that bimolecular radical termination, regardless of activation mechanism, was extremely low and could be considered negligible in an NRC reaction. Experiments were carried out to form 2- and 3-arm PSTY stars using two ligands, PMDETA and Me6TREN, in a range of solvent conditions by varying the ratio of DMSO to toluene, and over a wide temperature range. The rate of 2- or 3-arm star formation was governed by the choice of solvent and ligand. The combination of Me6TREN and toluene/DMSO showed a relatively temperature independent rate, and remarkably reached near quantitative yields for 2-arm star formation after only 1 min at 25 °C.
Resumo:
The single electron transfer-nitroxide radical coupling (SET-NRC) reaction has been used to produce multiblock polymers with high molecular weights in under 3 min at 50◦C by coupling a difunctional telechelic polystyrene (Br-PSTY-Br)with a dinitroxide. The well known combination of dimethyl sulfoxide as solvent and Me6TREN as ligand facilitated the in situ disproportionation of CuIBr to the highly active nascent Cu0 species. This SET reaction allowed polymeric radicals to be rapidly formed from their corresponding halide end-groups. Trapping of these carbon-centred radicals at close to diffusion controlled rates by dinitroxides resulted in high-molecular-weight multiblock polymers. Our results showed that the disproportionation of CuI was critical in obtaining these ultrafast reactions, and confirmed that activation was primarily through Cu0. We took advantage of the reversibility of the NRC reaction at elevated temperatures to decouple the multiblock back to the original PSTY building block through capping the chain-ends with mono-functional nitroxides. These alkoxyamine end-groups were further exchanged with an alkyne mono-functional nitroxide (TEMPO–≡) and ‘clicked’ by a CuI-catalyzed azide/alkyne cycloaddition (CuAAC) reaction with N3–PSTY–N3 to reform the multiblocks. This final ‘click’ reaction, even after the consecutive decoupling and nitroxide-exchange reactions, still produced high molecular-weight multiblocks efficiently. These SET-NRC reactions would have ideal applications in re-usable plastics and possibly as self-healing materials.
Resumo:
High activation of polystyrene with bromine end groups (PSTY-Br) to their incipient radicals occurred in the presence of Cu(I)Br, Me6TREN, and DMSO solvent. These radicals were then trapped by nitroxide species leading to coupling reactions between PSTY-Br and nitroxides that were ultrafast and selective in the presence of a diverse range of functional groups. The nitroxide radical coupling (NRC) reactions have the attributes of a “click” reaction with near quantitative yields of product formed, but through the reversibility of this reaction, it has the added advantage of permitting the exchange of chemical functionality on macromolecules. Conditions were chosen to facilitate the disproportionation of Cu(I)Br to the highly activating nascent Cu(0) and deactivating Cu(II)Br2 in the presence of DMSO solvent and Me6TREN ligand. NRC at room temperature gave near quantitative yields of macromolecular coupling of low molecular weight polystyrene with bromine chain-ends (PSTY-Br) and nitroxides in under 7 min even in the presence of functional groups (e.g., −≡, −OH, −COOH, −NH2, =O). Utilization of the reversibility of the NRC reaction at elevated temperatures allowed the exchange of chain-end groups with a variety of functional nitroxide derivatives. The robustness and orthogonality of this NRC reaction were further demonstrated using the Cu-catalyzed azide/alkyne “click” (CuAAC) reactions, in which yields greater than 95% were observed for coupling between PSTY-N3 and a PSTY chain first trapped with an alkyne functional TEMPO (PSTY-TEMPO-≡).
Resumo:
Road transport plays a significant role in various industries and mobility services around the globe and has a vital impact on our daily lives. However it also has serious impacts on both public health and the environment. In-vehicle feedback systems are a relatively new approach to encouraging driver behaviour change for improving fuel efficiency and safety in automotive environments. While many studies claim that the adoption of eco-driving practices, such as eco-driving training programs and in-vehicle feedback to drivers, has the potential to improve fuel efficiency, limited research has integrated safety and eco-driving. Therefore, this research seeks to use human factors related theories and practices to inform the design and evaluation of an in-vehicle Human Machine Interface (HMI) providing real-time driver feedback with the aim of improving both fuel efficiency and safety.
Resumo:
A channel router is an important design aid in the design automation of VLSI circuit layout. Many algorithms have been developed based on various wiring models with routing done on two layers. With the recent advances in VLSI process technology, it is possible to have three independent layers for interconnection. In this paper two algorithms are presented for three-layer channel routing. The first assumes a very simple wiring model. This enables the routing problem to be solved optimally in a time of O(n log n). The second algorithm is for a different wiring model and has an upper bound of O(n2) for its execution time. It uses fewer horizontal tracks than the first algorithm. For the second model the channel width is not bounded by the channel density.
Resumo:
NeEstimator v2 is a completely revised and updated implementation of software that produces estimates of contemporary effective population size, using several different methods and a single input file. NeEstimator v2 includes three single-sample estimators (updated versions of the linkage disequilibrium and heterozygote-excess methods, and a new method based on molecular coancestry), as well as the two-sample (moment-based temporal) method. New features include the following: (i) an improved method for accounting for missing data; (ii) options for screening out rare alleles; (iii) confidence intervals for all methods; (iv) the ability to analyse data sets with large numbers of genetic markers (10000 or more); (v) options for batch processing large numbers of different data sets, which will facilitate cross-method comparisons using simulated data; and (vi) correction for temporal estimates when individuals sampled are not removed from the population (Plan I sampling). The user is given considerable control over input data and composition, and format of output files. The freely available software has a new JAVA interface and runs under MacOS, Linux and Windows.
Resumo:
Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.
Resumo:
This manual consists of written descriptions of jungle perch Kuhlia rupestris production and video material to demonstrate each of the key production steps. Video links are at the end of each major written section in the document. To activate the link use ctrl click. The videos enhance the instructive ability of this manual. The keys to producing jungle perch are: maintaining broodstock in freshwater or low salinity water less than 5 ppt spawning fish in full seawater at 28C incubating eggs in full seawater. Salinities must not be less than 32 ppt ensuring that first feed jungle perch larvae have an adequate supply of copepod nauplii rearing larvae in full seawater under bright light use of gentle aeration in tanks postponing spawns until adequate densities of copepod nauplii are present in ponds sustaining copepod blooms in ponds for at least 20 days avoiding use of paddlewheels in ponds supplementary feeding with Artemia salina and weaning diets from 20 days after hatch harvesting of fingerlings or fry after they are 25-30 mm in length (50 to 60 days post hatch) covering tanks of fingerlings with 5 mm mesh and submerging freshwater inlets to prevent jumping.
Resumo:
Human age is surrounded by assumed set of rules and behaviors imposed by local culture and the society they live in. This paper introduces software that counts the presence of a person on the Internet and examines the activities he/she conducts online. The paper answers questions such as how "old" are you on the Internet? How soon will a newbie be exposed to adult websites? How long will it take for a new Internet user to know about social networking sites? And how many years a user has to surf online to celebrate his/her first "birthday" of Internet presence? Paper findings from a database of 105 school and university students containing their every click of first 24 hours of Internet usage are presented. The findings provide valuable insights for Internet Marketing, ethics, Internet business and the mapping of Internet life with real life. Privacy and ethical issues related to the study have been discussed at the end. © Springer Science+Business Media B.V. 2010.
An FETI-preconditioned conjuerate gradient method for large-scale stochastic finite element problems
Resumo:
In the spectral stochastic finite element method for analyzing an uncertain system. the uncertainty is represented by a set of random variables, and a quantity of Interest such as the system response is considered as a function of these random variables Consequently, the underlying Galerkin projection yields a block system of deterministic equations where the blocks are sparse but coupled. The solution of this algebraic system of equations becomes rapidly challenging when the size of the physical system and/or the level of uncertainty is increased This paper addresses this challenge by presenting a preconditioned conjugate gradient method for such block systems where the preconditioning step is based on the dual-primal finite element tearing and interconnecting method equipped with a Krylov subspace reusage technique for accelerating the iterative solution of systems with multiple and repeated right-hand sides. Preliminary performance results on a Linux Cluster suggest that the proposed Solution method is numerically scalable and demonstrate its potential for making the uncertainty quantification Of realistic systems tractable.
Development and characterization of lysine based tripeptide analogues as inhibitors of Sir2 activity
Resumo:
Sirtuins are NAD(+) dependent deacetylases that modulate various essential cellular functions. Development of peptide based inhibitors of Sir2s would prove useful both as pharmaceutical agents and as effectors by which downstream cellular alterations can be monitored. Click chemistry that utilizes Huisgen's 1,3-dipolar cycloaddition permits attachment of novel modifications onto the side chain of lysine. Herein, we report the synthesis of peptide analogues prepared using click reactions on N epsilon-propargyloxycarbonyl protected lysine residues and their characterization as inhibitors of Plasmodium falciparum Sir2 activity. The peptide based inhibitors exhibited parabolic competitive inhibition with respect to acetylated-peptide substrate and parabolic non-competitive inhibition with NAD(+) supporting the formation of EI2 and E.NAD(+).I-2 complexes. Cross-competition inhibition analysis with the non-competitive inhibitor nicotinamide (NAM) ruled out the possibility of the NAM-binding site being the second inhibitor binding site, suggesting the presence of a unique alternate pocket commodating the inhibitor. One of these compounds was also found to be a potent inhibitor of the intraerythrocytic growth of P. falciparum with 50% inhibitory concentration in the micromolar range.
Resumo:
The role of lectins in mediating cancer metastasis, apoptosis as well as various other signaling events has been well established in the past few years. Data on various aspects of the role of lectins in cancer is being accumulated at a rapid pace. The data on lectins available in the literature is so diverse, that it becomes difficult and time-consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. Not only do the lectins vary significantly in their individual functional roles, but they are also diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities and specificities as well as their potential applications. An organization of these seemingly independent data into a common framework is essential in order to achieve effective use of all the data towards understanding the roles of different lectins in different aspects of cancer and any resulting applications. An integrated knowledge base (CancerLectinDB) together with appropriate analytical tools has therefore been developed for lectins relevant for any aspect of cancer, by collating and integrating diverse data. This database is unique in terms of providing sequence, structural, and functional annotations for lectins from all known sources in cancer and is expected to be a useful addition to the number of glycan related resources now available to the community. The database has been implemented using MySQL on a Linux platform and web-enabled using Perl-CGI and Java tools. Data for individual lectins pertain to taxonomic, biochemical, domain architecture, molecular sequence and structural details as well as carbohydrate specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value for various studies on lectin cancer biology.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
A large part of today's multi-core chips is interconnect. Increasing communication complexity has made essential new strategies for interconnects, such as Network on Chip. Power dissipation in interconnects has become a substantial part of the total power dissipation. Techniques to reduce interconnect power have thus become a necessity. In this paper, we present a design methodology that gives values of bus width for interconnect links, frequency of operation for routers, in Network on Chip scenario that satisfy required throughput and dissipate minimal switching power. We develop closed form analytical expressions for the power dissipation, with bus width and frequency as variables and then use Lagrange multiplier method to arrive at the optimal values. We present a 4 port router in 90 nm technology library as case study. The results obtained from analysis are discussed.