932 resultados para heterogeneous polymerization
Resumo:
Polymerization of methyl methacrylate in the presence of a mixed ligand complex, [N,N-ethylenebis(salicylideneiminato)](acetylacetonato)cobalt(III) in benzene was studied. The rate of polymerization was proportional to the square root of the concentration of the chelate and the monomer exponent was 1.67 and 1.69 at 60 and 70°C, respectively. The activation energy and the kinetic and transfer constants were evaluated. A free-radical mechanism has been proposed.
Resumo:
XML documents are becoming more and more common in various environments. In particular, enterprise-scale document management is commonly centred around XML, and desktop applications as well as online document collections are soon to follow. The growing number of XML documents increases the importance of appropriate indexing methods and search tools in keeping the information accessible. Therefore, we focus on content that is stored in XML format as we develop such indexing methods. Because XML is used for different kinds of content ranging all the way from records of data fields to narrative full-texts, the methods for Information Retrieval are facing a new challenge in identifying which content is subject to data queries and which should be indexed for full-text search. In response to this challenge, we analyse the relation of character content and XML tags in XML documents in order to separate the full-text from data. As a result, we are able to both reduce the size of the index by 5-6\% and improve the retrieval precision as we select the XML fragments to be indexed. Besides being challenging, XML comes with many unexplored opportunities which are not paid much attention in the literature. For example, authors often tag the content they want to emphasise by using a typeface that stands out. The tagged content constitutes phrases that are descriptive of the content and useful for full-text search. They are simple to detect in XML documents, but also possible to confuse with other inline-level text. Nonetheless, the search results seem to improve when the detected phrases are given additional weight in the index. Similar improvements are reported when related content is associated with the indexed full-text including titles, captions, and references. Experimental results show that for certain types of document collections, at least, the proposed methods help us find the relevant answers. Even when we know nothing about the document structure but the XML syntax, we are able to take advantage of the XML structure when the content is indexed for full-text search.
Resumo:
Wireless technologies are continuously evolving. Second generation cellular networks have gained worldwide acceptance. Wireless LANs are commonly deployed in corporations or university campuses, and their diffusion in public hotspots is growing. Third generation cellular systems are yet to affirm everywhere; still, there is an impressive amount of research ongoing for deploying beyond 3G systems. These new wireless technologies combine the characteristics of WLAN based and cellular networks to provide increased bandwidth. The common direction where all the efforts in wireless technologies are headed is towards an IP-based communication. Telephony services have been the killer application for cellular systems; their evolution to packet-switched networks is a natural path. Effective IP telephony signaling protocols, such as the Session Initiation Protocol (SIP) and the H 323 protocol are needed to establish IP-based telephony sessions. However, IP telephony is just one service example of IP-based communication. IP-based multimedia sessions are expected to become popular and offer a wider range of communication capabilities than pure telephony. In order to conjoin the advances of the future wireless technologies with the potential of IP-based multimedia communication, the next step would be to obtain ubiquitous communication capabilities. According to this vision, people must be able to communicate also when no support from an infrastructured network is available, needed or desired. In order to achieve ubiquitous communication, end devices must integrate all the capabilities necessary for IP-based distributed and decentralized communication. Such capabilities are currently missing. For example, it is not possible to utilize native IP telephony signaling protocols in a totally decentralized way. This dissertation presents a solution for deploying the SIP protocol in a decentralized fashion without support of infrastructure servers. The proposed solution is mainly designed to fit the needs of decentralized mobile environments, and can be applied to small scale ad-hoc networks or also bigger networks with hundreds of nodes. A framework allowing discovery of SIP users in ad-hoc networks and the establishment of SIP sessions among them, in a fully distributed and secure way, is described and evaluated. Security support allows ad-hoc users to authenticate the sender of a message, and to verify the integrity of a received message. The distributed session management framework has been extended in order to achieve interoperability with the Internet, and the native Internet applications. With limited extensions to the SIP protocol, we have designed and experimentally validated a SIP gateway allowing SIP signaling between ad-hoc networks with private addressing space and native SIP applications in the Internet. The design is completed by an application level relay that permits instant messaging sessions to be established in heterogeneous environments. The resulting framework constitutes a flexible and effective approach for the pervasive deployment of real time applications.
Resumo:
The TCP protocol is used by most Internet applications today, including the recent mobile wireless terminals that use TCP for their World-Wide Web, E-mail and other traffic. The recent wireless network technologies, such as GPRS, are known to cause delay spikes in packet transfer. This causes unnecessary TCP retransmission timeouts. This dissertation proposes a mechanism, Forward RTO-Recovery (F-RTO) for detecting the unnecessary TCP retransmission timeouts and thus allow TCP to take appropriate follow-up actions. We analyze a Linux F-RTO implementation in various network scenarios and investigate different alternatives to the basic algorithm. The second part of this dissertation is focused on quickly adapting the TCP's transmission rate when the underlying link characteristics change suddenly. This can happen, for example, due to vertical hand-offs between GPRS and WLAN wireless technologies. We investigate the Quick-Start algorithm that, in collaboration with the network routers, aims to quickly probe the available bandwidth on a network path, and allow TCP's congestion control algorithms to use that information. By extensive simulations we study the different router algorithms and parameters for Quick-Start, and discuss the challenges Quick-Start faces in the current Internet. We also study the performance of Quick-Start when applied to vertical hand-offs between different wireless link technologies.
Resumo:
Wireless access is expected to play a crucial role in the future of the Internet. The demands of the wireless environment are not always compatible with the assumptions that were made on the era of the wired links. At the same time, new services that take advantage of the advances in many areas of technology are invented. These services include delivery of mass media like television and radio, Internet phone calls, and video conferencing. The network must be able to deliver these services with acceptable performance and quality to the end user. This thesis presents an experimental study to measure the performance of bulk data TCP transfers, streaming audio flows, and HTTP transfers which compete the limited bandwidth of the GPRS/UMTS-like wireless link. The wireless link characteristics are modeled with a wireless network emulator. We analyze how different competing workload types behave with regular TPC and how the active queue management, the Differentiated services (DiffServ), and a combination of TCP enhancements affect the performance and the quality of service. We test on four link types including an error-free link and the links with different Automatic Repeat reQuest (ARQ) persistency. The analysis consists of comparing the resulting performance in different configurations based on defined metrics. We observed that DiffServ and Random Early Detection (RED) with Explicit Congestion Notification (ECN) are useful, and in some conditions necessary, for quality of service and fairness because a long queuing delay and congestion related packet losses cause problems without DiffServ and RED. However, we observed situations, where there is still room for significant improvements if the link-level is aware of the quality of service. Only very error-prone link diminishes the benefits to nil. The combination of TCP enhancements improves performance. These include initial window of four, Control Block Interdependence (CBI) and Forward RTO recovery (F-RTO). The initial window of four helps a later starting TCP flow to start faster but generates congestion under some conditions. CBI prevents slow-start overshoot and balances slow start in the presence of error drops, and F-RTO reduces unnecessary retransmissions successfully.
Resumo:
A model for heterogeneous acetalisation of poly(vinyl alcohol) with limited solution volume is proposed based on the grain model of Sohn and Szekely. Instead of treating the heterogeneous acetalisation as purely a diffusion process, as in the Matuzawa and Ogasawara model, the present model also takes into account the chemical reaction and the physical state of the solid polymer, such as degree of swelling and porosity, and assumes segregation of the polymer phase at higher conversion into an outer fully reacted zone and an inner zone where the reaction still proceeds. The solution of the model for limited solution volume, moreover, offers a simple method of determining the kinetic parameters and diffusivity for the solid-liquid system using the easily measurable bulk solution concentration of the liquid reactant instead of conversion-distance data for the solid phase, which are considerably more difficult to obtain.
Resumo:
Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.
Resumo:
The behavior of the chelate, ferric dipivaloylmethide, Fe(DPM)3, in vinyl polymerization systems was investigated. The polymerization was found to be of free-radical nature. The rate of polymerization was proportional to the square root of the concentration of the chelate. The monomer exponent was close to 1.5 for the Fe(DPM)3-initiated polymerization of styrene and methyl methacrylate. The kinetic and transfer constants and activation energies for these systems have been evaluated. Spectral studies revealed the possibility of a complex formation between the chelate and the monomer. A kinetic scheme for the Fe(DPM)3-initiated polymerization is derived based on this initial complex formation.
Resumo:
The nature of the interaction between the unsaturated monomer and the chelate, Fe(DPM)3, is studied in detail. The interaction is found to occur only in solution. The stoichiometry of interaction and the equilibrium constant are evaluated. With the help of spectral evidence, attempts are made to point out the specific sites of interaction.
Resumo:
We examined whether C-terminal residues of soluble recombinant FtsZ of Mycobacterium tuberculosis (MtFtsZ) have any role in MtFtsZ polymerization in vitro. MtFtsZ-delta C1, which lacks C-terminal extreme Arg residue (underlined in the C-terminal extreme stretch of 13 residues, DDDDVDVPPFMRR), but retaining the penultimate Arg residue (DDDDVDVPPFMR), polymerizes like full-length MtFtsZ in vitro. However, MtFtsZ-delta C2 that lacks both the Arg residues at the C-terminus (DDDDVDVPPFM), neither polymerizes at pH 6.5 nor forms even single- or double-stranded filaments at pH 7.7 in the presence of 10 mM CaCl2. Neither replacement of the penultimate Arg residue, in the C-terminal Arg deletion mutant DDDDVDVPPFMR, with Lys or His or Ala or Asp (DDDDVDVPPFMK/H/A/D) enabled polymerization. Although MtFtsZ-delta C2 showed secondary and tertiary structural changes, which might have affected polymerization, GTPase activity of MtFtsZ-delta C2 was comparable to that of MtFtsZ. These data suggest that MtFtsZ requires an Arg residue as the extreme C-terminal residue for polymerization in vitro. The polypeptide segment containing C-terminal 67 residues, whose coordinates were absent from MtFtsZ crystal structure, was modeled on tubulin and MtFtsZ dimers. Possibilities for the influence of the C-terminal Arg residues on the stability of the dimer and thereby on MtFtsZ polymerization have been discussed.
Resumo:
Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.
Resumo:
With the increasing adoption of wireless technology, it is reasonable to expect an increase in file demand for supporting both real-time multimedia and high rate reliable data services. Next generation wireless systems employ Orthogonal Frequency Division Multiplexing (OFDM) physical layer owing, to the high data rate transmissions that are possible without increase in bandwidth. Towards improving file performance of these systems, we look at the design of resource allocation algorithms at medium-access layer, and their impact on higher layers. While TCP-based clastic traffic needs reliable transport, UDP-based real-time applications have stringent delay and rate requirements. The MAC algorithms while catering to the heterogeneous service needs of these higher layers, tradeoff between maximizing the system capacity and providing fairness among users. The novelly of this work is the proposal of various channel-aware resource allocation algorithms at the MAC layer. which call result in significant performance gains in an OFDM based wireless system.
Resumo:
Randomness in the source condition other than the heterogeneity in the system parameters can also be a major source of uncertainty in the concentration field. Hence, a more general form of the problem formulation is necessary to consider randomness in both source condition and system parameters. When the source varies with time, the unsteady problem, can be solved using the unit response function. In the case of random system parameters, the response function becomes a random function and depends on the randomness in the system parameters. In the present study, the source is modelled as a random discrete process with either a fixed interval or a random interval (the Poisson process). In this study, an attempt is made to assess the relative effects of various types of source uncertainties on the probabilistic behaviour of the concentration in a porous medium while the system parameters are also modelled as random fields. Analytical expressions of mean and covariance of concentration due to random discrete source are derived in terms of mean and covariance of unit response function. The probabilistic behaviour of the random response function is obtained by using a perturbation-based stochastic finite element method (SFEM), which performs well for mild heterogeneity. The proposed method is applied for analysing both the 1-D as well as the 3-D solute transport problems. The results obtained with SFEM are compared with the Monte Carlo simulation for 1-D problems.
Resumo:
The conversion of a metastable phase into a thermodynamically stable phase takes place via the formation of clusters. Clusters of different sizes are formed spontaneously within the metastable mother phase, but only those larger than a certain size, called the critical size, will end up growing into a new phase. There are two types of nucleation: homogeneous, where the clusters appear in a uniform phase, and heterogeneous, when pre-existing surfaces are available and clusters form on them. The nucleation of aerosol particles from gas-phase molecules is connected not only with inorganic compounds, but also with nonvolatile organic substances found in atmosphere. The question is which ones of the myriad of organic species have the right properties and are able to participate in nucleation phenomena. This thesis discusses both homogeneous and heterogeneous nucleation, having as theoretical tool the classical nucleation theory (CNT) based on thermodynamics. Different classes of organics are investigated. The members of the first class are four dicarboxylic acids (succinic, glutaric, malonic and adipic). They can be found in both the gas and particulate phases, and represent good candidates for the aerosol formation due to their low vapor pressure and solubility. Their influence on the nucleation process has not been largely investigated in the literature and it is not fully established. The accuracy of the CNT predictions for binary water-dicarboxylic acid systems depends significantly on the good knowledge of the thermophysical properties of the organics and their aqueous solutions. A large part of the thesis is dedicated to this issue. We have shown that homogeneous and heterogeneous nucleation of succinic, glutaric and malonic acids in combination with water is unlikely to happen in atmospheric conditions. However, it seems that adipic acid could participate in the nucleation process in conditions occurring in the upper troposphere. The second class of organics is represented by n-nonane and n-propanol. Their thermophysical properties are well established, and experiments on these substances have been performed. The experimental data of binary homogeneous and heterogeneous nucleation have been compared with the theoretical predictions. Although the n-nonane - n-propanol mixture is far from being ideal, CNT seems to behave fairly well, especially when calculating the cluster composition. In the case of heterogeneous nucleation, it has been found that better characterization of the substrate - liquid interaction by means of line tension and microscopic contact angle leads to a significant improvement of the CNT prediction. Unfortunately, this can not be achieved without well defined experimental data.