933 resultados para heterogeneous computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing adoption of wireless technology, it is reasonable to expect an increase in file demand for supporting both real-time multimedia and high rate reliable data services. Next generation wireless systems employ Orthogonal Frequency Division Multiplexing (OFDM) physical layer owing, to the high data rate transmissions that are possible without increase in bandwidth. Towards improving file performance of these systems, we look at the design of resource allocation algorithms at medium-access layer, and their impact on higher layers. While TCP-based clastic traffic needs reliable transport, UDP-based real-time applications have stringent delay and rate requirements. The MAC algorithms while catering to the heterogeneous service needs of these higher layers, tradeoff between maximizing the system capacity and providing fairness among users. The novelly of this work is the proposal of various channel-aware resource allocation algorithms at the MAC layer. which call result in significant performance gains in an OFDM based wireless system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Randomness in the source condition other than the heterogeneity in the system parameters can also be a major source of uncertainty in the concentration field. Hence, a more general form of the problem formulation is necessary to consider randomness in both source condition and system parameters. When the source varies with time, the unsteady problem, can be solved using the unit response function. In the case of random system parameters, the response function becomes a random function and depends on the randomness in the system parameters. In the present study, the source is modelled as a random discrete process with either a fixed interval or a random interval (the Poisson process). In this study, an attempt is made to assess the relative effects of various types of source uncertainties on the probabilistic behaviour of the concentration in a porous medium while the system parameters are also modelled as random fields. Analytical expressions of mean and covariance of concentration due to random discrete source are derived in terms of mean and covariance of unit response function. The probabilistic behaviour of the random response function is obtained by using a perturbation-based stochastic finite element method (SFEM), which performs well for mild heterogeneity. The proposed method is applied for analysing both the 1-D as well as the 3-D solute transport problems. The results obtained with SFEM are compared with the Monte Carlo simulation for 1-D problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conversion of a metastable phase into a thermodynamically stable phase takes place via the formation of clusters. Clusters of different sizes are formed spontaneously within the metastable mother phase, but only those larger than a certain size, called the critical size, will end up growing into a new phase. There are two types of nucleation: homogeneous, where the clusters appear in a uniform phase, and heterogeneous, when pre-existing surfaces are available and clusters form on them. The nucleation of aerosol particles from gas-phase molecules is connected not only with inorganic compounds, but also with nonvolatile organic substances found in atmosphere. The question is which ones of the myriad of organic species have the right properties and are able to participate in nucleation phenomena. This thesis discusses both homogeneous and heterogeneous nucleation, having as theoretical tool the classical nucleation theory (CNT) based on thermodynamics. Different classes of organics are investigated. The members of the first class are four dicarboxylic acids (succinic, glutaric, malonic and adipic). They can be found in both the gas and particulate phases, and represent good candidates for the aerosol formation due to their low vapor pressure and solubility. Their influence on the nucleation process has not been largely investigated in the literature and it is not fully established. The accuracy of the CNT predictions for binary water-dicarboxylic acid systems depends significantly on the good knowledge of the thermophysical properties of the organics and their aqueous solutions. A large part of the thesis is dedicated to this issue. We have shown that homogeneous and heterogeneous nucleation of succinic, glutaric and malonic acids in combination with water is unlikely to happen in atmospheric conditions. However, it seems that adipic acid could participate in the nucleation process in conditions occurring in the upper troposphere. The second class of organics is represented by n-nonane and n-propanol. Their thermophysical properties are well established, and experiments on these substances have been performed. The experimental data of binary homogeneous and heterogeneous nucleation have been compared with the theoretical predictions. Although the n-nonane - n-propanol mixture is far from being ideal, CNT seems to behave fairly well, especially when calculating the cluster composition. In the case of heterogeneous nucleation, it has been found that better characterization of the substrate - liquid interaction by means of line tension and microscopic contact angle leads to a significant improvement of the CNT prediction. Unfortunately, this can not be achieved without well defined experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a scenario in which a wireless sensor network is formed by randomly deploying n sensors to measure some spatial function over a field, with the objective of computing a function of the measurements and communicating it to an operator station. We restrict ourselves to the class of type-threshold functions (as defined in the work of Giridhar and Kumar, 2005), of which max, min, and indicator functions are important examples: our discussions are couched in terms of the max function. We view the problem as one of message-passing distributed computation over a geometric random graph. The network is assumed to be synchronous, and the sensors synchronously measure values and then collaborate to compute and deliver the function computed with these values to the operator station. Computation algorithms differ in (1) the communication topology assumed and (2) the messages that the nodes need to exchange in order to carry out the computation. The focus of our paper is to establish (in probability) scaling laws for the time and energy complexity of the distributed function computation over random wireless networks, under the assumption of centralized contention-free scheduling of packet transmissions. First, without any constraint on the computation algorithm, we establish scaling laws for the computation time and energy expenditure for one-time maximum computation. We show that for an optimal algorithm, the computation time and energy expenditure scale, respectively, as Theta(radicn/log n) and Theta(n) asymptotically as the number of sensors n rarr infin. Second, we analyze the performance of three specific computation algorithms that may be used in specific practical situations, namely, the tree algorithm, multihop transmission, and the Ripple algorithm (a type of gossip algorithm), and obtain scaling laws for the computation time and energy expenditure as n rarr infin. In particular, we show that the computation time for these algorithms scales as Theta(radicn/lo- g n), Theta(n), and Theta(radicn log n), respectively, whereas the energy expended scales as , Theta(n), Theta(radicn/log n), and Theta(radicn log n), respectively. Finally, simulation results are provided to show that our analysis indeed captures the correct scaling. The simulations also yield estimates of the constant multipliers in the scaling laws. Our analyses throughout assume a centralized optimal scheduler, and hence, our results can be viewed as providing bounds for the performance with practical distributed schedulers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lung cancer is the second most common type of cancer in the world and is the most common cause of cancer-related death in both men and women. Research into causes, prevention and treatment of lung cancer is ongoing and much progress has been made recently in these areas, however survival rates have not significantly improved. Therefore, it is essential to develop biomarkers for early diagnosis of lung cancer, prediction of metastasis and evaluation of treatment efficiency, as well as using these molecules to provide some understanding about tumour biology and translate highly promising findings in basic science research to clinical application. In this investigation, two-dimensional difference gel electrophoresis and mass spectrometry were initially used to analyse conditioned media from a panel of lung cancer and normal bronchial epithelial cell lines. Significant proteins were identified with heterogeneous nuclear ribonucleoprotein A2B1 (hnRNPA2B1), pyruvate kinase M2 isoform (PKM2), Hsc-70 interacting protein and lactate dehydrogenase A (LDHA) selected for analysis in serum from healthy individuals and lung cancer patients. hnRNPA2B1, PKM2 and LDHA were found to be statistically significant in all comparisons. Tissue analysis and knockdown of hnRNPA2B1 using siRNA subsequently demonstrated both the overexpression and potential role for this molecule in lung tumorigenesis. The data presented highlights a number of in vitro derived candidate biomarkers subsequently verified in patient samples and also provides some insight into their roles in the complex intracellular mechanisms associated with tumour progression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nucleation is the first step of a first order phase transition. A new phase is always sprung up in nucleation phenomena. The two main categories of nucleation are homogeneous nucleation, where the new phase is formed in a uniform substance, and heterogeneous nucleation, when nucleation occurs on a pre-existing surface. In this thesis the main attention is paid on heterogeneous nucleation. This thesis wields the nucleation phenomena from two theoretical perspectives: the classical nucleation theory and the statistical mechanical approach. The formulation of the classical nucleation theory relies on equilibrium thermodynamics and use of macroscopically determined quantities to describe the properties of small nuclei, sometimes consisting of just a few molecules. The statistical mechanical approach is based on interactions between single molecules, and does not bear the same assumptions as the classical theory. This work gathers up the present theoretical knowledge of heterogeneous nucleation and utilizes it in computational model studies. A new exact molecular approach on heterogeneous nucleation was introduced and tested by Monte Carlo simulations. The results obtained from the molecular simulations were interpreted by means of the concepts of the classical nucleation theory. Numerical calculations were carried out for a variety of substances nucleating on different substances. The classical theory of heterogeneous nucleation was employed in calculations of one-component nucleation of water on newsprint paper, Teflon and cellulose film, and binary nucleation of water-n-propanol and water-sulphuric acid mixtures on silver nanoparticles. The results were compared with experimental results. The molecular simulation studies involved homogeneous nucleation of argon and heterogeneous nucleation of argon on a planar platinum surface. It was found out that the use of a microscopical contact angle as a fitting parameter in calculations based on the classical theory of heterogeneous nucleation leads to a fair agreement between the theoretical predictions and experimental results. In the presented cases the microscopical angle was found to be always smaller than the contact angle obtained from macroscopical measurements. Furthermore, molecular Monte Carlo simulations revealed that the concept of the geometrical contact parameter in heterogeneous nucleation calculations can work surprisingly well even for very small clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CMPs enable simultaneous execution of multiple applications on the same platforms that share cache resources. Diversity in the cache access patterns of these simultaneously executing applications can potentially trigger inter-application interference, leading to cache pollution. Whereas a large cache can ameliorate this problem, the issues of larger power consumption with increasing cache size, amplified at sub-100nm technologies, makes this solution prohibitive. In this paper in order to address the issues relating to power-aware performance of caches, we propose a caching structure that addresses the following: 1. Definition of application-specific cache partitions as an aggregation of caching units (molecules). The parameters of each molecule namely size, associativity and line size are chosen so that the power consumed by it and access time are optimal for the given technology. 2. Application-Specific resizing of cache partitions with variable and adaptive associativity per cache line, way size and variable line size. 3. A replacement policy that is transparent to the partition in terms of size, heterogeneity in associativity and line size. Through simulation studies we establish the superiority of molecular cache (caches built as aggregations of molecules) that offers a 29% power advantage over that of an equivalently performing traditional cache.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clustered architecture processors are preferred for embedded systems because centralized register file architectures scale poorly in terms of clock rate, chip area, and power consumption. Although clustering helps by improving clock speed, reducing energy consumption of the logic, and making the design simpler, it introduces extra overheads by way of inter-cluster communication. This communication happens over long global wires which leads to delay in execution and significantly high energy consumption.In this paper, we propose a new instruction scheduling algorithm that exploits scheduling slacks of instructions and communication slacks of data values together to achieve better energy-performance trade-offs for clustered architectures with heterogeneous interconnect. Our instruction scheduling algorithm achieves 35% and 40% reduction in communication energy, whereas the overall energy-delay product improves by 4.5% and 6.5% respectively for 2 cluster and 4 cluster machines with marginal increase (1.6% and 1.1%) in execution time. Our test bed uses the Trimaran compiler infrastructure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geometric phases have been used in NMR to implement controlled phase shift gates for quantum-information processing, only in weakly coupled systems in which the individual spins can be identified as qubits. In this work, we implement controlled phase shift gates in strongly coupled systems by using nonadiabatic geometric phases, obtained by evolving the magnetization of fictitious spin-1/2 subspaces, over a closed loop on the Bloch sphere. The dynamical phase accumulated during the evolution of the subspaces is refocused by a spin echo pulse sequence and by setting the delay of transition selective pulses such that the evolution under the homonuclear coupling makes a complete 2 pi rotation. A detailed theoretical explanation of nonadiabatic geometric phases in NMR is given by using single transition operators. Controlled phase shift gates, two qubit Deutsch-Jozsa algorithm, and parity algorithm in a qubit-qutrit system have been implemented in various strongly dipolar coupled systems obtained by orienting the molecules in liquid crystal media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This master thesis studies how trade liberalization affects the firm-level productivity and industrial evolution. To do so, I built a dynamic model that considers firm-level productivity as endogenous to investigate the influence of trade on firm’s productivity and the market structure. In the framework, heterogeneous firms in the same industry operate differently in equilibrium. Specifically, firms are ex ante identical but heterogeneity arises as an equilibrium outcome. Under the setting of monopolistic competition, this type of model yields an industry that is represented not by a steady-state outcome, but by an evolution that rely on the decisions made by individual firms. I prove that trade liberalization has a general positive impact on technological adoption rates and hence increases the firm-level productivity. Besides, this endogenous technology adoption model also captures the stylized facts: exporting firms are larger and more productive than their non-exporting counterparts in the same sector. I assume that the number of firms is endogenous, since, according to the empirical literature, the industrial evolution shows considerably different patterns across countries; some industries experience large scale of firms’ exit in the period of contracting market shares, while some industries display relative stable number of firms or gradually increase quantities. The special word “shakeout” is used to describe the dramatic decrease in the number of firms. In order to explain the causes of shakeout, I construct a model where forward-looking firms decide to enter and exit the market on the basis of their state of technology. In equilibrium, firms choose different dates to adopt innovation which generate a gradual diffusion process. It is exactly this gradual diffusion process that generates the rapid, large-scale exit phenomenon. Specifically, it demonstrates that there is a positive feedback between firm’s exit and adoption, the reduction in the number of firms increases the incentives for remaining firms to adopt innovation. Therefore, in the setting of complete information, this model not only generates a shakeout but also captures the stability of an industry. However, the solely national view of industrial evolution neglects the importance of international trade in determining the shape of market structure. In particular, I show that the higher trade barriers lead to more fragile markets, encouraging the over-entry in the initial stage of industry life cycle and raising the probability of a shakeout. Therefore, more liberalized trade generates more stable market structure from both national and international viewpoints. The main references are Ederington and McCalman(2008,2009).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliability analysis for computing systems in aerospace applications must account for actual computations the system performs in the use environment. This paper introduces a theoretical nonhomogeneous Markov model for such applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three different types of consistencies, viz., semiweak, weak, and strong, of a read-only transaction in a schedule s of a set T of transactions are defined and these are compared with the existing notions of consistencies of a read-only transaction in a schedule. We present a technique that enables a user to control the consistency of a read-only transaction in heterogeneous locking protocols. Since the weak consistency of a read-only transaction improves concurrency in heterogeneous locking protocols, the users can help to improve concurrency in heterogeneous locking protocols by supplying the consistency requirements of read-only transactions. A heterogeneous locking protocol P' derived from a locking protocol P that uses exclusive mode locks only and ensures serializability need not be deadlock-free. We present a sufficient condition that ensures the deadlock-freeness of Pprime, when P is deadlock-free and all the read-only transactions in Pprime are two phase.