866 resultados para Lower bounds
Resumo:
This paper gives a brief review of R&D researches for light olefin synthesis directly and indirectly from synthesis gas in the Dalian Institute of Chemical Physics (DICP). The first pilot plant test was on methanol to olefin (MTO) reaction and was finished in 1993, which was based on ZSM-5-type catalyst and fixed bed reaction. In the meantime, a new indirect method designated as SDTO (syngas via dimethylether to olefin) was proposed. In this process, metal-acid bifunctional catalyst was applied for synthesis gas to dimethylether(DME) reaction, and modified SAPO-34 catalyst that was synthesized by a new low-cost method with optimal crystal size was used to convert DME to light olefin on a fluidized bed reactor. The pilot plant test on SDTO was performed and finished in 1995. Evaluation of the pilot plant data showed that 190-200 g of DME were yielded by single-pass for each standard cubic meter of synthesis gas. For the second reaction, 1.880 tons of DME or 2.615 tons of methanol produced 1 ton of light olefins, which constitutes of 0.533 ton of ethylene, 0.349 ton of propylene and 0.118 ton of butene. DICP also paid some attention on direct conversion of synthesis gas to light olefins. A semi-pilot plant test (catalyst 1.8 1) was finished in 1995 with a CO conversion > 70% and a C(2)(=)-C(4)(=) olefin selectivity 71-74% in 1000 h. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
The lower alkene production by the gas-phase oxidative cracking (GOC) or catalytic oxidative cracking (COC) of hexane (C6) with added syngas was investigated. The addition of syngas to the COC process could effectively enhance the selectivity to lower alkenes and decrease the selectivity to COx, because of the preferential reaction between O-2 with H-2 contained in the syngas, whereas it has little effect on the conversion of C6 and product distribution in the GOC process. The high selectivity to lower alkenes of 70% and low selectivity to CO, of 6% at C6 conversion of 66% were achieved over 0.1% Pt/MgAl2O4 catalyst. The COC process of C6 combined with the syngas in the feed could directly produce a gas mixture of lower alkenes, H-2, and CO, which usually is a suitable feedstock for the hydroformylation process.
Resumo:
This paper presents a lower-bound result on the computational power of a genetic algorithm in the context of combinatorial optimization. We describe a new genetic algorithm, the merged genetic algorithm, and prove that for the class of monotonic functions, the algorithm finds the optimal solution, and does so with an exponential convergence rate. The analysis pertains to the ideal behavior of the algorithm where the main task reduces to showing convergence of probability distributions over the search space of combinatorial structures to the optimal one. We take exponential convergence to be indicative of efficient solvability for the sample-bounded algorithm, although a sampling theory is needed to better relate the limit behavior to actual behavior. The paper concludes with a discussion of some immediate problems that lie ahead.
Resumo:
We show that if a language is recognized within certain error bounds by constant-depth quantum circuits over a finite family of gates, then it is computable in (classical) polynomial time. In particular, our results imply EQNC^0 ⊆ P, where EQNC^0 is the constant-depth analog of the class EQP. On the other hand, we adapt and extend ideas of Terhal and DiVincenzo [?] to show that, for any family
Resumo:
We consider the problem of delivering popular streaming media to a large number of asynchronous clients. We propose and evaluate a cache-and-relay end-system multicast approach, whereby a client joining a multicast session caches the stream, and if needed, relays that stream to neighboring clients which may join the multicast session at some later time. This cache-and-relay approach is fully distributed, scalable, and efficient in terms of network link cost. In this paper we analytically derive bounds on the network link cost of our cache-and-relay approach, and we evaluate its performance under assumptions of limited client bandwidth and limited client cache capacity. When client bandwidth is limited, we show that although finding an optimal solution is NP-hard, a simple greedy algorithm performs surprisingly well in that it incurs network link costs that are very close to a theoretical lower bound. When client cache capacity is limited, we show that our cache-and-relay approach can still significantly reduce network link cost. We have evaluated our cache-and-relay approach using simulations over large, synthetic random networks, power-law degree networks, and small-world networks, as well as over large real router-level Internet maps.
Resumo:
snBench is a platform on which novice users compose and deploy distributed Sense and Respond programs for simultaneous execution on a shared, distributed infrastructure. It is a natural imperative that we have the ability to (1) verify the safety/correctness of newly submitted tasks and (2) derive the resource requirements for these tasks such that correct allocation may occur. To achieve these goals we have established a multi-dimensional sized type system for our functional-style Domain Specific Language (DSL) called Sensor Task Execution Plan (STEP). In such a type system data types are annotated with a vector of size attributes (e.g., upper and lower size bounds). Tracking multiple size aspects proves essential in a system in which Images are manipulated as a first class data type, as image manipulation functions may have specific minimum and/or maximum resolution restrictions on the input they can correctly process. Through static analysis of STEP instances we not only verify basic type safety and establish upper computational resource bounds (i.e., time and space), but we also derive and solve data and resource sizing constraints (e.g., Image resolution, camera capabilities) from the implicit constraints embedded in program instances. In fact, the static methods presented here have benefit beyond their application to Image data, and may be extended to other data types that require tracking multiple dimensions (e.g., image "quality", video frame-rate or aspect ratio, audio sampling rate). In this paper we present the syntax and semantics of our functional language, our type system that builds costs and resource/data constraints, and (through both formalism and specific details of our implementation) provide concrete examples of how the constraints and sizing information are used in practice.
Resumo:
This paper proposes the use of in-network caches (which we call Angels) to reduce the Minimum Distribution Time (MDT) of a file from a seeder – a node that possesses the file – to a set of leechers – nodes who are interested in downloading the file. An Angel is not a leecher in the sense that it is not interested in receiving the entire file, but rather it is interested in minimizing the MDT to all leechers, and as such uses its storage and up/down-link capacity to cache and forward parts of the file to other peers. We extend the analytical results by Kumar and Ross [1] to account for the presence of angels by deriving a new lower bound for the MDT. We show that this newly derived lower bound is tight by proposing a distribution strategy under assumptions of a fluid model. We present a GroupTree heuristic that addresses the impracticalities of the fluid model. We evaluate our designs through simulations that show that our Group-Tree heuristic outperforms other heuristics, that it scales well with the increase of the number of leechers, and that it closely approaches the optimal theoretical bounds.
Resumo:
This thesis proposes the use of in-network caches (which we call Angels) to reduce the Minimum Distribution Time (MDT) of a file from a seeder – a node that possesses the file – to a set of leechers – nodes who are interested in downloading the file. An Angel is not a leecher in the sense that it is not interested in receiving the entire file, but rather it is interested in minimizing the MDT to all leechers, and as such uses its storage and up/down-link capacity to cache and forward parts of the file to other peers. We extend the analytical results by Kumar and Ross (Kumar and Ross, 2006) to account for the presence of angels by deriving a new lower bound for the MDT. We show that this newly derived lower bound is tight by proposing a distribution strategy under assumptions of a fluid model. We present a GroupTree heuristic that addresses the impracticalities of the fluid model. We evaluate our designs through simulations that show that our GroupTree heuristic outperforms other heuristics, that it scales well with the increase of the number of leechers, and that it closely approaches the optimal theoretical bounds.
Resumo:
Abstract—Personal communication devices are increasingly being equipped with sensors that are able to passively collect information from their surroundings – information that could be stored in fairly small local caches. We envision a system in which users of such devices use their collective sensing, storage, and communication resources to query the state of (possibly remote) neighborhoods. The goal of such a system is to achieve the highest query success ratio using the least communication overhead (power). We show that the use of Data Centric Storage (DCS), or directed placement, is a viable approach for achieving this goal, but only when the underlying network is well connected. Alternatively, we propose, amorphous placement, in which sensory samples are cached locally and informed exchanges of cached samples is used to diffuse the sensory data throughout the whole network. In handling queries, the local cache is searched first for potential answers. If unsuccessful, the query is forwarded to one or more direct neighbors for answers. This technique leverages node mobility and caching capabilities to avoid the multi-hop communication overhead of directed placement. Using a simplified mobility model, we provide analytical lower and upper bounds on the ability of amorphous placement to achieve uniform field coverage in one and two dimensions. We show that combining informed shuffling of cached samples upon an encounter between two nodes, with the querying of direct neighbors could lead to significant performance improvements. For instance, under realistic mobility models, our simulation experiments show that amorphous placement achieves 10% to 40% better query answering ratio at a 25% to 35% savings in consumed power over directed placement.
Resumo:
A souterrain was discovered here when the weight of a tractor passing overhead caused a collapse of the roof of Chamber I. It was surveyed in March 1976. The landowner, Mr. Thomas Curran of Ballylangdon has consented to keep the site open for future inspection. The site is not directly connected with any visible surface structure. A small uni-vallate ringfort is however situated c.I60m S.S.E. of the site. The bedrock is a slaty sandstone.
Resumo:
Introduction: The prevalence of diabetes is rising rapidly. Assessing quality of diabetes care is difficult. Lower Extremity Amputation (LEA) is recognised as a marker of the quality of diabetes care. The focus of this thesis was first to describe the trends in LEA rates in people with and without diabetes in the Republic of Ireland (RoI) in recent years and then, to explore the determinants of LEA in people with diabetes. While clinical and socio-demographic determinants have been well-established, the role of service-related factors has been less well-explored. Methods: Using hospital discharge data, trends in LEA rates in people with and without diabetes were described and compared to other countries. Background work included concordance studies exploring the reliability of hospital discharge data for recording LEA and diabetes and estimation of diabetes prevalence rates in the RoI from a nationally representative study (SLAN 2007). To explore determinants, a systematic review and meta-analysis assessed the effect of contact with a podiatrist on the outcome of LEA in people with diabetes. Finally, a case-control study using hospital discharge data explored determinants of LEA in people with diabetes with a particular focus on the timing of access to secondary healthcare services as a risk factor. Results: There are high levels of agreement between hospital discharge data and medical records for LEA and diabetes. Thus, hospital discharge data was deemed sufficiently reliable for use in this PhD thesis. A decrease in major diabetes-related LEA rates in people with diabetes was observed in the RoI from 2005-2012. In 2012, the relative risk of a person with diabetes undergoing a major LEA was 6.2 times (95% CI 4.8-8.1) that of a person without diabetes. Based on the systematic review and meta-analysis, contact with a podiatrist did not significantly affect the relative risk (RR) of LEA in people with diabetes. Results from the case-control study identified being single, documented CKD and documented hypertension as significant risk factors for LEA in people with diabetes whilst documented retinopathy was protective. Within the seven year time window included in the study, no association was detected between LEA in patients with diabetes and timing of patient access to secondary healthcare for diabetes management. Discussion: Many countries have reported reduced major LEA rates in people with diabetes coinciding with improved organisation of healthcare systems. Reassuringly, these first national estimates in people with diabetes in the RoI from 2005 to 2012 demonstrated reducing trends in major LEA rates. This may be attributable to changes in diabetes care and also, secular trends in smoking, dyslipidaemia and hypertension. Consistent with international practice, LEA trends data in Ireland can be used to monitor quality of care. Quantifying this improvement precisely, though, is problematic without robust denominator data on the prevalence of diabetes. However, a reduction in major diabetes-related LEA rates suggests improved quality of diabetes care. Much controversy exists around the reliability of hospital discharge data in the RoI. This thesis includes the first multi-site study to explore this issue and found hospital discharge data reliable for the reporting of the procedure of LEA and diagnosis of diabetes. This project did not detect protective effects of access to services including podiatry and secondary healthcare for LEA in people with diabetes. A major limitation of the systematic review and meta-analysis was the design and quality of the included studies. The data available in the area of effect of contact with a podiatrist on LEA risk are too sparse to say anything definitive about the efficacy of podiatry on LEA. Limitations of the case-control study include lack of a diabetes register in Ireland, restricted information from secondary healthcare and lack of data available from primary healthcare. Due to these issues, duration of disease could not be accounted for in the study which limits the conclusions that can be drawn from the results. The model of diabetes care in the RoI is currently undergoing a re-configuration with plans to introduce integrated care. In the future, trends in LEA rates should be continuously monitored to evaluate the effectiveness of changes to the healthcare system. Efforts are already underway to improve the availability of routine data from primary healthcare with the recent development of the iPCRN (Irish Primary Care Research Network). Linkage of primary and secondary healthcare records with a unique patient identifier should be the goal for the future.
Resumo:
Existing point estimates of half-life deviations from purchasing power parity (PPP), around 3-5 years, suggest that the speed of convergence is extremely slow. This article assesses the degree of uncertainty around these point estimates by using local-to-unity asymptotic theory to construct confidence intervals that are robust to high persistence in small samples. The empirical evidence suggests that the lower bound of the confidence interval is between four and eight quarters for most currencies, which is not inconsistent with traditional price-stickiness explanations. However, the upper bounds are infinity for all currencies, so we cannot provide conclusive evidence in favor of PPP either. © 2005 American Statistical Association.
Resumo:
We present a theory of hypoellipticity and unique ergodicity for semilinear parabolic stochastic PDEs with "polynomial" nonlinearities and additive noise, considered as abstract evolution equations in some Hilbert space. It is shown that if Hörmander's bracket condition holds at every point of this Hilbert space, then a lower bound on the Malliavin covariance operatorμt can be obtained. Informally, this bound can be read as "Fix any finite-dimensional projection on a subspace of sufficiently regular functions. Then the eigenfunctions of μt with small eigenvalues have only a very small component in the image of Π." We also show how to use a priori bounds on the solutions to the equation to obtain good control on the dependency of the bounds on the Malliavin matrix on the initial condition. These bounds are sufficient in many cases to obtain the asymptotic strong Feller property introduced in [HM06]. One of the main novel technical tools is an almost sure bound from below on the size of "Wiener polynomials," where the coefficients are possibly non-adapted stochastic processes satisfying a Lips chitz condition. By exploiting the polynomial structure of the equations, this result can be used to replace Norris' lemma, which is unavailable in the present context. We conclude by showing that the two-dimensional stochastic Navier-Stokes equations and a large class of reaction-diffusion equations fit the framework of our theory.