925 resultados para nutrients and sulfur application
Resumo:
K. Rasmani and Q. Shen. Data-driven fuzzy rule generation and its application for student academic performance evaluation. Applied Intelligence, 25(3):305-319, 2006.
Resumo:
J. Keppens, Q. Shen and M. Lee. Compositional Bayesian modelling and its application to decision support in crime investigation. Proceedings of the 19th International Workshop on Qualitative Reasoning, pages 138-148.
Resumo:
K. Rasmani and Q. Shen. Subsethood-based Fuzzy Rule Models and their Application to Student Performance Classification. Proceedings of the 14th International Conference on Fuzzy Systems, pages 755-760, 2005.
Resumo:
Gohm, Rolf, (2003) 'A probabilistic index for completely positive maps and an application', Journal of Operator Theory 54(2) pp.339-361 RAE2008
Resumo:
Wireless sensor networks have recently emerged as enablers of important applications such as environmental, chemical and nuclear sensing systems. Such applications have sophisticated spatial-temporal semantics that set them aside from traditional wireless networks. For example, the computation of temperature averaged over the sensor field must take into account local densities. This is crucial since otherwise the estimated average temperature can be biased by over-sampling areas where a lot more sensors exist. Thus, we envision that a fundamental service that a wireless sensor network should provide is that of estimating local densities. In this paper, we propose a lightweight probabilistic density inference protocol, we call DIP, which allows each sensor node to implicitly estimate its neighborhood size without the explicit exchange of node identifiers as in existing density discovery schemes. The theoretical basis of DIP is a probabilistic analysis which gives the relationship between the number of sensor nodes contending in the neighborhood of a node and the level of contention measured by that node. Extensive simulations confirm the premise of DIP: it can provide statistically reliable and accurate estimates of local density at a very low energy cost and constant running time. We demonstrate how applications could be built on top of our DIP-based service by computing density-unbiased statistics from estimated local densities.
Resumo:
The World Wide Web (WWW or Web) is growing rapidly on the Internet. Web users want fast response time and easy access to a enormous variety of information across the world. Thus, performance is becoming a main issue in the Web. Fractals have been used to study fluctuating phenomena in many different disciplines, from the distribution of galaxies in astronomy to complex physiological control systems. The Web is also a complex, irregular, and random system. In this paper, we look at the document reference pattern at Internet Web servers and use fractal-based models to understand aspects (e.g. caching schemes) that affect the Web performance.
Resumo:
World-Wide Web (WWW) services have grown to levels where significant delays are expected to happen. Techniques like pre-fetching are likely to help users to personalize their needs, reducing their waiting times. However, pre-fetching is only effective if the right documents are identified and if user's move is correctly predicted. Otherwise, pre-fetching will only waste bandwidth. Therefore, it is productive to determine whether a revisit will occur or not, before starting pre-fetching. In this paper we develop two user models that help determining user's next move. One model uses Random Walk approximation and the other is based on Digital Signal Processing techniques. We also give hints on how to use such models with a simple pre-fetching technique that we are developing.
Resumo:
We introduce Collocation Games as the basis of a general framework for modeling, analyzing, and facilitating the interactions between the various stakeholders in distributed systems in general, and in cloud computing environments in particular. Cloud computing enables fixed-capacity (processing, communication, and storage) resources to be offered by infrastructure providers as commodities for sale at a fixed cost in an open marketplace to independent, rational parties (players) interested in setting up their own applications over the Internet. Virtualization technologies enable the partitioning of such fixed-capacity resources so as to allow each player to dynamically acquire appropriate fractions of the resources for unencumbered use. In such a paradigm, the resource management problem reduces to that of partitioning the entire set of applications (players) into subsets, each of which is assigned to fixed-capacity cloud resources. If the infrastructure and the various applications are under a single administrative domain, this partitioning reduces to an optimization problem whose objective is to minimize the overall deployment cost. In a marketplace, in which the infrastructure provider is interested in maximizing its own profit, and in which each player is interested in minimizing its own cost, it should be evident that a global optimization is precisely the wrong framework. Rather, in this paper we use a game-theoretic framework in which the assignment of players to fixed-capacity resources is the outcome of a strategic "Collocation Game". Although we show that determining the existence of an equilibrium for collocation games in general is NP-hard, we present a number of simplified, practically-motivated variants of the collocation game for which we establish convergence to a Nash Equilibrium, and for which we derive convergence and price of anarchy bounds. In addition to these analytical results, we present an experimental evaluation of implementations of some of these variants for cloud infrastructures consisting of a collection of multidimensional resources of homogeneous or heterogeneous capacities. Experimental results using trace-driven simulations and synthetically generated datasets corroborate our analytical results and also illustrate how collocation games offer a feasible distributed resource management alternative for autonomic/self-organizing systems, in which the adoption of a global optimization approach (centralized or distributed) would be neither practical nor justifiable.
Resumo:
To construct high performance Web servers, system builders are increasingly turning to distributed designs. An important challenge that arises in distributed Web servers is the need to direct incoming connections to individual hosts. Previous methods for connection routing have employed a centralized node which handles all incoming requests. In contrast, we propose a distributed approach, called Distributed Packet Rewriting (DPR), in which all hosts of the distributed system participate in connection routing. We argue that this approach promises better scalability and fault-tolerance than the centralized approach. We describe our implementation of four variants of DPR and compare their performance. We show that DPR provides performance comparable to centralized alternatives, measured in terms of throughput and delay under the SPECweb96 benchmark. Finally, we argue that DPR is particularly attractive both for small scale systems and for systems following the emerging trend toward increasingly intelligent I/O subsystems.
Resumo:
In many networked applications, independent caching agents cooperate by servicing each other's miss streams, without revealing the operational details of the caching mechanisms they employ. Inference of such details could be instrumental for many other processes. For example, it could be used for optimized forwarding (or routing) of one's own miss stream (or content) to available proxy caches, or for making cache-aware resource management decisions. In this paper, we introduce the Cache Inference Problem (CIP) as that of inferring the characteristics of a caching agent, given the miss stream of that agent. While CIP is insolvable in its most general form, there are special cases of practical importance in which it is, including when the request stream follows an Independent Reference Model (IRM) with generalized power-law (GPL) demand distribution. To that end, we design two basic "litmus" tests that are able to detect LFU and LRU replacement policies, the effective size of the cache and of the object universe, and the skewness of the GPL demand for objects. Using extensive experiments under synthetic as well as real traces, we show that our methods infer such characteristics accurately and quite efficiently, and that they remain robust even when the IRM/GPL assumptions do not hold, and even when the underlying replacement policies are not "pure" LFU or LRU. We exemplify the value of our inference framework by considering example applications.
Resumo:
The primary focus of this thesis was the asymmetric peroxidation of α,β-unsaturated aldehydes and the development of this methodology to include the synthesis of bioactive chiral 1,2-dioxane and 1,2-dioxalane rings. In Chapter 1 a review detailing the new and improved methods for the acyclic introduction of peroxide functionality to substrates over the last decade was discussed. These include a detailed examination of metal-mediated transformations, chiral peroxidation using organocatalytic means and the improvements in methodology of well-established peroxidation pathways. The second chapter discusses the method by which peroxidation of our various substrates was attempted and the optimisation studies associated with these reactions. The method by which the enantioselectivity of our β-peroxyaldehydes was determined is also reviewed. Chapters 3 and 4 focus on improving the enantioselectivity associated with our asymmetric peroxidation reaction. A comprehensive analysis exploring the effect of solvent, concentration and temperature on enantioselectivity was examined. The effect that different catalytic systems have on enantioselectivity and reactivity was also investigated in depth. Chapter 5 details the various transformations that β-peroxyaldehydes can undergo and the manipulation of these transformations towards the establishment of several routes for the formation of chiral 1,2-dioxane and 1,2-dioxalane rings. Chapter 6 details the full experimental procedures, including spectroscopic and analytical data for the compounds prepared during this research.
Resumo:
As part of the “free-from” trend, biopreservation for bread products has increasingly become important to prevent spoilage since artificial preservatives are more and more rejected by consumers. A literature review conducted as part of this thesis revealed that the evaluation of more suitable antifungal strains of lactic acid bacteria (LAB) is important. Moreover, increasing the knowledge about the origin of the antifungal effect is fundamental for further enhancement of biopreservation. This thesis addresses the investigation of Lactobacillus amylovorus DSM19280, Lb. brevis R2: and Lb. reuteri R29 for biopreservation using in vitro trials and in situ sourdough fermentations of quinoa, rice and wheat flours as biopreservatives in breads. Their contribution to quality and shelf life extension on bread was compared and related to their metabolic activity and substrate features. Moreover, the quantity of antifungal carboxylic acids produced during sourdough fermentation was analysed. Overall a specific profile of antifungal compounds was found in the sourdough samples which were strain and substrate dependently different. The best preservative effect in quinoa sourdough and wheat sourdough bread was achieved when Lb. amylovorus DSM19280 fermented sourdough was used. However, the concentration of the antifungal compounds found in these biopreservatives were much lower when compared with Lb. reuteri R29 as the highest producer. Nevertheless, the artificial application of the highest concentration of these antifungal compounds in chemically acidified wheat sourdough bread succeeded in a longer shelf life than achieved only by acidifying the dough. This evidences their partial contribution to the antifungal activity and their synergy. Additionally, a HRGC/MS method for the identification and quantification of the antifungal active compounds cyclo(Leu-Pro), cyclo(Pro-Pro), cyclo(Met-Pro) and cyclo(Phe-Pro) was successfully developed by using stable isotope dilutions assays with the deuterated counterparts. It was observed that the concentrations of cyclo(Leu-Pro), cyclo(Pro-Pro), and cyclo(Phe-Pro) increased only moderately in MRS-broth and wort fermentation by the activity of the selected microorganism, whereas the concentration of cyclo(Met-Pro) stayed unchanged.
Resumo:
The use of stem cells for tissue regeneration and repair is advancing both at the bench and bedside. Stem cells isolated from bone marrow are currently being tested for their therapeutic potential in a variety of clinical conditions including cardiovascular injury, kidney failure, cancer, and neurological and bone disorders. Despite the advantages, stem cell therapy is still limited by low survival, engraftment, and homing to damage area as well as inefficiencies in differentiating into fully functional tissues. Genetic engineering of mesenchymal stem cells is being explored as a means to circumvent some of these problems. This review presents the current understanding of the use of genetically engineered mesenchymal stem cells in human disease therapy with emphasis on genetic modifications aimed to improve survival, homing, angiogenesis, and heart function after myocardial infarction. Advancements in other disease areas are also discussed.
Resumo:
INTRODUCTION: Proteins that undergo receptor-mediated endocytosis are subject to lysosomal degradation, requiring radioiodination methods that minimize loss of radioactivity from tumor cells after this process occurs. To accomplish this, we developed the residualizing radioiodination agent N(ϵ)-(3-[(*)I]iodobenzoyl)-Lys(5)-N(α)-maleimido-Gly(1)-D-GEEEK (Mal-D-GEEEK-[(*)I]IB), which enhanced tumor uptake but also increased kidney activity and necessitates generation of sulfhydryl moieties on the protein. The purpose of the current study was to synthesize and evaluate a new D-amino acid based agent that might avoid these potential problems. METHODS: N(α)-(3-iodobenzoyl)-(5-succinimidyloxycarbonyl)-D-EEEG (NHS-IB-D-EEEG), which contains 3 D-glutamates to provide negative charge and a N-hydroxysuccinimide function to permit conjugation to unmodified proteins, and the corresponding tin precursor were produced by solid phase peptide synthesis and subsequent conjugation with appropriate reagents. Radioiodination of the anti-HER2 antibody trastuzumab using NHS-IB-D-EEEG and Mal-D-GEEEK-IB was compared. Paired-label internalization assays on BT474 breast carcinoma cells and biodistribution studies in athymic mice bearing BT474M1 xenografts were performed to evaluate the two radioiodinated D-peptide trastuzumab conjugates. RESULTS: NHS-[(131)I]IB-D-EEEG was produced in 53.8%±13.4% and conjugated to trastuzumab in 39.5%±7.6% yield. Paired-label internalization assays with trastuzumab-NHS-[(131)I]IB-D-EEEG and trastuzumab-Mal-D-GEEEK-[(125)I]IB demonstrated similar intracellular trapping for both conjugates at 1h ((131)I, 84.4%±6.1%; (125)I, 88.6%±5.2%) through 24h ((131)I, 60.7%±6.8%; (125)I, 64.9%±6.9%). In the biodistribution experiment, tumor uptake peaked at 48 h (trastuzumab-NHS-[(131)I]IB-D-EEEG, 29.8%±3.6%ID/g; trastuzumab-Mal-D-GEEEK-[(125)I]IB, 45.3%±5.3%ID/g) and was significantly higher for (125)I at all time points. In general, normal tissue levels were lower for trastuzumab-NHS-[(131)I]IB-D-EEEG, with the differences being greatest in kidneys ((131)I, 2.2%±0.4%ID/g; (125)I, 16.9%±2.8%ID/g at 144 h). CONCLUSION: NHS-[(131)I]IB-D-EEEG warrants further evaluation as a residualizing radioiodination agent for labeling internalizing antibodies/fragments, particularly for applications where excessive renal accumulation could be problematic.