925 resultados para spray schedules


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Psicologia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Speculative Concurrency Control (SCC) [Best92a] is a new concurrency control approach especially suited for real-time database applications. It relies on the use of redundancy to ensure that serializable schedules are discovered and adopted as early as possible, thus increasing the likelihood of the timely commitment of transactions with strict timing constraints. In [Best92b], SCC-nS, a generic algorithm that characterizes a family of SCC-based algorithms was described, and its correctness established by showing that it only admits serializable histories. In this paper, we evaluate the performance of the Two-Shadow SCC algorithm (SCC-2S), a member of the SCC-nS family, which is notable for its minimal use of redundancy. In particular, we show that SCC-2S (as a representative of SCC-based algorithms) provides significant performance gains over the widely used Optimistic Concurrency Control with Broadcast Commit (OCC-BC), under a variety of operating conditions and workloads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an algorithm which extends the relatively new notion of speculative concurrency control by delaying the commitment of transactions, thus allowing other conflicting transactions to continue execution and commit rather than restart. This algorithm propagates uncommitted data to other outstanding transactions thus allowing more speculative schedules to be considered. The algorithm is shown always to find a serializable schedule, and to avoid cascading aborts. Like speculative concurrency control, it considers strictly more schedules than traditional concurrency control algorithms. Further work is needed to determine which of these speculative methods performs better on actual transaction loads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a new class of Concurrency Control Algorithms that is especially suited for real-time database applications. Our approach relies on the use of (potentially) redundant computations to ensure that serializable schedules are found and executed as early as possible, thus, increasing the chances of a timely commitment of transactions with strict timing constraints. Due to its nature, we term our concurrency control algorithms Speculative. The aforementioned description encompasses many algorithms that we call collectively Speculative Concurrency Control (SCC) algorithms. SCC algorithms combine the advantages of both Pessimistic and Optimistic Concurrency Control (PCC and OCC) algorithms, while avoiding their disadvantages. On the one hand, SCC resembles PCC in that conflicts are detected as early as possible, thus making alternative schedules available in a timely fashion in case they are needed. On the other hand, SCC resembles OCC in that it allows conflicting transactions to proceed concurrently, thus avoiding unnecessary delays that may jeopardize their timely commitment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commonly, research work in routing for delay tolerant networks (DTN) assumes that node encounters are predestined, in the sense that they are the result of unknown, exogenous processes that control the mobility of these nodes. In this paper, we argue that for many applications such an assumption is too restrictive: while the spatio-temporal coordinates of the start and end points of a node's journey are determined by exogenous processes, the specific path that a node may take in space-time, and hence the set of nodes it may encounter could be controlled in such a way so as to improve the performance of DTN routing. To that end, we consider a setting in which each mobile node is governed by a schedule consisting of a ist of locations that the node must visit at particular times. Typically, such schedules exhibit some level of slack, which could be leveraged for DTN message delivery purposes. We define the Mobility Coordination Problem (MCP) for DTNs as follows: Given a set of nodes, each with its own schedule, and a set of messages to be exchanged between these nodes, devise a set of node encounters that minimize message delivery delays while satisfying all node schedules. The MCP for DTNs is general enough that it allows us to model and evaluate some of the existing DTN schemes, including data mules and message ferries. In this paper, we show that MCP for DTNs is NP-hard and propose two detour-based approaches to solve the problem. The first (DMD) is a centralized heuristic that leverages knowledge of the message workload to suggest specific detours to optimize message delivery. The second (DNE) is a distributed heuristic that is oblivious to the message workload, and which selects detours so as to maximize node encounters. We evaluate the performance of these detour-based approaches using extensive simulations based on synthetic workloads as well as real schedules obtained from taxi logs in a major metropolitan area. Our evaluation shows that our centralized, workload-aware DMD approach yields the best performance, in terms of message delay and delivery success ratio, and that our distributed, workload-oblivious DNE approach yields favorable performance when compared to approaches that require the use of data mules and message ferries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Controlling the mobility pattern of mobile nodes (e.g., robots) to monitor a given field is a well-studied problem in sensor networks. In this setup, absolute control over the nodes’ mobility is assumed. Apart from the physical ones, no other constraints are imposed on planning mobility of these nodes. In this paper, we address a more general version of the problem. Specifically, we consider a setting in which mobility of each node is externally constrained by a schedule consisting of a list of locations that the node must visit at particular times. Typically, such schedules exhibit some level of slack, which could be leveraged to achieve a specific coverage distribution of a field. Such a distribution defines the relative importance of different field locations. We define the Constrained Mobility Coordination problem for Preferential Coverage (CMC-PC) as follows: given a field with a desired monitoring distribution, and a number of nodes n, each with its own schedule, we need to coordinate the mobility of the nodes in order to achieve the following two goals: 1) satisfy the schedules of all nodes, and 2) attain the required coverage of the given field. We show that the CMC-PC problem is NP-complete (by reduction to the Hamiltonian Cycle problem). Then we propose TFM, a distributed heuristic to achieve field coverage that is as close as possible to the required coverage distribution. We verify the premise of TFM using extensive simulations, as well as taxi logs from a major metropolitan area. We compare TFM to the random mobility strategy—the latter provides a lower bound on performance. Our results show that TFM is very successful in matching the required field coverage distribution, and that it provides, at least, two-fold query success ratio for queries that follow the target coverage distribution of the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most real-time scheduling problems are known to be NP-complete. To enable accurate comparison between the schedules of heuristic algorithms and the optimal schedule, we introduce an omniscient oracle. This oracle provides schedules for periodic task sets with harmonic periods and variable resource requirements. Three different job value functions are described and implemented. Each corresponds to a different system goal. The oracle is used to examine the performance of different on-line schedulers under varying loads, including overload. We have compared the oracle against Rate Monotonic Scheduling, Statistical Rate Monotonic Scheduling, and Slack Stealing Job Admission Control Scheduling. Consistently, the oracle provides an upper bound on performance for the metric under consideration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Infant formula is often produced as an agglomerated powder using a spray drying process. Pneumatic conveying is commonly used for transporting this product within a manufacturing plant. The transient mechanical loads imposed by this process cause some of the agglomerates to disintegrate, which has implications for key quality characteristics of the formula including bulk density and wettability. This thesis used both experimental and modelling approaches to investigate this breakage during conveying. One set of conveying trials had the objective of establishing relationships between the geometry and operating conditions of the conveying system and the resulting changes in bulk properties of the infant formula upon conveying. A modular stainless steel pneumatic conveying rig was constructed for these trials. The mode of conveying and air velocity had a statistically-significant effect on bulk density at a 95% level, while mode of conveying was the only factor which significantly influenced D[4,3] or wettability. A separate set of conveying experiments investigated the effect of infant formula composition, rather than the pneumatic conveying parameters, and also assessed the relationships between the mechanical responses of individual agglomerates of four infant formulae and their compositions. The bulk densities before conveying, and the forces and strains at failure of individual agglomerates, were related to the protein content. The force at failure and stiffness of individual agglomerates were strongly correlated, and generally increased with increasing protein to fat ratio while the strain at failure decreased. Two models of breakage were developed at different scales; the first was a detailed discrete element model of a single agglomerate. This was calibrated using a novel approach based on Taguchi methods which was shown to have considerable advantages over basic parameter studies which are widely used. The data obtained using this model compared well to experimental results for quasi-static uniaxial compression of individual agglomerates. The model also gave adequate results for dynamic loading simulations. A probabilistic model of pneumatic conveying was also developed; this was suitable for predicting breakage in large populations of agglomerates and was highly versatile: parts of the model could easily be substituted by the researcher according to their specific requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study has considered the optimisation of granola breakfast cereal manufacturing processes by wet granulation and pneumatic conveying. Granola is an aggregated food product used as a breakfast cereal and in cereal bars. Processing of granola involves mixing the dry ingredients (typically oats, nuts, etc.) followed by the addition of a binder which can contain honey, water and/or oil. In this work, the design and operation of two parallel wet granulation processes to produce aggregate granola products were incorporated: a) a high shear mixing granulation process followed by drying/toasting in an oven. b) a continuous fluidised bed followed by drying/toasting in an oven. In high shear granulation the influence of process parameters on key granule aggregate quality attributes such as granule size distribution and textural properties of granola were investigated. The experimental results show that the impeller rotational speed is the single most important process parameter which influences granola physical and textural properties. After that binder addition rate and wet massing time also show significant impacts on granule properties. Increasing the impeller speed and wet massing time increases the median granule size while also presenting a positive correlation with density. The combination of high impeller speed and low binder addition rate resulted in granules with the highest levels of hardness and crispness. In the fluidised bed granulation process the effect of nozzle air pressure and binder spray rate on key aggregate quality attributes were studied. The experimental results show that a decrease in nozzle air pressure leads to larger in mean granule size. The combination of lowest nozzle air pressure and lowest binder spray rate results in granules with the highest levels of hardness and crispness. Overall, the high shear granulation process led to larger, denser, less porous and stronger (less likely to break) aggregates than the fluidised bed process. The study also examined the particle breakage of granola during pneumatic conveying produced by both the high shear granulation and the fluidised bed granulation process. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. Particle breakage increases with applied pressure drop, and a 90° bend pipe results in more attrition for all conveying velocities relative to other pipe geometry. Additionally for the granules produced in the high shear granulator; those produced at the highest impeller speed, while being the largest also have the lowest levels of proportional breakage while smaller granules produced at the lowest impeller speed have the highest levels of breakage. This effect clearly shows the importance of shear history (during granule production) on breakage during subsequent processing. In terms of the fluidised bed granulation, there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. Finally, a simple power law breakage model based on process input parameters was developed for both manufacturing processes. It was found suitable for predicting the breakage of granola breakfast cereal at various applied air velocities using a number of pipe configurations, taking into account shear histories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Infant milk formula (IMF) is fortified milk with composition based on the nutrient content in human mother's milk, 0 to 6 months postpartum. Extensive medical and clinical research has led to advances in the nutritional quality of infant formula; however, relatively few studies have focused on interactions between nutrients and the manufacturing process. The objective of this research was to investigate the impact of composition and processing parameters on physical behaviour of high dry matter (DM) IMF systems with a view to designing more sustainable manufacturing processes. The study showed that commercial IMF, with similar compositions, manufactured by different processes, had markedly different physical properties in dehydrated or reconstituted state. Commercial products made with hydrolysed protein were more heat stable compared to products made with intact protein, however, emulsion quality was compromised. Heat-induced denaturation of whey proteins resulted in increased viscosity of wet-mixes, an effect that was dependant on both whey concentration and interactions with lactose and caseins. Expanding on fundamental laboratory studies, a novel high velocity steam injection process was developed whereby high DM (60%) wet-mixes with lower denaturation/viscosity compared to conventional processes could be achieved; powders produced using this process were of similar quality to those manufactured conventionally. Hydrolysed proteins were also shown to be an effective way of reducing viscosity in heat-treated high DM wet-mixes. In particular, using a whey protein concentrate whereby β-Lactoglobulin was selectively hydrolysed, i.e., α-Lactalbumin remained intact, reduced viscosity of wet-mixes during processing while still providing good emulsification. The thesis provides new insights into interactions between nutrients and/or processing which influence physical stability of IMF both in concentrated liquid and powdered form. The outcomes of the work have applications in such areas as; increasing the DM content of spray drier feeds in order to save energy, and, controlling final powder quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work illustrates the influence of wind forecast errors on system costs, wind curtailment and generator dispatch in a system with high wind penetration. Realistic wind forecasts of different specified accuracy levels are created using an auto-regressive moving average model and these are then used in the creation of day-ahead unit commitment schedules. The schedules are generated for a model of the 2020 Irish electricity system with 33% wind penetration using both stochastic and deterministic approaches. Improvements in wind forecast accuracy are demonstrated to deliver: (i) clear savings in total system costs for deterministic and, to a lesser extent, stochastic scheduling; (ii) a decrease in the level of wind curtailment, with close agreement between stochastic and deterministic scheduling; and (iii) a decrease in the dispatch of open cycle gas turbine generation, evident with deterministic, and to a lesser extent, with stochastic scheduling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To compare the efficacy of paclitaxel versus doxorubicin given as single agents in first-line therapy of advanced breast cancer (primary end point, progression-free survival ¿PFS) and to explore the degree of cross-resistance between the two agents. PATIENTS AND METHODS: Three hundred thirty-one patients were randomized to receive either paclitaxel 200 mg/m(2), 3-hour infusion every 3 weeks, or doxorubicin 75 mg/m(2), intravenous bolus every 3 weeks. Seven courses were planned unless progression or unacceptable toxicity occurred before the seven courses were finished. Patients who progressed within the seven courses underwent early cross-over to the alternative drug, while a delayed cross-over was optional for the remainder of patients at the time of disease progression. RESULTS: Objective response in first-line therapy was significantly better (P =.003) for doxorubicin (response rate ¿RR, 41%) than for paclitaxel (RR, 25%), with doxorubicin achieving a longer median PFS (7.5 months for doxorubicin v 3.9 months for paclitaxel, P <.001). In second-line therapy, cross-over to doxorubicin (91 patients) and to paclitaxel (77 patients) gave response rates of 30% and 16%, respectively. The median survival durations of 18.3 months for doxorubicin and 15.6 months for paclitaxel were not significantly different (P =.38). The doxorubicin arm had greater toxicity, but this was counterbalanced by better symptom control. CONCLUSION: At the dosages and schedules used in the present study, doxorubicin achieves better disease and symptom control than paclitaxel in first-line treatment. Doxorubicin and paclitaxel are not totally cross-resistant, which supports further investigation of these drugs in combination or in sequence, both in advanced disease and in the adjuvant setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have performed a retrospective analysis to evaluate the impact of age, using a 70 year cutoff, on the safety and efficacy of pegylated liposomal doxorubicin (Caelyx) given at 60 mg/m(2) every 6 weeks (treatment A) or 50 mg/m(2) every 4 weeks (treatment B) to 136 metastatic breast cancer patients in two EORTC trials, of whom 65 were 70 years of age or older. No difference in terms of toxicity was observed between younger and older patients treated with the 4-week schedule, while a higher incidence of hematological toxicity, anorexia, asthenia, and stomatitis was observed in older patients when the 6-week schedule was used. Antitumor activity was not affected by age. In the older cohort of patients, no dependence was found between the incidence of grade 3-4 toxicity or antitumor activity and patients' baseline performance status, number and severity of comorbidities, or number of concomitant medications. The higher therapeutic index of Caelyx 50 mg/m(2) every 4 weeks makes it, of the two dose schedules investigated, the preferred regimen in the elderly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Droplet size and dynamics of blended palm oil-based fatty acid methyl ester (FAME) and diesel oil spray were mechanistically investigated using a phase Doppler anemometry. A two-fluid atomizer was applied for dispersing viscous blends of blended biodiesel oil with designated flow rates. It was experimentally found that the atomizer could generate a spray with large droplets with Sauter mean diameters of ca. 30 mm at low air injection pressure. Such large droplets traveled with a low velocity along their trajectory after emerging from the nozzle tip. The viscosity of blended biodiesel could significantly affect the atomizing process, resulting in the controlled droplet size distribution. Blended biodiesel with a certain fraction of palm oil-based FAME would be consistently atomized owing to its low viscosity. However, the viscosity could exert only a small effect on the droplet velocity profile with the air injection pressure higher than 0.2 MPa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drop size and velocity distribution in a spray of fuel play an important role in determining combustion efficiency. The Phase Doppler anemometer (PDA) is a well-established technique allowing simultaneous measurement of velocity and size of droplets. In this work, effect of bio-substitute component on the size and velocity of biodiesel droplets which are generated by a two-fluid nozzle are investigated comprehensively using a PDA.