970 resultados para Sessile Drop


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The congestion control mechanisms of TCP make it vulnerable in an environment where flows with different congestion-sensitivity compete for scarce resources. With the increasing amount of unresponsive UDP traffic in today's Internet, new mechanisms are needed to enforce fairness in the core of the network. We propose a scalable Diffserv-like architecture, where flows with different characteristics are classified into separate service queues at the routers. Such class-based isolation provides protection so that flows with different characteristics do not negatively impact one another. In this study, we examine different aspects of UDP and TCP interaction and possible gains from segregating UDP and TCP into different classes. We also investigate the utility of further segregating TCP flows into two classes, which are class of short and class of long flows. Results are obtained analytically for both Tail-drop and Random Early Drop (RED) routers. Class-based isolation have the following salient features: (1) better fairness, (2) improved predictability for all kinds of flows, (3) lower transmission delay for delay-sensitive flows, and (4) better control over Quality of Service (QoS) of a particular traffic type.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of virtualization and cloud computing technologies necessitates the development of effective mechanisms for the estimation and reservation of resources needed by content providers to deliver large numbers of video-on-demand (VOD) streams through the cloud. Unfortunately, capacity planning for the QoS-constrained delivery of a large number of VOD streams is inherently difficult as VBR encoding schemes exhibit significant bandwidth variability. In this paper, we present a novel resource management scheme to make such allocation decisions using a mixture of per-stream reservations and an aggregate reservation, shared across all streams to accommodate peak demands. The shared reservation provides capacity slack that enables statistical multiplexing of peak rates, while assuring analytically bounded frame-drop probabilities, which can be adjusted by trading off buffer space (and consequently delay) and bandwidth. Our two-tiered bandwidth allocation scheme enables the delivery of any set of streams with less bandwidth (or equivalently with higher link utilization) than state-of-the-art deterministic smoothing approaches. The algorithm underlying our proposed frame-work uses three per-stream parameters and is linear in the number of servers, making it particularly well suited for use in an on-line setting. We present results from extensive trace-driven simulations, which confirm the efficiency of our scheme especially for small buffer sizes and delay bounds, and which underscore the significant realizable bandwidth savings, typically yielding losses that are an order of magnitude or more below our analytically derived bounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The majority of the traffic (bytes) flowing over the Internet today have been attributed to the Transmission Control Protocol (TCP). This strong presence of TCP has recently spurred further investigations into its congestion avoidance mechanism and its effect on the performance of short and long data transfers. At the same time, the rising interest in enhancing Internet services while keeping the implementation cost low has led to several service-differentiation proposals. In such service-differentiation architectures, much of the complexity is placed only in access routers, which classify and mark packets from different flows. Core routers can then allocate enough resources to each class of packets so as to satisfy delivery requirements, such as predictable (consistent) and fair service. In this paper, we investigate the interaction among short and long TCP flows, and how TCP service can be improved by employing a low-cost service-differentiation scheme. Through control-theoretic arguments and extensive simulations, we show the utility of isolating TCP flows into two classes based on their lifetime/size, namely one class of short flows and another of long flows. With such class-based isolation, short and long TCP flows have separate service queues at routers. This protects each class of flows from the other as they possess different characteristics, such as burstiness of arrivals/departures and congestion/sending window dynamics. We show the benefits of isolation, in terms of better predictability and fairness, over traditional shared queueing systems with both tail-drop and Random-Early-Drop (RED) packet dropping policies. The proposed class-based isolation of TCP flows has several advantages: (1) the implementation cost is low since it only requires core routers to maintain per-class (rather than per-flow) state; (2) it promises to be an effective traffic engineering tool for improved predictability and fairness for both short and long TCP flows; and (3) stringent delay requirements of short interactive transfers can be met by increasing the amount of resources allocated to the class of short flows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study has considered the optimisation of granola breakfast cereal manufacturing processes by wet granulation and pneumatic conveying. Granola is an aggregated food product used as a breakfast cereal and in cereal bars. Processing of granola involves mixing the dry ingredients (typically oats, nuts, etc.) followed by the addition of a binder which can contain honey, water and/or oil. In this work, the design and operation of two parallel wet granulation processes to produce aggregate granola products were incorporated: a) a high shear mixing granulation process followed by drying/toasting in an oven. b) a continuous fluidised bed followed by drying/toasting in an oven. In high shear granulation the influence of process parameters on key granule aggregate quality attributes such as granule size distribution and textural properties of granola were investigated. The experimental results show that the impeller rotational speed is the single most important process parameter which influences granola physical and textural properties. After that binder addition rate and wet massing time also show significant impacts on granule properties. Increasing the impeller speed and wet massing time increases the median granule size while also presenting a positive correlation with density. The combination of high impeller speed and low binder addition rate resulted in granules with the highest levels of hardness and crispness. In the fluidised bed granulation process the effect of nozzle air pressure and binder spray rate on key aggregate quality attributes were studied. The experimental results show that a decrease in nozzle air pressure leads to larger in mean granule size. The combination of lowest nozzle air pressure and lowest binder spray rate results in granules with the highest levels of hardness and crispness. Overall, the high shear granulation process led to larger, denser, less porous and stronger (less likely to break) aggregates than the fluidised bed process. The study also examined the particle breakage of granola during pneumatic conveying produced by both the high shear granulation and the fluidised bed granulation process. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. Particle breakage increases with applied pressure drop, and a 90° bend pipe results in more attrition for all conveying velocities relative to other pipe geometry. Additionally for the granules produced in the high shear granulator; those produced at the highest impeller speed, while being the largest also have the lowest levels of proportional breakage while smaller granules produced at the lowest impeller speed have the highest levels of breakage. This effect clearly shows the importance of shear history (during granule production) on breakage during subsequent processing. In terms of the fluidised bed granulation, there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. Finally, a simple power law breakage model based on process input parameters was developed for both manufacturing processes. It was found suitable for predicting the breakage of granola breakfast cereal at various applied air velocities using a number of pipe configurations, taking into account shear histories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Republic of Ireland became the first European country to implement nationwide smoke-free workplace legislation. Aims: To determine prevalence of smoking among bar workers and estimate the impact of the smoke-free workplace legislation on their smoking behaviour to that of a comparable general population sample. To approximate the influence of tobacco control measures on risk perception of second-hand smoke (SHS) among the general population. To explore the de-normalisation of smoking behaviour and the potential increased stigmatisation of smokers and their smoking. Methods: Prevalence estimates and behavioural changes were examined among a random sample of bar workers before and 1 year after the smoke-free legislation; comparisons made with a general population sub-sample. Changes in risk knowledge related to SHS exposure were based on general population data. Qualitative interviews were conducted among a purposive sample of smokers and non-smokers four years after the implementation of the legislation. Results: Smoking prevalence was extremely high among bar workers. Smoking prevalence dropped in bar workers and significantly among the general population 1 year post ban while cigarette consumption dropped significantly among bar workers. Disparity in knowledge between smokers and non-smoker of risk associated with SHS exposure reduced. Lack of understanding of the risk of ear infections in children posed by SHS exposure was notable. Evidence for advanced de-normalisation of smoking behaviour and intensification of stigma because of the introduction of the legislation was dependent on many factors, quality of smoking facilities played a key role. Conclusions: Ireland’s smoke-free legislation was associated with a drop in prevalence and cigarette consumption. Disparity in knowledge between smokers and non-smokers of the risk posed by SHS exposure reduced however the risk of ear infections in children needs to be effectively disseminated. The proliferation of ‘good’ smoking areas may diminish the potential to reduce smoking behaviour and de-normalise smoking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work presented in this thesis described the development of low-cost sensing and separation devices with electrochemical detections for health applications. This research employs macro, micro and nano technology. The first sensing device developed was a tonerbased micro-device. The initial development of microfluidic devices was based on glass or quartz devices that are often expensive to fabricate; however, the introduction of new types of materials, such as plastics, offered a new way for fast prototyping and the development of disposable devices. One such microfluidic device is based on the lamination of laser-printed polyester films using a computer, printer and laminator. The resulting toner-based microchips demonstrated a potential viability for chemical assays, coupled with several detection methods, particularly Chip-Electrophoresis-Chemiluminescence (CE-CL) detection which has never been reported in the literature. Following on from the toner-based microchip, a three-electrode micro-configuration was developed on acetate substrate. This is the first time that a micro-electrode configuration made from gold; silver and platinum have been fabricated onto acetate by means of patterning and deposition techniques using the central fabrication facilities in Tyndall National Institute. These electrodes have been designed to facilitate the integration of a 3- electrode configuration as part of the fabrication process. Since the electrodes are on acetate the dicing step can automatically be eliminated. The stability of these sensors has been investigated using electrochemical techniques with excellent outcomes. Following on from the generalised testing of the electrodes these sensors were then coupled with capillary electrophoresis. The final sensing devices were on a macro scale and involved the modifications of screenprinted electrodes. Screen-printed electrodes (SPE) are generally seen to be far less sensitive than the more expensive electrodes including the gold, boron-doped diamond and glassy carbon electrodes. To enhance the sensitivity of these electrodes they were treated with metal nano-particles, gold and palladium. Following on from this, another modification was introduced. The carbonaceous material carbon monolith was drop-cast onto the SPE and then the metal nano-particles were electrodeposited onto the monolith material

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A jazz piece for solo piano and/or jazz ensemble. This piece combines traditional and contemporary approaches. It is primarily old style swing but incorporates many modern harmonic techniques/devices, as evidenced in the score, such as: the use of upper structure triads, drop two voicing techniques, and voicing in fourths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The child is the most precious asset and the focal point of development for any country. However, unless children are brought up in a stimulating and conducive environment getting the best possible care and protection, their physical, mental, emotional and social development is susceptible to permanent damage. Ethiopia, being one of the least developed countries of the world due to interrelated and complex socio-economic factors including man-made and natural calamities, a large portion of our population - especially children - are victimized by social evils like famine, disease, poverty, mass displacement, lack of education and family instability. Owing to the fact that children are the most vulnerable group among the whole society and also because they constitute half of the population it is evident that a considerable number of Ethiopian children are living under difficult circumstances. Therefore, as in a number of other third world countries there are many poor, displaced, unaccompanied and orphaned children in our country. A considerable proportion of these children work on the street with some even totally living on the street without any adult care and protection. These children are forced to the streets in their tight for survival. They supplement their parents meagre income or support themselves with the small incomes they earn doing menial jobs. In doing this, street children face the danger of getting into accidents and violence, they get exploited and abused, many are forced to drop out of school or never get the chance to be enroled at all and some drift into begging or petty crime. This study is undertaken mainly for updating the findings of previous studies, monitoring changing trends, examining new facts of the problem and getting a better understanding of the phenomenon in the country by covering at least some of the major centres where the problem is acute. Thus, the outcome of this research can be useful in the formation of the social welfare programme of the country. Finally, in recognition of the urgency of the problem and the limited resources available, the Ministry of Labour and Social Affairs expresses appreciation to all agencies engaged in the rehabilitation of street children and prevention of the problem. The Ministry also calls for more co-operation and support between concerned governmental and non-governmental organizations in their efforts for improving the situation of street children and in curbing the overwhelming nature of the problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drop size and velocity distribution in a spray of fuel play an important role in determining combustion efficiency. The Phase Doppler anemometer (PDA) is a well-established technique allowing simultaneous measurement of velocity and size of droplets. In this work, effect of bio-substitute component on the size and velocity of biodiesel droplets which are generated by a two-fluid nozzle are investigated comprehensively using a PDA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Droplet size distribution of biodiesel oil with various compositions was investigated in this work. The droplets generated by a two-fluid atomizer were measured by a commercial PDA. It was found that viscosity of the fuel has a strong effect on the drop size distribution. Additionally, effect of air injection pressures applied to atomize the spray was taken into account. Shear force induced by flow field exerts an effect on distribution of biodiesel droplets in atomized spray.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Dropouts and missing data are nearly-ubiquitous in obesity randomized controlled trails, threatening validity and generalizability of conclusions. Herein, we meta-analytically evaluate the extent of missing data, the frequency with which various analytic methods are employed to accommodate dropouts, and the performance of multiple statistical methods. METHODOLOGY/PRINCIPAL FINDINGS: We searched PubMed and Cochrane databases (2000-2006) for articles published in English and manually searched bibliographic references. Articles of pharmaceutical randomized controlled trials with weight loss or weight gain prevention as major endpoints were included. Two authors independently reviewed each publication for inclusion. 121 articles met the inclusion criteria. Two authors independently extracted treatment, sample size, drop-out rates, study duration, and statistical method used to handle missing data from all articles and resolved disagreements by consensus. In the meta-analysis, drop-out rates were substantial with the survival (non-dropout) rates being approximated by an exponential decay curve (e(-lambdat)) where lambda was estimated to be .0088 (95% bootstrap confidence interval: .0076 to .0100) and t represents time in weeks. The estimated drop-out rate at 1 year was 37%. Most studies used last observation carried forward as the primary analytic method to handle missing data. We also obtained 12 raw obesity randomized controlled trial datasets for empirical analyses. Analyses of raw randomized controlled trial data suggested that both mixed models and multiple imputation performed well, but that multiple imputation may be more robust when missing data are extensive. CONCLUSION/SIGNIFICANCE: Our analysis offers an equation for predictions of dropout rates useful for future study planning. Our raw data analyses suggests that multiple imputation is better than other methods for handling missing data in obesity randomized controlled trials, followed closely by mixed models. We suggest these methods supplant last observation carried forward as the primary method of analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data from five laboratories using five different techniques were reanalyzed to measure subjects' knowledge of events that occurred over the past 70 years. Subjects were about 20 years of age, so the measures included events that extended up to 50 years before birth. The functions relating knowledge about the events to age do not decrease precipitously at birth but gradually drop to above-chance levels. Techniques usually used to study retention within the individual can be used to study the persistence of ideas and fashions within an age cohort in a culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Adherence to glaucoma medications is essential for successful treatment of the disease but is complex and difficult for many of our patients. Health coaching has been used successfully in the treatment of other chronic diseases. This pilot study explores the use of health coaching for glaucoma care. METHODS: A mixed methods study design was used to assess the health coaching intervention for glaucoma patients. The health coaching intervention consisted of four to six health coaching sessions with a certified health coach via telephone. Quantitative measures included demographic and health information, adherence to glaucoma medications (using the visual analog adherence scale and medication event monitoring system), and an exit survey rating the experience. Qualitative measures included a precoaching health questionnaire, notes made by the coach during the intervention, and an exit interview with the subjects at the end of the study. RESULTS: Four glaucoma patients participated in the study; all derived benefits from the health coaching. Study subjects demonstrated increased glaucoma drop adherence in response to the coaching intervention, in both visual analog scale and medication event monitoring system. Study subjects' qualitative feedback reflected a perceived improvement in both eye and general health self-care. The subjects stated that they would recommend health coaching to friends or family members. CONCLUSION: Health coaching was helpful to the glaucoma patients in this study; it has the potential to improve glaucoma care and overall health.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. STUDY DESIGN: A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. RESULTS: Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with "massive" PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a "stat" complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. CONCLUSION: The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist.