10 resultados para Application times
em CentAUR: Central Archive University of Reading - UK
Resumo:
Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
The performance benefit when using Grid systems comes from different strategies, among which partitioning the applications into parallel tasks is the most important. However, in most cases the enhancement coming from partitioning is smoothed by the effect of the synchronization overhead, mainly due to the high variability of completion times of the different tasks, which, in turn, is due to the large heterogeneity of Grid nodes. For this reason, it is important to have models which capture the performance of such systems. In this paper we describe a queueing-network-based performance model able to accurately analyze Grid architectures, and we use the model to study a real parallel application executed in a Grid. The proposed model improves the classical modelling techniques and highlights the impact of resource heterogeneity and network latency on the application performance.
Resumo:
As consumers demand more functionality) from their electronic devices and manufacturers supply the demand then electrical power and clock requirements tend to increase, however reassessing system architecture can fortunately lead to suitable counter reductions. To maintain low clock rates and therefore reduce electrical power, this paper presents a parallel convolutional coder for the transmit side in many wireless consumer devices. The coder accepts a parallel data input and directly computes punctured convolutional codes without the need for a separate puncturing operation while the coded bits are available at the output of the coder in a parallel fashion. Also as the computation is in parallel then the coder can be clocked at 7 times slower than the conventional shift-register based convolutional coder (using DVB 7/8 rate). The presented coder is directly relevant to the design of modern low-power consumer devices
Resumo:
BACKGROUND AND AIM: The atherogenic potential of dietary derived lipids, chylomicrons (CM) and their remnants (CMr) is now becoming more widely recognised. To investigate factors effecting levels of CM and CMr and their importance in coronary heart disease risk it is essential to use a specific method of quantification. Two studies were carried out to investigate: (i) effects of increased daily intake of long chain n-3 polyunsaturated fatty acid (LC n-3 PUFA), and (ii) effects of increasing meal monounsaturated fatty acid (MUFA) content on the postprandial response of intestinally-derived lipoproteins. The contribution of the intestinally-derived lipoproteins to total lipaemia was assessed by triacylglycerol-rich lipoprotein (TRL) apolipoprotein B-48 (apo B-48) and retinyl ester (RE) concentrations. METHODS AND RESULTS: In a randomised controlled crossover trial (placebo vs LC n-3 PUFA) a mean daily intake of 1.4 g/day of LC n-3 PUFA failed to reduce fasting and postprandial triacylglycerol (TAG) response in 9 healthy male volunteers. Although the pattern and nature of the apo B-48 response was consistent with the TAG response following the two diets, the postprandial RE response differed on the LC n-3 PUFA diet with a lower early RE response and a delayed and more marked increase in RE in the late postprandial period compared with the control diet, but the differences did not reach levels of statistical significance. In the meal study there was no effect of MUFA/SFA content on the total lipaemic response to the meals nor on the contribution of intestinally derived lipoproteins evaluated as TAG, apo B-48 and RE responses in the TRL fraction. In both studies, the RE and apo B-48 measurements provided broadly similar information with respect to lack of effects of dietary or meal fatty acid composition and the presence of single or multiple peak responses. However the apo B-48 and RE measurements differed with respect to the timing of their peak response times, with a delayed RE peak, relalive to apo B-48, of approximately 2-3 hours for the LC n-3 PUFA diet (p = 0.002) study and 1-1.5 hours for the meal MUFA/SFA study. CONCLUSIONS: It was concluded that there are limitations of using RE as a specific CM marker, apo B-48 quantitation was found to be a more appropriate method for CM and CMr quantitation. However it was still considered of value to measure RE as it provided additional information regarding the incorporation of other constituents into the CM particle.
Resumo:
This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.
Resumo:
Four protocols involving the application of low pressures, either toward the end of frying or after frying, were investigated with the aim of lowering the oil content of potato chips. Protocol 1 involving frying at atmospheric pressure followed by a 3 min draining time constituted the control. Protocol 2 involved lowering of pressure to 13.33 kPa, 40 s before the end of frying, followed by draining for 3 min at the same pressure. Protocol 3 was the same as protocol 2, except that the pressure was lowered 3 s before the end of frying. Protocol 4 involved lowering the pressure to 13.33 kPa after the product was lifted from the oil and holding it at this value over the draining time of 3 min. Protocol 4 gave a product having the lowest oil content (37.12 g oil/100 g defatted dry matter), while protocol 2 gave the product with highest oil content (71.10 g oil/100 g defatted dry matter), followed by those obtained using protocols 1 and 3(68.48 g oil/100 g defatted dry matter and 52.50 g oil/100 g defatted dry matter, respectively). Protocol 4 was further evaluated to study the effects of draining times and vacuum applied, and compared with the control. It was noted that over the modest range of pressures investigated, there was no significant effect of the vacuum applied on the oil content of the product. This study demonstrates that the oil content of potato chips can be lowered significantly by combining atmospheric frying with draining under vacuum.
Resumo:
Novel oxazoline-based comb-polymers possessing linoleyl or oleic side chains have been synthesized and used to produce low viscosity coatings. Inclusion of the polymers in model paint formulations results in coatings that exhibit faster drying times than commercially available alkyd resin formulations. The comb polymers were produced from diol substituted oxazoline monomers that were synthesized through a scalable, solvent free protocol and purified by simple recrystallisation. Co-polymerisation of the oxazolines with adipic acid at 160 °C in the bulk resulted in the targeted polyester comb type polymers. The polymers were soluble in a range of organic solvents and compatible with commercial alkyd resins. Model paint formulations containing up to 40 wt% of the linoleyl-based comb polymers exhibited a dramatic reduction in viscosity (from 35 to 13 Poise at 25 °C) with increasing quantities of polymer added. Dynamic mechanical analysis (DMA) studies revealed that the drying rate of the model paint formulations containing the comb polymers was enhanced when compared with that of commercial alkyd resins.
Resumo:
n this study, the authors discuss the effective usage of technology to solve the problem of deciding on journey start times for recurrent traffic conditions. The developed algorithm guides the vehicles to travel on more reliable routes that are not easily prone to congestion or travel delays, ensures that the start time is as late as possible to avoid the traveller waiting too long at their destination and attempts to minimise the travel time. Experiments show that in order to be more certain of reaching their destination on time, a traveller has to leave early and correspondingly arrive early, resulting in a large waiting time. The application developed here asks the user to set this certainty factor as per the task in hand, and computes the best start time and route.
Resumo:
There were 338 road fatalities on Irish roads in 2007. Research in 2007 by the Road Safety Authority in Ireland states that young male drivers (17 – 25 years) are seven times more likely to be killed on Irish roads than other road users. The car driver fatality rate was found to be approximately 10 times higher for young male drivers than for female drivers in 2000. Young male drivers in particular demonstrate a high proclivity for risky driving behaviours. These risky behaviours include drink driving, speeding, rug-driving and engaging in aggressive driving. Speed is the single largest contributing factor to road deaths in Ireland. Approximately 40% of fatal accidents are caused by excessive or inappropriate speed. This study focuses on how dangerous driving behaviours may be addressed through social marketing. This study analyses the appropriate level of fear that needs to be induced in order to change young male driving behaviour.
Resumo:
Twitter has become a dependable microblogging tool for real time information dissemination and newsworthy events broadcast. Its users sometimes break news on the network faster than traditional newsagents due to their presence at ongoing real life events at most times. Different topic detection methods are currently used to match Twitter posts to real life news of mainstream media. In this paper, we analyse tweets relating to the English FA Cup finals 2012 by applying our novel method named TRCM to extract association rules present in hash tag keywords of tweets in different time-slots. Our system identify evolving hash tag keywords with strong association rules in each time-slot. We then map the identified hash tag keywords to event highlights of the game as reported in the ground truth of the main stream media. The performance effectiveness measure of our experiments show that our method perform well as a Topic Detection and Tracking approach.