15 resultados para non-ideal source
em Aston University Research Archive
Resumo:
A study was made of the effect of blending practice upon selected physical properties of crude oils, and of various base oils and petroleum products, using a range of binary mixtures. The crudes comprised light, medium and heavy Kuwait crude oils. The properties included kinematic viscosity, pour point, boiling point and Reid vapour pressure. The literature related to the prediction of these properties, and the changes reported to occur on blending, was critically reviewed as a preliminary to the study. The kinematic viscosity of petroleum oils in general exhibited non-ideal behaviour upon blending. A mechanism was proposed for this behaviour which took into account the effect of asphaltenes content. A correlation was developed, as a modification of Grunberg's equation, to predict the viscosities of binary mixtures of petroleum oils. A correlation was also developed to predict the viscosities of ternary mixtures. This correlation showed better agreement with experimental data (< 6% deviation for crude oils and 2.0% for base oils) than currently-used methods, i.e. ASTM and Refutas methods. An investigation was made of the effect of temperature on the viscosities of crude oils and petroleum products at atmospheric pressure. The effect of pressure on the viscosity of crude oil was also studied. A correlation was developed to predict the viscosity at high pressures (up to 8000 psi), which gave significantly better agreement with the experimental data than the current method due to Kouzel (5.2% and 6.0% deviation for the binary and ternary mixtures respectively). Eyring's theory of viscous flow was critically investigated, and a modification was proposed which extends its application to petroleum oils. The effect of blending on the pour points of selected petroleum oils was studied together with the effect of wax formation and asphaltenes content. Depression of the pour point was always obtained with crude oil binary mixtures. A mechanism was proposed to explain the pour point behaviour of the different binary mixtures. The effects of blending on the boiling point ranges and Reid vapour pressures of binary mixtures of petroleum oils were investigated. The boiling point range exhibited ideal behaviour but the R.V.P. showed negative deviations from it in all cases. Molecular weights of these mixtures were ideal, but the densities and molar volumes were not. The stability of the various crude oil binary mixtures, in terms of viscosity, was studied over a temperature range of 1oC - 30oC for up to 12 weeks. Good stability was found in most cases.
Resumo:
The theory of vapour-liquid equilibria is reviewed, as is the present status or prediction methods in this field. After discussion of the experimental methods available, development of a recirculating equilibrium still based on a previously successful design (the modified Raal, Code and Best still of O'Donnell and Jenkins) is described. This novel still is designed to work at pressures up to 35 bar and for the measurement of both isothermal and isobaric vapour-liquid equilibrium data. The equilibrium still was first commissioned by measuring the saturated vapour pressures of pure ethanol and cyclohexane in the temperature range 77-124°C and 80-142°C respectively. The data obtained were compared with available literature experimental values and with values derived from an extended form of the Antoine equation for which parameters were given in the literature. Commissioning continued with the study of the phase behaviour of mixtures of the two pure components as such mixtures are strongly non-ideal, showing azeotopic behaviour. Existing data did not exist above one atmosphere pressure. Isothermal measurements were made at 83.29°C and 106.54°C, whilst isobaric measurements were made at pressures of 1 bar, 3 bar and 5 bar respectively. The experimental vapour-liquid equilibrium data obtained are assessed by a standard literature method incorporating a themodynamic consistency test that minimises the errors in all the measured variables. This assessment showed that reasonable x-P-T data-sets had been measured, from which y-values could be deduced, but that the experimental y-values indicated the need for improvements in the design of the still. The final discussion sets out the improvements required and outlines how they might be attained.
Resumo:
We have simulated the performance of various apertures used in Coded Aperture Imaging - optically. Coded pictures of extended and continuous-tone planar objects from the Annulus, Twin Annulus, Fresnel Zone Plate and the Uniformly Redundant Array have been decoded using a noncoherent correlation process. We have compared the tomographic capabilities of the Twin Annulus with the Uniformly Redundant Arrays based on quadratic residues and m-sequences. We discuss the ways of reducing the 'd. c.' background of the various apertures used. The non-ideal System-Point-Spread-Function inherent in a noncoherent optical correlation process produces artifacts in the reconstruction. Artifacts are also introduced as a result of unwanted cross-correlation terms from out-of-focus planes. We find that the URN based on m-sequences exhibits good spatial resolution and out-of-focus behaviour when imaging extended objects.
Resumo:
Horizontal Subsurface Flow Treatment Wetlands (HSSF TWs) are used by Severn Trent Water as a low-cost tertiary wastewater treatment for rural locations. Experience has shown that clogging is a major operational problem that reduces HSSF TW lifetime. Clogging is caused by an accumulation of secondary wastewater solids from upstream processes and decomposing leaf litter. Clogging occurs as a sludge layer where wastewater is loaded on the surface of the bed at the inlet. Severn Trent systems receive relatively high hydraulic loading rates, which causes overland flow and reduces the ability to mineralise surface sludge accumulations. A novel apparatus and method, the Aston Permeameter, was created to measure hydraulic conductivity in situ. Accuracy is ±30 %, which was considered adequate given that conductivity in clogged systems varies by several orders of magnitude. The Aston Permeameter was used to perform 20 separate tests on 13 different HSSF TWs in the UK and the US. The minimum conductivity measured was 0.03 m/d at Fenny Compton (compared with 5,000 m/d clean conductivity), which was caused by an accumulation of construction fines in one part of the bed. Most systems displayed a 2 to 3 order of magnitude variation in conductivity in each dimension. Statistically significant transverse variations in conductivity were found in 70% of the systems. Clogging at the inlet and outlet was generally highest where flow enters the influent distribution and exits the effluent collection system, respectively. Surface conductivity was lower in systems with dense vegetation because plant canopies reduce surface evapotranspiration and decelerate sludge mineralisation. An equation was derived to describe how the water table profile is influenced by overland flow, spatial variations in conductivity and clogging. The equation is calibrated using a single parameter, the Clog Factor (CF), which represents the equivalent loss of porosity that would reproduce measured conductivity according to the Kozeny-Carman Equation. The CF varies from 0 for ideal conditions to 1 for completely clogged conditions. Minimum CF was 0.54 for a system that had recently been refurbished, which represents the deviation from ideal conditions due to characteristics of non-ideal media such as particle size distribution and morphology. Maximum CF was 0.90 for a 15 year old system that exhibited sludge accumulation and overland flow across the majority of the bed. A Finite Element Model of a 15 m long HSSF TW was used to indicate how hydraulics and hydrodynamics vary as CF increases. It was found that as CF increases from 0.55 to 0.65 the subsurface wetted area increases, which causes mean hydraulic residence time to increase from 0.16 days to 0.18 days. As CF increases from 0.65 to 0.90, the extent of overland flow increases from 1.8 m to 13.1 m, which reduces hydraulic efficiency from 37 % to 12 % and reduces mean residence time to 0.08 days.
Resumo:
A multistage distillation column in which mass transfer and a reversible chemical reaction occurred simultaneously, has been investigated to formulate a technique by which this process can be analysed or predicted. A transesterification reaction between ethyl alcohol and butyl acetate, catalysed by concentrated sulphuric acid, was selected for the investigation and all the components were analysed on a gas liquid chromatograph. The transesterification reaction kinetics have been studied in a batch reactor for catalyst concentrations of 0.1 - 1.0 weight percent and temperatures between 21.4 and 85.0 °C. The reaction was found to be second order and dependent on the catalyst concentration at a given temperature. The vapour liquid equilibrium data for six binary, four ternary and one quaternary systems are measured at atmospheric pressure using a modified Cathala dynamic equilibrium still. The systems with the exception of ethyl alcohol - butyl alcohol mixtures, were found to be non-ideal. Multicomponent vapour liquid equilibrium compositions were predicted by a computer programme which utilised the Van Laar constants obtained from the binary data sets. Good agreement was obtained between the predicted and experimental quaternary equilibrium vapour compositions. Continuous transesterification experiments were carried out in a six stage sieve plate distillation column. The column was 3" in internal diameter and of unit construction in glass. The plates were 8" apart and had a free area of 7.7%. Both the liquid and vapour streams were analysed. The component conversion was dependent on the boilup rate and the reflux ratio. Because of the presence of the reaction, the concentration of one of the lighter components increased below the feed plate. In the same region a highly developed foam was formed due to the presence of the catalyst. The experimental results were analysed by the solution of a series of simultaneous enthalpy and mass equations. Good agreement was obtained between the experimental and calculated results.
Resumo:
We present three jargonaphasic patients who made phonological errors in naming, repetition and reading. We analyse target/response overlap using statistical models to answer three questions: 1) Is there a single phonological source for errors or two sources, one for target-related errors and a separate source for abstruse errors? 2) Can correct responses be predicted by the same distribution used to predict errors or do they show a completion boost (CB)? 3) Is non-lexical and lexical information summed during reading and repetition? The answers were clear. 1) Abstruse errors did not require a separate distribution created by failure to access word forms. Abstruse and target-related errors were the endpoints of a single overlap distribution. 2) Correct responses required a special factor, e.g., a CB or lexical/phonological feedback, to preserve their integrity. 3) Reading and repetition required separate lexical and non-lexical contributions that were combined at output.
Resumo:
This study examined the extent to which students could fake responses on personality and approaches to studying questionnaires, and the effects of such responding on the validity of non-cognitive measures for predicting academic performance (AP). University students produced a profile of an ‘ideal’ student using the Big-Five personality taxonomy, which yielded a stereotype with low scores for Neuroticism, and high scores for the other four traits. A sub-set of participants were allocated to a condition in which they were instructed to fake their responses as University applicants, portraying themselves as positively as possible. Scores for these participants revealed higher scores than those in a control condition on measures of deep and strategic approaches to studying, but lower scores on the surface approach variable. Conscientiousness was a significant predictor of AP in both groups, but the predictive effect of approaches to studying variables and Openness to Experience identified in the control group was lower in the group who faked their responses. Non-cognitive psychometric measures can be valid predictors of AP, but scores on these measures can be affected by instructional set. Further implications for psychometric measurement in educational settings are discussed.
Resumo:
The effects of haem limitation and iron restriction on cells of non typable Haemophilus influenzae were investigated. Haem limitation was achieved by adding concentrations of haem to growth media which resulted in substantial decreases in final cell yields. Iron restriction was achieved by substituting protoporphyrin IX (PPIX) for haem in the growth medium and adding an iron chelator to the system. The effect of these nutrient limitations on a) outer membrane composition, and b) respiratory systems of non typable H.influenzae was investigated. Several of the strains examined produced new PPIX-specific outer membrane proteins when cultured utilising PPIX as a porphyrin source. The immune response of patients with bronchiectasis to outer membrane antigens of H.influenzae cultured under iron-restricted conditions was analysed by ELISA and immunoblotting techniques. ELISA analysis revealed that individuals with severe bronchiectasis had high titres of antibodies directed against H.influenzae OMs in both serum and sputum. Immunoblotting with homologous serum showed that where PPIX-specific OMPs were produced they were antigenic and were recognised by patients' serum. This suggested that these H.influenzae OMPs may be expressed in vivo. Additionally, the development of the immune responses to non typable H.influenzae outer membrane antigens was investigated using a rat lung model. Bacteria encased in agar beads were inoculated intratracheally into rat lungs, infection was established, and the immune response monitored for 6 weeks. The animals developed antibodies to PPIX-specific OMPs during the course of infection, providing further evidence that H.influenzae express these novel OMP antigens when growing in vivo. Studies in vitro on respiratory systems of phenotypically altered H.influenzae showed that bacteria grown utilising PPIX as a porphyrin source, or under conditions of iron-restriction produced ten fold fewer cytochromes than cells grown in nutrient excess, while haem limited H.influenzae produced no detectable cytochromes. Respiration of various substrates was depressed in haem limited and in PPIX-grown cultures as compared with cells grown in nutrient excess.
Resumo:
Common approaches to IP-traffic modelling have featured the use of stochastic models, based on the Markov property, which can be classified into black box and white box models based on the approach used for modelling traffic. White box models, are simple to understand, transparent and have a physical meaning attributed to each of the associated parameters. To exploit this key advantage, this thesis explores the use of simple classic continuous-time Markov models based on a white box approach, to model, not only the network traffic statistics but also the source behaviour with respect to the network and application. The thesis is divided into two parts: The first part focuses on the use of simple Markov and Semi-Markov traffic models, starting from the simplest two-state model moving upwards to n-state models with Poisson and non-Poisson statistics. The thesis then introduces the convenient to use, mathematically derived, Gaussian Markov models which are used to model the measured network IP traffic statistics. As one of the most significant contributions, the thesis establishes the significance of the second-order density statistics as it reveals that, in contrast to first-order density, they carry much more unique information on traffic sources and behaviour. The thesis then exploits the use of Gaussian Markov models to model these unique features and finally shows how the use of simple classic Markov models coupled with use of second-order density statistics provides an excellent tool for capturing maximum traffic detail, which in itself is the essence of good traffic modelling. The second part of the thesis, studies the ON-OFF characteristics of VoIP traffic with reference to accurate measurements of the ON and OFF periods, made from a large multi-lingual database of over 100 hours worth of VoIP call recordings. The impact of the language, prosodic structure and speech rate of the speaker on the statistics of the ON-OFF periods is analysed and relevant conclusions are presented. Finally, an ON-OFF VoIP source model with log-normal transitions is contributed as an ideal candidate to model VoIP traffic and the results of this model are compared with those of previously published work.
Resumo:
Exploratory analysis of petroleum geochemical data seeks to find common patterns to help distinguish between different source rocks, oils and gases, and to explain their source, maturity and any intra-reservoir alteration. However, at the outset, one is typically faced with (a) a large matrix of samples, each with a range of molecular and isotopic properties, (b) a spatially and temporally unrepresentative sampling pattern, (c) noisy data and (d) often, a large number of missing values. This inhibits analysis using conventional statistical methods. Typically, visualisation methods like principal components analysis are used, but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this paper we introduce a complementary approach based on a non-linear probabilistic model. Generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, while also dealing with missing data. We show how using generative topographic mapping also provides an optimal method with which to replace missing values in two geochemical datasets, particularly where a large proportion of the data is missing.
Resumo:
An important field of application of lasers is biomedical optics. Here, they offer great utility for diagnosis, therapy and surgery. For the development of novel methods of laser-based biomedical diagnostics careful study of light propagation in biological tissues is necessary to enhance our understanding of the optical measurements undertaken, increase research and development capacity and the diagnostic reliability of optical technologies. Ultimately, fulfilling these requirements will increase uptake in clinical applications of laser based diagnostics and therapeutics. To address these challenges informative biomarkers relevant to the biological and physiological function or disease state of the organism must be selected. These indicators are the results of the analysis of tissues and cells, such as blood. For non-invasive diagnostics peripheral blood, cells and tissue can potentially provide comprehensive information on the condition of the human organism. A detailed study of the light scattering and absorption characteristics can quickly detect physiological and morphological changes in the cells due to thermal, chemical, antibiotic treatments, etc [1-5]. The selection of a laser source to study the structure of biological particles also benefits from the fact that gross pathological changes are not induced and diagnostics make effective use of the monochromatic directional coherence properties of laser radiation.
Resumo:
Methods - Ethical approval for the study was granted by both the local National Health Service (NHS) Research Ethics Committee (REC) and Aston University’s REC. Seven focus groups were conducted between October and December 2011 in medical or community settings within inner-city Birmingham (UK). Discussions were guided by a theme plan which was developed from key themes identified by a literature review and piloted via a Patient Consultation Group. Each focus group had between 3 and 7 participants. The groups were digitally recorded and subsequently transcribed verbatim. The transcriptions were then subjected to thematic analysis via constant comparison in order to identify emerging themes. Results - Participants recognised the pharmacist as an expert source of advice about prescribed medicines, a source they frequently felt a need to consult as a result of the inadequate supply of medicines information from the prescriber. However, an emerging theme was a perception that pharmacists had an oblique profit motive relating to the supply of generic medicines with frequent changes to the ‘brand’ of generic supplied being attributed to profit-seeking by pharmacists. Such changes had a negative impact on the patient’s perceived efficacy of the therapy which may make non-adherence more likely. Conclusions - Whilst pharmacists were recognised as medicines experts, trust in the pharmacist was undermined by frequent changes to generic medicines. Such changes have the potential to adversely impact adherence levels. Further, quantitative research is recommended to examine if such views are generalisable to the wider population of Birmingham and to establish if such views impact on adherence levels.
Resumo:
One of the most pressing demands on electrophysiology applied to the diagnosis of epilepsy is the non-invasive localization of the neuronal generators responsible for brain electrical and magnetic fields (the so-called inverse problem). These neuronal generators produce primary currents in the brain, which together with passive currents give rise to the EEG signal. Unfortunately, the signal we measure on the scalp surface doesn't directly indicate the location of the active neuronal assemblies. This is the expression of the ambiguity of the underlying static electromagnetic inverse problem, partly due to the relatively limited number of independent measures available. A given electric potential distribution recorded at the scalp can be explained by the activity of infinite different configurations of intracranial sources. In contrast, the forward problem, which consists of computing the potential field at the scalp from known source locations and strengths with known geometry and conductivity properties of the brain and its layers (CSF/meninges, skin and skull), i.e. the head model, has a unique solution. The head models vary from the computationally simpler spherical models (three or four concentric spheres) to the realistic models based on the segmentation of anatomical images obtained using magnetic resonance imaging (MRI). Realistic models – computationally intensive and difficult to implement – can separate different tissues of the head and account for the convoluted geometry of the brain and the significant inter-individual variability. In real-life applications, if the assumptions of the statistical, anatomical or functional properties of the signal and the volume in which it is generated are meaningful, a true three-dimensional tomographic representation of sources of brain electrical activity is possible in spite of the ‘ill-posed’ nature of the inverse problem (Michel et al., 2004). The techniques used to achieve this are now referred to as electrical source imaging (ESI) or magnetic source imaging (MSI). The first issue to influence reconstruction accuracy is spatial sampling, i.e. the number of EEG electrodes. It has been shown that this relationship is not linear, reaching a plateau at about 128 electrodes, provided spatial distribution is uniform. The second factor is related to the different properties of the source localization strategies used with respect to the hypothesized source configuration.
Resumo:
This chapter investigates the resistance by institutional actors in ambiguous supply chain environments for online grocery provision. Recent studies have shown that significant shifts in urban geographies are increasing consumers' expectations of online retail provision. However, at the same time there is also growing evidence that the collaborative practice in online grocery provision within the urban supply chains is resisted. That these trends are found despite growing demand of online provision highlights both the difficulty of bringing geographically dispersed supply partners together and the problems associated with operating within and across ambiguous environments. Drawing upon twenty-nine in-depth interviews with a range of institutional actors, including retail, logistics, and urban planning experts within an urban metropolis in an emerging market, we detail the different ways that collaboration is resisted in online retail provision. Several different patterns of resistance were identified in (non-) collaboration notably, ideological, functional, regulatory and spatial. © 2011, IGI Global. C.
Resumo:
THE YOUTH MOVEMENT NASHI (OURS) WAS FOUNDED IN THE SPRING of 2005 against the backdrop of Ukraine’s ‘Orange Revolution’. Its aim was to stabilise Russia’s political system and take back the streets from opposition demonstrators. Personally loyal to Putin and taking its ideological orientation from Surkov’s concept of ‘sovereign democracy’, Nashi has sought to turn the tide on ‘defeatism’ and develop Russian youth into a patriotic new elite that ‘believes in the future of Russia’ (p. 15). Combining a wealth of empirical detail and the application of insights from discourse theory, Ivo Mijnssen analyses the organisation’s development between 2005 and 2012. His analysis focuses on three key moments—the organisation’s foundation, the apogee of its mobilisation around the Bronze Soldier dispute with Estonia, and the 2010 Seliger youth camp—to help understand Nashi’s organisation, purpose and ideational outlook as well as the limitations and challenges it faces. As such,the book is insightful both for those with an interest in post-Soviet Russian youth culture, and for scholars seeking a rounded understanding of the Kremlin’s initiatives to return a sense of identity and purpose to Russian national life.The first chapter, ‘Background and Context’, outlines the conceptual toolkit provided by Ernesto Laclau and Chantal Mouffe to help make sense of developments on the terrain of identity politics. In their terms, since the collapse of the Soviet Union, Russia has experienced acute dislocation of its identity. With the tangible loss of great power status, Russian realities have become unfixed from a discourse enabling national life to be constructed, albeit inherently contingently, as meaningful. The lack of a Gramscian hegemonic discourse to provide a unifying national idea was securitised as an existential threat demanding special measures. Accordingly, the identification of those who are ‘notUs’ has been a recurrent theme of Nashi’s discourse and activity. With the victory in World War II held up as a foundational moment, a constitutive other is found in the notion of ‘unusual fascists’. This notion includes not just neo-Nazis, but reflects a chain of equivalence that expands to include a range of perceived enemies of Putin’s consolidation project such as oligarchs and pro-Western liberals.The empirical background is provided by the second chapter, ‘Russia’s Youth, the Orange Revolution, and Nashi’, which traces the emergence of Nashi amid the climate of political instability of 2004 and 2005. A particularly note-worthy aspect of Mijnssen’s work is the inclusion of citations from his interviews with Nashicommissars; the youth movement’s cadres. Although relatively few in number, such insider conversations provide insight into the ethos of Nashi’s organisation and the outlook of those who have pledged their involvement. Besides the discussion of Nashi’s manifesto, the reader thus gains insight into the motivations of some participants and behind-the-scenes details of Nashi’s activities in response to the perceived threat of anti-government protests. The third chapter, ‘Nashi’s Bronze Soldier’, charts Nashi’s role in elevating the removal of a World War II monument from downtown Tallinn into an international dispute over the interpretation of history. The events subsequent to this securitisation of memory are charted in detail, concluding that Nashi’s activities were ultimately unsuccessful as their demands received little official support.The fourth chapter, ‘Seliger: The Foundry of Modernisation’, presents a distinctive feature of Mijnssen’s study, namely his ethnographic account as a participant observer in the Youth International Forum at Seliger. In the early years of the camp (2005–2007), Russian participants received extensive training, including master classes in ‘methods of forestalling mass unrest’ (p. 131), and the camp served to foster a sense of group identity and purpose among activists. After 2009 the event was no longer officially run as a Nashi camp, and its role became that of a forum for the exchange of ideas about innovation, although camp spirit remained a central feature. In 2010 the camp welcomed international attendees for the first time. As one of about 700 international participants in that year the author provides a fascinating account based on fieldwork diaries.Despite the polemical nature of the topic, Mijnssen’s analysis remains even-handed, exemplified in his balanced assessment of the Seliger experience. While he details the frustrations and disappointments of the international participants with regard to the unaccustomed strict camp discipline, organisational and communication failures, and the controlled format of many discussions,he does not neglect to note the camp’s successes in generating a gratifying collective dynamic between the participants, even among the international attendees who spent only a week there.In addition to the useful bibliography, the book is back-ended by two appendices, which provide the reader with important Russian-language primary source materials. The first is Nashi’s ‘Unusual Fascism’ (Neobyknovennyi fashizm) brochure, and the second is the booklet entitled ‘Some Uncomfortable Questions to the Russian Authorities’ (Neskol’ko neudobnykh voprosov rossiiskoivlasti) which was provided to the Seliger 2010 instructors to guide them in responding to probing questions from foreign participants. Given that these are not readily publicly available even now, they constitute a useful resource from the historical perspective.