931 resultados para Non-Ideal Duffing System
Resumo:
Commercial explosives behave non-ideally in rock blasting. A direct and convenient measure of non-ideality is the detonation velocity. In this study, an alternative model fitted to experimental unconfined detonation velocity data is proposed and the effect of confinement on the detonation velocity is modelled. Unconfined data of several explosives showing various levels of nonideality were successfully modelled. The effect of confinement on detonation velocity was modelled empirically based on field detonation velocity measurements. Confined detonation velocity is a function of the ideal detonation velocity, unconfined detonation velocity at a given blasthole diameter and rock stiffness. For a given explosive and charge diameter, as confinement increases detonation velocity increases. The confinement model is implemented in a simple engineering based non-ideal detonation model. A number of simulations are carried out and analysed to predict the explosive performance parameters for the adopted blasting conditions.
Resumo:
Adsorption of pure nitrogen, argon, acetone, chloroform and acetone-chloroform mixture on graphitized thermal carbon black is considered at sub-critical conditions by means of molecular layer structure theory (MLST). In the present version of the MLST an adsorbed fluid is considered as a sequence of 2D molecular layers, whose Helmholtz free energies are obtained directly from the analysis of experimental adsorption isotherm of pure components. The interaction of the nearest layers is accounted for in the framework of mean field approximation. This approach allows quantitative correlating of experimental nitrogen and argon adsorption isotherm both in the monolayer region and in the range of multi-layer coverage up to 10 molecular layers. In the case of acetone and chloroform the approach also leads to excellent quantitative correlation of adsorption isotherms, while molecular approaches such as the non-local density functional theory (NLDFT) fail to describe those isotherms. We extend our new method to calculate the Helmholtz free energy of an adsorbed mixture using a simple mixing rule, and this allows us to predict mixture adsorption isotherms from pure component adsorption isotherms. The approach, which accounts for the difference in composition in different molecular layers, is tested against the experimental data of acetone-chloroform mixture (non-ideal mixture) adsorption on graphitized thermal carbon black at 50 degrees C. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Targeted inhibition of oncogenes in tumor cells is a rational approach toward the development of cancer therapies based on RNA interference (RNAi). Tumors caused by human papillomavirus (HPV) infection are an ideal model system for RNAi-based cancer therapies because the oncogenes that cause cervical cancer, E6 and E7, are expressed only in cancerous cells. We investigated whether targeting HPV E6 and E7 oncogenes yields cancer cells more sensitive to chemotherapy by cisplatin, the chemotherapeutic agent currently used for the treatment of advanced cervical cancer. We have designed siRNAs directed against the HPV E6 oncogene that simultaneously targets both E6 and E7, which results in an 80% reduction in E7 protein and reactivation of the p53 pathway. The loss of E6 and E7 resulted in a reduction in cellular viability concurrent with the induction of cellular senescence. Interference was specific in that no effect on HPV-negative cells was observed. We demonstrate that RNAi against E6 and E7 oncogenes enhances the chemotherapeutic effect of cisplatin in HeLa cells. The IC50 for HeLa cells treated with cisplatin was 9.4 mu M, but after the addition of a lentivirus-delivered shRNA against E6, the IC50 was reduced almost 4-fold to 2.4 mu M. We also observed a decrease in E7 expression with a concurrent increase in p53 protein levels upon cotreatment with shRNA and cisplatin over that seen with individual treatment alone. Our results provide strong evidence that loss of E6 and E7 results in increased sensitivity to cisplatin, probably because of increased p53 levels.
Resumo:
To survive adverse or unpredictable conditions in the ontogenetic environment, many organisms retain a level of phenotypic plasticity that allows them to meet the challenges of rapidly changing conditions. Larval anurans are widely known for their ability to modify behaviour, morphology and physiological processes during development, making them an ideal model system for studies of environmental effects on phenotypic traits. Although temperature is one of the most important factors influencing the growth, development and metamorphic condition of larval anurans, many studies have failed to include ecologically relevant thermal fluctuations among their treatments. We compared the growth and age at metamorphosis of striped marsh frogs Limnodynastes peronii raised in a diurnally fluctuating thermal regime and a stable regime of the same mean temperature. We then assessed the long-term effects of the larval environment on the morphology and performance of post-metamorphic frogs. Larval L. peronii from the fluctuating treatment were significantly longer throughout development and metamorphosed about 5 days earlier. Frogs from the fluctuating group metamorphosed at a smaller mass and in poorer condition compared with the stable group, and had proportionally shorter legs. Frogs from the fluctuating group showed greater jumping performance at metamorphosis and less degradation in performance during a 10-week dormancy. Treatment differences in performance could not be explained by whole-animal morphological variation, suggesting improved contractile properties of the muscles in the fluctuating group.
Resumo:
In the analysis and prediction of many real-world time series, the assumption of stationarity is not valid. A special form of non-stationarity, where the underlying generator switches between (approximately) stationary regimes, seems particularly appropriate for financial markets. We introduce a new model which combines a dynamic switching (controlled by a hidden Markov model) and a non-linear dynamical system. We show how to train this hybrid model in a maximum likelihood approach and evaluate its performance on both synthetic and financial data.
Resumo:
A study was made of the effect of blending practice upon selected physical properties of crude oils, and of various base oils and petroleum products, using a range of binary mixtures. The crudes comprised light, medium and heavy Kuwait crude oils. The properties included kinematic viscosity, pour point, boiling point and Reid vapour pressure. The literature related to the prediction of these properties, and the changes reported to occur on blending, was critically reviewed as a preliminary to the study. The kinematic viscosity of petroleum oils in general exhibited non-ideal behaviour upon blending. A mechanism was proposed for this behaviour which took into account the effect of asphaltenes content. A correlation was developed, as a modification of Grunberg's equation, to predict the viscosities of binary mixtures of petroleum oils. A correlation was also developed to predict the viscosities of ternary mixtures. This correlation showed better agreement with experimental data (< 6% deviation for crude oils and 2.0% for base oils) than currently-used methods, i.e. ASTM and Refutas methods. An investigation was made of the effect of temperature on the viscosities of crude oils and petroleum products at atmospheric pressure. The effect of pressure on the viscosity of crude oil was also studied. A correlation was developed to predict the viscosity at high pressures (up to 8000 psi), which gave significantly better agreement with the experimental data than the current method due to Kouzel (5.2% and 6.0% deviation for the binary and ternary mixtures respectively). Eyring's theory of viscous flow was critically investigated, and a modification was proposed which extends its application to petroleum oils. The effect of blending on the pour points of selected petroleum oils was studied together with the effect of wax formation and asphaltenes content. Depression of the pour point was always obtained with crude oil binary mixtures. A mechanism was proposed to explain the pour point behaviour of the different binary mixtures. The effects of blending on the boiling point ranges and Reid vapour pressures of binary mixtures of petroleum oils were investigated. The boiling point range exhibited ideal behaviour but the R.V.P. showed negative deviations from it in all cases. Molecular weights of these mixtures were ideal, but the densities and molar volumes were not. The stability of the various crude oil binary mixtures, in terms of viscosity, was studied over a temperature range of 1oC - 30oC for up to 12 weeks. Good stability was found in most cases.
Resumo:
The theory of vapour-liquid equilibria is reviewed, as is the present status or prediction methods in this field. After discussion of the experimental methods available, development of a recirculating equilibrium still based on a previously successful design (the modified Raal, Code and Best still of O'Donnell and Jenkins) is described. This novel still is designed to work at pressures up to 35 bar and for the measurement of both isothermal and isobaric vapour-liquid equilibrium data. The equilibrium still was first commissioned by measuring the saturated vapour pressures of pure ethanol and cyclohexane in the temperature range 77-124°C and 80-142°C respectively. The data obtained were compared with available literature experimental values and with values derived from an extended form of the Antoine equation for which parameters were given in the literature. Commissioning continued with the study of the phase behaviour of mixtures of the two pure components as such mixtures are strongly non-ideal, showing azeotopic behaviour. Existing data did not exist above one atmosphere pressure. Isothermal measurements were made at 83.29°C and 106.54°C, whilst isobaric measurements were made at pressures of 1 bar, 3 bar and 5 bar respectively. The experimental vapour-liquid equilibrium data obtained are assessed by a standard literature method incorporating a themodynamic consistency test that minimises the errors in all the measured variables. This assessment showed that reasonable x-P-T data-sets had been measured, from which y-values could be deduced, but that the experimental y-values indicated the need for improvements in the design of the still. The final discussion sets out the improvements required and outlines how they might be attained.
Resumo:
The dramatic effects of brain damage can provide some of the most interesting insights into the nature of normal cognitive performance. In recent years a number of neuropsychological studies have reported a particular form of cognitive impairment where patients have problems recognising objects from one category but remain able to recognise those from others. The most frequent ‘category-specific’ pattern is an impairment identifying living things, compared to nonliving things. The reverse pattern of dissociation, i.e., an impairment recognising and naming nonliving things relative to living things, has been reported albeit much less frequently. The objective of the work carried out in this thesis was to investigate the organising principles and anatomical correlates of stored knowledge for categories of living and nonliving things. Three complementary cognitive neuropsychological research techniques were employed to assess how, and where, this knowledge is represented in the brain: (i) studies of normal (neurologically intact) subjects, (ii) case-studies of neurologically impaired patients with selective deficits in object recognition, and (iii) studies of the anatomical correlates of stored knowledge for living and nonliving things on the brain using magnetoencephalography (MEG). The main empirical findings showed that semantic knowledge about living and nonliving things is principally encoded in terms of sensory and functional features, respectively. In two case-study chapters evidence was found supporting the view that category-specific impairments can arise from damage to a pre-semantic system, rather than the assumption often made that the system involved must be semantic. In the MEG study, rather than finding evidence for the involvement of specific brain areas for different object categories, it appeared that, when subjects named and categorised living and nonliving things, a non-differentiated neural system was involved.
Resumo:
We report the impact of longitudinal signal power profile on the transmission performance of coherently-detected 112 Gb/s m-ary polarization multiplexed quadrature amplitude modulation system after compensation of deterministic nonlinear fibre impairments. Performance improvements up to 0.6 dB (Q(eff)) are reported for a non-uniform transmission link power profile. Further investigation reveals that the evolution of the transmission performance with power profile management is fully consistent with the parametric amplification of the amplified spontaneous emission by the signal through four-wave mixing. In particular, for a non-dispersion managed system, a single-step increment of 4 dB in the amplifier gain, with respect to a uniform gain profile, at similar to 2/3(rd) of the total reach considerably improves the transmission performance for all the formats studied. In contrary a negative-step profile, emulating a failure (gain decrease or loss increase), significantly degrades the bit-error rate.
Resumo:
A multistage distillation column in which mass transfer and a reversible chemical reaction occurred simultaneously, has been investigated to formulate a technique by which this process can be analysed or predicted. A transesterification reaction between ethyl alcohol and butyl acetate, catalysed by concentrated sulphuric acid, was selected for the investigation and all the components were analysed on a gas liquid chromatograph. The transesterification reaction kinetics have been studied in a batch reactor for catalyst concentrations of 0.1 - 1.0 weight percent and temperatures between 21.4 and 85.0 °C. The reaction was found to be second order and dependent on the catalyst concentration at a given temperature. The vapour liquid equilibrium data for six binary, four ternary and one quaternary systems are measured at atmospheric pressure using a modified Cathala dynamic equilibrium still. The systems with the exception of ethyl alcohol - butyl alcohol mixtures, were found to be non-ideal. Multicomponent vapour liquid equilibrium compositions were predicted by a computer programme which utilised the Van Laar constants obtained from the binary data sets. Good agreement was obtained between the predicted and experimental quaternary equilibrium vapour compositions. Continuous transesterification experiments were carried out in a six stage sieve plate distillation column. The column was 3" in internal diameter and of unit construction in glass. The plates were 8" apart and had a free area of 7.7%. Both the liquid and vapour streams were analysed. The component conversion was dependent on the boilup rate and the reflux ratio. Because of the presence of the reaction, the concentration of one of the lighter components increased below the feed plate. In the same region a highly developed foam was formed due to the presence of the catalyst. The experimental results were analysed by the solution of a series of simultaneous enthalpy and mass equations. Good agreement was obtained between the experimental and calculated results.
Resumo:
Timing jitter is a major factor limiting the performance of any high-speed, long-haul data transmission system. It arises from a number of reasons, such as interaction with accumulated spontaneous emission, inter-symbol interference (ISI), electrostriction etc. Some effects causing timing jitter can be reduced by means of non-linear filtering, using, for example, a nonlinear optical loop mirror (NOLM) [1]. The NOLM has been shown to reduce the timing jitter by suppressing the ASE and by stabilising the pulse duration [2, 3]. In this paper, we investigate the dynamics of timing jitter in a 2R regenerated system, nonlinearly guided by NOLMs at bit rates of 10, 20, 40, and 80- Gbit/s. Transmission performance of an equivalent non-regenerated (generic) system is taken as a reference.
Resumo:
We report the impact of longitudinal signal power profile on the transmission performance of coherently-detected 112 Gb/s m-ary polarization multiplexed quadrature amplitude modulation system after compensation of deterministic nonlinear fibre impairments. Performance improvements up to 0.6 dB (Q(eff)) are reported for a non-uniform transmission link power profile. Further investigation reveals that the evolution of the transmission performance with power profile management is fully consistent with the parametric amplification of the amplified spontaneous emission by the signal through four-wave mixing. In particular, for a non-dispersion managed system, a single-step increment of 4 dB in the amplifier gain, with respect to a uniform gain profile, at similar to 2/3(rd) of the total reach considerably improves the transmission performance for all the formats studied. In contrary a negative-step profile, emulating a failure (gain decrease or loss increase), significantly degrades the bit-error rate.
Resumo:
Fusarium oxysporum forma specialis cubense is a soilborne phytopathogen that infects banana. The true evolutionary identity of this so called species, Fusarium oxysporum, is still unknown. Many techniques have been applied in order to gain insight for the observed genetic diversity of this species. The current classification system is based on vegetative compatibility groups (VCG's). Vegetative compatibility is a self non-self recognition system in which only those belonging to a VCG can form stable heterokaryons, cells containing two distinct nuclei. Heterokaryons in turn, are formed from hypha! anastomosis, the fusion of two hyphae. Furthermore, subsequent to heterokaryon formation potential mechanisms exist which may generate genetic variability. One is through viral transfer upon hyphal anastomosis. The other mechanism is a form of mitotic recombination referred to as the parasexual cycle. Very little research has been performed to directly obser.ve the cellular events; hypha! anastomosis, heterokaryon formation, and the parasexual cycle in Fusarium oxysporum f. sp. cubense. The purpose of this research was to design and use methods which would allow for the detection of hypha! anastomosis and heterokaryon formation, as well as any characteristics surrounding this event, within and between VCG's in Foe. First, some general growth properties were recorded: the number of nuclei per hypha, the size ofthe hyphal tip cell, the size of the cell adjacent to the hypha! tip (pre-tip) cell, and the number of cells to the first branch point. Second, four methods were designed in order to assay hyphal anastomosis and heterokaryon formation: 1) pairings on membrane: phase or brightfield microscopy, 2) pairings on membrane: fluorescence microscopy, 3) spore crosses: fluorescence microscopy, and 4) double picks in fractionated MMA. All of these methods were promtsmg.
Resumo:
Several are the areas in which digital images are used in solving day-to-day problems. In medicine the use of computer systems have improved the diagnosis and medical interpretations. In dentistry it’s not different, increasingly procedures assisted by computers have support dentists in their tasks. Set in this context, an area of dentistry known as public oral health is responsible for diagnosis and oral health treatment of a population. To this end, oral visual inspections are held in order to obtain oral health status information of a given population. From this collection of information, also known as epidemiological survey, the dentist can plan and evaluate taken actions for the different problems identified. This procedure has limiting factors, such as a limited number of qualified professionals to perform these tasks, different diagnoses interpretations among other factors. Given this context came the ideia of using intelligent systems techniques in supporting carrying out these tasks. Thus, it was proposed in this paper the development of an intelligent system able to segment, count and classify teeth from occlusal intraoral digital photographic images. The proposed system makes combined use of machine learning techniques and digital image processing. We first carried out a color-based segmentation on regions of interest, teeth and non teeth, in the images through the use of Support Vector Machine. After identifying these regions were used techniques based on morphological operators such as erosion and transformed watershed for counting and detecting the boundaries of the teeth, respectively. With the border detection of teeth was possible to calculate the Fourier descriptors for their shape and the position descriptors. Then the teeth were classified according to their types through the use of the SVM from the method one-against-all used in multiclass problem. The multiclass classification problem has been approached in two different ways. In the first approach we have considered three class types: molar, premolar and non teeth, while the second approach were considered five class types: molar, premolar, canine, incisor and non teeth. The system presented a satisfactory performance in the segmenting, counting and classification of teeth present in the images.
Resumo:
Several are the areas in which digital images are used in solving day-to-day problems. In medicine the use of computer systems have improved the diagnosis and medical interpretations. In dentistry it’s not different, increasingly procedures assisted by computers have support dentists in their tasks. Set in this context, an area of dentistry known as public oral health is responsible for diagnosis and oral health treatment of a population. To this end, oral visual inspections are held in order to obtain oral health status information of a given population. From this collection of information, also known as epidemiological survey, the dentist can plan and evaluate taken actions for the different problems identified. This procedure has limiting factors, such as a limited number of qualified professionals to perform these tasks, different diagnoses interpretations among other factors. Given this context came the ideia of using intelligent systems techniques in supporting carrying out these tasks. Thus, it was proposed in this paper the development of an intelligent system able to segment, count and classify teeth from occlusal intraoral digital photographic images. The proposed system makes combined use of machine learning techniques and digital image processing. We first carried out a color-based segmentation on regions of interest, teeth and non teeth, in the images through the use of Support Vector Machine. After identifying these regions were used techniques based on morphological operators such as erosion and transformed watershed for counting and detecting the boundaries of the teeth, respectively. With the border detection of teeth was possible to calculate the Fourier descriptors for their shape and the position descriptors. Then the teeth were classified according to their types through the use of the SVM from the method one-against-all used in multiclass problem. The multiclass classification problem has been approached in two different ways. In the first approach we have considered three class types: molar, premolar and non teeth, while the second approach were considered five class types: molar, premolar, canine, incisor and non teeth. The system presented a satisfactory performance in the segmenting, counting and classification of teeth present in the images.