934 resultados para Rule-based techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four new 5-aminoisophthalates of cobalt and nickel have been prepared employing hydro/solvothermal methods: [Co2(C8H5NO4)2(C4H4N2)(H2O)2]·3H2O (I), [Ni2(C8H5NO4)2(C4H4N2)(H2O)2]·3H2O (II), [Co2(H2O)(μ3-OH)2(C8H5NO4)] (III), and [Ni2(H2O)(μ3-OH)2(C8H5NO4)] (IV). Compounds I and II are isostructural, having anion-deficient CdCl2 related layers bridged by a pyrazine ligand, giving rise to a bilayer arrangement. Compounds III and IV have one-dimensional M−O(H)−M chains connected by the 5-aminoisophthalate units forming a three-dimensional structure. The coordinated as well as the lattice water molecules of I and II could be removed and inserted by simple heating−cooling cycles under the atmospheric conditions. The removal of the coordinated water molecule is accompanied by changes in the coordination environment around the M2+ (M = Co, Ni) and color of the samples (purple to blue, Co; green to dark yellow, Ni). This change has been examined by a variety of techniques that include in situ single crystal to single crystal transformation studies and in situ IR and UV−vis spectroscopic studies. Magnetic studies indicate antiferromagnetic behavior in I and II, a field-induced magnetism in III, and a canted antiferromagnetic behavior in IV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A considerable amount of work has been dedicated on the development of analytical solutions for flow of chemical contaminants through soils. Most of the analytical solutions for complex transport problems are closed-form series solutions. The convergence of these solutions depends on the eigen values obtained from a corresponding transcendental equation. Thus, the difficulty in obtaining exact solutions from analytical models encourages the use of numerical solutions for the parameter estimation even though, the later models are computationally expensive. In this paper a combination of two swarm intelligence based algorithms are used for accurate estimation of design transport parameters from the closed-form analytical solutions. Estimation of eigen values from a transcendental equation is treated as a multimodal discontinuous function optimization problem. The eigen values are estimated using an algorithm derived based on glowworm swarm strategy. Parameter estimation of the inverse problem is handled using standard PSO algorithm. Integration of these two algorithms enables an accurate estimation of design parameters using closed-form analytical solutions. The present solver is applied to a real world inverse problem in environmental engineering. The inverse model based on swarm intelligence techniques is validated and the accuracy in parameter estimation is shown. The proposed solver quickly estimates the design parameters with a great precision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sugarcane has garnered much interest for its potential as a viable renewable energy crop. While the use of sugar juice for ethanol production has been in practice for years, a new focus on using the fibrous co-product known as bagasse for producing renewable fuels and bio-based chemicals is growing in interest. The success of these efforts, and the development of new varieties of energy canes, could greatly increase the use of sugarcane and sugarcane biomass for fuels while enhancing industry sustainability and competitiveness. Sugarcane-Based Biofuels and Bioproducts examines the development of a suite of established and developing biofuels and other renewable products derived from sugarcane and sugarcane-based co-products, such as bagasse. Chapters provide broad-ranging coverage of sugarcane biology, biotechnological advances, and breakthroughs in production and processing techniques. This text brings together essential information regarding the development and utilization of new fuels and bioproducts derived from sugarcane. Authored by experts in the field, Sugarcane-Based Biofuels and Bioproducts is an invaluable resource for researchers studying biofuels, sugarcane, and plant biotechnology as well as sugar and biofuels industry personnel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combining the advanced techniques of optimal dynamic inversion and model-following neuro-adaptive control design, an innovative technique is presented to design an automatic drug administration strategy for effective treatment of chronic myelogenous leukemia (CML). A recently developed nonlinear mathematical model for cell dynamics is used to design the controller (medication dosage). First, a nominal controller is designed based on the principle of optimal dynamic inversion. This controller can treat the nominal model patients (patients who can be described by the mathematical model used here with the nominal parameter values) effectively. However, since the system parameters for a realistic model patient can be different from that of the nominal model patients, simulation studies for such patients indicate that the nominal controller is either inefficient or, worse, ineffective; i.e. the trajectory of the number of cancer cells either shows non-satisfactory transient behavior or it grows in an unstable manner. Hence, to make the drug dosage history more realistic and patient-specific, a model-following neuro-adaptive controller is augmented to the nominal controller. In this adaptive approach, a neural network trained online facilitates a new adaptive controller. The training process of the neural network is based on Lyapunov stability theory, which guarantees both stability of the cancer cell dynamics as well as boundedness of the network weights. From simulation studies, this adaptive control design approach is found to be very effective to treat the CML disease for realistic patients. Sufficient generality is retained in the mathematical developments so that the technique can be applied to other similar nonlinear control design problems as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional invasive coronary angiography is the clinical gold standard for detecting of coronary artery stenoses. Noninvasive multidetector computed tomography (MDCT) in combination with retrospective ECG gating has recently been shown to permit visualization of the coronary artery lumen and detection of coronary artery stenoses. Single photon emission tomography (SPECT) perfusion imaging has been considered the reference method for evaluation of nonviable myocardium, but magnetic resonance imaging (MRI) can accurately depict structure, function, effusion, and myocardial viability, with an overall capacity unmatched by any other single imaging modality. Magnetocardiography (MCG) provides noninvasively information about myocardial excitation propagation and repolarization without the use of electrodes. This evolving technique may be considered the magnetic equivalent to electrocardiography. The aim of the present series of studies was to evaluate changes in the myocardium assessed with SPECT and MRI caused by coronary artery disease, examine the capability of multidetector computed tomography coronary angiography (MDCT-CA) to detect significant stenoses in the coronary arteries, and MCG to assess remote myocardial infarctions. Our study showed that in severe, progressing coronary artery disease laser treatment does not improve global left ventricular function or myocardial perfusion, but it does preserve systolic wall thickening in fixed defects (scar). It also prevents changes from ischemic myocardial regions to scar. The MCG repolarization variables are informative in remote myocardial infarction, and may perform as well as the conventional QRS criteria in detection of healed myocardial infarction. These STT abnormalities are more pronounced in patients with Q-wave infarction than in patients with non-Q-wave infarctions. MDCT-CA had a sensitivity of 82%, a specificity of 94%, a positive predictive value of 79%, and a negative predictive value of 95% for stenoses over 50% in the main coronary arteries as compared with conventional coronary angiography in patients with known coronary artery disease. Left ventricular wall dysfunction, perfusion defects, and infarctions were detected in 50-78% of sectors assigned to calcifications or stenoses, but also in sectors supplied by normally perfused coronary arteries. Our study showed a low sensitivity (sensitivity 63%) in detecting obstructive coronary artery disease assessed by MDCT in patients with severe aortic stenosis. Massive calcifications complicated correct assessment of the lumen of coronary arteries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel chromogenic thiourea based sensors 4,4'-bis-[3-(4-nitrophenyl) thiourea] diphenyl ether 1 and 4,4'-bis-[3-(4-nitrophenyl) thiourea] diphenyl methane 2 having nitrophenyl group as signaling unit have been synthesized and characterized by spectroscopic techniques and X-ray crystallography. The both sensors show visual detection, UV-vis and NMR spectral changes in presence of fluoride and cyanide anions in organic solvent as well as in aqueous medium. The absorption spectra indicated the formation of complex between host and guest is in 1:2 stoichiometric ratios. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Close to one half of the LHC events are expected to be due to elastic or inelastic diffractive scattering. Still, predictions based on extrapolations of experimental data at lower energies differ by large factors in estimating the relative rate of diffractive event categories at the LHC energies. By identifying diffractive events, detailed studies on proton structure can be carried out. The combined forward physics objects: rapidity gaps, forward multiplicity and transverse energy flows can be used to efficiently classify proton-proton collisions. Data samples recorded by the forward detectors, with a simple extension, will allow first estimates of the single diffractive (SD), double diffractive (DD), central diffractive (CD), and non-diffractive (ND) cross sections. The approach, which uses the measurement of inelastic activity in forward and central detector systems, is complementary to the detection and measurement of leading beam-like protons. In this investigation, three different multivariate analysis approaches are assessed in classifying forward physics processes at the LHC. It is shown that with gene expression programming, neural networks and support vector machines, diffraction can be efficiently identified within a large sample of simulated proton-proton scattering events. The event characteristics are visualized by using the self-organizing map algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently it has been recognized that evolutionary aspects play a major role in conservation issues of a species. In this thesis I have combined evolutionary research with conservation studies to provide new insight into these fields. The study object of this thesis is the house sparrow, a species that has features that makes it interesting for this type of study. The house sparrow has been ubiquitous almost all over the world. Even though being still abundant, several countries have reported major declines. These declines have taken place in a relatively short time covering both urban and rural habitats. In Finland this species has declined by more than two thirds in just over two decades. In addition, as the house sparrow lives only in human inhabited areas it can also raise public awareness to conservation issues. I used both an extensive museum collection of house sparrows collected in 1980s from all over Finland as well as samples collected in 2009 from 12 of the previously collected localities. I used molecular techniques to study neutral genetic variation within and genetic differentiation between the study populations. This knowledge I then combined with data gathered on morphometric measurements. In addition I analyzed eight heavy metals from the livers of house sparrows that lived in either rural or urban areas in the 1980s and evaluated the role of heavy metal pollution as a possible cause of the declines. Even though dispersal of house sparrows is limited I found that just as the declines started in 1980s the house sparrows formed a genetically panmictic population on the scale of the whole Finland. When compared to Norway, where neutral genetic divergence has been found even with small geographic distances, I concluded that this difference would be due to contrasting landscapes. In Finland the landscape is rather homogeneous facilitating the movements of these birds and maintaining gene flow even with the low dispersal. To see whether the declines have had an effect on the neutral genetic variation of the populations I did a comparison between the historical and contemporary genetic data. I showed that even though genetic diversity has not decreased due to the drastic declines the populations have indeed become more differentiated from each other. This shows that even in a still quite abundant species the declines can have an effect on the genetic variation. It is shown that genetic diversity and differentiation may approach their new equilibriums at different rates. This emphasizes the importance of studying both of them and if the latter has increased it should be taken as a warning sign of a possible loss of genetic diversity in the future. One of the factors suggested to be responsible for the house sparrow declines is heavy metal pollution. When studying the livers of house sparrows from 1980s I discovered higher levels of heavy metal concentrations in urban than rural habitats, but the levels of the metals were comparatively low and based on that heavy metal pollution does not seem to be a direct cause for the declines in Finland. However, heavy metals are known to decrease the amount of insects in urban areas and thus in the cities heavy metals may have an indirect effect on house sparrows. Although neutral genetic variation is an important tool for conservation genetics it does not tell the whole story. Since neutral genetic variation is not affected by selection, information can be one-sided. It is possible that even neutral genetic differentiation is low, there can be substantial variation in additive genetic traits indicating local adaptation. Therefore I performed a comparison between neutral genetic differentiation and phenotypic differentiation. I discovered that two traits out of seven are likely to be under directional selection, whereas the others could be affected by random genetic drift. Bergmann s rule may be behind the observed directional selection in wing length and body mass. These results highlight the importance of estimating both neutral and adaptive genetic variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new computationally efficient method for large-scale polypeptide folding using coarse-grained elastic networks and gradient-based continuous optimization techniques. The folding is governed by minimization of energy based on Miyazawa–Jernigan contact potentials. Using this method we are able to substantially reduce the computation time on ordinary desktop computers for simulation of polypeptide folding starting from a fully unfolded state. We compare our results with available native state structures from Protein Data Bank (PDB) for a few de-novo proteins and two natural proteins, Ubiquitin and Lysozyme. Based on our simulations we are able to draw the energy landscape for a small de-novo protein, Chignolin. We also use two well known protein structure prediction software, MODELLER and GROMACS to compare our results. In the end, we show how a modification of normal elastic network model can lead to higher accuracy and lower time required for simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The swelling pressure of soil depends upon various soil parameters such as mineralogy, clay content, Atterberg's limits, dry density, moisture content, initial degree of saturation, etc. along with structural and environmental factors. It is very difficult to model and analyze swelling pressure effectively taking all the above aspects into consideration. Various statistical/empirical methods have been attempted to predict the swelling pressure based on index properties of soil. In this paper, the computational intelligence techniques artificial neural network and support vector machine have been used to develop models based on the set of available experimental results to predict swelling pressure from the inputs; natural moisture content, dry density, liquid limit, plasticity index, and clay fraction. The generalization of the model to new set of data other than the training set of data is discussed which is required for successful application of a model. A detailed study of the relative performance of the computational intelligence techniques has been carried out based on different statistical performance criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The swelling pressure of soil depends upon various soil parameters such as mineralogy, clay content, Atterberg's limits, dry density, moisture content, initial degree of saturation, etc. along with structural and environmental factors. It is very difficult to model and analyze swelling pressure effectively taking all the above aspects into consideration. Various statistical/empirical methods have been attempted to predict the swelling pressure based on index properties of soil. In this paper, the computational intelligence techniques artificial neural network and support vector machine have been used to develop models based on the set of available experimental results to predict swelling pressure from the inputs; natural moisture content, dry density, liquid limit, plasticity index, and clay fraction. The generalization of the model to new set of data other than the training set of data is discussed which is required for successful application of a model. A detailed study of the relative performance of the computational intelligence techniques has been carried out based on different statistical performance criteria.