806 resultados para Representation Targets
Resumo:
The present thesis report the results obtained from the studies carried out on the laser blow off plasma (LBO) from LiF-C (Lithium Fluoride with Carbon) thin film target, which is of particular importance in Tokamak plasma diagnostics. Keeping in view of its significance, plasma generated by the irradiation of thin film target by nanosecond laser pulses from an Nd:YAG laser over the thin film target has been characterized by fast photography using intensified CCD. In comparison to other diagnostic techniques, imaging studies provide better understanding of plasma geometry (size, shape, divergence etc) and structural formations inside the plume during different stages of expansion.
Resumo:
The- classic: experiment of Heinrich Hertz verified the theoretical predict him of Maxwell that kxnfli radio and light waves are physical phenomena governed by the same physical laws. This has started a.rnnJ era of interest in interaction of electromagnetic energy with matter. The scattering of electromagnetic waves from a target is cleverly utilized im1 RADAR. This electronic system used tx> detect and locate objects under unfavourable conditions or obscuration that would render the unaided eye useless. It also provides a means for measuring precisely the range, or distance of an object and the speed of a moving object. when an obstacle is illuminated by electromagnetic waves, energy is dispersed in all directions. The dispersed energy depends on the size, shape and composition of the obstacle and frequency and nature of the incident wave. This distribution of energy’ is known as ‘scattering’ and the obstacle as ‘scatterer’ or 'target'.
Squeezed Coherent State Representation of Scalar Field and Particle Production in the Early Universe
Resumo:
The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.
Squeezed Coherent State Representation of Scalar Field and Particle Production in the Early Universe
Resumo:
The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.
Resumo:
DNA sequence representation methods are used to denote a gene structure effectively and help in similarities/dissimilarities analysis of coding sequences. Many different kinds of representations have been proposed in the literature. They can be broadly classified into Numerical, Graphical, Geometrical and Hybrid representation methods. DNA structure and function analysis are made easy with graphical and geometrical representation methods since it gives visual representation of a DNA structure. In numerical method, numerical values are assigned to a sequence and digital signal processing methods are used to analyze the sequence. Hybrid approaches are also reported in the literature to analyze DNA sequences. This paper reviews the latest developments in DNA Sequence representation methods. We also present a taxonomy of various methods. A comparison of these methods where ever possible is also done
Resumo:
Diabetes mellitus is a heterogeneous metabolic disorder characterized by hyperglycemia with disturbances in carbohydrate, protein and lipid metabolism resulting from defects in insulin secretion, insulin action or both. Currently there are 387 million people with diabetes worldwide and is expected to affect 592 million people by 2035. Insulin resistance in peripheral tissues and pancreatic beta cell dysfunction are the major challenges in the pathophysiology of diabetes. Diabetic secondary complications (like liver cirrhosis, retinopathy, microvascular and macrovascular complications) arise from persistent hyperglycemia and dyslipidemia can be disabling or even life threatening. Current medications are effective for control and management of hyperglycemia but undesirable effects, inefficiency against secondary complications and high cost are still serious issues in the present prognosis of this disorder. Hence the search for more effective and safer therapeutic agents of natural origin has been found to be highly demanding and attract attention in the present drug discovery research. The data available from Ayurveda on various medicinal plants for treatment of diabetes can efficiently yield potential new lead as antidiabetic agents. For wider acceptability and popularity of herbal remedies available in Ayurveda scientific validation by the elucidation of mechanism of action is very much essential. Modern biological techniques are available now to elucidate the biochemical basis of the effectiveness of these medicinal plants. Keeping this idea the research programme under this thesis has been planned to evaluate the molecular mechanism responsible for the antidiabetic property of Symplocos cochinchinensis, the main ingredient of Nishakathakadi Kashayam, a wellknown Ayurvedic antidiabetic preparation. A general introduction of diabetes, its pathophysiology, secondary complications and current treatment options, innovative solutions based on phytomedicine etc has been described in Chapter 1. The effect of Symplocos cochinchinensis (SC), on various in vitro biochemical targets relevant to diabetes is depicted in Chapter 2 including the preparation of plant extract. Since diabetes is a multifactorial disease, ethanolic extract of the bark of SC (SCE) and its fractions (hexane, dichloromethane, ethyl acetate and 90 % ethanol) were evaluated by in vitro methods against multiple targets such as control of postprandial hyperglycemia, insulin resistance, oxidative stress, pancreatic beta cell proliferation, inhibition of protein glycation, protein tyrosine phosphatase-1B (PTP-1B) and dipeptidyl peptidase-IV (DPPxxi IV). Among the extracts, SCE exhibited comparatively better activity like alpha glucosidase inhibition, insulin dependent glucose uptake (3 fold increase) in L6 myotubes, pancreatic beta cell regeneration in RIN-m5F and reduced triglyceride accumulation in 3T3-L1 cells, protection from hyperglycemia induced generation of reactive oxygen species in HepG2 cells with moderate antiglycation and PTP-1B inhibition. Chemical characterization by HPLC revealed the superiority of SCE over other extracts due to presence of bioactives (beta-sitosterol, phloretin 2’glucoside, oleanolic acid) in addition to minerals like magnesium, calcium, potassium, sodium, zinc and manganese. So SCE has been subjected to oral sucrose tolerance test (OGTT) to evaluate its antihyperglycemic property in mild diabetic and diabetic animal models. SCE showed significant antihyperglycemic activity in in vivo diabetic models. Chapter 3 highlights the beneficial effects of hydroethanol extract of Symplocos cochinchinensis (SCE) against hyperglycemia associated secondary complications in streptozotocin (60 mg/kg body weight) induced diabetic rat model. Proper sanction had been obtained for all the animal experiments from CSIR-CDRI institutional animal ethics committee. The experimental groups consist of normal control (NC), N + SCE 500 mg/kg bwd, diabetic control (DC), D + metformin 100 mg/kg bwd, D + SCE 250 and D + SCE 500. SCEs and metformin were administered daily for 21 days and sacrificed on day 22. Oral glucose tolerance test, plasma insulin, % HbA1c, urea, creatinine, aspartate aminotransferase (AST), alanine aminotransferase (ALT), albumin, total protein etc. were analysed. Aldose reductase (AR) activity in the eye lens was also checked. On day 21, DC rats showed significantly abnormal glucose response, HOMA-IR, % HbA1c, decreased activity of antioxidant enzymes and GSH, elevated AR activity, hepatic and renal oxidative stress markers compared to NC. DC rats also exhibited increased level of plasma urea and creatinine. Treatment with SCE protected from the deleterious alterations of biochemical parameters in a dose dependent manner including histopathological alterations in pancreas. SCE 500 exhibited significant glucose lowering effect and decreased HOMA-IR, % HbA1c, lens AR activity, and hepatic, renal oxidative stress and function markers compared to DC group. Considerable amount of liver and muscle glycogen was replenished by SCE treatment in diabetic animals. Although metformin showed better effect, the activity of SCE was very much comparable with this drug. xxii The possible molecular mechanism behind the protective property of S. cochinchinensis against the insulin resistance in peripheral tissue as well as dyslipidemia in in vivo high fructose saturated fat diet model is described in Chapter 4. Initially animal were fed a high fructose saturated fat (HFS) diet for a period of 8 weeks to develop insulin resistance and dyslipidemia. The normal diet control (ND), ND + SCE 500 mg/kg bwd, high fructose saturated fat diet control (HFS), HFS + metformin 100 mg/kg bwd, HFS + SCE 250 and HFS + SCE 500 were the experimental groups. SCEs and metformin were administered daily for the next 3 weeks and sacrificed at the end of 11th week. At the end of week 11, HFS rats showed significantly abnormal glucose and insulin tolerance, HOMA-IR, % HbA1c, adiponectin, lipid profile, liver glycolytic and gluconeogenic enzyme activities, liver and muscle triglyceride accumulation compared to ND. HFS rats also exhibited increased level of plasma inflammatory cytokines, upregulated mRNA level of gluconeogenic and lipogenic genes in liver. HFS exhibited the increased expression of GLUT-2 in liver and decreased expression of GLUT-4 in muscle and adipose. SCE treatment also preserved the architecture of pancreas, liver, and kidney tissues. Treatment with SCE reversed the alterations of biochemical parameters, improved insulin sensitivity by modifying gene expression in liver, muscle and adipose tissues. Overall results suggest that SC mediates the antidiabetic activity mainly via alpha glucosidase inhibition, improved insulin sensitivity, with antiglycation and antioxidant activities.
Resumo:
Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. This survey analyzes the convergence of trends from both areas: Growing numbers of researchers work on improving the results of Web Mining by exploiting semantic structures in the Web, and they use Web Mining techniques for building the Semantic Web. Last but not least, these techniques can be used for mining the Semantic Web itself. The second aim of this paper is to use these concepts to circumscribe what Web space is, what it represents and how it can be represented and analyzed. This is used to sketch the role that Semantic Web Mining and the software agents and human agents involved in it can play in the evolution of Web space.
Resumo:
In this work, we present an atomistic-continuum model for simulations of ultrafast laser-induced melting processes in semiconductors on the example of silicon. The kinetics of transient non-equilibrium phase transition mechanisms is addressed with MD method on the atomic level, whereas the laser light absorption, strong generated electron-phonon nonequilibrium, fast heat conduction, and photo-excited free carrier diffusion are accounted for with a continuum TTM-like model (called nTTM). First, we independently consider the applications of nTTM and MD for the description of silicon, and then construct the combined MD-nTTM model. Its development and thorough testing is followed by a comprehensive computational study of fast nonequilibrium processes induced in silicon by an ultrashort laser irradiation. The new model allowed to investigate the effect of laser-induced pressure and temperature of the lattice on the melting kinetics. Two competing melting mechanisms, heterogeneous and homogeneous, were identified in our big-scale simulations. Apart from the classical heterogeneous melting mechanism, the nucleation of the liquid phase homogeneously inside the material significantly contributes to the melting process. The simulations showed, that due to the open diamond structure of the crystal, the laser-generated internal compressive stresses reduce the crystal stability against the homogeneous melting. Consequently, the latter can take a massive character within several picoseconds upon the laser heating. Due to the large negative volume of melting of silicon, the material contracts upon the phase transition, relaxes the compressive stresses, and the subsequent melting proceeds heterogeneously until the excess of thermal energy is consumed. A series of simulations for a range of absorbed fluences allowed us to find the threshold fluence value at which homogeneous liquid nucleation starts contributing to the classical heterogeneous propagation of the solid-liquid interface. A series of simulations for a range of the material thicknesses showed that the sample width we chosen in our simulations (800 nm) corresponds to a thick sample. Additionally, in order to support the main conclusions, the results were verified for a different interatomic potential. Possible improvements of the model to account for nonthermal effects are discussed and certain restrictions on the suitable interatomic potentials are found. As a first step towards the inclusion of these effects into MD-nTTM, we performed nanometer-scale MD simulations with a new interatomic potential, designed to reproduce ab initio calculations at the laser-induced electronic temperature of 18946 K. The simulations demonstrated that, similarly to thermal melting, nonthermal phase transition occurs through nucleation. A series of simulations showed that higher (lower) initial pressure reinforces (hinders) the creation and the growth of nonthermal liquid nuclei. For the example of Si, the laser melting kinetics of semiconductors was found to be noticeably different from that of metals with a face-centered cubic crystal structure. The results of this study, therefore, have important implications for interpretation of experimental data on the kinetics of melting process of semiconductors.
Resumo:
Ontic is an interactive system for developing and verifying mathematics. Ontic's verification mechanism is capable of automatically finding and applying information from a library containing hundreds of mathematical facts. Starting with only the axioms of Zermelo-Fraenkel set theory, the Ontic system has been used to build a data base of definitions and lemmas leading to a proof of the Stone representation theorem for Boolean lattices. The Ontic system has been used to explore issues in knowledge representation, automated deduction, and the automatic use of large data bases.
Resumo:
This paper describes a system for the computer understanding of English. The system answers questions, executes commands, and accepts information in normal English dialog. It uses semantic information and context to understand discourse and to disambiguate sentences. It combines a complete syntactic analysis of each sentence with a "heuristic understander" which uses different kinds of information about a sentence, other parts of the discourse, and general information about the world in deciding what the sentence means. It is based on the belief that a computer cannot deal reasonably with language unless it can "understand" the subject it is discussing. The program is given a detailed model of the knowledge needed by a simple robot having only a hand and an eye. We can give it instructions to manipulate toy objects, interrogate it about the scene, and give it information it will use in deduction. In addition to knowing the properties of toy objects, the program has a simple model of its own mentality. It can remember and discuss its plans and actions as well as carry them out. It enters into a dialog with a person, responding to English sentences with actions and English replies, and asking for clarification when its heuristic programs cannot understand a sentence through use of context and physical knowledge.
Resumo:
We present a set of techniques that can be used to represent and detect shapes in images. Our methods revolve around a particular shape representation based on the description of objects using triangulated polygons. This representation is similar to the medial axis transform and has important properties from a computational perspective. The first problem we consider is the detection of non-rigid objects in images using deformable models. We present an efficient algorithm to solve this problem in a wide range of situations, and show examples in both natural and medical images. We also consider the problem of learning an accurate non-rigid shape model for a class of objects from examples. We show how to learn good models while constraining them to the form required by the detection algorithm. Finally, we consider the problem of low-level image segmentation and grouping. We describe a stochastic grammar that generates arbitrary triangulated polygons while capturing Gestalt principles of shape regularity. This grammar is used as a prior model over random shapes in a low level algorithm that detects objects in images.
Resumo:
We present a novel scheme ("Categorical Basis Functions", CBF) for object class representation in the brain and contrast it to the "Chorus of Prototypes" scheme recently proposed by Edelman. The power and flexibility of CBF is demonstrated in two examples. CBF is then applied to investigate the phenomenon of Categorical Perception, in particular the finding by Bulthoff et al. (1998) of categorization of faces by gender without corresponding Categorical Perception. Here, CBF makes predictions that can be tested in a psychophysical experiment. Finally, experiments are suggested to further test CBF.