998 resultados para Advent sermons.
Resumo:
Kinetic studies of macromolecular ligand-ligate interaction have generated ample interest since the advent of plasmon resonance based instruments like BIAcore. Most of the studies reported in literature assume a simple 1 : 1 Langmuir binding and complete reversibility of the system. However we observed that in a high affinity antigen-antibody system [human chorionic gonadotropin-monoclonal antibody (hCG-mAb)] dissociation is insignificant and the sensogram data cannot be used to measure the equilibrium and kinetic parameters. At low concentrations of mAb the complete sensogram could be fitted to a single exponential. Interestingly we found that at higher mAb concentrations, the binding data did not conform to a simple bimolecular model. Instead, the data fitted a two-step model, which may be because of surface heterogeneity of affinity sites. In this paper, we report on the global fit of the sensograms. We have developed a method by which a single two-minute sensogram can be used in high affinity systems to measure the association rate constant of the reaction and the functional capacity of the ligand (hCG) immobilized on the chip. We provide a rational explanation for the discrepancies generally observed in most of the BIAcore sensograms
Resumo:
There has been significant progress in our understanding of the pathogenesis of AS. The advent of genome-wide association studies has increased the known loci associated with AS to more than 40. The endoplasmic reticulum resident aminopeptidases (ERAP) 1 and 2 were identified in this manner and are of particular interest. There appears to be a genetic as well as a functional interaction of ERAP1 and 2 with HLA-B27 based on the known functions of these molecules. Recent studies on the structure, immunological effects and the peptide-trimming properties of ERAP 1 and 2 have helped to provide insight into their pathogenic potential in AS. In this review, we explore the role of ERAP 1 and 2 in the pathogenesis of AS. © The Author 2015.
Resumo:
Introduction: Osteoporosis is the commonest metabolic bone disease worldwide. The clinical hallmark of osteoporosis is low trauma fracture, with the most devastating being hip fracture, resulting in significant effects on both morbidity and mortality. Sources of data: Data for this review have been gathered from the published literature and from a range of web resources. Areas of agreement: Genome-wide association studies in the field of osteoporosis have led to the identification of a number of loci associated with both bone mineral density and fracture risk and further increased our understanding of disease. Areas of controversy: The early strategies for mapping osteoporosis disease genes reported only isolated associations, with replication in independent cohorts proving difficult. Neither candidate gene or linkage studies showed association at genome-wide level of significance. Growing points: The advent of massive parallel sequencing technologies has proved extremely successful in mapping monogenic diseases and thus leading to the utilization of this new technology in complex disease genetics. Areas timely for developing research: The identification of novel genes and pathways will potentially lead to the identification of novel therapeutic options for patients with osteoporosis. © 2014 The Author.
Resumo:
In the recent time CFD tools have become increasingly useful in the engineering design studies especially in the area of aerospace vehicles. This is largely due to the advent of high speed computing platforms in addition to the development of new efficient algorithms. The algorithms based on kinetic schemes have been shown to be very robust and further meshless methods offer certain advantages over the other methods. Preliminary investigations of blood flow visualization through artery using CFD tool have shown encouraging results which further needs to be verified and validated.
Resumo:
Over the years, a wide range of methods to verify identity have been developed. Molecular markers have been used for identification since the 1920s, commencing with blood types and culminating with the advent of DNA techniques in the 1980s. Identification is required by authorities in many occasions, e.g. in disputed paternity cases, identification of deceased, or crime investigation. To clarify maternal and paternal lineages, uniparental DNA markers in mtDNA and Y-chromosome can be utilized. These markers have several advantages: male specific Y-chromosome can be used to identify a male from a mixture of male and female cells, e.g. in rape cases. MtDNA is durable and has a high copy number, allowing analyses even from old or degraded samples. However, both markers are lineage-specific, not individualizing, and susceptible to genetic drift. Prior to the application of any DNA marker in forensic casework, it is of utmost importance to investigate its qualities and peculiarities in the target population. Earlier studies on the Finnish population have shown reduced variation in the Y-chromosome, but in mtDNA results have been ambiguous. The obtained results confirmed the low diversity in Y-chromosome in Finland. Detailed population analysis revealed large regional differences, and extremely reduced diversity especially in East Finland. Analysis of the qualities affecting Y-chromosomal short tandem repeat (Y-STR) variation and mutation frequencies, and search of new polymorphic markers resulted a set of Y-STRs with especially high diversity in Finland. Contrary to Y-chromosome, neither reduced diversity nor regional differences were found in mtDNA within Finland. In fact, mtDNA diversity was found similar to other European populations. The revealed peculiarities in the uniparental markers are a legacy of the Finnish population history. The obtained results challenge the traditional explanation which emphasizes relatively recent founder effects creating the observed east-west patterns. Uniparentally inherited markers, both mtDNA and Y-chromosome, are applicable for identification purposes in Finland. By adjusting the analysed Y marker set to meet the characteristics of Finnish population, Y-chromosomal diversity increases and the regional differentiation decreases, resulting increase in discrimination power and thus usefulness of Y-chromosomal analysis in forensic casework.
Resumo:
The advent of large and fast digital computers and development of numerical techniques suited to these have made it possible to review the analysis of important fundamental and practical problems and phenomena of engineering which have remained intractable for a long time. The understanding of the load transfer between pin and plate is one such. Inspite of continuous attack on these problems for over half a century, classical solutions have remained limited in their approach and value to the understanding of the phenomena and the generation of design data. On the other hand, the finite element methods that have grown simultaneously with the recent development of computers have been helpful in analysing specific problems and answering specific questions, but are yet to be harnessed to assist in obtaining with economy a clearer understanding of the phenomena of partial separation and contact, friction and slip, and fretting and fatigue in pin joints. Against this background, it is useful to explore the application of the classical simple differential equation methods with the aid of computer power to open up this very important area. In this paper we describe some of the recent and current work at the Indian Institute of Science in this last direction.
Resumo:
With the advent of VLSI it has become possible to map parallel algorithms for compute-bound problems directly on silicon. Systolic architecture is very good candidate for VLSI implementation because of its regular and simple design, and regular communication pattern. In this paper, a systolic algorithm and corresponding systolic architecture, a linear systolic array, for the scanline-based hidden surface removal problem in three-dimensional computer graphics have been proposed. The algorithm is based on the concept of sample spans or intervals. The worst case time taken by the algorithm is O(n), n being the number of segments in a scanline. The time taken by the algorithm for a given scene depends on the scene itself, and on an average considerable improvement over the worst case behaviour is expected. A pipeline scheme for handling the I/O process has also been proposed which is suitable for VLSI implementation of the algorithm.
Resumo:
The advent of high intensity lasers coupled with the recent advances in crystal technology has led to rapid progress in the field of nonlinear optics. This article traces the history of materials development that has taken place over the past forty odd years and dwells on the current status in this important area. The materials aspect is discussed under three classes viz. inorganic, organic and semiorganic crystals. In the end, some of the crystal growth work that has been carried out in author's laboratory is presented.
Resumo:
The study of reaction mechanisms involves systematic investigations of the correlation between structure, reactivity, and time. The challenge is to be able to observe the chemical changes undergone by reactants as they change into products via one or several intermediates such as electronic excited states (singlet and triplet), radicals, radical ions, carbocations, carbanions, carbenes, nitrenes, nitrinium ions, etc. The vast array of intermediates and timescales means there is no single ``do-it-all'' technique. The simultaneous advances in contemporary time-resolved Raman spectroscopic techniques and computational methods have done much towards visualizing molecular fingerprint snapshots of the reactive intermediates in the microsecond to femtosecond time domain. Raman spectroscopy and its sensitive counterpart resonance Raman spectroscopy have been well proven as means for determining molecular structure, chemical bonding, reactivity, and dynamics of short-lived intermediates in solution phase and are advantageous in comparison to commonly used time-resolved absorption and emission spectroscopy. Today time-resolved Raman spectroscopy is a mature technique; its development owes much to the advent of pulsed tunable lasers, highly efficient spectrometers, and high speed, highly sensitive multichannel detectors able to collect a complete spectrum. This review article will provide a brief chronological development of the experimental setup and demonstrate how experimentalists have conquered numerous challenges to obtain background-free (removing fluorescence), intense, and highly spectrally resolved Raman spectra in the nanosecond to microsecond (ns-mu s) and picosecond (ps) time domains and, perhaps surprisingly, laid the foundations for new techniques such as spatially offset Raman spectroscopy.
Resumo:
With the advent of Internet, video over IP is gaining popularity. In such an environment, scalability and fault tolerance will be the key issues. Existing video on demand (VoD) service systems are usually neither scalable nor tolerant to server faults and hence fail to comply to multi-user, failure-prone networks such as the Internet. Current research areas concerning VoD often focus on increasing the throughput and reliability of single server, but rarely addresses the smooth provision of service during server as well as network failures. Reliable Server Pooling (RSerPool), being capable of providing high availability by using multiple redundant servers as single source point, can be a solution to overcome the above failures. During a possible server failure, the continuity of service is retained by another server. In order to achieve transparent failover, efficient state sharing is an important requirement. In this paper, we present an elegant, simple, efficient and scalable approach which has been developed to facilitate the transfer of state by the client itself, using extended cookie mechanism, which ensures that there is no noticeable change in disruption or the video quality.
Resumo:
Today 80 % of the content on the Web is in English, which is spoken by only 8% of the World population and 5% of Indian population. There is wealth of useful content in the various languages of the world other than English, which can be made available on the Internet. But, to date, for various reasons most of it is not yet available on the Internet. India itself has 18 officially recognized languages and scores of dialects. Although the medium of instruction for most of the higher education and research in India is English, substantial amount of literature by way of novels, textbooks, scholarly information are being generated in the other languages in the country. Many of the e-governance initiatives are in the respective state languages. In the past, support for different languages by the operating systems and the software packages were not very encouraging. However, with the advent of Unicode technology, operating systems and software packages are supporting almost all the major languages of the world that have scripts. In the work reported in this paper, we have explained the configuration changes that are needed for Eprints.org software to store multilingual content and to create a multilingual user interface.
Resumo:
Chronic recording of neural signals is indispensable in designing efficient brain–machine interfaces and to elucidate human neurophysiology. The advent of multichannel micro-electrode arrays has driven the need for electronics to record neural signals from many neurons. The dynamic range of the system can vary over time due to change in electrode–neuron distance and background noise. We propose a neural amplifier in UMC 130 nm, 1P8M complementary metal–oxide–semiconductor (CMOS) technology. It can be biased adaptively from 200 nA to 2 $mu{rm A}$, modulating input referred noise from 9.92 $mu{rm V}$ to 3.9 $mu{rm V}$. We also describe a low noise design technique which minimizes the noise contribution of the load circuitry. Optimum sizing of the input transistors minimizes the accentuation of the input referred noise of the amplifier and obviates the need of large input capacitance. The amplifier achieves a noise efficiency factor of 2.58. The amplifier can pass signal from 5 Hz to 7 kHz and the bandwidth of the amplifier can be tuned for rejecting low field potentials (LFP) and power line interference. The amplifier achieves a mid-band voltage gain of 37 dB. In vitro experiments are performed to validate the applicability of the neural low noise amplifier in neural recording systems.
Resumo:
The technological world has attained a new dimension with the advent of miniaturization and a major breakthrough has evolved in the form of moems, technically more advanced than mems. This breakthrough has paved way for the scientists to research and conceive their innovation. This paper presents a mathematical analysis of the wave propagation along the non-uniform waveguide with refractive index varying along the z axis implemented on the cantilever beam of MZI based moem accelerometer. Secondly the studies on the wave bends with minimum power loss focusing on two main aspects of bend angle and curvature angle is also presented.
Resumo:
Bangalore is one of the fastest growing cities in India and is branded as ‘Silicon Valley of India’ for heralding and spearheading the growth of Information Technology (IT) based industries in the country. With the advent and growth of IT industry, as well as numerous industries in other sectors and the onset of economic liberalisation since the early 1990s, Bangalore has taken lead in service-based industries fuelling substantial growth of the city both economically and spatially. Bangalore has become a cosmopolitan city attracting people and business alike, within and across nations. This profile notes the urban setting and provides an overview of the urban fabric, while discussing various prospects related to infrastructure and governance (Sudhira, et al. 2007).
Resumo:
The advent and evolution of geohazard warning systems is a very interesting study. The two broad fields that are immediately visible are that of geohazard evaluation and subsequent warning dissemination. Evidently, the latter field lacks any systematic study or standards. Arbitrarily organized and vague data and information on warning techniques create confusion and indecision. The purpose of this review is to try and systematize the available bulk of information on warning systems so that meaningful insights can be derived through decidable flowcharts, and a developmental process can be undertaken. Hence, the methods and technologies for numerous geohazard warning systems have been assessed by putting them into suitable categories for better understanding of possible ways to analyze their efficacy as well as shortcomings. By establishing a classification scheme based on extent, control, time period, and advancements in technology, the geohazard warning systems available in any literature could be comprehensively analyzed and evaluated. Although major advancements have taken place in geohazard warning systems in recent times, they have been lacking a complete purpose. Some systems just assess the hazard and wait for other means to communicate, and some are designed only for communication and wait for the hazard information to be provided, which usually is after the mishap. Primarily, systems are left at the mercy of administrators and service providers and are not in real time. An integrated hazard evaluation and warning dissemination system could solve this problem. Warning systems have also suffered from complexity of nature, requirement of expert-level monitoring, extensive and dedicated infrastructural setups, and so on. The user community, which would greatly appreciate having a convenient, fast, and generalized warning methodology, is surveyed in this review. The review concludes with the future scope of research in the field of hazard warning systems and some suggestions for developing an efficient mechanism toward the development of an automated integrated geohazard warning system. DOI: 10.1061/(ASCE)NH.1527-6996.0000078. (C) 2012 American Society of Civil Engineers.