972 resultados para Pan-Americanism
Resumo:
This work was focused to study the immobilization of enzymes on polymers. A large range of polymer matrices have been employed as supports for enzyme immobilization. Here polyaniline (PAN!) and poly(0~toluidine) (POT) were used as supports. PANI and POT provides an excellent support for enzyme immobilization by virtue of its facile synthesis, superior chemical and physical stabilities, and large retention capacity. We selected industrially important starch hydrolyzing enzymes a-amylase and glucoamylase for the study. In this work the selected enzymes were immobilized via adsorption and covalent bonding methods.To optimize the catalytic efficiency and stability of the resulting biocatalysts, the attempt was made to understand the immobilization effects on enzymatic properties. The effect of pH of the immobilization medium, time of immobilization on the immobilization efficiency was observed. The starch hydrolyzing activity of free 0:-amylase and glucoamylase were compared with immobilized forms. Immobilization on solid supports changes the microenvironment of the enzyme there by influences the pH and temperature relationship on the enzymatic activity. Hence these parameters also optimized. The reusability and storage stability of immobilized enzymes an important aspect from an application standpoint, especially in industrial applications. Taking in to consideration of this, the reusability and the long tenn storage stability of the immobilized enzyme investigated.
Resumo:
The present investigation was envisaged to determine the prevalence and identify the different Salmonella serovar in seafood from Cochin area. Though, the distribution of Salmonella serovars in different seafood samples of Cochin has been well documented, the present attempt was made to identify the different Salmonella serovars and determine its prevalence in various seafoods. First pan of this investigation involved the isolation and identification of Salmonella strains with the help of different conventional culture methods. The identified isolates were used for the further investigation i.e. serotyping, this provides the information about the prevalent serovars in seafood. The prevalent Salmonella strains have been further characterized based on the utilization of different sugars and amino acids, to identify the different biovar of a serovar.A major research gap was observed in molecular characterization of Salmonella in seafood. Though, previous investigations reported the large number of Salmonella serovars from food sources in India, yet, very few work has been reported regarding genetic characterization of Salmonella serovars associated with food. Second part of this thesis deals with different molecular fingerprint profiles of the Salmonella serovars from seafood. Various molecular typing methods such as plasmid profiling, characterization of virulence genes, PFGE, PCR- ribotyping, and ERIC—PCR have been used for the genetic characterization of Salmonella serovars.The conventional culture methods are mainly used for the identification of Salmonella in seafood and most of the investigations from India and abroad showed the usage of culture method for detection of Salmonella in seafood. Hence, development of indigenous, rapid molecular method is most desirable for screening of Salmonella in large number of seafood samples at a shorter time period. Final part of this study attempted to develop alternative, rapid molecular detection method for the detection of Salmonella in seafood. Rapid eight—hour PCR assay has been developed for detection of Salmonella in seafood. The performance of three different methods viz., culture, ELISA and PCR assays were evaluated for detection of Salmonella in seafood and the results were statistically analyzed. Presence of Salmonella cells in food and enviromnental has been reported low in number, hence, more sensitive method for enumeration of Salmonella in food sample need to be developed. A quantitative realtime PCR has been developed for detection of Salmonella in seafood. This method would be useful for quantitative detection of Salmonella in seafood.
Resumo:
Exchange-biased Ni/FeF2 films have been investigated using vector coil vibrating-sample magnetometry as a function of the cooling field strength HFC . In films with epitaxial FeF2 , a loop bifurcation develops with increasing HFC as it divides into two sub-loops shifted oppositely from zero field by the same amount. The positively biased sub-loop grows in size with HFC until only a single positively shifted loop is found. Throughout this process, the negative and positive (sub)loop shifts maintain the same discrete value. This is in sharp contrast to films with twinned FeF2 where the exchange field gradually changes with increasing HFC . The transverse magnetization shows clear correlations with the longitudinal subloops. Interestingly, over 85% of the Ni reverses its magnetization by rotation, either in one step or through two successive rotations. These results are due to the single-crystal nature of the antiferromagnetic FeF2 , which breaks down into two opposite regions of large domains.
Resumo:
Due to the great versatility of the properties of polymer thin films, special interest has been taken in recent years on their preparation and electrical properties. The present thesis is entirely devoted to the study of the formation, structure and electrical properties of plasma» polymerised polyacrylonitrile (PAN) thin films. Eventhough the studies are confined to a single polymer film, the results in general are applicable to similar polar polymer films.
Resumo:
Halobacteria, members of the domain Archaea that live under extremely halophilic conditions, are often considered as dependable source for deriving novel enzymes, novel genes, bioactive compounds and other industrially important molecules. Protein antibiotics have potential for application as preserving agents in food industry, leather industry and in control of infectious bacteria. Halocins are proteinaceous antibiotics synthesized and released into the environment by extreme halophiles, a universal characteristic of halophilic bacteria. Herein, we report the production of halocin (SH10) by an extremely halophilic archeon Natrinema sp. BTSH10 isolated from salt pan of Kanyakumari, Tamilnadu, India and optimization of medium for enhanced production of halocin. It was found that the optimal conditions for maximal halocin production were 42 C, pH 8.0, and 104 h of incubation at 200 rpm with 2% (V/V) inoculum concentration in Zobell’s medium containing 3 M NaCl, Galactose, beef extract, and calcium chloride as additional supplements. Results indicated scope for fermentation production of halocin for probable applications using halophilic archeon Natrinema sp. BTSH10
Resumo:
In this computerized, globalised and internet world our computer collects various types of information’s about every human being and stores them in files secreted deep on its hard drive. Files like cache, browser history and other temporary Internet files can be used to store sensitive information like logins and passwords, names addresses, and even credit card numbers. Now, a hacker can get at this information by wrong means and share with someone else or can install some nasty software on your computer that will extract your sensitive and secret information. Identity Theft posses a very serious problem to everyone today. If you have a driver’s license, a bank account, a computer, ration card number, PAN card number, ATM card or simply a social security number you are more than at risk, you are a target. Whether you are new to the idea of ID Theft, or you have some unanswered questions, we’ve compiled a quick refresher list below that should bring you up to speed. Identity theft is a term used to refer to fraud that involves pretending to be someone else in order to steal money or get other benefits. Identity theft is a serious crime, which is increasing at tremendous rate all over the world after the Internet evolution. There is widespread agreement that identity theft causes financial damage to consumers, lending institutions, retail establishments, and the economy as a whole. Surprisingly, there is little good public information available about the scope of the crime and the actual damages it inflicts. Accounts of identity theft in recent mass media and in film or literature have centered on the exploits of 'hackers' - variously lauded or reviled - who are depicted as cleverly subverting corporate firewalls or other data protection defenses to gain unauthorized access to credit card details, personnel records and other information. Reality is more complicated, with electronic identity fraud taking a range of forms. The impact of those forms is not necessarily quantifiable as a financial loss; it can involve intangible damage to reputation, time spent dealing with disinformation and exclusion from particular services because a stolen name has been used improperly. Overall we can consider electronic networks as an enabler for identity theft, with the thief for example gaining information online for action offline and the basis for theft or other injury online. As Fisher pointed out "These new forms of hightech identity and securities fraud pose serious risks to investors and brokerage firms across the globe," I am a victim of identity theft. Being a victim of identity theft I felt the need for creating an awareness among the computer and internet users particularly youngsters in India. Nearly 70 per cent of Indian‘s population are living in villages. Government of India already started providing computer and internet facilities even to the remote villages through various rural development and rural upliftment programmes. Highly educated people, established companies, world famous financial institutions are becoming victim of identity theft. The question here is how vulnerable the illiterate and innocent rural people are if they suddenly exposed to a new device through which some one can extract and exploit their personal data without their knowledge? In this research work an attempt has been made to bring out the real problems associated with Identity theft in developed countries from an economist point of view.
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.
Resumo:
Mit der Verwirklichung ,Ökologischer Netzwerke‘ werden Hoffnungen zum Stopp des Verlustes der biologischen Vielfalt verknüpft. Sowohl auf gesamteuropäischer Ebene (Pan-European Ecological Network - PEEN) als auch in den einzelnen Staaten entstehen Pläne zum Aufbau von Verbundsystemen. Im föderalen Deutschland werden kleinmaßstäbliche Biotopverbundplanungen auf Landesebene aufgestellt; zum nationalen Biotopverbund bestehen erste Konzepte. Die vorliegende Arbeit ist auf diese überörtlichen, strategisch vorbereitenden Planungsebenen ausgerichtet. Ziele des Verbunds sind der Erhalt von Populationen insbesondere der gefährdeten Arten sowie die Ermöglichung von Ausbreitung und Wanderung. Aufgrund fehlender Datengrundlagen zu den Arten und Populationen ist es nicht ohne weiteres möglich, die Konzepte und Modelle der Populationsökologie in die überörtlichen Planungsebenen zu übertragen. Gemäß der o.g. Zielstellungen sollte sich aber die Planung von Verbundsystemen an den Ansprüchen der auf Verbund angewiesenen Arten orientieren. Ziel der Arbeit war die Entwicklung einer praktikablen GIS-gestützten Planungshilfe zur größtmöglichen Integration ökologischen Wissens unter der Bedingung eingeschränkter Informationsverfügbarkeit. Als Grundlagen dazu werden in Übersichtsform zunächst die globalen, europäisch-internationalen und nationalen Rahmenbedingungen und Anforderungen bezüglich des Aufbaus von Verbundsystemen zusammengestellt. Hier sind die Strategien zum PEEN hervorzuheben, die eine Integration ökologischer Inhalte insbesondere durch die Berücksichtigung räumlich-funktionaler Beziehungen fordern. Eine umfassende Analyse der landesweiten Biotopverbundplanungen der BRD zeigte die teilweise erheblichen Unterschiede zwischen den Länderplanungen auf, die es aktuell nicht ermöglichen, ein schlüssiges nationales Konzept zusammenzufügen. Nicht alle Länder haben landesweite Biotopverbundplanungen und Landeskonzepte, bei denen dem geplanten Verbund die Ansprüche von Arten zugrunde gelegt werden, gibt es nur ansatzweise. Weiterhin wurde eine zielgerichtete Eignungsprüfung bestehender GIS-basierter Modelle und Konzepte zum Verbund unter Berücksichtigung der regelmäßig in Deutschland verfügbaren Datengrundlagen durchgeführt. Da keine integrativen regelorientierten Ansätze vorhanden waren, wurde der vektorbasierte Algorithmus HABITAT-NET entwickelt. Er arbeitet mit ,Anspruchstypen‘ hinsichtlich des Habitatverbunds, die stellvertretend für unterschiedliche ökologische Gruppen von (Ziel-) Arten mit terrestrischer Ausbreitung stehen. Kombiniert wird die Fähigkeit zur Ausbreitung mit einer Grobtypisierung der Biotopbindung. Die wichtigsten Grundlagendaten bilden die jeweiligen (potenziellen) Habitate von Arten eines Anspruchstyps sowie die umgebende Landnutzung. Bei der Bildung von ,Lebensraumnetzwerken‘ (Teil I) werden gestufte ,Funktions- und Verbindungsräume‘ generiert, die zu einem räumlichen System verknüpft sind. Anschließend kann die aktuelle Zerschneidung der Netzwerke durch Verkehrstrassen aufgezeigt werden, um darauf aufbauend prioritäre Abschnitte zur Wiedervernetzung zu ermitteln (Teil II). Begleitend wird das Konzept der unzerschnittenen Funktionsräume (UFR) entworfen, mit dem die Indikation von Habitatzerschneidung auf Landschaftsebene möglich ist. Diskutiert werden schließlich die Eignung der Ergebnisse als kleinmaßstäblicher Zielrahmen, Tests zur Validierung, Vergleiche mit Verbundplanungen und verschiedene Setzungen im GIS-Algorithmus. Erläuterungen zu den Einsatzmöglichkeiten erfolgen beispielsweise für die Bereiche Biotopverbund- und Landschaftsplanung, Raumordnung, Strategische Umweltprüfung, Verkehrswegeplanung, Unterstützung des Konzeptes der Lebensraumkorridore, Kohärenz im Schutzgebietssystem NATURA 2000 und Aufbau von Umweltinformationssystemen. Schließlich wird ein Rück- und Ausblick mit der Formulierung des weiteren Forschungsbedarfs verknüpft.
Resumo:
The Honda workers’ strike in 2010 attracted world wide attention. It was one of thousands of labor disputes that happen every year in China, but it was the first major calling for the right of workers to represent themselves in collective bargaining. The question of representation is therefore the main topic of the book. The various contributors to this volume share the view that the Chinese party-state takes the protest against social inequality seriously. It has enacted many laws aimed at channeling dissatisfaction into safe channels. The implementation of these laws, however, lags behind and these laws do not include the right of freedom of association. Without this right, super-exploitation will persist and the system of labor relations will remain prone to eruptive forms of protest. The first part of the book provides an overview of the economic context of Chinese labor relations, the transformation of class-relations, the evolution of labor law, and government policies intended to set a wage floor. Based on extensive field research, the second part looks at the evolution of labor relations at the industry level. In the third part, the focus shifts to the Corporate Social Responsibility agenda in China. The final part looks at the connection between land reform and social inequality.
Resumo:
It is well known that the parasitic weed Striga asiatica (L.) Kuntze can be suppressed by Striga-tolerant sorghum (Sorghum bicolor L. Moench) cultivars, Desmodium intortum (Mill.) Urb. (greanleaf desmodium), and by fertilization with nitrogen. The study objective was the assessment of Striga control provided by integration of Desmodium density, timing of sorghum-Desmodium intercrop establishment, and nitrogen fertilization. Growth responses and yield of three sorghum cultivars were measured in three pot experiments. A soil naturally infested with Striga was used, and that part of the soil which served as uninfested control was chemically sterilised. Striga numbers and growth were affected significantly by sorghum cultivars, sorghum-Desmodium intercrop ratios, timing of the sorghum-Desmodium association, as well as by their interactions. Desmodium caused 100% suppression of Striga emergence when Desmodium was established in the 1:3 sorghum-Desmodium ratio at seeding of sorghum. Total control of Striga was also achieved with the 1:1 sorghum-Desmodium ratio when Desmodium was transplanted 30 days before sorghum seeding. However, these two treatments also caused significant reductions in sorghum yield. In contrast, 100% Striga control and a dramatic increase in sorghum yield were achieved with 100 kg N ha^{-1} in the 1:1 sorghum-Desmodium intercrop. Compatibility of sorghum and Desmodium was evident at the 1:1 sorghum-Desmodium intercrop established at sorghum seeding. Overall, the Ethiopian cultivars Meko and Abshir showed better agronomic performance and higher tolerance to Striga than the South African cultivar PAN 8564. It is recommended that the N × Desmodium × sorghum interaction be investigated under field conditions.
Resumo:
Para atender a las familias españolas que residen fuera de España el Ministerio de Educación en colaboración con el de asuntos exteriores desarrolla un programa denominado educación en el exterior en el que participan profesores españoles de educación infantil, primaria y secundaria que permanecen en el centro de destino de tres a seis años. Los temas abordados son: estructura orgánica del Ministerio, organización de las enseñanzas y programas, los centros integrados, el Instituto Cervantes, las consejerías de educación, asesores técnicos, legislación, etc.
Resumo:
In this work we have made significant contributions in three different areas of interest: therapeutic protein stabilization, thermodynamics of natural gas clathrate-hydrates, and zeolite catalysis. In all three fields, using our various computational techniques, we have been able to elucidate phenomena that are difficult or impossible to explain experimentally. More specifically, in mixed solvent systems for proteins we developed a statistical-mechanical method to model the thermodynamic effects of additives in molecular-level detail. It was the first method demonstrated to have truly predictive (no adjustable parameters) capability for real protein systems. We also describe a novel mechanism that slows protein association reactions, called the “gap effect.” We developed a comprehensive picture of methioine oxidation by hydrogen peroxide that allows for accurate prediction of protein oxidation and provides a rationale for developing strategies to control oxidation. The method of solvent accessible area (SAA) was shown not to correlate well with oxidation rates. A new property, averaged two-shell water coordination number (2SWCN) was identified and shown to correlate well with oxidation rates. Reference parameters for the van der Waals Platteeuw model of clathrate-hydrates were found for structure I and structure II. These reference parameters are independent of the potential form (unlike the commonly used parameters) and have been validated by calculating phase behavior and structural transitions for mixed hydrate systems. These calculations are validated with experimental data for both structures and for systems that undergo transitions from one structure to another. This is the first method of calculating hydrate thermodynamics to demonstrate predictive capability for phase equilibria, structural changes, and occupancy in pure and mixed hydrate systems. We have computed a new mechanism for the methanol coupling reaction to form ethanol and water in the zeolite chabazite. The mechanism at 400°C proceeds via stable intermediates of water, methane, and protonated formaldehyde.
Resumo:
Este material se edita dentro de HABITAR, programa did??ctico para el estudio del medio, que ofert?? el Ayuntamiento de Gij??n a la comunidad educativa. El documento en s??, viene precedido por la presentaci??n de los principios educativos y otros principios de procedimiento que guian el programa.En concreto en el tema 'Los alimentos' existen dos partes diferenciadas: por un lado el aspecto del proceso productivo de los alimentos y por otro lado el aspecto de la alimentaci??n humana, de la nutrici??n. La t??cnica de trabajo es la de 'seguir la pista' con objeto de integrar los diferentes aspectos de la producci??n de alimentos facilitando la puesta en relaci??n de este proceso con las caracter??sticas de los alumnos y su importancia en la nutrici??n humana. Se proponen una serie de actividades como juegos (veo-veo, la habitaci??n vac??a, b??squeda del tesoro, el mercado en clase, platos t??picos, etc.) y otras cuyos materiales se ofrecen agrupados en: 1.-Los alimentos que nos dan los alimentos 2.-En la caseria se producen alimentos 3.-El casta??o y el manzano 4.-Del trigo al pan 6.- La compra equilibrada. Finalmente se ofrecen las gu??as para las visitas a una caser??a, el mercado y una panificadora.
Resumo:
Nowadays, Oceanographic and Geospatial communities are closely related worlds. The problem is that they follow parallel paths in data storage, distributions, modelling and data analyzing. This situation produces different data model implementations for the same features. While Geospatial information systems have 2 or 3 dimensions, the Oceanographic models uses multidimensional parameters like temperature, salinity, streams, ocean colour... This implies significant differences between data models of both communities, and leads to difficulties in dataset analysis for both sciences. These troubles affect directly to the Mediterranean Institute for Advanced Studies ( IMEDEA (CSIC-UIB)). Researchers from this Institute perform intensive processing with data from oceanographic facilities like CTDs, moorings, gliders… and geospatial data collected related to the integrated management of coastal zones. In this paper, we present an approach solution based on THREDDS (Thematic Real-time Environmental Distributed Data Services). THREDDS allows data access through the standard geospatial data protocol Web Coverage Service, inside the European project (European Coastal Sea Operational Observing and Forecasting system). The goal of ECOOP is to consolidate, integrate and further develop existing European coastal and regional seas operational observing and forecasting systems into an integrated pan- European system targeted at detecting environmental and climate changes
Resumo:
Resumen tomado de la publicaci??n