994 resultados para modern techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind energy has emerged as a major sustainable source of energy.The efficiency of wind power generation by wind mills has improved a lot during the last three decades.There is still further scope for maximising the conversion of wind energy into mechanical energy.In this context,the wind turbine rotor dynamics has great significance.The present work aims at a comprehensive study of the Horizontal Axis Wind Turbine (HAWT) aerodynamics by numerically solving the fluid dynamic equations with the help of a finite-volume Navier-Stokes CFD solver.As a more general goal,the study aims at providing the capabilities of modern numerical techniques for the complex fluid dynamic problems of HAWT.The main purpose is hence to maximize the physics of power extraction by wind turbines.This research demonstrates the potential of an incompressible Navier-Stokes CFD method for the aerodynamic power performance analysis of horizontal axis wind turbine.The National Renewable Energy Laboratory USA-NREL (Technical Report NREL/Cp-500-28589) had carried out an experimental work aimed at the real time performance prediction of horizontal axis wind turbine.In addition to a comparison between the results reported by NREL made and CFD simulations,comparisons are made for the local flow angle at several stations ahead of the wind turbine blades.The comparison has shown that fairly good predictions can be made for pressure distribution and torque.Subsequently, the wind-field effects on the blade aerodynamics,as well as the blade/tower interaction,were investigated.The selected case corresponded to a 12.5 m/s up-wind HAWT at zero degree of yaw angle and a rotational speed of 25 rpm.The results obtained suggest that the present can cope well with the flows encountered around wind turbines.The areodynamic performance of the turbine and the flow details near and off the turbine blades and tower can be analysed using theses results.The aerodynamic performance of airfoils differs from one another.The performance mainly depends on co-efficient of performnace,co-efficient of lift,co-efficient of drag, velocity of fluid and angle of attack.This study shows that the velocity is not constant for all angles of attack of different airfoils.The performance parameters are calculated analytically and are compared with the standardized performance tests.For different angles of ,the velocity stall is determined for the better performance of a system with respect to velocity.The research addresses the effect of surface roughness factor on the blade surface at various sections.The numerical results were found to be in agreement with the experimental data.A relative advantage of the theoretical aerofoil design method is that it allows many different concepts to be explored economically.Such efforts are generally impractical in wind tunnels because of time and money constraints.Thus, the need for a theoretical aerofoil design method is threefold:first for the design of aerofoil that fall outside the range of applicability of existing calalogs:second,for the design of aerofoil that more exactly match the requirements of the intended application:and third,for the economic exploration of many aerofoil concepts.From the results obtained for the different aerofoils,the velocity is not constant for all angles of attack.The results obtained for the aerofoil mainly depend on angle of attack and velocity.The vortex generator technique was meticulously studies with the formulation of the specification for the right angle shaped vortex generators-VG.The results were validated in accordance with the primary analysis phase.The results were found to be in good agreement with the power curve.The introduction of correct size VGs at appropriate locations over the blades of the selected HAWT was found to increase the power generation by about 4%

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuously cored boreholes, 100 to 220m deep were drilled in the northern part of the Po Plain by Regione Lombardia in the last five years. Quantitative provenance analysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carried out by using multivariate statistical analysis (principal component analysis, PCA, and similarity analysis) on an integrated data set, including high-resolution bulk petrography and heavy-mineral analyses on Pleistocene sands and of 250 major and minor modern rivers draining the southern flank of the Alps from West to East (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations, metamorphic and quartzofeldspathic detritus from the Western and Central Alps was carried from the axial belt to the Po basin longitudinally parallel to the SouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenario rapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset of the first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA and similarity analysis from core samples show that the longitudinal trunk river at this time was shifted southward by the rapid southward and westward progradation of transverse alluvial river systems fed from the Central and Southern Alps. Sediments were transported southward by braided river systems as well as glacial sediments transported by Alpine valley glaciers invaded the alluvial plain. Kew words: Detrital modes; Modern sands; Provenance; Principal Components Analysis; Similarity, Canberra Distance; palaeodrainage

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attitudes to floristics have changed considerably during the past few decades as a result of increasing and often more focused consumer demands, heightened awareness of the threats to biodiversity, information flow and overload, and the application of electronic and web-based techniques to information handling and processing. This paper will examine these concerns in relation to our floristic knowledge and needs in the region of SW Asia. Particular reference will be made to the experience gained from the Euro+Med PlantBase project for the preparation of an electronic plant-information system for Europe and the Mediterranean, with a single core list of accepted plant names and synonyms, based on consensus taxonomy agreed by a specialist network. The many challenges Ð scientific, technical and organisational Ð that it has presented will be discussed as well as the problems of handling nontaxonomic information from fields such as conservation, karyology, biosystematics and mapping. The question of regional cooperation and the sharing of efforts and resources will also be raised and attention drawn to the recent planning workshop held in Rabat (May 2002) for establishing a technical cooperation network for taxonomic capacity building in North Africa as a possible model for the SW Asia region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We apply modern synchrotron-based structural techniques to the study of serine adsorbed on the pure andAumodified intrinsically chiral Cu{531} surface. XPS and NEXAFS data in combination with DFT show that on the pure surface both enantiomers adsorb in l4 geometries (with de-protonated b-OH groups) at low coverage and in l3 geometries at saturation coverage. Significantly larger enantiomeric differences are seen for the l4 geometries, which involve substrate bonds of three side groups of the chiral center, i.e. a three-point interaction. The l3 adsorption geometry, where only the carboxylate and amino groups form substrate bonds, leads to smaller but still significant enantiomeric differences, both in geometry and the decomposition behavior. When Cu{531} is modified by the deposition of 1 and 2ML Au the orientations of serine at saturation coverage are significantly different from those on the clean surface. In all cases, however, a l3 bond coordination is found at saturation involving different numbers of Au atoms, which leads to relatively small enantiomeric differences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although modern control techniques such as eigenstructure assignment have been given extensive coverage in control literature there is a reluctance to use them in practice as they are often not believed to be as `visible' or as simple as classical methods. A simple aircraft example is used, and it is shown that eigenstructure assignment can be used easily to produce a more viable controller than with simple classical techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, power system operation becomes more complex because of the critical operating conditions resulting from the requirements of a market-driven operation. In this context, efficient methods for optimisation of power system operation and planning become critical to satisfy the operational (technical), financial and economic demands. Therefore, the detailed analysis of modern optimisation techniques as well as their application to the power system problems represent a relevant issue from the scientific and technological points of view. This paper presents a brief overview of the developments on modern mathematical optimisation methods applied to power system operation and planning. Copyright © 2007 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. The general aim of this article is to describe the state-of-the-art of biocompatibility testing for dental materials, and present new strategies for improving operative dentistry techniques and the biocompatibility of dental materials as they relate to their interaction with the dentin-pulp complex.Methods. The literature was reviewed focusing on articles related to biocompatibilty testing, the dentin-pulp complex and new strategies and materials for operative dentistry. For this purpose, the PubMed database as well as 118 articles published in English from 1939 to 2014 were searched. Data concerning types of biological tests and standardization of in vitro and in vivo protocols employed to evaluate the cytotoxicity and biocompatibility of dental materials were also searched from the US Food and Drug Administration (FDA), International Standards Organization (ISO) and American National Standards Institute (ANSI).Results. While there is an ongoing search for feasible strategies in the molecular approach to direct the repair or regeneration of structures that form the oral tissues, it is necessary for professionals to master the clinical therapies available at present. In turn, these techniques must be applied based on knowledge of the morphological and physiological characteristics of the tissues involved, as well as the physical, mechanical and biologic properties of the biomaterials recommended for each specific situation. Thus, particularly within modern esthetic restorative dentistry, the use of minimally invasive operative techniques associated with the use of dental materials with excellent properties and scientifically proved by means of clinical and laboratory studies must be a routine for dentists. This professional and responsible attitude will certainly result in greater possibility of achieving clinical success, benefiting patients and dentists themselves.Signcance. This article provides a general and critical view of the relations that permeate the interaction between dental materials and the dentin-pulp complex, and establish real possibilities and strategies that favor biocompatibility of the present and new products used in Dentistry, which will certainly benefit clinicians and their patients. (C) 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that the deposition of gaseous pollutants and aerosols plays a major role in causing the deterioration of monuments and built cultural heritage in European cities. Despite of many studies dedicated to the environmental damage of cultural heritage, in case of cement mortars, commonly used in the 20th century architecture, the deterioration due to air multipollutants impact, especially the formation of black crusts, is still not well explored making this issue a challenging area of research. This work centers on cement mortars – environment interactions, focusing on the diagnosis of the damage on the modern built heritage due to air multi-pollutants. For this purpose three sites, exposed to different urban areas in Europe, were selected for sampling and subsequent laboratory analyses: Centennial Hall, Wroclaw (Poland), Chiesa dell'Autostrada del Sole, Florence (Italy), Casa Galleria Vichi, Florence (Italy). The sampling sessions were performed taking into account the height from the ground level and protection from rain run off (sheltered, partly sheltered and exposed areas). The complete characterization of collected damage layer and underlying materials was performed using a range of analytical techniques: optical and scanning electron microscopy, X ray diffractometry, differential and gravimetric thermal analysis, ion chromatography, flash combustion/gas chromatographic analysis, inductively coupled plasma-optical emission spectrometer. The data were elaborated using statistical methods (i.e. principal components analyses) and enrichment factor for cement mortars was calculated for the first time. The results obtained from the experimental activity performed on the damage layers indicate that gypsum, due to the deposition of atmospheric sulphur compounds, is the main damage product at surfaces sheltered from rain run-off at Centennial Hall and Casa Galleria Vichi. By contrast, gypsum has not been identified in the samples collected at Chiesa dell'Autostrada del Sole. This is connected to the restoration works, particularly surface cleaning, regularly performed for the maintenance of the building. Moreover, the results obtained demonstrated the correlation between the location of the building and the composition of the damage layer: Centennial Hall is mainly undergoing to the impact of pollutants emitted from the close coal power stations, whilst Casa Galleria Vichi is principally affected by pollutants from vehicular exhaust in front of the building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synchronization is a key issue in any communication system, but it becomes fundamental in the navigation systems, which are entirely based on the estimation of the time delay of the signals coming from the satellites. Thus, even if synchronization has been a well known topic for many years, the introduction of new modulations and new physical layer techniques in the modern standards makes the traditional synchronization strategies completely ineffective. For this reason, the design of advanced and innovative techniques for synchronization in modern communication systems, like DVB-SH, DVB-T2, DVB-RCS, WiMAX, LTE, and in the modern navigation system, like Galileo, has been the topic of the activity. Recent years have seen the consolidation of two different trends: the introduction of Orthogonal Frequency Division Multiplexing (OFDM) in the communication systems, and of the Binary Offset Carrier (BOC) modulation in the modern Global Navigation Satellite Systems (GNSS). Thus, a particular attention has been given to the investigation of the synchronization algorithms in these areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis collects the outcomes of a Ph.D. course in Telecommunications engineering and it is focused on enabling techniques for Spread Spectrum (SS) navigation and communication satellite systems. It provides innovations for both interference management and code synchronization techniques. These two aspects are critical for modern navigation and communication systems and constitute the common denominator of the work. The thesis is organized in two parts: the former deals with interference management. We have proposed a novel technique for the enhancement of the sensitivity level of an advanced interference detection and localization system operating in the Global Navigation Satellite System (GNSS) bands, which allows the identification of interfering signals received with power even lower than the GNSS signals. Moreover, we have introduced an effective cancellation technique for signals transmitted by jammers, exploiting their repetitive characteristics, which strongly reduces the interference level at the receiver. The second part, deals with code synchronization. More in detail, we have designed the code synchronization circuit for a Telemetry, Tracking and Control system operating during the Launch and Early Orbit Phase; the proposed solution allows to cope with the very large frequency uncertainty and dynamics characterizing this scenario, and performs the estimation of the code epoch, of the carrier frequency and of the carrier frequency variation rate. Furthermore, considering a generic pair of circuits performing code acquisition, we have proposed a comprehensive framework for the design and the analysis of the optimal cooperation procedure, which minimizes the time required to accomplish synchronization. The study results particularly interesting since it enables the reduction of the code acquisition time without increasing the computational complexity. Finally, considering a network of collaborating navigation receivers, we have proposed an innovative cooperative code acquisition scheme, which allows exploit the shared code epoch information between neighbor nodes, according to the Peer-to-Peer paradigm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerosol particles are strongly related to climate, air quality, visibility and human health issues. They contribute the largest uncertainty in the assessment of the Earth´s radiative budget, directly by scattering or absorbing solar radiation or indirectly by nucleating cloud droplets. The influence of aerosol particles on cloud related climatic effects essentially depends upon their number concentration, size and chemical composition. A major part of submicron aerosol consists of secondary organic aerosol (SOA) that is formed in the atmosphere by the oxidation of volatile organic compounds. SOA can comprise a highly diverse spectrum of compounds that undergo continuous chemical transformations in the atmosphere.rnThe aim of this work was to obtain insights into the complexity of ambient SOA by the application of advanced mass spectrometric techniques. Therefore, an atmospheric pressure chemical ionization ion trap mass spectrometer (APCI-IT-MS) was applied in the field, facilitating the measurement of ions of the intact molecular organic species. Furthermore, the high measurement frequency provided insights into SOA composition and chemical transformation processes on a high temporal resolution. Within different comprehensive field campaigns, online measurements of particular biogenic organic acids were achieved by combining an online aerosol concentrator with the APCI-IT-MS. A holistic picture of the ambient organic aerosol was obtained through the co-located application of other complementary MS techniques, such as aerosol mass spectrometry (AMS) or filter sampling for the analysis by liquid chromatography / ultrahigh resolution mass spectrometry (LC/UHRMS).rnIn particular, during a summertime field study at the pristine boreal forest station in Hyytiälä, Finland, the partitioning of organic acids between gas and particle phase was quantified, based on the online APCI-IT-MS and AMS measurements. It was found that low volatile compounds reside to a large extent in the gas phase. This observation can be interpreted as a consequence of large aerosol equilibration timescales, which build up due to the continuous production of low volatile compounds in the gas phase and/or a semi-solid phase state of the ambient aerosol. Furthermore, in-situ structural informations of particular compounds were achieved by using the MS/MS mode of the ion trap. The comparison to MS/MS spectra from laboratory generated SOA of specific monoterpene precursors indicated that laboratory SOA barely depicts the complexity of ambient SOA. Moreover, it was shown that the mass spectra of the laboratory SOA more closely resemble the ambient gas phase composition, indicating that the oxidation state of the ambient organic compounds in the particle phase is underestimated by the comparison to laboratory ozonolysis. These observations suggest that the micro-scale processes, such as the chemistry of aerosol aging or the gas-to-particle partitioning, need to be better understood in order to predict SOA concentrations more reliably.rnDuring a field study at the Mt. Kleiner Feldberg, Germany, a slightly different aerosol concentrator / APCI-IT-MS setup made the online analysis of new particle formation possible. During a particular nucleation event, the online mass spectra indicated that organic compounds of approximately 300 Da are main constituents of the bulk aerosol during ambient new particle formation. Co-located filter analysis by LC/UHRMS analysis supported these findings and furthermore allowed to determine the molecular formulas of the involved organic compounds. The unambiguous identification of several oxidized C 15 compounds indicated that oxidation products of sesquiterpenes can be important compounds for the initial formation and subsequent growth of atmospheric nanoparticles.rnThe LC/UHRMS analysis furthermore revealed that considerable amounts of organosulfates and nitrooxy organosulfates were detected on the filter samples. Indeed, it was found that several nitrooxy organosulfate related APCI-IT-MS mass traces were simultaneously enhanced. Concurrent particle phase ion chromatography and AMS measurements indicated a strong bias between inorganic sulfate and total sulfate concentrations, supporting the assumption that substantial amounts of sulfate was bonded to organic molecules.rnFinally, the comprehensive chemical analysis of the aerosol composition was compared to the hygroscopicity parameter kappa, which was derived from cloud condensation nuclei (CCN) measurements. Simultaneously, organic aerosol aging was observed by the evolution of a ratio between a second and a first generation biogenic oxidation product. It was found that this aging proxy positively correlates with increasing hygroscopicity. Moreover, it was observed that the bonding of sulfate to organic molecules leads to a significant reduction of kappa, compared to an internal mixture of the same mass fractions of purely inorganic sulfate and organic molecules. Concluding, it has been shown within this thesis that the application of modern mass spectrometric techniques allows for detailed insights into chemical and physico-chemical processes of atmospheric aerosols.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the discovery that DNA can be successfully recovered from museum collections, a new source of genetic information has been provided to extend our comprehension of the evolutionary history of species. However, historical specimens are often mislabeled or report incorrect information of origin, thus accurate identification of specimens is essential. Due to the highly damaged nature of ancient DNA many pitfalls exist and particular precautions need to be considered in order to perform genetic analysis. In this study we analyze 208 historical remains of pelagic fishes collected in the beginning of the 20th century. Through the adaptation of existing protocols, usually applied to human remains, we manage to successfully retrieve valuable genetic material from almost all of the examined samples using a guanidine and silica column-based approach. The combined use of two mitochondrial markers cytochrome-oxidase-1(mtDNA COI) and Control Region (mtDNA CR), and the nuclear marker first internal transcriber space (ITS1) allowed us to identify the majority of the examined specimens using traditional PCR and Sanger sequencing techniques. The creation of primers capable of amplifying heavily degraded DNA have great potential for future uses, both in ancient and in modern investigation. The methodologies developed in this study can in fact be applied for other ancient fish specimens as well as cooked or canned samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis and management of patients with renovascular disease and hypertension continue to elude healthcare providers. The advent of novel imaging and interventional techniques, and increased understanding of the pathways leading to irreversible renal injury and renovascular hypertension, have ushered in commendable attempts to optimize and finetune strategies to preserve or restore renal function and control blood pressure. Large randomized clinical trials that compare different forms of therapy, and smaller trials that test novel experimental treatments, will hopefully help formulate innovative concepts and tools to manage the patient population with atherosclerotic renovascular disease.