998 resultados para Continuously Operating Reference Stations (CORS)
Resumo:
Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.
Resumo:
This thesis Entitled studies on the macrobenthic community of cochin backwaters with special reference to culture of eriopisa chilkensis (Gammaridae- amphipoda).Benthic organisms are usually studied for environmental impact assessment, pollution control and resource conservation. The benthic monitoring component has three major objectives: 1) characterize the benthic communities to assess the estuarine health, 2) determine seasonal and spatial variability in benthic communities, and 3) detect changes in the estuarine community through examination of changes in abundances of specific indicator taxa and other standard benthic indices.Cochin backwaters situated at the tip of the northern Vembanad lake is a tropical positive estuarine system. The backwaters of Kerala support as much biological productivity and diversity as tropical rain forest and are responsible for the rich fishery potential of Kerala. Backwaters also act as nursery grounds for commercially important prawns and fishes.The thesis has been subdivided into seven chapters. The first chapter gives a general introduction about the topic and also highlights the scope and purpose of the study. The second chapter covers the methodology adopted for the collection and analysis of water quality parameters, sediment and the macrobenthic fauna.Chapter 3 deals with hydrographic features, sediment characteristics and the spatial variation and abundance of macrobenthic fauna in the Cochin estuary.Chapter 4 explains the impact of organic enrichment on macrobenthic popUlation in the Cochin estuary and includes the comparison of the present data with the earlier work in this region.Chapter 5 deals with seasonal variability in abundance of macrobenthic species in the estuary. The study was conducted from 9 stations during three seasons (pre-monsoon, monsoon and post-monsoon) in 2003.Chapter 6 deals with Life history and Population Dynamics of Eriopisa chilkensis Chilton (Gammaridae-Amphipoda). The life cycle of the gammarid amphipod Eriopisa chilkensis from the Cochin estuary, south west coast of India was studied for the first time under laboratory conditions.
Resumo:
As of 1999. the state of Kerala has 3210 offices of scheduled commercial banks (SCBS). In all, there are 48 commercial banks operating in Kerala, which includes PSBs, OPBs, NPBS. FBs, and Gramin Banks. The urban areas give a complete picture of the competition in the present day banking scenario with the presence of all bank groups. Semi-urban areas of Kerala have 2196 and urban areas have 593 as on March 1995.“ The study focuses on the selected segments ofthe urban customers in Kerala which is capable of giving the finer aspects of variation in customer behaviour in the purchase of banking products and services. Considering the exhaustive nature of such an exercise, all the districts in the state have not been brought under the purview of the study. Instead. three districts with largest volume of business in terms of deposits, advances, and number of offices have been short listed as representative regions for a focused study. The study focuses on the retail customer segment and their perceptions on the various products or services offered to them. Non Resident Indians (NRIs), and Traders and Small—ScaIe Industries segments have also been included in the study with a view to obtain a comparative picture with respect to perception on customer satisfaction and service quality dimensions and bank choice behaviour. The research is hence confined to customer behaviour and the implications for possible strategies for segmentation within the retail segment customers
Resumo:
This research has responded to the need for diagnostic reference tools explicitly linking the influence of environmental uncertainty and performance within the supply chain. Uncertainty is a key factor influencing performance and an important measure of the operating environment. We develop and demonstrate a novel reference methodology based on data envelopment analysis (DEA) for examining the performance of value streams within the supply chain with specific reference to the level of environmental uncertainty they face. In this paper, using real industrial data, 20 product supply value streams within the European automotive industry sector are evaluated. Two are found to be efficient. The peer reference groups for the underperforming value streams are identified and numerical improvement targets are derived. The paper demonstrates how DEA can be used to guide supply chain improvement efforts through role-model identification and target setting, in a way that recognises the multiple dimensions/outcomes of the supply chain process and the influence of its environmental conditions. We have facilitated the contextualisation of environmental uncertainty and its incorporation into a specific diagnostic reference tool.
Resumo:
The general objective of this thesis has been seasonal monitoring (quarterly time scale) of coastal and estuarine areas of a section of the Northern Coast of Rio Grande do Norte, Brazil, environmentally sensitive and with intense sediment erosion in the oil activities to underpin the implementation of projects for containment of erosion and mitigate the impacts of coastal dynamics. In order to achieve the general objective, the work was done systematically in three stages which consisted the specific objectives. The first stage was the implementation of geodetic reference infrastructure for carrying out the geodetic survey of the study area. This process included the implementation of RGLS (Northern Coast of the RN GPS Network), consisting of stations with geodetic coordinates and orthometric heights of precision; positioning of Benchmarks and evaluation of the gravimetric geoid available, for use in GPS altimetry of precision; and development of software for GPS altimetry of precision. The second stage was the development and improvement of methodologies for collection, processing, representation, integration and analysis of CoastLine (CL) and Digital Elevation Models (DEM) obtained by geodetic positioning techniques. As part of this stage have been made since, the choice of equipment and positioning methods to be used, depending on the required precision and structure implanted, and the definition of the LC indicator and of the geodesic references best suited, to coastal monitoring of precision. The third step was the seasonal geodesic monitoring of the study area. It was defined the execution times of the geodetic surveys by analyzing the pattern of sediment dynamics of the study area; the performing of surveys in order to calculate and locate areas and volumes of erosion and accretion (sandy and volumetric sedimentary balance) occurred on CL and on the beaches and islands surfaces throughout the year, and study of correlations between the measured variations (in area and volume) between each survey and the action of the coastal dynamic agents. The results allowed an integrated study of spatial and temporal interrelationships of the causes and consequences of intensive coastal processes operating in the area, especially to the measurement of variability of erosion, transport, balance and supply sedimentary over the annual cycle of construction and destruction of beaches. In the analysis of the results, it was possible to identify the causes and consequences of severe coastal erosion occurred on beaches exposed, to analyze the recovery of beaches and the accretion occurring in tidal inlets and estuaries. From the optics of seasonal variations in the CL, human interventions to erosion contention have been proposed with the aim of restoring the previous situation of the beaches in the process of erosion.
Resumo:
Three ranges of increasing temperatures (35-43, 37-45, 39-47degreesC) were sequentially applied to a five-stage system continuously operated with cell recycling so that differences of 2degreesC (between one reactor to the next) and 8degreesC (between the first reactor at the highest temperature and the fifth at the lowest temperature) were kept among the reactors for each temperature range. The entire system was fed through the first reactor. The lowest values of biomass and viability were obtained for reactor R-3 located in the middle of the system. The highest yield of biomass was obtained in the effluent when the system was operated at 35-43degreesC. This nonconventional system was set up to simulate the local fluctuations in temperature and nutrient concentrations that occur in different regions of the medium in an industrial bioreactor for fuel ethanol production mainly in tropical climates. Minimized cell death and continuous sugar utilization were observed at temperatures normally considered too high for Saccharomyces cerevisiae fermentations.
Resumo:
As descargas atmosféricas representam um dos maiores fatores de riscos para o setor elétrico, especialmente na Amazônia, uma região do mundo identificada com altas densidades de ocorrências e picos de corrente de raios. Nesse contexto, a rede STARNET (Sferics Timing And Ranging NETwork), único sistema de detecção de descargas elétricas atmosféricas, terrestre, gratuito, e contínuo, cobrindo toda a região Amazônica, foi escolhido para a geração de padrões de descargas atmosféricas necessários a otimização dos sistemas de proteção das linhas de transmissão por meio de sistema de monitoramento de descargas atmosféricas. Contudo, o funcionamento intermitente, observado nos diagramas operacionais das várias estações da rede STARNET, afeta o desempenho global do sistema, em especial a eficiência de detecção. Por isso, foi desenvolvido um modelo de uniformização dos dados de descargas atmosféricas em função da configuração da rede (número e localização dos sensores em operação), no objetivo final de editar mapas de índice ceráunico e densidades de raios confiáveis. Existem regiões da Amazônia que sempre apresentam anomalias positivas de densidade de raios como as regiões de Belém e Manaus, bem como o estado do Tocantins que afeitam as linhas de transmissão. Depois, a incidência de raios nas linhas de transmissão na Amazônia foi estimada a partir da distribuição de pico de correntes registrada pela RDR-SIPAM e da densidade de raios corrigida da rede STARNET. A avaliação do desempenho da rede STARNET tendo como referencia à rede de detecção de raios do Sistema de Proteção da Amazônia (RDR-SIPAM) mostrou também uma forte dependência da eficiência de detecção da rede STARNET em função do pico de corrente das descargas atmosféricas.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
The reference librarian's task is to translate the patron's question into one that can be answered with the library's resources. The first element of that task is to know what the patron wants; the second is to know what resources the library has and how to use them. Reference librarians must learn continuously throughout their careers, both because new resources become available, but also because patrons present questions requiring new resources. This article will focus on how to determine what kind of information the patron needs through the reference interview.
Resumo:
Data from polar stations
Resumo:
To estimate the kinematics of the SIRGAS reference frame, the Deutsches Geodätisches Forschungsinstitut (DGFI) as the IGS Regional Network Associate Analysis Centre for SIRGAS (IGS RNNAC SIR), yearly computes a cumulative (multi-year) solution containing all available weekly solutions delivered by the SIRGAS analysis centres. These cumulative solutions include those models, standards, and strategies widely applied at the time in which they were computed and cover different time spans depending on the availability of the weekly solutions. This data set corresponds to the multi-year solution SIR11P01. It is based on the combination of the weekly normal equations covering the time span from 2000-01-02 (GPS week 1043) to 2011-04-16 (GPS week 1631), when the IGS08 reference frame was introduced. It refers to ITRF2008, epoch 2005.0 and contains 230 stations with 269 occupations. Its precision was estimated to be ±1.0 mm (horizontal) and ±2.4 mm (vertical) for the station positions, and ±0.7 mm/a (horizontal) and ±1.1 mm/a (vertical) for the constant velocities. Computation strategy and results are in detail described in Sánchez and Seitz (2011). The IGS RNAAC SIR computation of the SIRGAS reference frame is possible thanks to the active participation of many Latin American and Caribbean colleagues, who not only make the measurements of the stations available, but also operate SIRGAS analysis centres processing the observational data on a routine basis (more details in http://www.sirgas.org). The achievements of SIRGAS are a consequence of a successful international geodetic cooperation not only following and meeting concrete objectives, but also becoming a permanent and self-sustaining geodetic community to guarantee quality, reliability, and long-term stability of the SIRGAS reference frame. The SIRGAS activities are strongly supported by the International Association of Geodesy (IAG) and the Pan-American Institute for Geography and History (PAIGH). The IGS RNAAC SIR highly appreciates all this support.
Resumo:
High-frequency data collected continuously over a multiyear time frame are required for investigating the various agents that drive ecological and hydrodynamic processes in estuaries. Here, we present water quality and current in-situ observations from a fixed monitoring station operating from 2008 to 2014 in the lower Guadiana Estuary, southern Portugal (37°11.30' N, 7°24.67' W). The data were recorded by a multi-parametric probe providing hourly records (temperature, salinity, chlorophyll, dissolved oxygen, turbidity, and pH) at a water depth of ~1 m, and by a bottom-mounted acoustic Doppler current profiler measuring the pressure, near-bottom temperature, and flow velocity through the water column every 15 min. The time-series data, in particular the probe ones, present substantial gaps arising from equipment failure and maintenance, which are ineluctable with this type of observations in harsh environments. However, prolonged (months-long) periods of multi-parametric observations during contrasted external forcing conditions are available. The raw data are reported together with flags indicating the quality status of each record. River discharge data from two hydrographic stations located near the estuary head are also provided to support data analysis and interpretation.