993 resultados para LINE-DATA-BASE


Relevância:

80.00% 80.00%

Publicador:

Resumo:

L'objectiu d'aquest article és presentar l'estructura de la base de dades relacional que inclou tota la informació sintictica continguda en el Diccionario Critico Etimológico Castellano e Hispánico de J. Corominas i J. A. Pascual. Tot i que aquest diccionari conté un ampli ventall d'informacions històriques de cadascun dels temes, aquestes no es mostren de forma estructurada, per la qual cosa ha estat necessari estudiar i classificar tots aquells elements relacionats amb aspectes sintàctics. És a partir d'aquest estudi previ que s'han elaborat els diferents camps de la base de dades, els quals s'agrupen en cinc blocs temàtics: informació lemàtica; gramatical; sintàctica; altres aspectes relacionats; i observacions o comentaris rellevants fets per l'investigador. Aquesta base de dades no només reprodueix els continguts del diccionari, sinó que inclou diferents camps interpretatius. Per aquesta raó, Syntax. dbf representa una eina de treball fonamental per a tots aquells investigadors interessats en la sintaxi diacrònica de l'espanyol

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis involves the development and implementation of a new and robust control system based on permeability trends but at the same time capable of reducing aeration proportionally to permeate flux. Permeability was made a key parameter for directly comparing temporary changes in membrane performance. Transmembrane pressure and flux were gathered every 10 seconds and permeability values were automatically calculated; different mathematical algorithms were applied for the signal filtering of on-line data. Short term and long term permeability trends were compared once a day, and a control action was applied proportionally to the short term/long term permeability ratio without exceeding the aeration flow recommended by the membrane suppliers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Actualment, la legislació ambiental ha esdevingut més restrictiva pel que fa a la descàrrega d'aigües residuals amb nutrients, especialment en les anomenades àrees sensibles o zones vulnerables. Arran d'aquest fet, s'ha estimulat el coneixement, desenvolupament i millora dels processos d'eliminació de nutrients. El Reactor Discontinu Seqüencial (RDS) o Sequencing Batch Reactor (SBR) en anglès, és un sistema de tractament de fangs actius que opera mitjançant un procediment d'omplerta-buidat. En aquest tipus de reactors, l'aigua residual és addicionada en un sol reactor que treballa per càrregues repetint un cicle (seqüència) al llarg del temps. Una de les característiques dels SBR és que totes les diferents operacions (omplerta, reacció, sedimentació i buidat) es donen en un mateix reactor. La tecnologia SBR no és nova d'ara. El fet, és que va aparèixer abans que els sistema de tractament continu de fangs actius. El precursor dels SBR va ser un sistema d'omplerta-buidat que operava en discontinu. Entre els anys 1914 i 1920, varen sorgir certes dificultats moltes d'elles a nivell d'operació (vàlvules, canvis el cabal d'un reactor a un altre, elevat temps d'atenció per l'operari...) per aquests reactors. Però no va ser fins a finals de la dècada dels '50 principis del '60, amb el desenvolupament de nous equipaments i noves tecnologies, quan va tornar a ressorgir l'interès pels SBRs. Importants millores en el camp del subministrament d'aire (vàlvules motoritzades o d'acció pneumàtica) i en el de control (sondes de nivell, mesuradors de cabal, temporitzadors automàtics, microprocessadors) han permès que avui en dia els SBRs competeixin amb els sistemes convencional de fangs actius. L'objectiu de la present tesi és la identificació de les condicions d'operació adequades per un cicle segons el tipus d'aigua residual a l'entrada, les necessitats del tractament i la qualitat desitjada de la sortida utilitzant la tecnologia SBR. Aquestes tres característiques, l'aigua a tractar, les necessitats del tractament i la qualitat final desitjada determinen en gran mesura el tractament a realitzar. Així doncs, per tal d'adequar el tractament a cada tipus d'aigua residual i les seves necessitats, han estat estudiats diferents estratègies d'alimentació. El seguiment del procés es realitza mitjançant mesures on-line de pH, OD i RedOx, els canvis de les quals donen informació sobre l'estat del procés. Alhora un altre paràmetre que es pot calcular a partir de l'oxigen dissolt és la OUR que és una dada complementària als paràmetres esmentats. S'han avaluat les condicions d'operació per eliminar nitrogen d'una aigua residual sintètica utilitzant una estratègia d'alimentació esglaonada, a través de l'estudi de l'efecte del nombre d'alimentacions, la definició de la llargada i el número de fases per cicle, i la identificació dels punts crítics seguint les sondes de pH, OD i RedOx. S'ha aplicat l'estratègia d'alimentació esglaonada a dues aigües residuals diferents: una procedent d'una indústria tèxtil i l'altra, dels lixiviats d'un abocador. En ambdues aigües residuals es va estudiar l'eficiència del procés a partir de les condicions d'operació i de la velocitat del consum d'oxigen. Mentre que en l'aigua residual tèxtil el principal objectiu era eliminar matèria orgànica, en l'aigua procedent dels lixiviats d'abocador era eliminar matèria orgànica i nitrogen. S'han avaluat les condicions d'operació per eliminar nitrogen i fòsfor d'una aigua residual urbana utilitzant una estratègia d'alimentació esglaonada, a través de la definició del número i la llargada de les fases per cicle, i la identificació dels punts crítics seguint les sondes de pH, OD i RedOx. S'ha analitzat la influència del pH i la font de carboni per tal d'eliminar fòsfor d'una aigua sintètica a partir de l'estudi de l'increment de pH a dos reactors amb diferents fonts de carboni i l'estudi de l'efecte de canviar la font de carboni. Tal i com es pot veure al llarg de la tesi, on s'han tractat diferents aigües residuals per a diferents necessitats, un dels avantatges més importants d'un SBR és la seva flexibilitat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reanalysis data obtained from data assimilation are increasingly used for diagnostic studies of the general circulation of the atmosphere, for the validation of modelling experiments and for estimating energy and water fluxes between the Earth surface and the atmosphere. Because fluxes are not specifically observed, but determined by the data assimilation system, they are not only influenced by the utilized observations but also by model physics and dynamics and by the assimilation method. In order to better understand the relative importance of humidity observations for the determination of the hydrological cycle, in this paper we describe an assimilation experiment using the ERA40 reanalysis system where all humidity data have been excluded from the observational data base. The surprising result is that the model, driven by the time evolution of wind, temperature and surface pressure, is able to almost completely reconstitute the large-scale hydrological cycle of the control assimilation without the use of any humidity data. In addition, analysis of the individual weather systems in the extratropics and tropics using an objective feature tracking analysis indicates that the humidity data have very little impact on these systems. We include a discussion of these results and possible consequences for the way moisture information is assimilated, as well as the potential consequences for the design of observing systems for climate monitoring. It is further suggested, with support from a simple assimilation study with another model, that model physics and dynamics play a decisive role for the hydrological cycle, stressing the need to better understand these aspects of model parametrization. .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impact of selected observing systems on the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA40) is explored by mimicking observational networks of the past. This is accomplished by systematically removing observations from the present observational data base used by ERA40. The observing systems considered are a surface-based system typical of the period prior to 1945/50, obtained by only retaining the surface observations, a terrestrial-based system typical of the period 1950-1979, obtained by removing all space-based observations, and finally a space-based system, obtained by removing all terrestrial observations except those for surface pressure. Experiments using these different observing systems have been limited to seasonal periods selected from the last 10 yr of ERA40. The results show that the surface-based system has severe limitations in reconstructing the atmospheric state of the upper troposphere and stratosphere. The terrestrial system has major limitations in generating the circulation of the Southern Hemisphere with considerable errors in the position and intensity of individual weather systems. The space-based system is able to analyse the larger-scale aspects of the global atmosphere almost as well as the present observing system but performs less well in analysing the smaller-scale aspects as represented by the vorticity field. Here, terrestrial data such as radiosondes and aircraft observations are of paramount importance. The terrestrial system in the form of a limited number of radiosondes in the tropics is also required to analyse the quasi-biennial oscillation phenomenon in a proper way. The results also show the dominance of the satellite observing system in the Southern Hemisphere. These results all indicate that care is required in using current reanalyses in climate studies due to the large inhomogeneity of the available observations, in particular in time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Severe wind storms are one of the major natural hazards in the extratropics and inflict substantial economic damages and even casualties. Insured storm-related losses depend on (i) the frequency, nature and dynamics of storms, (ii) the vulnerability of the values at risk, (iii) the geographical distribution of these values, and (iv) the particular conditions of the risk transfer. It is thus of great importance to assess the impact of climate change on future storm losses. To this end, the current study employs—to our knowledge for the first time—a coupled approach, using output from high-resolution regional climate model scenarios for the European sector to drive an operational insurance loss model. An ensemble of coupled climate-damage scenarios is used to provide an estimate of the inherent uncertainties. Output of two state-of-the-art global climate models (HadAM3, ECHAM5) is used for present (1961–1990) and future climates (2071–2100, SRES A2 scenario). These serve as boundary data for two nested regional climate models with a sophisticated gust parametrizations (CLM, CHRM). For validation and calibration purposes, an additional simulation is undertaken with the CHRM driven by the ERA40 reanalysis. The operational insurance model (Swiss Re) uses a European-wide damage function, an average vulnerability curve for all risk types, and contains the actual value distribution of a complete European market portfolio. The coupling between climate and damage models is based on daily maxima of 10 m gust winds, and the strategy adopted consists of three main steps: (i) development and application of a pragmatic selection criterion to retrieve significant storm events, (ii) generation of a probabilistic event set using a Monte-Carlo approach in the hazard module of the insurance model, and (iii) calibration of the simulated annual expected losses with a historic loss data base. The climate models considered agree regarding an increase in the intensity of extreme storms in a band across central Europe (stretching from southern UK and northern France to Denmark, northern Germany into eastern Europe). This effect increases with event strength, and rare storms show the largest climate change sensitivity, but are also beset with the largest uncertainties. Wind gusts decrease over northern Scandinavia and Southern Europe. Highest intra-ensemble variability is simulated for Ireland, the UK, the Mediterranean, and parts of Eastern Europe. The resulting changes on European-wide losses over the 110-year period are positive for all layers and all model runs considered and amount to 44% (annual expected loss), 23% (10 years loss), 50% (30 years loss), and 104% (100 years loss). There is a disproportionate increase in losses for rare high-impact events. The changes result from increases in both severity and frequency of wind gusts. Considerable geographical variability of the expected losses exists, with Denmark and Germany experiencing the largest loss increases (116% and 114%, respectively). All countries considered except for Ireland (−22%) experience some loss increases. Some ramifications of these results for the socio-economic sector are discussed, and future avenues for research are highlighted. The technique introduced in this study and its application to realistic market portfolios offer exciting prospects for future research on the impact of climate change that is relevant for policy makers, scientists and economists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Atlas presents statistical analyses of the simulations submitted to the Aqua-Planet Experiment (APE) data archive. The simulations are from global Atmospheric General Circulation Models (AGCM) applied to a water-covered earth. The AGCMs include ones actively used or being developed for numerical weather prediction or climate research. Some are mature, application models and others are more novel and thus less well tested in Earth-like applications. The experiment applies AGCMs with their complete parameterization package to an idealization of the planet Earth which has a greatly simplified lower boundary that consists of an ocean only. It has no land and its associated orography, and no sea ice. The ocean is represented by Sea Surface Temperatures (SST) which are specified everywhere with simple, idealized distributions. Thus in the hierarchy of tests available for AGCMs, APE falls between tests with simplified forcings such as those proposed by Held and Suarez (1994) and Boer and Denis (1997) and Earth-like simulations of the Atmospheric Modeling Intercomparison Project (AMIP, Gates et al., 1999). Blackburn and Hoskins (2013) summarize the APE and its aims. They discuss where the APE fits within a modeling hierarchy which has evolved to evaluate complete models and which provides a link between realistic simulation and conceptual models of atmospheric phenomena. The APE bridges a gap in the existing hierarchy. The goals of APE are to provide a benchmark of current model behaviors and to stimulate research to understand the cause of inter-model differences., APE is sponsored by the World Meteorological Organization (WMO) joint Commission on Atmospheric Science (CAS), World Climate Research Program (WCRP) Working Group on Numerical Experimentation (WGNE). Chapter 2 of this Atlas provides an overview of the specification of the eight APE experiments and of the data collected. Chapter 3 lists the participating models and includes brief descriptions of each. Chapters 4 through 7 present a wide variety of statistics from the 14 participating models for the eight different experiments. Additional intercomparison figures created by Dr. Yukiko Yamada in AGU group are available at http://www.gfd-dennou.org/library/ape/comparison/. This Atlas is intended to present and compare the statistics of the APE simulations but does not contain a discussion of interpretive analyses. Such analyses are left for journal papers such as those included in the Special Issue of the Journal of the Meteorological Society of Japan (2013, Vol. 91A) devoted to the APE. Two papers in that collection provide an overview of the simulations. One (Blackburn et al., 2013) concentrates on the CONTROL simulation and the other (Williamson et al., 2013) on the response to changes in the meridional SST profile. Additional papers provide more detailed analysis of the basic simulations, while others describe various sensitivities and applications. The APE experiment data base holds a wealth of data that is now publicly available from the APE web site: http://climate.ncas.ac.uk/ape/. We hope that this Atlas will stimulate future analyses and investigations to understand the large variation seen in the model behaviors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Atmospheric dust is an important feedback in the climate system, potentially affecting the radiative balance and chemical composition of the atmosphere and providing nutrients to terrestrial and marine ecosystems. Yet the potential impact of dust on the climate system, both in the anthropogenically disturbed future and the naturally varying past, remains to be quantified. The geologic record of dust provides the opportunity to test earth system models designed to simulate dust. Records of dust can be obtained from ice cores, marine sediments, and terrestrial (loess) deposits. Although rarely unequivocal, these records document a variety of processes (source, transport and deposition) in the dust cycle, stored in each archive as changes in clay mineralogy, isotopes, grain size, and concentration of terrigenous materials. Although the extraction of information from each type of archive is slightly different, the basic controls on these dust indicators are the same. Changes in the dust flux and particle size might be controlled by a combination of (a) source area extent, (b) dust emission efficiency (wind speed) and atmospheric transport, (c) atmospheric residence time of dust, and/or (d) relative contributions of dry settling and rainout of dust. Similarly, changes in mineralogy reflect (a) source area mineralogy and weathering and (b) shifts in atmospheric transport. The combination of these geological data with process-based, forward-modelling schemes in global earth system models provides an excellent means of achieving a comprehensive picture of the global pattern of dust accumulation rates, their controlling mechanisms, and how those mechanisms may vary regionally. The Dust Indicators and Records of Terrestrial and MArine Palaeoenvironments (DIRTMAP) data base has been established to provide a global palaeoenvironmental data set that can be used to validate earth system model simulations of the dust cycle over the past 150,000 years.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Global syntheses of palaeoenvironmental data are required to test climate models under conditions different from the present. Data sets for this purpose contain data from spatially extensive networks of sites. The data are either directly comparable to model output or readily interpretable in terms of modelled climate variables. Data sets must contain sufficient documentation to distinguish between raw (primary) and interpreted (secondary, tertiary) data, to evaluate the assumptions involved in interpretation of the data, to exercise quality control, and to select data appropriate for specific goals. Four data bases for the Late Quaternary, documenting changes in lake levels since 30 kyr BP (the Global Lake Status Data Base), vegetation distribution at 18 kyr and 6 kyr BP (BIOME 6000), aeolian accumulation rates during the last glacial-interglacial cycle (DIRTMAP), and tropical terrestrial climates at the Last Glacial Maximum (the LGM Tropical Terrestrial Data Synthesis) are summarised. Each has been used to evaluate simulations of Last Glacial Maximum (LGM: 21 calendar kyr BP) and/or mid-Holocene (6 cal. kyr BP) environments. Comparisons have demonstrated that changes in radiative forcing and orography due to orbital and ice-sheet variations explain the first-order, broad-scale (in space and time) features of global climate change since the LGM. However, atmospheric models forced by 6 cal. kyr BP orbital changes with unchanged surface conditions fail to capture quantitative aspects of the observed climate, including the greatly increased magnitude and northward shift of the African monsoon during the early to mid-Holocene. Similarly, comparisons with palaeoenvironmental datasets show that atmospheric models have underestimated the magnitude of cooling and drying of much of the land surface at the LGM. The inclusion of feedbacks due to changes in ocean- and land-surface conditions at both times, and atmospheric dust loading at the LGM, appears to be required in order to produce a better simulation of these past climates. The development of Earth system models incorporating the dynamic interactions among ocean, atmosphere, and vegetation is therefore mandated by Quaternary science results as well as climatological principles. For greatest scientific benefit, this development must be paralleled by continued advances in palaeodata analysis and synthesis, which in turn will help to define questions that call for new focused data collection efforts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new method to measure the epicycle frequency kappa in the Galactic disc is presented. We make use of the large data base on open clusters completed by our group to derive the observed velocity vector (amplitude and direction) of the clusters in the Galactic plane. In the epicycle approximation, this velocity is equal to the circular velocity given by the rotation curve, plus a residual or perturbation velocity, of which the direction rotates as a function of time with the frequency kappa. Due to the non-random direction of the perturbation velocity at the birth time of the clusters, a plot of the present-day direction angle of this velocity as a function of the age of the clusters reveals systematic trends from which the epicycle frequency can be obtained. Our analysis considers that the Galactic potential is mainly axis-symmetric, or in other words, that the effect of the spiral arms on the Galactic orbits is small; in this sense, our results do not depend on any specific model of the spiral structure. The values of kappa that we obtain provide constraints on the rotation velocity of the in particular, V(0) is found to be 230 +/- 15 km s(-1) even if the scale (R(0) = 7.5 kpc) of the Galaxy is adopted. The measured kappa at the solar radius is 43 +/- 5 km s(-1) kpc(-1). The distribution of initial velocities of open clusters is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we construct a dynamic portrait of the inner asteroidal belt. We use information about the distribution of test particles, which were initially placed on a perfectly rectangular grid of initial conditions, after 4.2 Myr of gravitational interactions with the Sun and five planets, from Mars to Neptune. Using the spectral analysis method introduced by Michtchenko et al., the asteroidal behaviour is illustrated in detail on the dynamical, averaged and frequency maps. On the averaged and frequency maps, we superpose information on the proper elements and proper frequencies of real objects, extracted from the data base, AstDyS, constructed by Milani and Knezevic. A comparison of the maps with the distribution of real objects allows us to detect possible dynamical mechanisms acting in the domain under study; these mechanisms are related to mean-motion and secular resonances. We note that the two- and three-body mean-motion resonances and the secular resonances (strong linear and weaker non-linear) have an important role in the diffusive transportation of the objects. Their long-lasting action, overlaid with the Yarkovsky effect, may explain many observed features of the density, size and taxonomic distributions of the asteroids.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study shows the current need for security solutions concerning work with information in different areas.Investigations will show important solutions for printers’ needs to meet the increasingly harder demands forfast and safe digital communications. Important factors to be analyzed in the investigations are: access todifferent types of information, workers authority of information, access to the data base register internallyand externally, production solutions for an effective fault detection and data base solutions fororders and distribution.Planned and unplanned stops result in a standard of value in interruptions. Internal data bases areprotected by so-called “Fire Walls”, “Watch Dogs” and Virtual Private Networks. Offset plates are locked infor a definitive range of time. Subsequent destruction and remaining sheets are shredded and recycled. Alldocumentation is digital, in business control systems, which guarantees that no important documents arelying around in working places. Fault detection work is facilitated by the ability to fully track the order numberson incoming orders.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problems of finding best facility locations require complete and accurate road network with the corresponding population data in a specific area. However the data obtained in road network databases usually do not fit in this usage. In this paper we propose our procedure of converting the road network database to a road graph which could be used in localization problems. The road network data come from the National road data base in Sweden. The graph derived is cleaned, and reduced to a suitable level for localization problems. The population points are also processed in ordered to match with that graph. The reduction of the graph is done maintaining most of the accuracy for distance measures in the network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O mundo atravessa uma fase de incertezas sobre o câmbio. O objetivo deste trabalho é apresentar uma revisão bibliográfica sobre o assunto, trazendo os modelos que tentam elucidar como se forma a taxa de câmbio no curto e longo prazo. Serão tratados os motivos que levaram o Brasil a abandonar a política de bandas cambiais em 1999 e quais as medidas tomadas pelas autoridades econômicas, bem como o impacto destas sobre a economia. Ao longo dessa análise será calculada a taxa de câmbio real e real efetiva mensalmente com data base julho de 1994, permitindo visualizar as fases de apreciação e depreciação do câmbio. Encerrando o capítulo uma revisão teórica do impacto das políticas monetária e fiscal sobre o câmbio à luz dos diferentes regimes cambiais existentes. Finalmente, uma revisão dos fatos que levaram às crises cambiais no México, Tailândia, Malásia, Indonésia, Coréia do Sul, Rússia e Argentina, analisando os principais indicadores macroeconômicos destas economias tentando encontrar elementos comuns que permitam entender os motivos das crises.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Com a globalização do mercado e o alto nível de competitividade no setor educacional, as organizações, para manterem-se, devem ser ágeis e competentes. Neste contexto, a gestão eficiente dos recursos e a obtenção de informações precisas que apóiem a tomada de decisão dependerão, em grande parte, de um sistema de informações de custos. Este sistema deverá ter como base um método de custeio que forneça informações, a fim de atender as distintas necessidades dos gestores dos diversos níveis hierárquico e das diversas áreas de atuação. O trabalho consiste no estudo de uma metodologia de custeio aplicável a uma Instituição de Ensino Superior – IES privada, a qual atenda as três perspectivas que são fornecer informações para embasar a composição dos preços, para apoiar o processo decisório e para o planejamento e controle de gastos. Para tanto, partiu-se da pesquisa bibliográfica no levantamento do estado da arte relacionada ao tema. Com o estudo de caso buscou-se a identificação das necessidades de informações de custos, demandadas pelos gestores da IES, por meio de pesquisa qualitativa. A partir dessa identificação, as necessidades foram cruzadas com os métodos de custeio existentes, o que permitiu a identificação do método mais adequado a IES. Nesta etapa foi possível o cruzamento entre a teoria e a prática, onde foram comparados o método proposto em relação ao atual método adotado pela IES o que possibilitou a identificação das deficiências do modelo atual e suas causas. A partir disto, propõe-se uma sistemática mais adequada para apoiar a tomada de decisão, com o intuito de melhoria do desempenho da instituição. Os resultados obtidos demonstram o cumprimento do objetivo onde, considerando as necessidades de informações de custos dos gestores, o método de custeio por atividades é o mais adequado para o suporte a gestão da IES.