979 resultados para CLASIFICACION DECIMAL GEOGRAFICA


Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente trabajo se realizó en el distrito central del cantón de Bagaces, Guanacaste, Costa Rica y tiene como propósito evaluar el impacto de la II fase del proyecto de Riego Arenal-Tempisque en el hábitat potencial del venado cola blanca (odocoileus virginianus). Los resultados del estudio indican que la II etapa del Proyecto de Riego impactará 4.633 hectáreas del hábitat clasificado como alto para el venado. SUMMARY The aim of this paper is to discuss and illustrate the usefulness of the GIS in the evaluation of the impact of the II Phase of the Arenal-Tempisque irrigation project on the habitat of white-tailed deer. The study area is the central canton of Bagaces in Guanacaste, Costa Rica. Once fully developed, the irrigation project will impact 4633 hectares of prime deer habitat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este articulo tal como reza en su titulo, reúne la aplicación de tres tecnologías diferentes en la geografía actual: Sistemas de Posicionamiento Global (SPG), Sistemas de Información Geográfica (SIG) y Teledetección, lo cual es importante para los investigadores de estos temas y para el lector común científico o no. Pero la principal finalidad del trabajo es que sus contenidos rigurosos, metodológica y teóricamente tratan de análisis histórico del uso de los recursos naturales y la evolución ecológica en uno de los países mas devastados por todo tipo de calamidad: los desastres naturales, los desastres sociales-guerra- y las enfermedades. Lo anterior lo califica como uno de los países más pobres de Centroamérica y del mundo. El sudeste de Nicaragua es la <<muestra>> de nuestros países expoliados, exportadores de materia prima, endeudados con los entes financieros y con todo tipo de debilidad interna: desempleo, bajos salarios, desigualdad en la repartición de la tierra, dominio empresarial de una <<oligarquía>>, agricultura de subsistencia, factores ambientales negativos. Como se señalará más adelante, <<a los habitantes les gusta decir que llueve trece meses al año>>. Este último factor, constituye una dificultad metodológica porque la cantidad de nubosidad impide realizar trabajos <<óptimos>>. No obstante, el trabajo deja ver las consecuencias de la deforestación para el pastoreo y algunos cultivos (monocultivos y productos para el mercado externo). En conclusión, el estudio lleva a la consecución de la distribución del uso del suelo, la delimitación de la frontera agrícola y la elaboración del mapa respectivo para 1992, elementos que permiten comparar el gran avance de la frontera agrícola observando un mapa del instituto Nicaragüense de Estudios Territoriales (INETER) de 1983.  ABSTRACT This article focuses on the application of Global Positioning Systems (GPS), Geographical Information Systems (GIS), and Remote Sensing to the study of the agricultural frontier of Southeastern Nicaragua in the elaboration of a land use map of this area. The methodology of map elaboration, bases on LANDSAT satellite imagines, is explained. A report on deforestation processes and the agrarian frontier in Nicaragua within the context of Central America is also included.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A bacia do Rio Verde, localizada na Região Metropolitana de Curitiba (RMC) – Paraná – Brasil, é um dos mananciais de abastecimento da região, e se reveste de importância, tendo em vista que fornece água para a Refinaria Presidente Getúlio Vargas, da PETROBRÁS, que constitui o maior complexo petrolífero do Estado do Paraná. A sua ocupação, de início, predominantemente rural, se deu de forma caótica, assim como a urbana, sem planejamento algum. Tal situação terminou se refletindo no uso desordenado da terra, com significativas alterações nos ambientes fluviais, em seu entorno. Essa situação motivou o desenvolvimento da presente pesquisa, cujo objetivo foi o de analisar as áreas de risco ambiental na referida bacia, através do uso de sistemas de informação geográfica (SIG’s), para fins de gestão ambiental. Para a classificação e análise das unidades de mapeamento, foram gerados planos de informação a partir de cartas topográficas (escala 1:20.000 – 1976; fotografias aéreas (escala 1:30.000 – 2000) , imagem de satélite (2005), além de atualização de campo. Os resultados demonstraram que a bacia corre riscos ambientais, que demandam medidas mitigatórias, existindo a possibilidade de eutrofização do reservatório da represa, que pode determinar, no futuro, colapso quanto ao abastecimento de água  para a refinaria da PETROBRAS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lineamientos teóricos sobre la acumulación de capital a escala regional y la transferencia geográfica del valor como componentes de desarrollo geográfico desigual. Se intenta una aproximación empírica sobre la transferencia geográfica del valor como una fuerza material que orienta el desarrollo capitalista en la región de Limón.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este artículo se analiza el modelo agroexportador incluyente-marginante y los estidos de crecimeinto de Nicaragua.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent agents are an advanced technology utilized in Web Intelligence. When searching information from a distributed Web environment, information is retrieved by multi-agents on the client site and fused on the broker site. The current information fusion techniques rely on cooperation of agents to provide statistics. Such techniques are computationally expensive and unrealistic in the real world. In this paper, we introduce a model that uses a world ontology constructed from the Dewey Decimal Classification to acquire user profiles. By search using specific and exhaustive user profiles, information fusion techniques no longer rely on the statistics provided by agents. The model has been successfully evaluated using the large INEX data set simulating the distributed Web environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the commercial food industry, demonstration of microbiological safety and thermal process equivalence often involves a mathematical framework that assumes log-linear inactivation kinetics and invokes concepts of decimal reduction time (DT), z values, and accumulated lethality. However, many microbes, particularly spores, exhibit inactivation kinetics that are not log linear. This has led to alternative modeling approaches, such as the biphasic and Weibull models, that relax strong log-linear assumptions. Using a statistical framework, we developed a novel log-quadratic model, which approximates the biphasic and Weibull models and provides additional physiological interpretability. As a statistical linear model, the log-quadratic model is relatively simple to fit and straightforwardly provides confidence intervals for its fitted values. It allows a DT-like value to be derived, even from data that exhibit obvious "tailing." We also showed how existing models of non-log-linear microbial inactivation, such as the Weibull model, can fit into a statistical linear model framework that dramatically simplifies their solution. We applied the log-quadratic model to thermal inactivation data for the spore-forming bacterium Clostridium botulinum and evaluated its merits compared with those of popular previously described approaches. The log-quadratic model was used as the basis of a secondary model that can capture the dependence of microbial inactivation kinetics on temperature. This model, in turn, was linked to models of spore inactivation of Sapru et al. and Rodriguez et al. that posit different physiological states for spores within a population. We believe that the log-quadratic model provides a useful framework in which to test vitalistic and mechanistic hypotheses of inactivation by thermal and other processes. Copyright © 2009, American Society for Microbiology. All Rights Reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La coriza infecciosa es una enfermedad respiratoria aguda de las gallinas domesti- cas causada por la bacteria Haemophilus parugallinarum. Excepcionalmente pueden enfermarse tambien los faisanes y gallinas de Guinea. El H. paragallinarum infecta al ave por via respiratoria y luego de un cor- to periodo de incubation, que varia entre 1 a 3 dias, produce una enfermedad que se manifiesta por inflamacion catarral de los senos paranasales. Este cuadro puede estar asociado a inflamacion de los barbillones, conjuntivitis o queratitis. Los casos de neu- nionia y aerosaculitis son menos frecuentes pero tambien suelen ocurrir en las infeccio- nes puras por estos hemofilos. En las gallinas en produccion causa alta morbilidad, baja o nula mortalidad y una importante perdida en la produccion de huevos, la que generalmente oscila entre 10% y 40%. En pollos parrilleros puede cau- sar un cuadro descrito como «cabeza hin- chada» y ocasionalmente tambien producir septicemia y muerte (48). Esta bacteria ge- neralmente se asocia con otros agentes bacterianos, viricos o parasitarios y cuan- do esto ocurre se agrava el curso de la en- fermedad. Entre los agentes bacterianos mas comunes deben mencionarse los mycoplasinas y las pasteurelas. Cuando H . paragallinarum se asocia con otros agentes esta enfermedad se denomina .«coriza infec- ciosa complicada» (48). En esta recopilacion se aportaran deta- lles sobre la clasificacion, identificacion y serotipificacion del agente causal. Tambien se resumira la informacion disponible sobre nuevos metodos de diagnostico y programas de vacunacion para prevenir esta enferme-dad. A lo largo de esta revision se hara re-ferencia a los hemofilos aviarios que, para el proposito de este trabajo, seran definidos como organisnios gram negativos aislados de aves y que necesariamente requieren factores de crecimiento in vitro. Los dos factores que pueden ser requeridos por los hemofilos para su crecimiento in vitro son hemina (factor X) y/o nicotin-adenin-dinucleirtido (NAD o factor V).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiplexers, as in the case of binary, are very useful building blocks in the development of quaternary systems. The use of quaternary multiplexer (QMUX) in the implementation of quaternary adder, subtractor and multiplier is described in this paper. Quaternary coded decimal (QCD) adder/subtractor and quaternary excess-3 adder/subtractor realization using QMUX are also proposed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Raw data from SeaScan™ transects off Wide Bay (south Queensland) taken in August 2007 as part of a study of ecological factors influencing the distribution of spanner crabs (Ranina ranina). The dataset (comma-delimited ascii file) comprises the following fields: 1. record number 2. date-time (GMT) 3. date-time (AEST) 4. latitude (signed decimal degrees) 5. longitude (decimal degrees) 6. speed over ground (knots) 7. depth (m) 8. seabed roughness (v) 9. hardness (v) Indices of roughness and hardness (from the first and second echoes respectively) were obtained using a SeaScan™ 100 system (un-referenced) on board the Research Vessel Tom Marshall, with the ship’s Furuno FCV 1100 echo sounder and 1 kW, 50 kHz transducer. Generally vessel speed was kept below about 14 kt (typically ~12 kt), and the echo-sounder range set to 80 m. The data were filtered to remove errors due to data drop-out, straying beyond system depth limits (min. 10 m), or transducer interference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The natural frequencies of continuous systems depend on the governing partial differential equation and can be numerically estimated using the finite element method. The accuracy and convergence of the finite element method depends on the choice of basis functions. A basis function will generally perform better if it is closely linked to the problem physics. The stiffness matrix is the same for either static or dynamic loading, hence the basis function can be chosen such that it satisfies the static part of the governing differential equation. However, in the case of a rotating beam, an exact closed form solution for the static part of the governing differential equation is not known. In this paper, we try to find an approximate solution for the static part of the governing differential equation for an uniform rotating beam. The error resulting from the approximation is minimized to generate relations between the constants assumed in the solution. This new function is used as a basis function which gives rise to shape functions which depend on position of the element in the beam, material, geometric properties and rotational speed of the beam. The results of finite element analysis with the new basis functions are verified with published literature for uniform and tapered rotating beams under different boundary conditions. Numerical results clearly show the advantage of the current approach at high rotation speeds with a reduction of 10 to 33% in the degrees of freedom required for convergence of the first five modes to four decimal places for an uniform rotating cantilever beam.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The power of X-ray crystal structure analysis as a technique is to `see where the atoms are'. The results are extensively used by a wide variety of research communities. However, this `seeing where the atoms are' can give a false sense of security unless the precision of the placement of the atoms has been taken into account. Indeed, the presentation of bond distances and angles to a false precision (i.e. to too many decimal places) is commonplace. This article has three themes. Firstly, a basis for a proper representation of protein crystal structure results is detailed and demonstrated with respect to analyses of Protein Data Bank entries. The basis for establishing the precision of placement of each atom in a protein crystal structure is non-trivial. Secondly, a knowledge base harnessing such a descriptor of precision is presented. It is applied here to the case of salt bridges, i.e. ion pairs, in protein structures; this is the most fundamental place to start with such structure-precision representations since salt bridges are one of the tenets of protein structure stability. Ion pairs also play a central role in protein oligomerization, molecular recognition of ligands and substrates, allosteric regulation, domain motion and alpha-helix capping. A new knowledge base, SBPS (Salt Bridges in Protein Structures), takes these structural precisions into account and is the first of its kind. The third theme of the article is to indicate natural extensions of the need for such a description of precision, such as those involving metalloproteins and the determination of the protonation states of ionizable amino acids. Overall, it is also noted that this work and these examples are also relevant to protein three-dimensional structure molecular graphics software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pesquisa procura identificar e analisar as circunstâncias e motivações teóricas que levaram ao surgimento, no final do século XIX, da área de estudo denominada Documentação. Apresenta informações biográficas sobre os principais artífices deste projeto, os advogados e bibliógrafos belgas, Paul Otlet e Henri La Fontaine, que visaram ampliar a compreensão sobre o meio social e cultural em que atuaram. Procura ampliar a discussão sobre os fatores que motivaram em 1895 a proposta de organização racional de toda a produção intelectual do homem. Apresenta em linhas gerais a visão abrangente e integradora da Documentação que eliminando barreiras físicas, acessava acervos arquivísticos, bibliográficos e museológicos para o registro integral dos assuntos pesquisados. Discute a utilização da Classificação Decimal de Dewey (CDD) na criação do Repertório Bibliográfico Universal e analisa o processo que levou ao surgimento da Classificação Decimal Universal (CDU). Conclui sugerindo a inclusão desta visão extensiva e integradora da documentação ao referencial teórico da Ciência da Informação. Sugere que o resgate deste aporte teórico poderá contribuir para um melhor enfrentamento dos problemas da gestão do conhecimento registrado, produzido e acumulado até os dias atuais, nos mais diversos formatos, suportes e repositórios.