901 resultados para Centralize density-based spatial clustering of applications with noise
Resumo:
In this paper, the concept of available potential energy (APE) density is extended to a multicomponent Boussinesq fluid with a nonlinear equation of state. As shown by previous studies, the APE density is naturally interpreted as the work against buoyancy forces that a parcel needs to perform to move from a notional reference position at which its buoyancy vanishes to its actual position; because buoyancy can be defined relative to an arbitrary reference state, so can APE density. The concept of APE density is therefore best viewed as defining a class of locally defined energy quantities, each tied to a different reference state, rather than as a single energy variable. An important result, for which a new proof is given, is that the volume integrated APE density always exceeds Lorenz’s globally defined APE, except when the reference state coincides with Lorenz’s adiabatically re-arranged reference state of minimum potential energy. A parcel reference position is systematically defined as a level of neutral buoyancy (LNB): depending on the nature of the fluid and on how the reference state is defined, a parcel may have one, none, or multiple LNB within the fluid. Multiple LNB are only possible for a multicomponent fluid whose density depends on pressure. When no LNB exists within the fluid, a parcel reference position is assigned at the minimum or maximum geopotential height. The class of APE densities thus defined admits local and global balance equations, which all exhibit a conversion with kinetic energy, a production term by boundary buoyancy fluxes, and a dissipation term by internal diffusive effects. Different reference states alter the partition between APE production and dissipation, but neither affect the net conversion between kinetic energy and APE, nor the difference between APE production and dissipation. We argue that the possibility of constructing APE-like budgets based on reference states other than Lorenz’s reference state is more important than has been previously assumed, and we illustrate the feasibility of doing so in the context of an idealised and realistic oceanic example, using as reference states one with constant density and another one defined as the horizontal mean density field; in the latter case, the resulting APE density is found to be a reasonable approximation of the APE density constructed from Lorenz’s reference state, while being computationally cheaper.
Resumo:
During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series for Germany are quantified based on potential losses using empirical models. Two reanalysis data sets and observations from German weather stations are considered for 30 winters. Histograms of events exceeding selected return levels (1-, 2- and 5-year) are derived. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Over 4000 years of general circulation model (GCM) simulations forced with current climate conditions are analysed to provide a better assessment of historical return periods. Estimations differ between distributions, for example 40 to 65 years for the 1990 series. For such less frequent series, estimates obtained with the Poisson distribution clearly deviate from empirical data. The negative binomial distribution provides better estimates, even though a sensitivity to return level and data set is identified. The consideration of GCM data permits a strong reduction of uncertainties. The present results support the importance of considering explicitly clustering of losses for an adequate risk assessment for economical applications.
Resumo:
Subspace clustering groups a set of samples from a union of several linear subspaces into clusters, so that the samples in the same cluster are drawn from the same linear subspace. In the majority of the existing work on subspace clustering, clusters are built based on feature information, while sample correlations in their original spatial structure are simply ignored. Besides, original high-dimensional feature vector contains noisy/redundant information, and the time complexity grows exponentially with the number of dimensions. To address these issues, we propose a tensor low-rank representation (TLRR) and sparse coding-based (TLRRSC) subspace clustering method by simultaneously considering feature information and spatial structures. TLRR seeks the lowest rank representation over original spatial structures along all spatial directions. Sparse coding learns a dictionary along feature spaces, so that each sample can be represented by a few atoms of the learned dictionary. The affinity matrix used for spectral clustering is built from the joint similarities in both spatial and feature spaces. TLRRSC can well capture the global structure and inherent feature information of data, and provide a robust subspace segmentation from corrupted data. Experimental results on both synthetic and real-world data sets show that TLRRSC outperforms several established state-of-the-art methods.
Resumo:
Este trabalho teve como objetivo avaliar a influência das formas do relevo na variabilidade espacial de atributos físicos e suas relações com a mineralogia da argila de um Latossolo Vermelho eutroférrico, utilizando a técnica da geoestatística. Os solos foram amostrados nos pontos de cruzamento de uma malha, com intervalos regulares de 10 m, nas profundidades de 0,0-0,2 m, 0,2-0,4 m e 0,4-0,6 m para os atributos físicos e 0,6-0,8 m para os atributos mineralógicos. Os valores médios para a densidade do solo e resistência do solo à penetração são maiores no compartimento I onde a relação Ct/Ct+Gb é relativamente maior, indicando a presença de maior teor de caulinita. No compartimento II a condutividade hidráulica e a macroporsidade são maiores, influenciados provavelmente pelo predomínio da gibbsita. Portanto, conclui-se que a identificação das pedoformas é muito eficiente para compreender a variabilidade espacial de propriedades do solo. Sendo que, as variações na forma da paisagem promovem variabilidade espacial diferenciada das propriedades físicas e mineralógicas do solo.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Several studies suggest that, on a large scale, relief conditions influence the Atlantic Forest cover. The aim of this work was to explore these relationships on a local scale, in Caucaia do Alto, on the Ibiúna Plateau. Within an area of about 78 km2, the distribution of forest cover, divided into two successional stages, was associated with relief attribute data (slope, slope orientation and altitude). The mapping of the vegetation was based on the interpretation of stereoscopic pairs of aerial photographs, from April 2000, on a scale of 1:10,000, while the relief attributes were obtained by geoprocessing from digitalized topographic maps on a scale of 1:10,000. Statistical analyses, based on qui-square tests, revealed that there was a more extensive forest cover, irrespective of the successional stage, in steeper areas (>10 degrees) located at higher altitudes (>923 m), but no influence of the slope orientation. There was no sign of direct influence of relief on the forest cover through environmental gradients that might have contributed to the forest regeneration. Likewise, there was no evidence that these results could have been influenced by the distance from roads or urban areas or with respect to permanent preservation areas. Relief seems to influence the forest cover indirectly, since agricultural land use is preferably made in flatter and lower areas. These results suggest a general distribution pattern of the forest remnants, independent of the scale of study, on which relief indirectly has a strong influence, since it determines human occupation.
Resumo:
A simple method for designing a digital state-derivative feedback gain and a feedforward gain such that the control law is equivalent to a known and adequate state feedback and feedforward control law of a digital redesigned system is presented. It is assumed that the plant is a linear controllable, time-invariant, Single-Input (SI) or Multiple-Input (MI) system. This procedure allows the use of well-known continuous-time state feedback design methods to directly design discrete-time state-derivative feedback control systems. The state-derivative feedback can be useful, for instance, in the vibration control of mechanical systems, where the main sensors are accelerometers. One example considering the digital redesign with state-derivative feedback of a helicopter illustrates the proposed method. © 2009 IEEE.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
This study was undertaken in a 1566 ha drainage basin situated in an area with cuesta relief in the state of São Paulo, Brazil. The objectives were: 1) to map the maximum potential soil water retention capacity, and 2) to simulate the depth of surface runoff in each geographical position of the area based on a typical rainfall event. The database required for the development of this research was generated in the environment of the geographical information system ArcInfo v.10.1. Undeformed soil samples were collected at 69 points. The ordinary kriging method was used in the interpolation of the values of soil density and maximum potential soil water retention capacity. The spherical model allowed for better adjustment of the semivariograms corresponding to the two soil attributes for the depth of 0 to 20 cm, while the Gaussian model enabled a better fit of the spatial behavior of the two variables for the depth of 20 to 40 cm. The simulation of the spatial distribution revealed a gradual increase in the depth of surface runoff for the rainfall event taken as example (25 mm) from the reverse to the peripheral depression of the cuesta (from west to east). There is a positive aspect observed in the gradient, since the sites of highest declivity, especially those at the front of the cuesta, are closer to the western boundary of the watershed where the lowest depths of runoff occur. This behavior, in conjunction with certain values of erodibility and depending on the land use and cover, can help mitigate the soil erosion processes in these areas.
Resumo:
In Rahmen der vorliegenden Arbeit wurde ein neuartiger Zugang zu einer Vielzahl von Polymerstrukturen auf Basis des klinisch zugelassenen Polymers Poly(N-(2-Hydroxypropyl)-methacrylamide) (PHPMA) entwickelt. Der synthetische Zugang beruht zum einen auf der Verwendung von Reaktivesterpolymeren und zum anderen auf der Reversible Addition Fragmentation Chain Transfer (RAFT) Polymerisationsmethode. Diese Form einer kontrollierten radikalischen Polymerisation ermöglichte es, neben der Synthese von besser definierten Homopolymeren auch statistische und Blockcopolymere herzustellen. Die Reaktivesterpolymere können durch einfache Aminolyse in HPMA-basierte Systeme überführt werden. Somit können sie als eine vielversprechende Basis zur Synthese von umfangreichen Polymerbibliotheken angesehen werden. Die hergestellten Polymere kombinieren verschiedene Funktionalitäten bei konstantem Polymerisationsgrad. Dies ermöglicht eine Optimierung auf eine gezielte Anwendung hin ohne den Parameter der Kettenlänge zu verändern.rnIm weiteren war es durch Verwendung der RAFT Polymerisation möglich partiell bioabbaubare Blockcopolymere auf Basis von Polylactiden und HPMA herzustellen, in dem ein Kettentransferreagenz (CTA) an ein wohl definiertes Polylactid Homopolymer gekoppelt wurde. Diese Strukturen wurden in ihrer Zusammensetzung variiert und mit Erkennungsstrukturen (Folaten) und markierenden Elementen (Fluoreszenzfarbstoffe und +-emittierenden Radionukleide) versehen und im weiteren in vitro und in vivo evaluiert.rnAuf Grund dieser Errungenschaften war es möglich den Einfluss der Polymermikrostruktur auf das Aggregationsverhalten hin mittel Lichtstreuung und Fluoreszenzkorrelationsspektroskopie zu untersuchen. Es konnte gezeigt werden, dass erst diese Informationen über die Überstrukturbildung die Kinetik der Zellaufnahme erklären können. Somit wurde die wichtige Rolle von Strukturwirkungsbeziehungen nachgewiesen.rnSomit konnte neben der Synthese, Charakterisierung und ersten biologischen Evaluierungen ein Beitrag zum besseres Verständnis zur Interaktion von polymeren Partikeln mit biologischen Systemen geleistet werden.
Resumo:
Introduction: Advances in biotechnology have shed light on many biological processes. In biological networks, nodes are used to represent the function of individual entities within a system and have historically been studied in isolation. Network structure adds edges that enable communication between nodes. An emerging fieldis to combine node function and network structure to yield network function. One of the most complex networks known in biology is the neural network within the brain. Modeling neural function will require an understanding of networks, dynamics, andneurophysiology. It is with this work that modeling techniques will be developed to work at this complex intersection. Methods: Spatial game theory was developed by Nowak in the context of modeling evolutionary dynamics, or the way in which species evolve over time. Spatial game theory offers a two dimensional view of analyzingthe state of neighbors and updating based on the surroundings. Our work builds upon this foundation by studying evolutionary game theory networks with respect to neural networks. This novel concept is that neurons may adopt a particular strategy that will allow propagation of information. The strategy may therefore act as the mechanism for gating. Furthermore, the strategy of a neuron, as in a real brain, isimpacted by the strategy of its neighbors. The techniques of spatial game theory already established by Nowak are repeated to explain two basic cases and validate the implementation of code. Two novel modifications are introduced in Chapters 3 and 4 that build on this network and may reflect neural networks. Results: The introduction of two novel modifications, mutation and rewiring, in large parametricstudies resulted in dynamics that had an intermediate amount of nodes firing at any given time. Further, even small mutation rates result in different dynamics more representative of the ideal state hypothesized. Conclusions: In both modificationsto Nowak's model, the results demonstrate the network does not become locked into a particular global state of passing all information or blocking all information. It is hypothesized that normal brain function occurs within this intermediate range and that a number of diseases are the result of moving outside of this range.
Resumo:
Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.
Resumo:
SUMMARY There is interest in the potential of companion animal surveillance to provide data to improve pet health and to provide early warning of environmental hazards to people. We implemented a companion animal surveillance system in Calgary, Alberta and the surrounding communities. Informatics technologies automatically extracted electronic medical records from participating veterinary practices and identified cases of enteric syndrome in the warehoused records. The data were analysed using time-series analyses and a retrospective space-time permutation scan statistic. We identified a seasonal pattern of reports of occurrences of enteric syndromes in companion animals and four statistically significant clusters of enteric syndrome cases. The cases within each cluster were examined and information about the animals involved (species, age, sex), their vaccination history, possible exposure or risk behaviour history, information about disease severity, and the aetiological diagnosis was collected. We then assessed whether the cases within the cluster were unusual and if they represented an animal or public health threat. There was often insufficient information recorded in the medical record to characterize the clusters by aetiology or exposures. Space-time analysis of companion animal enteric syndrome cases found evidence of clustering. Collection of more epidemiologically relevant data would enhance the utility of practice-based companion animal surveillance.
Resumo:
Purpose. To examine the association between living in proximity to Toxics Release Inventory (TRI) facilities and the incidence of childhood cancer in the State of Texas. ^ Design. This is a secondary data analysis utilizing the publicly available Toxics release inventory (TRI), maintained by the U.S. Environmental protection agency that lists the facilities that release any of the 650 TRI chemicals. Total childhood cancer cases and childhood cancer rate (age 0-14 years) by county, for the years 1995-2003 were used from the Texas cancer registry, available at the Texas department of State Health Services website. Setting: This study was limited to the children population of the State of Texas. ^ Method. Analysis was done using Stata version 9 and SPSS version 15.0. Satscan was used for geographical spatial clustering of childhood cancer cases based on county centroids using the Poisson clustering algorithm which adjusts for population density. Pictorial maps were created using MapInfo professional version 8.0. ^ Results. One hundred and twenty five counties had no TRI facilities in their region, while 129 facilities had at least one TRI facility. An increasing trend for number of facilities and total disposal was observed except for the highest category based on cancer rate quartiles. Linear regression analysis using log transformation for number of facilities and total disposal in predicting cancer rates was computed, however both these variables were not found to be significant predictors. Seven significant geographical spatial clusters of counties for high childhood cancer rates (p<0.05) were indicated. Binomial logistic regression by categorizing the cancer rate in to two groups (<=150 and >150) indicated an odds ratio of 1.58 (CI 1.127, 2.222) for the natural log of number of facilities. ^ Conclusion. We have used a unique methodology by combining GIS and spatial clustering techniques with existing statistical approaches in examining the association between living in proximity to TRI facilities and the incidence of childhood cancer in the State of Texas. Although a concrete association was not indicated, further studies are required examining specific TRI chemicals. Use of this information can enable the researchers and public to identify potential concerns, gain a better understanding of potential risks, and work with industry and government to reduce toxic chemical use, disposal or other releases and the risks associated with them. TRI data, in conjunction with other information, can be used as a starting point in evaluating exposures and risks. ^