987 resultados para Information fractal dimension
Resumo:
The Bronze to Iron Age transition in Crete, a period of state collapse and insecurity, saw the island's rugged, high-contrast topography used in striking new ways. The visual drama of many of the new site locations has stimulated significant research over the last hundred years, with explanation of the change as the main focus. The new sites are not monumental in character: the vast majority are settlements, and much of the information about them comes from survey. Perhaps as a result, the new site map has not been much studied from phenomenological perspectives. A focus on the visual and experiential aspects of the new landscape can offer valuable insights into social structures at this period, and illuminate social developments prefiguring the emergence of polis states in Crete by c. 700 BC. To develop, share and evaluate this type of integrated study, digital reconstructive techniques are still under-used in this region. I highlight their potential value in addressing a regularly-identified shortcoming of phenomenological approaches-their necessarily subjective emphasis.
Resumo:
The Bronze to Iron Age transition in Crete, a period of state collapse and insecurity, saw the island's rugged, high-contrast topography used in striking new ways. The visual drama of many of the new site locations has stimulated significant research over the last hundred years, with explanation of the change as the main focus. The new sites are not monumental in character: the vast majority are settlements, and much of the information about them comes from survey. Perhaps as a result, the new site map has not been much studied from phenomenological perspectives. A focus on the visual and experiential aspects of the new landscape can offer valuable insights into social structures at this period, and illuminate social developments prefiguring the emergence of polis states in Crete by c. 700 BC. To develop, share and evaluate this type of integrated study, digital reconstructive techniques are still under-used in this region. I highlight their potential value in addressing a regularly-identified shortcoming of phenomenological approaches-their necessarily subjective emphasis.
Resumo:
This Editorial presents the focus, scope and policies of the inaugural issue of Nature Conservation, a new open access, peer-reviewed journal bridging natural sciences, social sciences and hands-on applications in conservation management. The journal covers all aspects of nature conservation and aims particularly at facilitating better interaction between scientists and practitioners. The journal will impose no restrictions on manuscript size or the use of colour. We will use an XML-based editorial workflow and several cutting-edge innovations in publishing and information dissemination. These include semantic mark-up of, and enhancements to published text, data, and extensive cross-linking within the journal and to external sources. We believe the journal will make an important contribution to better linking science and practice, offers rapid, peer-reviewed and flexible publication for authors and unrestricted access to content.
Resumo:
Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.
Resumo:
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
Resumo:
Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.
Resumo:
This paper evaluates how information asymmetry affects the strength of competition in credit markets. A theory is presented in which adverse selection softens competition by decreasing the incentives creditors have for competing in the interest rate dimension. In equilibirum, although creditors compete, the outcome is similar to collusion. Three empirical implications arise. First, interest rate should respond asymmetrically to changes in the cost of funds: increases in cost of funds should, on average, have a larger effect on interest rates than decreases. Second, aggressiveness in pricing should be associated with a worseing in the bank level default rates. Third, bank level default rates should be endogenous. We then verify the validity of these three empirical implications using Brazilian data on consumer overdraft loans. The results in this paper rationalize seemingly abnormallly high interest rates in unsecured loans.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Statement of problem. An increase in occlusal vertical dimension (OVD) may occur after processing complete dentures. Although many factors that generate this change are known, no information is available in the dental literature regarding the effect that the occlusal scheme may have on the change in OVD.Purpose. This in vitro study compared the increase in OVD, after processing, between complete dentures with teeth arranged in lingualized balanced occlusion and conventional balanced occlusion.Material and methods. Thirty sets of complete dentures were evaluated as follows: 15 sets of complete dentures were arranged in conventional balanced occlusion (control) and 15 sets of complete dentures were arranged in lingualized balanced occlusion. All dentures were compression molded with a long polymerization cycle. The occlusal vertical dimension was measured with a micrometer (mm) before and after processing each set of dentures. Data were analyzed using an independent t test (alpha=.05).Results. The mean increase in the OVD, after processing, was 0.87 +/- 0.21 mm for the control group and 0.90 +/- 0.27 mm for the experimental group. There was no significant difference between the groups.Conclusion. After processing, dentures set in lingualized balanced occlusion showed an increase in OVD similar to those set in conventional balanced occlusion.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This is the first paper in a two-part series devoted to studying the Hausdorff dimension of invariant sets of non-uniformly hyperbolic, non-conformal maps. Here we consider a general abstract model, that we call piecewise smooth maps with holes. We show that the Hausdorff dimension of the repeller is strictly less than the dimension of the ambient manifold. Our approach also provides information on escape rates and dynamical dimension of the repeller.
Resumo:
Multifractal analysis is now increasingly used to characterize soil properties as it may provide more information than a single fractal model. During the building of a large reservoir on the Parana River (Brazil), a highly weathered soil profile was excavated to a depth between 5 and 8 m. Excavation resulted in an abandoned area with saprolite materials and, in this area, an experimental field was established to assess the effectiveness of different soil rehabilitation treatments. The experimental design consisted of randomized blocks. The aim of this work was to characterize particle-size distributions of the saprolite material and use the information obtained to assess between-block variability. Particle-size distributions of the experimental plots were characterized by multifractal techniques. Ninety-six soil samples were analyzed routinely for particle-size distribution by laser diffractometry in a range of scales, varying from 0.390 to 2000 mu m. Six different textural classes (USDA) were identified with a clay content ranging from 16.9% to 58.4%. Multifractal models described reasonably well the scaling properties of particle-size distributions of the saprolite material. This material exhibits a high entropy dimension, D-1. Parameters derived from the left side (q > 0) of the f(alpha) spectra, D-1, the correlation dimension (D-2) and the range (alpha(0)-alpha(q+)), as well as the total width of the spectra (alpha(max) - alpha(min)) all showed dependence on the clay content. Sand, silt and clay contents were significantly different among treatments as a consequence of soil intrinsic variability. The D, and the Holder exponent of order zero, alpha(0), were not significantly different between treatments; in contrast, D-2 and several fractal attributes describing the width of the f(alpha) spectra were significantly different between treatments. The only parameter showing significant differences between sampling depths was (alpha(0) - alpha(q+)). Scale independent fractal attributes may be useful for characterizing intrinsic particle-size distribution variability. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Includes bibliography
Resumo:
Patients suffering from cystic fibrosis (CF) show thick secretions, mucus plugging and bronchiectasis in bronchial and alveolar ducts. This results in substantial structural changes of the airway morphology and heterogeneous ventilation. Disease progression and treatment effects are monitored by so-called gas washout tests, where the change in concentration of an inert gas is measured over a single or multiple breaths. The result of the tests based on the profile of the measured concentration is a marker for the severity of the ventilation inhomogeneity strongly affected by the airway morphology. However, it is hard to localize underlying obstructions to specific parts of the airways, especially if occurring in the lung periphery. In order to support the analysis of lung function tests (e.g. multi-breath washout), we developed a numerical model of the entire airway tree, coupling a lumped parameter model for the lung ventilation with a 4th-order accurate finite difference model of a 1D advection-diffusion equation for the transport of an inert gas. The boundary conditions for the flow problem comprise the pressure and flow profile at the mouth, which is typically known from clinical washout tests. The natural asymmetry of the lung morphology is approximated by a generic, fractal, asymmetric branching scheme which we applied for the conducting airways. A conducting airway ends when its dimension falls below a predefined limit. A model acinus is then connected to each terminal airway. The morphology of an acinus unit comprises a network of expandable cells. A regional, linear constitutive law describes the pressure-volume relation between the pleural gap and the acinus. The cyclic expansion (breathing) of each acinus unit depends on the resistance of the feeding airway and on the flow resistance and stiffness of the cells themselves. Special care was taken in the development of a conservative numerical scheme for the gas transport across bifurcations, handling spatially and temporally varying advective and diffusive fluxes over a wide range of scales. Implicit time integration was applied to account for the numerical stiffness resulting from the discretized transport equation. Local or regional modification of the airway dimension, resistance or tissue stiffness are introduced to mimic pathological airway restrictions typical for CF. This leads to a more heterogeneous ventilation of the model lung. As a result the concentration in some distal parts of the lung model remains increased for a longer duration. The inert gas concentration at the mouth towards the end of the expirations is composed of gas from regions with very different washout efficiency. This results in a steeper slope of the corresponding part of the washout profile.
Resumo:
To reach the goals established by the Institute of Medicine (IOM) and the Centers for Disease Control's (CDC) STOP TB USA, measures must be taken to curtail a future peak in Tuberculosis (TB) incidence and speed the currently stagnant rate of TB elimination. Both efforts will require, at minimum, the consideration and understanding of the third dimension of TB transmission: the location-based spread of an airborne pathogen among persons known and unknown to each other. This consideration will require an elucidation of the areas within the U.S. that have endemic TB. The Houston Tuberculosis Initiative (HTI) was a population-based active surveillance of confirmed Houston/Harris County TB cases from 1995–2004. Strengths in this dataset include the molecular characterization of laboratory confirmed cases, the collection of geographic locations (including home addresses) frequented by cases, and the HTI time period that parallels a decline in TB incidence in the United States (U.S.). The HTI dataset was used in this secondary data analysis to implement a GIS analysis of TB cases, the locations frequented by cases, and their association with risk factors associated with TB transmission. ^ This study reports, for the first time, the incidence of TB among the homeless in Houston, Texas. The homeless are an at-risk population for TB disease, yet they are also a population whose TB incidence has been unknown and unreported due to their non-enumeration. The first section of this dissertation identifies local areas in Houston with endemic TB disease. Many Houston TB cases who reported living in these endemic areas also share the TB risk factor of current or recent homelessness. Merging the 2004–2005 Houston enumeration of the homeless with historical HTI surveillance data of TB cases in Houston enabled this first-time report of TB risk among the homeless in Houston. The homeless were more likely to be US-born, belong to a genotypic cluster, and belong to a cluster of a larger size. The calculated average incidence among homeless persons was 411/100,000, compared to 9.5/100,000 among housed. These alarming rates are not driven by a co-infection but by social determinants. The unsheltered persons were hospitalized more days and required more follow-up time by staff than those who reported a steady housing situation. The homeless are a specific example of the increased targeting of prevention dollars that could occur if TB rates were reported for specific areas with known health disparities rather than as a generalized rate normalized over a diverse population. ^ It has been estimated that 27% of Houstonians use public transportation. The city layout allows bus routes to run like veins connecting even the most diverse of populations within the metropolitan area. Secondary data analysis of frequent bus use (defined as riding a route weekly) among TB cases was assessed for its relationship with known TB risk factors. The spatial distribution of genotypic clusters associated with bus use was assessed, along with the reported routes and epidemiologic-links among cases belonging to the identified clusters. ^ TB cases who reported frequent bus use were more likely to have demographic and social risk factors associated with poverty, immune suppression and health disparities. An equal proportion of bus riders and non-bus riders were cultured for Mycobacterium tuberculosis, yet 75% of bus riders were genotypically clustered, indicating recent transmission, compared to 56% of non-bus riders (OR=2.4, 95%CI(2.0, 2.8), p<0.001). Bus riders had a mean cluster size of 50.14 vs. 28.9 (p<0.001). Second order spatial analysis of clustered fingerprint 2 (n=122), a Beijing family cluster, revealed geographic clustering among cases based on their report of bus use. Univariate and multivariate analysis of routes reported by cases belonging to these clusters found that 10 of the 14 clusters were associated with use. Individual Metro routes, including one route servicing the local hospitals, were found to be risk factors for belonging to a cluster shown to be endemic in Houston. The routes themselves geographically connect the census tracts previously identified as having endemic TB. 78% (15/23) of Houston Metro routes investigated had one or more print groups reporting frequent use for every HTI study year. We present data on three specific but clonally related print groups and show that bus-use is clustered in time by route and is the only known link between cases in one of the three prints: print 22. (Abstract shortened by UMI.)^