870 resultados para Exploit the images in the building


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the impact of seller's Property Condition Disclosure Law on the residential real estate values. A disclosure law may address the information asymmetry in housing transactions shifting of risk from buyers and brokers to the sellers and raising housing prices as a result. We combine propensity score techniques from the treatment effects literature with a traditional event study approach. We assemble a unique set of economic and institutional attributes for a quarterly panel of 291 US Metropolitan Statistical Areas (MSAs) and 50 US States spanning 21 years from 1984 to 2004 is used to exploit the MSA level variation in house prices. The study finds that the average seller may be able to fetch a higher price (about three to four percent) for the house if she furnishes a state-mandated seller.s property condition disclosure statement to the buyer. When we compare the results from parametric and semi-parametric event analyses, we find that the semi-parametric or the propensity score analysis generals moderately larger estimated effects of the law on housing prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Few high-latitude terrestrial records document the timing and nature of the Cenozoic "Greenhouse" to "Icehouse" transition. Here we exploit the bulk geochemistry of marine siliciclastic sediments from drill cores on Antarctica's continental margin to extract a unique semiquantitative temperature and precipitation record for Eocene to mid-Miocene (~54-13 Ma). Alkaline elements are strongly enriched in the detrital mineral fraction in fine-grained siliciclastic marine sediments and only occur as trace metals in the biogenic fraction. Hence, terrestrial climofunctions similar to the chemical index of alteration (CIA) can be applied to the alkaline major element geochemistry of marine sediments on continental margins in order to reconstruct changes in precipitation and temperature. We validate this approach by comparison with published paleotemperature and precipitation records derived from fossil wood, leaves, and pollen and find remarkable agreement, despite uncertainties in the calibrations of the different proxies. A long-term cooling on the order of >=8°C is observed between the Early Eocene Climatic Optimum (~54-52 Ma) and the middle Miocene (~15-13 Ma) with the onset of transient cooling episodes in the middle Eocene at ~46-45 Ma. High-latitude stratigraphic records currently exhibit insufficient temporal resolution to reconstruct continental aridity and inferred ice-sheet development during the middle to late Eocene (~45-37 Ma). However, we find an abrupt aridification of East Antarctica near the Eocene-Oligocene transition (~34 Ma), which suggests that ice coverage influenced high-latitude atmospheric circulation patterns through albedo effects from the earliest Oligocene onward.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el presente artículo examino la apropiación platónica del lenguaje poético en República y sostengo que, a pesar de sus críticas a la poesía en los libros 3 y 10, el lenguaje poético está correctamente entrelazado dentro del tejido filosófico para pintar lo corrupto, lo feo y lo inmoral. En términos específicos, la adaptación platónica de diversos motivos poéticos e imágenes en República se vuelve más significativa si prestamos atención a Sócrates como un quasi-pintor en el diálogo e interpretamos sus imágenes filosóficas como una respuesta de la filosofía a las engañosas representaciones dramáticas de la poesía. De este modo, el arte de la pintura que, incluso es criticado en el libro 10 de República, en manos de Platón resulta una herramienta filosófica que le permite investigar la relación de nuestro mundo senso-perceptivo ordinario con el campo metafísico de las Ideas y el lugar de lo humano en él

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el presente artículo examino la apropiación platónica del lenguaje poético en República y sostengo que, a pesar de sus críticas a la poesía en los libros 3 y 10, el lenguaje poético está correctamente entrelazado dentro del tejido filosófico para pintar lo corrupto, lo feo y lo inmoral. En términos específicos, la adaptación platónica de diversos motivos poéticos e imágenes en República se vuelve más significativa si prestamos atención a Sócrates como un quasi-pintor en el diálogo e interpretamos sus imágenes filosóficas como una respuesta de la filosofía a las engañosas representaciones dramáticas de la poesía. De este modo, el arte de la pintura que, incluso es criticado en el libro 10 de República, en manos de Platón resulta una herramienta filosófica que le permite investigar la relación de nuestro mundo senso-perceptivo ordinario con el campo metafísico de las Ideas y el lugar de lo humano en él

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el presente artículo examino la apropiación platónica del lenguaje poético en República y sostengo que, a pesar de sus críticas a la poesía en los libros 3 y 10, el lenguaje poético está correctamente entrelazado dentro del tejido filosófico para pintar lo corrupto, lo feo y lo inmoral. En términos específicos, la adaptación platónica de diversos motivos poéticos e imágenes en República se vuelve más significativa si prestamos atención a Sócrates como un quasi-pintor en el diálogo e interpretamos sus imágenes filosóficas como una respuesta de la filosofía a las engañosas representaciones dramáticas de la poesía. De este modo, el arte de la pintura que, incluso es criticado en el libro 10 de República, en manos de Platón resulta una herramienta filosófica que le permite investigar la relación de nuestro mundo senso-perceptivo ordinario con el campo metafísico de las Ideas y el lugar de lo humano en él

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Long-Term Ecological Research (LTER) observatory HAUSGARTEN, in the eastern Fram Strait, provides us the valuable ability to study the composition of benthic megafaunal communities through the analysis of seafloor photographs. This, in combination with extensive sampling campaigns, which have yielded a unique data set on faunal, bacterial, biogeochemical and geological properties, as well as on hydrography and sedimentation patterns, allows us to address the question of why variations in megafaunal community structure and species distribution exist within regional (60-110 km) and local (<4 km) scales. Here, we present first results from the latitudinal HAUSGARTEN gradient, consisting of three different stations (N3, HG-IV, S3) between 78°30'N and 79°45'N (2351 - 2788 m depth), obtained via the analysis of images acquired by a towed camera (OFOS - Ocean Floor Observation System) in 2011. We assess variability in megafaunal densities, species composition and diversity as well as biotic and biogenic habitat features, which may cause the patterns observed. While there were significant regional-scale differences in megafaunal composition and densities between the stations (N3 = 26.74 ± 0.63; HG-IV = 11.21 ± 0.25; S3 = 18.34 ± 0.39 individuals/m**2), significant local differences were only found at HG-IV. Regional-scale variations may be due to the significant differences in ice coverage at each station as well as the different quantities of protein available, whereas local-scale differences at HG-IV may be a result of variation in bottom topography or factors not yet identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We estimated the relative contribution of atmospheric Nitrogen (N) input (wet and dry deposition and N fixation) to the epipelagic food web by measuring N isotopes of different functional groups of epipelagic zooplankton along 23°W (17°N-4°S) and 18°N (20-24°W) in the Eastern Tropical Atlantic. Results were related to water column observations of nutrient distribution and vertical diffusive flux as well as colony abundance of Trichodesmium obtained with an Underwater Vision Profiler (UVP5). The thickness and depth of the nitracline and phosphocline proved to be significant predictors of zooplankton stable N isotope values. Atmospheric N input was highest (61% of total N) in the strongly stratified and oligotrophic region between 3 and 7°N, which featured very high depth-integrated Trichodesmium abundance (up to 9.4×104 colonies m-2), strong thermohaline stratification and low zooplankton delta15N (~2 per mil). Relative atmospheric N input was lowest south of the equatorial upwelling between 3 and 5°S (27%). Values in the Guinea Dome region and north of Cape Verde ranged between 45 and 50%, respectively. The microstructure-derived estimate of the vertical diffusive N flux in the equatorial region was about one order of magnitude higher than in any other area (approximately 8 mmol m-2 d 1). At the same time, this region received considerable atmospheric N input (35% of total). In general, zooplankton delta15N and Trichodesmium abundance were closely correlated, indicating that N fixation is the major source of atmospheric N input. Although Trichodesmium is not the only N fixing organism, its abundance can be used with high confidence to estimate the relative atmospheric N input in the tropical Atlantic (r2 = 0.95). Estimates of absolute N fixation rates are two- to tenfold higher than incubation-derived rates reported for the same regions. Our approach integrates over large spatial and temporal scales and also quantifies fixed N released as dissolved inorganic and organic N. In a global analysis, it may thus help to close the gap in oceanic N budgets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European construction industry is supposed to consume the 40% of the natural European resources and to generate the 40% of the European solid waste. Conscious of the great damage being suffered by the environment because of construction activity, this work tries to provide the building actors with a new tool to improve the current situation. The tool proposed is a model for the comprehensive evaluation of construction products by determining their environmental level. In this research, the environmental level of a construction product has been defined as its quality of accomplishing the construction requirements needed by causing the minimum ecological impact in its surrounding environment. This information allows building actors to choose suitable materials for building needs and also for the environment, mainly in the project stage or on the building site, contributing to improve the relationship between buildings and environment. For the assessment of the environmental level of construction products, five indicators have been identified regarding their global environmental impact through the product life cycle: CO2 emissions provoked during their production, volume and toxicity of waste generated on the building site, durability and recycling capacity after their useful life. Therefore, the less environmental impact one construction product produces, the higher environmental level performs. The model has been tested in 30 construction products that include environmental criteria in their description. The results obtained will be discussed in this article. Furthermore, this model can lay down guidelines for the selection of ecoefficient construction products and the design of new eco-competitive and eco-committed ones

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Under the 12th International Conference on Building Materials and Components is inserted this communication related to the field of management of those assets that constitute the Spanish Cultural Heritage and maintenance. This work is related to the field of management of those assets that constitute the Spanish Cultural Heritage which share an artistic or historical background. The conservation and maintenance become a social demand necessary for the preservation of public values, requiring the investment of necessary resources. The legal protection involves a number of obligations and rights to ensure the conservation and heritage protection. The duty of maintenance and upkeep exceeds the useful life the property that must endure more for their cultural value for its usability. The establishment of the necessary conditions to prevent deterioration and precise in order to fulfill its social function, seeking to prolong the life of the asset, preserving their physical integrity and its ability to convey the values protected. This obligation implies a substantial financial effort to the holder of the property, either public or private entity, addressing a problem of economic sustainability. Economic exploitation, with the aim of contributing to their well-maintained, is sometimes the best way to get resources. The work will include different lines of research with the following objectives. - Establishment of processes for assessing total costs over the building life cycle (LCC), during the planning stages or maintenance budgets to determine the most advantageous operating system. - Relationship between the value of property and maintenance costs, and establishing a sensitivity analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present contribution discusses the development of a PSE-3D instability analysis algorithm, in which a matrix forming and storing approach is followed. Alternatively to the typically used in stability calculations spectral methods, new stable high-order finitedifference-based numerical schemes for spatial discretization 1 are employed. Attention is paid to the issue of efficiency, which is critical for the success of the overall algorithm. To this end, use is made of a parallelizable sparse matrix linear algebra package which takes advantage of the sparsity offered by the finite-difference scheme and, as expected, is shown to perform substantially more efficiently than when spectral collocation methods are used. The building blocks of the algorithm have been implemented and extensively validated, focusing on classic PSE analysis of instability on the flow-plate boundary layer, temporal and spatial BiGlobal EVP solutions (the latter necessary for the initialization of the PSE-3D), as well as standard PSE in a cylindrical coordinates using the nonparallel Batchelor vortex basic flow model, such that comparisons between PSE and PSE-3D be possible; excellent agreement is shown in all aforementioned comparisons. Finally, the linear PSE-3D instability analysis is applied to a fully three-dimensional flow composed of a counter-rotating pair of nonparallel Batchelor vortices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most significant aspects of a building’s acoustic behavior is the airborne sound insulation of the room façades, since this determines the protection of its inhabitants against environmental noise. For this reason, authorities in most countries have established in their acoustic regulations for buildings the minimum value of sound insulation that must be respected for façades. In order to verify compliance with legal requirements it is usual to perform acoustic measurements in the finished buildings and then compare the measurement results with the established limits. Since there is always a certain measurement uncertainty, this uncertainty must be calculated and taken into account in order to ensure compliance with specifications. The most commonly used method for measuring sound insulation on façades is the so-called Global Loudspeaker Method, specified in ISO 140-5:1998. This method uses a loudspeaker placed outside the building as a sound source. The loudspeaker directivity has a significant influence on the measurement results, and these results may change noticeably by choosing different loudspeakers, even though they all fulfill the directivity requirements of ISO 140-5. This work analyzes the influence of the loudspeaker directivity on the results of façade sound insulation measurement, and determines its contribution to measurement uncertainty. The theoretical analysis is experimentally validated by means of an intermediate precision test according to ISO 5725-3:1994, which compares the values of sound insulation obtained for a façade using various loudspeakers with different directivities

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The image by Computed Tomography is a non-invasive alternative for observing soil structures, mainly pore space. The pore space correspond in soil data to empty or free space in the sense that no material is present there but only fluids, the fluid transport depend of pore spaces in soil, for this reason is important identify the regions that correspond to pore zones. In this paper we present a methodology in order to detect pore space and solid soil based on the synergy of the image processing, pattern recognition and artificial intelligence. The mathematical morphology is an image processing technique used for the purpose of image enhancement. In order to find pixels groups with a similar gray level intensity, or more or less homogeneous groups, a novel image sub-segmentation based on a Possibilistic Fuzzy c-Means (PFCM) clustering algorithm was used. The Artificial Neural Networks (ANNs) are very efficient for demanding large scale and generic pattern recognition applications for this reason finally a classifier based on artificial neural network is applied in order to classify soil images in two classes, pore space and solid soil respectively.