928 resultados para conceptual hydrogeological model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Functional grammar currently has a great acceptance in linguistics, mainly because it can enlighten grammatical facts' motivation in the structure of a text. From its emergence, the tradition of studying grammar by grammar has come to an end. Demonstrative pronouns, for instance, have begun to be viewed as efficient tools of text cohesion, used to resume terms from previous clauses. This task, however, ends up leaving an endless trail of black boxes. How is it possible to explain the origin of demonstratives' anaphoric functioning if they are originally used to indicate things or people relative to the interlocutors' spatial position? This work aims at showing that Cognitive Linguistics arises just as an option for opening those black boxes. This article focuses on one of its themes - the conceptual blending theory - to support this possibility. Firstly, it was necessary to integrate the cognitive model into the complexity theory, according to Bybee (2010) and Castilho (2009), who understand language as a complex adaptive system. After that, a brief updated description on the conceptual blending theory is made and its application in some grammatical facts of the Brazilian Portuguese language is suggested under the morphological and syntactic levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Geociências e Meio Ambiente - IGCE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The water management in any area is highly important to the success of many business and also of life and the understanding of your relationship with the environment brings better control to its demand. I.e. hydrogeological studies are needed under better understanding of the behavior of an aquifer, so that its management is done so as not to deplete or harm it. The objective of this work is the numerical modeling in transient regime of a portion of the Rio Claro aquifer formation in order to get answers about its hydrogeological parameters, its main flow direction and also its most sensitive parameters. A literature review and conceptual characterization of the aquifer, combined with field campaigns and monitoring of local water level (NA), enabled the subsequent construction of the mathematical model by finite elements method, using the FEFLOW 6.1 ® computational algorithm. The study site includes the campus of UNESP and residential and industrial areas of Rio Claro city. Its area of 9.73 km ² was divided into 318040 triangular elements spread over six layers, totaling a volume of 0.25 km³. The local topography and geological contacts were obtained from previous geological and geophysical studies as well as profiles of campus wells and SIAGAS / CPRM system. The seven monitoring wells on campus were set up as observation points for calibration and checking of the simulation results. Sampling and characterization of Rio Claro sandstones shows up a high hydrological and lithological heterogeneity for the aquifer formation. The simulation results indicate values of hydraulic conductivity between 10-6 and 10-4 m / s, getting the Recharge/Rainfall simulation in transient ratio at 13%. Even with the simplifications imposed on the model, it was able to represent the fluctuations of local NA over a year of monitoring. The result was the exit of 3774770 m³ of water and the consequently NA fall. The model is considered representative for the...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metacontingency has been described as the functional relation between interlocking behavioral contingencies, plus their direct and immediate effect, called aggregated product, and a selecting event dependent of such effect, called cultural consequence. The metacontingencies analysis enables the discussion of human behavior complexity in social systems. In the present study, we aimed to review and discuss: (a) the importance of basic behavioral processes analysis for the comprehension of social human phenomena; (b) the necessity of constructing and improving metacontingencies experimental models; (c) the current state of metacontingencies experimental investigations in humans; (d) the use of animal models as a way to control the effects of verbal behavior, among other variables, over cultural selection; (e) a concrete and illustrative proposal of an animal model of metacontingencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exergetic analysis can provide useful information as it enables the identification of irreversible phenomena bringing about entropy generation and, therefore, exergy losses (also referred to as irreversibilities). As far as human thermal comfort is concerned, irreversibilities can be evaluated based on parameters related to both the occupant and his surroundings. As an attempt to suggest more insights for the exergetic analysis of thermal comfort, this paper calculates irreversibility rates for a sitting person wearing fairly light clothes and subjected to combinations of ambient air and mean radiant temperatures. The thermodynamic model framework relies on the so-called conceptual energy balance equation together with empirical correlations for invoked thermoregulatory heat transfer rates adapted for a clothed body. Results suggested that a minimum irreversibility rate may exist for particular combinations of the aforesaid surrounding temperatures. By separately considering the contribution of each thermoregulatory mechanism, the total irreversibility rate rendered itself more responsive to either convective or radiative clothing-influenced heat transfers, with exergy losses becoming lower if the body is able to transfer more heat (to the ambient) via convection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a previous paper, we connected the phenomenological noncommutative inflation of Alexander, Brandenberger and Magueijo [ Phys. Rev. D 67 081301 (2003)] and Koh and Brandenberger [ J. Cosmol. Astropart Phys. 2007 21 ()] with the formal representation theory of groups and algebras and analyzed minimal conditions that the deformed dispersion relation should satisfy in order to lead to a successful inflation. In that paper, we showed that elementary tools of algebra allow a group-like procedure in which even Hopf algebras (roughly the symmetries of noncommutative spaces) could lead to the equation of state of inflationary radiation. Nevertheless, in this paper, we show that there exists a conceptual problem with the kind of representation that leads to the fundamental equations of the model. The problem comes from an incompatibility between one of the minimal conditions for successful inflation (the momentum of individual photons being bounded from above) and the Fock-space structure of the representation which leads to the fundamental inflationary equations of state. We show that the Fock structure, although mathematically allowed, would lead to problems with the overall consistency of physics, like leading to a problematic scattering theory, for example. We suggest replacing the Fock space by one of two possible structures that we propose. One of them relates to the general theory of Hopf algebras (here explained at an elementary level) while the other is based on a representation theorem of von Neumann algebras (a generalization of the Clebsch-Gordan coefficients), a proposal already suggested by us to take into account interactions in the inflationary equation of state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management and organization literature has extensively noticed the crucial role that improvisation assumes in organizations, both as a learning process (Miner, Bassoff & Moorman, 2001), a creative process (Fisher & Amabile, 2008), a capability (Vera & Crossan, 2005), and a personal disposition (Hmielesky & Corbett, 2006; 2008). My dissertation aims to contribute to the existing literature on improvisation, addressing two general research questions: 1) How does improvisation unfold at an individual level? 2) What are the potential antecedents and consequences of individual proclivity to improvise? This dissertation is based on a mixed methodology that allowed me to deal with these two general research questions and enabled a constant interaction between the theoretical framework and the empirical results. The selected empirical field is haute cuisine and the respondents are the executive chefs of the restaurants awarded by Michelin Guide in 2010 in Italy. The qualitative section of the dissertation is based on the analysis of 26 inductive case studies and offers a multifaceted contribution. First, I describe how improvisation works both as a learning and creative process. Second, I introduce a new categorization of individual improvisational scenarios (demanded creative improvisation, problem solving improvisation, and pure creative improvisation). Third, I describe the differences between improvisation and other creative processes detected in the field (experimentation, brainstorming, trial and error through analytical procedure, trial and error, and imagination). The quantitative inquiry is founded on a Structural Equation Model, which allowed me to test simultaneously the relationships between proclivity to improvise and its antecedents and consequences. In particular, using a newly developed scale to measure individual proclivity to improvise, I test the positive influence of industry experience, self-efficacy, and age on proclivity to improvise and the negative impact of proclivity to improvise on outcome deviation. Theoretical contributions and practical implications of the results are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our growing understanding of human mind and cognition and the development of neurotechnology has triggered debate around cognitive enhancement in neuroethics. The dissertation examines the normative issues of memory enhancement, and focuses on two issues: (1) the distinction between memory treatment and enhancement; and (2) how the issue of authenticity concerns memory interventions, including memory treatments and enhancements. rnThe first part consists of a conceptual analysis of the concepts required for normative considerations. First, the representational nature and the function of memory are discussed. Memory is regarded as a special form of self-representation resulting from a constructive processes. Next, the concepts of selfhood, personhood, and identity are examined and a conceptual tool—the autobiographical self-model (ASM)—is introduced. An ASM is a collection of mental representations of the system’s relations with its past and potential future states. Third, the debate between objectivist and constructivist views of health are considered. I argue for a phenomenological account of health, which is based on the primacy of illness and negative utilitarianism.rnThe second part presents a synthesis of the relevant normative issues based on the conceptual tools developed. I argue that memory enhancement can be distinguished from memory treatment using a demarcation regarding the existence of memory-related suffering. That is, memory enhancements are, under standard circumstances and without any unwilling suffering or potential suffering resulting from the alteration of memory functions, interventions that aim to manipulate memory function based on the self-interests of the individual. I then consider the issue of authenticity, namely whether memory intervention or enhancement endangers “one’s true self”. By analyzing two conceptions of authenticity—authenticity as self-discovery and authenticity as self-creation, I propose that authenticity should be understood in terms of the satisfaction of the functional constraints of an ASM—synchronic coherence, diachronic coherence, and global veridicality. This framework provides clearer criteria for considering the relevant concerns and allows us to examine the moral values of authenticity. rn

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reviews the psychophysiological and brain imaging literature on emotional brain function from a methodological point of view. The difficulties in defining, operationalising and measuring emotional activation and, in particular, aversive learning will be considered. Emotion is a response of the organism during an episode of major significance and involves physiological activation, motivational, perceptual, evaluative and learning processes, motor expression, action tendencies and monitoring/subjective feelings. Despite the advances in assessing the physiological correlates of emotional perception and learning processes, a critical appraisal shows that functional neuroimaging approaches encounter methodological difficulties regarding measurement precision (e.g., response scaling and reproducibility) and validity (e.g., response specificity, generalisation to other paradigms, subjects or settings). Since emotional processes are not only the result of localised but also of widely distributed activation, a more representative model of assessment is needed that systematically relates the hierarchy of high- and low-level emotion constructs with the corresponding patterns of activity and functional connectivity of the brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The IDA model of cognition is a fully integrated artificial cognitive system reaching across the full spectrum of cognition, from low-level perception/action to high-level reasoning. Extensively based on empirical data, it accurately reflects the full range of cognitive processes found in natural cognitive systems. As a source of plausible explanations for very many cognitive processes, the IDA model provides an ideal tool to think with about how minds work. This online tutorial offers a reasonably full account of the IDA conceptual model, including background material. It also provides a high-level account of the underlying computational “mechanisms of mind” that constitute the IDA computational model.