989 resultados para Empirical processes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the notion of permanent object during the first year of life, taking into account the controversy of two approaches about the nature of change: developmental change and cognitive change. Using a longitudinal/cross-sectional design, tasks adapted of the subscale of permanent object and operative causality of the Uzgiris-Hunt Scale (Uzgiris and Hunt, 1975) (Uzgiris & Hunt, 1975) were presented to 110 infants of 0, 3, 6 and 9 months-old, which reside in three cities of Colombia. The results showed three types of strategies: (a) Not resolution; (b) Exploratory and (c) Resolution, which follow different trajectories in children’s performance. This allows affirming that adaptive conquests of the cognitive development stay together with the variety of strategies. Using strategies reveals adjustments and transformations of action programs that consolidate the notion of permanent object not necessarily with age, but with self-regulatory processes. Empirical evidence contributes to the understanding of the relations between the emergence of novelty in the development and performance variability

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We argue that impulsiveness is characterized by compromised timing functions such as premature motor timing, decreased tolerance to delays, poor temporal foresight and steeper temporal discounting. A model illustration for the association between impulsiveness and timing deficits is the impulsiveness disorder of attention-deficit hyperactivity disorder (ADHD). Children with ADHD have deficits in timing processes of several temporal domains and the neural substrates of these compromised timing functions are strikingly similar to the neuropathology of ADHD. We review our published and present novel functional magnetic resonance imaging data to demonstrate that ADHD children show dysfunctions in key timing regions of prefrontal, cingulate, striatal and cerebellar location during temporal processes of several time domains including time discrimination of milliseconds, motor timing to seconds and temporal discounting of longer time intervals. Given that impulsiveness, timing abnormalities and more specifically ADHD have been related to dopamine dysregulation, we tested for and demonstrated a normalization effect of all brain dysfunctions in ADHD children during time discrimination with the dopamine agonist and treatment of choice, methylphenidate. This review together with the new empirical findings demonstrates that neurocognitive dysfunctions in temporal processes are crucial to the impulsiveness disorder of ADHD and provides first evidence for normalization with a dopamine reuptake inhibitor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Space weather effects on technological systems originate with energy carried from the Sun to the terrestrial environment by the solar wind. In this study, we present results of modeling of solar corona-heliosphere processes to predict solar wind conditions at the L1 Lagrangian point upstream of Earth. In particular we calculate performance metrics for (1) empirical, (2) hybrid empirical/physics-based, and (3) full physics-based coupled corona-heliosphere models over an 8-year period (1995–2002). L1 measurements of the radial solar wind speed are the primary basis for validation of the coronal and heliosphere models studied, though other solar wind parameters are also considered. The models are from the Center for Integrated Space-Weather Modeling (CISM) which has developed a coupled model of the whole Sun-to-Earth system, from the solar photosphere to the terrestrial thermosphere. Simple point-by-point analysis techniques, such as mean-square-error and correlation coefficients, indicate that the empirical coronal-heliosphere model currently gives the best forecast of solar wind speed at 1 AU. A more detailed analysis shows that errors in the physics-based models are predominately the result of small timing offsets to solar wind structures and that the large-scale features of the solar wind are actually well modeled. We suggest that additional “tuning” of the coupling between the coronal and heliosphere models could lead to a significant improvement of their accuracy. Furthermore, we note that the physics-based models accurately capture dynamic effects at solar wind stream interaction regions, such as magnetic field compression, flow deflection, and density buildup, which the empirical scheme cannot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managing ecosystems to ensure the provision of multiple ecosystem services is a key challenge for applied ecology. Functional traits are receiving increasing attention as the main ecological attributes by which different organisms and biological communities influence ecosystem services through their effects on underlying ecosystem processes. Here we synthesize concepts and empirical evidence on linkages between functional traits and ecosystem services across different trophic levels. Most of the 247 studies reviewed considered plants and soil invertebrates, but quantitative trait–service associations have been documented for a range of organisms and ecosystems, illustrating the wide applicability of the trait approach. Within each trophic level, specific processes are affected by a combination of traits while particular key traits are simultaneously involved in the control of multiple processes. These multiple associations between traits and ecosystem processes can help to identify predictable trait–service clusters that depend on several trophic levels, such as clusters of traits of plants and soil organisms that underlie nutrient cycling, herbivory, and fodder and fibre production. We propose that the assessment of trait–service clusters will represent a crucial step in ecosystem service monitoring and in balancing the delivery of multiple, and sometimes conflicting, services in ecosystem management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many different individuals, who have their own expertise and criteria for decision making, are involved in making decisions on construction projects. Decision-making processes are thus significantly affected by communication, in which a dynamic performance of human intentions leads to unpredictable outcomes. In order to theorise the decision making processes including communication, it is argued here that the decision making processes resemble evolutionary dynamics in terms of both selection and mutation, which can be expressed by the replicator-mutator equation. To support this argument, a mathematical model of decision making has been made from an analogy with evolutionary dynamics, in which there are three variables: initial support rate, business hierarchy, and power of persuasion. On the other hand, a survey of patterns in decision making in construction projects has also been performed through self-administered mail questionnaire to construction practitioners. Consequently, comparison between the numerical analysis of mathematical model and the statistical analysis of empirical data has shown a significant potential of the replicator-mutator equation as a tool to study dynamic properties of intentions in communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper draws from three case studies of regional construction firms operating in the UK. The case studies provide new insights into the ways in which such firms strive to remain competitive. Empirical data was derived from multiple interactions with senior personnel from with each firm. Data collection methods included semi-structured interviews, informal interactions, archival research, and workshops. The initial research question was informed by existing resource-based theories of competitiveness and an extensive review of constructionspecific literature. However, subsequent emergent empirical findings progressively pointed towards the need to mobilise alternative theoretical models that emphasise localised learning and embeddedness. The findings point towards the importance of de-centralised structures that enable multiple business units to become embedded within localised markets. A significant degree of autonomy is essential to facilitate entrepreneurial behaviour. In essence, sustained competitiveness was found to rest on the way de-centralised business units enact ongoing processes of localised learning. Once local business units have become embedded within localised markets, the essential challenge is how to encourage continued entrepreneurial behaviour while maintaining some degree of centralised control and coordination. This presents a number of tensions and challenges which play out differently across each of the three case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was an attempt to identify the epistemological roots of knowledge when students carry out hands-on experiments in physics. We found that, within the context of designing a solution to a stated problem, subjects constructed and ran thought experiments intertwined within the processes of conducting physical experiments. We show that the process of alternating between these two modes- empirically experimenting and experimenting in thought- leads towards a convergence on scientifically acceptable concepts. We call this process mutual projection. In the process of mutual projection, external representations were generated. Objects in the physical environment were represented in an imaginary world and these representations were associated with processes in the physical world. It is through this coupling that constituents of both the imaginary world and the physical world gain meaning. We further show that the external representations are rooted in sensory interaction and constitute a semi-symbolic pictorial communication system, a sort of primitive 'language', which is developed as the practical work continues. The constituents of this pictorial communication system are used in the thought experiments taking place in association with the empirical experimentation. The results of this study provide a model of physics learning during hands-on experimentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perturb the SC, BCC, and FCC crystal structures with a spatial Gaussian noise whose adimensional strength is controlled by the parameter a, and analyze the topological and metrical properties of the resulting Voronoi Tessellations (VT). The topological properties of the VT of the SC and FCC crystals are unstable with respect to the introduction of noise, because the corresponding polyhedra are geometrically degenerate, whereas the tessellation of the BCC crystal is topologically stable even against noise of small but finite intensity. For weak noise, the mean area of the perturbed BCC and FCC crystals VT increases quadratically with a. In the case of perturbed SCC crystals, there is an optimal amount of noise that minimizes the mean area of the cells. Already for a moderate noise (a>0.5), the properties of the three perturbed VT are indistinguishable, and for intense noise (a>2), results converge to the Poisson-VT limit. Notably, 2-parameter gamma distributions are an excellent model for the empirical of of all considered properties. The VT of the perturbed BCC and FCC structures are local maxima for the isoperimetric quotient, which measures the degre of sphericity of the cells, among space filling VT. In the BCC case, this suggests a weaker form of the recentluy disproved Kelvin conjecture. Due to the fluctuations of the shape of the cells, anomalous scalings with exponents >3/2 is observed between the area and the volumes of the cells, and, except for the FCC case, also for a->0. In the Poisson-VT limit, the exponent is about 1.67. As the number of faces is positively correlated with the sphericity of the cells, the anomalous scaling is heavily reduced when we perform powerlaw fits separately on cells with a specific number of faces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The time-mean quasi-geostrophic potential vorticity equation of the atmospheric flow on isobaric surfaces can explicitly include an atmospheric (internal) forcing term of the stationary-eddy flow. In fact, neglecting some non-linear terms in this equation, this forcing can be mathematically expressed as a single function, called Empirical Forcing Function (EFF), which is equal to the material derivative of the time-mean potential vorticity. Furthermore, the EFF can be decomposed as a sum of seven components, each one representing a forcing mechanism of different nature. These mechanisms include diabatic components associated with the radiative forcing, latent heat release and frictional dissipation, and components related to transient eddy transports of heat and momentum. All these factors quantify the role of the transient eddies in forcing the atmospheric circulation. In order to assess the relevance of the EFF in diagnosing large-scale anomalies in the atmospheric circulation, the relationship between the EFF and the occurrence of strong North Atlantic ridges over the Eastern North Atlantic is analyzed, which are often precursors of severe droughts over Western Iberia. For such events, the EFF pattern depicts a clear dipolar structure over the North Atlantic; cyclonic (anticyclonic) forcing of potential vorticity is found upstream (downstream) of the anomalously strong ridges. Results also show that the most significant components are related to the diabatic processes. Lastly, these results highlight the relevance of the EFF in diagnosing large-scale anomalies, also providing some insight into their interaction with different physical mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is widely accepted that some of the most accurate Value-at-Risk (VaR) estimates are based on an appropriately specified GARCH process. But when the forecast horizon is greater than the frequency of the GARCH model, such predictions have typically required time-consuming simulations of the aggregated returns distributions. This paper shows that fast, quasi-analytic GARCH VaR calculations can be based on new formulae for the first four moments of aggregated GARCH returns. Our extensive empirical study compares the Cornish–Fisher expansion with the Johnson SU distribution for fitting distributions to analytic moments of normal and Student t, symmetric and asymmetric (GJR) GARCH processes to returns data on different financial assets, for the purpose of deriving accurate GARCH VaR forecasts over multiple horizons and significance levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What is the relation between competition and performance? The present research addresses this important multidisciplinary question by conducting a meta-analysis of existing empirical work and by proposing a new conceptual model—the opposing processes model of competition and performance. This model was tested by conducting an additional meta-analysis and 3 new empirical studies. The first meta-analysis revealed that there is no noteworthy relation between competition and performance. The second meta-analysis showed, in accord with the opposing processes model, that the absence of a direct effect is the result of inconsistent mediation via achievement goals: Competition prompts performance-approach goals which, in turn, facilitate performance; and competition also prompts performance-avoidance goals which, in turn, undermine performance. These same direct and mediational findings were also observed in the 3 new empirical studies (using 3 different conceptualizations of competition and attending to numerous control variables). Our findings provide both interpretational clarity regarding past research and conceptual guidance regarding future research on the competition–performance relation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a process-based model for the dispersion of a passive scalar in the turbulent flow around the buildings of a city centre. The street network model is based on dividing the airspace of the streets and intersections into boxes, within which the turbulence renders the air well mixed. Mean flow advection through the network of street and intersection boxes then mediates further lateral dispersion. At the same time turbulent mixing in the vertical detrains scalar from the streets and intersections into the turbulent boundary layer above the buildings. When the geometry is regular, the street network model has an analytical solution that describes the variation in concentration in a near-field downwind of a single source, where the majority of scalar lies below roof level. The power of the analytical solution is that it demonstrates how the concentration is determined by only three parameters. The plume direction parameter describes the branching of scalar at the street intersections and hence determines the direction of the plume centreline, which may be very different from the above-roof wind direction. The transmission parameter determines the distance travelled before the majority of scalar is detrained into the atmospheric boundary layer above roof level and conventional atmospheric turbulence takes over as the dominant mixing process. Finally, a normalised source strength multiplies this pattern of concentration. This analytical solution converges to a Gaussian plume after a large number of intersections have been traversed, providing theoretical justification for previous studies that have developed empirical fits to Gaussian plume models. The analytical solution is shown to compare well with very high-resolution simulations and with wind tunnel experiments, although re-entrainment of scalar previously detrained into the boundary layer above roofs, which is not accounted for in the analytical solution, is shown to become an important process further downwind from the source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical Mode Decomposition is presented as an alternative to traditional analysis methods to decompose geomagnetic time series into spectral components. Important comments on the algorithm and its variations will be given. Using this technique, planetary wave modes of 5-, 10-, and 16-day mean periods can be extracted from magnetic field components of three different stations in Germany. In a second step, the amplitude modulation functions of these wave modes can be shown to contain significant contribution from solar cycle variation through correlation with smoothed sunspot numbers. Additionally, the data indicate connections with geomagnetic jerk occurrences, supported by a second set of data providing reconstructed near-Earth magnetic field for 150 years. Usually attributed to internal dynamo processes within the Earth's outer core, the question of who is impacting whom will be briefly discussed here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.