961 resultados para Multifaceted explanation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES] El uso de analogías en bioética es muy frecuente. Dado que son instrumentos especialmente eficaces desde un punto de vista retórico, resulta fundamental determinar bajo qué condiciones la formulación de analogías constituye un recurso discursivo legítimo. En este artículo, distinguimos entre usos no-discursivos y usos discursivos de las analogías, y dentro de estos últimos, entre usos explicativos y usos argumentativos. En base a esta clasificación, proponemos distintos conjuntos de criterios para determinar si una analogía particular constituye un recurso discursivo legítimo o no. Para ello, ilustramos nuestra clasificación mediante ejemplos tomados del reciente debate ético y jurídico sobre los biobancos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[Es]En este trabajo se aborda la problemática detectada como consecuencia del fracaso que presenta gran mayoría del alumnado de la E. U. de Magisterio de Bilbao, futuro profesorado de Educación Primaria (EP), en la aplicación creativa de conocimientos transmitidos en el aula de ciencias, esto es, en la resolución de problemas y en la explicación de fenómenos cotidianos del mundo que nos rodea. Para ello se ha analizado, por un lado, su capacitación en relación a varios tópicos de ciencias incluidos en el Área de Conocimiento del Medio en la EP, presentados en un contexto de ciencia en la vida cotidiana y su autovaloración en relación a su capacitación didáctica para abordarlos en aulas de ciencias escolares y, por otro, la metodología didáctica utilizada en las clases de ciencias que han recibido en etapas educativas previas a la universitaria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES] En el presente artículo se aborda el problema de la causalidad en las Ciencias Económicas. Partiendo de la diferenciación entre ciencias duras y blandas, y de la supuesta clasificicación de la economía en este segundo grupo, se realiza un análisis de los diferentes paradigmas ontológicos que soportan la investigación científica, para a continuación trasladar la causalidad desde el ámbito ontológico al gnoseológico. Posteriormente se profundiza en la conjunción de la metodología hipotético-deductiva con el método correlacional, generando una causalidad probabilística. Dicha causalidad, contextualizada de forma científica, permite la realización y contrastación de inferencias predictivas, a través de las cuales las Ciencias Ecnómicas pueden encontrar su ubicación en los niveles mas extrictos de la investigación científica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES] Este artículo de introducción al número especial sobre innovación de la revista Cuadernos de Gestión consta de dos apartados bien diferenciados. En el primero, se recogen algunas ideas que pueden ser útiles para los lectores que se aproximan al tema de la innovación y observan una gran variedad conceptual. En concreto, se ha tratado de explicar de una manera más o menos exhaustiva el concepto de Open Innovation (OI). La OI surge del reconocimiento de que la empresa no puede innovar de manera aislada, y, por lo tanto, necesita siempre adquirir las ideas y los recursos del entorno exterior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trata da definição de princípios e de sua validade no processo legislativo. Estuda, inicialmente, a evolução do Estado e do Poder Legislativo para situar sua importância e suas regras. Posteriormente, propõe a delimitação do que são princípios e quais os mais caros ao processo legislativo, definindo o princípio democrático (e correlatos) e o princípio do bicameralismo como pontos de partida. Após essa explanação, procura aplicar o referencial teórico à tramitação do projeto de lei da Ficha Limpa, focando na emenda de redação do Senado Federal como possível afronta aos princípios citados. Essa discussão ainda é abordada segundo as ações propostas ao Poder Judiciário e as respostas dos tribunais à questão. /// The study aims to define what are the principles and validity study of the legislative process. It was initially studied the evolution of the state and particularly the Legislature to place the reader on its importance and its rules. Subsequently, it is proposed that the delimiting of the principles and which are the most expensive of the legislative process, defining the democratic principle (and related) and the principle of bicameralism as starting points. After this explanation, we try to apply the theoretical framework for the processing of the bill of ¿Ficha Limpa¿, focusing on the wording of the amendment in the Senate as a possible affront to the principles cited. This discussion is addressed under the proposed actions to the courts and the courts' responses to the question.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers a time varying parameter extension of the Ruge-Murcia (2003, 2004) model to explore whether some of the variation in parameter estimates seen in the literature could arise from this source. A time varying value for the unemployment volatility parameter can be motivated through several means including variation in the slope of the Phillips curve or variation in the preferences of the monetary authority.We show that allowing time variation for the coefficient on the unemployment volatility parameter improves the model fit and it helps to provide an explanation of inflation bias based on asymmetric central banker preferences, which is consistent across subsamples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper investigates whether the growing GDP share of the services sector can contribute to explain the great moderation in the US. We identify and analyze three oil price shocks and use a SVAR analysis to measure their economic impact on the US economy at both the aggregate and the sectoral level. We find mixed support for the explanation of the great moderation in terms of shrinking oil shock volatilities and observe that increases (decreases) in oil shock volatilities are contrasted by a weakening (strengthening) in their transmission mechanism. Across sectors, services are the least affected by any oil shock. As the contribution of services to the GDP volatility increases over time, we conclude that a composition effect contributed to moderate the conditional volatility to oil shocks of the US GDP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mapping and geospatial analysis of benthic environments are multidisciplinary tasks that have become more accessible in recent years because of advances in technology and cost reductions in survey systems. The complex relationships that exist among physical, biological, and chemical seafloor components require advanced, integrated analysis techniques to enable scientists and others to visualize patterns and, in so doing, allow inferences to be made about benthic processes. Effective mapping, analysis, and visualization of marine habitats are particularly important because the subtidal seafloor environment is not readily viewed directly by eye. Research in benthic environments relies heavily, therefore, on remote sensing techniques to collect effective data. Because many benthic scientists are not mapping professionals, they may not adequately consider the links between data collection, data analysis, and data visualization. Projects often start with clear goals, but may be hampered by the technical details and skills required for maintaining data quality through the entire process from collection through analysis and presentation. The lack of technical understanding of the entire data handling process can represent a significant impediment to success. While many benthic mapping efforts have detailed their methodology as it relates to the overall scientific goals of a project, only a few published papers and reports focus on the analysis and visualization components (Paton et al. 1997, Weihe et al. 1999, Basu and Saxena 1999, Bruce et al. 1997). In particular, the benthic mapping literature often briefly describes data collection and analysis methods, but fails to provide sufficiently detailed explanation of particular analysis techniques or display methodologies so that others can employ them. In general, such techniques are in large part guided by the data acquisition methods, which can include both aerial and water-based remote sensing methods to map the seafloor without physical disturbance, as well as physical sampling methodologies (e.g., grab or core sampling). The terms benthic mapping and benthic habitat mapping are often used synonymously to describe seafloor mapping conducted for the purpose of benthic habitat identification. There is a subtle yet important difference, however, between general benthic mapping and benthic habitat mapping. The distinction is important because it dictates the sequential analysis and visualization techniques that are employed following data collection. In this paper general seafloor mapping for identification of regional geologic features and morphology is defined as benthic mapping. Benthic habitat mapping incorporates the regional scale geologic information but also includes higher resolution surveys and analysis of biological communities to identify the biological habitats. In addition, this paper adopts the definition of habitats established by Kostylev et al. (2001) as a “spatially defined area where the physical, chemical, and biological environment is distinctly different from the surrounding environment.” (PDF contains 31 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper draws together contributions to a scientific table discussion on obesity at the European Science Open Forum 2008 which took place in Barcelona, Spain. Socioeconomic dimensions of global obesity, including those factors promoting it, those surrounding the social perceptions of obesity and those related to integral public health solutions, are discussed. It argues that although scientific accounts of obesity point to large-scale changes in dietary and physical environments, media representations of obesity, which context public policy, pre-eminently follow individualistic models of explanation. While the debate at the forum brought together a diversity of views, all the contributors agreed that this was a global issue requiring an equally global response. Furthermore, an integrated ecological model of obesity proposes that to be effective, policy will need to address not only human health but also planetary health, and that therefore, public health and environmental policies coincide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Strong mechanical forces can, obviously, disrupt cell-cell and cell-matrix adhesions, e.g., cyclic uniaxial stretch induces instability of cell adhesion, which then causes the reorientation of cells away from the stretching direction. However, recent experiments also demonstrated the existence of force dependent adhesion growth (rather than dissociation). To provide a quantitative explanation for the two seemingly contradictory phenomena, a microscopic model that includes both integrin-integrin interaction and integrin-ligand interaction is developed at molecular level by treating the focal adhesion as an adhesion cluster. The integrin clustering dynamics and integrin-ligand binding dynamics are then simulated within one unified theoretical frame with Monte Carlo simulation. We find that the focal adhesion will grow when the traction force is higher than a relative small threshold value, and the growth is dominated by the reduction of local chemical potential energy by the traction force. In contrast, the focal adhesion will rupture when the traction force exceeds a second threshold value, and the rupture is dominated by the breaking of integrin-ligand bonds. Consistent with the experiments, these results suggest a force map for various responses of cell adhesion to different scales of mechanical force. PMID: 20542514

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the effect of focal points and initial relative position in the outcome of a bargaining process. We conduct two on-line experiments. In the first experiment we attempt to replicate Güth, Huck and Müller's (2001) results about the relevance of equal splits. In our second experiment, we recover the choices of participants in forty mini-ultimatum games. This design allows us to test whether the equal split or any other distribution or set of distributions are salient. Our data provide no support for a focal-point explanation but we find support for an explanation based on relative position. Our results confirm that there is a norm against hyper-fair offers. Proposers are expected to behave selfishly when the unselfish distribution leads to a change in the initial relative position.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This annotated bibliography of selected literature on Olney's three7square (Scirpus olneyi Gray )compiled basically for two reasons: 1) to assist a task force in its pursuit of an explanation for the substantial reduction in marsh acreage at the Blackwater National Wildlife Refuge in Dorchester County, Maryland, and 2) to serve as the author's foundation for the initiation of ecological research on this species as partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Botany Department of the University of Maryland. Both purposes are directly related in that the Author's research will be of use to the task force, along I with its other technical information and research results, in under-standing and possibly correcting the marshland loss problem at the Refuge. (PDF contains 100 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are two competing models of our universe right now. One is Big Bang with inflation cosmology. The other is the cyclic model with ekpyrotic phase in each cycle. This paper is divided into two main parts according to these two models. In the first part, we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes $\langle a_{lm}a_{l'm'}^*\rangle$ of the spherical-harmonic coefficients. We then provide a model and study the two-point correlation of a massless scalar (the inflaton) when the stress tensor contains the energy density from an infinitely long straight cosmic string in addition to a cosmological constant. Finally, we discuss if inflation can reconcile with the Liouville's theorem as far as the fine-tuning problem is concerned. In the second part, we find several problems in the cyclic/ekpyrotic cosmology. First of all, quantum to classical transition would not happen during an ekpyrotic phase even for superhorizon modes, and therefore the fluctuations cannot be interpreted as classical. This implies the prediction of scale-free power spectrum in ekpyrotic/cyclic universe model requires more inspection. Secondly, we find that the usual mechanism to solve fine-tuning problems is not compatible with eternal universe which contains infinitely many cycles in both direction of time. Therefore, all fine-tuning problems including the flatness problem still asks for an explanation in any generic cyclic models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis covers four different problems in the understanding of vortex sheets, and these are presented in four chapters.

In Chapter 1, free streamline theory is used to determine the steady solutions of an array of identical, hollow or stagnant core vortices in an inviscid, incompressible fluid. Assuming the array is symmetric to rotation through π radians about an axis through any vortex centre, there are two solutions or no solutions depending on whether A^(1/2)/L is less than or greater than 0.38 where A is the area of the vortex and L is the separation distance. Stability analysis shows that the more deformed shape is unstable to infinitesimal symmetric disturbances which leave the centres of the vortices undisplaced.

Chapter 2 is concerned with the roll-up of vortex sheets in homogeneous fluid. The flow over conventional and ring wings is used to test the method of Fink and Soh (1974). Despite modifications which improve the accuracy of the method, unphysical results occur. A possible explanation for this is that small scales are important and an alternate method based on "Cloud-in-Cell" techniques is introduced. The results show small scale growth and amalgamation into larger structures.

The motion of a buoyant pair of line vortices of opposite circulation is considered in Chapter 3. The density difference between the fluid carried by the vortices and the fluid outside is considered small, so that the Boussinesq approximation may be used. A macroscopic model is developed which shows the formation of a detrainment filament and this is included as a modification to the model. The results agree well with the numerical solution as developed by Hill (1975b) and show that after an initial slowdown, the vortices begin to accelerate downwards.

Chapter 4 reproduces completely a paper that has already been published (Baker, Barker, Bofah and Saffman (1974)) on the effect of "vortex wandering" on the measurement of velocity profiles of the trailing vortices behind a wing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a growing amount of experimental evidence that suggests people often deviate from the predictions of game theory. Some scholars attempt to explain the observations by introducing errors into behavioral models. However, most of these modifications are situation dependent and do not generalize. A new theory, called the rational novice model, is introduced as an attempt to provide a general theory that takes account of erroneous behavior. The rational novice model is based on two central principals. The first is that people systematically make inaccurate guesses when they are evaluating their options in a game-like situation. The second is that people treat their decisions similar to a portfolio problem. As a result, non optimal actions in a game theoretic sense may be included in the rational novice strategy profile with positive weights.

The rational novice model can be divided into two parts: the behavioral model and the equilibrium concept. In a theoretical chapter, the mathematics of the behavioral model and the equilibrium concept are introduced. The existence of the equilibrium is established. In addition, the Nash equilibrium is shown to be a special case of the rational novice equilibrium. In another chapter, the rational novice model is applied to a voluntary contribution game. Numerical methods were used to obtain the solution. The model is estimated with data obtained from the Palfrey and Prisbrey experimental study of the voluntary contribution game. It is found that the rational novice model explains the data better than the Nash model. Although a formal statistical test was not used, pseudo R^2 analysis indicates that the rational novice model is better than a Probit model similar to the one used in the Palfrey and Prisbrey study.

The rational novice model is also applied to a first price sealed bid auction. Again, computing techniques were used to obtain a numerical solution. The data obtained from the Chen and Plott study were used to estimate the model. The rational novice model outperforms the CRRAM, the primary Nash model studied in the Chen and Plott study. However, the rational novice model is not the best amongst all models. A sophisticated rule-of-thumb, called the SOPAM, offers the best explanation of the data.