885 resultados para implicit authentication
Resumo:
El autor analiza, en las dos primeras novelas escritas por Pareja Diezcanseco, elementos externos de la estructura del mundo novelado, proporcionados por la voz extraficcional (prefacios, epílogos, notas de pie de página). La casa de los locos es la sátira de una sociedad en anarquía, que presenta la actuación caótica de las clases sociales dirigentes y prefigura a los sodalios deformes de Las pequeñas estaturas, en ella es de gran importancia el «liminar», en el que la voz extraficcional presenta su poética de la escritura: ésta puede ser un arma de combate cultural e ideológico. En La Señorita Ecuador, uno de los elementos extraficcionales, el epílogo, aclara y justifica la designación del texto como novela. Estos elementos constituyen, en ambos textos, indispensable contrapunto para entender el sentido y la crítica social implícitos en las ficciones textuales.
Resumo:
En el análisis sobre migrantes, en general, y sobre migrantes descritos como grupos étnicos, en concreto, se tiende a presuponer un tipo de “solidaridad étnica” dentro de sus redes sociales. Esta suposición da por sentado que todos los que se definen como “indígenas” disponen y disfrutan de un capital social basado en la solidaridad. En este artículo se toma el caso de comerciantes migrantes kichwa-otavalo tanto para indagar la suposición de que la etnicidad es la fuente del capital social colectivo y, por tanto, de la solidaridad, como para cuestionar si esta actúa implícita en las cadenas y redes de migrantes, las cuales suelen ser equiparadas a su capital social.
Resumo:
This paper discusses the creation of a European Banking Union. First, we discuss questions of design. We highlight seven fundamental choices that decision makers will need to make: Which EU countries should participate in the banking union? To which categories of banks should it apply? Which institution should be tasked with supervision? Which one should deal with resolution? How centralised should the deposit insurance system be? What kind of fiscal backing would be required? What governance framework and political institutions would be needed? In terms of geographical scope, we see the coverage of the banking union of the euro area as necessary and of additional countries as desirable, even though this would entail important additional economic difficulties. The system should ideally cover all banks within the countries included, in order to prevent major competitive and distributional distortions. Supervisory authority should be granted either to both the ECB and a new agency, or to a new agency alone. National supervisors, acting under the authority of the European supervisor, would be tasked with the supervision of smaller banks in accordance with the subsidiarity principle. A European resolution authority should be established, with the possibility of drawing on ESM resources. A fully centralized deposit insurance system would eventually be desirable, but a system of partial reinsurance may also be envisaged at least in a first phase. A banking union would require at least implicit European fiscal backing, with significant political authority and legitimacy. Thus, banking union cannot be considered entirely separately from fiscal union and political union. The most difficult challenge of creating a European banking union lies with the short-term steps towards its eventual implementation. Many banks in the euro area, and especially in the crisis countries, are currently under stress and the move towards banking union almost certainly has significant distributional implications. Yet it is precisely because banks are under such stress that early and concrete action is needed. An overarching principle for such action is to minimize the cost to the tax payers. The first step should be to create a European supervisor that will anchor the development of the future banking union. In parallel, a capability to quickly assess the true capital position of the system’s most important banks should be created, for which we suggest establishing a temporary European Banking Sector Task Force working together with the European supervisor and other authorities. Ideally, problems identified by this process should be resolved by national authorities; in case fiscal capacities would prove insufficient, the European level would take over in the country concerned with some national financial participation, or in an even less likely adverse scenario, in all participating countries at once. This approach would require the passing of emergency legislation in the concerned countries that would give the Task Force the required access to information and, if necessary, further intervention rights. Thus, the principle of fiscal responsibility of respective member states for legacy costs would be preserved to the maximum extent possible, and at the same time, market participants and the public would be reassured that adequate tools are in place to address any eventuality.
Resumo:
Currently many ontologies are available for addressing different domains. However, it is not always possible to deploy such ontologies to support collaborative working, so that their full potential can be exploited to implement intelligent cooperative applications capable of reasoning over a network of context-specific ontologies. The main problem arises from the fact that presently ontologies are created in an isolated way to address specific needs. However we foresee the need for a network of ontologies which will support the next generation of intelligent applications/devices, and, the vision of Ambient Intelligence. The main objective of this paper is to motivate the design of a networked ontology (Meta) model which formalises ways of connecting available ontologies so that they are easy to search, to characterise and to maintain. The aim is to make explicit the virtual and implicit network of ontologies serving the Semantic Web.
Resumo:
Alternative meshes of the sphere and adaptive mesh refinement could be immensely beneficial for weather and climate forecasts, but it is not clear how mesh refinement should be achieved. A finite-volume model that solves the shallow-water equations on any mesh of the surface of the sphere is presented. The accuracy and cost effectiveness of four quasi-uniform meshes of the sphere are compared: a cubed sphere, reduced latitude–longitude, hexagonal–icosahedral, and triangular–icosahedral. On some standard shallow-water tests, the hexagonal–icosahedral mesh performs best and the reduced latitude–longitude mesh performs well only when the flow is aligned with the mesh. The inclusion of a refined mesh over a disc-shaped region is achieved using either gradual Delaunay, gradual Voronoi, or abrupt 2:1 block-structured refinement. These refined regions can actually degrade global accuracy, presumably because of changes in wave dispersion where the mesh is highly nonuniform. However, using gradual refinement to resolve a mountain in an otherwise coarse mesh can improve accuracy for the same cost. The model prognostic variables are height and momentum collocated at cell centers, and (to remove grid-scale oscillations of the A grid) the mass flux between cells is advanced from the old momentum using the momentum equation. Quadratic and upwind biased cubic differencing methods are used as explicit corrections to a fast implicit solution that uses linear differencing.
Resumo:
In the 12th annual Broadbent Lecture at the Annual Conference Dianne Berry outlined Broadbent’s explicit and implicit influences on psychological science and scientists.
Resumo:
This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.
Resumo:
We develop the linearization of a semi-implicit semi-Lagrangian model of the one-dimensional shallow-water equations using two different methods. The usual tangent linear model, formed by linearizing the discrete nonlinear model, is compared with a model formed by first linearizing the continuous nonlinear equations and then discretizing. Both models are shown to perform equally well for finite perturbations. However, the asymptotic behaviour of the two models differs as the perturbation size is reduced. This leads to difficulties in showing that the models are correctly coded using the standard tests. To overcome this difficulty we propose a new method for testing linear models, which we demonstrate both theoretically and numerically. © Crown copyright, 2003. Royal Meteorological Society
Resumo:
Three experiments examine whether simple pair-wise comparison judgments, involving the “recognition heuristic” (Goldstein & Gigerenzer, 2002), are sensitive to implicit cues to the nature of the comparison required. Experiments 1 & 2 show that participants frequently choose the recognized option of a pair if asked to make “larger” judgments but are significantly less likely to choose the unrecognized option when asked to make “smaller” judgments. Experiment 3 demonstrates that, overall, participants consider recognition to be a more reliable guide to judgments of a magnitude criterion than lack of recognition and that this intuition drives the framing effect. These results support the idea that, when making pair-wise comparison judgments, inferring that the recognized item is large is simpler than inferring that the unrecognized item is small.
Recategorization and subgroup identification: predicting and preventing threats from common ingroups
Resumo:
Much work has supported the idea that recategorization of ingroups and outgroups into a superordinate category can have beneficial effects for intergroup relations. Recently, however, increases in bias following recategorization have been observed in some contexts. It is argued that such unwanted consequences of recategorization will only be apparent for perceivers who are highly committed to their ingroup subgroups. In Experiments 1 to 3, the authors observed, on both explicit and implicit measures, that an increase in bias following recategorization occurred only for high subgroup identifiers. In Experiment 4, it was found that maintaining the salience of subgroups within a recategorized superordinate group averted this increase in bias for high identifiers and led overall to the lowest levels of bias. These findings are discussed in the context of recent work on the Common Ingroup Identity Model.
Resumo:
We have designed a highly parallel design for a simple genetic algorithm using a pipeline of systolic arrays. The systolic design provides high throughput and unidirectional pipelining by exploiting the implicit parallelism in the genetic operators. The design is significant because, unlike other hardware genetic algorithms, it is independent of both the fitness function and the particular chromosome length used in a problem. We have designed and simulated a version of the mutation array using Xilinix FPGA tools to investigate the feasibility of hardware implementation. A simple 5-chromosome mutation array occupies 195 CLBs and is capable of performing more than one million mutations per second. I. Introduction Genetic algorithms (GAs) are established search and optimization techniques which have been applied to a range of engineering and applied problems with considerable success [1]. They operate by maintaining a population of trial solutions encoded, using a suitable encoding scheme.
Resumo:
Slantwise convective available potential energy (SCAPE) is a measure of the degree to which the atmosphere is unstable to conditional symmetric instability (CSI). It has, until now, been defined by parcel theory in which the atmosphere is assumed to be nonevolving and balanced, that is, two-dimensional. When applying this two-dimensional theory to three-dimensional evolving flows, these assumptions can be interpreted as an implicit assumption that a timescale separation exists between a relatively rapid timescale for slantwise ascent and a slower timescale for the development of the system. An approximate extension of parcel theory to three dimensions is derived and it is shown that calculations of SCAPE based on the assumption of relatively rapid slantwise ascent can be qualitatively in error. For a case study example of a developing extratropical cyclone, SCAPE calculated along trajectories determined without assuming the existence of the timescale separation show large SCAPE values for parcels ascending from the warm sector and along the warm front. These parcels ascend into the cloud head within which there is some evidence consistent with the release of CSI from observational and model cross sections. This region of high SCAPE was not found for calculations along the relatively rapidly ascending trajectories determined by assuming the existence of the timescale separation.
Resumo:
Existing data on animal health and welfare in organic livestock production systems in the European Community countries are reviewed in the light of the demands and challenges of the recently implemented EU regulation on organic livestock production. The main conclusions and recommendations of a three-year networking project on organic livestock production are summarised and the future challenges to organic livestock production in terms of welfare and health management are discussed. The authors conclude that, whilst the available data are limited and the implementation of the EC regulation is relatively recent, there is little evidence to suggest that organic livestock management causes major threats to animal health and welfare in comparison with conventional systems. There are, however, some well-identified areas, like parasite control and balanced ration formulation, where efforts are needed to find solutions that meet with organic standard requirements and guarantee high levels of health and welfare. It is suggested that, whilst organic standards offer an implicit framework for animal health and welfare management, there is a need to solve apparent conflicts between the organic farming objectives in regard to environment, public health, farmer income and animal health and welfare. The key challenges for the future of organic livestock production in Europe are related to the feasibility of implementing improved husbandry inputs and the development of evidence-based decision support systems for health and feeding management.
Resumo:
The complexity of rural economies in developing countries is increasingly recognised, as is the need to tailor poverty reduction policies according to the diversity of rural households and their requirements. By reference to a village in Western India, the paper examines the results of a longitudinal micro-level research approach, employed for the study of livelihood diversification and use of informal finance. Over a 25-year period, livelihoods are shown to have become more complex, in terms of location, types of non-farm activities, and combinations of activities. Moreover, livelihood pathways taken continue to be critically affected by economic and social inequalities implicit in the caste system and tribal economy. A longitudinal micro-level research approach is shown to be one that can effectively identify the many complexities of rural livelihoods and the continued dependence on the informal financial sector, providing important insights into the requirements for rural financial products and services.
Resumo:
Diebold and Lamb (1997) argue that since the long-run elasticity of supply derived from the Nerlovian model entails a ratio of random variables, it is without moments. They propose minimum expected loss estimation to correct this problem but in so-doing ignore the fact that a non white-noise-error is implicit in the model. We show that, as a consequence the estimator is biased and demonstrate that Bayesian estimation which fully accounts for the error structure is preferable.