889 resultados para Information storage and retrieval systems.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This in vitro study evaluated the microtensile bond strength of a resin composite to Er:YAG-prepared dentin after long-term storage and thermocycling. Eighty bovine incisors were selected and their roots removed. The crowns were ground to expose superficial dentin. The samples were randomly divided according to cavity preparation method (I-Er:YAG laser and II-carbide bur). Subsequently, an etch & rinse adhesive system was applied and the samples were restored with a resin composite. The samples were subdivided according to time of water storage (WS)/number of thermocycles (TC) performed: A) 24 hours WS/no TC; B) 7 days WS/500 TC; C) 1 month WS/2,000 TC; D) 6 months WS/12,000 TC. The teeth were sectioned in sticks with a cross-sectional area of 1.0-mm(2), which were loaded in tension in a universal testing machine. The data were subjected to two-way ANOVA, Scheffe and Fisher`s tests at a 5% level. In general, the bur-prepared group displayed higher microtensile bond strength values than the laser-treated group. Based on one-month water storage and 2,000 thermocycles, the performance of the tested adhesive system to Er:YAG-laser irradiated dentin was negatively affected (Group IC), while adhesion of the bur-prepared group decreased only within six months of water storage combined with 12,000 thermocycles (Group IID). It may be concluded that adhesion to the Er:YAG laser cavity preparation was more affected by the methods used for simulating degradation of the adhesive interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation was offered as part of the CUNY Library Assessment Conference, Reinventing Libraries: Reinventing Assessment, held at the City University of New York in June 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper investigates which of Shannon’s measures (entropy, conditional entropy, mutual information) is the right one for the task of quantifying information flow in a programming language. We examine earlier relevant contributions from Denning, McLean and Gray and we propose and motivate a specific quantitative definition of information flow. We prove results relating equivalence relations, interference of program variables, independence of random variables and the flow of confidential information. Finally, we show how, in our setting, Shannon’s Perfect Secrecy theorem provides a sufficient condition to determine whether a program leaks confidential information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past few years, libraries have started to design public programs that educate patrons about different tools and techniques to protect personal privacy. But do end user solutions provide adequate safeguards against surveillance by corporate and government actors? What does a comprehensive plan for privacy entail in order that libraries live up to their privacy values? In this paper, the authors discuss the complexity of surveillance architecture that the library institution might confront when seeking to defend the privacy rights of patrons. This architecture consists of three main parts: physical or material aspects, logical characteristics, and social factors of information and communication flows in the library setting. For each category, the authors will present short case studies that are culled from practitioner experience, research, and public discourse. The case studies probe the challenges faced by the library—not only when making hardware and software choices, but also choices related to staffing and program design. The paper shows that privacy choices intersect not only with free speech and chilling effects, but also with questions that concern intellectual property, organizational development, civic engagement, technological innovation, public infrastructure, and more. The paper ends with discussion of what libraries will require in order to sustain and improve efforts to serve as stewards of privacy in the 21st century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GCM outputs such as CMIP3 are available via network access to PCMDI web site. Meteorological researchers are familiar with the usage of the GCM data, but the most of researchers other than meteorology such as agriculture, civil engineering, etc., and general people are not familiar with the GCM. There are some difficulties to use GCM; 1) to download the enormous quantity of data, 2) to understand the GCM methodology, parameters and grids. In order to provide a quick access way to GCM, Climate Change Information Database has been developed. The purpose of the database is to bridge the users and meteorological specialists and to facilitate the understanding the climate changes. The resolution of the data is unified, and climate change amount or factors for each meteorological element are provided from the database. All data in the database are interpolated on the same 80km mesh. Available data are the present-future projections of 27 GCMs, 16 meteorological elements (precipitation, temperature, etc.), 3 emission scenarios (A1B, A2, B1). We showed the summary of this database to residents in Toyama prefecture and measured the effect of showing and grasped the image for the climate change by using the Internet questionary survey. The persons who feel a climate change at the present tend to feel the additional changes in the future. It is important to show the monitoring results of climate change for a citizen and promote the understanding for the climate change that had already occurred. It has been shown that general images for the climate change promote to understand the need of the mitigation, and that it is important to explain about the climate change that might occur in the future even if it did not occur at the present in order to have people recognize widely the need of the adaptation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esse estudo estende a metodologia de Fama e French (1988) para testar a hipótese derivada da Teoria dos Estoques de que o convenience yield dos estoques diminui a uma taxa decrescente com o aumento de estoque. Como descrito por Samuelson (1965), a Teoria implica que as variações nos preços à vista (spot) e dos futuros (ou dos contratos a termo) serão similares quando os estoques estão altos, mas os preços futuros variarão menos que os preços à vista quando os estoques estão baixos. Isso ocorre porque os choques de oferta e demanda podem ser absorvidos por ajustes no estoque quando este está alto, afetando de maneira similar os preços à vista e futuros. Por outro lado, quando os estoques estão baixos, toda a absorção dos choques de demanda ou oferta recai sobre o preço à vista, uma vez que os agentes econômicos têm pouca condição de reagir à quantidade demandada ou ofertada no curto prazo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis provides three original contributions to the field of Decision Sciences. The first contribution explores the field of heuristics and biases. New variations of the Cognitive Reflection Test (CRT--a test to measure "the ability or disposition to resist reporting the response that first comes to mind"), are provided. The original CRT (S. Frederick [2005] Journal of Economic Perspectives, v. 19:4, pp.24-42) has items in which the response is immediate--and erroneous. It is shown that by merely varying the numerical parameters of the problems, large deviations in response are found. Not only the final results are affected by the proposed variations, but so is processing fluency. It seems that numbers' magnitudes serve as a cue to activate system-2 type reasoning. The second contribution explores Managerial Algorithmics Theory (M. Moldoveanu [2009] Strategic Management Journal, v. 30, pp. 737-763); an ambitious research program that states that managers display cognitive choices with a "preference towards solving problems of low computational complexity". An empirical test of this hypothesis is conducted, with results showing that this premise is not supported. A number of problems are designed with the intent of testing the predictions from managerial algorithmics against the predictions of cognitive psychology. The results demonstrate (once again) that framing effects profoundly affect choice, and (an original insight) that managers are unable to distinguish computational complexity problem classes. The third contribution explores a new approach to a computationally complex problem in marketing: the shelf space allocation problem (M-H Yang [2001] European Journal of Operational Research, v. 131, pp.107--118). A new representation for a genetic algorithm is developed, and computational experiments demonstrate its feasibility as a practical solution method. These studies lie at the interface of psychology and economics (with bounded rationality and the heuristics and biases programme), psychology, strategy, and computational complexity, and heuristics for computationally hard problems in management science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tese se dedica ao estudo de modelos de fixação de preços e suas implicações macroeconômicas. Nos primeiros dois capítulos analiso modelos em que as decisões das firmas sobre seus preços praticados levam em conta custos de menu e de informação. No Capítulo 1 eu estimo tais modelos empregando estatísticas de variações de preços dos Estados Unidos, e concluo que: os custos de informação são significativamente maiores que os custos de menu; os dados claramente favorecem o modelo em que informações sobre condições agregadas são custosas enquanto que as idiossincráticas têm custo zero. No Capítulo 2 investigo as consequências de choques monetários e anúncios de desinflação usando os modelos previamente estimados. Mostro que o grau de não-neutralidade monetária é maior no modelo em que parte da informação é grátis. O Capítulo 3 é um artigo em conjunto com Carlos Carvalho (PUC-Rio) e Antonella Tutino (Federal Reserve Bank of Dallas). No artigo examinamos um modelo de fixação de preços em que firmas estão sujeitas a uma restrição de fluxo de informação do tipo Shannon. Calibramos o modelo e estudamos funções impulso-resposta a choques idiossincráticos e agregados. Mostramos que as firmas vão preferir processar informações agregadas e idiossincráticas conjuntamente ao invés de investigá-las separadamente. Este tipo de processamento gera ajustes de preços mais frequentes, diminuindo a persistência de efeitos reais causados por choques monetários.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper evaluates how information asymmetry affects the strength of competition in credit markets. A theory is presented in which adverse selection softens competition by decreasing the incentives creditors have for competing in the interest rate dimension. In equilibirum, although creditors compete, the outcome is similar to collusion. Three empirical implications arise. First, interest rate should respond asymmetrically to changes in the cost of funds: increases in cost of funds should, on average, have a larger effect on interest rates than decreases. Second, aggressiveness in pricing should be associated with a worseing in the bank level default rates. Third, bank level default rates should be endogenous. We then verify the validity of these three empirical implications using Brazilian data on consumer overdraft loans. The results in this paper rationalize seemingly abnormallly high interest rates in unsecured loans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the effect of social embeddedness on voter turnout by investigating the role of information about other voters’ decisions. We do so in a participation game, where some voters (‘receivers’) are told about some other voters’ (‘senders’) turnout decision at a first stage of the game. Cases are distinguished where the voters support the same or different candidates or where they are uncertain about each other’s preferences. Our experimental results show that such information matters. Participation is much higher when information is exchanged than when it is not. Senders strategically try to use their first mover position and some receivers respond to this.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the stability of monetary regimes in a decentralized economy where fiat money is endogenously created, information about its value is imperfect, and agents only learn from their personal trading experiences. We show that in poorly informed economies, monetary stability depends heavily on the government's commitment to the long run value of money, whereas in economies where agents gather information more easily, monetary stability can be an endogenous outcome. We generate a dynamics on the acceptability of fiat money that resembles historical accounts of the rise and eventual colIapse of overissued paper money. Moreover, our results provide an explanation of the fact that, despite its obvious advantages, the widespread use of fiat money is a very recent development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

EMAp - Escola de Matemática Aplicada

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My dissertation focuses on dynamic aspects of coordination processes such as reversibility of early actions, option to delay decisions, and learning of the environment from the observation of other people’s actions. This study proposes the use of tractable dynamic global games where players privately and passively learn about their actions’ true payoffs and are able to adjust early investment decisions to the arrival of new information to investigate the consequences of the presence of liquidity shocks to the performance of a Tobin tax as a policy intended to foster coordination success (chapter 1), and the adequacy of the use of a Tobin tax in order to reduce an economy’s vulnerability to sudden stops (chapter 2). Then, it analyzes players’ incentive to acquire costly information in a sequential decision setting (chapter 3). In chapter 1, a continuum of foreign agents decide whether to enter or not in an investment project. A fraction λ of them are hit by liquidity restrictions in a second period and are forced to withdraw early investment or precluded from investing in the interim period, depending on the actions they chose in the first period. Players not affected by the liquidity shock are able to revise early decisions. Coordination success is increasing in the aggregate investment and decreasing in the aggregate volume of capital exit. Without liquidity shocks, aggregate investment is (in a pivotal contingency) invariant to frictions like a tax on short term capitals. In this case, a Tobin tax always increases success incidence. In the presence of liquidity shocks, this invariance result no longer holds in equilibrium. A Tobin tax becomes harmful to aggregate investment, which may reduces success incidence if the economy does not benefit enough from avoiding capital reversals. It is shown that the Tobin tax that maximizes the ex-ante probability of successfully coordinated investment is decreasing in the liquidity shock. Chapter 2 studies the effects of a Tobin tax in the same setting of the global game model proposed in chapter 1, with the exception that the liquidity shock is considered stochastic, i.e, there is also aggregate uncertainty about the extension of the liquidity restrictions. It identifies conditions under which, in the unique equilibrium of the model with low probability of liquidity shocks but large dry-ups, a Tobin tax is welfare improving, helping agents to coordinate on the good outcome. The model provides a rationale for a Tobin tax on economies that are prone to sudden stops. The optimal Tobin tax tends to be larger when capital reversals are more harmful and when the fraction of agents hit by liquidity shocks is smaller. Chapter 3 focuses on information acquisition in a sequential decision game with payoff complementar- ity and information externality. When information is cheap relatively to players’ incentive to coordinate actions, only the first player chooses to process information; the second player learns about the true payoff distribution from the observation of the first player’s decision and follows her action. Miscoordination requires that both players privately precess information, which tends to happen when it is expensive and the prior knowledge about the distribution of the payoffs has a large variance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)