980 resultados para Complete Information


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We analyze a common agency game under asymmetric information on the preferences of the non-cooperating principals in a public good context. Asymmetric information introduces incentive compatibility constraints which rationalize the requirement of truthfulness made in the earlier literature on common agency games under complete information. There exists a large class of differentiable equilibria which are ex post inefficient and exhibit free-riding. We then characterize some interim efficient equilibria. Finally, there exists also a unique equilibrium allocation which is robust to random perturbations. This focal equilibrium is characterized for any distribution of types.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a version of the cooperative buyer-seller market game of Shapley and Shubik (1972). For this market we propose a c1ass of sealed- bid auctions where objects are sold simultaneously at a market c1earing price rule. We ana1yze the strategic games induced by these mechanisms under the complete information approach. We show that these noncooperative games can be regarded as a competitive process for achieving a cooperative outcome: every Nash equilibrium payoff is a core outcome of the cooperative market game. Precise answers can be given to the strategic questions raised.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We analyze a common agency game under asymmetric information on the preferences of the non-cooperating principals. Asymmetric information introduces incentive compatibility constraints which rationalize the requirement of truthfulness made in the earlier literature on common agency games under complete information. There exists a large class of differentiable equilibria which are ex post inefficient and exhibit free-riding. We then characterize some interim efficient equilibria. Finally, there exists also a unique equilibrium allocation which is robust to random perturbations. This focal equilibrium is characterized for any distribution of types.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In most studies on beef cattle longevity, only the cows reaching a given number of calvings by a specific age are considered in the analyses. With the aim of evaluating all cows with productive life in herds, taking into consideration the different forms of management on each farm, it was proposed to measure cow longevity from age at last calving (ALC), that is, the most recent calving registered in the files. The objective was to characterize this trait in order to study the longevity of Nellore cattle, using the Kaplan-Meier estimators and the Cox model. The covariables and class effects considered in the models were age at first calving (AFC), year and season of birth of the cow and farm. The variable studied (ALC) was classified as presenting complete information (uncensored = 1) or incomplete information (censored = 0), using the criterion of the difference between the date of each cow's last calving and the date of the latest calving at each farm. If this difference was >36 months, the cow was considered to have failed. If not, this cow was censored, thus indicating that future calving remained possible for this cow. The records of 11 791 animals from 22 farms within the Nellore Breed Genetic Improvement Program ('Nellore Brazil') were used. In the estimation process using the Kaplan-Meier model, the variable of AFC was classified into three age groups. In individual analyses, the log-rank test and the Wilcoxon test in the Kaplan-Meier model showed that all covariables and class effects had significant effects (P < 0.05) on ALC. In the analysis considering all covariables and class effects, using the Wald test in the Cox model, only the season of birth of the cow was not significant for ALC (P > 0.05). This analysis indicated that each month added to AFC diminished the risk of the cow's failure in the herd by 2%. Nonetheless, this does not imply that animals with younger AFC had less profitability. Cows with greater numbers of calvings were more precocious than those with fewer calvings. Copyright © The Animal Consortium 2012.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Com o objetivo de investigar a prevalência da co-infecção Leishmania/HIV em pacientes sorologicamente positivos para o HIV, sem histórico da co-infecção, foi realizado um estudo transversal através de análise de ficha clínico-epidemiológica aplicada aos pacientes registrados nas unidades de referência para aids em belém: CASA DIA e URE - DIPE, no período de julho a outubro de 2008. Foram coletadas amostras de sangue de 377 pacientes que concordaram em participar da pesquisa, por análise de IFI e PCR e em 55 deles foi realizada a IDRM. A média de idade foi de 38,2 anos e 59% dos pacientes era do sexo feminino. A carga viral média entre os 249 pacientes que apresentaram informações completas a respeito desta variável, foi de 30952,2. A IFI foi positiva em 08 pacientes e a PCR foi positiva em 22. Um total de 214 pacientes encontrva-se em tratamento com TARV. Foram registrados 218 pacientes com episódios de doenças associados à condição HIV positivos. Cinco pacientes relataram episódio de co-infecção M. leprae/HIV, e nenhum deles apresentou resultado positivo para Leishmania pela PCR. O uso de drogas intravenosas foi relatado por 27 pacientes, porém apenas um apresentou PCR positivo, indicando que a transmissão não foi do tipo inter-humana. Dentre os pacientes que foram submetidos à IDRM, nenhum apresentou reação positiva. No presente trabalho, a técnica de PCR foi mais sensível que a reação de IFI, 6% e 2%, respectivamente.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the main objectives of the Spanish and Portuguese-Speaking Group of the International Society for Forensic Genetics (GHEP-ISFG) is to promote and contribute to the development and dissemination of scientific knowledge in the area of forensic genetics. Due to this fact, GHEP-ISFG holds different working commissions that are set up to develop activities in scientific aspects of general interest. One of them, the Mixture Commission of GHEP-ISFG, has organized annually, since 2009, a collaborative exercise on analysis and interpretation of autosomal short tandem repeat (STR) mixture profiles. Until now, three exercises have been organized (GHEP-MIX01, GHEP-MIX02 and GHEP-MIX03), with 32, 24 and 17 participant laboratories respectively. The exercise aims to give a general vision by addressing, through the proposal of mock cases, aspects related to the edition of mixture profiles and the statistical treatment. The main conclusions obtained from these exercises may be summarized as follows. Firstly, the data show an increased tendency of the laboratories toward validation of DNA mixture profiles analysis following international recommendations (ISO/IEC 17025: 2005). Secondly, the majority of discrepancies are mainly encountered in stutters positions (53.4%, 96.0% and 74.9%, respectively for the three editions). On the other hand, the results submitted reveal the importance of performing duplicate analysis by using different kits in order to reduce errors as much as possible. Regarding the statistical aspect (GHEP-MIX02 and 03), all participants employed the likelihood ratio (LR) parameter to evaluate the statistical compatibility and the formulas employed were quite similar. When the hypotheses to evaluate the LR value were locked by the coordinators (GHEP-MIX02) the results revealed a minor number of discrepancies that were mainly due to clerical reasons. However, the GHEP-MIX03 exercise allowed the participants to freely come up with their own hypotheses to calculate the LR value. In this situation the laboratories reported several options to explain the mock cases proposed and therefore significant differences between the final LR values were obtained. Complete information concerning the background of the criminal case is a critical aspect in order to select the adequate hypotheses to calculate the LR value. Although this should be a task for the judicial court to decide, it is important for the expert to account for the different possibilities and scenarios, and also offer this expertise to the judge. In addition, continuing education in the analysis and interpretation of mixture DNA profiles may also be a priority for the vast majority of forensic laboratories. (C) 2014 Elsevier Ireland Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have developed a method to compute the albedo contrast between dust devil tracks and their surrounding regions on Mars. It is mainly based on Mathematical Morphology operators and uses all the points of the edges of the tracks to compute the values of the albedo contrast. It permits the extraction of more accurate and complete information, when compared to traditional point sampling, not only providing better statistics but also permitting the analysis of local variations along the entirety of the tracks. This measure of contrast, based on relative quantities, is much more adequate to establish comparisons at regional scales and in multi-temporal basis using imagery acquired in rather different environmental and operational conditions. Also, the substantial increase in the details extracted may permit quantifying differential depositions of dust by computing local temporal fading of the tracks with consequences on a better estimation of the thickness of the top most layer of dust and the minimum value needed to create dust devils tracks. The developed tool is tested on 110 HiRISE images depicting regions in the Aeolis, Argyre, Eridania, Noachis and Hellas quadrangles. As a complementary evaluation, we also performed a temporal analysis of the albedo in a region of Russell crater, where high seasonal dust devil activity was already observed before, comprising the years 2007-2012. The mean albedo of the Russell crater is in this case indicative of dust devil tracks presence and, therefore, can be used to quantify dust devil activity. (C) 2014 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Das Standardmodell der Elementarteilchenphysik istexperimentell hervorragend bestätigt, hat auf theoretischerSeite jedoch unbefriedigende Aspekte: Zum einen wird derHiggssektor der Theorie von Hand eingefügt, und zum anderenunterscheiden sich die Beschreibung des beobachtetenTeilchenspektrums und der Gravitationfundamental. Diese beiden Nachteile verschwinden, wenn mandas Standardmodell in der Sprache der NichtkommutativenGeometrie formuliert. Ziel hierbei ist es, die Raumzeit der physikalischen Theoriedurch algebraische Daten zu erfassen. Beispielsweise stecktdie volle Information über eine RiemannscheSpinmannigfaltigkeit M in dem Datensatz (A,H,D), den manspektrales Tripel nennt. A ist hierbei die kommutativeAlgebra der differenzierbaren Funktionen auf M, H ist derHilbertraum der quadratintegrablen Spinoren über M und D istder Diracoperator. Mit Hilfe eines solchen Tripels (zu einer nichtkommutativenAlgebra) lassen sich nun sowohl Gravitation als auch dasStandardmodell mit mathematisch ein und demselben Mittelerfassen. In der vorliegenden Arbeit werden nulldimensionale spektraleTripel (die diskreten Raumzeiten entsprechen) zunächstklassifiziert und in Beispielen wird eine Quantisierungsolcher Objekte durchgeführt. Ein Problem der spektralenTripel stellt ihre Beschränkung auf echt RiemannscheMetriken dar. Zu diesem Problem werden Lösungsansätzepräsentiert. Im abschließenden Kapitel der Arbeit wird dersogenannte 'Feynman-Beweis der Maxwellgleichungen' aufnichtkommutative Konfigurationsräume verallgemeinert.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBIETTIVI: Per esplorare il contributo dei fattori di rischio biomeccanico, ripetitività (hand activity level – HAL) e forza manuale (peak force - PF), nell’insorgenza della sindrome del tunnel carpale (STC), abbiamo studiato un’ampia coorte di lavoratori dell’industria, utilizzando come riferimento il valore limite di soglia (TLV©) dell’American Conference of Governmental Industrial Hygienists (ACGIH). METODI: La coorte è stata osservata dal 2000 al 2011. Abbiamo classificato l’esposizione professionale rispetto al limite di azione (AL) e al TLV dell’ACGIH in: “accettabile” (sotto AL), “intermedia” (tra AL e TLV) e “inaccettabile” (sopra TLV). Abbiamo considerato due definizioni di caso: 1) sintomi di STC; 2) sintomi e positività allo studio di conduzione nervosa (SCN). Abbiamo applicato modelli di regressione di Poisson aggiustati per sesso, età, indice di massa corporea e presenza di patologie predisponenti la malattia. RISULTATI: Nell’intera coorte (1710 lavoratori) abbiamo trovato un tasso di incidenza (IR) di sintomi di STC di 4.1 per 100 anni-persona; un IR di STC confermata dallo SCN di 1.3 per 100 anni-persona. Gli esposti “sopra TLV” presentano un rischio di sviluppare sintomi di STC di 1.76 rispetto agli esposti “sotto AL”. Un andamento simile è emerso per la seconda definizione di caso [incidence rate ratios (IRR) “sopra TLV”, 1.37 (intervallo di confidenza al 95% (IC95%) 0.84–2.23)]. Gli esposti a “carico intermedio” risultano a maggior rischio per la STC [IRR per i sintomi, 3.31 (IC95% 2.39–4.59); IRR per sintomi e SCN positivo, 2.56 (IC95% 1.47–4.43)]. Abbiamo osservato una maggior forza di associazione tra HAL e la STC. CONCLUSIONI: Abbiamo trovato un aumento di rischio di sviluppare la STC all’aumentare del carico biomeccanico: l’aumento di rischio osservato già per gli esposti a “carico intermedio” suggerisce che gli attuali valori limite potrebbero non essere sufficientemente protettivi per alcuni lavoratori. Interventi di prevenzione vanno orientati verso attività manuali ripetitive.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In veterinary medicine, the ability to classify mammary tumours based on the molecular profile and also determine whether the immunophenotype of the regional lymph node and/or systemic metastases is equal to that of the primary tumor may be predictive on the estimation of the effectiveness of various cancer treatments that can be scheduled. Therefore, aims, developed as projects, of the past three years have been (1) to define the molecular phenotype of feline mammary carcinomas and their lymph node metastases according to a previous modified algorithm and to demonstrate the concordance or discordance of the molecular profile between the primary tumour and lymph node metastasis, (2) to analyze, in female dogs, the relationship between the primary mammary tumor and its lymph node metastasis based on immunohistochemical molecular characterization in order to develop the most specific prognostic-predictive models and targeted therapeutic options, and (3) to evaluate the molecular trend of cancer from its primary location to systemic metastases in three cats and two dogs with mammary tumors. The studies on mammary tumours, particularly in dogs, have drawn gradually increasing attention not exclusively to the epithelial component, but also to the myoepithelial cells. The lack of complete information on a valid panel of markers for the identification of these cells in the normal and neoplastic mammary gland and lack of investigation of immunohistochemical changes from an epithelial to a mesenchymal phenotype, was the aim of a parallel research. While investigating mammary tumours, it was noticed that only few studies had focused on the expression of CD117. Therefore, it was decided to further deepen the knowledge in order to characterize the immunohistochemical staining of CD117 in normal and neoplastic mammary tissue of the dog, and to correlate CD117 immunohistochemical results with mammary histotype, histological stage (invasiveness), Ki67 index and patient survival time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation mimics the Turkish college admission procedure. It started with the purpose to reduce the inefficiencies in Turkish market. For this purpose, we propose a mechanism under a new market structure; as we prefer to call, semi-centralization. In chapter 1, we give a brief summary of Matching Theory. We present the first examples in Matching history with the most general papers and mechanisms. In chapter 2, we propose our mechanism. In real life application, that is in Turkish university placements, the mechanism reduces the inefficiencies of the current system. The success of the mechanism depends on the preference profile. It is easy to show that under complete information the mechanism implements the full set of stable matchings for a given profile. In chapter 3, we refine our basic mechanism. The modification on the mechanism has a crucial effect on the results. The new mechanism is, as we call, a middle mechanism. In one of the subdomain, this mechanism coincides with the original basic mechanism. But, in the other partition, it gives the same results with Gale and Shapley's algorithm. In chapter 4, we apply our basic mechanism to well known Roommate Problem. Since the roommate problem is in one-sided game patern, firstly we propose an auxiliary function to convert the game semi centralized two-sided game, because our basic mechanism is designed for this framework. We show that this process is succesful in finding a stable matching in the existence of stability. We also show that our mechanism easily and simply tells us if a profile lacks of stability by using purified orderings. Finally, we show a method to find all the stable matching in the existence of multi stability. The method is simply to run the mechanism for all of the top agents in the social preference.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This Ph.D. thesis consists in three research papers focused on the relationship between media industry and the financial sector. The importance of a correct understanding what is the effect of media on financial markets is becoming increasingly important as long as fully informed markets hypothesis has been challenged. Therefore, if financial markets do not have access to complete information, the importance of information professionals, the media, follows. On the other side, another challenge for economic and finance scholar is to understand how financial features are able to influence media and to condition information disclosure. The main aim of this Ph.D. dissertation is to contribute to a better comprehension for both the phenomena. The first paper analyzes the effects of owning equity shares in a newspaper- publishing firm. The main findings show how for a firm being part of the ownership structure of a media firm ends to receive more and better coverage. This confirms the view in which owning a media outlet is a source of conflicts of interest. The second paper focuses on the effect of media-delivered information on financial markets. In the framework of IPO in the U.S. market, we found empirical evidence of a significant effect of the media role in the IPO pricing. Specifically, increasing the quantity and the quality of the coverage increases the first-day returns (i.e. the underpricing). Finally the third paper tries to summarize what has been done in studying the relationship between media and financial industries, putting together contributes from economic, business, and financial scholars. The main finding of this dissertation is therefore to have underlined the importance and the effectiveness of the relationship between media industry and the financial sector, contributing to the stream of research that investigates about the media role and media effectiveness in the financial and business sectors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Justification Logic studies epistemic and provability phenomena by introducing justifications/proofs into the language in the form of justification terms. Pure justification logics serve as counterparts of traditional modal epistemic logics, and hybrid logics combine epistemic modalities with justification terms. The computational complexity of pure justification logics is typically lower than that of the corresponding modal logics. Moreover, the so-called reflected fragments, which still contain complete information about the respective justification logics, are known to be in~NP for a wide range of justification logics, pure and hybrid alike. This paper shows that, under reasonable additional restrictions, these reflected fragments are NP-complete, thereby proving a matching lower bound. The proof method is then extended to provide a uniform proof that the corresponding full pure justification logics are $\Pi^p_2$-hard, reproving and generalizing an earlier result by Milnikel.