876 resultados para nonparametric rationality tests
Resumo:
Introduction Since the launch of the Global Programme to Eliminate Lymphatic Filariasis, more than 70% of the endemic countries have implemented mass drug administration (MDA) to interrupt disease transmission. The monitoring of filarial infection in sentinel populations, particularly schoolchildren, is recommended to assess the impact of MDA. A key issue is choosing the appropriate tools for these initial assessments (to define the best intervention) and for monitoring transmission. Methods This study compared the pre-MDA performance of five diagnostic methods, namely, thick film test, Knott's technique, filtration, Og4C3-ELISA, and the AD12-ICT card test, in schoolchildren from Brazil. Venous and capillary blood samples were collected between 11 pm and 1 am. The microfilarial loads were analyzed with a negative binomial regression, and the prevalence and associated 95% confidence intervals were estimated for all methods. The accuracies of the AD12-ICT card and Og4C3-ELISA tests were assessed against the combination of parasitological test results. Results A total of 805 schoolchildren were examined. The overall and stratified prevalence by age group and gender detected by Og4C3-ELISA and AD12-ICT were markedly higher than the prevalence estimated by the parasitological methods. The sensitivity of the AD12-ICT card and Og4C3-ELISA tests was approximately 100%, and the positive likelihood ratios were above 6. The specificity of the Og4C3-ELISA was higher than that of the AD12-ICT at different prevalence levels. Conclusions The ICT card test should be the recommended tool for monitoring school-age populations living in areas with ongoing or completed MDA.
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
INTRODUCTION: Various methods are used for the diagnosis of visceral leishmaniasis (VL), such as microscopic examination, culture and inoculation of laboratory animals; however, serological assays are commonly used for the detection of antibodies in serum samples with a wide range of specificity and sensitivity. METHODS: The purpose of this study was to compare three serological methods, including rA2-ELISA, the recombinant KE16 (rKE16) dipstick test and the direct agglutination test (DAT), for the detection of antibodies against VL antigens. The assays utilized 350 statistically based random serum samples from domestic dogs with clinical symptoms as well as samples from asymptomatic and healthy dogs from rural and urban areas of the Meshkinshahr district, northwestern Iran. RESULTS: Samples were assessed, and the following positive rates were obtained: 11.5% by rKE16, 26.9% by DAT and 49.8% by ELISA. The sensitivity among symptomatic dogs was 32.4% with rKE16, 100% with DAT and 52.9% with ELISA. Conversely, rA2-ELISA was less specific for asymptomatic dogs, at 46.5%, compared with DAT, at 88.9%. CONCLUSIONS : This study recommends rA2-ELISA as a parallel assay combined with DAT to detect VL infection among dogs. Further evaluations should be performed to develop an inexpensive and reliable serologic test for the detection of Leishmania infantum among infected dogs.
Resumo:
Abstract: An integrative literature review was conducted to synthesize available publications regarding the potential use of serological tests in leprosy programs. We searched the databases Literatura Latino-Americana e do Caribe em Ciências da Saúde, Índice Bibliográfico Espanhol em Ciências da Saúde, Acervo da Biblioteca da Organização Pan-Americana da Saúde, Medical Literature Analysis and Retrieval System Online, Hanseníase, National Library of Medicine, Scopus, Ovid, Cinahl, and Web of Science for articles investigating the use of serological tests for antibodies against phenolic glycolipid-I (PGL-I), ML0405, ML2331, leprosy IDRI diagnostic-1 (LID-1), and natural disaccharide octyl-leprosy IDRI diagnostic-1 (NDO-LID). From an initial pool of 3.514 articles, 40 full-length articles fulfilled our inclusion criteria. Based on these papers, we concluded that these antibodies can be used to assist in diagnosing leprosy, detecting neuritis, monitoring therapeutic efficacy, and monitoring household contacts or at-risk populations in leprosy-endemic areas. Thus, available data suggest that serological tests could contribute substantially to leprosy management.
Resumo:
Abstract: INTRODUCTION: Acceptance of the IT LEISH(r) and direct agglutination test- made in the Laboratório de Pesquisas Clínicas (DAT-LPC) by healthcare professionals and patients suspected of visceral leishmaniasis (VL) in Ribeirão das Neves was evaluated. METHODS: Ninety-two patients and 47 professionals completed three questionnaires. RESULTS: Eighty-eight (96%) patients considered fingertip blood collection a positive test feature, and 86% (37) and 91% of professionals considered the IT LEISH(r) easy to perform and interpret, respectively. All professionals classified the DAT-LPC as simple and easy. CONCLUSIONS: Patients and healthcare professionals in Ribeirão das Neves demonstrated a high degree of acceptance of the IT LEISH(r) and DAT-LPC.
Resumo:
Liver function and its correlation with bilirubin and hepatic enzymes were evaluated in 30 male chronic asymptomatic or oligosymptomatic alcoholics admitted into the psychiatric hospital for detoxification and treatment of alcoholism. Hypoalbuminemia, lowered prothrombin activity, hypotransferrinemia and hypofibrinogenemia were detected in 32 %, 32 %, 28 %, and 24 % of patients, respectively. Transferrin was elevated in 8 %. Greater prevalence of hyperbilirubinemia was found in patients with lowered prothrombin activity, hypofibrinogenemia, or hypotransferrinemia. No correlation was found between serum bilirubin or aminotransferase levels and normal or elevated albumin levels, time or activity of prothrombin, and fibrinogen levels. Serum alkaline phosphatase was elevated in normoalbuminemics and gamma-glutamyltransferase in patients with lowered prothrombin activity. Hypoalbuminemia was associated with hypofibrinogenemia, hypotransferrinemia with elevated aspartate aminotransferase or gamma-glutamyltransferase, and hypertransferrinemia with elevation of alanine aminotransferase. These data indicated the occurrence of hepatic dysfunction due to liver damage caused directly by alcohol or by alcoholism-associated nutritional deficiencies.
Resumo:
Wireless Sensor Networks(WSN) are networks of devices used to sense and act that applies wireless radios to communicate. To achieve a successful implementation of a wireless device it is necessary to take in consideration the existence of a wide variety of radios available, a large number of communication parameters (payload, duty cycle, etc.) and environmental conditions that may affect the device’s behaviour. However, to evaluate a specific radio towards a unique application it might be necessary to conduct trial experiments, with such a vast amount of devices, communication parameters and environmental conditions to take into consideration the number of trial cases generated can be surprisingly high. Thus, making trial experiments to achieve manual validation of wireless communication technologies becomes unsuitable due to the existence of a high number of trial cases on the field. To overcome this technological issue an automated test methodology was introduced, presenting the possibility to acquire data regarding the device’s behaviour when testing several technologies and parameters that care for a specific analysis. Therefore, this method advances the validation and analysis process of the wireless radios and allows the validation to be done without the need of specific and in depth knowledge about wireless devices.
Resumo:
In this work, the fracture mode I parameters of steel fibre reinforced self-compacting concrete (SFRSCC) were derived from the numerical simulation of indirect splitting tensile tests. The combined experimental and numerical research allowed a comparison between the stress-crack width (σ - w) relationship acquired straightforwardly from direct tensile tests, and the σ - w response derived from inverse analysis of the splitting tensile tests results. For this purpose a comprehensive nonlinear 3D finite element (FE) modeling strategy was developed. A comparison between the experimental results obtained from splitting tensile tests and the corresponding FE simulations confirmed the good accuracy of the proposed strategy to derive the σ – w for these composites. It is concluded that the post-cracking tensile laws obtained from inverse analysis provided a close relationship with the ones obtained from the experimental uniaxial tensile tests.
Resumo:
This paper presents the findings of an experimental campaign that was conducted to investigate the seismic behaviour of log houses. A two-storey log house designed by the Portuguese company Rusticasa® was subjected to a series of shaking table tests at LNEC, Lisbon, Portugal. The paper contains the description of the geometry and construction of the house and all the aspects related to the testing procedure, namely the pre-design, the setup, instrumentation and the testing process itself. The shaking table tests were carried out with a scaled spectrum of the Montenegro (1979) earthquake, at increasing levels of PGA, starting from 0.07g, moving on to 0.28g and finally 0.5g. The log house did not suffer any major damage and remained in working condition throughout the entire process. The preliminary analysis of the overall behaviour of the log house is also discussed.
Resumo:
Rockburst is characterized by a violent explosion of a block causing a sudden rupture in the rock and is quite common in deep tunnels. It is critical to understand the phenomenon of rockburst, focusing on the patterns of occurrence so these events can be avoided and/or managed saving costs and possibly lives. The failure mechanism of rockburst needs to be better understood. Laboratory experiments are undergoing at the Laboratory for Geomechanics and Deep Underground Engineering (SKLGDUE) of Beijing and the system is described. A large number of rockburst tests were performed and their information collected, stored in a database and analyzed. Data Mining (DM) techniques were applied to the database in order to develop predictive models for the rockburst maximum stress (σRB) and rockburst risk index (IRB) that need the results of such tests to be determined. With the developed models it is possible to predict these parameters with high accuracy levels using data from the rock mass and specific project.
Resumo:
Wind tunnel tests are a reliable tool to determine the effect of natural ventilation on buildings. This paper presents results of wind tunnel tests conducted to evaluate the influence of ventilation modules positioning on a façade system. Modules positioning was modified, resulting in different façade configurations. The tests were carried out with the use of a model, varying the position of the ventilation modules in the façade configuration. The cases tested were six ventilation modules positioned below the window-sill (ventilated window-sill), and three ventilation modules positioned above and below the façade. The façade system proposed was movable and interchangeable so that the same basic model could be used to test the possibilities for ventilation. Wind speed measurements were taken inside and outside the model for the different façades configurations to evaluate the best performance in relation to natural ventilation. Singleâ sided and Cross ventilation were considered for wind speed measurements. Results show the use of six ventilation modules positioned below the window-sill, forming "a ventilated window-sill" is the best solution in terms of natural ventilation.
Resumo:
The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.
Resumo:
In longitudinal studies of disease, patients may experience several events through a follow-up period. In these studies, the sequentially ordered events are often of interest and lead to problems that have received much attention recently. Issues of interest include the estimation of bivariate survival, marginal distributions and the conditional distribution of gap times. In this work we consider the estimation of the survival function conditional to a previous event. Different nonparametric approaches will be considered for estimating these quantities, all based on the Kaplan-Meier estimator of the survival function. We explore the finite sample behavior of the estimators through simulations. The different methods proposed in this article are applied to a data set from a German Breast Cancer Study. The methods are used to obtain predictors for the conditional survival probabilities as well as to study the influence of recurrence in overall survival.
Resumo:
Purpose. To analyze dry eye disease (DED) tests and their consistency in similar nonsymptomatic population samples living in two geographic locations with different climates (Continental vs. Atlantic). Methods. This is a pilot study including 14 nonsymptomatic residents from Valladolid (Continental climate, Spain) and 14 sex-matched and similarly aged residents from Braga (Atlantic climate, Portugal); they were assessed during the same season (spring) of two consecutive years. Phenol red thread test, conjunctival hyperemia, fluorescein tear breakup time, corneal and conjunctival staining, and Schirmer test were evaluated on three different consecutive visits. Reliability was assessed using the intraclass correlation coefficient and weighted kappa (J) coefficient for quantitative and ordinal variables, respectively. Results. Fourteen subjects were recruited in each city with a mean (TSD) age of 63.0 (T1.7) and 59.1 (T0.9) years (p = 0.08) in Valladolid and Braga, respectively. Intraclass correlation coefficient and J values of the tests performed were below 0.69 and 0.61, respectively, for both samples, thus showing moderate to poor reliability. Subsequently, comparisons were made between the results corresponding to the middle and higher outdoor relative humidity (RH) visit in each location as there were no differences in mean temperature (p Q 0.75) despite RH values significantly differing (p e 0.005). Significant (p e 0.05) differences were observed between Valladolid and Braga samples on tear breakup time (middle RH visit, 2.76 T 0.60 vs. 5.26 T 0.64 seconds; higher RH visit, 2.61 T 0.32 vs. 5.78 T 0.88 seconds) and corneal (middle RH, 0.64 T 0.17 vs. 0.14 T 0.10; higher RH, 0.60 T 0.22 vs. 0.0 T 0.0) and conjunctival staining (middle RH, 0.61 T 0.17 vs. 0.14 T 0.08; higher RH, 0.57 T 0.15 vs. 0.18 T 0.09). Conclusions. This pilot study provides initial evidence to support that DED test outcomes assessing the ocular surface integrity and tear stability are climate dependent. Future large-sample studies should support these outcomes also in DED patients. This knowledge is fundamental for multicenter clinical trials. Lack of consistency in diagnostic clinical tests for DED was also corroborated. (Optom Vis Sci 2015;92:e284Ye289)
Resumo:
OBJECTIVE: Compare pattern of exploratory eye movements during visual scanning of the Rorschach and TAT test cards in people with schizophrenia and controls. METHOD: 10 participants with schizophrenia and 10 controls matched by age, schooling and intellectual level participated in the study. Severity of symptoms was evaluated with the Positive and Negative Syndrome Scale. Test cards were divided into three groups: TAT cards with scenes content, TAT cards with interaction content (TAT-faces), and Rorschach cards with abstract images. Eye movements were analyzed for: total number, duration and location of fixation; and length of saccadic movements. RESULTS: Different pattern of eye movement was found, with schizophrenia participants showing lower number of fixations but longer fixation duration in Rorschach cards and TAT-faces. The biggest difference was observed in Rorschach, followed by TAT-faces and TAT-scene cards. CONCLUSIONS: Results suggest alteration in visual exploration mechanisms possibly related to integration of abstract visual information.