828 resultados para expected value of information


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On May 25, 2018, the EU introduced the General Data Protection Regulation (GDPR) that offers EU citizens a shelter for their personal information by requesting companies to explain how people’s information is used clearly. To comply with the new law, European and non-European companies interacting with EU citizens undertook a massive data re-permission-request campaign. However, if on the one side the EU Regulator was particularly specific in defining the conditions to get customers’ data access, on the other side, it did not specify how the communication between firms and consumers should be designed. This has left firms free to develop their re-permission emails as they liked, plausibly coupling the informative nature of these privacy-related communications with other persuasive techniques to maximize data disclosure. Consequently, we took advantage of this colossal wave of simultaneous requests to provide insights into two issues. Firstly, we investigate how companies across industries and countries chose to frame their requests. Secondly, we investigate which are the factors that influenced the selection of alternative re-permission formats. In order to achieve these goals, we examine the content of a sample of 1506 re-permission emails sent by 1396 firms worldwide, and we identify the dominant “themes” characterizing these emails. We then relate these themes to both the expected benefits firms may derive from data usage and the possible risks they may experience from not being completely compliant to the spirit of the law. Our results show that: (1) most firms enriched their re-permission messages with persuasive arguments aiming at increasing consumers’ likelihood of relinquishing their data; (2) the use of persuasion is the outcome of a difficult tradeoff between costs and benefits; (3) most companies acted in their self-interest and “gamed the system”. Our results have important implications for policymakers, managers, and customers of the online sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

347

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Left atrial volume indexed (LAVI) has been reported as a predictor of cardiovascular events. We sought to determine the prognostic value of LAVI for predicting the outcome of patients who underwent dobutamine stress echocardiography (DSE) for known or suspected coronary artery disease (CAD). Methods From January 2000 to July 2005, we studied 981 patients who underwent DSE and off-line measurements of LAVI. The value of DSE over clinical and LAVI data was examined using a stepwise log-rank test. Results During a median follow-up of 24 months, 56 (6%) events occurred. By univariate analysis, predictors of events were male sex, diabetes mellitus, previous myocardial infarction, left ventricular ejection fraction (LVEF), left atrial diameter indexed, LAVI, and abnormal DSE. By multivariate analysis, independent predictors were LVEF (relative risk [RR] = 0.98, 95% CI 0.95-1.00), LAVI (RR = 1.04, 95% CI 1.02-1.05), and abnormal DSE (RR = 2.70, 95% CI 1.28-5.69). In an incremental multivariate model, LAVI was additional to clinical data for predicting events (chi(2) 36.8, P < .001). The addition of DSE to clinical and LAVI yielded incremental information (chi(2) 55.3, P < .001). The 3-year event-free survival in patients with normal DSE and LAVI <= 33 mL/m(2) was 96%; with abnormal DSE and LAVI <= 33 mL/m(2), 91%; with normal DSE and LAVI >34 mL/m(2), 83%; and with abnormal DSE and LAVI >34 mL/m(2) 51%. Conclusion Left atrial volume indexed provides independent prognostic information in patients who underwent DSE for known or suspected CAD. Among patients with normal DSE, those with larger LAVI had worse outcome, and among patients with abnormal DSE, LAVI was still predictive. (Am Heart J 2008; 156:1110-6.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowledge is central to the modern economy and society. Indeed, the knowledge society has transformed the concept of knowledge and is more and more aware of the need to overcome the lack of knowledge when has to make options or address its problems and dilemmas. One’s knowledge is less based on exact facts and more on hypotheses, perceptions or indications. Even when we use new computational artefacts and novel methodologies for problem solving, like the use of Group Decision Support Systems (GDSSs), the question of incomplete information is in most of the situations marginalized. On the other hand, common sense tells us that when a decision is made it is impossible to have a perception of all the information involved and the nature of its intrinsic quality. Therefore, something has to be made in terms of the information available and the process of its evaluation. It is under this framework that a Multi-valued Extended Logic Programming language will be used for knowledge representation and reasoning, leading to a model that embodies the Quality-of-Information (QoI) and its quantification, along the several stages of the decision-making process. In this way, it is possible to provide a measure of the value of the QoI that supports the decision itself. This model will be here presented in the context of a GDSS for VirtualECare, a system aimed at sustaining online healthcare services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The foot and the ankle are small structures commonly affected by disorders, and their complex anatomy represent significant diagnostic challenges. SPECT/CT Image fusion can provide missing anatomical and bone structure information to functional imaging, which is particularly useful to increase diagnosis certainty of bone pathology. However, due to SPECT acquisition duration, patient’s involuntary movements may lead to misalignment between SPECT and CT images. Patient motion can be reduced using a dedicated patient support. We aimed at designing an ankle and foot immobilizing device and measuring its efficacy at improving image fusion. Methods: We enrolled 20 patients undergoing distal lower-limb SPECT/CT of the ankle and the foot with and without a foot holder. The misalignment between SPECT and CT images was computed by manually measuring 14 fiducial markers chosen among anatomical landmarks also visible on bone scintigraphy. Analysis of variance was performed for statistical analysis. Results: The obtained absolute average difference without and with support was 5.1±5.2 mm (mean±SD) and 3.1±2.7 mm, respectively, which is significant (p<0.001). Conclusion: The introduction of the foot holder significantly decreases misalignment between SPECT and CT images, which may have clinical influence in the precise localization of foot and ankle pathology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Dengue is prevalent in many tropical and sub-tropical regions. The clinical diagnosis of dengue is still complex, and not much data are available. This work aimed at assessing the diagnostic accuracy of the tourniquet test in patients with suspected dengue infection and its positivity in different classifications of this disease as reported to the Information System for Notifiable Disease in Belo Horizonte, State of Minas Gerais, Brazil between 2001 and 2006. Methods Cross-section analysis of the diagnostic accuracy of the tourniquet test for dengue, using IgM-anti-DENV ELISA as a gold standard. Results We selected 9,836 suspected cases, of which 41.1% were confirmed to be dengue. Classic dengue was present in 95.8%, dengue with complications in 2.5% and dengue hemorrhagic fever in 1.7%. The tourniquet test was positive in 16.9% of classic dengue cases, 61.7% of dengue cases with complications and 82.9% of cases of dengue hemorrhagic fever. The sensitivity and specificity of the tourniquet test were 19.1% and 86.4%, respectively. Conclusions A positive tourniquet test can be a valuable tool to support diagnosis of dengue where laboratory tests are not available. However, the absence of a positive test should not be read as the absence of infection. In addition, the tourniquet test was demonstrated to be an indicator of dengue severity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The economic value of flounder from shore angling around Ireland was assessed. Flounder catches from shore angling tournaments around Ireland were related to domestic and overseas shore angling expenditure in order to determine an economic value for the species. Temporal trends in flounder angling catches, and specimen (trophy) flounder reports were also investigated. Flounder was found to be the most caught shore angling species in competitions around Ireland constituting roughly one third of the shore angling competition catch although this did vary by area. The total value of flounder from shore angling tourism was estimated to be of the order of €8.4 million. No significant temporal trends in flounder angling catches and specimen reports were found. Thus there is no evidence from the current study for any decline in flounder stocks. The population dynamics of 0-group flounder during the early benthic stage was investigated at estuarine sites in Galway Bay, west of Ireland. Information was analysed from the March to June sampling period over five years (2002 to 2006). Spatial and temporal variations in settlement and population length structure were analysed between beach and river habitats and sites. Settlement of flounder began from late March to early May of each year, most commonly in April. Peak settlement was usually in April or early May. Settlement was recorded earlier than elsewhere, although most commonly was similar to the southern part of the UK and northern France. Settlement was generally later in tidal rivers than on sandy beaches. Abundance of 0-group flounder in Galway Bay did not exhibit significant inter -annual variability. 0-group flounder were observed in dense aggregations of up to 105 m'2, which were patchy in distribution. Highest densities of 0-group flounder were recorded in limnetic and oligohaline areas as compared with the lower densities in polyhaline and to a lesser extent mesohaline areas. Measurements to of salinity allowed the classification of beaches, and tidal river sections near the mouth, into a salinity based scheme for length comparisons. Beaches were classified as polyhaline,the lower section of rivers as mesohaline, and the middle and upper sections as oligohaline. Over the March to June sampling period 0-group flounder utilised different sections at different length ranges and were significantly larger in more upstream sections. During initial settlement in April, 0-group flounder of 8-10 mm (standard length, SL) were present in abundance on polyhaline sandy beaches. By about 10mm (SL), flounder were present in all polyhaline, mesohaline and (oligohaline) sections. 0-group flounder became absent or in insignificant numbers in polyhaline and mesohaline sections in a matter of weeks after first appearance. From April to June, 0-group flounder of 12-30mm (SL) were found in more upstream locations in the oligohaline sections. About one month (May or June) after initial settlement, 0-group flounder became absent from the oligohaline sections. Concurrently, flounder start to reappear in mesohaline and polyhaline areas at approximately 30mm (SL) in June. The results indicate 0-group flounder in the early benthic stage are associated with low salinity areas, but as they grow, this association diminishes. Results strongly suggest that migration of 0-group flounder between habitats takes place during the early benthic phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The TIMI Score for ST-segment elevation myocardial infarction (STEMI) was created and validated specifically for this clinical scenario, while the GRACE score is generic to any type of acute coronary syndrome. Objective: Between TIMI and GRACE scores, identify the one of better prognostic performance in patients with STEMI. Methods: We included 152 individuals consecutively admitted for STEMI. The TIMI and GRACE scores were tested for their discriminatory ability (C-statistics) and calibration (Hosmer-Lemeshow) in relation to hospital death. Results: The TIMI score showed equal distribution of patients in the ranges of low, intermediate and high risk (39 %, 27 % and 34 %, respectively), as opposed to the GRACE Score that showed predominant distribution at low risk (80 %, 13 % and 7%, respectively). Case-fatality was 11%. The C-statistics of the TIMI score was 0.87 (95%CI = 0.76 to 0.98), similar to GRACE (0.87, 95%CI = 0.75 to 0.99) - p = 0.71. The TIMI score showed satisfactory calibration represented by χ2 = 1.4 (p = 0.92), well above the calibration of the GRACE score, which showed χ2 = 14 (p = 0.08). This calibration is reflected in the expected incidence ranges for low, intermediate and high risk, according to the TIMI score (0 %, 4.9 % and 25 %, respectively), differently to GRACE (2.4%, 25% and 73%), which featured middle range incidence inappropriately. Conclusion: Although the scores show similar discriminatory capacity for hospital death, the TIMI score had better calibration than GRACE. These findings need to be validated populations of different risk profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background:The QRS-T angle correlates with prognosis in patients with heart failure and coronary artery disease, reflected by an increase in mortality proportional to an increase in the difference between the axes of the QRS complex and T wave in the frontal plane. The value of this correlation in patients with Chagas heart disease is currently unknown.Objective:Determine the correlation of the QRS-T angle and the risk of induction of ventricular tachycardia / ventricular fibrillation (VT / VF) during electrophysiological study (EPS) in patients with Chagas disease.Methods:Case-control study at a tertiary center. Patients without induction of VT / VF on EPS were used as controls. The QRS-T angle was categorized as normal (0-105º), borderline (105-135º) or abnormal (135-180º). Differences between groups for continuous variables were analyzed with the t test or Mann-Whitney test, and for categorical variables with Fisher's exact test. P values < 0.05 were considered significant.Results:Of 116 patients undergoing EPS, 37.9% were excluded due to incomplete information / inactive records or due to the impossibility to correctly calculate the QRS-T angle (presence of left bundle branch block and atrial fibrillation). Of 72 patients included in the study, 31 induced VT / VF on EPS. Of these, the QRS-T angle was normal in 41.9%, borderline in 12.9% and abnormal in 45.2%. Among patients without induction of VT / VF on EPS, the QRS-T angle was normal in 63.4%, borderline in 14.6% and abnormal in 17.1% (p = 0.04). When compared with patients with normal QRS-T angle, those with abnormal angle had a fourfold higher risk of inducing ventricular tachycardia / ventricular fibrillation on EPS [odds ratio (OR) 4; confidence interval (CI) 1.298-12.325; p = 0.028]. After adjustment for other variables such as age, ejection fraction (EF) and QRS size, there was a trend for the abnormal QRS-T angle to identify patients with increased risk of inducing VT / VF during EPS (OR 3.95; CI 0.99-15.82; p = 0.052). The EF also emerged as a predictor of induction of VT / VF: for each point increase in EF, there was a 4% reduction in the rate of sustained ventricular arrhythmia on EPS.Conclusions:Changes in the QRS-T angle and decreases in EF were associated with an increased risk of induction of VT / VF on EPS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.