899 resultados para the SIMPLE algorithm
Resumo:
This paper considers a framework where data from correlated sources are transmitted with the help of network coding in ad hoc network topologies. The correlated data are encoded independently at sensors and network coding is employed in the intermediate nodes in order to improve the data delivery performance. In such settings, we focus on the problem of reconstructing the sources at decoder when perfect decoding is not possible due to losses or bandwidth variations. We show that the source data similarity can be used at decoder to permit decoding based on a novel and simple approximate decoding scheme. We analyze the influence of the network coding parameters and in particular the size of finite coding fields on the decoding performance. We further determine the optimal field size that maximizes the expected decoding performance as a trade-off between information loss incurred by limiting the resolution of the source data and the error probability in the reconstructed data. Moreover, we show that the performance of the approximate decoding improves when the accuracy of the source model increases even with simple approximate decoding techniques. We provide illustrative examples showing how the proposed algorithm can be deployed in sensor networks and distributed imaging applications.
Resumo:
The two major subtypes of diffuse large B-cell lymphoma (DLBCL) (germinal centre B-cell - like (GCB-DLBCL) and activated B-cell - like (ABC-DLBCL)) are defined by means of gene expression profiling (GEP). Patients with GCB-DLBCL survive longer with the current standard regimen R-CHOP than patients with ABC-DLBCL. As GEP is not part of the current routine diagnostic work-up, efforts have been made to find a substitute than involves immunohistochemistry (IHC). Various algorithms achieved this with 80-90% accuracy. However, conflicting results on the appropriateness of IHC have been reported. Because it is likely that the molecular subtypes will play a role in future clinical practice, we assessed the determination of the molecular DLBCL subtypes by means of IHC at our University Hospital, and some aspects of this determination elsewhere in Switzerland. The most frequently used Hans algorithm includes three antibodies (against CD10, bcl-6 and MUM1). From records of the routine diagnostic work-up, we identified 51 of 172 (29.7%) newly diagnosed and treated DLBCL cases from 2005 until 2010 with an assigned DLBCL subtype. DLBCL subtype information was expanded by means of tissue microarray analysis. The outcome for patients with the GCB subtype was significantly better compared with those with the non-GC subtype, independent of the age-adjusted International Prognostic Index. We found a lack of standardisation in the subtype determination by means of IHC in Switzerland and significant problems of reproducibility. We conclude that the Hans algorithm performs well in our hands and that awareness of this important matter is increasing. However, outside clinical trials, vigorous efforts to standardise IHC determination are needed as DLBCL subtype-specific therapies emerge.
Resumo:
Information overload is a significant problem for modern medicine. Searching MEDLINE for common topics often retrieves more relevant documents than users can review. Therefore, we must identify documents that are not only relevant, but also important. Our system ranks articles using citation counts and the PageRank algorithm, incorporating data from the Science Citation Index. However, citation data is usually incomplete. Therefore, we explore the relationship between the quantity of citation information available to the system and the quality of the result ranking. Specifically, we test the ability of citation count and PageRank to identify "important articles" as defined by experts from large result sets with decreasing citation information. We found that PageRank performs better than simple citation counts, but both algorithms are surprisingly robust to information loss. We conclude that even an incomplete citation database is likely to be effective for importance ranking.
Resumo:
Information overload is a significant problem for modern medicine. Searching MEDLINE for common topics often retrieves more relevant documents than users can review. Therefore, we must identify documents that are not only relevant, but also important. Our system ranks articles using citation counts and the PageRank algorithm, incorporating data from the Science Citation Index. However, citation data is usually incomplete. Therefore, we explore the relationship between the quantity of citation information available to the system and the quality of the result ranking. Specifically, we test the ability of citation count and PageRank to identify "important articles" as defined by experts from large result sets with decreasing citation information. We found that PageRank performs better than simple citation counts, but both algorithms are surprisingly robust to information loss. We conclude that even an incomplete citation database is likely to be effective for importance ranking.
Resumo:
In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.
Resumo:
BACKGROUND Areal bone mineral density is predictive for fracture risk. Microstructural bone parameters evaluated at the appendicular skeleton by high-resolution peripheral quantitative computed tomography (HR-pQCT) display differences between healthy patients and fracture patients. With the simple geometry of the cortex at the distal tibial diaphysis, a cortical index of the tibia combining material and mechanical properties correlated highly with bone strength ex vivo. The trabecular bone score derived from the scan of the lumbar spine by dual-energy X-ray absorptiometry (DXA) correlated ex vivo with the micro architectural parameters. It is unknown if these microstructural correlations could be made in healthy premenopausal women. METHODS Randomly selected women between 20-40 years of age were examined by DXA and HR-pQCT at the standard regions of interest and at customized sub regions to focus on cortical and trabecular parameters of strength separately. For cortical strength, at the distal tibia the volumetric cortical index was calculated directly from HR-pQCT and the areal cortical index was derived from the DXA scan using a Canny threshold-based tool. For trabecular strength, the trabecular bone score was calculated based on the DXA scan of the lumbar spine and was compared with the corresponding parameters derived from the HR-pQCT measurements at radius and tibia. RESULTS Seventy-two healthy women were included (average age 33.8 years, average BMI 23.2 kg/m(2)). The areal cortical index correlated highly with the volumetric cortical index at the distal tibia (R = 0.798). The trabecular bone score correlated moderately with the microstructural parameters of the trabecular bone. CONCLUSION This study in randomly selected premenopausal women demonstrated that microstructural parameters of the bone evaluated by HR-pQCT correlated with the DXA derived parameters of skeletal regions containing predominantly cortical or cancellous bone. Whether these indexes are suitable for better predictions of the fracture risk deserves further investigation.
Resumo:
Is the online trade with second-hand products changing individual consumer behaviour? What is the sustainability potential of this activity? How can daily energy-consuming routines at the workplace be changed? Do major changes in the course of people's lives represent opportunities to modify their consumer behaviour towards greater sustainability? These are only some of the research questions studied in the focal topic "From Knowledge to Action - New Paths towards Sustainable Consumption" which is funded by the German Federal Ministry of Education and Research (BMBF) as part of the "Social-ecological Research Programme" (SÖF). This book gives an insight into the research results of the ten project groups. Their diversity highlights that there is much more to "sustainable consumption" than the simple purchase of organic or fair trade products.In addition, overarching conceptual and normative issues were treated across the project groups of the focal topic. Developed collaboratively and moderated by the accompanying research project, the results of the synthesis process are also presented here, as for example how the sustainability of individual consumer behaviour can be evaluated,or which theories of action are particularly useful for specific consumer behaviour phenomena.
Resumo:
Artificial pancreas is in the forefront of research towards the automatic insulin infusion for patients with type 1 diabetes. Due to the high inter- and intra-variability of the diabetic population, the need for personalized approaches has been raised. This study presents an adaptive, patient-specific control strategy for glucose regulation based on reinforcement learning and more specifically on the Actor-Critic (AC) learning approach. The control algorithm provides daily updates of the basal rate and insulin-to-carbohydrate (IC) ratio in order to optimize glucose regulation. A method for the automatic and personalized initialization of the control algorithm is designed based on the estimation of the transfer entropy (TE) between insulin and glucose signals. The algorithm has been evaluated in silico in adults, adolescents and children for 10 days. Three scenarios of initialization to i) zero values, ii) random values and iii) TE-based values have been comparatively assessed. The results have shown that when the TE-based initialization is used, the algorithm achieves faster learning with 98%, 90% and 73% in the A+B zones of the Control Variability Grid Analysis for adults, adolescents and children respectively after five days compared to 95%, 78%, 41% for random initialization and 93%, 88%, 41% for zero initial values. Furthermore, in the case of children, the daily Low Blood Glucose Index reduces much faster when the TE-based tuning is applied. The results imply that automatic and personalized tuning based on TE reduces the learning period and improves the overall performance of the AC algorithm.
Resumo:
Demographic composition and dynamics of animal and human populations are important determinants for the transmission dynamics of infectious disease and for the effect of infectious disease or environmental disasters on productivity. In many circumstances, demographic data are not available or of poor quality. Since 1999 Switzerland has been recording cattle movements, births, deaths and slaughter in an animal movement database (AMD). The data present in the AMD offers the opportunity for analysing and understanding the dynamic of the Swiss cattle population. A dynamic population model can serve as a building block for future disease transmission models and help policy makers in developing strategies regarding animal health, animal welfare, livestock management and productivity. The Swiss cattle population was therefore modelled using a system of ordinary differential equations. The model was stratified by production type (dairy or beef), age and gender (male and female calves: 0-1 year, heifers and young bulls: 1-2 years, cows and bulls: older than 2 years). The simulation of the Swiss cattle population reflects the observed pattern accurately. Parameters were optimized on the basis of the goodness-of-fit (using the Powell algorithm). The fitted rates were compared with calculated rates from the AMD and differed only marginally. This gives confidence in the fitted rates of parameters that are not directly deductible from the AMD (e.g. the proportion of calves that are moved from the dairy system to fattening plants).
Resumo:
Background: In an artificial pancreas (AP), the meals are either manually announced or detected and their size estimated from the blood glucose level. Both methods have limitations, which result in suboptimal postprandial glucose control. The GoCARB system is designed to provide the carbohydrate content of meals and is presented within the AP framework. Method: The combined use of GoCARB with a control algorithm is assessed in a series of 12 computer simulations. The simulations are defined according to the type of the control (open or closed loop), the use or not-use of GoCARB and the diabetics’ skills in carbohydrate estimation. Results: For bad estimators without GoCARB, the percentage of the time spent in target range (70-180 mg/dl) during the postprandial period is 22.5% and 66.2% for open and closed loop, respectively. When the GoCARB is used, the corresponding percentages are 99.7% and 99.8%. In case of open loop, the time spent in severe hypoglycemic events (<50 mg/dl) is 33.6% without the GoCARB and is reduced to 0.0% when the GoCARB is used. In case of closed loop, the corresponding percentage is 1.4% without the GoCARB and is reduced to 0.0% with the GoCARB. Conclusion: The use of GoCARB improves the control of postprandial response and glucose profiles especially in the case of open loop. However, the most efficient regulation is achieved by the combined use of the control algorithm and the GoCARB.
Resumo:
In the fermion loop formulation the contributions to the partition function naturally separate into topological equivalence classes with a definite sign. This separation forms the basis for an efficient fermion simulation algorithm using a fluctuating open fermion string. It guarantees sufficient tunnelling between the topological sectors, and hence provides a solution to the fermion sign problem affecting systems with broken supersymmetry. Moreover, the algorithm shows no critical slowing down even in the massless limit and can hence handle the massless Goldstino mode emerging in the supersymmetry broken phase. In this paper – the third in a series of three – we present the details of the simulation algorithm and demonstrate its efficiency by means of a few examples.
Resumo:
Context. The Rosetta encounter with comet 67P/Churyumov-Gerasimenko provides a unique opportunity for an in situ, up-close investigation of ion-neutral chemistry in the coma of a weakly outgassing comet far from the Sun. Aims. Observations of primary and secondary ions and modeling are used to investigate the role of ion-neutral chemistry within the thin coma. Methods. Observations from late October through mid-December 2014 show the continuous presence of the solar wind 30 km from the comet nucleus. These and other observations indicate that there is no contact surface and the solar wind has direct access to the nucleus. On several occasions during this time period, the Rosetta/ROSINA/Double Focusing Mass Spectrometer measured the low-energy ion composition in the coma. Organic volatiles and water group ions and their breakup products (masses 14 through 19), CO2+ (masses 28 and 44) and other mass peaks (at masses 26, 27, and possibly 30) were observed. Secondary ions include H3O+ and HCO+ (masses 19 and 29). These secondary ions indicate ion-neutral chemistry in the thin coma of the comet. A relatively simple model is constructed to account for the low H3O+/H2O+ and HCO+/CO+ ratios observed in a water dominated coma. Results from this simple model are compared with results from models that include a more detailed chemical reaction network. Results. At low outgassing rates, predictions from the simple model agree with observations and with results from more complex models that include much more chemistry. At higher outgassing rates, the ion-neutral chemistry is still limited and high HCO+/CO+ ratios are predicted and observed. However, at higher outgassing rates, the model predicts high H3O+/H2O+ ratios and the observed ratios are often low. These low ratios may be the result of the highly heterogeneous nature of the coma, where CO and CO2 number densities can exceed that of water.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
CONTEXT Hyperthyroidism is an established risk factor for atrial fibrillation (AF), but information concerning the association with variations within the normal range of thyroid function and subgroups at risk is lacking. OBJECTIVE This study aimed to investigate the association between normal thyroid function and AF prospectively and explore potential differential risk patterns. DESIGN, SETTING, AND PARTICIPANTS From the Rotterdam Study we included 9166 participants ≥ 45 y with TSH and/or free T4 (FT4) measurements and AF assessment (1997-2012 median followup, 6.8 y), with 399 prevalent and 403 incident AF cases. MAIN OUTCOME MEASURES Outcome measures were 3-fold: 1) hazard ratios (HRs) for the risk of incident AF by Cox proportional-hazards models, 2) 10-year absolute risks taking competing risk of death into account, and 3) discrimination ability of adding FT4 to the CHARGE-AF simple model, an established prediction model for AF. RESULTS Higher FT4 levels were associated with higher risks of AF (HR 1.63, 95% confidence interval, 1.19-2.22), when comparing those in the highest quartile to those in lowest quartile. Absolute 10-year risks increased with higher FT4 in participants ≤ 65 y from 1-9% and from 6-12% in subjects ≥ 65 y. Discrimination of the prediction model improved when adding FT4 to the simple model (c-statistic, 0.722 vs 0.729; P = .039). TSH levels were not associated with AF. CONCLUSIONS There is an increased risk of AF with higher FT4 levels within the normal range, especially in younger subjects. Adding FT4 to the simple model slightly improved discrimination of risk prediction.
Resumo:
BACKGROUND & AIMS European and American guidelines have endorsed the Barcelona Clinic Liver Cancer (BCLC) staging system. The aim of this study was to assess the performance of the recently developed Hong Kong Liver Cancer (HKLC) classification as a staging system for hepatocellular carcinoma (HCC) in Europe. METHODS We used a pooled set of 1693 HCC patients combining three prospective European cohorts. Discrimination ability between the nine substages and five stages of the HKLC classification system was assessed. To evaluate the predictive power of the HKLC and BCLC staging systems on overall survival, Nagelkerke pseudo R2, Bayesian Information Criterion and Harrell's concordance index were calculated. The number of patients who would benefit from a curative therapy was assessed for both staging system. RESULTS The HKLC classification in nine substages shows suboptimal discrimination between the staging groups. The classification in five stages shows better discrimination between groups. However, the BCLC classification performs better than the HKLC classification in the ability to predict OS. The HKLC treatment algorithm tags significantly more patients to curative therapy than the BCLC. CONCLUSIONS The BCLC staging system performs better for European patients than the HKLC staging system in predicting OS. Twice more patients are eligible for a curative therapy with the HKLC algorithm, whether this translates in survival benefit remains to be investigated. This article is protected by copyright. All rights reserved.