945 resultados para Percolation threshold
Resumo:
This paper investigates whether stock market wealth affects real consumption asymmetrically through a threshold adjustment model. The empirical findings for the US show that wealth produces an asymmetric effect on real consumption, with negative 'news' affecting consumption less than positive 'news.' Thus, policy makers may want to focus more attention on preventing asset 'bubbles' than on responding to negative asset shocks.
Resumo:
Using quantile regressions and cross-sectional data from 152 countries, we examine the relationship between inflation and its variability. We consider two measures of inflation - the mean and median - and three different measures of inflation variability - the standard deviation, coefficient of variation, and median deviation. Using the mean and standard deviation or the median and the median deviation, the results support both the hypothesis that higher inflation creates more inflation variability and that inflation variability raises inflation across quantiles. Moreover, higher quantiles in both cases lead to larger marginal effects of inflation (inflation variability) on inflation variability (inflation). Using the mean and the coefficient of variation, however, the findings largely support no correlation between inflation and its variability. Finally, we also consider whether thresholds for inflation rate or inflation variability exist before finding such positive correlations. We find evidence of thresholds for inflation rates below 3 percent, but mixed results for thresholds for inflation variability.
Resumo:
Many phase II clinical studies in oncology use two-stage frequentist design such as Simon's optimal design. However, they have a common logistical problem regarding the patient accrual at the interim. Strictly speaking, patient accrual at the end of the first stage may have to be suspended until all patients have events, success or failure. For example, when the study endpoint is six-month progression free survival, patient accrual has to be stopped until all outcomes from stage I is observed. However, study investigators may have concern when accrual is suspended after the first stage due to the loss of accrual momentum during this hiatus. We propose a two-stage phase II design that resolves the patient accrual problem due to an interim analysis, and it can be used as an alternative way to frequentist two-stage phase II studies in oncology. ^
Resumo:
Objective. In 2009, the International Expert Committee recommended the use of HbA1c test for diagnosis of diabetes. Although it has been recommended for the diagnosis of diabetes, its precise test performance among Mexican Americans is uncertain. A strong “gold standard” would rely on repeated blood glucose measurement on different days, which is the recommended method for diagnosing diabetes in clinical practice. Our objective was to assess test performance of HbA1c in detecting diabetes and pre-diabetes against repeated fasting blood glucose measurement for the Mexican American population living in United States-Mexico border. Moreover, we wanted to find out a specific and precise threshold value of HbA1c for Diabetes Mellitus (DM) and pre-diabetes for this high-risk population which might assist in better diagnosis and better management of patient diabetes. ^ Research design and methods. We used CCHC dataset for our study. In 2004, the Cameron County Hispanic Cohort (CCHC), now numbering 2,574, was established drawn from randomly selected households on the basis of 2000 Census tract data. The CCHC study randomly selected a subset of people (aged 18-64 years) in CCHC cohort households to determine the influence of SES on diabetes and obesity. Among the participants in Cohort-2000, 67.15% are female; all are Hispanic. ^ Individuals were defined as having diabetes mellitus (Fasting plasma glucose [FPG] ≥ 126 mg/dL or pre-diabetes (100 ≤ FPG < 126 mg/dL). HbA1c test performance was evaluated using receiver operator characteristic (ROC) curves. Moreover, change-point models were used to determine HbA1c thresholds compatible with FPG thresholds for diabetes and pre-diabetes. ^ Results. When assessing Fasting Plasma Glucose (FPG) is used to detect diabetes, the sensitivity and specificity of HbA1c≥ 6.5% was 75% and 87% respectively (area under the curve 0.895). Additionally, when assessing FPG to detect pre-diabetes, the sensitivity and specificity of HbA1c≥ 6.0% (ADA recommended threshold) was 18% and 90% respectively. The sensitivity and specificity of HbA1c≥ 5.7% (International Expert Committee recommended threshold) for detecting pre-diabetes was 31% and 78% respectively. ROC analyses suggest HbA1c as a sound predictor of diabetes mellitus (area under the curve 0.895) but a poorer predictor for pre-diabetes (area under the curve 0.632). ^ Conclusions. Our data support the current recommendations for use of HbA1c in the diagnosis of diabetes for the Mexican American population as it has shown reasonable sensitivity, specificity and accuracy against repeated FPG measures. However, use of HbA1c may be premature for detecting pre-diabetes in this specific population because of the poor sensitivity with FPG. It might be the case that HbA1c is differentiating the cases more effectively who are at risk of developing diabetes. Following these pre-diabetic individuals for a longer-term for the detection of incident diabetes may lead to more confirmatory result.^
Resumo:
Soybean aphid has been a major pest for producers in Northwest Iowa since their first major outbreak in 2003. Control measures for managing this pest are warranted almost every growing season and much research is being done on managing this pest. Insecticide applications have been the sole management technique for soybean aphid and will continue to be important in the future. An economic threshold of 250 aphids/plant is the current threshold level recommended by Iowa State University. This study was conducted to determine if the current recommendations are useful in managing soybean aphid and maintaining profitability for producers.
Resumo:
In this paper, by employing the threshold regression method, we estimate the average tariff equivalent of fixed costs for the use of a free trade agreement (FTA) among all existing FTAs in the world. It is estimated to be 3.2%. This global estimate serves as a reference rate in the evaluation of each FTA’s fixed costs.
Resumo:
Relacionado con línea de investigación del GDS del ISOM ver http://www.isom.upm.es/dsemiconductores.php
Resumo:
In recent decades, there has been an increasing interest in systems comprised of several autonomous mobile robots, and as a result, there has been a substantial amount of development in the eld of Articial Intelligence, especially in Robotics. There are several studies in the literature by some researchers from the scientic community that focus on the creation of intelligent machines and devices capable to imitate the functions and movements of living beings. Multi-Robot Systems (MRS) can often deal with tasks that are dicult, if not impossible, to be accomplished by a single robot. In the context of MRS, one of the main challenges is the need to control, coordinate and synchronize the operation of multiple robots to perform a specic task. This requires the development of new strategies and methods which allow us to obtain the desired system behavior in a formal and concise way. This PhD thesis aims to study the coordination of multi-robot systems, in particular, addresses the problem of the distribution of heterogeneous multi-tasks. The main interest in these systems is to understand how from simple rules inspired by the division of labor in social insects, a group of robots can perform tasks in an organized and coordinated way. We are mainly interested on truly distributed or decentralized solutions in which the robots themselves, autonomously and in an individual manner, select a particular task so that all tasks are optimally distributed. In general, to perform the multi-tasks distribution among a team of robots, they have to synchronize their actions and exchange information. Under this approach we can speak of multi-tasks selection instead of multi-tasks assignment, which means, that the agents or robots select the tasks instead of being assigned a task by a central controller. The key element in these algorithms is the estimation ix of the stimuli and the adaptive update of the thresholds. This means that each robot performs this estimate locally depending on the load or the number of pending tasks to be performed. In addition, it is very interesting the evaluation of the results in function in each approach, comparing the results obtained by the introducing noise in the number of pending loads, with the purpose of simulate the robot's error in estimating the real number of pending tasks. The main contribution of this thesis can be found in the approach based on self-organization and division of labor in social insects. An experimental scenario for the coordination problem among multiple robots, the robustness of the approaches and the generation of dynamic tasks have been presented and discussed. The particular issues studied are: Threshold models: It presents the experiments conducted to test the response threshold model with the objective to analyze the system performance index, for the problem of the distribution of heterogeneous multitasks in multi-robot systems; also has been introduced additive noise in the number of pending loads and has been generated dynamic tasks over time. Learning automata methods: It describes the experiments to test the learning automata-based probabilistic algorithms. The approach was tested to evaluate the system performance index with additive noise and with dynamic tasks generation for the same problem of the distribution of heterogeneous multi-tasks in multi-robot systems. Ant colony optimization: The goal of the experiments presented is to test the ant colony optimization-based deterministic algorithms, to achieve the distribution of heterogeneous multi-tasks in multi-robot systems. In the experiments performed, the system performance index is evaluated by introducing additive noise and dynamic tasks generation over time.
Resumo:
Ionoluminescence (IL) has been used in this work as a sensitive tool to probe the microscopic electronic processes and structural changes produced on quartz by the irradiation with swift heavy ions. The IL yields have been measured as a function of irradiation fluence and electronic stopping power. The results are consistent with the assignment of the 2.7 eV (460 nm) band to the recombination of self-trapped excitons at the damaged regions in the irradiated material. Moreover, it was possible to determine the threshold for amorphization by a single ion impact, as 1:7 keV/nm, which agrees well with the results of previous studies.
Resumo:
This paper focuses on the general problem of coordinating multiple robots. More specifically, it addresses the self-selection of heterogeneous specialized tasks by autonomous robots. In this paper we focus on a specifically distributed or decentralized approach as we are particularly interested in a decentralized solution where the robots themselves autonomously and in an individual manner, are responsible for selecting a particular task so that all the existing tasks are optimally distributed and executed. In this regard, we have established an experimental scenario to solve the corresponding multi-task distribution problem and we propose a solution using two different approaches by applying Response Threshold Models as well as Learning Automata-based probabilistic algorithms. We have evaluated the robustness of the algorithms, perturbing the number of pending loads to simulate the robot’s error in estimating the real number of pending tasks and also the dynamic generation of loads through time. The paper ends with a critical discussion of experimental results.
Resumo:
Sistema nervioso y ejercicio
Resumo:
A great challenge for future information technologies is building reliable systems on top of unreliable components. Parameters of modern and future technology devices are affected by severe levels of process variability and devices will degrade and even fail during the normal lifeDme of the chip due to aging mechanisms. These extreme levels of variability are caused by the high device miniaturizaDon and the random placement of individual atoms. Variability is considered a "red brick" by the InternaDonal Technology Roadmap for Semiconductors. The session is devoted to this topic presenDng research experiences from the Spanish Network on Variability called VARIABLES. In this session a talk entlited "Modeling sub-threshold slope and DIBL mismatch of sub-22nm FinFet" was presented.