958 resultados para Bland Altman analysis
Resumo:
There is a growing need for parametric design software that communicates building performance feedback in early architectural exploration to support decision-making. This paper examines how the circuit of design and analysis process can be closed to provide active and concurrent feedback between architecture and services engineering domains. It presents the structure for an openly customisable design system that couples parametric modelling and energy analysis software to allow designers to assess the performance of early design iterations quickly. Finally, it discusses how user interactions with the system foster information exchanges that facilitate the sharing of design intelligence across disciplines.
Resumo:
Obesity has been widely regarded as a public health concern because of its adverse impact on individuals’ health. Systematic reviews have been published in examining the effect of obesity on depression, but with major emphasis on general obesity as measured by the body mass index. Despite a stronger effect of abdominal obesity on individuals’ physical health outcomes, to our best knowledge, no systematic review was undertaken with regard to the relationship between abdominal obesity and depression. This paper reports the results of a systematic review and meta-analysis of cross-sectional studies examining the relationship between abdominal obesity and depression in a general population. Multiple electronic databases were searched until the end of September 2009. 15 articles were systematically reviewed and meta-analyzed. The analysis showed that the odds ratio of having depression for individuals with abdominal obesity was 1.38 (95% CI, 1.22–1.57) as compared to those who are not obese. Furthermore, it was found that this relationship did not vary with potential confounders including gender, age, measurement of depression and abdominal obesity, and study quality.
Resumo:
Two decades after its inception, Latent Semantic Analysis(LSA) has become part and parcel of every modern introduction to Information Retrieval. For any tool that matures so quickly, it is important to check its lore and limitations, or else stagnation will set in. We focus here on the three main aspects of LSA that are well accepted, and the gist of which can be summarized as follows: (1) that LSA recovers latent semantic factors underlying the document space, (2) that such can be accomplished through lossy compression of the document space by eliminating lexical noise, and (3) that the latter can best be achieved by Singular Value Decomposition. For each aspect we performed experiments analogous to those reported in the LSA literature and compared the evidence brought to bear in each case. On the negative side, we show that the above claims about LSA are much more limited than commonly believed. Even a simple example may show that LSA does not recover the optimal semantic factors as intended in the pedagogical example used in many LSA publications. Additionally, and remarkably deviating from LSA lore, LSA does not scale up well: the larger the document space, the more unlikely that LSA recovers an optimal set of semantic factors. On the positive side, we describe new algorithms to replace LSA (and more recent alternatives as pLSA, LDA, and kernel methods) by trading its l2 space for an l1 space, thereby guaranteeing an optimal set of semantic factors. These algorithms seem to salvage the spirit of LSA as we think it was initially conceived.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
Objective: To determine whether remote monitoring (structured telephone support or telemonitoring) without regular clinic or home visits improves outcomes for patients with chronic heart failure. Data sources: 15 electronic databases, hand searches of previous studies, and contact with authors and experts. Data extraction: Two investigators independently screened the results. Review methods: Published randomised controlled trials comparing remote monitoring programmes with usual care in patients with chronic heart failure managed within the community. Results: 14 randomised controlled trials (4264 patients) of remote monitoring met the inclusion criteria: four evaluated telemonitoring, nine evaluated structured telephone support, and one evaluated both. Remote monitoring programmes reduced the rates of admission to hospital for chronic heart failure by 21% (95% confidence interval 11% to 31%) and all cause mortality by 20% (8% to 31%); of the six trials evaluating health related quality of life three reported significant benefits with remote monitoring, and of the four studies examining healthcare costs with structured telephone support three reported reduced cost and one no effect. Conclusion: Programmes for chronic heart failure that include remote monitoring have a positive effect on clinical outcomes in community dwelling patients with chronic heart failure.
Resumo:
National Housing Relics and Scenic Sites (NHRSSs) in China are the equivalent of National Parks in the West but have contrasting features and broader roles when compared to their Western counterparts. By reviewing and analysing more than 370 academic sources, this paper identifies 6 major issue clusters and future challenges that will influence the management of NHRSSs over time. It also provides a number of cases to illustrate the particular features of NHRSSs. Identifying the hot issues and important challenges in Chinese NHRSSs will provide valuable insights into priorities now being discussed in highly populated areas of the World.
Resumo:
The multifractal properties of two indices of geomagnetic activity, D st (representative of low latitudes) and a p (representative of the global geomagnetic activity), with the solar X-ray brightness, X l , during the period from 1 March 1995 to 17 June 2003 are examined using multifractal detrended fluctuation analysis (MF-DFA). The h(q) curves of D st and a p in the MF-DFA are similar to each other, but they are different from that of X l , indicating that the scaling properties of X l are different from those of D st and a p . Hence, one should not predict the magnitude of magnetic storms directly from solar X-ray observations. However, a strong relationship exists between the classes of the solar X-ray irradiance (the classes being chosen to separate solar flares of class X-M, class C, and class B or less, including no flares) in hourly measurements and the geomagnetic disturbances (large to moderate, small, or quiet) seen in D st and a p during the active period. Each time series was converted into a symbolic sequence using three classes. The frequency, yielding the measure representations, of the substrings in the symbolic sequences then characterizes the pattern of space weather events. Using the MF-DFA method and traditional multifractal analysis, we calculate the h(q), D(q), and τ (q) curves of the measure representations. The τ (q) curves indicate that the measure representations of these three indices are multifractal. On the basis of this three-class clustering, we find that the h(q), D(q), and τ (q) curves of the measure representations of these three indices are similar to each other for positive values of q. Hence, a positive flare storm class dependence is reflected in the scaling exponents h(q) in the MF-DFA and the multifractal exponents D(q) and τ (q). This finding indicates that the use of the solar flare classes could improve the prediction of the D st classes.
Resumo:
This paper provides fundamental understanding for the use of cumulative plots for travel time estimation on signalized urban networks. Analytical modeling is performed to generate cumulative plots based on the availability of data: a) Case-D, for detector data only; b) Case-DS, for detector data and signal timings; and c) Case-DSS, for detector data, signal timings and saturation flow rate. The empirical study and sensitivity analysis based on simulation experiments have observed the consistency in performance for Case-DS and Case-DSS, whereas, for Case-D the performance is inconsistent. Case-D is sensitive to detection interval and signal timings within the interval. When detection interval is integral multiple of signal cycle then it has low accuracy and low reliability. Whereas, for detection interval around 1.5 times signal cycle both accuracy and reliability are high.
Resumo:
PURPOSE: The purpose of this study is to identify risk factors for developing complications following treatment of refractory glaucoma with transscleral diode laser cyclophotocoagulation (cyclodiode), to improve the safety profile of this treatment modality. METHOD: A retrospective analysis of 72 eyes from 70 patients who were treated with cyclodiode. RESULTS: The mean pre-treatment IOP was 37.0 mmHg (SD 11.0), with a mean post-treatment reduction in intraocular pressure (IOP) of 19.8 mmHg, and a mean IOP at last follow-up of 17.1 mmHg (SD 9.7). Mean total power delivered during treatment was 156.8 Joules (SD 82.7) over a mean of 1.3 treatments (SD 0.6). Sixteen eyes (22.2% of patients) developed complications from the treatment, with the most common being hypotony, occurring in 6 patients, including 4 with neovascular glaucoma. A higher pre-treatment IOP and higher mean total power delivery also were associated with higher complications. CONCLUSIONS: Cyclodiode is an effective treatment option for glaucoma that is refractory to other treatment options. By identifying risk factors for potential complications, cyclodiode can be modified accordingly for each patient to improve safety and efficacy.