15 resultados para ecliptic curve based chameleon hashing
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Afdeyu Station is one of the few river gauging stations in the highlands of Eritrea where daily measurements are taken. As a result of damages, the station was refurbished, and the cross section of the gauge was changed to have better control of minimal runoff. The gauge therefore had to be re-calibrated. This publication documents this process and also provides the new calibration curve, based on extensive field work carried out in the rainy season 2009
Resumo:
CpG island methylator phenotype (CIMP) is being investigated for its role in the molecular and prognostic classification of colorectal cancer patients but is also emerging as a factor with the potential to influence clinical decision-making. We report a comprehensive analysis of clinico-pathological and molecular features (KRAS, BRAF and microsatellite instability, MSI) as well as of selected tumour- and host-related protein markers characterizing CIMP-high (CIMP-H), -low, and -negative colorectal cancers. Immunohistochemical analysis for 48 protein markers and molecular analysis of CIMP (CIMP-H: ? 4/5 methylated genes), MSI (MSI-H: ? 2 instable genes), KRAS, and BRAF were performed on 337 colorectal cancers. Simple and multiple regression analysis and receiver operating characteristic (ROC) curve analysis were performed. CIMP-H was found in 24 cases (7.1%) and linked (p < 0.0001) to more proximal tumour location, BRAF mutation, MSI-H, MGMT methylation (p = 0.022), advanced pT classification (p = 0.03), mucinous histology (p = 0.069), and less frequent KRAS mutation (p = 0.067) compared to CIMP-low or -negative cases. Of the 48 protein markers, decreased levels of RKIP (p = 0.0056), EphB2 (p = 0.0045), CK20 (p = 0.002), and Cdx2 (p < 0.0001) and increased numbers of CD8+ intra-epithelial lymphocytes (p < 0.0001) were related to CIMP-H, independently of MSI status. In addition to the expected clinico-pathological and molecular associations, CIMP-H colorectal cancers are characterized by a loss of protein markers associated with differentiation, and metastasis suppression, and have increased CD8+ T-lymphocytes regardless of MSI status. In particular, Cdx2 loss seems to strongly predict CIMP-H in both microsatellite-stable (MSS) and MSI-H colorectal cancers. Cdx2 is proposed as a surrogate marker for CIMP-H.
Resumo:
Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation
Resumo:
Quantitative characterisation of carotid atherosclerosis and classification into symptomatic or asymptomatic is crucial in planning optimal treatment of atheromatous plaque. The computer-aided diagnosis (CAD) system described in this paper can analyse ultrasound (US) images of carotid artery and classify them into symptomatic or asymptomatic based on their echogenicity characteristics. The CAD system consists of three modules: a) the feature extraction module, where first-order statistical (FOS) features and Laws' texture energy can be estimated, b) the dimensionality reduction module, where the number of features can be reduced using analysis of variance (ANOVA), and c) the classifier module consisting of a neural network (NN) trained by a novel hybrid method based on genetic algorithms (GAs) along with the back propagation algorithm. The hybrid method is able to select the most robust features, to adjust automatically the NN architecture and to optimise the classification performance. The performance is measured by the accuracy, sensitivity, specificity and the area under the receiver-operating characteristic (ROC) curve. The CAD design and development is based on images from 54 symptomatic and 54 asymptomatic plaques. This study demonstrates the ability of a CAD system based on US image analysis and a hybrid trained NN to identify atheromatous plaques at high risk of stroke.
Resumo:
Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.
Resumo:
The ActiGraph accelerometer is commonly used to measure physical activity in children. Count cut-off points are needed when using accelerometer data to determine the time a person spent in moderate or vigorous physical activity. For the GT3X accelerometer no cut-off points for young children have been published yet. The aim of the current study was thus to develop and validate count cut-off points for young children. Thirty-two children aged 5 to 9 years performed four locomotor and four play activities. Activity classification into the light-, moderate- or vigorous-intensity category was based on energy expenditure measurements with indirect calorimetry. Vertical axis as well as vector magnitude cut-off points were determined through receiver operating characteristic curve analyses with the data of two thirds of the study group and validated with the data of the remaining third. The vertical axis cut-off points were 133 counts per 5 sec for moderate to vigorous physical activity (MVPA), 193 counts for vigorous activity (VPA) corresponding to a metabolic threshold of 5 MET and 233 for VPA corresponding to 6 MET. The vector magnitude cut-off points were 246 counts per 5 sec for MVPA, 316 counts for VPA - 5 MET and 381 counts for VPA - 6 MET. When validated, the current cut-off points generally showed high recognition rates for each category, high sensitivity and specificity values and moderate agreement in terms of the Kappa statistic. These results were similar for vertical axis and vector magnitude cut-off points. The current cut-off points adequately reflect MVPA and VPA in young children. Cut-off points based on vector magnitude counts did not appear to reflect the intensity categories better than cut-off points based on vertical axis counts alone.
Resumo:
BACKGROUND Driving a car is a complex instrumental activity of daily living and driving performance is very sensitive to cognitive impairment. The assessment of driving-relevant cognition in older drivers is challenging and requires reliable and valid tests with good sensitivity and specificity to predict safe driving. Driving simulators can be used to test fitness to drive. Several studies have found strong correlation between driving simulator performance and on-the-road driving. However, access to driving simulators is restricted to specialists and simulators are too expensive, large, and complex to allow easy access to older drivers or physicians advising them. An easily accessible, Web-based, cognitive screening test could offer a solution to this problem. The World Wide Web allows easy dissemination of the test software and implementation of the scoring algorithm on a central server, allowing generation of a dynamically growing database with normative values and ensures that all users have access to the same up-to-date normative values. OBJECTIVE In this pilot study, we present the novel Web-based Bern Cognitive Screening Test (wBCST) and investigate whether it can predict poor simulated driving performance in healthy and cognitive-impaired participants. METHODS The wBCST performance and simulated driving performance have been analyzed in 26 healthy younger and 44 healthy older participants as well as in 10 older participants with cognitive impairment. Correlations between the two tests were calculated. Also, simulated driving performance was used to group the participants into good performers (n=70) and poor performers (n=10). A receiver-operating characteristic analysis was calculated to determine sensitivity and specificity of the wBCST in predicting simulated driving performance. RESULTS The mean wBCST score of the participants with poor simulated driving performance was reduced by 52%, compared to participants with good simulated driving performance (P<.001). The area under the receiver-operating characteristic curve was 0.80 with a 95% confidence interval 0.68-0.92. CONCLUSIONS When selecting a 75% test score as the cutoff, the novel test has 83% sensitivity, 70% specificity, and 81% efficiency, which are good values for a screening test. Overall, in this pilot study, the novel Web-based computer test appears to be a promising tool for supporting clinicians in fitness-to-drive assessments of older drivers. The Web-based distribution and scoring on a central computer will facilitate further evaluation of the novel test setup. We expect that in the near future, Web-based computer tests will become a valid and reliable tool for clinicians, for example, when assessing fitness to drive in older drivers.
Resumo:
The synthesis and incorporation into oligonucleotides of C-nucleosides containing the two aromatic, non-hydrogen-bonding nucleobase substitutes biphenyl (I) and bipyridyl (Y) are described. Their homo- and hetero-recognition properties in different sequential arrangements were then investigated via UV-melting curve analysis, gel mobility assays, CD- and NMR spectroscopy. An NMR analysis of a dodecamer duplex containing one biphenyl pair in the center, as well as CD data on duplexes with multiple insertions provide further evidence for the zipper-like interstrand stacking motif that we proposed earlier based on molecular modeling. UV-thermal melting experiments with duplexes containing one to up to seven I- or Y base pairs revealed a constant increase in T(m) in the case of I and a constant decrease for Y. Mixed I/Y base pairs lead to stabilities in between the homoseries. Insertion of alternating I/abasic site- or Y/abasic site pairs strongly decreases the thermal stability of duplexes. Asymmetric distribution of I- or Y residues on either strand of the duplex were also investigated in this context. Duplexes with three natural base pairs at both ends and 50 % of I pairs in the center are still readily formed, while duplexes with blunt ended I pairs tend to aggregate unspecifically. Duplexes with one natural overhang at the end of a I-I base pair tract can both aggregate or form ordered duplexes, depending on the nature of the natural bases in the overhang
Resumo:
Palynology provides the opportunity to make inferences on changes in diversity of terrestrial vegetation over long time scales. The often coarse taxonomic level achievable in pollen analysis, differences in pollen production and dispersal, and the lack of pollen source boundaries hamper the application of diversity indices to palynology. Palynological richness, the number of pollen types at a constant pollen count, is the most robust and widely used diversity indicator for pollen data. However, this index is also influenced by the abundance distribution of pollen types in sediments. In particular, where the index is calculated by rarefaction analysis, information on taxonomic richness at low abundance may be lost. Here we explore information that can be extracted from the accumulation of taxa over consecutive samples. The log-transformed taxa accumulation curve can be broken up into linear sections with different slope and intersect parameters, describing the accumulation of new taxa within the section. The breaking points may indicate changes in the species pool or in the abundance of high versus low pollen producers. Testing this concept on three pollen diagrams from different landscapes, we find that the break points in the taxa accumulation curves provide convenient zones for identifying changes in richness and evenness. The linear regressions over consecutive samples can be used to inter- and extrapolate to low or extremely high pollen counts, indicating evenness and richness in taxonomic composition within these zones. An evenness indicator, based on the rank-order-abundance is used to assist in the evaluation of the results and the interpretation of the fossil records. Two central European pollen diagrams show major changes in the taxa accumulation curves for the Lateglacial period and the time of human induced land-use changes, while they do not indicate strong changes in the species pool with the onset of the Holocene. In contrast, a central Swedish pollen diagram shows comparatively little change, but high richness during the early Holocene forest establishment. Evenness and palynological richness are related for most periods in the three diagrams, however, sections before forest establishment and after forest clearance show high evenness, which is not necessarily accompanied by high palynological richness, encouraging efforts to separate the two.
Resumo:
BACKGROUND Existing prediction models for mortality in chronic obstructive pulmonary disease (COPD) patients have not yet been validated in primary care, which is where the majority of patients receive care. OBJECTIVES Our aim was to validate the ADO (age, dyspnoea, airflow obstruction) index as a predictor of 2-year mortality in 2 general practice-based COPD cohorts. METHODS Six hundred and forty-six patients with COPD with GOLD (Global Initiative for Chronic Obstructive Lung Disease) stages I-IV were enrolled by their general practitioners and followed for 2 years. The ADO regression equation was used to predict a 2-year risk of all-cause mortality in each patient and this risk was compared with the observed 2-year mortality. Discrimination and calibration were assessed as well as the strength of association between the 15-point ADO score and the observed 2-year all-cause mortality. RESULTS Fifty-two (8.1%) patients died during the 2-year follow-up period. Discrimination with the ADO index was excellent with an area under the curve of 0.78 [95% confidence interval (CI) 0.71-0.84]. Overall, the predicted and observed risks matched well and visual inspection revealed no important differences between them across 10 risk classes (p = 0.68). The odds ratio for death per point increase according to the ADO index was 1.50 (95% CI 1.31-1.71). CONCLUSIONS The ADO index showed excellent prediction properties in an out-of-population validation carried out in COPD patients from primary care settings.
Resumo:
The currently proposed space debris remediation measures include the active removal of large objects and “just in time” collision avoidance by deviating the objects using, e.g., ground-based lasers. Both techniques require precise knowledge of the attitude state and state changes of the target objects. In the former case, to devise methods to grapple the target by a tug spacecraft, in the latter, to precisely propagate the orbits of potential collision partners as disturbing forces like air drag and solar radiation pressure depend on the attitude of the objects. Non-resolving optical observations of the magnitude variations, so-called light curves, are a promising technique to determine rotation or tumbling rates and the orientations of the actual rotation axis of objects, as well as their temporal changes. The 1-meter telescope ZIMLAT of the Astronomical Institute of the University of Bern has been used to collect light curves of MEO and GEO objects for a considerable period of time. Recently, light curves of Low Earth Orbit (LEO) targets were acquired as well. We present different observation methods, including active tracking using a CCD subframe readout technique, and the use of a high-speed scientific CMOS camera. Technical challenges when tracking objects with poor orbit redictions, as well as different data reduction methods are addressed. Results from a survey of abandoned rocket upper stages in LEO, examples of abandoned payloads and observations of high area-to-mass ratio debris will be resented. Eventually, first results of the analysis of these light curves are provided.
Resumo:
Although there has been a significant decrease in caries prevalence in developed countries, the slower progression of dental caries requires methods capable of detecting and quantifying lesions at an early stage. The aim of this study was to evaluate the effectiveness of fluorescence-based methods (DIAGNOdent 2095 laser fluorescence device [LF], DIAGNOdent 2190 pen [LFpen], and VistaProof fluorescence camera [FC]) in monitoring the progression of noncavitated caries-like lesions on smooth surfaces. Caries-like lesions were developed in 60 blocks of bovine enamel using a bacterial model of Streptococcus mutans and Lactobacillus acidophilus . Enamel blocks were evaluated by two independent examiners at baseline (phase I), after the first cariogenic challenge (eight days) (phase II), and after the second cariogenic challenge (a further eight days) (phase III) by two independent examiners using the LF, LFpen, and FC. Blocks were submitted to surface microhardness (SMH) and cross-sectional microhardness analyses. The intraclass correlation coefficient for intra- and interexaminer reproducibility ranged from 0.49 (FC) to 0.94 (LF/LFpen). SMH values decreased and fluorescence values increased significantly among the three phases. Higher values for sensitivity, specificity, and area under the receiver operating characteristic curve were observed for FC (phase II) and LFpen (phase III). A significant correlation was found between fluorescence values and SMH in all phases and integrated loss of surface hardness (ΔKHN) in phase III. In conclusion, fluorescence-based methods were effective in monitoring noncavitated caries-like lesions on smooth surfaces, with moderate correlation with SMH, allowing differentiation between sound and demineralized enamel.
Resumo:
BACKGROUND Little information is yet available on zirconia-based prostheses supported by implants. PURPOSE To evaluate technical problems and failures of implant-supported zirconia-based prostheses with exclusive screw-retention. MATERIAL AND METHODS Consecutive patients received screw-retained zirconia-based prostheses supported by implants and were followed over a time period of 5 years. The implant placement and prosthetic rehabilitation were performed in one clinical setting, and all patients participated in the maintenance program. The treatment comprised single crowns (SCs) and fixed dental prostheses (FDPs) of three to 12 units. Screw-retention of the CAD/CAM-fabricated SCs and FDPs was performed with direct connection at the implant level. The primary outcome was the complete failure of zirconia-based prostheses; outcome measures were fracture of the framework or extensive chipping resulting in the need for refabrication. A life table analysis was performed, the cumulative survival rate (CSR) calculated, and a Kaplan-Meier curve drawn. RESULTS Two hundred and ninety-four implants supported 156 zirconia-based prostheses in 95 patients (52 men, 43 women, average age 59.1 ± 11.7 years). Sixty-five SCs and 91 FDPs were identified, comprising a total of 441 units. Fractures of the zirconia framework and extensive chipping resulted in refabrication of nine prostheses. Nearly all the prostheses (94.2%) remained in situ during the observation period. The 5-year CSR was 90.5%, and 41 prostheses (14 SCs, 27 FDPs) comprising 113 units survived for an observation time of more than 5 years. Six SCs exhibited screw loosening, and polishing of minor chipping was required for five prostheses. CONCLUSIONS This study shows that zirconia-based implant-supported fixed prostheses exhibit satisfactory treatment outcomes and that screw-retention directly at the implant level is feasible.
Resumo:
Both cointegration methods, and non-cointegrated structural VARs identified based on either long-run restrictions, or a combination of long-run and sign restrictions, are used in order to explore the long-run trade-off between inflation and the unemployment rate in the post-WWII U.S., U.K., Euro area, Canada, and Australia. Overall, neither approach produces clear evidence of a non-vertical trade-off. The extent of uncertainty surrounding the estimates is however substantial, thus implying that a researcher holding alternative priors about what a reasonable slope of the long-run trade-off might be will likely not see her views falsified