976 resultados para ecliptic curve based chameleon hashing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative characterisation of carotid atherosclerosis and classification into symptomatic or asymptomatic is crucial in planning optimal treatment of atheromatous plaque. The computer-aided diagnosis (CAD) system described in this paper can analyse ultrasound (US) images of carotid artery and classify them into symptomatic or asymptomatic based on their echogenicity characteristics. The CAD system consists of three modules: a) the feature extraction module, where first-order statistical (FOS) features and Laws' texture energy can be estimated, b) the dimensionality reduction module, where the number of features can be reduced using analysis of variance (ANOVA), and c) the classifier module consisting of a neural network (NN) trained by a novel hybrid method based on genetic algorithms (GAs) along with the back propagation algorithm. The hybrid method is able to select the most robust features, to adjust automatically the NN architecture and to optimise the classification performance. The performance is measured by the accuracy, sensitivity, specificity and the area under the receiver-operating characteristic (ROC) curve. The CAD design and development is based on images from 54 symptomatic and 54 asymptomatic plaques. This study demonstrates the ability of a CAD system based on US image analysis and a hybrid trained NN to identify atheromatous plaques at high risk of stroke.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ActiGraph accelerometer is commonly used to measure physical activity in children. Count cut-off points are needed when using accelerometer data to determine the time a person spent in moderate or vigorous physical activity. For the GT3X accelerometer no cut-off points for young children have been published yet. The aim of the current study was thus to develop and validate count cut-off points for young children. Thirty-two children aged 5 to 9 years performed four locomotor and four play activities. Activity classification into the light-, moderate- or vigorous-intensity category was based on energy expenditure measurements with indirect calorimetry. Vertical axis as well as vector magnitude cut-off points were determined through receiver operating characteristic curve analyses with the data of two thirds of the study group and validated with the data of the remaining third. The vertical axis cut-off points were 133 counts per 5 sec for moderate to vigorous physical activity (MVPA), 193 counts for vigorous activity (VPA) corresponding to a metabolic threshold of 5 MET and 233 for VPA corresponding to 6 MET. The vector magnitude cut-off points were 246 counts per 5 sec for MVPA, 316 counts for VPA - 5 MET and 381 counts for VPA - 6 MET. When validated, the current cut-off points generally showed high recognition rates for each category, high sensitivity and specificity values and moderate agreement in terms of the Kappa statistic. These results were similar for vertical axis and vector magnitude cut-off points. The current cut-off points adequately reflect MVPA and VPA in young children. Cut-off points based on vector magnitude counts did not appear to reflect the intensity categories better than cut-off points based on vertical axis counts alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Driving a car is a complex instrumental activity of daily living and driving performance is very sensitive to cognitive impairment. The assessment of driving-relevant cognition in older drivers is challenging and requires reliable and valid tests with good sensitivity and specificity to predict safe driving. Driving simulators can be used to test fitness to drive. Several studies have found strong correlation between driving simulator performance and on-the-road driving. However, access to driving simulators is restricted to specialists and simulators are too expensive, large, and complex to allow easy access to older drivers or physicians advising them. An easily accessible, Web-based, cognitive screening test could offer a solution to this problem. The World Wide Web allows easy dissemination of the test software and implementation of the scoring algorithm on a central server, allowing generation of a dynamically growing database with normative values and ensures that all users have access to the same up-to-date normative values. OBJECTIVE In this pilot study, we present the novel Web-based Bern Cognitive Screening Test (wBCST) and investigate whether it can predict poor simulated driving performance in healthy and cognitive-impaired participants. METHODS The wBCST performance and simulated driving performance have been analyzed in 26 healthy younger and 44 healthy older participants as well as in 10 older participants with cognitive impairment. Correlations between the two tests were calculated. Also, simulated driving performance was used to group the participants into good performers (n=70) and poor performers (n=10). A receiver-operating characteristic analysis was calculated to determine sensitivity and specificity of the wBCST in predicting simulated driving performance. RESULTS The mean wBCST score of the participants with poor simulated driving performance was reduced by 52%, compared to participants with good simulated driving performance (P<.001). The area under the receiver-operating characteristic curve was 0.80 with a 95% confidence interval 0.68-0.92. CONCLUSIONS When selecting a 75% test score as the cutoff, the novel test has 83% sensitivity, 70% specificity, and 81% efficiency, which are good values for a screening test. Overall, in this pilot study, the novel Web-based computer test appears to be a promising tool for supporting clinicians in fitness-to-drive assessments of older drivers. The Web-based distribution and scoring on a central computer will facilitate further evaluation of the novel test setup. We expect that in the near future, Web-based computer tests will become a valid and reliable tool for clinicians, for example, when assessing fitness to drive in older drivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The synthesis and incorporation into oligonucleotides of C-nucleosides containing the two aromatic, non-hydrogen-bonding nucleobase substitutes biphenyl (I) and bipyridyl (Y) are described. Their homo- and hetero-recognition properties in different sequential arrangements were then investigated via UV-melting curve analysis, gel mobility assays, CD- and NMR spectroscopy. An NMR analysis of a dodecamer duplex containing one biphenyl pair in the center, as well as CD data on duplexes with multiple insertions provide further evidence for the zipper-like interstrand stacking motif that we proposed earlier based on molecular modeling. UV-thermal melting experiments with duplexes containing one to up to seven I- or Y base pairs revealed a constant increase in T(m) in the case of I and a constant decrease for Y. Mixed I/Y base pairs lead to stabilities in between the homoseries. Insertion of alternating I/abasic site- or Y/abasic site pairs strongly decreases the thermal stability of duplexes. Asymmetric distribution of I- or Y residues on either strand of the duplex were also investigated in this context. Duplexes with three natural base pairs at both ends and 50 % of I pairs in the center are still readily formed, while duplexes with blunt ended I pairs tend to aggregate unspecifically. Duplexes with one natural overhang at the end of a I-I base pair tract can both aggregate or form ordered duplexes, depending on the nature of the natural bases in the overhang

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Palynology provides the opportunity to make inferences on changes in diversity of terrestrial vegetation over long time scales. The often coarse taxonomic level achievable in pollen analysis, differences in pollen production and dispersal, and the lack of pollen source boundaries hamper the application of diversity indices to palynology. Palynological richness, the number of pollen types at a constant pollen count, is the most robust and widely used diversity indicator for pollen data. However, this index is also influenced by the abundance distribution of pollen types in sediments. In particular, where the index is calculated by rarefaction analysis, information on taxonomic richness at low abundance may be lost. Here we explore information that can be extracted from the accumulation of taxa over consecutive samples. The log-transformed taxa accumulation curve can be broken up into linear sections with different slope and intersect parameters, describing the accumulation of new taxa within the section. The breaking points may indicate changes in the species pool or in the abundance of high versus low pollen producers. Testing this concept on three pollen diagrams from different landscapes, we find that the break points in the taxa accumulation curves provide convenient zones for identifying changes in richness and evenness. The linear regressions over consecutive samples can be used to inter- and extrapolate to low or extremely high pollen counts, indicating evenness and richness in taxonomic composition within these zones. An evenness indicator, based on the rank-order-abundance is used to assist in the evaluation of the results and the interpretation of the fossil records. Two central European pollen diagrams show major changes in the taxa accumulation curves for the Lateglacial period and the time of human induced land-use changes, while they do not indicate strong changes in the species pool with the onset of the Holocene. In contrast, a central Swedish pollen diagram shows comparatively little change, but high richness during the early Holocene forest establishment. Evenness and palynological richness are related for most periods in the three diagrams, however, sections before forest establishment and after forest clearance show high evenness, which is not necessarily accompanied by high palynological richness, encouraging efforts to separate the two.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Existing prediction models for mortality in chronic obstructive pulmonary disease (COPD) patients have not yet been validated in primary care, which is where the majority of patients receive care. OBJECTIVES Our aim was to validate the ADO (age, dyspnoea, airflow obstruction) index as a predictor of 2-year mortality in 2 general practice-based COPD cohorts. METHODS Six hundred and forty-six patients with COPD with GOLD (Global Initiative for Chronic Obstructive Lung Disease) stages I-IV were enrolled by their general practitioners and followed for 2 years. The ADO regression equation was used to predict a 2-year risk of all-cause mortality in each patient and this risk was compared with the observed 2-year mortality. Discrimination and calibration were assessed as well as the strength of association between the 15-point ADO score and the observed 2-year all-cause mortality. RESULTS Fifty-two (8.1%) patients died during the 2-year follow-up period. Discrimination with the ADO index was excellent with an area under the curve of 0.78 [95% confidence interval (CI) 0.71-0.84]. Overall, the predicted and observed risks matched well and visual inspection revealed no important differences between them across 10 risk classes (p = 0.68). The odds ratio for death per point increase according to the ADO index was 1.50 (95% CI 1.31-1.71). CONCLUSIONS The ADO index showed excellent prediction properties in an out-of-population validation carried out in COPD patients from primary care settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The currently proposed space debris remediation measures include the active removal of large objects and “just in time” collision avoidance by deviating the objects using, e.g., ground-based lasers. Both techniques require precise knowledge of the attitude state and state changes of the target objects. In the former case, to devise methods to grapple the target by a tug spacecraft, in the latter, to precisely propagate the orbits of potential collision partners as disturbing forces like air drag and solar radiation pressure depend on the attitude of the objects. Non-resolving optical observations of the magnitude variations, so-called light curves, are a promising technique to determine rotation or tumbling rates and the orientations of the actual rotation axis of objects, as well as their temporal changes. The 1-meter telescope ZIMLAT of the Astronomical Institute of the University of Bern has been used to collect light curves of MEO and GEO objects for a considerable period of time. Recently, light curves of Low Earth Orbit (LEO) targets were acquired as well. We present different observation methods, including active tracking using a CCD subframe readout technique, and the use of a high-speed scientific CMOS camera. Technical challenges when tracking objects with poor orbit redictions, as well as different data reduction methods are addressed. Results from a survey of abandoned rocket upper stages in LEO, examples of abandoned payloads and observations of high area-to-mass ratio debris will be resented. Eventually, first results of the analysis of these light curves are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although there has been a significant decrease in caries prevalence in developed countries, the slower progression of dental caries requires methods capable of detecting and quantifying lesions at an early stage. The aim of this study was to evaluate the effectiveness of fluorescence-based methods (DIAGNOdent 2095 laser fluorescence device [LF], DIAGNOdent 2190 pen [LFpen], and VistaProof fluorescence camera [FC]) in monitoring the progression of noncavitated caries-like lesions on smooth surfaces. Caries-like lesions were developed in 60 blocks of bovine enamel using a bacterial model of Streptococcus mutans and Lactobacillus acidophilus . Enamel blocks were evaluated by two independent examiners at baseline (phase I), after the first cariogenic challenge (eight days) (phase II), and after the second cariogenic challenge (a further eight days) (phase III) by two independent examiners using the LF, LFpen, and FC. Blocks were submitted to surface microhardness (SMH) and cross-sectional microhardness analyses. The intraclass correlation coefficient for intra- and interexaminer reproducibility ranged from 0.49 (FC) to 0.94 (LF/LFpen). SMH values decreased and fluorescence values increased significantly among the three phases. Higher values for sensitivity, specificity, and area under the receiver operating characteristic curve were observed for FC (phase II) and LFpen (phase III). A significant correlation was found between fluorescence values and SMH in all phases and integrated loss of surface hardness (ΔKHN) in phase III. In conclusion, fluorescence-based methods were effective in monitoring noncavitated caries-like lesions on smooth surfaces, with moderate correlation with SMH, allowing differentiation between sound and demineralized enamel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Little information is yet available on zirconia-based prostheses supported by implants. PURPOSE To evaluate technical problems and failures of implant-supported zirconia-based prostheses with exclusive screw-retention. MATERIAL AND METHODS Consecutive patients received screw-retained zirconia-based prostheses supported by implants and were followed over a time period of 5 years. The implant placement and prosthetic rehabilitation were performed in one clinical setting, and all patients participated in the maintenance program. The treatment comprised single crowns (SCs) and fixed dental prostheses (FDPs) of three to 12 units. Screw-retention of the CAD/CAM-fabricated SCs and FDPs was performed with direct connection at the implant level. The primary outcome was the complete failure of zirconia-based prostheses; outcome measures were fracture of the framework or extensive chipping resulting in the need for refabrication. A life table analysis was performed, the cumulative survival rate (CSR) calculated, and a Kaplan-Meier curve drawn. RESULTS Two hundred and ninety-four implants supported 156 zirconia-based prostheses in 95 patients (52 men, 43 women, average age 59.1 ± 11.7 years). Sixty-five SCs and 91 FDPs were identified, comprising a total of 441 units. Fractures of the zirconia framework and extensive chipping resulted in refabrication of nine prostheses. Nearly all the prostheses (94.2%) remained in situ during the observation period. The 5-year CSR was 90.5%, and 41 prostheses (14 SCs, 27 FDPs) comprising 113 units survived for an observation time of more than 5 years. Six SCs exhibited screw loosening, and polishing of minor chipping was required for five prostheses. CONCLUSIONS This study shows that zirconia-based implant-supported fixed prostheses exhibit satisfactory treatment outcomes and that screw-retention directly at the implant level is feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both cointegration methods, and non-cointegrated structural VARs identified based on either long-run restrictions, or a combination of long-run and sign restrictions, are used in order to explore the long-run trade-off between inflation and the unemployment rate in the post-WWII U.S., U.K., Euro area, Canada, and Australia. Overall, neither approach produces clear evidence of a non-vertical trade-off. The extent of uncertainty surrounding the estimates is however substantial, thus implying that a researcher holding alternative priors about what a reasonable slope of the long-run trade-off might be will likely not see her views falsified

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies feature subset selection in classification using a multiobjective estimation of distribution algorithm. We consider six functions, namely area under ROC curve, sensitivity, specificity, precision, F1 measure and Brier score, for evaluation of feature subsets and as the objectives of the problem. One of the characteristics of these objective functions is the existence of noise in their values that should be appropriately handled during optimization. Our proposed algorithm consists of two major techniques which are specially designed for the feature subset selection problem. The first one is a solution ranking method based on interval values to handle the noise in the objectives of this problem. The second one is a model estimation method for learning a joint probabilistic model of objectives and variables which is used to generate new solutions and advance through the search space. To simplify model estimation, l1 regularized regression is used to select a subset of problem variables before model learning. The proposed algorithm is compared with a well-known ranking method for interval-valued objectives and a standard multiobjective genetic algorithm. Particularly, the effects of the two new techniques are experimentally investigated. The experimental results show that the proposed algorithm is able to obtain comparable or better performance on the tested datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Objective assessment of motor skills has become an important challenge in minimally invasive surgery (MIS) training.Currently, there is no gold standard defining and determining the residents' surgical competence.To aid in the decision process, we analyze the validity of a supervised classifier to determine the degree of MIS competence based on assessment of psychomotor skills METHODOLOGY: The ANFIS is trained to classify performance in a box trainer peg transfer task performed by two groups (expert/non expert). There were 42 participants included in the study: the non-expert group consisted of 16 medical students and 8 residents (< 10 MIS procedures performed), whereas the expert group consisted of 14 residents (> 10 MIS procedures performed) and 4 experienced surgeons. Instrument movements were captured by means of the Endoscopic Video Analysis (EVA) tracking system. Nine motion analysis parameters (MAPs) were analyzed, including time, path length, depth, average speed, average acceleration, economy of area, economy of volume, idle time and motion smoothness. Data reduction was performed by means of principal component analysis, and then used to train the ANFIS net. Performance was measured by leave one out cross validation. RESULTS: The ANFIS presented an accuracy of 80.95%, where 13 experts and 21 non-experts were correctly classified. Total root mean square error was 0.88, while the area under the classifiers' ROC curve (AUC) was measured at 0.81. DISCUSSION: We have shown the usefulness of ANFIS for classification of MIS competence in a simple box trainer exercise. The main advantage of using ANFIS resides in its continuous output, which allows fine discrimination of surgical competence. There are, however, challenges that must be taken into account when considering use of ANFIS (e.g. training time, architecture modeling). Despite this, we have shown discriminative power of ANFIS for a low-difficulty box trainer task, regardless of the individual significances between MAPs. Future studies are required to confirm the findings, inclusion of new tasks, conditions and sample population.