980 resultados para ecliptic curve based chameleon hashing
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
This paper reports on the development and validation of a simple and sensitive method that uses solid phase extraction (SPE) and liquid chromatography with ultraviolet detection to analyze fluoxetine (FLX) and norfluoxetine (NFLX) in human plasma samples. A lab-made C18 SPE phase was synthesized by using a sol–gel process employing a low-cost silica precursor. This sorbent was fully characterized by nuclear magnetic resonance (NMR), Fourier-transform infrared spectroscopy (FT-IR), and scanning electron microscopy (SEM) to check the particles' shape, size and C18 functionalization. The lab-made C18 silica was used in the sample preparation step of human plasma by the SPE-HPLC-UV method. The method was validated in the 15 to 500 ng mL 1 range for both FLX and NFLX using a matrix matched curve. Detection limits of 4.3 and 4.2 ng mL 1 were obtained for FLX and NFLX, respectively. The repeatability and intermediary precision achieved varied from 7.6 to 15.0% and the accuracy ranged from 14.9 to 9.1%. The synthesized C18 sorbent was compared to commercial C18 sorbents. The average recoveries were similar (85–105%), however the lab-made C18 silica showed fewer interfering peaks in the chromatogram. After development and validation, the method using the lab-made C18 SPE was applied to plasma samples of patients under FLX treatment (n ¼ 6). The concentrations of FLX and NFLX found in the samples varied from 46.8–215.5 and 48.0–189.9 ng mL 1 , respectively.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work proposes the development and study of a novel technique lot the generation of fractal descriptors used in texture analysis. The novel descriptors are obtained from a multiscale transform applied to the Fourier technique of fractal dimension calculus. The power spectrum of the Fourier transform of the image is plotted against the frequency in a log-log scale and a multiscale transform is applied to this curve. The obtained values are taken as the fractal descriptors of the image. The validation of the proposal is performed by the use of the descriptors for the classification of a dataset of texture images whose real classes are previously known. The classification precision is compared to other fractal descriptors known in the literature. The results confirm the efficiency of the proposed method. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
International Journal of Paediatric Dentistry 2012; 22: 459466 Aim. This in vitro study aimed to test the performance of fluorescence-based methods in detecting occlusal caries lesions in primary molars compared to conventional methods. Design. Two examiners assessed 113 sites on 77 occlusal surfaces of primary molars using three fluorescence devices: DIAGNOdent (LF), DIAGNOdent pen (LFpen), and fluorescence camera (VistaProof-FC). Visual inspection (ICDAS) and radiographic methods were also evaluated. One examiner repeated the evaluations after one month. As reference standard method, the lesion depth was determined after sectioning and evaluation in stereomicroscope. The area under the ROC curve (Az), sensitivity, specificity, and accuracy of the methods were calculated at enamel (D1) and dentine caries (D3) lesions thresholds. The intra and interexaminer reproducibility were calculated using the intraclass correlation coefficient (ICC) and kappa statistics. Results. At D1, visual inspection presented higher sensitivities (0.970.99) but lower specificities (0.180.25). At D3, all the methods demonstrated similar performance (Az values around 0.90). Visual and radiographic methods showed a slightly higher specificity (values higher than 0.96) than the fluorescence based ones (values around 0.88). In general, all methods presented high reproducibility (ICC higher than 0.79). Conclusions. Although fluorescence-based and conventional methods present similar performance in detecting occlusal caries lesions in primary teeth, visual inspection alone seems to be sufficient to be used in clinical practice.
Resumo:
Three-party password-authenticated key exchange (3PAKE) protocols allow entities to negotiate a secret session key with the aid of a trusted server with whom they share a human-memorable password. Recently, Lou and Huang proposed a simple 3PAKE protocol based on elliptic curve cryptography, which is claimed to be secure and to provide superior efficiency when compared with similar-purpose solutions. In this paper, however, we show that the solution is vulnerable to key-compromise impersonation and offline password guessing attacks from system insiders or outsiders, which indicates that the empirical approach used to evaluate the scheme's security is flawed. These results highlight the need of employing provable security approaches when designing and analyzing PAKE schemes. Copyright (c) 2011 John Wiley & Sons, Ltd.
Resumo:
The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.
Resumo:
The diagnosis, grading and classification of tumours has benefited considerably from the development of DCE-MRI which is now essential to the adequate clinical management of many tumour types due to its capability in detecting active angiogenesis. Several strategies have been proposed for DCE-MRI evaluation. Visual inspection of contrast agent concentration curves vs time is a very simple yet operator dependent procedure, therefore more objective approaches have been developed in order to facilitate comparison between studies. In so called model free approaches, descriptive or heuristic information extracted from time series raw data have been used for tissue classification. The main issue concerning these schemes is that they have not a direct interpretation in terms of physiological properties of the tissues. On the other hand, model based investigations typically involve compartmental tracer kinetic modelling and pixel-by-pixel estimation of kinetic parameters via non-linear regression applied on region of interests opportunely selected by the physician. This approach has the advantage to provide parameters directly related to the pathophysiological properties of the tissue such as vessel permeability, local regional blood flow, extraction fraction, concentration gradient between plasma and extravascular-extracellular space. Anyway, nonlinear modelling is computational demanding and the accuracy of the estimates can be affected by the signal-to-noise ratio and by the initial solutions. The principal aim of this thesis is investigate the use of semi-quantitative and quantitative parameters for segmentation and classification of breast lesion. The objectives can be subdivided as follow: describe the principal techniques to evaluate time intensity curve in DCE-MRI with focus on kinetic model proposed in literature; to evaluate the influence in parametrization choice for a classic bi-compartmental kinetic models; to evaluate the performance of a method for simultaneous tracer kinetic modelling and pixel classification; to evaluate performance of machine learning techniques training for segmentation and classification of breast lesion.
Resumo:
CpG island methylator phenotype (CIMP) is being investigated for its role in the molecular and prognostic classification of colorectal cancer patients but is also emerging as a factor with the potential to influence clinical decision-making. We report a comprehensive analysis of clinico-pathological and molecular features (KRAS, BRAF and microsatellite instability, MSI) as well as of selected tumour- and host-related protein markers characterizing CIMP-high (CIMP-H), -low, and -negative colorectal cancers. Immunohistochemical analysis for 48 protein markers and molecular analysis of CIMP (CIMP-H: ? 4/5 methylated genes), MSI (MSI-H: ? 2 instable genes), KRAS, and BRAF were performed on 337 colorectal cancers. Simple and multiple regression analysis and receiver operating characteristic (ROC) curve analysis were performed. CIMP-H was found in 24 cases (7.1%) and linked (p < 0.0001) to more proximal tumour location, BRAF mutation, MSI-H, MGMT methylation (p = 0.022), advanced pT classification (p = 0.03), mucinous histology (p = 0.069), and less frequent KRAS mutation (p = 0.067) compared to CIMP-low or -negative cases. Of the 48 protein markers, decreased levels of RKIP (p = 0.0056), EphB2 (p = 0.0045), CK20 (p = 0.002), and Cdx2 (p < 0.0001) and increased numbers of CD8+ intra-epithelial lymphocytes (p < 0.0001) were related to CIMP-H, independently of MSI status. In addition to the expected clinico-pathological and molecular associations, CIMP-H colorectal cancers are characterized by a loss of protein markers associated with differentiation, and metastasis suppression, and have increased CD8+ T-lymphocytes regardless of MSI status. In particular, Cdx2 loss seems to strongly predict CIMP-H in both microsatellite-stable (MSS) and MSI-H colorectal cancers. Cdx2 is proposed as a surrogate marker for CIMP-H.
Resumo:
Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation
Resumo:
Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.
Resumo:
Quantitative characterisation of carotid atherosclerosis and classification into symptomatic or asymptomatic is crucial in planning optimal treatment of atheromatous plaque. The computer-aided diagnosis (CAD) system described in this paper can analyse ultrasound (US) images of carotid artery and classify them into symptomatic or asymptomatic based on their echogenicity characteristics. The CAD system consists of three modules: a) the feature extraction module, where first-order statistical (FOS) features and Laws' texture energy can be estimated, b) the dimensionality reduction module, where the number of features can be reduced using analysis of variance (ANOVA), and c) the classifier module consisting of a neural network (NN) trained by a novel hybrid method based on genetic algorithms (GAs) along with the back propagation algorithm. The hybrid method is able to select the most robust features, to adjust automatically the NN architecture and to optimise the classification performance. The performance is measured by the accuracy, sensitivity, specificity and the area under the receiver-operating characteristic (ROC) curve. The CAD design and development is based on images from 54 symptomatic and 54 asymptomatic plaques. This study demonstrates the ability of a CAD system based on US image analysis and a hybrid trained NN to identify atheromatous plaques at high risk of stroke.
Resumo:
Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.