82 resultados para Image processing -- Digital techniques -- Mathematical models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach to the local measurement of residual stress in microstructures is described in this paper. The presented technique takes advantage of the combined milling-imaging features of a focused ion beam (FIB) equipment to scale down the widely known hole drilling method. This method consists of drilling a small hole in a solid with inherent residual stresses and measuring the strains/displacements caused by the local stress release, that takes place around the hole. In the presented case, the displacements caused by the milling are determined by applying digital image correlation (DIC) techniques to high resolution micrographs taken before and after the milling process. The residual stress value is then obtained by fitting the measured displacements to the analytical solution of the displacement fields. The feasibility of this approach has been demonstrated on a micromachined silicon nitride membrane showing that this method has high potential for applications in the field of mechanical characterization of micro/nanoelectromechanical systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Populations of phase oscillators interacting globally through a general coupling function f(x) have been considered. We analyze the conditions required to ensure the existence of a Lyapunov functional giving close expressions for it in terms of a generating function. We have also proposed a family of exactly solvable models with singular couplings showing that it is possible to map the synchronization phenomenon into other physical problems. In particular, the stationary solutions of the least singular coupling considered, f(x) = sgn(x), have been found analytically in terms of elliptic functions. This last case is one of the few nontrivial models for synchronization dynamics which can be analytically solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper is aimed at providing a general strategic overview of the existing theoretical models that have applications in the field of financial innovation. Whereas most financialdevelopments have relied upon traditional economic tools, a new stream of research is defining a novel paradigm in which mathematical models from diverse scientific disciplines are being applied to conceptualize and explain economic and financial behavior. Indeed, terms such as ‘econophysics’ or ‘quantum finance’ have recently appeared to embrace efforts in this direction. As a first contact with such research, the project will present a brief description of some of the main theoretical models that have applications in finance and economics, and will try to present, if possible, potential new applications to particular areas in financial analysis, or new applicable models. As a result, emphasiswill be put on the implications of this research for the financial sector and its future dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In most geochemical analyses log-ratio techniques are required to analyse compositional data sets. When a chemical element is present at a low concentration in is usally identified as a value below the detection límit and added to the data set either as zero or simply by attaching a less-than label. In any case, the occirrence of such concentration prevents us from applying the log-ratio approach. We review here the tehoretical bases of the most recent proposals for dealing with these types of observation, give some advice on their practical application and illustrate their performance throgh some examples using geochemical data

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, a new one-class classification ensemble strategy called approximate polytope ensemble is presented. The main contribution of the paper is threefold. First, the geometrical concept of convex hull is used to define the boundary of the target class defining the problem. Expansions and contractions of this geometrical structure are introduced in order to avoid over-fitting. Second, the decision whether a point belongs to the convex hull model in high dimensional spaces is approximated by means of random projections and an ensemble decision process. Finally, a tiling strategy is proposed in order to model non-convex structures. Experimental results show that the proposed strategy is significantly better than state of the art one-class classification methods on over 200 datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim of this contribution is to illustrate the state of the art of smart antenna research from several perspectives. The bow is drawn from transmitter issues via channel measurements and modeling, receiver signal processing, network aspects, technological challenges towards first smart antenna applications and current status of standardization. Moreover, some future prospects of different disciplines in smart antenna research are given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This correspondence addresses the problem of nondata-aidedwaveform estimation for digital communications. Based on the unconditionalmaximum likelihood criterion, the main contribution of this correspondenceis the derivation of a closed-form solution to the waveform estimationproblem in the low signal-to-noise ratio regime. The proposed estimationmethod is based on the second-order statistics of the received signaland a clear link is established between maximum likelihood estimation andcorrelation matching techniques. Compression with the signal-subspace isalso proposed to improve the robustness against the noise and to mitigatethe impact of abnormals or outliers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Current advances in genomics, proteomics and other areas of molecular biology make the identification and reconstruction of novel pathways an emerging area of great interest. One such class of pathways is involved in the biogenesis of Iron-Sulfur Clusters (ISC). Results: Our goal is the development of a new approach based on the use and combination of mathematical, theoretical and computational methods to identify the topology of a target network. In this approach, mathematical models play a central role for the evaluation of the alternative network structures that arise from literature data-mining, phylogenetic profiling, structural methods, and human curation. As a test case, we reconstruct the topology of the reaction and regulatory network for the mitochondrial ISC biogenesis pathway in S. cerevisiae. Predictions regarding how proteins act in ISC biogenesis are validated by comparison with published experimental results. For example, the predicted role of Arh1 and Yah1 and some of the interactions we predict for Grx5 both matches experimental evidence. A putative role for frataxin in directly regulating mitochondrial iron import is discarded from our analysis, which agrees with also published experimental results. Additionally, we propose a number of experiments for testing other predictions and further improve the identification of the network structure. Conclusion: We propose and apply an iterative in silico procedure for predictive reconstruction of the network topology of metabolic pathways. The procedure combines structural bioinformatics tools and mathematical modeling techniques that allow the reconstruction of biochemical networks. Using the Iron Sulfur cluster biogenesis in S. cerevisiae as a test case we indicate how this procedure can be used to analyze and validate the network model against experimental results. Critical evaluation of the obtained results through this procedure allows devising new wet lab experiments to confirm its predictions or provide alternative explanations for further improving the models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method for optimizing the strength of a parametric phase mask for a wavefront coding imaging system is presented. The method is based on an optimization process that minimizes a proposed merit function. The goal is to achieve modulation transfer function invariance while quantitatively maintaining nal image delity. A parametric lter that copes with the noise present in the captured images is used to obtain the nal images, and this lter is optimized. The whole process results in optimum phase mask strength and optimal parameters for the restoration lter. The results for a particular optical system are presented and tested experimentally in the labo- ratory. The experimental results show good agreement with the simulations, indicating that the procedure is useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Cherenkov light flashes produced by Extensive Air Showers are very short in time. A high bandwidth and fast digitizing readout, therefore, can minimize the influence of the background from the light of the night sky, and improve the performance in Cherenkov telescopes. The time structure of the Cherenkov image can further be used in single-dish Cherenkov telescopes as an additional parameter to reduce the background from unwanted hadronic showers. A description of an analysis method which makes use of the time information and the subsequent improvement on the performance of the MAGIC telescope (especially after the upgrade with an ultra fast 2 GSamples/s digitization system in February 2007) will be presented. The use of timing information in the analysis of the new MAGIC data reduces the background by a factor two, which in turn results in an enhancement of about a factor 1.4 of the flux sensitivity to point-like sources, as tested on observations of the Crab Nebula.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CORNISH project is the highest resolution radio continuum survey of the Galactic plane to date. It is the 5 GHz radio continuum part of a series of multi-wavelength surveys that focus on the northern GLIMPSE region (10° < l < 65°), observed by the Spitzer satellite in the mid-infrared. Observations with the Very Large Array in B and BnA configurations have yielded a 1.''5 resolution Stokes I map with a root mean square noise level better than 0.4 mJy beam 1. Here we describe the data-processing methods and data characteristics, and present a new, uniform catalog of compact radio emission. This includes an implementation of automatic deconvolution that provides much more reliable imaging than standard CLEANing. A rigorous investigation of the noise characteristics and reliability of source detection has been carried out. We show that the survey is optimized to detect emission on size scales up to 14'' and for unresolved sources the catalog is more than 90% complete at a flux density of 3.9 mJy. We have detected 3062 sources above a 7σ detection limit and present their ensemble properties. The catalog is highly reliable away from regions containing poorly sampled extended emission, which comprise less than 2% of the survey area. Imaging problems have been mitigated by down-weighting the shortest spacings and potential artifacts flagged via a rigorous manual inspection with reference to the Spitzer infrared data. We present images of the most common source types found: H II regions, planetary nebulae, and radio galaxies. The CORNISH data and catalog are available online at http://cornish.leeds.ac.uk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The causal mechanism and seasonal evolution of the internal wave field in a deep, warm, monomictic reservoirare examined through the analysis of field observations and numerical techniques. The study period extends fromthe onset of thermal stratification in the spring until midsummer in 2005. During this time, wind forcing wasperiodic, with a period of 24 h (typical of land–sea breezes), and the thermal structure in the lake wascharacterized by the presence of a shallow surface layer overlying a thick metalimnion, typical of small to mediumsized reservoirs with deep outtakes. Basin-scale internal seiches of high vertical mode (ranging from mode V3 toV5) were observed in the metalimnion. The structure of the dominant modes of oscillation changed asstratification evolved on seasonal timescales, but in all cases, their periods were close to that of the local windforcing (i.e., 24 h), suggesting a resonant response. Nonresonant oscillatory modes of type V1 and V2 becamedominant after large frontal events, which disrupted the diurnal periodicity of the wind forcing

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phase encoded nano structures such as Quick Response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase encoded QR codes. The system is illuminated using polarized light and the QR code is encoded using a phase-only random mask. Using classification algorithms it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase encoded QR codes using polarimetric signatures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is oftenassociated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcificationsis performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcificationshave been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the senseof adding new features not only related to the shape