989 resultados para Ancestral range estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review summarizes the development of exclusion chromatography, also termed gel filtration, molecular-sieve chromatography and gel permeation chromatography, for the quantitative characterization of solutes and solute interactions. As well as affording a means of determining molecular mass and molecular mass distribution, the technique offers a convenient way of characterizing solute selfassociation and solute-ligand interactions in terms of reaction stoichiometry and equilibrium constant. The availability of molecular-sieve media with different selective porosities ensures that very little restriction is imposed on the size of solute amenable to study. Furthermore, access to a diverse array of assay procedures for monitoring the column eluate endows analytical exclusion chromatography with far greater flexibility than other techniques from the viewpoint of solute concentration range that can be examined. In addition to its widely recognized prowess as a means of solute separation and purification, exclusion chromatography thus also possesses considerable potential for investigating the functional roles of the purified solutes. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a resurgence of interest in the mean trace length estimator of Pahl for window sampling of traces. The estimator has been dealt with by Mauldon and Zhang and Einstein in recent publications. The estimator is a very useful one in that it is non-parametric. However, despite some discussion regarding the statistical distribution of the estimator, none of the recent works or the original work by Pahl provide a rigorous basis for the determination a confidence interval for the estimator or a confidence region for the estimator and the corresponding estimator of trace spatial intensity in the sampling window. This paper shows, by consideration of a simplified version of the problem but without loss of generality, that the estimator is in fact the maximum likelihood estimator (MLE) and that it can be considered essentially unbiased. As the MLE, it possesses the least variance of all estimators and confidence intervals or regions should therefore be available through application of classical ML theory. It is shown that valid confidence intervals can in fact be determined. The results of the work and the calculations of the confidence intervals are illustrated by example. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of authors concerned with the analysis of rock jointing have used the idea that the joint areal or diametral distribution can be linked to the trace length distribution through a theorem attributed to Crofton. This brief paper seeks to demonstrate why Crofton's theorem need not be used to link moments of the trace length distribution captured by scan line or areal mapping to the moments of the diametral distribution of joints represented as disks and that it is incorrect to do so. The valid relationships for areal or scan line mapping between all the moments of the trace length distribution and those of the joint size distribution for joints modeled as disks are recalled and compared with those that might be applied were Crofton's theorem assumed to apply. For areal mapping, the relationship is fortuitously correct but incorrect for scan line mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O câncer de mama é a principal neoplasia maligna que acomete o sexo feminino no Brasil. O câncer de mama é hoje uma doença de extrema importância para a saúde pública nacional, motivando ampla discussão em torno das medidas que promova o seu diagnóstico precoce, a redução em sua morbidade e mortalidade. A presente pesquisa possui três objetivos, cujos resultados encontram-se organizados em artigos. O primeiro objetivo buscou analisar a completude dos dados do Sistema de Informação de Mortalidade sobre os óbitos por câncer de mama em mulheres no Espírito Santo, Sudeste e Brasil (1998 a 2007). Realizou-se um estudo descritivo analítico baseado em dados secundários, onde foi analisado o número absoluto e percentual de não preenchimento das variáveis nas declarações de óbitos. Adotou-se escore para avaliar os graus de não completude. Os resultados para as variáveis sexo e idade foram excelentes tanto para o Espírito Santo, Sudeste e Brasil. O preenchimento das variáveis raça/cor, grau de escolaridade e estado civil apresentam problemas no Espírito Santo. Enquanto no Sudeste e Brasil as variáveis raça/cor e escolaridade têm tendência decrescente para a não completude, no Espírito Santo a tendência se mantém estável. Para a variável estado civil, a não completude tem tendência crescente no Estado do Espírito Santo. O segundo objetivo foi analisar a evolução das taxas de mortalidade por câncer de mama, em mulheres no Espírito Santo no período de 1980 a 2007. Estudo de série temporal, cujos dados sobre óbitos foram obtidos do Sistema de Informação de Mortalidade e as estimativas populacionais segundo idade e anos-calendário, do Instituto Brasileiro Geografia e Estatística. Os coeficientes específicos 9 de mortalidade, segundo faixa etária, foram calculados anualmente. A análise de tendência foi realizada por meio da padronização das taxas de mortalidade pelo método direto, em que a população do senso IBGE-2000, foi considerada padrão. No período de estudo, ocorreram 2.736 óbitos por câncer de mama. O coeficiente de mortalidade neste período variou de 3,41 a 10,99 por 100.000 mulheres. Os resultados indicam que há tendência de mortalidade por câncer de mama ao longo da série (p=0,001 com crescimento de 75,42%). Todas as faixas etárias a partir de 30 anos apresentaram tendência de crescimento da mortalidade estatisticamente significante (p=0,001). Os percentuais de crescimento foram aumentando, segundo as idades mais avançadas, sendo 48,4% na faixa de 40 a 49 anos, chegando a 92,3%, na faixa de 80 anos e mais. O terceiro objetivo foi realizar a análise espacial dos óbitos em mulheres por câncer de mama no estado do Espírito Santo, nos anos de 2003 a 2007, com análise das correlações espaciais dessa mortalidade e componentes do município. O cenário foi o Estado do Espírito Santo, composto por 78 municípios. Para análise dos dados, utilizou-se a abordagem bayesiana (métodos EBest Global e EBest Local) para correção de taxas epidemiológicas. Calculou-se o índice I de Moran, para dependência espacial em nível global e a estatística Moran Local. As maiores taxas estão concentradas em 19 municípios pertencentes às Microrregiões: Metropolitana (Fundão, Vitória, Vila Velha, Viana, Cariacica e Guarapari), Metrópole Expandida Sul (Anchieta, Alfredo Chaves), Pólo Cachoeiro (Vargem Alta, Rio Novo do Sul, Mimoso do Sul, Cachoeiro de Itapemirim, Castelo, Jerônimo Monteiro, Bom Jesus do Norte, Apiacá e Muqui) e Caparaó (Alegre e São José do Calçado). Os resultados da Estimação Bayesiana (Índice de Moran) dos óbitos por câncer de mama em mulheres ocorridos no estado do Espírito Santo, segundo os dados brutos e 10 ajustados indicam a existência de correlação espacial significativa para o mapa Local (I = 0,573; p = 0,001) e Global (I = 0,118; p = 0,039). Os dados brutos não apresentam correlação espacial (I = 0,075; p = 0,142).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For modern consumer cameras often approximate calibration data is available, making applications such as 3D reconstruction or photo registration easier as compared to the pure uncalibrated setting. In this paper we address the setting with calibrateduncalibrated image pairs: for one image intrinsic parameters are assumed to be known, whereas the second view has unknown distortion and calibration parameters. This situation arises e.g. when one would like to register archive imagery to recently taken photos. A commonly adopted strategy for determining epipolar geometry is based on feature matching and minimal solvers inside a RANSAC framework. However, only very few existing solutions apply to the calibrated-uncalibrated setting. We propose a simple and numerically stable two-step scheme to first estimate radial distortion parameters and subsequently the focal length using novel solvers. We demonstrate the performance on synthetic and real datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper was to estimate the return on investment in QMS (quality management systems) certification undertaken in Portuguese firms, according to the ISO 9000 series. A total of 426 certified Portuguese firms were surveyed. The response rate was 61.03 percent. The different payback periods were validated through statistical analysis and the relationship between expected and perceived payback periods was discussed. This study suggests that a firm’s sector of activity, size and degree of internationalization are related to the length of the investment in QMS certification recovery period. Furthermore, our findings suggest, that the time taken to obtain the certification is not directly related to the economic component of the certification. The majority of Portuguese firms (58.9%) took up to three years to recoup their investment and 35.5% of companies said they had not yet recovered the initial investment made. The recoup of investment was measured by the increase in the number of customers and consequent volume of deliveries, improved profitability and productivity of the company, improvement of competitive position and performance (cost savings), reduction in the number of external complaints and internal defects/scrap, achievement of some important clientele, among others. We compared our work to similar studies undertaken in other countries. This paper provides a contribution to the research related to the return on investment for costs related to the certification QMS according to ISO 9000. This paper provides a valuable contribution to the field and is one of the first studies to undertake this type of analysis in Portugal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a method for estimating local thickness distribution in nite element models, applied to injection molded and cast engineering parts. This method features considerable improved performance compared to two previously proposed approaches, and has been validated against thickness measured by di erent human operators. We also demonstrate that the use of this method for assigning a distribution of local thickness in FEM crash simulations results in a much more accurate prediction of the real part performance, thus increasing the bene ts of computer simulations in engineering design by enabling zero-prototyping and thus reducing product development costs. The simulation results have been compared to experimental tests, evidencing the advantage of the proposed method. Thus, the proposed approach to consider local thickness distribution in FEM crash simulations has high potential on the product development process of complex and highly demanding injection molded and casted parts and is currently being used by Ford Motor Company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimally invasive cardiovascular interventions guided by multiple imaging modalities are rapidly gaining clinical acceptance for the treatment of several cardiovascular diseases. These images are typically fused with richly detailed pre-operative scans through registration techniques, enhancing the intra-operative clinical data and easing the image-guided procedures. Nonetheless, rigid models have been used to align the different modalities, not taking into account the anatomical variations of the cardiac muscle throughout the cardiac cycle. In the current study, we present a novel strategy to compensate the beat-to-beat physiological adaptation of the myocardium. Hereto, we intend to prove that a complete myocardial motion field can be quickly recovered from the displacement field at the myocardial boundaries, therefore being an efficient strategy to locally deform the cardiac muscle. We address this hypothesis by comparing three different strategies to recover a dense myocardial motion field from a sparse one, namely, a diffusion-based approach, thin-plate splines, and multiquadric radial basis functions. Two experimental setups were used to validate the proposed strategy. First, an in silico validation was carried out on synthetic motion fields obtained from two realistic simulated ultrasound sequences. Then, 45 mid-ventricular 2D sequences of cine magnetic resonance imaging were processed to further evaluate the different approaches. The results showed that accurate boundary tracking combined with dense myocardial recovery via interpolation/ diffusion is a potentially viable solution to speed up dense myocardial motion field estimation and, consequently, to deform/compensate the myocardial wall throughout the cardiac cycle. Copyright © 2015 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant’s pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant’s pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant’s main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant’s pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67±34μm and 108μm, and angular misfits of 0.15±0.08º and 1.4º, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants’ pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A motivação para este trabalho vem da necessidade que o autor tem em poder registar as notas tocadas na guitarra durante o processo de improviso. Quando o músico está a improvisar na guitarra, muitas vezes não se recorda das notas tocadas no momento, este trabalho trata o desenvolvimento de uma aplicação para guitarristas, que permita registar as notas tocadas na guitarra eléctrica ou clássica. O sinal é adquirido a partir da guitarra e processado com requisitos de tempo real na captura do sinal. As notas produzidas pela guitarra eléctrica, ligada ao computador, são representadas no formato de tablatura e/ou partitura. Para este efeito a aplicação capta o sinal proveniente da guitarra eléctrica a partir da placa de som do computador e utiliza algoritmos de detecção de frequência e algoritmos de estimação de duração de cada sinal para construir o registo das notas tocadas. A aplicação é desenvolvida numa perspectiva multi-plataforma, podendo ser executada em diferentes sistemas operativos Windows e Linux, usando ferramentas e bibliotecas de domínio público. Os resultados obtidos mostram a possibilidade de afinar a guitarra com valores de erro na ordem de 2 Hz em relação às frequências de afinação standard. A escrita da tablatura apresenta resultados satisfatórios, mas que podem ser melhorados. Para tal será necessário melhorar a implementação de técnicas de processamento do sinal bem como a comunicação entre processos para resolver os problemas encontrados nos testes efectuados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The portfolio generating the iTraxx EUR index is modeled by coupled Markov chains. Each of the industries of the portfolio evolves according to its own Markov transition matrix. Using a variant of the method of moments, the model parameters are estimated from a data set of Standard and Poor's. Swap spreads are evaluated by Monte-Carlo simulations. Along with an actuarially fair spread, at least squares spread is considered.