838 resultados para Mesh generation from image data
Resumo:
We report on the measurement of second-harmonic signals from hyperplastic parenchyma and stroma in malignant human prostate tissue under femtosecond pulsed illumination in the wavelength range from 730 to 870 nm. In particular, the relationship of the second-harmonic generation to the excitation wavelength is measured. The result in these two regions behaves considerably differently and thus provides a possible indicator for identifying tissue components and malignancy.
Resumo:
Self-reported health status measures are generally used to analyse Social Security Disability Insurance's (SSDI) application and award decisions as well as the relationship between its generosity and labour force participation. Due to endogeneity and measurement error, the use of self-reported health and disability indicators as explanatory variables in economic models is problematic. We employ county-level aggregate data, instrumental variables and spatial econometric techniques to analyse the determinants of variation in SSDI rates and explicitly account for the endogeneity and measurement error of the self-reported disability measure. Two surprising results are found. First, it is shown that measurement error is the dominating source of the bias and that the main source of measurement error is sampling error. Second, results suggest that there may be synergies for applying for SSDI when the disabled population is larger. © 2011 Taylor & Francis.
Resumo:
The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.
Resumo:
The global efforts to reduce carbon emissions from power generation have favoured renewable energy resources such as wind and solar in recent years. The generation of power from the renewable energy resources has become attractive because of various incentives provided by government policies supporting green power. Among the various available renewable energy resources, the power generation from wind has seen tremendous growth in the last decade. This article discusses various advantages of the upcoming offshore wind technology and associated considerations related to their construction. The conventional configuration of the offshore wind farm is based on the alternative current internal links. With the recent advances of improved commercialised converters, voltage source converters based high voltage direct current link for offshore wind farms is gaining popularity. The planning and construction phases of offshore wind farms, including related environmental issues, are discussed here.
Resumo:
Quantifying the competing rates of intake and elimination of persistent organic pollutants (POPs) in the human body is necessary to understand the levels and trends of POPs at a population level. In this paper we reconstruct the historical intake and elimination of ten polychlorinated biphenyls (PCBs) and five organochlorine pesticides (OCPs) from Australian biomonitoring data by fitting a population-level pharmacokinetic (PK) model. Our analysis exploits two sets of cross-sectional biomonitoring data for PCBs and OCPs in pooled blood serum samples from the Australian population that were collected in 2003 and 2009. The modeled adult reference intakes in 1975 for PCB congeners ranged from 0.89 to 24.5 ng/kg bw/day, lower than the daily intakes of OCPs ranging from 73 to 970 ng/kg bw/day. Modeled intake rates are declining with half-times from 1.1 to 1.3 years for PCB congeners and 0.83 to 0.97 years for OCPs. The shortest modeled intrinsic human elimination half-life among the compounds studied here is 6.4 years for hexachlorobenzene, and the longest is 30 years for PCB-74. Our results indicate that it is feasible to reconstruct intakes and to estimate intrinsic human elimination half-lives using the population-level PK model and biomonitoring data only. Our modeled intrinsic human elimination half-lives are in good agreement with values from a similar study carried out for the population of the United Kingdom, and are generally longer than reported values from other industrialized countries in the Northern Hemisphere.
Resumo:
In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.
Resumo:
A method for reconstruction of an object f(x) x=(x,y,z) from a limited set of cone-beam projection data has been developed. This method uses a modified form of convolution back-projection and projection onto convex sets (POCS) for handling the limited (or incomplete) data problem. In cone-beam tomography, one needs to have a complete geometry to completely reconstruct the original three-dimensional object. While complete geometries do exist, they are of little use in practical implementations. The most common trajectory used in practical scanners is circular, which is incomplete. It is, however, possible to recover some of the information of the original signal f(x) based on a priori knowledge of the nature of f(x). If this knowledge can be posed in a convex set framework, then POCS can be utilized. In this report, we utilize this a priori knowledge as convex set constraints to reconstruct f(x) using POCS. While we demonstrate the effectiveness of our algorithm for circular trajectories, it is essentially geometry independent and will be useful in any limited-view cone-beam reconstruction.
Resumo:
A series of bimetallic acetylacetonate (acac) complexes, AlxCr1-x(acac)(3), 0 <= x <= 1, have been synthesized for application as precursors for the CVD Of Substituted oxides, such as (AlxCr1-x)(2)O-3. Detailed thermal analysis has been carried out on these complexes, which are solids that begin subliming at low temperatures, followed by melting, and evaporation from the melt. By applying the Langmuir equation to differential thermogravimetry data, the vapour pressure of these complexes is estimated. From these vapour pressure data, the distinctly different enthalpies of sublimation and evaporation are calculated, using the Clausius-Clapeyron equation. Such a determination of both the enthalpies of sublimation and evaporation of complexes, which sublime and melt congruently, does not appear to have been reported in the literature to date.
Resumo:
The Fabens method is commonly used to estimate growth parameters k and l infinity in the von Bertalanffy model from tag-recapture data. However, the Fabens method of estimation has an inherent bias when individual growth is variable. This paper presents an asymptotically unbiassed method using a maximum likelihood approach that takes account of individual variability in both maximum length and age-at-tagging. It is assumed that each individual's growth follows a von Bertalanffy curve with its own maximum length and age-at-tagging. The parameter k is assumed to be a constant to ensure that the mean growth follows a von Bertalanffy curve and to avoid overparameterization. Our method also makes more efficient use nf thp measurements at tno and recapture and includes diagnostic techniques for checking distributional assumptions. The method is reasonably robust and performs better than the Fabens method when individual growth differs from the von Bertalanffy relationship. When measurement error is negligible, the estimation involves maximizing the profile likelihood of one parameter only. The method is applied to tag-recapture data for the grooved tiger prawn (Penaeus semisulcatus) from the Gulf of Carpentaria, Australia.
Resumo:
We propose a new model for estimating the size of a population from successive catches taken during a removal experiment. The data from these experiments often have excessive variation, known as overdispersion, as compared with that predicted by the multinomial model. The new model allows catchability to vary randomly among samplings, which accounts for overdispersion. When the catchability is assumed to have a beta distribution, the likelihood function, which is refered to as beta-multinomial, is derived, and hence the maximum likelihood estimates can be evaluated. Simulations show that in the presence of extravariation in the data, the confidence intervals have been substantially underestimated in previous models (Leslie-DeLury, Moran) and that the new model provides more reliable confidence intervals. The performance of these methods was also demonstrated using two real data sets: one with overdispersion, from smallmouth bass (Micropterus dolomieu), and the other without overdispersion, from rat (Rattus rattus).
Resumo:
In this paper, we have probed the origin of SHG in copper nanoparticles by polarization-resolved hyper-Rayleigh scattering (HRS). Results obtained with various sizes of copper nanoparticles at four different wavelengths covering the wavelength range 738-1907 nm reveal that the origin of second harmonic generation (SHG) in these particles is purely dipolar in nature as long as the size (d) of the particles remains smaller compared to the wavelength (;.) of light ("small-particle limit"). However, contribution of the higher order multipoles coupled with retardation effect becomes apparent with an increase in the d/lambda ratio. We have identified the "small-particle limit" in the second harmonic generation from noble metal nanoparticles by evaluating the critical d/lambda ratio at which the retardation effect sets in the noble metal nanoparticles. We have found that the second-order nonlinear optical property of copper nanoparticles closely resembles that of gold, but not that of silver. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In order to understand the molecular mechanism of non-oxidative decarboxylation of aromatic acids observed in microbial systems, 2,3 dihydroxybenzoic acid (DHBA) decarboxylase from Image Image was purified to homogeneity by affinity chromatography. The enzyme (Mr 120 kDa) had four identical subunits (28 kDa each) and was specific for DHBA. It had a pH optimum of 5.2 and Km was 0.34mM. The decarboxylation did not require any cofactors, nor did the enzyme had any pyruvoyl group at the active site. The carboxyl group and hydroxyl group in the Image -position were required for activity. The preliminary spectroscopic properties of the enzyme are also reported.
Resumo:
In multi-vehicle motorcycle crashes, the motorcycle rider is less likely to be at-fault but more commonly severely injured than the other road user. Therefore, not surprisingly, crashes in which motorcycle riders are at-fault and particularly the injuries to the other road users in these crashes have received little research attention. This paper aims to address this gap in the literature by investigating the factors influencing the severity of injury to other road users in motorcyclist-at-fault crashes. Five years of data from Queensland, Australia, were obtained from a database of claims against the compulsory third party (CTP) injury insurance of the at-fault motorcyclists. Analysis of the data using an ordered probit model shows higher injury severity for crashes involving young (under 25) and older (60+) at-fault motorcyclists. Among the not at-fault road users, the young, old, and males were found to be more severely injured than others. Injuries to vehicle occupants were less severe than those to pillions. Crashes that occurred between vehicles traveling in opposite directions resulted in more severe injuries than those involving vehicles traveling in the same direction. While most existing studies have analyzed police reported crash data, this study used CTP insurance data. Comparison of results indicates the potential of using CTP insurance data as an alternative to police reported crash data for gaining a better understanding of risk factors for motorcycle crashes and injury severity.
Resumo:
NeEstimator v2 is a completely revised and updated implementation of software that produces estimates of contemporary effective population size, using several different methods and a single input file. NeEstimator v2 includes three single-sample estimators (updated versions of the linkage disequilibrium and heterozygote-excess methods, and a new method based on molecular coancestry), as well as the two-sample (moment-based temporal) method. New features include the following: (i) an improved method for accounting for missing data; (ii) options for screening out rare alleles; (iii) confidence intervals for all methods; (iv) the ability to analyse data sets with large numbers of genetic markers (10000 or more); (v) options for batch processing large numbers of different data sets, which will facilitate cross-method comparisons using simulated data; and (vi) correction for temporal estimates when individuals sampled are not removed from the population (Plan I sampling). The user is given considerable control over input data and composition, and format of output files. The freely available software has a new JAVA interface and runs under MacOS, Linux and Windows.
Resumo:
A soluble fraction of Image catalyzed the hydroxylation of mandelic acid to Image -hydroxymandelic acid. The enzyme had a pH optimum of 5.4 and showed an absolute requirement for Fe2+, tetrahydropteridine, NADPH. Image -Hydroxymandelate, the product of the enzyme reaction was identified by paper chromatography, thin layer chromatography, UV and IR-spectra.