201 resultados para Computational Geometry and Object Modelling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There have been substantial advances in small field dosimetry techniques and technologies, over the last decade, which have dramatically improved the achievable accuracy of small field dose measurements. This educational note aims to help radiation oncology medical physicists to apply some of these advances in clinical practice. The evaluation of a set of small field output factors (total scatter factors) is used to exemplify a detailed measurement and simulation procedure and as a basis for discussing the possible effects of simplifying that procedure. Field output factors were measured with an unshielded diode and a micro-ionisation chamber, at the centre of a set of square fields defined by a micro-multileaf collimator. Nominal field sizes investigated ranged from 6×6 to 98×98 mm2. Diode measurements in fields smaller than 30 mm across were corrected using response factors calculated using Monte Carlo simulations of the full diode geometry and daisy-chained to match micro-chamber measurements at intermediate field sizes. Diode measurements in fields smaller than 15 mm across were repeated twelve times over three separate measurement sessions, to evaluate the to evaluate the reproducibility of the radiation field size and its correspondence with the nominal field size. The five readings that contributed to each measurement on each day varied by up to 0.26%, for the “very small” fields smaller than 15 mm, and 0.18% for the fields larger than 15 mm. The diode response factors calculated for the unshielded diode agreed with previously published results, within 1.6%. The measured dimensions of the very small fields differed by up to 0.3 mm, across the different measurement sessions, contributing an uncertainty of up to 1.2% to the very small field output factors. The overall uncertainties in the field output factors were 1.8% for the very small fields and 1.1% for the fields larger than 15 mm across. Recommended steps for acquiring small field output factor measurements for use in radiotherapy treatment planning system beam configuration data are provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Epigenetic changes correspond to heritable modifications of the chromatin structure, which do not involve any alteration of the DNA sequence but nonetheless affect gene expression. These mechanisms play an important role in cell differentiation, but aberrant occurrences are also associated with a number of diseases, including cancer and neural development disorders. In particular, aberrant DNA methylation induced by H. Pylori has been found to be a significant risk factor in gastric cancer. To investigate the sensitivity of different genes and cell types to this infection, a computational model of methylation in gastric crypts is developed. In this article, we review existing results from physical experiments and outline their limitations, before presenting the computational model and investigating the influence of its parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews a variety of advanced signal processing algorithms that have been developed at the University of Southampton as part of the Prometheus (Programme for European traffic flow with highest efficiency and unprecedented safety) programme to achieve an intelligent driver warning system (IDWS). The IDWS includes the detection of road edges, lanes, obstacles and their tracking and identification, estimates of time to collision, and behavioural modelling of drivers for a variety of scenarios. The underlying algorithms are briefly discussed in support of the IDWS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most important aspect of modelling a geological variable, such as metal grade, is the spatial correlation. Spatial correlation describes the relationship between realisations of a geological variable sampled at different locations. Any method for spatially modelling such a variable should be capable of accurately estimating the true spatial correlation. Conventional kriged models are the most commonly used in mining for estimating grade or other variables at unsampled locations, and these models use the variogram or covariance function to model the spatial correlations in the process of estimation. However, this usage assumes the relationships of the observations of the variable of interest at nearby locations are only influenced by the vector distance between the locations. This means that these models assume linear spatial correlation of grade. In reality, the relationship with an observation of grade at a nearby location may be influenced by both distance between the locations and the value of the observations (ie non-linear spatial correlation, such as may exist for variables of interest in geometallurgy). Hence this may lead to inaccurate estimation of the ore reserve if a kriged model is used for estimating grade of unsampled locations when nonlinear spatial correlation is present. Copula-based methods, which are widely used in financial and actuarial modelling to quantify the non-linear dependence structures, may offer a solution. This method was introduced by Bárdossy and Li (2008) to geostatistical modelling to quantify the non-linear spatial dependence structure in a groundwater quality measurement network. Their copula-based spatial modelling is applied in this research paper to estimate the grade of 3D blocks. Furthermore, real-world mining data is used to validate this model. These copula-based grade estimates are compared with the results of conventional ordinary and lognormal kriging to present the reliability of this method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semantic perception and object labeling are key requirements for robots interacting with objects on a higher level. Symbolic annotation of objects allows the usage of planning algorithms for object interaction, for instance in a typical fetchand-carry scenario. In current research, perception is usually based on 3D scene reconstruction and geometric model matching, where trained features are matched with a 3D sample point cloud. In this work we propose a semantic perception method which is based on spatio-semantic features. These features are defined in a natural, symbolic way, such as geometry and spatial relation. In contrast to point-based model matching methods, a spatial ontology is used where objects are rather described how they "look like", similar to how a human would described unknown objects to another person. A fuzzy based reasoning approach matches perceivable features with a spatial ontology of the objects. The approach provides a method which is able to deal with senor noise and occlusions. Another advantage is that no training phase is needed in order to learn object features. The use-case of the proposed method is the detection of soil sample containers in an outdoor environment which have to be collected by a mobile robot. The approach is verified using real world experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new active learning query strategy for information extraction, called Domain Knowledge Informativeness (DKI). Active learning is often used to reduce the amount of annotation effort required to obtain training data for machine learning algorithms. A key component of an active learning approach is the query strategy, which is used to iteratively select samples for annotation. Knowledge resources have been used in information extraction as a means to derive additional features for sample representation. DKI is, however, the first query strategy that exploits such resources to inform sample selection. To evaluate the merits of DKI, in particular with respect to the reduction in annotation effort that the new query strategy allows to achieve, we conduct a comprehensive empirical comparison of active learning query strategies for information extraction within the clinical domain. The clinical domain was chosen for this work because of the availability of extensive structured knowledge resources which have often been exploited for feature generation. In addition, the clinical domain offers a compelling use case for active learning because of the necessary high costs and hurdles associated with obtaining annotations in this domain. Our experimental findings demonstrated that 1) amongst existing query strategies, the ones based on the classification model’s confidence are a better choice for clinical data as they perform equally well with a much lighter computational load, and 2) significant reductions in annotation effort are achievable by exploiting knowledge resources within active learning query strategies, with up to 14% less tokens and concepts to manually annotate than with state-of-the-art query strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Lagrangian particle tracking provides an effective method for simulating the deposition of nano- particles as well as micro-particles as it accounts for the particle inertia effect as well as the Brownian excitation. However, using the Lagrangian approach for simulating ultrafine particles has been limited due to computational cost and numerical difficulties. The aim of this paper is to study the deposition of nano-particles in cylindrical tubes under laminar condition using the Lagrangian particle tracking method. The commercial Fluent software is used to simulate the fluid flow in the pipes and to study the deposition and dispersion of nano-particles. Different particle diameters as well as different pipe lengths and flow rates are examined. The results show good agreement between the calculated deposition efficiency and different analytic correlations in the literature. Furthermore, for the nano-particles with higher diameters and when the effect of inertia has a higher importance, the calculated deposition efficiency by the Lagrangian method is less than the analytic correlations based on Eulerian method due to statistical error or the inertia effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The latest generation of Deep Convolutional Neural Networks (DCNN) have dramatically advanced challenging computer vision tasks, especially in object detection and object classification, achieving state-of-the-art performance in several computer vision tasks including text recognition, sign recognition, face recognition and scene understanding. The depth of these supervised networks has enabled learning deeper and hierarchical representation of features. In parallel, unsupervised deep learning such as Convolutional Deep Belief Network (CDBN) has also achieved state-of-the-art in many computer vision tasks. However, there is very limited research on jointly exploiting the strength of these two approaches. In this paper, we investigate the learning capability of both methods. We compare the output of individual layers and show that many learnt filters and outputs of the corresponding level layer are almost similar for both approaches. Stacking the DCNN on top of unsupervised layers or replacing layers in the DCNN with the corresponding learnt layers in the CDBN can improve the recognition/classification accuracy and training computational expense. We demonstrate the validity of the proposal on ImageNet dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The various cell types and their relative numbers in multicellular organisms are controlled by growth factors and related extracellular molecules which affect genetic expression pathways. However, these substances may have both/either inhibitory and/or stimulatory effects on cell division and cell differentiation depending on the cellular environment. It is not known how cells respond to these substances in such an ambiguous way. Many cellular effects have been investigated and reported using cell culture from cancer cell lines in an effort to define normal cellular behaviour using these abnormal cells. A model is offered to explain the harmony of cellular life in multicellular organisms involving interacting extracellular substances. Methods A basic model was proposed based on asymmetric cell division and evidence to support the hypothetical model was accumulated from the literature. In particular, relevant evidence was selected for the Insulin-Like Growth Factor system from the published data, especially from certain cell lines, to support the model. The evidence has been selective in an attempt to provide a picture of normal cellular responses, derived from the cell lines. Results The formation of a pair of coupled cells by asymmetric cell division is an integral part of the model as is the interaction of couplet molecules derived from these cells. Each couplet cell will have a receptor to measure the amount of the couplet molecule produced by the other cell; each cell will be receptor-positive or receptor-negative for the respective receptors. The couplet molecules will form a binary complex whose level is also measured by the cell. The hypothesis is heavily supported by selective collection of circumstantial evidence and by some direct evidence. The basic model can be expanded to other cellular interactions. Conclusions These couplet cells and interacting couplet molecules can be viewed as a mechanism that provides a controlled and balanced division-of-labour between the two progeny cells, and, in turn, their progeny. The presence or absence of a particular receptor for a couplet molecule will define a cell type and the presence or absence of many such receptors will define the cell types of the progeny within cell lineages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new dearomatized porphyrinoid, 5,10-diiminoporphodimethene (5,10-DIPD), has been prepared by palladium-catalyzed hydrazination of 5,10-dibromo-15,20-bis(3,5-di-tert-butylphenyl)porphyrin and its nickel(II) complex, by using ethyl and 4-methoxybenzyl carbazates. The oxidative dearomatization of the porphyrin ring occurs in high yield. Further oxidation with 2,3-dichloro-5,6-dicyanobenzoquinone forms the corresponding 5,10-bis(azocarboxylates), thereby restoring the porphyrin aromaticity. The UV/visible spectra of the NiII DIPDs exhibit remarkable redshifts of the lowest-energy bands to 780 nm, and differential pulse voltammetry reveals a contracted electrochemical HOMO–LUMO gap of 1.44 V. Density functional theory (DFT) was used to calculate the optimized geometries and frontier molecular orbitals of model 5,10-DIPD Ni7c and 5,10-bis(azocarboxylate) Ni8c. The conformations of the carbamate groups and the configurations of the CNZ unit were considered in conjunction with the NOESY spectra, to generate the global minimum geometry and two other structures with slightly higher energies. In the absence of solution data regarding conformations, ten possible local minimum conformations were considered for Ni8c. Partition of the porphyrin macrocycle into tri- and monopyrrole fragments in Ni7c and the inclusion of terminal conjugating functional groups generate unique frontier molecular orbital distributions and a HOMO–LUMO transition with a strong element of charge transfer from the monopyrrole ring. Time-dependent DFT calculations were performed for the three lowest-energy structures of Ni7c and Ni8c, and weighting according to their energies allowed the prediction of the electronic spectra. The calculations reproduce the lower-energy regions of the spectra and the overall forms of the spectra with high accuracy, but agreement is not as good in the Soret region below 450 nm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental studies of Bi heteroepitaxy on Si(001) have recently uncovered a self-organised nanoline motif which has no detectable width dispersion. The Bi lines can be grown with an aspect ratio that is greater than 350 : 1. This paper describes a study of the nanoline geometry and electronic structure using a combination of scanning tunneling microscopy (STM) and ab initio theoretical methods. In particular, the effect that the lines have on Si(001) surface structure at large length scales, l > 100 nm, is studied. It has been found that Bi line growth on surfaces that have regularly spaced single height steps results in a 'preferred' domain orientation.