951 resultados para Single system image
Resumo:
Three-dimensional reconstruction from volumetric medical images (e.g. CT, MRI) is a well-established technology used in patient-specific modelling. However, there are many cases where only 2D (planar) images may be available, e.g. if radiation dose must be limited or if retrospective data is being used from periods when 3D data was not available. This study aims to address such cases by proposing an automated method to create 3D surface models from planar radiographs. The method consists of (i) contour extraction from the radiograph using an Active Contour (Snake) algorithm, (ii) selection of a closest matching 3D model from a library of generic models, and (iii) warping the selected generic model to improve correlation with the extracted contour.
This method proved to be fully automated, rapid and robust on a given set of radiographs. Measured mean surface distance error values were low when comparing models reconstructed from matching pairs of CT scans and planar X-rays (2.57–3.74 mm) and within ranges of similar studies. Benefits of the method are that it requires a single radiographic image to perform the surface reconstruction task and it is fully automated. Mechanical simulations of loaded bone with different levels of reconstruction accuracy showed that an error in predicted strain fields grows proportionally to the error level in geometric precision. In conclusion, models generated by the proposed technique are deemed acceptable to perform realistic patient-specific simulations when 3D data sources are unavailable.
Resumo:
The construction industry is inherently risky, with a significant number of accidents and disasters occurring, particularly on confined construction sites. This research investigates and identifies the various issues affecting successful management of health and safety in confined construction sites. The rationale is that identifying the issues would assist the management of health and safety particularly in inner city centres which are mostly confined sites. Using empiricism epistemology, the methodology was based on qualitative research approach by means of multiple case studies in three different geographical locations of Ireland, UK and USA. Data on each case study were collected through individual interviews and focus group discussion with project participants. The findings suggest that three core issues are the underlying factors affecting management of health and safety on confined construction sites. It include, (i) lack of space, (ii) problem of co-ordination and management of site personnel, and (iii) overcrowding of workplace. The implication of this is that project teams and their organisations should see project processes from a holistic point of view, as a unified single system, where quick intervention in solving a particular issue should be the norm, so as not to adversely affect interrelated sequence of events in project operations. Proactive strategies should be devised to mitigate these issues and may include detail project programming, space management, effective constructability review and efficient co-ordination of personnel, plant and materials among others. The value of this research is to aid management and operation of brownfield sites by identifying issues impacting on health and safety management in project process.
Resumo:
In this paper we demonstrate a simple and novel illumination model that can be used for illumination invariant facial recognition. This model requires no prior knowledge of the illumination conditions and can be used when there is only a single training image per-person. The proposed illumination model separates the effects of illumination over a small area of the face into two components; an additive component modelling the mean illumination and a multiplicative component, modelling the variance within the facial area. Illumination invariant facial recognition is performed in a piecewise manner, by splitting the face image into blocks, then normalizing the illumination within each block based on the new lighting model. The assumptions underlying this novel lighting model have been verified on the YaleB face database. We show that magnitude 2D Fourier features can be used as robust facial descriptors within the new lighting model. Using only a single training image per-person, our new method achieves high (in most cases 100%) identification accuracy on the YaleB, extended YaleB and CMU-PIE face databases.
Resumo:
Education has a powerful and long-term effect on people’s lives and therefore should be based on evidence of what works best. This assertion warrants a definition of what constitutes good research evidence. Two research designs that are often thought to come from diametrically opposed fields, single-subject research designs and randomised controlled-trials, are described and common features, such as the use of probabilistic assumptions and the aim of discovering causal relations are delineated. Differences between the two research designs are also highlighted and this is used as the basis to set out how these two research designs might better be used to complement one another. Recommendations for future action are made accordingly.
Resumo:
Attempts to record, understand and respond to variations in child welfare and protection reporting, service patterns and outcomes are international, numerous and longstanding. Reframing such variations as an issue of inequity between children and between families opens the way to a new approach to explaining the profound difference in intervention rates between and within countries and administrative districts. Recent accounts of variation have frequently been based on the idea that there is a binary division between bias and risk (or need). Here we propose seeing supply (bias) and demand (risk) factors as two aspects of a single system, both framed, in part, by social structures. A recent finding from a study of intervention rates in England, the 'inverse intervention law', is used to illustrate the complex ways in which a range of factors interact to produce intervention rates. In turn, this analysis raises profound moral, policy, practice and research questions about current child welfare and child protection services.
Resumo:
In this paper, we introduce a novel approach to face recognition which simultaneously tackles three combined challenges: 1) uneven illumination; 2) partial occlusion; and 3) limited training data. The new approach performs lighting normalization, occlusion de-emphasis and finally face recognition, based on finding the largest matching area (LMA) at each point on the face, as opposed to traditional fixed-size local area-based approaches. Robustness is achieved with novel approaches for feature extraction, LMA-based face image comparison and unseen data modeling. On the extended YaleB and AR face databases for face identification, our method using only a single training image per person, outperforms other methods using a single training image, and matches or exceeds methods which require multiple training images. On the labeled faces in the wild face verification database, our method outperforms comparable unsupervised methods. We also show that the new method performs competitively even when the training images are corrupted.
Resumo:
As tecnologias de informação e comunicação na área da saúde não são só um instrumento para a boa gestão de informação, mas antes um fator estratégico para uma prestação de cuidados mais eficiente e segura. As tecnologias de informação são um pilar para que os sistemas de saúde evoluam em direção a um modelo centrado no cidadão, no qual um conjunto abrangente de informação do doente deve estar automaticamente disponível para as equipas que lhe prestam cuidados, independentemente de onde foi gerada (local geográfico ou sistema). Este tipo de utilização segura e agregada da informação clínica é posta em causa pela fragmentação generalizada das implementações de sistemas de informação em saúde. Várias aproximações têm sido propostas para colmatar as limitações decorrentes das chamadas “ilhas de informação” na saúde, desde a centralização total (um sistema único), à utilização de redes descentralizadas de troca de mensagens clínicas. Neste trabalho, propomos a utilização de uma camada de unificação baseada em serviços, através da federação de fontes de informação heterogéneas. Este agregador de informação clínica fornece a base necessária para desenvolver aplicações com uma lógica regional, que demostrámos com a implementação de um sistema de registo de saúde eletrónico virtual. Ao contrário dos métodos baseados em mensagens clínicas ponto-a-ponto, populares na integração de sistemas em saúde, desenvolvemos um middleware segundo os padrões de arquitetura J2EE, no qual a informação federada é expressa como um modelo de objetos, acessível através de interfaces de programação. A arquitetura proposta foi instanciada na Rede Telemática de Saúde, uma plataforma instalada na região de Aveiro que liga oito instituições parceiras (dois hospitais e seis centros de saúde), cobrindo ~350.000 cidadãos, utilizada por ~350 profissionais registados e que permite acesso a mais de 19.000.000 de episódios. Para além da plataforma colaborativa regional para a saúde (RTSys), introduzimos uma segunda linha de investigação, procurando fazer a ponte entre as redes para a prestação de cuidados e as redes para a computação científica. Neste segundo cenário, propomos a utilização dos modelos de computação Grid para viabilizar a utilização e integração massiva de informação biomédica. A arquitetura proposta (não implementada) permite o acesso a infraestruturas de e-Ciência existentes para criar repositórios de informação clínica para aplicações em saúde.
Resumo:
Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
Marxian thinking following the TSSI (Temporal Single System Interpretation) of Marx is applied to refute the allegation of a tautology in the resource-based view of the firm--paired with providing an explanation of how and why resources create value--, where resources are synonymous with Marx's categories of constant and variable capital. Refuting the allegation naturally leads to the holy grail of resource-based thinking, i.e. the question of what, conceptually, constitutes a firm's competitive advantage within the industry context. The article achieves its objectives by tying the resource-based view into Marx's theory value.
Resumo:
In this paper we describe a system for underwater navigation with AUVs in partially structured environments, such as dams, ports or marine platforms. An imaging sonar is used to obtain information about the location of planar structures present in such environments. This information is incorporated into a feature-based SLAM algorithm in a two step process: (I) the full 360deg sonar scan is undistorted (to compensate for vehicle motion), thresholded and segmented to determine which measurements correspond to planar environment features and which should be ignored; and (2) SLAM proceeds once the data association is obtained: both the vehicle motion and the measurements whose correct association has been previously determined are incorporated in the SLAM algorithm. This two step delayed SLAM process allows to robustly determine the feature and vehicle locations in the presence of large amounts of spurious or unrelated measurements that might correspond to boats, rocks, etc. Preliminary experiments show the viability of the proposed approach
Resumo:
En este proyecto de grado se diseño una herramienta para determinar el potencial de internacionalización de la empresa. Identificando las áreas estrategias más importantes y evaluándolas para poder ubicarlas dentro de una escala. Se presentan diferentes teorías para desarrollar la herramienta. Se aplica la herramienta a una Pyme Colombiana productora de vidrio. La solución de la herramienta es poder estandarizar todas las áreas definidas y poderlas proyectar como un solo sistema para poder avanzar al siguiente nivel de internacionalización. Las empresas Colombianas necesitan competir en el mercado global para desarrollar practicas y conocimiento que le generen una ventaja y posicionamiento de largo plazo.
Resumo:
A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model
Resumo:
Intact, enveloped coronavirus particles vary widely in size and contour, and are thus refractory to study by traditional structural means such as X-ray crystallography. Electron microscopy (EM) overcomes some problems associated with particle variability and has been an important tool for investigating coronavirus ultrastructure. However, EM sample preparation requires that the specimen be dried onto a carbon support film before imaging, collapsing internal particle structure in the case of coronaviruses. Moreover, conventional EM achieves image contrast by immersing the specimen briefly in heavy-metal-containing stain, which reveals some features while obscuring others. Electron cryomicroscopy (cryo-EM) instead employs a porous support film, to which the specimen is adsorbed and flash-frozen. Specimens preserved in vitreous ice over holes in the support film can then be imaged without additional staining. Cryo-EM, coupled with single-particle image analysis techniques, makes it possible to examine the size, structure and arrangement of coronavirus structural components in fully hydrated, native virions. Two virus purification procedures are described.
Resumo:
Oxidized low-density lipoprotein (oxLDL) exhibits many atherogenic effects, including the promotion of monocyte recruitment to the arterial endothelium and the induction of scavenger receptor expression. However, while atherosclerosis involves chronic inflammation within the arterial intima, it is unclear whether oxLDL alone provides a direct inflammatory stimulus for monocyte-macrophages. Furthermore, oxLDL is not a single, well-defined entity, but has structural and physical properties which vary according to the degree of oxidation. We tested the hypothesis that the biological effects of oxLDL will vary according to its degree of oxidation and that some species of oxLDL will have atherogenic properties, while other species may be responsible for its inflammatory activity. The atherogenic and inflammatory properties of LDL oxidized to predetermined degrees (mild, moderate and extensive oxidation) were investigated in a single system using human monocyte-derived macrophages. Expression of CD36 mRNA was up-regulated by mildly- and moderately-oxLDL, but not highly-oxLDL. The expression of the transcription factor, proliferator-activated receptor-gamma (PPARgamma), which has been proposed to positively regulate the expression of CD36, was increased to the greatest degree by highly-oxLDL. However, the DNA binding activity of PPARgamma was increased only by mildly- and moderately-oxLDL. None of the oxLDL species appeared to be pro-inflammatory towards monocytes, either directly or indirectly through mediators derived from lymphocytes, regardless of the degree of oxidation. (C) 2003 Published by Elsevier Science Ireland Ltd.
Resumo:
We consider two weakly coupled systems and adopt a perturbative approach based on the Ruelle response theory to study their interaction. We propose a systematic way of parameterizing the effect of the coupling as a function of only the variables of a system of interest. Our focus is on describing the impacts of the coupling on the long term statistics rather than on the finite-time behavior. By direct calculation, we find that, at first order, the coupling can be surrogated by adding a deterministic perturbation to the autonomous dynamics of the system of interest. At second order, there are additionally two separate and very different contributions. One is a term taking into account the second-order contributions of the fluctuations in the coupling, which can be parameterized as a stochastic forcing with given spectral properties. The other one is a memory term, coupling the system of interest to its previous history, through the correlations of the second system. If these correlations are known, this effect can be implemented as a perturbation with memory on the single system. In order to treat this case, we present an extension to Ruelle's response theory able to deal with integral operators. We discuss our results in the context of other methods previously proposed for disentangling the dynamics of two coupled systems. We emphasize that our results do not rely on assuming a time scale separation, and, if such a separation exists, can be used equally well to study the statistics of the slow variables and that of the fast variables. By recursively applying the technique proposed here, we can treat the general case of multi-level systems.