970 resultados para single system image


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2014

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Marxian thinking following the TSSI (Temporal Single System Interpretation) of Marx is applied to refute the allegation of a tautology in the resource-based view of the firm--paired with providing an explanation of how and why resources create value--, where resources are synonymous with Marx's categories of constant and variable capital. Refuting the allegation naturally leads to the holy grail of resource-based thinking, i.e. the question of what, conceptually, constitutes a firm's competitive advantage within the industry context. The article achieves its objectives by tying the resource-based view into Marx's theory value.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we describe a system for underwater navigation with AUVs in partially structured environments, such as dams, ports or marine platforms. An imaging sonar is used to obtain information about the location of planar structures present in such environments. This information is incorporated into a feature-based SLAM algorithm in a two step process: (I) the full 360deg sonar scan is undistorted (to compensate for vehicle motion), thresholded and segmented to determine which measurements correspond to planar environment features and which should be ignored; and (2) SLAM proceeds once the data association is obtained: both the vehicle motion and the measurements whose correct association has been previously determined are incorporated in the SLAM algorithm. This two step delayed SLAM process allows to robustly determine the feature and vehicle locations in the presence of large amounts of spurious or unrelated measurements that might correspond to boats, rocks, etc. Preliminary experiments show the viability of the proposed approach

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En este proyecto de grado se diseño una herramienta para determinar el potencial de internacionalización de la empresa. Identificando las áreas estrategias más importantes y evaluándolas para poder ubicarlas dentro de una escala. Se presentan diferentes teorías para desarrollar la herramienta. Se aplica la herramienta a una Pyme Colombiana productora de vidrio. La solución de la herramienta es poder estandarizar todas las áreas definidas y poderlas proyectar como un solo sistema para poder avanzar al siguiente nivel de internacionalización. Las empresas Colombianas necesitan competir en el mercado global para desarrollar practicas y conocimiento que le generen una ventaja y posicionamiento de largo plazo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intact, enveloped coronavirus particles vary widely in size and contour, and are thus refractory to study by traditional structural means such as X-ray crystallography. Electron microscopy (EM) overcomes some problems associated with particle variability and has been an important tool for investigating coronavirus ultrastructure. However, EM sample preparation requires that the specimen be dried onto a carbon support film before imaging, collapsing internal particle structure in the case of coronaviruses. Moreover, conventional EM achieves image contrast by immersing the specimen briefly in heavy-metal-containing stain, which reveals some features while obscuring others. Electron cryomicroscopy (cryo-EM) instead employs a porous support film, to which the specimen is adsorbed and flash-frozen. Specimens preserved in vitreous ice over holes in the support film can then be imaged without additional staining. Cryo-EM, coupled with single-particle image analysis techniques, makes it possible to examine the size, structure and arrangement of coronavirus structural components in fully hydrated, native virions. Two virus purification procedures are described.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Oxidized low-density lipoprotein (oxLDL) exhibits many atherogenic effects, including the promotion of monocyte recruitment to the arterial endothelium and the induction of scavenger receptor expression. However, while atherosclerosis involves chronic inflammation within the arterial intima, it is unclear whether oxLDL alone provides a direct inflammatory stimulus for monocyte-macrophages. Furthermore, oxLDL is not a single, well-defined entity, but has structural and physical properties which vary according to the degree of oxidation. We tested the hypothesis that the biological effects of oxLDL will vary according to its degree of oxidation and that some species of oxLDL will have atherogenic properties, while other species may be responsible for its inflammatory activity. The atherogenic and inflammatory properties of LDL oxidized to predetermined degrees (mild, moderate and extensive oxidation) were investigated in a single system using human monocyte-derived macrophages. Expression of CD36 mRNA was up-regulated by mildly- and moderately-oxLDL, but not highly-oxLDL. The expression of the transcription factor, proliferator-activated receptor-gamma (PPARgamma), which has been proposed to positively regulate the expression of CD36, was increased to the greatest degree by highly-oxLDL. However, the DNA binding activity of PPARgamma was increased only by mildly- and moderately-oxLDL. None of the oxLDL species appeared to be pro-inflammatory towards monocytes, either directly or indirectly through mediators derived from lymphocytes, regardless of the degree of oxidation. (C) 2003 Published by Elsevier Science Ireland Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider two weakly coupled systems and adopt a perturbative approach based on the Ruelle response theory to study their interaction. We propose a systematic way of parameterizing the effect of the coupling as a function of only the variables of a system of interest. Our focus is on describing the impacts of the coupling on the long term statistics rather than on the finite-time behavior. By direct calculation, we find that, at first order, the coupling can be surrogated by adding a deterministic perturbation to the autonomous dynamics of the system of interest. At second order, there are additionally two separate and very different contributions. One is a term taking into account the second-order contributions of the fluctuations in the coupling, which can be parameterized as a stochastic forcing with given spectral properties. The other one is a memory term, coupling the system of interest to its previous history, through the correlations of the second system. If these correlations are known, this effect can be implemented as a perturbation with memory on the single system. In order to treat this case, we present an extension to Ruelle's response theory able to deal with integral operators. We discuss our results in the context of other methods previously proposed for disentangling the dynamics of two coupled systems. We emphasize that our results do not rely on assuming a time scale separation, and, if such a separation exists, can be used equally well to study the statistics of the slow variables and that of the fast variables. By recursively applying the technique proposed here, we can treat the general case of multi-level systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The topography of many floodplains in the developed world has now been surveyed with high resolution sensors such as airborne LiDAR (Light Detection and Ranging), giving accurate Digital Elevation Models (DEMs) that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X WorldDEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution Synthetic Aperture Radar (SAR) images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. This paper discusses an additional use of SAR flood extents, namely to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving this DEM for future flood modelling and other studies. The method is based on the fact that for larger rivers the water elevation generally changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as samples with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the corrected heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must in general be no lower than the corrected heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the reduced errors on the corrected waterline heights. The method was tested on a section of the TanDEM-X Intermediate DEM (IDEM) covering an 11km reach of the Warwickshire Avon, England. Flood extents from four COSMO-SKyMed images were available at various stages of a flood in November 2012, and a LiDAR DEM was available for validation. In the area covered by the flood extents, the original IDEM heights had a mean difference from the corresponding LiDAR heights of 0.5 m with a standard deviation of 2.0 m, while the corrected heights had a mean difference of 0.3 m with standard deviation 1.2 m. These figures show that significant reductions in IDEM height bias and error can be made using the method, with the corrected error being only 60% of the original. Even if only a single SAR image obtained near the peak of the flood was used, the corrected error was only 66% of the original. The method should also be capable of improving the final TanDEM-X DEM and other DEMs, and may also be of use with data from the SWOT (Surface Water and Ocean Topography) satellite.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Our aim in this paper is to robustly match frontal faces in the presence of extreme illumination changes, using only a single training image per person and a single probe image. In the illumination conditions we consider, which include those with the dominant light source placed behind and to the side of the user, directly above and pointing downwards or indeed below and pointing upwards, this is a most challenging problem. The presence of sharp cast shadows, large poorly illuminated regions of the face, quantum and quantization noise and other nuisance effects, makes it difficult to extract a sufficiently discriminative yet robust representation. We introduce a representation which is based on image gradient directions near robust edges which correspond to characteristic facial features. Robust edges are extracted using a cascade of processing steps, each of which seeks to harness further discriminative information or normalize for a particular source of extra-personal appearance variability. The proposed representation was evaluated on the extremely difficult YaleB data set. Unlike most of the previous work we include all available illuminations, perform training using a single image per person and match these also to a single probe image. In this challenging evaluation setup, the proposed gradient edge map achieved 0.8% error rate, demonstrating a nearly perfect receiver-operator characteristic curve behaviour. This is by far the best performance achieved in this setup reported in the literature, the best performing methods previously proposed attaining error rates of approximately 6–7%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper addresses the task of time-separated aerial image registration. The ability to solve this problem accurately and reliably is important for a variety of subsequent image understanding applications. The principal challenge lies in the extent and nature of transient appearance variation that a land area can undergo, such as that caused by the change under illumination conditions, seasonal variations, or the occlusion by non-persistent objects (people, cars). Our work introduces several major novelties (i) unlike previous work on aerial image registration, we approach the problem using a set-based paradigm; (ii) we show how image space local, pair-wise constraints can be used to enforce a globally good registration using a constraints graph structure; (iii) we show how a simple holistic representation derived from raw aerial images can be used as a basic building block of the constraints graph in a manner which achieves both high registration accuracy and speed; (iv) lastly, we introduce a new and, to the best of our knowledge, the only data corpus suitable for the evaluation of set-based aerial image registration algorithms. Using this data set, we demonstrate (i) that the proposed method outperforms the state-of-the-art for pair-wise registration already, achieving greater accuracy and reliability, while at the same time reducing the computational cost of the task and (ii) that the increase in the number of available images in a set consistently reduces the average registration error, with a major difference already for a single additional image.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho tem por objetivo propor diretrizes para o sistema de controle interno do Poder Executivo do Estado do Rio de Janeiro, de forma a inserir atividades de correição, de ouvidorias e de transparência e prevenção à corrupção. O cenário atual deste sistema estadual é regido pelo Decreto 43.463, de 14 de fevereiro de 2012, e deixou de fora aquelas atividades. O trabalho contextualiza a importância de que tais atividades integrem um sistema único, a fim de desenvolver e manter permanente interlocução das informações produzidas por cada atividade. Como o controle interno é instrumento de accountability, tal integração pode fomentar a transparência, podendo contribuir para as ações do controle social. A premissa para a proposta de diretrizes do presente trabalho é o modelo federal e o modelo de alguns estados da federação, os quais já estruturaram seu sistema de controle interno do Poder Executivo, contemplando as atividades de controle interno, de correição, de ouvidorias e de transparência e prevenção à corrupção.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tabletop computers featuring multi-touch input and object tracking are a common platform for research on Tangible User Interfaces (also known as Tangible Interaction). However, such systems are confined to sensing activity on the tabletop surface, disregarding the rich and relatively unexplored interaction canvas above the tabletop. This dissertation contributes with tCAD, a 3D modeling tool combining fiducial marker tracking, finger tracking and depth sensing in a single system. This dissertation presents the technical details of how these features were integrated, attesting to its viability through the design, development and early evaluation of the tCAD application. A key aspect of this work is a description of the interaction techniques enabled by merging tracked objects with direct user input on and above a table surface.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Avaliou-se a produtividade de grupos de alface e da rúcula, em consórcio, em relação aos seus cultivos solteiros, na UNESP Jaboticabal, em condições de campo, em duas épocas de cultivo, maio a agosto e setembro a novembro de 2001, sob delineamento de blocos casualizados, com quatro repetições. Os tratamentos constaram de combinações dos fatores grupos de alface (crespa, cv. Vera; lisa, cv. Elisa e americana, cv. Tainá), sistemas de cultivo (consórcio e cultivo solteiro) e épocas de semeadura da rúcula para o estabelecimento do consórcio [0; 7 e 14 dias após o transplante (DAT) da alface]. As maiores matérias fresca e seca de alface foram obtidas na primavera, com destaque para a alface americana. As alfaces não foram afetadas pelo sistema de cultivo. A maior matéria fresca de rúcula, no outono-inverno, foi obtida em consórcio, a 0 DAT, com as alfaces crespa e lisa e, aos 7 DAT com americana, enquanto na primavera, quando foi consorciada aos 7 DAT com alface crespa e a 0 DAT com alfaces do grupo lisa e americana. A rúcula teve sua matéria seca reduzida nos consórcios estabelecidos tardiamente, aos 14 DAT. Os cultivos consorciados apresentaram-se superiores aos cultivos solteiros entre 5 e 93%, segundo o índice de uso eficiente da terra. Os maiores índices de uso eficiente da terra foram obtidos com os consórcios de rúcula e alface crespa a 0 DAT (1,93), no outono-inverno e pelas mesmas hortaliças aos 7 DAT (1,84), na primavera.