996 resultados para quality map
Resumo:
Pós-graduação em Geografia - FCT
Resumo:
Automatically recognizing faces captured under uncontrolled environments has always been a challenging topic in the past decades. In this work, we investigate cohort score normalization that has been widely used in biometric verification as means to improve the robustness of face recognition under challenging environments. In particular, we introduce cohort score normalization into undersampled face recognition problem. Further, we develop an effective cohort normalization method specifically for the unconstrained face pair matching problem. Extensive experiments conducted on several well known face databases demonstrate the effectiveness of cohort normalization on these challenging scenarios. In addition, to give a proper understanding of cohort behavior, we study the impact of the number and quality of cohort samples on the normalization performance. The experimental results show that bigger cohort set size gives more stable and often better results to a point before the performance saturates. And cohort samples with different quality indeed produce different cohort normalization performance. Recognizing faces gone after alterations is another challenging problem for current face recognition algorithms. Face image alterations can be roughly classified into two categories: unintentional (e.g., geometrics transformations introduced by the acquisition devide) and intentional alterations (e.g., plastic surgery). We study the impact of these alterations on face recognition accuracy. Our results show that state-of-the-art algorithms are able to overcome limited digital alterations but are sensitive to more relevant modifications. Further, we develop two useful descriptors for detecting those alterations which can significantly affect the recognition performance. In the end, we propose to use the Structural Similarity (SSIM) quality map to detect and model variations due to plastic surgeries. Extensive experiments conducted on a plastic surgery face database demonstrate the potential of SSIM map for matching face images after surgeries.
Resumo:
In the literature on firm strategy and product differentiation, consumer price-quality trade-offs are sometimes represented using consumer 'value maps'. These involve the geometric representation of indifferent price and quality combinations as points along curves that are concave to the 'quality' axis. In this paper, it is shown that the value map for price-quality tradeoffs may be derived from a Hicksian compensated demand curve for product quality. The paper provides the theoretical link between analytical methods employed in the existing literature on firm strategy and competitive advantage with the broader body of economic analysis.
Resumo:
Leadership and Management in Engineering, January 2009
Resumo:
Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium’s Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new “quality-enabled” profile of WMS, which we call “WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.
Resumo:
Landslide hazard and risk are growing as a consequence of climate change and demographic pressure. Land‐use planning represents a powerful tool to manage this socio‐economic problem and build sustainable and landslide resilient communities. Landslide inventory maps are a cornerstone of land‐use planning and, consequently, their quality assessment represents a burning issue. This work aimed to define the quality parameters of a landslide inventory and assess its spatial and temporal accuracy with regard to its possible applications to land‐use planning. In this sense, I proceeded according to a two‐steps approach. An overall assessment of the accuracy of data geographic positioning was performed on four case study sites located in the Italian Northern Apennines. The quantification of the overall spatial and temporal accuracy, instead, focused on the Dorgola Valley (Province of Reggio Emilia). The assessment of spatial accuracy involved a comparison between remotely sensed and field survey data, as well as an innovative fuzzylike analysis of a multi‐temporal landslide inventory map. Conversely, long‐ and short‐term landslide temporal persistence was appraised over a period of 60 years with the aid of 18 remotely sensed image sets. These results were eventually compared with the current Territorial Plan for Provincial Coordination (PTCP) of the Province of Reggio Emilia. The outcome of this work suggested that geomorphologically detected and mapped landslides are a significant approximation of a more complex reality. In order to convey to the end‐users this intrinsic uncertainty, a new form of cartographic representation is needed. In this sense, a fuzzy raster landslide map may be an option. With regard to land‐use planning, landslide inventory maps, if appropriately updated, confirmed to be essential decision‐support tools. This research, however, proved that their spatial and temporal uncertainty discourages any direct use as zoning maps, especially when zoning itself is associated to statutory or advisory regulations.
Resumo:
Marker assisted selection depends on the identification of tightly linked association between marker and the trait of interest. In the present work, functional (EST-SSRs) and genomic (gSSRs) microsatellite markers were used to detect putative QTLs for sugarcane yield components (stalk number, diameter and height) and as well as for quality parameters (Brix, Pol and fibre) in plant cane. The mapping population (200 individuals) was derived from a bi-parental cross (IACSP95-3018 x IACSP93-3046) from the IAC Sugarcane Breeding Program. As the map is under construction, single marker trait association analysis based on the likelihood ratio test was undertaken to detect the QTLs. Of the 215 single dose markers evaluated (1:1 and 3:1), 90 (42%) were associated with putative QTLs involving 43 microsatellite primers (18 gSSRs and 25 EST-SSRs). For the yield components, 41 marker/trait associations were found: 20 for height, 6 for diameter and 15 for stalk number. An EST-SSRs marker with homology to non-phototropic hypocotyls 4 (NPH4) protein was associated with a putative QTL with positive effect for diameter as also with a negative effect for stalk number. In relation to the quality parameters, 18 marker trait associations were found for Brix, 19 for Pol, and 12 for fibre. For fibre, 58% of the QTLs detected showed a negative effect on this trait. Some makers associated with QTLs with a negative effect for fibre showed a positive effect for Pol, reflecting the negative correlation generally observed between these traits.
Resumo:
The principle of using induction rules based on spatial environmental data to model a soil map has previously been demonstrated Whilst the general pattern of classes of large spatial extent and those with close association with geology were delineated small classes and the detailed spatial pattern of the map were less well rendered Here we examine several strategies to improve the quality of the soil map models generated by rule induction Terrain attributes that are better suited to landscape description at a resolution of 250 m are introduced as predictors of soil type A map sampling strategy is developed Classification error is reduced by using boosting rather than cross validation to improve the model Further the benefit of incorporating the local spatial context for each environmental variable into the rule induction is examined The best model was achieved by sampling in proportion to the spatial extent of the mapped classes boosting the decision trees and using spatial contextual information extracted from the environmental variables.
Resumo:
The applicability of image calibration to like-values in mapping water quality parameters from multitemporal images is explored, Six sets of water samples were collected at satellite overpasses over Moreton Bay, Brisbane, Australia. Analysis of these samples reveals that waters in this shallow bay are mostly TSS-dominated, even though they are occasionally dominated by chlorophyll as well. Three of the images were calibrated to a reference image based on invariant targets. Predictive models constructed from the reference image were applied to estimating total suspended sediment (TSS) and Secchi depth from another image at a discrepancy of around 35 percent. Application of the predictive model for TSS concentration to another image acquired at a time of different water types resulted in a discrepancy of 152 percent. Therefore, image calibration to like-values could be used to reliably map certain water quality parameters from multitemporal TM images so long as the water type under study remains unchanged. This method is limited in that the mapped results could be rather inaccurate if the water type under study has changed considerably. Thus, the approach needs to be refined in shallow water from multitemporal satellite imagery.
Resumo:
An operational space map is an efficient tool to compare a large number of operational strategies to find an optimal choice of setpoints based on a multicriterion. Typically, such a multicriterion includes a weighted sum of cost of operation and effluent quality. Due to the relative high cost of aeration such a definition of optimality result in a relatively high fraction of the effluent total nitrogen in the form of ammonium. Such a strategy may however introduce a risk into operation because a low degree of ammonium removal leads to a low amount of nitrifiers. This in turn leads to a reduced ability to reject event disturbances, such as large variations in the ammonium load, drop in temperature, the presence of toxic/inhibitory compounds in the influent etc. Hedging is a risk minimisation tool, with the aim to "reduce one's risk of loss on a bet or speculation by compensating transactions on the other side" (The Concise Oxford Dictionary (1995)). In wastewater treatment plant operation hedging can be applied by choosing a higher level of ammonium removal to increase the amount of nitrifiers. This is a sensible way to introduce disturbance rejection ability into the multi criterion. In practice, this is done by deciding upon an internal effluent ammonium criterion. In some countries such as Germany, a separate criterion already applies to the level of ammonium in the effluent. However, in most countries the effluent criterion applies to total nitrogen only. In these cases, an internal effluent ammonium criterion should be selected in order to secure proper disturbance rejection ability.
Resumo:
Lean Thinking is an important pillar in the success of any program of continuous improvement process. Its tools are useful means in the analysis, control and organization of important data for correct decision making in organizations. This project had as main objective the design of a program of quality improvement in Eurico Ferreira, S.A., based on the evaluation of customer satisfaction and the implementation of 5S. Subsequently, we have selected which business area of the company to address. After the selection, there was an initial diagnostic procedure, identifying the various points of improvement to which some tools of Lean Thinking have been applied, in particular Value Stream Mapping and 5S methodology. With the first, we were able to map the current state of the process in which all stakeholders were represented as well as the flow of materials and information throughout the process. The 5S methodology allowed to act on the wastage, identifying and implementing various process improvements.
Resumo:
This study deals with investigating the groundwater quality for irrigation purpose, the vulnerability of the aquifer system to pollution and also the aquifer potential for sustainable water resources development in Kobo Valley development project. The groundwater quality is evaluated up on predicting the best possible distribution of hydrogeochemicals using geostatistical method and comparing them with the water quality guidelines given for the purpose of irrigation. The hydro geochemical parameters considered are SAR, EC, TDS, Cl-, Na+, Ca++, SO4 2- and HCO3 -. The spatial variability map reveals that these parameters falls under safe, moderate and severe or increasing problems. In order to present it clearly, the aggregated Water Quality Index (WQI) map is constructed using Weighted Arithmetic Mean method. It is found that Kobo-Gerbi sub basin is suffered from bad water quality for the irrigation purpose. Waja Golesha sub-basin has moderate and Hormat Golena is the better sub basin in terms of water quality. The groundwater vulnerability assessment of the study area is made using the GOD rating system. It is found that the whole area is experiencing moderate to high risk of vulnerability and it is a good warning for proper management of the resource. The high risks of vulnerability are noticed in Hormat Golena and Waja Golesha sub basins. The aquifer potential of the study area is obtained using weighted overlay analysis and 73.3% of the total area is a good site for future water well development. The rest 26.7% of the area is not considered as a good site for spotting groundwater wells. Most of this area fall under Kobo-Gerbi sub basin.
Resumo:
Many municipal activities require updated large-scale maps that include both topographic and thematic information. For this purpose, the efficient use of very high spatial resolution (VHR) satellite imagery suggests the development of approaches that enable a timely discrimination, counting and delineation of urban elements according to legal technical specifications and quality standards. Therefore, the nature of this data source and expanding range of applications calls for objective methods and quantitative metrics to assess the quality of the extracted information which go beyond traditional thematic accuracy alone. The present work concerns the development and testing of a new approach for using technical mapping standards in the quality assessment of buildings automatically extracted from VHR satellite imagery. Feature extraction software was employed to map buildings present in a pansharpened QuickBird image of Lisbon. Quality assessment was exhaustive and involved comparisons of extracted features against a reference data set, introducing cartographic constraints from scales 1:1000, 1:5000, and 1:10,000. The spatial data quality elements subject to evaluation were: thematic (attribute) accuracy, completeness, and geometric quality assessed based on planimetric deviation from the reference map. Tests were developed and metrics analyzed considering thresholds and standards for the large mapping scales most frequently used by municipalities. Results show that values for completeness varied with mapping scales and were only slightly superior for scale 1:10,000. Concerning the geometric quality, a large percentage of extracted features met the strict topographic standards of planimetric deviation for scale 1:10,000, while no buildings were compliant with the specification for scale 1:1000.
Resumo:
Estudi elaborat a partir d’una estada al Royal Veterinary and Agricultural University of Denmark entre els mesos de Març a Juny del 2006. S’ha investigat l’efecte dels envasats amb atmosferes modificades (MAP), així com la marinació amb vi tint, sobre l’evolució de la contaminació bacteriològica de carns fosques, dures i seques (DFD). Les carns DFD es troben a les canals d’animals que, abans del sacrifici, han estat exposades a activitats musculars prolongades o estrès. Les carns DFD impliquen importants pèrdues econòmiques degut a la contaminació bacteriològica i als problemes tecnològics relacionats amb la alta capacitat de retenció d’aigua. A més a més, és crític per la indústria investigar la diversitat de la contaminació bacteriana, identificar les espècies bacterianes i controlar-les. Però és difícil degut a la inhabilitat per detectar algunes bactèries en medis coneguts, les interaccions entre elles, la complexitat dels tipus de contaminació com són aigua, terra, femtes i l’ambient. La Polymerasa chain reaction- Denaturating Electrophoresis Gel (PCR-DGEE ) pot sobrepassar aquests problemes reflectint la diversitat microbial i les espècies bacterianes. Els resultants han indicat que la varietat bacteriana de la carn incrementava amb els dies d’envasat independentment del mètode d’envasat, però decreixia significativament amb el tractament de marinació amb vi tint. La DGEE ha mostrat diferències en les espècies trobades, indicant canvis en la contaminació bacteriana i les seves característiques en la carn DFD sota els diferents tractaments. Tot i que la marinació és una bona alternativa i solució a la comercialització de carn DFD , estudis de seqüenciació són necessaris per identificar les diferents tipus de bactèries.