826 resultados para 2D barcode based authentication scheme
Resumo:
A novel nanosized and addressable sensing platform based on membrane coated plasmonic particles for detection of protein adsorption using dark field scattering spectroscopy of single particles has been established. To this end, a detailed analysis of the deposition of gold nanorods on differently functionalized substrates is performed in relation to various factors (such as the pH, ionic strength, concentration of colloidal suspension, incubation time) in order to find the optimal conditions for obtaining a homogenous distribution of particles at the desired surface number density. The possibility of successfully draping lipid bilayers over the gold particles immobilized on glass substrates depends on the careful adjustment of parameters such as membrane curvature and adhesion properties and is demonstrated with complementary techniques such as phase imaging AFM, fluorescence microscopy (including FRAP) and single particle spectroscopy. The functionality and sensitivity of the proposed sensing platform is unequivocally certified by the resonance shifts of the plasmonic particles that were individually interrogated with single particle spectroscopy upon the adsorption of streptavidin to biotinylated lipid membranes. This new detection approach that employs particles as nanoscopic reporters for biomolecular interactions insures a highly localized sensitivity that offers the possibility to screen lateral inhomogeneities of native membranes. As an alternative to the 2D array of gold nanorods, short range ordered arrays of nanoholes in optically transparent gold films or regular arrays of truncated tetrahedron shaped particles are built by means of colloidal nanolithography on transparent substrates. Technical issues mainly related to the optimization of the mask deposition conditions are successfully addressed such that extended areas of homogenously nanostructured gold surfaces are achieved. Adsorption of the proteins annexin A1 and prothrombin on multicomponent lipid membranes as well as the hydrolytic activity of the phospholipase PLA2 were investigated with classical techniques such as AFM, ellipsometry and fluorescence microscopy. At first, the issues of lateral phase separation in membranes of various lipid compositions and the dependency of the domains configuration (sizes and shapes) on the membrane content are addressed. It is shown that the tendency for phase segregation of gel and fluid phase lipid mixtures is accentuated in the presence of divalent calcium ions for membranes containing anionic lipids as compared to neutral bilayers. Annexin A1 adsorbs preferentially and irreversibly on preformed phosphatidylserine (PS) enriched lipid domains but, dependent on the PS content of the bilayer, the protein itself may induce clustering of the anionic lipids into areas with high binding affinity. Corroborated evidence from AFM and fluorescence experiments confirm the hypothesis of a specifically increased hydrolytic activity of PLA2 on the highly curved regions of membranes due to a facilitated access of lipase to the cleavage sites of the lipids. The influence of the nanoscale gold surface topography on the adhesion of lipid vesicles is unambiguously demonstrated and this reveals, at least in part, an answer for the controversial question existent in the literature about the behavior of lipid vesicles interacting with bare gold substrates. The possibility of formation monolayers of lipid vesicles on chemically untreated gold substrates decorated with gold nanorods opens new perspectives for biosensing applications that involve the radiative decay engineering of the plasmonic particles.
A new double laser pulse pumping scheme for transient collisionally excited plasma soft X-ray lasers
Resumo:
Within this thesis a new double laser pulse pumping scheme for plasma-based, transient collisionally excited soft x-ray lasers (SXRL) was developed, characterized and utilized for applications. SXRL operations from ~50 up to ~200 electron volt were demonstrated applying this concept. As a central technical tool, a special Mach-Zehnder interferometer in the chirped pulse amplification (CPA) laser front-end was developed for the generation of fully controllable double-pulses to optimally pump SXRLs.rnThis Mach-Zehnder device is fully controllable and enables the creation of two CPA pulses of different pulse duration and variable energy balance with an adjustable time delay. Besides the SXRL pumping, the double-pulse configuration was applied to determine the B-integral in the CPA laser system by amplifying short pulse replica in the system, followed by an analysis in the time domain. The measurement of B-integral values in the 0.1 to 1.5 radian range, only limited by the reachable laser parameters, proved to be a promising tool to characterize nonlinear effects in the CPA laser systems.rnContributing to the issue of SXRL pumping, the double-pulse was configured to optimally produce the gain medium of the SXRL amplification. The focusing geometry of the two collinear pulses under the same grazing incidence angle on the target, significantly improved the generation of the active plasma medium. On one hand the effect was induced by the intrinsically guaranteed exact overlap of the two pulses on the target, and on the other hand by the grazing incidence pre-pulse plasma generation, which allows for a SXRL operation at higher electron densities, enabling higher gain in longer wavelength SXRLs and higher efficiency at shorter wavelength SXRLs. The observation of gain enhancement was confirmed by plasma hydrodynamic simulations.rnThe first introduction of double short-pulse single-beam grazing incidence pumping for SXRL pumping below 20 nanometer at the laser facility PHELIX in Darmstadt (Germany), resulted in a reliable operation of a nickel-like palladium SXRL at 14.7 nanometer with a pump energy threshold strongly reduced to less than 500 millijoule. With the adaptation of the concept, namely double-pulse single-beam grazing incidence pumping (DGRIP) and the transfer of this technology to the laser facility LASERIX in Palaiseau (France), improved efficiency and stability of table-top high-repetition soft x-ray lasers in the wavelength region below 20 nanometer was demonstrated. With a total pump laser energy below 1 joule the target, 2 mircojoule of nickel-like molybdenum soft x-ray laser emission at 18.9 nanometer was obtained at 10 hertz repetition rate, proving the attractiveness for high average power operation. An easy and rapid alignment procedure fulfilled the requirements for a sophisticated installation, and the highly stable output satisfied the need for a reliable strong SXRL source. The qualities of the DGRIP scheme were confirmed in an irradiation operation on user samples with over 50.000 shots corresponding to a deposited energy of ~ 50 millijoule.rnThe generation of double-pulses with high energies up to ~120 joule enabled the transfer to shorter wavelength SXRL operation at the laser facility PHELIX. The application of DGRIP proved to be a simple and efficient method for the generation of soft x-ray lasers below 10 nanometer. Nickel-like samarium soft x-ray lasing at 7.3 nanometer was achieved at a low total pump energy threshold of 36 joule, which confirmed the suitability of the applied pumping scheme. A reliable and stable SXRL operation was demonstrated, due to the single-beam pumping geometry despite the large optical apertures. The soft x-ray lasing of nickel-like samarium was an important milestone for the feasibility of applying the pumping scheme also for higher pumping pulse energies, which are necessary to obtain soft x-ray laser wavelengths in the water window. The reduction of the total pump energy below 40 joule for 7.3 nanometer short wavelength lasing now fulfilled the requirement for the installation at the high-repetition rate operation laser facility LASERIX.rn
Resumo:
In this thesis, I present the realization of a fiber-optical interface using optically trapped cesium atoms, which is an efficient tool for coupling light and atoms. The basic principle of the presented scheme relies on the trapping of neutral cesium atoms in a two-color evanescent field surrounding a nanofiber. The strong confinement of the fiber guided light, which also protrudes outside the nanofiber, provides strong confinement of the atoms as well as efficient coupling to near-resonant light propagating through the fiber. In chapter 1, the necessary physical and mathematical background describing the propagation of light in an optical fiber is presented. The exact solution of Maxwell’s equations allows us to model fiber-guided light fields which give rise to the trapping potentials and the atom-light coupling in the close vicinity of a nanofiber. Chapter 2 gives the theoretical background of light-atom interaction. A quantum mechanical model of the light-induced shifts of the relevant atomic levels is reviewed, which allows us to quantify the perturbation of the atomic states due to the presence of the trapping light-fields. The experimental realization of the fiber-based atom trap is the focus of chapter 3. Here, I analyze the properties of the fiber-based trap in terms of the confinement of the atoms and the impact of several heating mechanisms. Furthermore, I demonstrate the transportation of the trapped atoms, as a first step towards a deterministic delivery of individual atoms. In chapter 4, I present the successful interfacing of the trapped atomic ensemble and fiber-guided light. Three different approaches are discussed, i.e., those involving the measurement of either near-resonant scattering in absorption or the emission into the guided mode of the nanofiber. In the analysis of the spectroscopic properties of the trapped ensemble we find good agreement with the prediction of theoretical model discussed in chapter 2. In addition, I introduce a non-destructive scheme for the interrogation of the atoms states, which is sensitive to phase shifts of far-detuned fiber-guided light interacting with the trapped atoms. The inherent birefringence in our system, induced by the atoms, changes the state of polarization of the probe light and can be thus detected via a Stokes vector measurement.
Resumo:
The research aims at developing a framework for semantic-based digital survey of architectural heritage. Rooted in knowledge-based modeling which extracts mathematical constraints of geometry from architectural treatises, as-built information of architecture obtained from image-based modeling is integrated with the ideal model in BIM platform. The knowledge-based modeling transforms the geometry and parametric relation of architectural components from 2D printings to 3D digital models, and create large amount variations based on shape grammar in real time thanks to parametric modeling. It also provides prior knowledge for semantically segmenting unorganized survey data. The emergence of SfM (Structure from Motion) provides access to reconstruct large complex architectural scenes with high flexibility, low cost and full automation, but low reliability of metric accuracy. We solve this problem by combing photogrammetric approaches which consists of camera configuration, image enhancement, and bundle adjustment, etc. Experiments show the accuracy of image-based modeling following our workflow is comparable to that from range-based modeling. We also demonstrate positive results of our optimized approach in digital reconstruction of portico where low-texture-vault and dramatical transition of illumination bring huge difficulties in the workflow without optimization. Once the as-built model is obtained, it is integrated with the ideal model in BIM platform which allows multiple data enrichment. In spite of its promising prospect in AEC industry, BIM is developed with limited consideration of reverse-engineering from survey data. Besides representing the architectural heritage in parallel ways (ideal model and as-built model) and comparing their difference, we concern how to create as-built model in BIM software which is still an open area to be addressed. The research is supposed to be fundamental for research of architectural history, documentation and conservation of architectural heritage, and renovation of existing buildings.
Resumo:
The aim of this work is to present various aspects of numerical simulation of particle and radiation transport for industrial and environmental protection applications, to enable the analysis of complex physical processes in a fast, reliable, and efficient way. In the first part we deal with speed-up of numerical simulation of neutron transport for nuclear reactor core analysis. The convergence properties of the source iteration scheme of the Method of Characteristics applied to be heterogeneous structured geometries has been enhanced by means of Boundary Projection Acceleration, enabling the study of 2D and 3D geometries with transport theory without spatial homogenization. The computational performances have been verified with the C5G7 2D and 3D benchmarks, showing a sensible reduction of iterations and CPU time. The second part is devoted to the study of temperature-dependent elastic scattering of neutrons for heavy isotopes near to the thermal zone. A numerical computation of the Doppler convolution of the elastic scattering kernel based on the gas model is presented, for a general energy dependent cross section and scattering law in the center of mass system. The range of integration has been optimized employing a numerical cutoff, allowing a faster numerical evaluation of the convolution integral. Legendre moments of the transfer kernel are subsequently obtained by direct quadrature and a numerical analysis of the convergence is presented. In the third part we focus our attention to remote sensing applications of radiative transfer employed to investigate the Earth's cryosphere. The photon transport equation is applied to simulate reflectivity of glaciers varying the age of the layer of snow or ice, its thickness, the presence or not other underlying layers, the degree of dust included in the snow, creating a framework able to decipher spectral signals collected by orbiting detectors.
Resumo:
A new control scheme has been presented in this thesis. Based on the NonLinear Geometric Approach, the proposed Active Control System represents a new way to see the reconfigurable controllers for aerospace applications. The presence of the Diagnosis module (providing the estimation of generic signals which, based on the case, can be faults, disturbances or system parameters), mean feature of the depicted Active Control System, is a characteristic shared by three well known control systems: the Active Fault Tolerant Controls, the Indirect Adaptive Controls and the Active Disturbance Rejection Controls. The standard NonLinear Geometric Approach (NLGA) has been accurately investigated and than improved to extend its applicability to more complex models. The standard NLGA procedure has been modified to take account of feasible and estimable sets of unknown signals. Furthermore the application of the Singular Perturbations approximation has led to the solution of Detection and Isolation problems in scenarios too complex to be solved by the standard NLGA. Also the estimation process has been improved, where multiple redundant measuremtent are available, by the introduction of a new algorithm, here called "Least Squares - Sliding Mode". It guarantees optimality, in the sense of the least squares, and finite estimation time, in the sense of the sliding mode. The Active Control System concept has been formalized in two controller: a nonlinear backstepping controller and a nonlinear composite controller. Particularly interesting is the integration, in the controller design, of the estimations coming from the Diagnosis module. Stability proofs are provided for both the control schemes. Finally, different applications in aerospace have been provided to show the applicability and the effectiveness of the proposed NLGA-based Active Control System.
Resumo:
Die vorliegende Arbeit behandelt die Entwicklung und Verbesserung von linear skalierenden Algorithmen für Elektronenstruktur basierte Molekulardynamik. Molekulardynamik ist eine Methode zur Computersimulation des komplexen Zusammenspiels zwischen Atomen und Molekülen bei endlicher Temperatur. Ein entscheidender Vorteil dieser Methode ist ihre hohe Genauigkeit und Vorhersagekraft. Allerdings verhindert der Rechenaufwand, welcher grundsätzlich kubisch mit der Anzahl der Atome skaliert, die Anwendung auf große Systeme und lange Zeitskalen. Ausgehend von einem neuen Formalismus, basierend auf dem großkanonischen Potential und einer Faktorisierung der Dichtematrix, wird die Diagonalisierung der entsprechenden Hamiltonmatrix vermieden. Dieser nutzt aus, dass die Hamilton- und die Dichtematrix aufgrund von Lokalisierung dünn besetzt sind. Das reduziert den Rechenaufwand so, dass er linear mit der Systemgröße skaliert. Um seine Effizienz zu demonstrieren, wird der daraus entstehende Algorithmus auf ein System mit flüssigem Methan angewandt, das extremem Druck (etwa 100 GPa) und extremer Temperatur (2000 - 8000 K) ausgesetzt ist. In der Simulation dissoziiert Methan bei Temperaturen oberhalb von 4000 K. Die Bildung von sp²-gebundenem polymerischen Kohlenstoff wird beobachtet. Die Simulationen liefern keinen Hinweis auf die Entstehung von Diamant und wirken sich daher auf die bisherigen Planetenmodelle von Neptun und Uranus aus. Da das Umgehen der Diagonalisierung der Hamiltonmatrix die Inversion von Matrizen mit sich bringt, wird zusätzlich das Problem behandelt, eine (inverse) p-te Wurzel einer gegebenen Matrix zu berechnen. Dies resultiert in einer neuen Formel für symmetrisch positiv definite Matrizen. Sie verallgemeinert die Newton-Schulz Iteration, Altmans Formel für beschränkte und nicht singuläre Operatoren und Newtons Methode zur Berechnung von Nullstellen von Funktionen. Der Nachweis wird erbracht, dass die Konvergenzordnung immer mindestens quadratisch ist und adaptives Anpassen eines Parameters q in allen Fällen zu besseren Ergebnissen führt.
Resumo:
Bandlaufwerke waren bisher die vorherrschende Technologie, um die anfallenden Datenmengen in Archivsystemen zu speichern. Mit Zugriffsmustern, die immer aktiver werden, und Speichermedien wie Festplatten die kostenmäßig aufholen, muss die Architektur vor Speichersystemen zur Archivierung neu überdacht werden. Zuverlässigkeit, Integrität und Haltbarkeit sind die Haupteigenschaften der digitalen Archivierung. Allerdings nimmt auch die Zugriffsgeschwindigkeit einen erhöhten Stellenwert ein, wenn aktive Archive ihre gesamten Inhalte für den direkten Zugriff bereitstellen. Ein band-basiertes System kann die hierfür benötigte Parallelität, Latenz und Durchsatz nicht liefern, was in der Regel durch festplattenbasierte Systeme als Zwischenspeicher kompensiert wird.rnIn dieser Arbeit untersuchen wir die Herausforderungen und Möglichkeiten ein festplattenbasiertes Speichersystem zu entwickeln, das auf eine hohe Zuverlässigkeit und Energieeffizienz zielt und das sich sowohl für aktive als auch für kalte Archivumgebungen eignet. Zuerst analysieren wir die Speichersysteme und Zugriffsmuster eines großen digitalen Archivs und präsentieren damit ein mögliches Einsatzgebiet für unsere Architektur. Daraufhin stellen wir Mechanismen vor um die Zuverlässigkeit einer einzelnen Festplatte zu verbessern und präsentieren sowie evaluieren einen neuen, energieeffizienten, zwei- dimensionalen RAID Ansatz der für „Schreibe ein Mal, lese mehrfach“ Zugriffe optimiert ist. Letztlich stellen wir Protokollierungs- und Zwischenspeichermechanismen vor, die die zugrundeliegenden Ziele unterstützen und evaluieren das RAID System in einer Dateisystemumgebung.
Development of a biorefinery scheme for the valorization of olive mill wastewaters and grape pomaces
Resumo:
In the Mediterranean area, olive mill wastewater (OMW) and grape pomace (GP) are among the major agro-industrial wastes produced. These two wastes have a high organic load and high phytotoxicity. Thus, their disposal in the environment can lead to negative effects. Second-generation biorefineries are dedicated to the valorization of biowaste by the production of goods from such residual biomasses. This approach can combine bioremediation approaches to the generation of noble molecules, biomaterials and energy. The main aim of this thesis work was to study the anaerobic digestion of OMW and GP under different operational conditions to produce volatile fatti acids (VFAs) (first stage aim) and CH4 (second stage aim). To this end, a packed-bed biofilm reactor (PBBR) was set up to perform the anaerobic acidogenic digestion of the liquid dephenolized stream of OMW (OMWdeph). In parallel, the solid stream of OMW (OMWsolid), previously separated in order to allow the solid phase extraction of polyphenols, was addressed to anaerobic methanogenic digestion to obtain CH4. The latter experiment was performed in 100ml Pyrex bottles which were maintained at different temperatures (55-45-37°C). Together with previous experiments, the anaerobic acidogenic digestion of fermented GP (GPfreshacid) and dephenolized and fermented GP (GPdephacid) was performed in 100ml Pyrex bottles to estimate the concentration of VFAs achievable from each aforementioned GPs. Finally, the same matrices of GP and not pre-treated GP (GPfresh) were digested under anaerobic methanogenic condition to produce CH4. Anaerobic acidogenic and methanogenic digestion processes of GPs lasted about 33 days. Instead, the anaerobic acidogenic and methanogenic digestion process of OMWs lasted about 121 and 60 days, respectively. Each experiment was periodically monitored by analysing volume and composition of produced biogas and VFA concentration. Results showed that VFAs were produced in higher concentrations in GP compared to OMWdeph. The overall concentration of VFAs from GPfreshacid was approximately 39.5 gCOD L-1, 29 gCOD L-1 from GPdephacid, and 8.7 gCOD L-1 from OMWdeph. Concerning the CH4 production, the OMWsolid reached a high biochemical methane potential (BMP) at a thermophilic temperature (55°) than at mesophlic ones (37-45°C). The value reached was about 358.7 mlCH4 gSVsub-1. In contrast, GPfresh got a high BMP but at a mesophilic temperature. The BMP was about 207.3 mlCH4 gSVsub-1, followed by GPfreshacid with about 192.6 mlCH4 gSVsub-1 and lastly GPdephacid with about 102.2 mlCH4 gSVsub-1. In summary, based on the gathered results, GP seems to be a better carbon source for acidogenic and methanogenic microrganism compared to OMW, because higher amount of VFAs and CH4 were produced in AD of GP than OMW. In addition to these products, polyphenols were extracted by means of a solid phase extraction (SPE) procedure by another research group, and VFAs were utilised for biopolymers production, in particular polyhydroxyalkanoates (PHAs), by the same research group in which I was involved.
Resumo:
The present work studies a km-scale data assimilation scheme based on a LETKF developed for the COSMO model. The aim is to evaluate the impact of the assimilation of two different types of data: temperature, humidity, pressure and wind data from conventional networks (SYNOP, TEMP, AIREP reports) and 3d reflectivity from radar volume. A 3-hourly continuous assimilation cycle has been implemented over an Italian domain, based on a 20 member ensemble, with boundary conditions provided from ECMWF ENS. Three different experiments have been run for evaluating the performance of the assimilation on one week in October 2014 during which Genova flood and Parma flood took place: a control run of the data assimilation cycle with assimilation of data from conventional networks only, a second run in which the SPPT scheme is activated into the COSMO model, a third run in which also reflectivity volumes from meteorological radar are assimilated. Objective evaluation of the experiments has been carried out both on case studies and on the entire week: check of the analysis increments, computing the Desroziers statistics for SYNOP, TEMP, AIREP and RADAR, over the Italian domain, verification of the analyses against data not assimilated (temperature at the lowest model level objectively verified against SYNOP data), and objective verification of the deterministic forecasts initialised with the KENDA analyses for each of the three experiments.
Resumo:
We propose a new and clinically oriented approach to perform atlas-based segmentation of brain tumor images. A mesh-free method is used to model tumor-induced soft tissue deformations in a healthy brain atlas image with subsequent registration of the modified atlas to a pathologic patient image. The atlas is seeded with a tumor position prior and tumor growth simulating the tumor mass effect is performed with the aim of improving the registration accuracy in case of patients with space-occupying lesions. We perform tests on 2D axial slices of five different patient data sets and show that the approach gives good results for the segmentation of white matter, grey matter, cerebrospinal fluid and the tumor.
Resumo:
This paper presents methods based on Information Filters for solving matching problems with emphasis on real-time, or effectively real-time applications. Both applications discussed in this work deal with ultrasound-based rigid registration in computer-assisted orthopedic surgery. In the first application, the usual workflow of rigid registration is reformulated such that registration algorithms would iterate while the surgeon is acquiring ultrasound images of the anatomy to be operated. Using this effectively real-time approach to registration, the surgeon would then receive feedback in order to better gauge the quality of the final registration outcome. The second application considered in this paper circumvents the need to attach physical markers to bones for anatomical referencing. Experiments using anatomical objects immersed in water are performed in order to evaluate and compare the different methods presented herein, using both 2D as well as real-time 3D ultrasound.
Resumo:
Seventeen bones (sixteen cadaveric bones and one plastic bone) were used to validate a method for reconstructing a surface model of the proximal femur from 2D X-ray radiographs and a statistical shape model that was constructed from thirty training surface models. Unlike previously introduced validation studies, where surface-based distance errors were used to evaluate the reconstruction accuracy, here we propose to use errors measured based on clinically relevant morphometric parameters. For this purpose, a program was developed to robustly extract those morphometric parameters from the thirty training surface models (training population), from the seventeen surface models reconstructed from X-ray radiographs, and from the seventeen ground truth surface models obtained either by a CT-scan reconstruction method or by a laser-scan reconstruction method. A statistical analysis was then performed to classify the seventeen test bones into two categories: normal cases and outliers. This classification step depends on the measured parameters of the particular test bone. In case all parameters of a test bone were covered by the training population's parameter ranges, this bone is classified as normal bone, otherwise as outlier bone. Our experimental results showed that statistically there was no significant difference between the morphometric parameters extracted from the reconstructed surface models of the normal cases and those extracted from the reconstructed surface models of the outliers. Therefore, our statistical shape model based reconstruction technique can be used to reconstruct not only the surface model of a normal bone but also that of an outlier bone.
Resumo:
In this paper a new 22 GHz water vapor spectro-radiometer which has been specifically designed for profile measurement campaigns of the middle atmosphere is presented. The instrument is of a compact design and has a simple set up procedure. It can be operated as a standalone instrument as it maintains its own weather station and a calibration scheme that does not rely on other instruments or the use of liquid nitrogen. The optical system of MIAWARA-C combines a choked gaussian horn antenna with a parabolic mirror which reduces the size of the instrument in comparison with currently existing radiometers. For the data acquisition a correlation receiver is used together with a digital cross correlating spectrometer. The complete backend section, including the computer, is located in the same housing as the instrument. The receiver section is temperature stabilized to minimize gain fluctuations. Calibration of the instrument is achieved through a balancing scheme with the sky used as the cold load and the tropospheric properties are determined by performing regular tipping curves. Since MIAWARA-C is used in measurement campaigns it is important to be able to determine the elevation pointing in a simple manner as this is a crucial parameter in the calibration process. Here we present two different methods; scanning the sky and the Sun. Finally, we report on the first spectra and retrieved water vapor profiles acquired during the Lapbiat campaign at the Finnish Meteorological Institute Arctic Research Centre in Sodankylä, Finland. The performance of MIAWARA-C is validated here by comparison of the presented profiles against the equivalent profiles from the Microwave Limb Sounder on the EOS/Aura satellite.
Resumo:
A new fragile logo watermarking scheme is proposed for public authentication and integrity verification of images. The security of the proposed block-wise scheme relies on a public encryption algorithm and a hash function. The encoding and decoding methods can provide public detection capabilities even in the absence of the image indices and the original logos. Furthermore, the detector automatically authenticates input images and extracts possible multiple logos and image indices, which can be used not only to localise tampered regions, but also to identify the original source of images used to generate counterfeit images. Results are reported to illustrate the effectiveness of the proposed method.