902 resultados para Digital techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

CUNHA, Jacqueline; GALINDO, Marcos. Preservação digital: o estado da arte. In:ENCONTRO NACIONAL DE PESQUISA EM CIÊNCIA DA INFORMAÇÃO, 8., Savador, 2007. Anais... Salvador: ANCIB, 2007. Disponível em:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contemporary integrated circuits are designed and manufactured in a globalized environment leading to concerns of piracy, overproduction and counterfeiting. One class of techniques to combat these threats is circuit obfuscation which seeks to modify the gate-level (or structural) description of a circuit without affecting its functionality in order to increase the complexity and cost of reverse engineering. Most of the existing circuit obfuscation methods are based on the insertion of additional logic (called “key gates”) or camouflaging existing gates in order to make it difficult for a malicious user to get the complete layout information without extensive computations to determine key-gate values. However, when the netlist or the circuit layout, although camouflaged, is available to the attacker, he/she can use advanced logic analysis and circuit simulation tools and Boolean SAT solvers to reveal the unknown gate-level information without exhaustively trying all the input vectors, thus bringing down the complexity of reverse engineering. To counter this problem, some ‘provably secure’ logic encryption algorithms that emphasize methodical selection of camouflaged gates have been proposed previously in literature [1,2,3]. The contribution of this paper is the creation and simulation of a new layout obfuscation method that uses don't care conditions. We also present proof-of-concept of a new functional or logic obfuscation technique that not only conceals, but modifies the circuit functionality in addition to the gate-level description, and can be implemented automatically during the design process. Our layout obfuscation technique utilizes don’t care conditions (namely, Observability and Satisfiability Don’t Cares) inherent in the circuit to camouflage selected gates and modify sub-circuit functionality while meeting the overall circuit specification. Here, camouflaging or obfuscating a gate means replacing the candidate gate by a 4X1 Multiplexer which can be configured to perform all possible 2-input/ 1-output functions as proposed by Bao et al. [4]. It is important to emphasize that our approach not only obfuscates but alters sub-circuit level functionality in an attempt to make IP piracy difficult. The choice of gates to obfuscate determines the effort required to reverse engineer or brute force the design. As such, we propose a method of camouflaged gate selection based on the intersection of output logic cones. By choosing these candidate gates methodically, the complexity of reverse engineering can be made exponential, thus making it computationally very expensive to determine the true circuit functionality. We propose several heuristic algorithms to maximize the RE complexity based on don’t care based obfuscation and methodical gate selection. Thus, the goal of protecting the design IP from malicious end-users is achieved. It also makes it significantly harder for rogue elements in the supply chain to use, copy or replicate the same design with a different logic. We analyze the reverse engineering complexity by applying our obfuscation algorithm on ISCAS-85 benchmarks. Our experimental results indicate that significant reverse engineering complexity can be achieved at minimal design overhead (average area overhead for the proposed layout obfuscation methods is 5.51% and average delay overhead is about 7.732%). We discuss the strengths and limitations of our approach and suggest directions that may lead to improved logic encryption algorithms in the future. References: [1] R. Chakraborty and S. Bhunia, “HARPOON: An Obfuscation-Based SoC Design Methodology for Hardware Protection,” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 28, no. 10, pp. 1493–1502, 2009. [2] J. A. Roy, F. Koushanfar, and I. L. Markov, “EPIC: Ending Piracy of Integrated Circuits,” in 2008 Design, Automation and Test in Europe, 2008, pp. 1069–1074. [3] J. Rajendran, M. Sam, O. Sinanoglu, and R. Karri, “Security Analysis of Integrated Circuit Camouflaging,” ACM Conference on Computer Communications and Security, 2013. [4] Bao Liu, Wang, B., "Embedded reconfigurable logic for ASIC design obfuscation against supply chain attacks,"Design, Automation and Test in Europe Conference and Exhibition (DATE), 2014 , vol., no., pp.1,6, 24-28 March 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A variety of physical and biomedical imaging techniques, such as digital holography, interferometric synthetic aperture radar (InSAR), or magnetic resonance imaging (MRI) enable measurement of the phase of a physical quantity additionally to its amplitude. However, the phase can commonly only be measured modulo 2π, as a so called wrapped phase map. Phase unwrapping is the process of obtaining the underlying physical phase map from the wrapped phase. Tile-based phase unwrapping algorithms operate by first tessellating the phase map, then unwrapping individual tiles, and finally merging them to a continuous phase map. They can be implemented computationally efficiently and are robust to noise. However, they are prone to failure in the presence of phase residues or erroneous unwraps of single tiles. We tried to overcome these shortcomings by creating novel tile unwrapping and merging algorithms as well as creating a framework that allows to combine them in modular fashion. To increase the robustness of the tile unwrapping step, we implemented a model-based algorithm that makes efficient use of linear algebra to unwrap individual tiles. Furthermore, we adapted an established pixel-based unwrapping algorithm to create a quality guided tile merger. These original algorithms as well as previously existing ones were implemented in a modular phase unwrapping C++ framework. By examining different combinations of unwrapping and merging algorithms we compared our method to existing approaches. We could show that the appropriate choice of unwrapping and merging algorithms can significantly improve the unwrapped result in the presence of phase residues and noise. Beyond that, our modular framework allows for efficient design and test of new tile-based phase unwrapping algorithms. The software developed in this study is freely available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis focuses on digital equalization of nonlinear fiber impairments for coherent optical transmission systems. Building from well-known physical models of signal propagation in single-mode optical fibers, novel nonlinear equalization techniques are proposed, numerically assessed and experimentally demonstrated. The structure of the proposed algorithms is strongly driven by the optimization of the performance versus complexity tradeoff, envisioning the near-future practical application in commercial real-time transceivers. The work is initially focused on the mitigation of intra-channel nonlinear impairments relying on the concept of digital backpropagation (DBP) associated with Volterra-based filtering. After a comprehensive analysis of the third-order Volterra kernel, a set of critical simplifications are identified, culminating in the development of reduced complexity nonlinear equalization algorithms formulated both in time and frequency domains. The implementation complexity of the proposed techniques is analytically described in terms of computational effort and processing latency, by determining the number of real multiplications per processed sample and the number of serial multiplications, respectively. The equalization performance is numerically and experimentally assessed through bit error rate (BER) measurements. Finally, the problem of inter-channel nonlinear compensation is addressed within the context of 400 Gb/s (400G) superchannels for long-haul and ultra-long-haul transmission. Different superchannel configurations and nonlinear equalization strategies are experimentally assessed, demonstrating that inter-subcarrier nonlinear equalization can provide an enhanced signal reach while requiring only marginal added complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introducción: En la presente investigación nos vamos a centrar en los contenidos on line que ofrece la marca Red Bull ya sea a través de su web www.redbull.tv o de su App Red Bull TV. Red Bull como marca necesita destacar de entre todo el ruido publicitario, necesita hacerse notar y llegar hasta sus consumidores, pero hoy en día las audiencias están fragmentadas, las audiencias no están sometidas al dictado programático de las televisiones. Cada vez más, y sobre todo en el público joven, que es aquel con el que más se identifica esta marca, es un público activo que genera sus propias parrillas de programación, que busca aquello que realmente quiere ver. Entonces, ¿cómo una marca logra que se le escuche? a través del Rock&Love: sorprender y dar amor. Hipótesis: Nuestra hipótesis de partida es que Red Bull, en la búsqueda de conexión con su público, les ofrece contenidos de su interés que les reafirma en su acercamiento a la marca, y así lograr el Rock&Love. Objetivos: De acuerdo con la hipótesis, son tres los objetivos que van a guiar la presente investigación: el primero, conocer las claves de Red Bull en el entorno digital. El segundo, realizar un análisis de los contenidos utilizados por la marca para aproximarse a sus usuarios teniendo en cuenta los nuevos hábitos de consumo. El tercero, analizar la diferencia de contenidos según el medio en que se difunde. Metodología: Para corroborar o no nuestra hipótesis participaremos del método cualitativo y del método cuantitativo. Las técnicas a utilizar son la observación documental y el análisis de contenido. Conclusiones: Este análisis de contenido Ad Hoc ha permitido la obtención de datos cuantitativos y nos ha facilitado la aproximación a nuestros objetivos de forma cualitativa y de esta forma, se ha logrado dar respuesta a la hipótesis de partida y confirmar que la marca recurre al Rock&Love en los contenidos que muestran a través de las diferentes plataformas de Red Bull TV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For this project I prepared a series of recitals featuring music for horn and percussion, in which the horn part featured extended horn techniques. For this project, I considered anything beyond the open or muted horn an extended technique. These techniques range from the common hand-stopped note passages to complex new techniques involving half-valves, multi-phonics, and more, for new sounds desired by the composer. There are several pieces written for solo horn and percussion, with ensembles ranging from simple duets to solo horn with a full percussion ensemble. However, few include extended techniques for the horn. All of these select pieces are lesser known because of their difficulty, primarily because of the challenge of the extended techniques requested by the composer. In the introduction to this paper I give a brief background to the project, where the current repertoire stands, and my experiences with commissioning works for this genre. I then give a brief history and how-to on the more common extended techniques, which were found in almost every piece. I separated these techniques so that they could be referenced in the performance notes without being extremely repetitive in their description. Then follows the main performance notes of the repertoire chosen, which includes a brief description of the piece itself and a longer discussion for performers and composers who wish to learn more about these techniques. In this section my primary focus is the extended techniques used and I provide score samples with permission to further the education of the next musicians to tackle this genre. All works performed for this project were recorded and accompany this paper in the Digital Repository at the University of Maryland (DRUM). The following works were included in this project: o Howard J. Buss, Dreams from the Shadows (2015) o Howard J. Buss, Night Tide (1995) o George Crumb, An Idyll for the Misbegotten, trans. Robert Patterson (1986/1997) o Charles Fernandez, Metamorphosis: A Horn’s Life, “Prenatal and Toddler” (2016, unfinished) o Helen Gifford, Of Old Angkor (1995) o Douglas Hill, Thoughtful Wanderings… (1990) o Pierre-Yves Level, Duetto pour Cor en Fa et Percussion (1999) o David Macbride, Elegy for Horn and Timpani (2009) o Brian Prechtl, A Song of David (1995) o Verne Reynolds, HornVibes (1986) o Pablo Salazar, Cincontar (2016) o Mark Schultz, Dragons in the Sky (1989) o Faye-Ellen Silverman, Protected Sleep (2007) o Charles Taylor, Sonata for Horn and Marimba (1991) o Robert Wolk, Tessellations (2016) With this project, I intend to promote these pieces and the techniques used to encourage more works written in this style, and reveal to fellow horn players that the techniques should not prevent these great works from being performed. Due to the lack of repertoire, I successfully commissioned new pieces featuring extended techniques, which were featured in the final recital.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nigerian scam, also known as advance fee fraud or 419 scam, is a prevalent form of online fraudulent activity that causes financial loss to individuals and businesses. Nigerian scam has evolved from simple non-targeted email messages to more sophisticated scams targeted at users of classifieds, dating and other websites. Even though such scams are observed and reported by users frequently, the community’s understanding of Nigerian scams is limited since the scammers operate “underground”. To better understand the underground Nigerian scam ecosystem and seek effective methods to deter Nigerian scam and cybercrime in general, we conduct a series of active and passive measurement studies. Relying upon the analysis and insight gained from the measurement studies, we make four contributions: (1) we analyze the taxonomy of Nigerian scam and derive long-term trends in scams; (2) we provide an insight on Nigerian scam and cybercrime ecosystems and their underground operation; (3) we propose a payment intervention as a potential deterrent to cybercrime operation in general and evaluate its effectiveness; and (4) we offer active and passive measurement tools and techniques that enable in-depth analysis of cybercrime ecosystems and deterrence on them. We first created and analyze a repository of more than two hundred thousand user-reported scam emails, stretching from 2006 to 2014, from four major scam reporting websites. We select ten most commonly observed scam categories and tag 2,000 scam emails randomly selected from our repository. Based upon the manually tagged dataset, we train a machine learning classifier and cluster all scam emails in the repository. From the clustering result, we find a strong and sustained upward trend for targeted scams and downward trend for non-targeted scams. We then focus on two types of targeted scams: sales scams and rental scams targeted users on Craigslist. We built an automated scam data collection system and gathered large-scale sales scam emails. Using the system we posted honeypot ads on Craigslist and conversed automatically with the scammers. Through the email conversation, the system obtained additional confirmation of likely scam activities and collected additional information such as IP addresses and shipping addresses. Our analysis revealed that around 10 groups were responsible for nearly half of the over 13,000 total scam attempts we received. These groups used IP addresses and shipping addresses in both Nigeria and the U.S. We also crawled rental ads on Craigslist, identified rental scam ads amongst the large number of benign ads and conversed with the potential scammers. Through in-depth analysis of the rental scams, we found seven major scam campaigns employing various operations and monetization methods. We also found that unlike sales scammers, most rental scammers were in the U.S. The large-scale scam data and in-depth analysis provide useful insights on how to design effective deterrence techniques against cybercrime in general. We study underground DDoS-for-hire services, also known as booters, and measure the effectiveness of undermining a payment system of DDoS Services. Our analysis shows that the payment intervention can have the desired effect of limiting cybercriminals’ ability and increasing the risk of accepting payments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preserving the cultural heritage of the performing arts raises difficult and sensitive issues, as each performance is unique by nature and the juxtaposition between the performers and the audience cannot be easily recorded. In this paper, we report on an experimental research project to preserve another aspect of the performing arts—the history of their rehearsals. We have specifically designed non-intrusive video recording and on-site documentation techniques to make this process transparent to the creative crew, and have developed a complete workflow to publish the recorded video data and their corresponding meta-data online as Open Data using state-of-the-art audio and video processing to maximize non-linear navigation and hypervideo linking. The resulting open archive is made publicly available to researchers and amateurs alike and offers a unique account of the inner workings of the worlds of theater and opera.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O presente Relatório Científico Final do Trabalho de Investigação Aplicada está subordinado ao tema “Crimes em ambiente digital – Investigação da GNR para a obtenção de prova”. O tema enunciado tem como finalidade fazer uma análise da investigação criminal da GNR, no que diz respeito à obtenção de prova digital, em inquéritos delegados pela Autoridade Judiciária. Como objetivo geral pretende-se determinar a importância da prova digital para a investigação criminal da GNR. A investigação tem ainda objetivos específicos como a determinação das capacidades e dificuldades das vertentes operativa e criminalística para a obtenção de prova digital e também a determinação dos principais tipos de crime que se suportaram neste tipo de prova. Ao nível das bases lógicas, a presente investigação apoia-se no método hipotéticodedutivo, como tal, o ponto de partida é a conceção das questões de investigação, respetivos objetivos e hipóteses de investigação. No que diz respeito às técnicas de recolha de dados, a presente investigação é apoiada em conteúdo documental, entrevistas e questionários. A análise e discussão dos resultados obtidos permite tecer as conclusões do trabalho que, por sua vez, permitem verificar a veracidade das hipóteses formuladas na fase inicial da investigação. Como principais resultados conseguimos constatar que a prova digital é um tipo de prova que deve ser priorizada para os inquéritos podendo ser obtida num grande espectro de tipologias criminais que são da competência da GNR, em matéria de investigação criminal. Concluímos também que a Guarda ainda tem uma grande margem de progressão até estar completamente capacitada para a obtenção de prova digital, ainda assim, estão a ser desenvolvidos esforços e competências nesse sentido, sendo que alguns Comandos Territoriais se encontram mais desenvolvidos nesta matéria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El periodismo ha dado grandes saltos hacia la modernidad digital pasando por la multimedialidad, convergencia digital y transmedialidad. La hipertextualidad e interactividad son características que definen a la Web 2.0 que rompen con la comunicación lineal y unidireccional permitiendo a los periodistas, medios de comunicación y usuarios estar estrechamente conectados. Para la investigación se realizó un análisis de las narrativas transmedia, el ciberperiodismo y las características que definen a un periodista digital. Se procedió a conocer y posteriormente a realizar un estudio de recepción del medio de comunicación Comunica-Girón, que asienta su estructura funcional en la esfera digital, para lo cual se aplicaron métodos y técnicas de investigación de alcance descriptivo-exploratorio, proceso de observación, recopilación documental y tabulación de la información.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The United States transportation industry is predicted to consume approximately 13 million barrels of liquid fuel per day by 2025. If one percent of the fuel energy were salvaged through waste heat recovery, there would be a reduction of 130 thousand barrels of liquid fuel per day. This dissertation focuses on automotive waste heat recovery techniques with an emphasis on two novel techniques. The first technique investigated was a combination coolant and exhaust-based Rankine cycle system, which utilized a patented piston-in-piston engine technology. The research scope included a simulation of the maximum mass flow rate of steam (700 K and 5.5 MPa) from two heat exchangers, the potential power generation from the secondary piston steam chambers, and the resulting steam quality within the steam chamber. The secondary piston chamber provided supplemental steam power strokes during the engine's compression and exhaust strokes to reduce the pumping work of the engine. A Class-8 diesel engine, operating at 1,500 RPM at full load, had a maximum increase in the brake fuel conversion efficiency of 3.1%. The second technique investigated the implementation of thermoelectric generators on the outer cylinder walls of a liquid-cooled internal combustion engine. The research scope focused on the energy generation, fuel energy distribution, and cylinder wall temperatures. The analysis was conducted over a range of engine speeds and loads in a two cylinder, 19.4 kW, liquid-cooled, spark-ignition engine. The cylinder wall temperatures increased by 17% to 44% which correlated well to the 4.3% to 9.5% decrease in coolant heat transfer. Only 23.3% to 28.2% of the heat transfer to the coolant was transferred through the TEG and TEG surrogate material. The gross indicated work decreased by 0.4% to 1.0%. The exhaust gas energy decreased by 0.8% to 5.9%. Due to coolant contamination, the TEG output was not able to be obtained. TEG output was predicted from cylinder wall temperatures and manufacturer documentation, which was less than 0.1% of the cumulative heat release. Higher TEG conversion efficiencies, combined with greater control of heat transfer paths, would be needed to improve energy output and make this a viable waste heat recovery technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combinatorial optimization is a complex engineering subject. Although formulation often depends on the nature of problems that differs from their setup, design, constraints, and implications, establishing a unifying framework is essential. This dissertation investigates the unique features of three important optimization problems that can span from small-scale design automation to large-scale power system planning: (1) Feeder remote terminal unit (FRTU) planning strategy by considering the cybersecurity of secondary distribution network in electrical distribution grid, (2) physical-level synthesis for microfluidic lab-on-a-chip, and (3) discrete gate sizing in very-large-scale integration (VLSI) circuit. First, an optimization technique by cross entropy is proposed to handle FRTU deployment in primary network considering cybersecurity of secondary distribution network. While it is constrained by monetary budget on the number of deployed FRTUs, the proposed algorithm identi?es pivotal locations of a distribution feeder to install the FRTUs in different time horizons. Then, multi-scale optimization techniques are proposed for digital micro?uidic lab-on-a-chip physical level synthesis. The proposed techniques handle the variation-aware lab-on-a-chip placement and routing co-design while satisfying all constraints, and considering contamination and defect. Last, the first fully polynomial time approximation scheme (FPTAS) is proposed for the delay driven discrete gate sizing problem, which explores the theoretical view since the existing works are heuristics with no performance guarantee. The intellectual contribution of the proposed methods establishes a novel paradigm bridging the gaps between professional communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human scent, or the volatile organic compounds (VOCs) produced by an individual, has been recognized as a biometric measurement because of the distinct variations in both the presence and abundance of these VOCs between individuals. In forensic science, human scent has been used as a form of associative evidence by linking a suspect to a scene/object through the use of human scent discriminating canines. The scent most often collected and used with these specially trained canines is from the hands because a majority of the evidence collected is likely to have been handled by the suspect. However, the scents from other biological specimens, especially those that are likely to be present at scenes of violent crimes, have yet to be explored. Hair, fingernails and saliva are examples of these types of specimens. In this work, a headspace solid phase microextraction gas chromatography-mass spectrometry (HS-SPME-GC-MS) technique was used for the identification of VOCs from hand odor, hair, fingernails and saliva. Sixty individuals were sampled and the profiles of the extracted VOCs were evaluated to assess whether they could be used for distinguishing individuals. Preliminary analysis of the biological specimens collected from an individual (intra-subject) showed that, though these materials have some VOCs in common, their overall chemical profile is different for each specimen type. Pair-wise comparisons, using Spearman Rank correlations, were made between the chemical profiles obtained from each subject, per a specimen type. Greater than 98.8% of the collected samples were distinguished from the subjects for all of the specimen types, demonstrating that these specimens can be used for distinguishing individuals. Additionally, field trials were performed to determine the utility of these specimens as scent sources for human scent discriminating canines. Three trials were conducted to evaluate hair, fingernails and saliva in comparison to hand odor, which was considered the standard source of human odor. It was revealed that canines perform similarly to these alternative human scent sources as they do to hand odor implying that, though there are differences in the chemical profiles released by these specimens, they can still be used for the discrimination of individuals by trained canines.