896 resultados para computer forensics, digital evidence, computer profiling, time-lining, temporal inconsistency, computer forensic object model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis deals with some of the non-linear Gaussian and non-Gaussian time models and mainly concentrated in studying the properties and application of a first order autoregressive process with Cauchy marginal distribution. In this thesis some of the non-linear Gaussian and non-Gaussian time series models and mainly concentrated in studying the properties and application of a order autoregressive process with Cauchy marginal distribution. Time series relating to prices, consumptions, money in circulation, bank deposits and bank clearing, sales and profit in a departmental store, national income and foreign exchange reserves, prices and dividend of shares in a stock exchange etc. are examples of economic and business time series. The thesis discuses the application of a threshold autoregressive(TAR) model, try to fit this model to a time series data. Another important non-linear model is the ARCH model, and the third model is the TARCH model. The main objective here is to identify an appropriate model to a given set of data. The data considered are the daily coconut oil prices for a period of three years. Since it is a price data the consecutive prices may not be independent and hence a time series based model is more appropriate. In this study the properties like ergodicity, mixing property and time reversibility and also various estimation procedures used to estimate the unknown parameters of the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents methods for moving object detection in airborne video surveillance. The motion segmentation in the above scenario is usually difficult because of small size of the object, motion of camera, and inconsistency in detected object shape etc. Here we present a motion segmentation system for moving camera video, based on background subtraction. An adaptive background building is used to take advantage of creation of background based on most recent frame. Our proposed system suggests CPU efficient alternative for conventional batch processing based background subtraction systems. We further refine the segmented motion by meanshift based mode association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Das hier frei verfügbare Skript und die Sammlung an Klausuren mit Musterlösungen aus den Jahren 2006 bis 2015 geht auf die gleichnamige Vorlesung im Bachelorstudiengang Informatik an der Universität Kassel zurück, die von Prof. Dr. Wegner und ab 2012 von Dr. Schweinsberg angeboten wurde. Behandelt werden die Grundlagen der eXtensible Markup Language, die sich als Datenaustauschsprache etabliert hat. Im Gegensatz zu HTML erlaubt sie die semantische Anreicherung von Dokumenten. In der Vorlesung wird die Entwicklung von XML-basierten Sprachen sowie die Transformierung von XML-Dokumenten mittels Stylesheets (eXtensible Stylesheet Language XSL) behandelt. Ebenfalls werden die DOM-Schnittstelle (Document Object Model) und SAX (Simple API for XML) vorgestellt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a technique for finding pixelwise correspondences between two images by using models of objects of the same class to guide the search. The object models are 'learned' from example images (also called prototypes) of an object class. The models consist of a linear combination ofsprototypes. The flow fields giving pixelwise correspondences between a base prototype and each of the other prototypes must be given. A novel image of an object of the same class is matched to a model by minimizing an error between the novel image and the current guess for the closest modelsimage. Currently, the algorithm applies to line drawings of objects. An extension to real grey level images is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a component-based approach for recognizing objects under large pose changes. From a set of training images of a given object we extract a large number of components which are clustered based on the similarity of their image features and their locations within the object image. The cluster centers build an initial set of component templates from which we select a subset for the final recognizer. In experiments we evaluate different sizes and types of components and three standard techniques for component selection. The component classifiers are finally compared to global classifiers on a database of four objects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The male and female homosexual orientation has substantial prevalence in humans and can be explained by determinants of various levels: biological, genetic, psychological, social and cultural. However, the biological and genetic evidence have been the main hypotheses tested in scientific research in the world. This article aims to review research studies about the existence of genetic and biological evidence that determine homosexual orientation. Was conducted a review of the literature, using the database MedLine/PubMed and Google scholar. The papers and books were searched in Portuguese and English, using the following keywords: sexual orientation, sexual behavior, homosexuality, developmental Biology and genetics. Was selected papers of the last 22 years. Were found five main theories about the biological components: (1) fraternal birth order, (2) brain androgenization and 2D:4D ratio; (3) brain activation by pheromones; and (4) epigenetic inheritance; and four theories about the genetic components: (1) genetic polymorphism; (2) pattern of X-linked inheritance; (3) monozygotic twins; and (4) sexual antagonistic selection. Concluded that there were many scientific evidence found over time to explain some of biological and genetic components of homosexuality, especially in males. However, today, there is no definitive explanation about what are the determinants of homosexual orientation components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different optimization methods can be employed to optimize a numerical estimate for the match between an instantiated object model and an image. In order to take advantage of gradient-based optimization methods, perspective inversion must be used in this context. We show that convergence can be very fast by extrapolating to maximum goodness-of-fit with Newton's method. This approach is related to methods which either maximize a similar goodness-of-fit measure without use of gradient information, or else minimize distances between projected model lines and image features. Newton's method combines the accuracy of the former approach with the speed of convergence of the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to determine the potential of mid-infrared spectroscopy coupled with multidimensional statistical analysis for the prediction of processed cheese instrumental texture and meltability attributes. Processed cheeses (n = 32) of varying composition were manufactured in a pilot plant. Following two and four weeks storage at 4 degrees C samples were analysed using texture profile analysis, two meltability tests (computer vision, Olson and Price) and mid-infrared spectroscopy (4000-640 cm(-1)). Partial least squares regression was used to develop predictive models for all measured attributes. Five attributes were successfully modelled with varying degrees of accuracy. The computer vision meltability model allowed for discrimination between high and low melt values (R-2 = 0.64). The hardness and springiness models gave approximate quantitative results (R-2 = 0.77) and the cohesiveness (R-2 = 0.81) and Olson and Price meltability (R-2 = 0.88) models gave good prediction results. (c) 2006 Elsevier Ltd. All rights reserved..

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The search for ever deeper relationships among the World’s languages is bedeviled by the fact that most words evolve too rapidly to preserve evidence of their ancestry beyond 5,000 to 9,000 y. On the other hand, quantitative modeling indicates that some “ultraconserved” words exist that might be used to find evidence for deep linguistic relationships beyond that time barrier. Here we use a statistical model, which takes into account the frequency with which words are used in common everyday speech, to predict the existence of a set of such highly conserved words among seven language families of Eurasia postulated to form a linguistic superfamily that evolved from a common ancestor around 15,000 y ago. We derive a dated phylogenetic tree of this proposed superfamily with a time-depth of ∼14,450 y, implying that some frequently used words have been retained in related forms since the end of the last ice age. Words used more than once per 1,000 in everyday speech were 7- to 10-times more likely to show deep ancestry on this tree. Our results suggest a remarkable fidelity in the transmission of some words and give theoretical justification to the search for features of language that might be preserved across wide spans of time and geography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to investigate the potential role of vegetation changes in megafaunal extinctions during the later part of the last glacial stage and early Holocene (42–10 ka BP), the palaeovegetation of northern Eurasia and Alaska was simulated using the LPJ-GUESS dynamic vegetation model. Palaeoclimatic driving data were derived from simulations made for 22 time slices using the Hadley Centre Unified Model. Modelled annual net primary productivity (aNPP) of a series of plant functional types (PFTs) is mapped for selected time slices and summarised for major geographical regions for all time slices. Strong canonical correlations are demonstrated between model outputs and pollen data compiled for the same period and region. Simulated aNPP values, especially for tree PFTs and for a mesophilous herb PFT, provide evidence of the structure and productivity of last glacial vegetation. The mesophilous herb PFT aNPP is higher in many areas during the glacial than at present or during the early Holocene. Glacial stage vegetation, whilst open and largely treeless in much of Europe, thus had a higher capacity to support large vertebrate herbivore populations than did early Holocene vegetation. A marked and rapid decrease in aNPP of mesophilous herbs began shortly after the Last Glacial Maximum, especially in western Eurasia. This is likely implicated in extinction of several large herbivorous mammals during the latter part of the glacial stage and the transition to the Holocene.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electronic properties of liquid hydrogen fluoride (HF) were investigated by carrying out sequential quantum mechanics/Born-Oppenheimer molecular dynamics. The structure of the liquid is in good agreement with recent experimental information. Emphasis was placed on the analysis of polarisation effects, dynamic polarisability and electronic excitations in liquid HF. Our results indicate an increase in liquid phase of the dipole moment (similar to 0.5 D) and isotropic polarisability (5%) relative to their gas-phase values. Our best estimate for the first vertical excitation energy in liquid HF indicates a blue-shift of 0.4 +/- 0.2 eV relative to that of the gas-phase monomer (10.4 eV). (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of water distribution systems (WDS) need rehabilitation due to aging infrastructure leading to decreasing capacity, increasing leakage and consequently low performance of the WDS. However an appropriate strategy including location and time of pipeline rehabilitation in a WDS with respect to a limited budget is the main challenge which has been addressed frequently by researchers and practitioners. On the other hand, selection of appropriate rehabilitation technique and material types is another main issue which has yet to address properly. The latter can affect the environmental impacts of a rehabilitation strategy meeting the challenges of global warming mitigation and consequent climate change. This paper presents a multi-objective optimization model for rehabilitation strategy in WDS addressing the abovementioned criteria mainly focused on greenhouse gas (GHG) emissions either directly from fossil fuel and electricity or indirectly from embodied energy of materials. Thus, the objective functions are to minimise: (1) the total cost of rehabilitation including capital and operational costs; (2) the leakage amount; (3) GHG emissions. The Pareto optimal front containing optimal solutions is determined using Non-dominated Sorting Genetic Algorithm NSGA-II. Decision variables in this optimisation problem are classified into a number of groups as: (1) percentage proportion of each rehabilitation technique each year; (2) material types of new pipeline for rehabilitation each year. Rehabilitation techniques used here includes replacement, rehabilitation and lining, cleaning, pipe duplication. The developed model is demonstrated through its application to a Mahalat WDS located in central part of Iran. The rehabilitation strategy is analysed for a 40 year planning horizon. A number of conventional techniques for selecting pipes for rehabilitation are analysed in this study. The results show that the optimal rehabilitation strategy considering GHG emissions is able to successfully save the total expenses, efficiently decrease the leakage amount from the WDS whilst meeting environmental criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trata-se de uma pesquisa de natureza marcadamente descritiva, com etapas exploratórias, que visa a descrever as percepções e reflexões desveladas pelos sujeitos da pesquisa nas análises temáticas realizadas sobre diversas questões que envolvem o tema da autonomia da Perícia Criminal Oficial, no âmbito da Polícia Federal. Para esse fim, utilizou-se da metodologia da análise de conteúdo, segundo Bardin (1977). Os sujeitos da pesquisa foram escolhidos segundo o critério de acessibilidade e da natureza dos cargos, quais sejam: Delegado da Polícia Federal, Juiz Federal, Perito Criminal Federal e Procurador da República. Face à predominância do cunho qualitativo neste estudo, não há expectativas de generalizações dos resultados obtidos no campo, assim como a seleção desses sujeitos não priorizou pela representatividade quantitativa de cada cargo. O referencial teórico foi construído com o propósito de contextualizar e favorecer a compreensão do leitor sobre como é constituída a realidade em que se insere o objeto de estudo, buscando descrever os termos e conceitos necessários a essa compreensão, tais como: (i) o que é o Sistema de Justiça Criminal e como se deu seu processo de formação no Estado moderno; (ii) como é a estrutura e o fluxo processual básico do modelo brasileiro, com destaque para a posição que ocupam os órgãos ou Instituto de Criminalística; (iii) qual o nível de efetividade desse sistema, no Brasil, e quais os principais problemas que afetam a funcionalidade da Perícia Oficial em sua estrutura; (iv) quais os reflexos do uso dos paradigmas repressivo e preventivo, pelo Estado, no controle da violência, da criminalidade e da impunidade dos criminosos, visando a garantir a manutenção da ordem pública como bem coletivo; (v) que relevância tem o papel da Perícia Oficial para a efetividade do Sistema de Justiça Criminal, segundo o paradigma preventivo; (vi) o que é Criminalística e qual a natureza de sua atividade; e (vii) como se apresenta a atual estrutura administrativa e a rede de clientes da Perícia Oficial. Ao se aproximar do objeto de estudo, o pesquisador buscou descrever como se deu a origem do processo de autonomia da Criminalística, no Brasil, e como esse processo vem sendo desenhado como uma política de segurança pública, destacando as principais medidas administrativas e normativas adotadas no país que favoreceram a sua consolidação, tais como: a aprovação do PNSP (2002), do PNDH I (1996), do PNDH II (2002) e do PNDH III (2009), além da promulgação da Lei nº 12.030/2009, que assegura, de forma específica, a autonomia técnico-científica e funcional da função pericial criminal. Tratamento especial foi dado ao significado e ao alcance que têm as dimensões conceituais do termo “autonomia” para a função pericial. Em que pesem os resultados obtidos, as conclusões revelam que a complexidade do tema, teoria e prática, aguarda continuidade em pesquisas futuras.