889 resultados para Digital Forensics, Forensic Computing, Forensic Science
Resumo:
Historicamente, ao serem criadas, novas mídias se apropriam de recursos de linguagens de outras mídias pré-existentes. Na medida em que as tecnologias da mídia se desenvolvem o mesmo ocorre com as linguagens, de forma a adaptarem-se, simultaneamente, ao meio e mensagens; modos de produção; e condições ideais de interação com os usuários. As mídias digitais, por sua natureza, dispõem de interfaces performáticas imagens-pensantes que permitem mais que a simples representação estética de conteúdos. Neste contexto, se insere a problemática desta pesquisa: Quais teorias transdisciplinares podem contribuir para a compreensão dos complexos processos comunicacionais que envolvem o relacionamento entre seres humanos e mídias digitais com propósito de aprendizagem? O objetivo desta pesquisa foi o de estender o modelo desenvolvido por Stephen Littlejohn e incluir novos conceitos e generalizações, provenientes de outros ramos da ciência com diferentes 'visões de mundo', visando ampliar a proposta de Littlejohn para um Modelo Transdisciplinar para Comunicação com Mídias Digitais, que, em nossa perspectiva, contribui para explicar os fenômenos pertinentes à relação de humanos com mídias digitais, principalmente em processos de aprendizagem de ciências. A pesquisa foi feita com métodos de pesquisa Bibliográfica e Descritiva.(AU)
Resumo:
Historicamente, ao serem criadas, novas mídias se apropriam de recursos de linguagens de outras mídias pré-existentes. Na medida em que as tecnologias da mídia se desenvolvem o mesmo ocorre com as linguagens, de forma a adaptarem-se, simultaneamente, ao meio e mensagens; modos de produção; e condições ideais de interação com os usuários. As mídias digitais, por sua natureza, dispõem de interfaces performáticas imagens-pensantes que permitem mais que a simples representação estética de conteúdos. Neste contexto, se insere a problemática desta pesquisa: Quais teorias transdisciplinares podem contribuir para a compreensão dos complexos processos comunicacionais que envolvem o relacionamento entre seres humanos e mídias digitais com propósito de aprendizagem? O objetivo desta pesquisa foi o de estender o modelo desenvolvido por Stephen Littlejohn e incluir novos conceitos e generalizações, provenientes de outros ramos da ciência com diferentes 'visões de mundo', visando ampliar a proposta de Littlejohn para um Modelo Transdisciplinar para Comunicação com Mídias Digitais, que, em nossa perspectiva, contribui para explicar os fenômenos pertinentes à relação de humanos com mídias digitais, principalmente em processos de aprendizagem de ciências. A pesquisa foi feita com métodos de pesquisa Bibliográfica e Descritiva.(AU)
Resumo:
This review discusses menu analysis models in depth to identify the models strengths and weaknesses in attempt to discover opportunities to enhance existing models and evolve menu analysis toward a comprehensive analytical model.
Resumo:
Menu engineering is a methodology to classify menu items by their contribution margin and popularity. The process discounts the importance of food cost percentage, recognizing that operators deposit cash, not percentages. The authors raise the issue that strict application of the principles of menu engineering may result in an erroneous evaluation of a menu item, and also may be of little use without considering the variable portion of labor. They describe an enhancement to the process by considering labor.
Resumo:
Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules
Resumo:
Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules
Resumo:
Sustainability and responsible environmental behaviour constitute a vital premise in the development of the humankind. In fact, during last decades, the global energetic scenario is evolving towards a scheme with increasing relevance of Renewable Energy Sources (RES) like photovoltaic, wind, biomass and hydrogen. Furthermore, hydrogen is an energy carrier which constitutes a mean for long-term energy storage. The integration of hydrogen with local RES contributes to distributed power generation and early introduction of hydrogen economy. Intermittent nature of many of RES, for instance solar and wind sources, impose the development of a management and control strategy to overcome this drawback. This strategy is responsible of providing a reliable, stable and efficient operation of the system. To implement such strategy, a monitoring system is required.The present paper aims to contribute to experimentally validate LabVIEW as valuable tool to develop monitoring platforms in the field of RES-based facilities. To this aim, a set of real systems successfully monitored is exposed.
Resumo:
A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks.
Resumo:
The call to access and preserve the state records that document crimes committed by the state during Guatemala’s civil war has become an archival imperative entangled with neoliberal human rights discourses of “truth, justice, and memory.” 200,000 people were killed and disappeared in Guatemala’s civil war including acts of genocide in which 85% of massacres involved sexual violence committed against Mayan women. This dissertation argues that in an attempt to tell the official story of the civil war, American Human Rights organizations and academic institutions have constructed a normative identity whose humanity is attached to a scientific and evidentiary value as well as an archival status representing the materiality and institutionality of the record. Consequently, Human Rights discourses grounded in Western knowledges, in particular archival science and law, which prioritize the appearance of truth erase the material and epistemological experience of indigenous women during wartimes. As a result, the subjectivity that has surfaced on the record as most legible has mostly pertained to non-indigenous, middle class, urban, leftist men who were victims of enforced disappearance not genocide. This dissertation investigates this conflicting narrative that remembers a non-indigenous revolutionary masculine hero and grants him justice in human rights courtrooms simply because of a document attesting to his death. A main research question addressed in this project is why the promise of "truth and justice" under the name of human rights becomes a contentious site for gendered indigenous bodies? I conduct a discursive and rhetorical analysis of documentary film, declassified Guatemalan police and military records such as Operation Sofia, a military log known for “documenting the genocide” during rural counterinsurgencies executed by the military. I interrogate the ways in which racialized feminicides or the hyper-sexualized racial violence that has historically dehumanized indigenous women falls outside of discourses of vision constructed by Western positivist knowledges to reinscribe the ideal human right subject. I argue for alternative epistemological frames that recognize genocide as sexualized and gendered structures that have simultaneously produced racialized feminicides in order to disrupt the colonial structures of capitalism, patriarchy and heterosexuality. Ironically, these structures of power remain untouched by the dominant human rights discourse and its academic, NGO, and state collaborators that seek "truth and justice" in post-conflict Guatemala.
Resumo:
Peer reviewed
Resumo:
The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.
Resumo:
The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.