989 resultados para Crime detection
Resumo:
Background Techniques for detecting circulating tumor cells in the peripheral blood of patients with head and neck cancers may identify individuals likely to benefit from early systemic treatment. Methods Reconstruction experiments were used to optimise immunomagnetic enrichment and RT-PCR detection of circulating tumor cells using four markers (ELF3, CK19, EGFR and EphB4). This method was then tested in a pilot study using samples from 16 patients with advanced head and neck carcinomas. Results Seven patients were positive for circulating tumour cells both prior to and after surgery, 4 patients were positive prior to but not after surgery, 3 patients were positive after but not prior to surgery and 2 patients were negative. Two patients tested positive for circulating cells but there was no other evidence of tumor spread. Given this patient cohort had mostly advanced disease, as expected the detection of circulating tumour cells was not associated with significant differences in overall or disease free survival. Conclusion For the first time, we show that almost all patients with advanced head and neck cancers have circulating cells at the time of surgery. The clinical application of techniques for detection of spreading disease, such as the immunomagnetic enrichment RT-PCR analysis used in this study, should be explored further.
Resumo:
Analysis of either footprints or footwear impressions which have been recovered from a crime scene is a well known and well accepted part of forensic investigation. When this evidence is obtained by investigating officers, comparative analysis to a suspect’s evidence may be undertaken. This can be done either by the detectives or in some cases, podiatrists with experience in forensic analysis. Frequently asked questions of a podiatrist include; “What additional information should be collected from a suspect (for the purposes of comparison), and how should it be collected?” This paper explores the answers to these and related questions based on 20 years of practical experience in the field of crime scene analysis as it relates to podiatry and forensics. Elements of normal and abnormal foot function are explored and used to explain the high degree of variability in wear patterns produced by the interaction of the foot and footwear. Based on this understanding the potential for identifying unique features of the user and correlating this to footwear evidence becomes apparent. Standard protocols adopted by podiatrists allow for more precise, reliable, and valid results to be obtained from their analysis. Complex data sets are now being obtained by investigating officers and, in collaboration with the podiatrist; higher quality conclusions are being achieved. This presentation details the results of investigations which have used standard protocols to collect and analyse footwear and suspects of recent major crimes.
Resumo:
Given the serious nature of computer crime, and its global nature and implications, it is clear that there is a crucial need for a common understanding of such criminal activity internationally in order to deal with it effectively. Research into the extent to which legislation, international initiatives, and policy and procedures to combat and investigate computer crime are consistent globally is therefore of enormous importance. The challenge is to study, analyse, and compare the policies and practices of combating computer crime under different jurisdictions in order to identify the extent to which they are consistent with each other and with international guidelines; and the extent of their successes and limitations. The purpose ultimately is to identify areas where improvements are needed and what those improvements should be. This thesis examines approaches used for combating computer crime, including money laundering, in Australia, the UAE, the UK and the USA, four countries which represent a spectrum of economic development and culture. It does so in the context of the guidelines of international organizations such as the Council of Europe (CoE) and the Financial Action Task Force (FATF). In the case of the UAE, we examine also the cultural influences which differentiate it from the other three countries and which has necessarily been a factor in shaping its approaches for countering money laundering in particular. The thesis concludes that because of the transnational nature of computer crime there is a need internationally for further harmonisation of approaches for combating computer crime. The specific contributions of the thesis are as follows: „h Developing a new unified comprehensive taxonomy of computer crime based upon the dual characteristics of the role of the computer and the contextual nature of the crime „h Revealing differences in computer crime legislation in Australia, the UAE, the UK and the USA, and how they correspond to the CoE Convention on Cybercrime and identifying a new framework to develop harmonised computer crime or cybercrime legislation globally „h Identifying some important issues that continue to create problems for law enforcement agencies such as insufficient resources, coping internationally with computer crime legislation that differs between countries, having comprehensive documented procedures and guidelines for combating computer crime, and reporting and recording of computer crime offences as distinct from other forms of crime „h Completing the most comprehensive study currently available regarding the extent of money laundered in four such developed or fast developing countries „h Identifying that the UK and the USA are the most advanced with regard to anti-money laundering and combating the financing of terrorism (AML/CFT) systems among the four countries based on compliance with the FATF recommendations. In addition, the thesis has identified that local factors have affected how the UAE has implemented its financial and AML/CFT systems and reveals that such local and cultural factors should be taken into account when implementing or evaluating any country¡¦s AML/CFT system.
Resumo:
With the identification of common single locus point mutations as risk factors for thrombophilia, many DNA testing methodologies have been described for detecting these variations. Traditionally, functional or immunological testing methods have been used to investigate quantitative anticoagulant deficiencies. However, with the emergence of the genetic variations, factor V Leiden, prothrombin 20210 and, to a lesser extent, the methylene tetrahydrofolate reductase (MTHFR677) and factor V HR2 haplotype, traditional testing methodologies have proved to be less useful and instead DNA technology is more commonly employed in diagnostics. This review considers many of the DNA techniques that have proved to be useful in the detection of common genetic variants that predispose to thrombophilia. Techniques involving gel analysis are used to detect the presence or absence of restriction sites, electrophoretic mobility shifts, as in single strand conformation polymorphism or denaturing gradient gel electrophoresis, and product formation in allele-specific amplification. Such techniques may be sensitive, but are unwielding and often need to be validated objectively. In order to overcome some of the limitations of gel analysis, especially when dealing with larger sample numbers, many alternative detection formats, such as closed tube systems, microplates and microarrays (minisequencing, real-time polymerase chain reaction, and oligonucleotide ligation assays) have been developed. In addition, many of the emerging technologies take advantage of colourimetric or fluorescence detection (including energy transfer) that allows qualitative and quantitative interpretation of results. With the large variety of DNA technologies available, the choice of methodology will depend on several factors including cost and the need for speed, simplicity and robustness. © 2000 Lippincott Williams & Wilkins.
Resumo:
PCR-based cancer diagnosis requires detection of rare mutations in k- ras, p53 or other genes. The assumption has been that mutant and wild-type sequences amplify with near equal efficiency, so that they are eventually present in proportions representative of the starting material. Work on factor IX suggests that this assumption is invalid for one case of near- sequence identity. To test the generality of this phenomenon and its relevance to cancer diagnosis, primers distant from point mutations in p53 and k-ras were used to amplify wild-type and mutant sequences from these genes. A substantial bias against PCR amplification of mutants was observed for two regions of the p53 gene and one region of k-ras. For k-ras and p53, bias was observed when the wild-type and mutant sequences were amplified separately or when mixed in equal proportions before PCR. Bias was present with proofreading and non-proofreading polymerase. Mutant and wild-type segments of the factor V, cystic fibrosis transmembrane conductance regulator and prothrombin genes were amplified and did not exhibit PCR bias. Therefore, the assumption of equal PCR efficiency for point mutant and wild-type sequences is invalid in several systems. Quantitative or diagnostic PCR will require validation for each locus, and enrichment strategies may be needed to optimize detection of mutants.
Resumo:
Rates of female delinquency, especially for violent crimes, are increasing in most common law countries. At the same time the growth in cyber-bullying, especially among girls, appears to be a related global phenomenon. While the gender gap in delinquency is narrowing in Australia, United States, Canada and the United Kingdom, boys continue to dominate the youth who commit crime and have a virtual monopoly over sexually violent crimes. Indigenous youth continue to be vastly over-represented in the juvenile justice system in every Australian jurisdiction. The Indigenisation of delinquency is a persistent problem in other countries such as Canada and New Zealand. Young people who gather in public places are susceptible to being perceived as somehow threatening or riotous, attracting more than their share of public order policing. Professional football has been marred by repeated scandals involving sexual assault, violence and drunkenness. Given the cultural significance of footballers as role models to thousands, if not millions, of young men around the world, it is vitally important to address this problem. Offending Youth explores these key contemporary patterns of delinquency, the response to these by the juvenile justice agencies and moreover what can be done to address these problems. The book also analyses the major policy and legislative changes from the nineteenth to twenty first centuries, chiefly the shift the penal welfarism to diversion and restorative justice. Using original cases studied by Carrington twenty years ago, Offending Youth illustrates how penal welfarism criminalised young people from socially marginal backgrounds, especially Aboriginal children, children from single parent families, family-less children, state wards and young people living in poverty or in housing commission estates. A number of inquiries in Australia and the United Kingdom have since established that children committed to these institutions, supposedly for their own good, experienced systemic physical, sexual and psychological abuse during their institutionalisation. The book is dedicated to the survivors of these institutions who only now are receiving official recognition of the injustices they suffered. The underlying philosophy of juvenile justice has fundamentally shifted away from penal welfarism to embrace positive policy responses to juvenile crime, such as youth conferencing, cautions, warnings, restorative justice, circle sentencing and diversion examined in the concluding chapter. Offending Youth is aimed at a broad readership including policy makers, juvenile justice professionals, youth workers, families, teachers, politicians as well as students and academics in criminology, policing, gender studies, masculinity studies, Indigenous studies, justice studies, youth studies and the sociology of youth and deviance more generally.-- [from publisher website]
Resumo:
This paper describes an effective method for signal-authentication and spoofing detection for civilian GNSS receivers using the GPS L1 C/A and the Galileo E1-B Safety of Life service. The paper discusses various spoofing attack profiles and how the proposed method is able to detect these attacks. This method is relatively low-cost and can be suitable for numerous mass-market applications. This paper is the subject of a pending patent.
Resumo:
Given the recent emergence of the smart grid and smart grid related technologies, their security is a prime concern. Intrusion detection provides a second line of defense. However, conventional intrusion detection systems (IDSs) are unable to adequately address the unique requirements of the smart grid. This paper presents a gap analysis of contemporary IDSs from a smart grid perspective. This paper highlights the lack of adequate intrusion detection within the smart grid and discusses the limitations of current IDSs approaches. The gap analysis identifies current IDSs as being unsuited to smart grid application without significant changes to address smart grid specific requirements.
Resumo:
In this paper, a plasmonic “ac Wheatstone bridge” circuit is proposed and theoretically modeled for the first time. The bridge circuit consists of three metallic nanoparticles, shaped as rectangular prisms, with two nanoparticles acting as parallel arms of a resonant circuit and the third bridging nanoparticle acting as an optical antenna providing an output signal. Polarized light excites localized surface plasmon resonances in the two arms of the circuit, which generate an optical signal dependent on the phase-sensitive excitations of surface plasmons in the antenna. The circuit is analyzed using a plasmonic coupling theory and numerical simulations. The analyses show that the plasmonic circuit is sensitive to phase shifts between the arms of the bridge and has the potential to detect the presence of single molecules.
Resumo:
In previous research (Chung et al., 2009), the potential of the continuous risk profile (CRP) to proactively detect the systematic deterioration of freeway safety levels was presented. In this paper, this potential is investigated further, and an algorithm is proposed for proactively detecting sites where the collision rate is not sufficiently high to be classified as a high collision concentration location but where a systematic deterioration of safety level is observed. The approach proposed compares the weighted CRP across different years and uses the cumulative sum (CUSUM) algorithm to detect the sites where changes in collision rate are observed. The CRPs of the detected sites are then compared for reproducibility. When high reproducibility is observed, a growth factor is used for sequential hypothesis testing to determine if the collision profiles are increasing over time. Findings from applying the proposed method using empirical data are documented in the paper together with a detailed description of the method.
Resumo:
Despite many incidents about fake online consumer reviews have been reported, very few studies have been conducted to date to examine the trustworthiness of online consumer reviews. One of the reasons is the lack of an effective computational method to separate the untruthful reviews (i.e., spam) from the legitimate ones (i.e., ham) given the fact that prominent spam features are often missing in online reviews. The main contribution of our research work is the development of a novel review spam detection method which is underpinned by an unsupervised inferential language modeling framework. Another contribution of this work is the development of a high-order concept association mining method which provides the essential term association knowledge to bootstrap the performance for untruthful review detection. Our experimental results confirm that the proposed inferential language model equipped with high-order concept association knowledge is effective in untruthful review detection when compared with other baseline methods.
Resumo:
Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.