986 resultados para Ubiquitous technology
Resumo:
Human hair fibres are ubiquitous in nature and are found frequently at crime scenes often as a result of exchange between the perpetrator, victim and/or the surroundings according to Locard's Principle. Therefore, hair fibre evidence can provide important information for crime investigation. For human hair evidence, the current forensic methods of analysis rely on comparisons of either hair morphology by microscopic examination or nuclear and mitochondrial DNA analyses. Unfortunately in some instances the utilisation of microscopy and DNA analyses are difficult and often not feasible. This dissertation is arguably the first comprehensive investigation aimed to compare, classify and identify the single human scalp hair fibres with the aid of FTIR-ATR spectroscopy in a forensic context. Spectra were collected from the hair of 66 subjects of Asian, Caucasian and African (i.e. African-type). The fibres ranged from untreated to variously mildly and heavily cosmetically treated hairs. The collected spectra reflected the physical and chemical nature of a hair from the near-surface particularly, the cuticle layer. In total, 550 spectra were acquired and processed to construct a relatively large database. To assist with the interpretation of the complex spectra from various types of human hair, Derivative Spectroscopy and Chemometric methods such as Principal Component Analysis (PCA), Fuzzy Clustering (FC) and Multi-Criteria Decision Making (MCDM) program; Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and Geometrical Analysis for Interactive Aid (GAIA); were utilised. FTIR-ATR spectroscopy had two important advantages over to previous methods: (i) sample throughput and spectral collection were significantly improved (no physical flattening or microscope manipulations), and (ii) given the recent advances in FTIR-ATR instrument portability, there is real potential to transfer this work.s findings seamlessly to on-field applications. The "raw" spectra, spectral subtractions and second derivative spectra were compared to demonstrate the subtle differences in human hair. SEM images were used as corroborative evidence to demonstrate the surface topography of hair. It indicated that the condition of the cuticle surface could be of three types: untreated, mildly treated and treated hair. Extensive studies of potential spectral band regions responsible for matching and discrimination of various types of hair samples suggested the 1690-1500 cm-1 IR spectral region was to be preferred in comparison with the commonly used 1750-800 cm-1. The principal reason was the presence of the highly variable spectral profiles of cystine oxidation products (1200-1000 cm-1), which contributed significantly to spectral scatter and hence, poor hair sample matching. In the preferred 1690-1500 cm-1 region, conformational changes in the keratin protein attributed to the α-helical to β-sheet transitions in the Amide I and Amide II vibrations and played a significant role in matching and discrimination of the spectra and hence, the hair fibre samples. For gender comparison, the Amide II band is significant for differentiation. The results illustrated that the male hair spectra exhibit a more intense β-sheet vibration in the Amide II band at approximately 1511 cm-1 whilst the female hair spectra displayed more intense α-helical vibration at 1520-1515cm-1. In terms of chemical composition, female hair spectra exhibit greater intensity of the amino acid tryptophan (1554 cm-1), aspartic and glutamic acid (1577 cm-1). It was also observed that for the separation of samples based on racial differences, untreated Caucasian hair was discriminated from Asian hair as a result of having higher levels of the amino acid cystine and cysteic acid. However, when mildly or chemically treated, Asian and Caucasian hair fibres are similar, whereas African-type hair fibres are different. In terms of the investigation's novel contribution to the field of forensic science, it has allowed for the development of a novel, multifaceted, methodical protocol where previously none had existed. The protocol is a systematic method to rapidly investigate unknown or questioned single human hair FTIR-ATR spectra from different genders and racial origin, including fibres of different cosmetic treatments. Unknown or questioned spectra are first separated on the basis of chemical treatment i.e. untreated, mildly treated or chemically treated, genders, and racial origin i.e. Asian, Caucasian and African-type. The methodology has the potential to complement the current forensic analysis methods of fibre evidence (i.e. Microscopy and DNA), providing information on the morphological, genetic and structural levels.
Resumo:
It is generally acknowledged that mooting is an effective way to enhance the teaching of practical skills in legal education as well as to provide an authentic learning experience with links to the real world. However, there are a number of impediments to students participating in mooting; in particular being located off-campus, inexperience and lack of time. It has been suggested that technology may be a means of overcoming these impediments. However the use of technology in mooting has not been tested. This paper will report on a trial of the use of Second Life and Elluminate and videoconferencing as platforms for the conduct of moots. The trials identified limitations in the use of technology for mooting in particularly in relation to the development of advocacy skills. The paper will conclude that these limitations can be overcome by careful consideration of the appropriate technology to be used depending on the context and the objectives to be achieved by the moot. It will also suggest that in order to provide an authentic use of online communication technology in a court setting, the best available technology should be used for the conduct of moot competitions.
Resumo:
The focus of the present research was to investigate how Local Governments in Queensland were progressing with the adoption of delineated DM policies and supporting guidelines. The study consulted Local Government representatives and hence, the results reflect their views on these issues. Is adoption occurring? To what degree? Are policies and guidelines being effectively implemented so that the objective of a safer, more resilient community is being achieved? If not, what are the current barriers to achieving this, and can recommendations be made to overcome these barriers? These questions defined the basis on which the present study was designed and the survey tools developed. While it was recognised that LGAQ and Emergency Management Queensland (EMQ) may have differing views on some reported issues, it was beyond the scope of the present study to canvass those views. The study resolved to document and analyse these questions under the broad themes of: • Building community capacity (notably via community awareness). • Council operationalisation of DM. • Regional partnerships (in mitigation/adaptation). Data was collected via a survey tool comprising two components: • An online questionnaire survey distributed via the LGAQ Disaster Management Alliance (hereafter referred to as the “Alliance”) to DM sections of all Queensland Local Government Councils; and • a series of focus groups with selected Queensland Councils
Resumo:
Choi et al. recently proposed an efficient RFID authentication protocol for a ubiquitous computing environment, OHLCAP(One-Way Hash based Low-Cost Authentication Protocol). However, this paper reveals that the protocol has several security weaknesses : 1) traceability based on the leakage of counter information, 2) vulnerability to an impersonation attack by maliciously updating a random number, and 3) traceability based on a physically-attacked tag. Finally, a security enhanced group-based authentication protocol is presented.
Resumo:
This paper examines the interactions between knowledge and power in the adoption of technologies central to municipal water supply plans, specifically investigating decisions in Progressive Era Chicago regarding water meters. The invention and introduction into use of the reliable water meter early in the Progressive Era allowed planners and engineers to gauge water use, and enabled communities willing to invest in the new infrastructure to allocate costs for provision of supply to consumers relative to use. In an era where efficiency was so prized and the role of technocratic expertise was increasing, Chicago’s continued failure to adopt metering (despite levels of per capita consumption nearly twice that of comparable cities and acknowledged levels of waste nearing half of system production) may indicate that the underlying characteristics of the city’s political system and its elite stymied the implementation of metering technologies as in Smith’s (1977) comparative study of nineteenth century armories. Perhaps, as with Flyvbjerg’s (1998) study of the city of Aalborg, the powerful know what they want and data will not interfere with their conclusions: if the data point to a solution other than what is desired, then it must be that the data are wrong. Alternatively, perhaps the technocrats failed adequately to communicate their findings in a language which the political elite could understand, with the failure lying in assumptions of scientific or technical literacy rather than with dissatisfaction in outcomes (Benveniste 1972). When examined through a historical institutionalist perspective, the case study of metering adoption lends itself to exploration of larger issues of knowledge and power in the planning process: what governs decisions regarding knowledge acquisition, how knowledge and power interact, whether the potential to improve knowledge leads to changes in action, and, whether the decision to overlook available knowledge has an impact on future decisions.
Resumo:
The indecision surrounding the definition of Technology extends to the classroom as not knowing what a subject “is” affects how it is taught. Similarly, its relative newness – and consequent lack of habitus in school settings - means that it is still struggling to find its own place in the curriculum as well as resolve its relationship with more established subject domains, particularly Science and Mathematics. The guidance from syllabus documents points to open-ended student-directed projects where extant studies indicate a more common experience of teacher –directed activities and an emphasis on product over process. There are issues too for researchers in documenting classroom observations and in analysing teacher practice in new learning environments. This paper presents a framework for defining and mapping classroom practice and for attempting to describe the social practice in the Technology classroom. The framework is a bricolage which draws on contemporary research. More formally, the development of the framework is consonant with the aim of design-based research to develop a flexible, adaptive and generalisable theory to better understanding a teaching domain where promise is not seen to match current reality. The framework may also inform emergent approaches to STEM (Science, Technology, Education and Mathematics) in education.
Resumo:
A vast proportion of companies nowadays are looking to design and are focusing on the end users as a means of driving new projects. However still many companies are drawn to technological improvements which drive innovation within their industry context. The Australian livestock industry is no different. To date the adoption of new products and services within the livestock industry has been documented as being quite slow. This paper investigates how disruptive innovation should be a priority for these technologically focused companies and demonstrates how the use of design led innovation can bring about a higher quality engagement between end user and company alike. A case study linking participatory design and design thinking is presented. Within this, a conceptual model of presenting future scenarios to internal and external stakeholders is applied to the livestock industry; assisting companies to apply strategy, culture and advancement in meaningful product offerings to consumers.
Resumo:
In Australia, trials conducted as 'electronic trials' have ordinarily run with the assistance of commercial service providers, with the associated costs being borne by the parties. However, an innovative approach has been taken by the courts in Queensland. In October 2007 Queensland became the first Australian jurisdiction to develop its own court-provided technology, to facilitate the conduct of an electronic trial. This technology was first used in the conduct of civil trials. The use of the technology in the civil sphere highlighted its benefits and, more significantly, demonstrated the potential to achieve much greater efficiencies. The Queensland courts have now gone further, using the court-provided technology in the high proffle criminal trial of R v Hargraves, Hargraves and Stoten, in which the three accused were tried for conspiracy to defraud the Commonwealth of Australia of about $3.7 million in tax. This paper explains the technology employed in this case and reports on the perspectives of all of the participants in the process. The representatives for all parties involved in this trial acknowledged, without reservation, that the use of the technology at trial produced considerable overall efficiencies and costs savings. The experience in this trial also demonstrates that the benefits of trial technology for the criminal justice process are greater than those for civil litigation. It shows that, when skilfully employed, trial technology presents opportunities to enhance the fairness of trials for accused persons. The paper urges governments, courts and the judiciary in all jurisdictions to continue their efforts to promote change, and to introduce mechanisms to facilitate more broadly a shift from the entrenched paper-based approach to both criminal and civil procedure to one which embraces more broadly the enormous benefits trial technology has to offer.
Resumo:
In recent years social technologies such as wikis, blogs or microblogging have seen an exponential growth in the uptake of their user base making this type of technology one of the most significant networking and knowledge sharing platforms for potentially hundreds of millions of users. However, the adoption of these technologies has been so far mostly for private purposes. First attempts have been made to embed features of social technologies in the corporate IT landscape, and Business Process Management is no exception. This paper aims to consolidate the opportunities for integrating social technologies into the different stages of the business process lifecycle. Thus, it contributes to a conceptualization of this fast growing domain, and can help to categorize academic and corporate development activities.
Resumo:
The emergence of mobile and ubiquitous computing has created what is referred to as a hybrid space – a virtual layer of digital information and interaction opportunities that sits on top and augments the physical environment. The increasing connectedness through such media, from anywhere to anybody at anytime, makes us less dependent on being physically present somewhere in particular. But, what is the role of ubiquitous computing in making physical presence at a particular place more attractive? Acknowledging historic context and identity as important attributes of place, this work embarks on a ‘global sense of place’ in which the cultural diversity, multiple identities, backgrounds, skills and experiences of people traversing a place are regarded as social assets of that place. The aim is to explore ways how physical architecture and infrastructure of a place can be mediated towards making invisible social assets visible, thus augmenting people’s situated social experience. Thereby, the focus is on embodied media, i.e. media that materialise digital information as observable and sometimes interactive parts of the physical environment hence amplify people’s real world experience, rather than substituting or moving it to virtual spaces.
Resumo:
With the identification of common single locus point mutations as risk factors for thrombophilia, many DNA testing methodologies have been described for detecting these variations. Traditionally, functional or immunological testing methods have been used to investigate quantitative anticoagulant deficiencies. However, with the emergence of the genetic variations, factor V Leiden, prothrombin 20210 and, to a lesser extent, the methylene tetrahydrofolate reductase (MTHFR677) and factor V HR2 haplotype, traditional testing methodologies have proved to be less useful and instead DNA technology is more commonly employed in diagnostics. This review considers many of the DNA techniques that have proved to be useful in the detection of common genetic variants that predispose to thrombophilia. Techniques involving gel analysis are used to detect the presence or absence of restriction sites, electrophoretic mobility shifts, as in single strand conformation polymorphism or denaturing gradient gel electrophoresis, and product formation in allele-specific amplification. Such techniques may be sensitive, but are unwielding and often need to be validated objectively. In order to overcome some of the limitations of gel analysis, especially when dealing with larger sample numbers, many alternative detection formats, such as closed tube systems, microplates and microarrays (minisequencing, real-time polymerase chain reaction, and oligonucleotide ligation assays) have been developed. In addition, many of the emerging technologies take advantage of colourimetric or fluorescence detection (including energy transfer) that allows qualitative and quantitative interpretation of results. With the large variety of DNA technologies available, the choice of methodology will depend on several factors including cost and the need for speed, simplicity and robustness. © 2000 Lippincott Williams & Wilkins.
Resumo:
Activated protein C resistance (APCR), the most common risk factor for venous thrombosis, is the result of a G to A base substitution at nucleotide 1691 (R506Q) in the factor V gene. Current techniques to detect the factor V Leiden mutation, such as determination of restriction length polymorphisms, do not have the capacity to screen large numbers of samples in a rapid, cost- effective test. The aim of this study was to apply the first nucleotide change (FNC) technology, to the detection of the factor V Leiden mutation. After preliminary amplification of genomic DNA by polymerase chain reaction (PCR), an allele-specific primer was hybridised to the PCR product and extended using fluorescent terminating dideoxynucleotides which were detected by colorimetric assay. Using this ELISA-based assay, the prevalence of the factor V Leiden mutation was determined in an Australian blood donor population (n = 500). A total of 18 heterozygotes were identified (3.6%) and all of these were confirmed with conventional MnlI restriction digest. No homozygotes for the variant allele were detected. We conclude from this study that the frequency of 3.6% is compatible with others published for Caucasian populations. In addition, the FNC technology shows promise as the basis for a rapid, automated DNA based test for factor V Leiden.